Sample records for continuous random variables

  1. Maximum-entropy probability distributions under Lp-norm constraints

    NASA Technical Reports Server (NTRS)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  2. A Geometrical Framework for Covariance Matrices of Continuous and Categorical Variables

    ERIC Educational Resources Information Center

    Vernizzi, Graziano; Nakai, Miki

    2015-01-01

    It is well known that a categorical random variable can be represented geometrically by a simplex. Accordingly, several measures of association between categorical variables have been proposed and discussed in the literature. Moreover, the standard definitions of covariance and correlation coefficient for continuous random variables have been…

  3. Randomized trial of intermittent or continuous amnioinfusion for variable decelerations.

    PubMed

    Rinehart, B K; Terrone, D A; Barrow, J H; Isler, C M; Barrilleaux, P S; Roberts, W E

    2000-10-01

    To determine whether continuous or intermittent bolus amnioinfusion is more effective in relieving variable decelerations. Patients with repetitive variable decelerations were randomized to an intermittent bolus or continuous amnioinfusion. The intermittent bolus infusion group received boluses of 500 mL of normal saline, each over 30 minutes, with boluses repeated if variable decelerations recurred. The continuous infusion group received a bolus infusion of 500 mL of normal saline over 30 minutes and then 3 mL per minute until delivery occurred. The ability of the amnioinfusion to abolish variable decelerations was analyzed, as were maternal demographic and pregnancy outcome variables. Power analysis indicated that 64 patients would be required. Thirty-five patients were randomized to intermittent infusion and 30 to continuous infusion. There were no differences between groups in terms of maternal demographics, gestational age, delivery mode, neonatal outcome, median time to resolution of variable decelerations, or the number of times variable decelerations recurred. The median volume infused in the intermittent infusion group (500 mL) was significantly less than that in the continuous infusion group (905 mL, P =.003). Intermittent bolus amnioinfusion is as effective as continuous infusion in relieving variable decelerations in labor. Further investigation is necessary to determine whether either of these techniques is associated with increased occurrence of rare complications such as cord prolapse or uterine rupture.

  4. Students' Misconceptions about Random Variables

    ERIC Educational Resources Information Center

    Kachapova, Farida; Kachapov, Ilias

    2012-01-01

    This article describes some misconceptions about random variables and related counter-examples, and makes suggestions about teaching initial topics on random variables in general form instead of doing it separately for discrete and continuous cases. The focus is on post-calculus probability courses. (Contains 2 figures.)

  5. Random Item IRT Models

    ERIC Educational Resources Information Center

    De Boeck, Paul

    2008-01-01

    It is common practice in IRT to consider items as fixed and persons as random. Both, continuous and categorical person parameters are most often random variables, whereas for items only continuous parameters are used and they are commonly of the fixed type, although exceptions occur. It is shown in the present article that random item parameters…

  6. Approximating prediction uncertainty for random forest regression models

    Treesearch

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  7. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    PubMed

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  8. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    NASA Astrophysics Data System (ADS)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  9. Random dopant fluctuations and statistical variability in n-channel junctionless FETs

    NASA Astrophysics Data System (ADS)

    Akhavan, N. D.; Umana-Membreno, G. A.; Gu, R.; Antoszewski, J.; Faraone, L.

    2018-01-01

    The influence of random dopant fluctuations on the statistical variability of the electrical characteristics of n-channel silicon junctionless nanowire transistor (JNT) has been studied using three dimensional quantum simulations based on the non-equilibrium Green’s function (NEGF) formalism. Average randomly distributed body doping densities of 2 × 1019, 6 × 1019 and 1 × 1020 cm-3 have been considered employing an atomistic model for JNTs with gate lengths of 5, 10 and 15 nm. We demonstrate that by properly adjusting the doping density in the JNT, a near ideal statistical variability and electrical performance can be achieved, which can pave the way for the continuation of scaling in silicon CMOS technology.

  10. Single-photon continuous-variable quantum key distribution based on the energy-time uncertainty relation.

    PubMed

    Qi, Bing

    2006-09-15

    We propose a new quantum key distribution protocol in which information is encoded on continuous variables of a single photon. In this protocol, Alice randomly encodes her information on either the central frequency of a narrowband single-photon pulse or the time delay of a broadband single-photon pulse, while Bob randomly chooses to do either frequency measurement or time measurement. The security of this protocol rests on the energy-time uncertainty relation, which prevents Eve from simultaneously determining both frequency and time information with arbitrarily high resolution. Since no interferometer is employed in this scheme, it is more robust against various channel noises, such as polarization and phase fluctuations.

  11. Uncertain dynamic analysis for rigid-flexible mechanisms with random geometry and material properties

    NASA Astrophysics Data System (ADS)

    Wu, Jinglai; Luo, Zhen; Zhang, Nong; Zhang, Yunqing; Walker, Paul D.

    2017-02-01

    This paper proposes an uncertain modelling and computational method to analyze dynamic responses of rigid-flexible multibody systems (or mechanisms) with random geometry and material properties. Firstly, the deterministic model for the rigid-flexible multibody system is built with the absolute node coordinate formula (ANCF), in which the flexible parts are modeled by using ANCF elements, while the rigid parts are described by ANCF reference nodes (ANCF-RNs). Secondly, uncertainty for the geometry of rigid parts is expressed as uniform random variables, while the uncertainty for the material properties of flexible parts is modeled as a continuous random field, which is further discretized to Gaussian random variables using a series expansion method. Finally, a non-intrusive numerical method is developed to solve the dynamic equations of systems involving both types of random variables, which systematically integrates the deterministic generalized-α solver with Latin Hypercube sampling (LHS) and Polynomial Chaos (PC) expansion. The benchmark slider-crank mechanism is used as a numerical example to demonstrate the characteristics of the proposed method.

  12. Decision tree modeling using R.

    PubMed

    Zhang, Zhongheng

    2016-08-01

    In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met. In the example I focus on conditional inference tree, which incorporates tree-structured regression models into conditional inference procedures. While growing a single tree is subject to small changes in the training data, random forests procedure is introduced to address this problem. The sources of diversity for random forests come from the random sampling and restricted set of input variables to be selected. Finally, I introduce R functions to perform model based recursive partitioning. This method incorporates recursive partitioning into conventional parametric model building.

  13. Global solutions to random 3D vorticity equations for small initial data

    NASA Astrophysics Data System (ADS)

    Barbu, Viorel; Röckner, Michael

    2017-11-01

    One proves the existence and uniqueness in (Lp (R3)) 3, 3/2 < p < 2, of a global mild solution to random vorticity equations associated to stochastic 3D Navier-Stokes equations with linear multiplicative Gaussian noise of convolution type, for sufficiently small initial vorticity. This resembles some earlier deterministic results of T. Kato [16] and are obtained by treating the equation in vorticity form and reducing the latter to a random nonlinear parabolic equation. The solution has maximal regularity in the spatial variables and is weakly continuous in (L3 ∩L 3p/4p - 6)3 with respect to the time variable. Furthermore, we obtain the pathwise continuous dependence of solutions with respect to the initial data. In particular, one gets a locally unique solution of 3D stochastic Navier-Stokes equation in vorticity form up to some explosion stopping time τ adapted to the Brownian motion.

  14. Why we should use simpler models if the data allow this: relevance for ANOVA designs in experimental biology.

    PubMed

    Lazic, Stanley E

    2008-07-21

    Analysis of variance (ANOVA) is a common statistical technique in physiological research, and often one or more of the independent/predictor variables such as dose, time, or age, can be treated as a continuous, rather than a categorical variable during analysis - even if subjects were randomly assigned to treatment groups. While this is not common, there are a number of advantages of such an approach, including greater statistical power due to increased precision, a simpler and more informative interpretation of the results, greater parsimony, and transformation of the predictor variable is possible. An example is given from an experiment where rats were randomly assigned to receive either 0, 60, 180, or 240 mg/L of fluoxetine in their drinking water, with performance on the forced swim test as the outcome measure. Dose was treated as either a categorical or continuous variable during analysis, with the latter analysis leading to a more powerful test (p = 0.021 vs. p = 0.159). This will be true in general, and the reasons for this are discussed. There are many advantages to treating variables as continuous numeric variables if the data allow this, and this should be employed more often in experimental biology. Failure to use the optimal analysis runs the risk of missing significant effects or relationships.

  15. Smooth conditional distribution function and quantiles under random censorship.

    PubMed

    Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine

    2002-09-01

    We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).

  16. Examining solutions to missing data in longitudinal nursing research.

    PubMed

    Roberts, Mary B; Sullivan, Mary C; Winchester, Suzy B

    2017-04-01

    Longitudinal studies are highly valuable in pediatrics because they provide useful data about developmental patterns of child health and behavior over time. When data are missing, the value of the research is impacted. The study's purpose was to (1) introduce a three-step approach to assess and address missing data and (2) illustrate this approach using categorical and continuous-level variables from a longitudinal study of premature infants. A three-step approach with simulations was followed to assess the amount and pattern of missing data and to determine the most appropriate imputation method for the missing data. Patterns of missingness were Missing Completely at Random, Missing at Random, and Not Missing at Random. Missing continuous-level data were imputed using mean replacement, stochastic regression, multiple imputation, and fully conditional specification (FCS). Missing categorical-level data were imputed using last value carried forward, hot-decking, stochastic regression, and FCS. Simulations were used to evaluate these imputation methods under different patterns of missingness at different levels of missing data. The rate of missingness was 16-23% for continuous variables and 1-28% for categorical variables. FCS imputation provided the least difference in mean and standard deviation estimates for continuous measures. FCS imputation was acceptable for categorical measures. Results obtained through simulation reinforced and confirmed these findings. Significant investments are made in the collection of longitudinal data. The prudent handling of missing data can protect these investments and potentially improve the scientific information contained in pediatric longitudinal studies. © 2017 Wiley Periodicals, Inc.

  17. Continuous-variable phase estimation with unitary and random linear disturbance

    NASA Astrophysics Data System (ADS)

    Delgado de Souza, Douglas; Genoni, Marco G.; Kim, M. S.

    2014-10-01

    We address the problem of continuous-variable quantum phase estimation in the presence of linear disturbance at the Hamiltonian level by means of Gaussian probe states. In particular we discuss both unitary and random disturbance by considering the parameter which characterizes the unwanted linear term present in the Hamiltonian as fixed (unitary disturbance) or random with a given probability distribution (random disturbance). We derive the optimal input Gaussian states at fixed energy, maximizing the quantum Fisher information over the squeezing angle and the squeezing energy fraction, and we discuss the scaling of the quantum Fisher information in terms of the output number of photons, nout. We observe that, in the case of unitary disturbance, the optimal state is a squeezed vacuum state and the quadratic scaling is conserved. As regards the random disturbance, we observe that the optimal squeezing fraction may not be equal to one and, for any nonzero value of the noise parameter, the quantum Fisher information scales linearly with the average number of photons. Finally, we discuss the performance of homodyne measurement by comparing the achievable precision with the ultimate limit imposed by the quantum Cramér-Rao bound.

  18. Continuous-Time Random Walk with multi-step memory: an application to market dynamics

    NASA Astrophysics Data System (ADS)

    Gubiec, Tomasz; Kutner, Ryszard

    2017-11-01

    An extended version of the Continuous-Time Random Walk (CTRW) model with memory is herein developed. This memory involves the dependence between arbitrary number of successive jumps of the process while waiting times between jumps are considered as i.i.d. random variables. This dependence was established analyzing empirical histograms for the stochastic process of a single share price on a market within the high frequency time scale. Then, it was justified theoretically by considering bid-ask bounce mechanism containing some delay characteristic for any double-auction market. Our model appeared exactly analytically solvable. Therefore, it enables a direct comparison of its predictions with their empirical counterparts, for instance, with empirical velocity autocorrelation function. Thus, the present research significantly extends capabilities of the CTRW formalism. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  19. Bubble CPAP versus CPAP with variable flow in newborns with respiratory distress: a randomized controlled trial.

    PubMed

    Yagui, Ana Cristina Zanon; Vale, Luciana Assis Pires Andrade; Haddad, Luciana Branco; Prado, Cristiane; Rossi, Felipe Souza; Deutsch, Alice D Agostini; Rebello, Celso Moura

    2011-01-01

    To evaluate the efficacy and safety of nasal continuous positive airway pressure (NCPAP) using devices with variable flow or bubble continuous positive airway pressure (CPAP) regarding CPAP failure, presence of air leaks, total CPAP and oxygen time, and length of intensive care unit and hospital stay in neonates with moderate respiratory distress (RD) and birth weight (BW) ≥ 1,500 g. Forty newborns requiring NCPAP were randomized into two study groups: variable flow group (VF) and continuous flow group (CF). The study was conducted between October 2008 and April 2010. Demographic data, CPAP failure, presence of air leaks, and total CPAP and oxygen time were recorded. Categorical outcomes were tested using the chi-square test or the Fisher's exact test. Continuous variables were analyzed using the Mann-Whitney test. The level of significance was set at p < 0.05. There were no differences between the groups with regard to demographic data, CPAP failure (21.1 and 20.0% for VF and CF, respectively; p = 1.000), air leak syndrome (10.5 and 5.0%, respectively; p = 0.605), total CPAP time (median: 22.0 h, interquartile range [IQR]: 8.00-31.00 h and median: 22.0 h, IQR: 6.00-32.00 h, respectively; p = 0.822), and total oxygen time (median: 24.00 h, IQR: 7.00-85.00 h and median: 21.00 h, IQR: 9.50-66.75 h, respectively; p = 0.779). In newborns with BW ≥ 1,500 g and moderate RD, the use of continuous flow NCPAP showed the same benefits as the use of variable flow NCPAP.

  20. Examining Solutions to Missing Data in Longitudinal Nursing Research

    PubMed Central

    Roberts, Mary B.; Sullivan, Mary C.; Winchester, Suzy B.

    2017-01-01

    Purpose Longitudinal studies are highly valuable in pediatrics because they provide useful data about developmental patterns of child health and behavior over time. When data are missing, the value of the research is impacted. The study’s purpose was to: (1) introduce a 3-step approach to assess and address missing data; (2) illustrate this approach using categorical and continuous level variables from a longitudinal study of premature infants. Methods A three-step approach with simulations was followed to assess the amount and pattern of missing data and to determine the most appropriate imputation method for the missing data. Patterns of missingness were Missing Completely at Random, Missing at Random, and Not Missing at Random. Missing continuous-level data were imputed using mean replacement, stochastic regression, multiple imputation, and fully conditional specification. Missing categorical-level data were imputed using last value carried forward, hot-decking, stochastic regression, and fully conditional specification. Simulations were used to evaluate these imputation methods under different patterns of missingness at different levels of missing data. Results The rate of missingness was 16–23% for continuous variables and 1–28% for categorical variables. Fully conditional specification imputation provided the least difference in mean and standard deviation estimates for continuous measures. Fully conditional specification imputation was acceptable for categorical measures. Results obtained through simulation reinforced and confirmed these findings. Practice Implications Significant investments are made in the collection of longitudinal data. The prudent handling of missing data can protect these investments and potentially improve the scientific information contained in pediatric longitudinal studies. PMID:28425202

  1. Effect of Technology-Enhanced Continuous Progress Monitoring on Math Achievement

    ERIC Educational Resources Information Center

    Ysseldyke, Jim; Bolt, Daniel M.

    2007-01-01

    We examined the extent to which use of a technology-enhanced continuous progress monitoring system would enhance the results of math instruction, examined variability in teacher implementation of the program, and compared math results in classrooms in which teachers did and did not use the system. Classrooms were randomly assigned to within-school…

  2. Continuous operation of four-state continuous-variable quantum key distribution system

    NASA Astrophysics Data System (ADS)

    Matsubara, Takuto; Ono, Motoharu; Oguri, Yusuke; Ichikawa, Tsubasa; Hirano, Takuya; Kasai, Kenta; Matsumoto, Ryutaroh; Tsurumaru, Toyohiro

    2016-10-01

    We report on the development of continuous-variable quantum key distribution (CV-QKD) system that are based on discrete quadrature amplitude modulation (QAM) and homodyne detection of coherent states of light. We use a pulsed light source whose wavelength is 1550 nm and repetition rate is 10 MHz. The CV-QKD system can continuously generate secret key which is secure against entangling cloner attack. Key generation rate is 50 kbps when the quantum channel is a 10 km optical fiber. The CV-QKD system we have developed utilizes the four-state and post-selection protocol [T. Hirano, et al., Phys. Rev. A 68, 042331 (2003).]; Alice randomly sends one of four states {|+/-α⟩,|+/-𝑖α⟩}, and Bob randomly performs x- or p- measurement by homodyne detection. A commercially available balanced receiver is used to realize shot-noise-limited pulsed homodyne detection. GPU cards are used to accelerate the software-based post-processing. We use a non-binary LDPC code for error correction (reverse reconciliation) and the Toeplitz matrix multiplication for privacy amplification.

  3. A prospective, randomized, controlled study comparing Gynemesh, a synthetic mesh, and Pelvicol, a biologic graft, in the surgical treatment of recurrent cystocele.

    PubMed

    Natale, F; La Penna, C; Padoa, A; Agostini, M; De Simone, E; Cervigni, M

    2009-01-01

    We compared safety and efficacy of Gynemesh PS and Pelvicol for recurrent cystocele repair. One hundred ninety patients were randomly divided into Gynemesh PS and Pelvicol groups and underwent tension-free cystocele repair. The Chi-square test was used to compare categorical variables, the paired t test for continuous parametric variables, and the Mann-Whitney test for continuous nonparametric variables. Ninety-six Gynemesh PS patients and 94 Pelvicol patients were studied. Mesh erosions occurred in 6.3% of Gynemesh PS patients. No erosions were observed in Pelvicol patients (p = 0.02). Objective cure was 71.9% for Gynemesh PS and 56.4% for Pelvicol (p = 0.06). Subjective cure was the same in both groups except for better sexuality in the Pelvicol group. At 24 months follow-up, only Gynemesh PS patients had mesh erosions. Anatomical outcome was similar in the two groups. Pelvicol gave a better impact on voiding and sexuality.

  4. Lack of difference between continuous versus intermittent heparin infusion on maintenance of intra-arterial catheter in postoperative pediatric surgery: a randomized controlled study

    PubMed Central

    Witkowski, Maria Carolina; de Moraes, Maria Antonieta P.; Firpo, Cora Maria F.

    2013-01-01

    OBJECTIVE: To compare two systems of arterial catheters maintenance in postoperative pediatric surgery using intermittent or continuous infusion of heparin solution and to analyze adverse events related to the site of catheter insertion and the volume of infused heparin solution. METHODS: Randomized control trial with 140 patients selected for continuous infusion group (CIG) and intermittent infusion group (IIG). The variables analyzed were: type of heart disease, permanence time and size of the catheter, insertion site, technique used, volume of heparin solution and adverse events. The descriptive variables were analyzed by Student's t-test and the categorical variables, by chi-square test, being significant p<0.05. RESULTS: The median age was 11 (0-22) months, and 77 (55%) were females. No significant differences between studied variables were found, except for the volume used in CIG (12.0±1.2mL/24 hours) when compared to IIG (5.3±3.5mL/24 hours) with p<0.0003. CONCLUSIONS: The continuous infusion system and the intermittent infusion of heparin solution can be used for intra-arterial catheters maintenance in postoperative pediatric surgery, regardless of patient's clinical and demographic characteristics. Adverse events up to the third postoperative day occurred similarly in both groups. However, the intermittent infusion system usage in underweight children should be considered, due to the lower volume of infused heparin solution [ClinicalTrials.gov Identifier: NCT01097031]. PMID:24473958

  5. Nasal Jet-CPAP (variable flow) versus Bubble-CPAP in preterm infants with respiratory distress: an open label, randomized controlled trial.

    PubMed

    Bhatti, A; Khan, J; Murki, S; Sundaram, V; Saini, S S; Kumar, P

    2015-11-01

    To compare the failure rates between Jet continuous positive airway pressure device (J-CPAP-variable flow) and Bubble continuous positive airway device (B-CPAP) in preterm infants with respiratory distress. Preterm newborns <34 weeks gestation with onset of respiratory distress within 6 h of life were randomized to receive J-CPAP (a variable flow device) or B-CPAP (continuous flow device). A standardized protocol was followed for titration, weaning and removal of CPAP. Pressure was monitored close to the nares in both the devices every 6 hours and settings were adjusted to provide desired CPAP. The primary outcome was CPAP failure rate within 72 h of life. Secondary outcomes were CPAP failure within 7 days of life, need for surfactant post-randomization, time to CPAP failure, duration of CPAP and complications of prematurity. An intention to treat analysis was done. One-hundred seventy neonates were randomized, 80 to J-CPAP and 90 to B-CPAP. CPAP failure rates within 72 h were similar in infants who received J-CPAP and in those who received B-CPAP (29 versus 21%; relative risks 1.4 (0.8 to 2.3), P=0.25). Mean (95% confidence intervals) time to CPAP failure was 59 h (54 to 64) in the Jet CPAP group in comparison with 65 h (62 to 68) in the Bubble CPAP group (log rank P=0.19). All other secondary outcomes were similar between the two groups. In preterm infants with respiratory distress starting within 6 h of life, CPAP failure rates were similar with Jet CPAP and Bubble CPAP.

  6. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  7. Benford's law and continuous dependent random variables

    NASA Astrophysics Data System (ADS)

    Becker, Thealexa; Burt, David; Corcoran, Taylor C.; Greaves-Tunnell, Alec; Iafrate, Joseph R.; Jing, Joy; Miller, Steven J.; Porfilio, Jaclyn D.; Ronan, Ryan; Samranvedhya, Jirapat; Strauch, Frederick W.; Talbut, Blaine

    2018-01-01

    Many mathematical, man-made and natural systems exhibit a leading-digit bias, where a first digit (base 10) of 1 occurs not 11% of the time, as one would expect if all digits were equally likely, but rather 30%. This phenomenon is known as Benford's Law. Analyzing which datasets adhere to Benford's Law and how quickly Benford behavior sets in are the two most important problems in the field. Most previous work studied systems of independent random variables, and relied on the independence in their analyses. Inspired by natural processes such as particle decay, we study the dependent random variables that emerge from models of decomposition of conserved quantities. We prove that in many instances the distribution of lengths of the resulting pieces converges to Benford behavior as the number of divisions grow, and give several conjectures for other fragmentation processes. The main difficulty is that the resulting random variables are dependent. We handle this by using tools from Fourier analysis and irrationality exponents to obtain quantified convergence rates as well as introducing and developing techniques to measure and control the dependencies. The construction of these tools is one of the major motivations of this work, as our approach can be applied to many other dependent systems. As an example, we show that the n ! entries in the determinant expansions of n × n matrices with entries independently drawn from nice random variables converges to Benford's Law.

  8. Biochemical Network Stochastic Simulator (BioNetS): software for stochastic modeling of biochemical networks.

    PubMed

    Adalsteinsson, David; McMillen, David; Elston, Timothy C

    2004-03-08

    Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA) molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. We have developed the software package Biochemical Network Stochastic Simulator (BioNetS) for efficiently and accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous) for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solves the appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.

  9. Random variable transformation for generalized stochastic radiative transfer in finite participating slab media

    NASA Astrophysics Data System (ADS)

    El-Wakil, S. A.; Sallah, M.; El-Hanbaly, A. M.

    2015-10-01

    The stochastic radiative transfer problem is studied in a participating planar finite continuously fluctuating medium. The problem is considered for specular- and diffusly-reflecting boundaries with linear anisotropic scattering. Random variable transformation (RVT) technique is used to get the complete average for the solution functions, that are represented by the probability-density function (PDF) of the solution process. In the RVT algorithm, a simple integral transformation to the input stochastic process (the extinction function of the medium) is applied. This linear transformation enables us to rewrite the stochastic transport equations in terms of the optical random variable (x) and the optical random thickness (L). Then the transport equation is solved deterministically to get a closed form for the solution as a function of x and L. So, the solution is used to obtain the PDF of the solution functions applying the RVT technique among the input random variable (L) and the output process (the solution functions). The obtained averages of the solution functions are used to get the complete analytical averages for some interesting physical quantities, namely, reflectivity and transmissivity at the medium boundaries. In terms of the average reflectivity and transmissivity, the average of the partial heat fluxes for the generalized problem with internal source of radiation are obtained and represented graphically.

  10. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  11. Randomized, multicenter study: on-demand versus continuous maintenance treatment with esomeprazole in patients with non-erosive gastroesophageal reflux disease.

    PubMed

    Bayerdörffer, Ekkehard; Bigard, Marc-Andre; Weiss, Werner; Mearin, Fermín; Rodrigo, Luis; Dominguez Muñoz, Juan Enrique; Grundling, Hennie; Persson, Tore; Svedberg, Lars-Erik; Keeling, Nanna; Eklund, Stefan

    2016-04-14

    Most patients with gastroesophageal reflux disease experience symptomatic relapse after stopping acid-suppressive medication. The aim of this study was to compare willingness to continue treatment with esomeprazole on-demand versus continuous maintenance therapy for symptom control in patients with non-erosive reflux disease (NERD) after 6 months. This multicenter, open-label, randomized, parallel-group study enrolled adults with NERD who were heartburn-free after 4 weeks' treatment with esomeprazole 20 mg daily. Patients received esomeprazole 20 mg daily continuously or on-demand for 6 months. The primary variable was discontinuation due to unsatisfactory treatment. On-demand treatment was considered non-inferior if the upper limit of the one-sided 95 % confidence interval (CI) for the difference between treatments was <10 %. Of 877 patients enrolled, 598 were randomized to maintenance treatment (continuous: n = 297; on-demand: n = 301). Discontinuation due to unsatisfactory treatment was 6.3 % for on-demand and 9.8 % for continuous treatment (difference -3.5 % [90 % CI: -7.1 %, 0.2 %]). In total, 82.1 and 86.2 % of patients taking on-demand and continuous therapy, respectively, were satisfied with the treatment of heartburn and regurgitation symptoms, a secondary variable (P = NS). Mean study drug consumption was 0.41 and 0.91 tablets/day, respectively. Overall, 5 % of the on-demand group developed reflux esophagitis versus none in the continuous group (P < 0.0001). The Gastrointestinal Symptom Rating Scale Reflux dimension was also improved for continuous versus on-demand treatment. Esomeprazole was well tolerated. In terms of willingness to continue treatment, on-demand treatment with esomeprazole 20 mg was non-inferior to continuous maintenance treatment and reduced medication usage in patients with NERD who had achieved symptom control with initial esomeprazole treatment. ClinicalTrials.gov identifier (NCT number): NCT02670642 ; Date of registration: December 2015.

  12. Regression dilution bias: tools for correction methods and sample size calculation.

    PubMed

    Berglund, Lars

    2012-08-01

    Random errors in measurement of a risk factor will introduce downward bias of an estimated association to a disease or a disease marker. This phenomenon is called regression dilution bias. A bias correction may be made with data from a validity study or a reliability study. In this article we give a non-technical description of designs of reliability studies with emphasis on selection of individuals for a repeated measurement, assumptions of measurement error models, and correction methods for the slope in a simple linear regression model where the dependent variable is a continuous variable. Also, we describe situations where correction for regression dilution bias is not appropriate. The methods are illustrated with the association between insulin sensitivity measured with the euglycaemic insulin clamp technique and fasting insulin, where measurement of the latter variable carries noticeable random error. We provide software tools for estimation of a corrected slope in a simple linear regression model assuming data for a continuous dependent variable and a continuous risk factor from a main study and an additional measurement of the risk factor in a reliability study. Also, we supply programs for estimation of the number of individuals needed in the reliability study and for choice of its design. Our conclusion is that correction for regression dilution bias is seldom applied in epidemiological studies. This may cause important effects of risk factors with large measurement errors to be neglected.

  13. Predicting declines in perceived relationship continuity using practice deprivation scores: a longitudinal study in primary care.

    PubMed

    Levene, Louis S; Baker, Richard; Walker, Nicola; Williams, Christopher; Wilson, Andrew; Bankart, John

    2018-06-01

    Increased relationship continuity in primary care is associated with better health outcomes, greater patient satisfaction, and fewer hospital admissions. Greater socioeconomic deprivation is associated with lower levels of continuity, as well as poorer health outcomes. To investigate whether deprivation scores predicted variations in the decline over time of patient-perceived relationship continuity of care, after adjustment for practice organisational and population factors. An observational study in 6243 primary care practices with more than one GP, in England, using a longitudinal multilevel linear model, 2012-2017 inclusive. Patient-perceived relationship continuity was calculated using two questions from the GP Patient Survey. The effect of deprivation on the linear slope of continuity over time was modelled, adjusting for nine confounding variables (practice population and organisational factors). Clustering of measurements within general practices was adjusted for by using a random intercepts and random slopes model. Descriptive statistics and univariable analyses were also undertaken. Relationship continuity declined by 27.5% between 2012 and 2017, and at all deprivation levels. Deprivation scores from 2012 did not predict variations in the decline of relationship continuity at practice level, after accounting for the effects of organisational and population confounding variables, which themselves did not predict, or weakly predicted with very small effect sizes, the decline of continuity. Cross-sectionally, continuity and deprivation were negatively correlated within each year. The decline in relationship continuity of care has been marked and widespread. Measures to maximise continuity will need to be feasible for individual practices with diverse population and organisational characteristics. © British Journal of General Practice 2018.

  14. Circularly-symmetric complex normal ratio distribution for scalar transmissibility functions. Part I: Fundamentals

    NASA Astrophysics Data System (ADS)

    Yan, Wang-Ji; Ren, Wei-Xin

    2016-12-01

    Recent advances in signal processing and structural dynamics have spurred the adoption of transmissibility functions in academia and industry alike. Due to the inherent randomness of measurement and variability of environmental conditions, uncertainty impacts its applications. This study is focused on statistical inference for raw scalar transmissibility functions modeled as complex ratio random variables. The goal is achieved through companion papers. This paper (Part I) is dedicated to dealing with a formal mathematical proof. New theorems on multivariate circularly-symmetric complex normal ratio distribution are proved on the basis of principle of probabilistic transformation of continuous random vectors. The closed-form distributional formulas for multivariate ratios of correlated circularly-symmetric complex normal random variables are analytically derived. Afterwards, several properties are deduced as corollaries and lemmas to the new theorems. Monte Carlo simulation (MCS) is utilized to verify the accuracy of some representative cases. This work lays the mathematical groundwork to find probabilistic models for raw scalar transmissibility functions, which are to be expounded in detail in Part II of this study.

  15. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    PubMed

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  16. Record statistics of a strongly correlated time series: random walks and Lévy flights

    NASA Astrophysics Data System (ADS)

    Godrèche, Claude; Majumdar, Satya N.; Schehr, Grégory

    2017-08-01

    We review recent advances on the record statistics of strongly correlated time series, whose entries denote the positions of a random walk or a Lévy flight on a line. After a brief survey of the theory of records for independent and identically distributed random variables, we focus on random walks. During the last few years, it was indeed realized that random walks are a very useful ‘laboratory’ to test the effects of correlations on the record statistics. We start with the simple one-dimensional random walk with symmetric jumps (both continuous and discrete) and discuss in detail the statistics of the number of records, as well as of the ages of the records, i.e. the lapses of time between two successive record breaking events. Then we review the results that were obtained for a wide variety of random walk models, including random walks with a linear drift, continuous time random walks, constrained random walks (like the random walk bridge) and the case of multiple independent random walkers. Finally, we discuss further observables related to records, like the record increments, as well as some questions raised by physical applications of record statistics, like the effects of measurement error and noise.

  17. Interval estimation and optimal design for the within-subject coefficient of variation for continuous and binary variables

    PubMed Central

    Shoukri, Mohamed M; Elkum, Nasser; Walter, Stephen D

    2006-01-01

    Background In this paper we propose the use of the within-subject coefficient of variation as an index of a measurement's reliability. For continuous variables and based on its maximum likelihood estimation we derive a variance-stabilizing transformation and discuss confidence interval construction within the framework of a one-way random effects model. We investigate sample size requirements for the within-subject coefficient of variation for continuous and binary variables. Methods We investigate the validity of the approximate normal confidence interval by Monte Carlo simulations. In designing a reliability study, a crucial issue is the balance between the number of subjects to be recruited and the number of repeated measurements per subject. We discuss efficiency of estimation and cost considerations for the optimal allocation of the sample resources. The approach is illustrated by an example on Magnetic Resonance Imaging (MRI). We also discuss the issue of sample size estimation for dichotomous responses with two examples. Results For the continuous variable we found that the variance stabilizing transformation improves the asymptotic coverage probabilities on the within-subject coefficient of variation for the continuous variable. The maximum like estimation and sample size estimation based on pre-specified width of confidence interval are novel contribution to the literature for the binary variable. Conclusion Using the sample size formulas, we hope to help clinical epidemiologists and practicing statisticians to efficiently design reliability studies using the within-subject coefficient of variation, whether the variable of interest is continuous or binary. PMID:16686943

  18. Spatial generalised linear mixed models based on distances.

    PubMed

    Melo, Oscar O; Mateu, Jorge; Melo, Carlos E

    2016-10-01

    Risk models derived from environmental data have been widely shown to be effective in delineating geographical areas of risk because they are intuitively easy to understand. We present a new method based on distances, which allows the modelling of continuous and non-continuous random variables through distance-based spatial generalised linear mixed models. The parameters are estimated using Markov chain Monte Carlo maximum likelihood, which is a feasible and a useful technique. The proposed method depends on a detrending step built from continuous or categorical explanatory variables, or a mixture among them, by using an appropriate Euclidean distance. The method is illustrated through the analysis of the variation in the prevalence of Loa loa among a sample of village residents in Cameroon, where the explanatory variables included elevation, together with maximum normalised-difference vegetation index and the standard deviation of normalised-difference vegetation index calculated from repeated satellite scans over time. © The Author(s) 2013.

  19. Analysis on flood generation processes by means of a continuous simulation model

    NASA Astrophysics Data System (ADS)

    Fiorentino, M.; Gioia, A.; Iacobellis, V.; Manfreda, S.

    2006-03-01

    In the present research, we exploited a continuous hydrological simulation to investigate on key variables responsible of flood peak formation. With this purpose, a distributed hydrological model (DREAM) is used in cascade with a rainfall generator (IRP-Iterated Random Pulse) to simulate a large number of extreme events providing insight into the main controls of flood generation mechanisms. Investigated variables are those used in theoretically derived probability distribution of floods based on the concept of partial contributing area (e.g. Iacobellis and Fiorentino, 2000). The continuous simulation model is used to investigate on the hydrological losses occurring during extreme events, the variability of the source area contributing to the flood peak and its lag-time. Results suggest interesting simplification for the theoretical probability distribution of floods according to the different climatic and geomorfologic environments. The study is applied to two basins located in Southern Italy with different climatic characteristics.

  20. Patient characteristics and variability in adherence and competence in cognitive-behavioral therapy for panic disorder.

    PubMed

    Boswell, James F; Gallagher, Matthew W; Sauer-Zavala, Shannon E; Bullis, Jacqueline; Gorman, Jack M; Shear, M Katherine; Woods, Scott; Barlow, David H

    2013-06-01

    Although associations with outcome have been inconsistent, therapist adherence and competence continues to garner attention, particularly within the context of increasing interest in the dissemination, implementation, and sustainability of evidence-based treatments. To date, research on therapist adherence and competence has focused on average levels across therapists. With a few exceptions, research has failed to address multiple sources of variability in adherence and competence, identify important factors that might account for variability, or take these sources of variability into account when examining associations with symptom change. (a) statistically demonstrate between- and within-therapist variability in adherence and competence ratings and examine patient characteristics as predictors of this variability and (b) examine the relationship between adherence/competence and symptom change. Randomly selected audiotaped sessions from a randomized controlled trial of cognitive-behavioral therapy for panic disorder were rated for therapist adherence and competence. Patients completed a self-report measure of panic symptom severity prior to each session and the Inventory of Interpersonal Problems-Personality Disorder Scale prior to the start of treatment. Significant between- and within-therapist variability in adherence and competence were observed. Adherence and competence deteriorated significantly over the course of treatment. Higher patient interpersonal aggression was associated with decrements in both adherence and competence. Neither adherence nor competence predicted subsequent panic severity. Variability and "drift" in adherence and competence can be observed in controlled trials. Training and implementation efforts should involve continued consultation over multiple cases in order to account for relevant patient factors and promote sustainability across sessions and patients.

  1. High performance frame synchronization for continuous variable quantum key distribution systems.

    PubMed

    Lin, Dakai; Huang, Peng; Huang, Duan; Wang, Chao; Peng, Jinye; Zeng, Guihua

    2015-08-24

    Considering a practical continuous variable quantum key distribution(CVQKD) system, synchronization is of significant importance as it is hardly possible to extract secret keys from unsynchronized strings. In this paper, we proposed a high performance frame synchronization method for CVQKD systems which is capable to operate under low signal-to-noise(SNR) ratios and is compatible with random phase shift induced by quantum channel. A practical implementation of this method with low complexity is presented and its performance is analysed. By adjusting the length of synchronization frame, this method can work well with large range of SNR values which paves the way for longer distance CVQKD.

  2. Universal quantum computation with temporal-mode bilayer square lattices

    NASA Astrophysics Data System (ADS)

    Alexander, Rafael N.; Yokoyama, Shota; Furusawa, Akira; Menicucci, Nicolas C.

    2018-03-01

    We propose an experimental design for universal continuous-variable quantum computation that incorporates recent innovations in linear-optics-based continuous-variable cluster state generation and cubic-phase gate teleportation. The first ingredient is a protocol for generating the bilayer-square-lattice cluster state (a universal resource state) with temporal modes of light. With this state, measurement-based implementation of Gaussian unitary gates requires only homodyne detection. Second, we describe a measurement device that implements an adaptive cubic-phase gate, up to a random phase-space displacement. It requires a two-step sequence of homodyne measurements and consumes a (non-Gaussian) cubic-phase state.

  3. Integration of quantum key distribution and private classical communication through continuous variable

    NASA Astrophysics Data System (ADS)

    Wang, Tianyi; Gong, Feng; Lu, Anjiang; Zhang, Damin; Zhang, Zhengping

    2017-12-01

    In this paper, we propose a scheme that integrates quantum key distribution and private classical communication via continuous variables. The integrated scheme employs both quadratures of a weak coherent state, with encrypted bits encoded on the signs and Gaussian random numbers encoded on the values of the quadratures. The integration enables quantum and classical data to share the same physical and logical channel. Simulation results based on practical system parameters demonstrate that both classical communication and quantum communication can be implemented over distance of tens of kilometers, thus providing a potential solution for simultaneous transmission of quantum communication and classical communication.

  4. Continuous Variable Quantum Key Distribution Using Polarized Coherent States

    NASA Astrophysics Data System (ADS)

    Vidiella-Barranco, A.; Borelli, L. F. M.

    We discuss a continuous variables method of quantum key distribution employing strongly polarized coherent states of light. The key encoding is performed using the variables known as Stokes parameters, rather than the field quadratures. Their quantum counterpart, the Stokes operators Ŝi (i=1,2,3), constitute a set of non-commuting operators, being the precision of simultaneous measurements of a pair of them limited by an uncertainty-like relation. Alice transmits a conveniently modulated two-mode coherent state, and Bob randomly measures one of the Stokes parameters of the incoming beam. After performing reconciliation and privacy amplification procedures, it is possible to distill a secret common key. We also consider a non-ideal situation, in which coherent states with thermal noise, instead of pure coherent states, are used for encoding.

  5. Reconstruction of Sensory Stimuli Encoded with Integrate-and-Fire Neurons with Random Thresholds

    PubMed Central

    Lazar, Aurel A.; Pnevmatikakis, Eftychios A.

    2013-01-01

    We present a general approach to the reconstruction of sensory stimuli encoded with leaky integrate-and-fire neurons with random thresholds. The stimuli are modeled as elements of a Reproducing Kernel Hilbert Space. The reconstruction is based on finding a stimulus that minimizes a regularized quadratic optimality criterion. We discuss in detail the reconstruction of sensory stimuli modeled as absolutely continuous functions as well as stimuli with absolutely continuous first-order derivatives. Reconstruction results are presented for stimuli encoded with single as well as a population of neurons. Examples are given that demonstrate the performance of the reconstruction algorithms as a function of threshold variability. PMID:24077610

  6. Multiple imputation in the presence of non-normal data.

    PubMed

    Lee, Katherine J; Carlin, John B

    2017-02-20

    Multiple imputation (MI) is becoming increasingly popular for handling missing data. Standard approaches for MI assume normality for continuous variables (conditionally on the other variables in the imputation model). However, it is unclear how to impute non-normally distributed continuous variables. Using simulation and a case study, we compared various transformations applied prior to imputation, including a novel non-parametric transformation, to imputation on the raw scale and using predictive mean matching (PMM) when imputing non-normal data. We generated data from a range of non-normal distributions, and set 50% to missing completely at random or missing at random. We then imputed missing values on the raw scale, following a zero-skewness log, Box-Cox or non-parametric transformation and using PMM with both type 1 and 2 matching. We compared inferences regarding the marginal mean of the incomplete variable and the association with a fully observed outcome. We also compared results from these approaches in the analysis of depression and anxiety symptoms in parents of very preterm compared with term-born infants. The results provide novel empirical evidence that the decision regarding how to impute a non-normal variable should be based on the nature of the relationship between the variables of interest. If the relationship is linear in the untransformed scale, transformation can introduce bias irrespective of the transformation used. However, if the relationship is non-linear, it may be important to transform the variable to accurately capture this relationship. A useful alternative is to impute the variable using PMM with type 1 matching. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. A generator for unique quantum random numbers based on vacuum states

    NASA Astrophysics Data System (ADS)

    Gabriel, Christian; Wittmann, Christoffer; Sych, Denis; Dong, Ruifang; Mauerer, Wolfgang; Andersen, Ulrik L.; Marquardt, Christoph; Leuchs, Gerd

    2010-10-01

    Random numbers are a valuable component in diverse applications that range from simulations over gambling to cryptography. The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational unpredictability of quantum mechanics. However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique. Here we present a simple experimental setup based on homodyne measurements that uses the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably unique randomness, are important attributes for achieving high-reliability, high-speed and low-cost quantum random number generators.

  8. Two-time scale subordination in physical processes with long-term memory

    NASA Astrophysics Data System (ADS)

    Stanislavsky, Aleksander; Weron, Karina

    2008-03-01

    We describe dynamical processes in continuous media with a long-term memory. Our consideration is based on a stochastic subordination idea and concerns two physical examples in detail. First we study a temporal evolution of the species concentration in a trapping reaction in which a diffusing reactant is surrounded by a sea of randomly moving traps. The analysis uses the random-variable formalism of anomalous diffusive processes. We find that the empirical trapping-reaction law, according to which the reactant concentration decreases in time as a product of an exponential and a stretched exponential function, can be explained by a two-time scale subordination of random processes. Another example is connected with a state equation for continuous media with memory. If the pressure and the density of a medium are subordinated in two different random processes, then the ordinary state equation becomes fractional with two-time scales. This allows one to arrive at the Bagley-Torvik type of state equation.

  9. Regression Discontinuity for Causal Effect Estimation in Epidemiology.

    PubMed

    Oldenburg, Catherine E; Moscoe, Ellen; Bärnighausen, Till

    Regression discontinuity analyses can generate estimates of the causal effects of an exposure when a continuously measured variable is used to assign the exposure to individuals based on a threshold rule. Individuals just above the threshold are expected to be similar in their distribution of measured and unmeasured baseline covariates to individuals just below the threshold, resulting in exchangeability. At the threshold exchangeability is guaranteed if there is random variation in the continuous assignment variable, e.g., due to random measurement error. Under exchangeability, causal effects can be identified at the threshold. The regression discontinuity intention-to-treat (RD-ITT) effect on an outcome can be estimated as the difference in the outcome between individuals just above (or below) versus just below (or above) the threshold. This effect is analogous to the ITT effect in a randomized controlled trial. Instrumental variable methods can be used to estimate the effect of exposure itself utilizing the threshold as the instrument. We review the recent epidemiologic literature reporting regression discontinuity studies and find that while regression discontinuity designs are beginning to be utilized in a variety of applications in epidemiology, they are still relatively rare, and analytic and reporting practices vary. Regression discontinuity has the potential to greatly contribute to the evidence base in epidemiology, in particular on the real-life and long-term effects and side-effects of medical treatments that are provided based on threshold rules - such as treatments for low birth weight, hypertension or diabetes.

  10. Quantum random bit generation using energy fluctuations in stimulated Raman scattering.

    PubMed

    Bustard, Philip J; England, Duncan G; Nunn, Josh; Moffatt, Doug; Spanner, Michael; Lausten, Rune; Sussman, Benjamin J

    2013-12-02

    Random number sequences are a critical resource in modern information processing systems, with applications in cryptography, numerical simulation, and data sampling. We introduce a quantum random number generator based on the measurement of pulse energy quantum fluctuations in Stokes light generated by spontaneously-initiated stimulated Raman scattering. Bright Stokes pulse energy fluctuations up to five times the mean energy are measured with fast photodiodes and converted to unbiased random binary strings. Since the pulse energy is a continuous variable, multiple bits can be extracted from a single measurement. Our approach can be generalized to a wide range of Raman active materials; here we demonstrate a prototype using the optical phonon line in bulk diamond.

  11. Distillation of squeezing from non-Gaussian quantum states.

    PubMed

    Heersink, J; Marquardt, Ch; Dong, R; Filip, R; Lorenz, S; Leuchs, G; Andersen, U L

    2006-06-30

    We show that single copy distillation of squeezing from continuous variable non-Gaussian states is possible using linear optics and conditional homodyne detection. A specific non-Gaussian noise source, corresponding to a random linear displacement, is investigated experimentally. Conditioning the signal on a tap measurement, we observe probabilistic recovery of squeezing.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ide, Toshiki; Hofmann, Holger F.; JST-CREST, Graduate School of Advanced Sciences of Matter, Hiroshima University, Kagamiyama 1-3-1, Higashi Hiroshima 739-8530

    The information encoded in the polarization of a single photon can be transferred to a remote location by two-channel continuous-variable quantum teleportation. However, the finite entanglement used in the teleportation causes random changes in photon number. If more than one photon appears in the output, the continuous-variable teleportation accidentally produces clones of the original input photon. In this paper, we derive the polarization statistics of the N-photon output components and show that they can be decomposed into an optimal cloning term and completely unpolarized noise. We find that the accidental cloning of the input photon is nearly optimal at experimentallymore » feasible squeezing levels, indicating that the loss of polarization information is partially compensated by the availability of clones.« less

  13. Comparison of a step-down dose of once-daily ciclesonide with a continued dose of twice-daily fluticasone propionate in maintaining control of asthma.

    PubMed

    Knox, A; Langan, J; Martinot, J-B; Gruss, C; Häfner, D

    2007-10-01

    To compare a step-down approach in well-controlled asthma patients, as recommended by treatment guidelines, from fluticasone propionate 250 microg twice daily (FP250 BID), or equivalent, to ciclesonide 160 microg once daily (CIC160 OD) with continued FP250 BID treatment. Patients with well-controlled asthma prior to study entry were included in two identical, randomized, double-blind, double-dummy, parallel-group studies. After a 2-week run-in period with FP250 BID, patients were randomized to CIC160 OD (n = 58) or FP250 BID (n = 53) for 12 weeks. Primary endpoints were percentage of days with asthma control, asthma symptom-free days, rescue medication-free days and nocturnal awakening-free days. Secondary endpoints included lung function variables, asthma symptom scores, rescue medication use and asthma exacerbations. Safety variables were also recorded. Patients had >or= 97% of days with asthma control, 98% asthma symptom-free days and 100% of days free from rescue medication use and nocturnal awakenings in both treatment groups (median values). There were no significant between-treatment differences for any of the primary or secondary efficacy variables. Overall, 42 treatment-emergent adverse events (TEAEs) were reported in the CIC160 OD group and 49 TEAEs were reported in the FP250 BID group. There were no clinically relevant changes from baseline in the safety variables in either treatment group. Patients well controlled on FP250 BID, or equivalent, who were stepped down to CIC160 OD, maintained similar asthma control compared with patients who received continued treatment standardized to FP250 BID.

  14. The comparison of landslide ratio-based and general logistic regression landslide susceptibility models in the Chishan watershed after 2009 Typhoon Morakot

    NASA Astrophysics Data System (ADS)

    WU, Chunhung

    2015-04-01

    The research built the original logistic regression landslide susceptibility model (abbreviated as or-LRLSM) and landslide ratio-based ogistic regression landslide susceptibility model (abbreviated as lr-LRLSM), compared the performance and explained the error source of two models. The research assumes that the performance of the logistic regression model can be better if the distribution of landslide ratio and weighted value of each variable is similar. Landslide ratio is the ratio of landslide area to total area in the specific area and an useful index to evaluate the seriousness of landslide disaster in Taiwan. The research adopted the landside inventory induced by 2009 Typhoon Morakot in the Chishan watershed, which was the most serious disaster event in the last decade, in Taiwan. The research adopted the 20 m grid as the basic unit in building the LRLSM, and six variables, including elevation, slope, aspect, geological formation, accumulated rainfall, and bank erosion, were included in the two models. The six variables were divided as continuous variables, including elevation, slope, and accumulated rainfall, and categorical variables, including aspect, geological formation and bank erosion in building the or-LRLSM, while all variables, which were classified based on landslide ratio, were categorical variables in building the lr-LRLSM. Because the count of whole basic unit in the Chishan watershed was too much to calculate by using commercial software, the research took random sampling instead of the whole basic units. The research adopted equal proportions of landslide unit and not landslide unit in logistic regression analysis. The research took 10 times random sampling and selected the group with the best Cox & Snell R2 value and Nagelkerker R2 value as the database for the following analysis. Based on the best result from 10 random sampling groups, the or-LRLSM (lr-LRLSM) is significant at the 1% level with Cox & Snell R2 = 0.190 (0.196) and Nagelkerke R2 = 0.253 (0.260). The unit with the landslide susceptibility value > 0.5 (≦ 0.5) will be classified as a predicted landslide unit (not landslide unit). The AUC, i.e. the area under the relative operating characteristic curve, of or-LRLSM in the Chishan watershed is 0.72, while that of lr-LRLSM is 0.77. Furthermore, the average correct ratio of lr-LRLSM (73.3%) is better than that of or-LRLSM (68.3%). The research analyzed in detail the error sources from the two models. In continuous variables, using the landslide ratio-based classification in building the lr-LRLSM can let the distribution of weighted value more similar to distribution of landslide ratio in the range of continuous variable than that in building the or-LRLSM. In categorical variables, the meaning of using the landslide ratio-based classification in building the lr-LRLSM is to gather the parameters with approximate landslide ratio together. The mean correct ratio in continuous variables (categorical variables) by using the lr-LRLSM is better than that in or-LRLSM by 0.6 ~ 2.6% (1.7% ~ 6.0%). Building the landslide susceptibility model by using landslide ratio-based classification is practical and of better performance than that by using the original logistic regression.

  15. A stochastic-geometric model of soil variation in Pleistocene patterned ground

    NASA Astrophysics Data System (ADS)

    Lark, Murray; Meerschman, Eef; Van Meirvenne, Marc

    2013-04-01

    In this paper we examine the spatial variability of soil in parent material with complex spatial structure which arises from complex non-linear geomorphic processes. We show that this variability can be better-modelled by a stochastic-geometric model than by a standard Gaussian random field. The benefits of the new model are seen in the reproduction of features of the target variable which influence processes like water movement and pollutant dispersal. Complex non-linear processes in the soil give rise to properties with non-Gaussian distributions. Even under a transformation to approximate marginal normality, such variables may have a more complex spatial structure than the Gaussian random field model of geostatistics can accommodate. In particular the extent to which extreme values of the variable are connected in spatially coherent regions may be misrepresented. As a result, for example, geostatistical simulation generally fails to reproduce the pathways for preferential flow in an environment where coarse infill of former fluvial channels or coarse alluvium of braided streams creates pathways for rapid movement of water. Multiple point geostatistics has been developed to deal with this problem. Multiple point methods proceed by sampling from a set of training images which can be assumed to reproduce the non-Gaussian behaviour of the target variable. The challenge is to identify appropriate sources of such images. In this paper we consider a mode of soil variation in which the soil varies continuously, exhibiting short-range lateral trends induced by local effects of the factors of soil formation which vary across the region of interest in an unpredictable way. The trends in soil variation are therefore only apparent locally, and the soil variation at regional scale appears random. We propose a stochastic-geometric model for this mode of soil variation called the Continuous Local Trend (CLT) model. We consider a case study of soil formed in relict patterned ground with pronounced lateral textural variations arising from the presence of infilled ice-wedges of Pleistocene origin. We show how knowledge of the pedogenetic processes in this environment, along with some simple descriptive statistics, can be used to select and fit a CLT model for the apparent electrical conductivity (ECa) of the soil. We use the model to simulate realizations of the CLT process, and compare these with realizations of a fitted Gaussian random field. We show how statistics that summarize the spatial coherence of regions with small values of ECa, which are expected to have coarse texture and so larger saturated hydraulic conductivity, are better reproduced by the CLT model than by the Gaussian random field. This suggests that the CLT model could be used to generate an unlimited supply of training images to allow multiple point geostatistical simulation or prediction of this or similar variables.

  16. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models.

    PubMed

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M

    2017-12-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.

  17. Mind wandering at the fingertips: automatic parsing of subjective states based on response time variability

    PubMed Central

    Bastian, Mikaël; Sackur, Jérôme

    2013-01-01

    Research from the last decade has successfully used two kinds of thought reports in order to assess whether the mind is wandering: random thought-probes and spontaneous reports. However, none of these two methods allows any assessment of the subjective state of the participant between two reports. In this paper, we present a step by step elaboration and testing of a continuous index, based on response time variability within Sustained Attention to Response Tasks (N = 106, for a total of 10 conditions). We first show that increased response time variability predicts mind wandering. We then compute a continuous index of response time variability throughout full experiments and show that the temporal position of a probe relative to the nearest local peak of the continuous index is predictive of mind wandering. This suggests that our index carries information about the subjective state of the subject even when he or she is not probed, and opens the way for on-line tracking of mind wandering. Finally we proceed a step further and infer the internal attentional states on the basis of the variability of response times. To this end we use the Hidden Markov Model framework, which allows us to estimate the durations of on-task and off-task episodes. PMID:24046753

  18. The living Drake equation of the Tau Zero Foundation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2011-03-01

    The living Drake equation is our statistical generalization of the Drake equation such that it can take into account any number of factors. This new result opens up the possibility to enrich the equation by inserting more new factors as long as the scientific learning increases. The adjective "Living" refers just to this continuous enrichment of the Drake equation and is the goal of a new research project that the Tau Zero Foundation has entrusted to this author as the discoverer of the statistical Drake equation described hereafter. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be arbitrarily distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov form of the CLT, or the Lindeberg form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the lognormal distribution. Then, the mean value, standard deviation, mode, median and all the moments of this lognormal N can be derived from the means and standard deviations of the seven input random variables. In fact, the seven factors in the ordinary Drake equation now become seven independent positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) distance between any two neighbouring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, this distance now becomes a new random variable. We derive the relevant probability density function, apparently previously unknown (dubbed "Maccone distribution" by Paul Davies). Data Enrichment Principle. It should be noticed that any positive number of random variables in the statistical Drake equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation we call the "Data Enrichment Principle", and regard as the key to more profound, future results in Astrobiology and SETI.

  19. The Central Limit Theorem for Supercritical Oriented Percolation in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Tzioufas, Achillefs

    2018-04-01

    We consider the cardinality of supercritical oriented bond percolation in two dimensions. We show that, whenever the the origin is conditioned to percolate, the process appropriately normalized converges asymptotically in distribution to the standard normal law. This resolves a longstanding open problem pointed out to in several instances in the literature. The result applies also to the continuous-time analog of the process, viz. the basic one-dimensional contact process. We also derive general random-indices central limit theorems for associated random variables as byproducts of our proof.

  20. The Central Limit Theorem for Supercritical Oriented Percolation in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Tzioufas, Achillefs

    2018-06-01

    We consider the cardinality of supercritical oriented bond percolation in two dimensions. We show that, whenever the the origin is conditioned to percolate, the process appropriately normalized converges asymptotically in distribution to the standard normal law. This resolves a longstanding open problem pointed out to in several instances in the literature. The result applies also to the continuous-time analog of the process, viz. the basic one-dimensional contact process. We also derive general random-indices central limit theorems for associated random variables as byproducts of our proof.

  1. Investigation of the performance characteristics of Doppler radar technique for aircraft collision hazard warning, phase 3

    NASA Technical Reports Server (NTRS)

    1972-01-01

    System studies, equipment simulation, hardware development and flight tests which were conducted during the development of aircraft collision hazard warning system are discussed. The system uses a cooperative, continuous wave Doppler radar principle with pseudo-random frequency modulation. The report presents a description of the system operation and deals at length with the use of pseudo-random coding techniques. In addition, the use of mathematical modeling and computer simulation to determine the alarm statistics and system saturation characteristics in terminal area traffic of variable density is discussed.

  2. Three-part joint modeling methods for complex functional data mixed with zero-and-one-inflated proportions and zero-inflated continuous outcomes with skewness.

    PubMed

    Li, Haocheng; Staudenmayer, John; Wang, Tianying; Keadle, Sarah Kozey; Carroll, Raymond J

    2018-02-20

    We take a functional data approach to longitudinal studies with complex bivariate outcomes. This work is motivated by data from a physical activity study that measured 2 responses over time in 5-minute intervals. One response is the proportion of time active in each interval, a continuous proportions with excess zeros and ones. The other response, energy expenditure rate in the interval, is a continuous variable with excess zeros and skewness. This outcome is complex because there are 3 possible activity patterns in each interval (inactive, partially active, and completely active), and those patterns, which are observed, induce both nonrandom and random associations between the responses. More specifically, the inactive pattern requires a zero value in both the proportion for active behavior and the energy expenditure rate; a partially active pattern means that the proportion of activity is strictly between zero and one and that the energy expenditure rate is greater than zero and likely to be moderate, and the completely active pattern means that the proportion of activity is exactly one, and the energy expenditure rate is greater than zero and likely to be higher. To address these challenges, we propose a 3-part functional data joint modeling approach. The first part is a continuation-ratio model to reorder the ordinal valued 3 activity patterns. The second part models the proportions when they are in interval (0,1). The last component specifies the skewed continuous energy expenditure rate with Box-Cox transformations when they are greater than zero. In this 3-part model, the regression structures are specified as smooth curves measured at various time points with random effects that have a correlation structure. The smoothed random curves for each variable are summarized using a few important principal components, and the association of the 3 longitudinal components is modeled through the association of the principal component scores. The difficulties in handling the ordinal and proportional variables are addressed using a quasi-likelihood type approximation. We develop an efficient algorithm to fit the model that also involves the selection of the number of principal components. The method is applied to physical activity data and is evaluated empirically by a simulation study. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Recourse-based facility-location problems in hybrid uncertain environment.

    PubMed

    Wang, Shuming; Watada, Junzo; Pedrycz, Witold

    2010-08-01

    The objective of this paper is to study facility-location problems in the presence of a hybrid uncertain environment involving both randomness and fuzziness. A two-stage fuzzy-random facility-location model with recourse (FR-FLMR) is developed in which both the demands and costs are assumed to be fuzzy-random variables. The bounds of the optimal objective value of the two-stage FR-FLMR are derived. As, in general, the fuzzy-random parameters of the FR-FLMR can be regarded as continuous fuzzy-random variables with an infinite number of realizations, the computation of the recourse requires solving infinite second-stage programming problems. Owing to this requirement, the recourse function cannot be determined analytically, and, hence, the model cannot benefit from the use of techniques of classical mathematical programming. In order to solve the location problems of this nature, we first develop a technique of fuzzy-random simulation to compute the recourse function. The convergence of such simulation scenarios is discussed. In the sequel, we propose a hybrid mutation-based binary ant-colony optimization (MBACO) approach to the two-stage FR-FLMR, which comprises the fuzzy-random simulation and the simplex algorithm. A numerical experiment illustrates the application of the hybrid MBACO algorithm. The comparison shows that the hybrid MBACO finds better solutions than the one using other discrete metaheuristic algorithms, such as binary particle-swarm optimization, genetic algorithm, and tabu search.

  4. Stochastic reduced order models for inverse problems under uncertainty

    PubMed Central

    Warner, James E.; Aquino, Wilkins; Grigoriu, Mircea D.

    2014-01-01

    This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well. PMID:25558115

  5. Effect of study design on the reported effect of cardiac resynchronization therapy (CRT) on quantitative physiological measures: stratified meta-analysis in narrow-QRS heart failure and implications for planning future studies.

    PubMed

    Jabbour, Richard J; Shun-Shin, Matthew J; Finegold, Judith A; Afzal Sohaib, S M; Cook, Christopher; Nijjer, Sukhjinder S; Whinnett, Zachary I; Manisty, Charlotte H; Brugada, Josep; Francis, Darrel P

    2015-01-06

    Biventricular pacing (CRT) shows clear benefits in heart failure with wide QRS, but results in narrow QRS have appeared conflicting. We tested the hypothesis that study design might have influenced findings. We identified all reports of CRT-P/D therapy in subjects with narrow QRS reporting effects on continuous physiological variables. Twelve studies (2074 patients) met these criteria. Studies were stratified by presence of bias-resistance steps: the presence of a randomized control arm over a single arm, and blinded outcome measurement. Change in each endpoint was quantified using a standardized effect size (Cohen's d). We conducted separate meta-analyses for each variable in turn, stratified by trial quality. In non-randomized, non-blinded studies, the majority of variables (10 of 12, 83%) showed significant improvement, ranging from a standardized mean effect size of +1.57 (95%CI +0.43 to +2.7) for ejection fraction to +2.87 (+1.78 to +3.95) for NYHA class. In the randomized, non-blinded study, only 3 out of 6 variables (50%) showed improvement. For the randomized blinded studies, 0 out of 9 variables (0%) showed benefit, ranging from -0.04 (-0.31 to +0.22) for ejection fraction to -0.1 (-0.73 to +0.53) for 6-minute walk test. Differences in degrees of resistance to bias, rather than choice of endpoint, explain the variation between studies of CRT in narrow-QRS heart failure addressing physiological variables. When bias-resistance features are implemented, it becomes clear that these patients do not improve in any tested physiological variable. Guidance from studies without careful planning to resist bias may be far less useful than commonly perceived. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  6. Recursive Branching Simulated Annealing Algorithm

    NASA Technical Reports Server (NTRS)

    Bolcar, Matthew; Smith, J. Scott; Aronstein, David

    2012-01-01

    This innovation is a variation of a simulated-annealing optimization algorithm that uses a recursive-branching structure to parallelize the search of a parameter space for the globally optimal solution to an objective. The algorithm has been demonstrated to be more effective at searching a parameter space than traditional simulated-annealing methods for a particular problem of interest, and it can readily be applied to a wide variety of optimization problems, including those with a parameter space having both discrete-value parameters (combinatorial) and continuous-variable parameters. It can take the place of a conventional simulated- annealing, Monte-Carlo, or random- walk algorithm. In a conventional simulated-annealing (SA) algorithm, a starting configuration is randomly selected within the parameter space. The algorithm randomly selects another configuration from the parameter space and evaluates the objective function for that configuration. If the objective function value is better than the previous value, the new configuration is adopted as the new point of interest in the parameter space. If the objective function value is worse than the previous value, the new configuration may be adopted, with a probability determined by a temperature parameter, used in analogy to annealing in metals. As the optimization continues, the region of the parameter space from which new configurations can be selected shrinks, and in conjunction with lowering the annealing temperature (and thus lowering the probability for adopting configurations in parameter space with worse objective functions), the algorithm can converge on the globally optimal configuration. The Recursive Branching Simulated Annealing (RBSA) algorithm shares some features with the SA algorithm, notably including the basic principles that a starting configuration is randomly selected from within the parameter space, the algorithm tests other configurations with the goal of finding the globally optimal solution, and the region from which new configurations can be selected shrinks as the search continues. The key difference between these algorithms is that in the SA algorithm, a single path, or trajectory, is taken in parameter space, from the starting point to the globally optimal solution, while in the RBSA algorithm, many trajectories are taken; by exploring multiple regions of the parameter space simultaneously, the algorithm has been shown to converge on the globally optimal solution about an order of magnitude faster than when using conventional algorithms. Novel features of the RBSA algorithm include: 1. More efficient searching of the parameter space due to the branching structure, in which multiple random configurations are generated and multiple promising regions of the parameter space are explored; 2. The implementation of a trust region for each parameter in the parameter space, which provides a natural way of enforcing upper- and lower-bound constraints on the parameters; and 3. The optional use of a constrained gradient- search optimization, performed on the continuous variables around each branch s configuration in parameter space to improve search efficiency by allowing for fast fine-tuning of the continuous variables within the trust region at that configuration point.

  7. Diffusion Processes Satisfying a Conservation Law Constraint

    DOE PAGES

    Bakosi, J.; Ristorcelli, J. R.

    2014-03-04

    We investigate coupled stochastic differential equations governing N non-negative continuous random variables that satisfy a conservation principle. In various fields a conservation law requires that a set of fluctuating variables be non-negative and (if appropriately normalized) sum to one. As a result, any stochastic differential equation model to be realizable must not produce events outside of the allowed sample space. We develop a set of constraints on the drift and diffusion terms of such stochastic models to ensure that both the non-negativity and the unit-sum conservation law constraint are satisfied as the variables evolve in time. We investigate the consequencesmore » of the developed constraints on the Fokker-Planck equation, the associated system of stochastic differential equations, and the evolution equations of the first four moments of the probability density function. We show that random variables, satisfying a conservation law constraint, represented by stochastic diffusion processes, must have diffusion terms that are coupled and nonlinear. The set of constraints developed enables the development of statistical representations of fluctuating variables satisfying a conservation law. We exemplify the results with the bivariate beta process and the multivariate Wright-Fisher, Dirichlet, and Lochner’s generalized Dirichlet processes.« less

  8. Diffusion Processes Satisfying a Conservation Law Constraint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakosi, J.; Ristorcelli, J. R.

    We investigate coupled stochastic differential equations governing N non-negative continuous random variables that satisfy a conservation principle. In various fields a conservation law requires that a set of fluctuating variables be non-negative and (if appropriately normalized) sum to one. As a result, any stochastic differential equation model to be realizable must not produce events outside of the allowed sample space. We develop a set of constraints on the drift and diffusion terms of such stochastic models to ensure that both the non-negativity and the unit-sum conservation law constraint are satisfied as the variables evolve in time. We investigate the consequencesmore » of the developed constraints on the Fokker-Planck equation, the associated system of stochastic differential equations, and the evolution equations of the first four moments of the probability density function. We show that random variables, satisfying a conservation law constraint, represented by stochastic diffusion processes, must have diffusion terms that are coupled and nonlinear. The set of constraints developed enables the development of statistical representations of fluctuating variables satisfying a conservation law. We exemplify the results with the bivariate beta process and the multivariate Wright-Fisher, Dirichlet, and Lochner’s generalized Dirichlet processes.« less

  9. Estimating the Impact of the PROMISE Scholarship Using Propensity Score Weighted Frontier Fuzzy Regression Discontinuity Design

    ERIC Educational Resources Information Center

    Shobo, Yetty; Wong, Jen D.; Bell, Angie

    2014-01-01

    Regression discontinuity (RD), an "as good as randomized," research design is increasingly prominent in education research in recent years; the design gets eligible quasi-experimental designs as close as possible to experimental designs by using a stated threshold on a continuous baseline variable to assign individuals to a…

  10. Inverse Ising problem in continuous time: A latent variable approach

    NASA Astrophysics Data System (ADS)

    Donner, Christian; Opper, Manfred

    2017-12-01

    We consider the inverse Ising problem: the inference of network couplings from observed spin trajectories for a model with continuous time Glauber dynamics. By introducing two sets of auxiliary latent random variables we render the likelihood into a form which allows for simple iterative inference algorithms with analytical updates. The variables are (1) Poisson variables to linearize an exponential term which is typical for point process likelihoods and (2) Pólya-Gamma variables, which make the likelihood quadratic in the coupling parameters. Using the augmented likelihood, we derive an expectation-maximization (EM) algorithm to obtain the maximum likelihood estimate of network parameters. Using a third set of latent variables we extend the EM algorithm to sparse couplings via L1 regularization. Finally, we develop an efficient approximate Bayesian inference algorithm using a variational approach. We demonstrate the performance of our algorithms on data simulated from an Ising model. For data which are simulated from a more biologically plausible network with spiking neurons, we show that the Ising model captures well the low order statistics of the data and how the Ising couplings are related to the underlying synaptic structure of the simulated network.

  11. Design and Methods of a Randomized Trial of Continuous Glucose Monitoring in Persons With Type 1 Diabetes With Impaired Glycemic Control Treated With Multiple Daily Insulin Injections (GOLD Study).

    PubMed

    Lind, Marcus; Polonsky, William; Hirsch, Irl B; Heise, Tim; Bolinder, Jan; Dahlqvist, Sofia; Pehrsson, Nils-Gunnar; Moström, Peter

    2016-05-01

    The majority of individuals with type 1 diabetes today have glucose levels exceeding guidelines. The primary aim of this study was to evaluate whether continuous glucose monitoring (CGM), using the Dexcom G4 stand-alone system, improves glycemic control in adults with type 1 diabetes treated with multiple daily insulin injections (MDI). Individuals with type 1 diabetes and inadequate glycemic control (HbA1c ≥ 7.5% = 58 mmol/mol) treated with MDI were randomized in a cross-over design to the Dexcom G4 versus conventional care for 6 months followed by a 4-month wash-out period. Masked CGM was performed before randomization, during conventional treatment, and during the wash-out period to evaluate effects on hypoglycemia, hyperglycemia, and glycemic variability. Questionnaires were used to evaluate diabetes treatment satisfaction, fear of hypoglycemia, hypoglycemia confidence, diabetes-related distress, overall well-being, and physical activity during the different phases of the trial. The primary endpoint was the difference in HbA1c at the end of each treatment phase. A total of 205 patients were screened, of whom 161 were randomized between February and December 2014. Study completion is anticipated in April 2016. It is expected that the results of this study will establish whether using the Dexcom G4 stand-alone system in individuals with type 1 diabetes treated with MDI improves glycemic control, reduces hypoglycemia, and influences quality-of-life indicators and glycemic variability. © 2016 Diabetes Technology Society.

  12. Generating variable and random schedules of reinforcement using Microsoft Excel macros.

    PubMed

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.

  13. Effectiveness of a stepped primary care smoking cessation intervention: cluster randomized clinical trial (ISTAPS study).

    PubMed

    Cabezas, Carmen; Advani, Mamta; Puente, Diana; Rodriguez-Blanco, Teresa; Martin, Carlos

    2011-09-01

    To evaluate the effectiveness in primary care of a stepped smoking cessation intervention based on the transtheoretical model of change. Cluster randomized trial; unit of randomization: basic care unit (family physician and nurse who care for the same group of patients); and intention-to-treat analysis. All interested basic care units (n = 176) that worked in 82 primary care centres belonging to the Spanish Preventive Services and Health Promotion Research Network in 13 regions of Spain. A total of 2,827 smokers (aged 14-85 years) who consulted a primary care centre for any reason, provided written informed consent and had valid interviews. The outcome variable was the 1-year continuous abstinence rate at the 2-year follow-up. The main variable was the study group (intervention/control). Intervention involved 6-month implementation of recommendations from a Clinical Practice Guideline which included brief motivational interviews for smokers at the precontemplation-contemplation stage, brief intervention for smokers in preparation-action who do not want help, intensive intervention with pharmacotherapy for smokers in preparation-action who want help and reinforcing intervention in the maintenance stage. Control group involved usual care. Among others, characteristics of tobacco use and motivation to quit variables were also collected. The 1-year continuous abstinence rate at the 2-year follow-up was 8.1% in the intervention group and 5.8% in the control group (P = 0.014). In the multivariate logistic regression, the odds of quitting of the intervention versus control group was 1.50 (95% confidence interval = 1.05-2.14). A stepped smoking cessation intervention based on the transtheoretical model significantly increased smoking abstinence at a 2-year follow-up among smokers visiting primary care centres. © 2011 The Authors, Addiction © 2011 Society for the Study of Addiction.

  14. Probabilistic Reasoning for Plan Robustness

    NASA Technical Reports Server (NTRS)

    Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.

    2005-01-01

    A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.

  15. Statistical optics

    NASA Astrophysics Data System (ADS)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  16. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    PubMed Central

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values. PMID:18595286

  17. Bayesian data fusion for spatial prediction of categorical variables in environmental sciences

    NASA Astrophysics Data System (ADS)

    Gengler, Sarah; Bogaert, Patrick

    2014-12-01

    First developed to predict continuous variables, Bayesian Maximum Entropy (BME) has become a complete framework in the context of space-time prediction since it has been extended to predict categorical variables and mixed random fields. This method proposes solutions to combine several sources of data whatever the nature of the information. However, the various attempts that were made for adapting the BME methodology to categorical variables and mixed random fields faced some limitations, as a high computational burden. The main objective of this paper is to overcome this limitation by generalizing the Bayesian Data Fusion (BDF) theoretical framework to categorical variables, which is somehow a simplification of the BME method through the convenient conditional independence hypothesis. The BDF methodology for categorical variables is first described and then applied to a practical case study: the estimation of soil drainage classes using a soil map and point observations in the sandy area of Flanders around the city of Mechelen (Belgium). The BDF approach is compared to BME along with more classical approaches, as Indicator CoKringing (ICK) and logistic regression. Estimators are compared using various indicators, namely the Percentage of Correctly Classified locations (PCC) and the Average Highest Probability (AHP). Although BDF methodology for categorical variables is somehow a simplification of BME approach, both methods lead to similar results and have strong advantages compared to ICK and logistic regression.

  18. Divided dosing reduces prednisolone-induced hyperglycaemia and glycaemic variability: a randomized trial after kidney transplantation.

    PubMed

    Yates, Christopher J; Fourlanos, Spiros; Colman, Peter G; Cohney, Solomon J

    2014-03-01

    Prednisolone is a major risk factor for hyperglycaemia and new-onset diabetes after transplantation. Uncontrolled observational data suggest that divided dosing may reduce requirements for hypoglycaemic agents. This study aims to compare the glycaemic effects of divided twice daily (BD) and once daily (QD) prednisolone. Twenty-two kidney transplant recipients without diabetes were randomized to BD or QD prednisolone. Three weeks post-transplant, a continuous glucose monitor (iPro2(®) Medtronic) was applied for 5 days with subjects continuing their initial prednisolone regimen (Days 1-2) before crossover to the alternative regimen. Mean glucose, peak glucose, nadir glucose, exposure to hyperglycaemia (glucose ≥7.8 mmol/L) and glycaemic variability were assessed. The mean ± standard deviation (SD) age of subjects was 50 ± 10 years and 77% were male. Median (interquartile range) daily prednisolone dose was 25 (20, 25) mg. BD prednisolone was associated with decreased mean glucose (mean 7.9 ± 1.7 versus 8.1 ± 2.3 mmol/L, P < 0.001), peak glucose [median 10.4 (9.5, 11.4) versus 11.4 (10.3, 13.4) mmol/L, P< 0.001] and exposure to hyperglycaemia [median 25.5 (14.6, 30.3) versus 40.4 (33.2, 51.2) mmol/L/h, P = 0.003]. Median glucose peaked between 14:55-15.05 h with BD and 15:25-15:30 h with QD. Median glycaemic variability scores were decreased with BD: SD (1.1 versus 1.9, P < 0.001), mean amplitude of glycaemic excursion (1.5 versus 2.2, P = 0.001), continuous overlapping net glycaemic action-1 (CONGA-1; 1.0 versus 1.2, P = 0.039), CONGA-2 (1.2 versus 1.4, P = 0.008) and J-index (25 versus 31, P = 0.003). Split prednisolone dosing reduces glycaemic variability and hyperglycaemia early post-kidney transplant.

  19. Effect of progestin vs. combined oral contraceptive pills on lactation: A double-blind randomized controlled trial

    PubMed Central

    Espey, Eve; Ogburn, Tony; Leeman, Larry; Singh, Rameet; Schrader, Ronald

    2013-01-01

    Objective To estimate the effect of progestin-only vs. combined hormonal contraceptive pills on rates of breastfeeding continuation in postpartum women. Secondary outcomes include infant growth parameters, contraceptive method continuation and patient satisfaction with breastfeeding and contraceptive method. Methods In this randomized controlled trial, postpartum breastfeeding women who desired oral contraceptives were assigned to progestin-only vs. combined hormonal contraceptive pills. At two and eight weeks postpartum, participants completed in-person questionnaires that assessed breastfeeding continuation and contraceptive use. Infant growth parameters including weight, length and head circumference were assessed at eight weeks postpartum. Telephone questionnaires assessing breastfeeding, contraceptive continuation and satisfaction were completed at 3-7 weeks and 4 and 6 months. Breastfeeding continuation was compared between groups using Cox proportional hazards regression. Differences in baseline demographic characteristics and in variables between the two intervention groups were compared using chi-square tests, Fisher’s Exact test, or two-sample t-tests as appropriate. Results Breastfeeding continuation rates, contraceptive continuation, and infant growth parameters did not differ between users of progestin-only and combined hormonal contraceptive pills. Infant formula supplementation and maternal perception of inadequate milk supply were associated with decreased rates of breastfeeding in both groups. Conclusions Choice of combined or progestin-only birth control pills administered two weeks postpartum did not adversely affect breastfeeding continuation. PMID:22143258

  20. Exercise training in adults with repaired tetralogy of Fallot: A randomized controlled pilot study of continuous versus interval training.

    PubMed

    Novaković, Marko; Prokšelj, Katja; Rajkovič, Uroš; Vižintin Cuderman, Tjaša; Janša Trontelj, Katja; Fras, Zlatko; Jug, Borut

    2018-03-15

    Adults with repaired tetralogy of Fallot (ToF) have impaired exercise capacity, vascular and cardiac autonomic function, and quality of life (QoL). Specific effects of high-intensity interval or moderate continuous exercise training on these parameters in adults with repaired ToF remain unknown. Thirty adults with repaired ToF were randomized to either high-intensity interval, moderate intensity continuous training (36 sessions, 2-3 times a week) or usual care (no supervised exercise). Exercise capacity, flow-mediated vasodilation, pulse wave velocity, NT-proBNP and fibrinogen levels, heart rate variability and recovery, and QoL (SF-36 questionnaire) were determined at baseline and after the intervention period. Twenty-seven patients (mean age 39±9years, 63% females, 9 from each group) completed this pilot study. Both training groups improved in at least some parameters of cardiovascular health compared to no exercise. Interval-but not continuous-training improved VO2peak (21.2 to 22.9ml/kg/min, p=0.004), flow-mediated vasodilation (8.4 to 12.9%, p=0.019), pulse wave velocity (5.4 to 4.8m/s, p=0.028), NT-proBNP (202 to 190ng/L, p=0.032) and fibrinogen levels (2.67 to 2.46g/L, p=0.018). Conversely, continuous-but not interval-training improved heart rate variability (low-frequency domain, 0.32 to 0.22, p=0.039), heart rate recovery after 2min post-exercise (40 to 47 beats, p=0.023) and mental domain of SF-36 (87 to 95, p=0.028). Both interval and continuous exercise training modalities were safe. Interval training seems more efficacious in improving exercise capacity, vascular function, NT-proBNP and fibrinogen levels, while continuous training seems more efficacious in improving cardiac autonomic function and QoL. (Clinicaltrials.gov, NCT02643810). Copyright © 2018 Elsevier Ireland Ltd. All rights reserved.

  1. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    PubMed Central

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  2. Learning dependence from samples.

    PubMed

    Seth, Sohan; Príncipe, José C

    2014-01-01

    Mutual information, conditional mutual information and interaction information have been widely used in scientific literature as measures of dependence, conditional dependence and mutual dependence. However, these concepts suffer from several computational issues; they are difficult to estimate in continuous domain, the existing regularised estimators are almost always defined only for real or vector-valued random variables, and these measures address what dependence, conditional dependence and mutual dependence imply in terms of the random variables but not finite realisations. In this paper, we address the issue that given a set of realisations in an arbitrary metric space, what characteristic makes them dependent, conditionally dependent or mutually dependent. With this novel understanding, we develop new estimators of association, conditional association and interaction association. Some attractive properties of these estimators are that they do not require choosing free parameter(s), they are computationally simpler, and they can be applied to arbitrary metric spaces.

  3. Comparison of methods for the analysis of relatively simple mediation models.

    PubMed

    Rijnhart, Judith J M; Twisk, Jos W R; Chinapaw, Mai J M; de Boer, Michiel R; Heymans, Martijn W

    2017-09-01

    Statistical mediation analysis is an often used method in trials, to unravel the pathways underlying the effect of an intervention on a particular outcome variable. Throughout the years, several methods have been proposed, such as ordinary least square (OLS) regression, structural equation modeling (SEM), and the potential outcomes framework. Most applied researchers do not know that these methods are mathematically equivalent when applied to mediation models with a continuous mediator and outcome variable. Therefore, the aim of this paper was to demonstrate the similarities between OLS regression, SEM, and the potential outcomes framework in three mediation models: 1) a crude model, 2) a confounder-adjusted model, and 3) a model with an interaction term for exposure-mediator interaction. Secondary data analysis of a randomized controlled trial that included 546 schoolchildren. In our data example, the mediator and outcome variable were both continuous. We compared the estimates of the total, direct and indirect effects, proportion mediated, and 95% confidence intervals (CIs) for the indirect effect across OLS regression, SEM, and the potential outcomes framework. OLS regression, SEM, and the potential outcomes framework yielded the same effect estimates in the crude mediation model, the confounder-adjusted mediation model, and the mediation model with an interaction term for exposure-mediator interaction. Since OLS regression, SEM, and the potential outcomes framework yield the same results in three mediation models with a continuous mediator and outcome variable, researchers can continue using the method that is most convenient to them.

  4. Leveraging prognostic baseline variables to gain precision in randomized trials

    PubMed Central

    Colantuoni, Elizabeth; Rosenblum, Michael

    2015-01-01

    We focus on estimating the average treatment effect in a randomized trial. If baseline variables are correlated with the outcome, then appropriately adjusting for these variables can improve precision. An example is the analysis of covariance (ANCOVA) estimator, which applies when the outcome is continuous, the quantity of interest is the difference in mean outcomes comparing treatment versus control, and a linear model with only main effects is used. ANCOVA is guaranteed to be at least as precise as the standard unadjusted estimator, asymptotically, under no parametric model assumptions and also is locally semiparametric efficient. Recently, several estimators have been developed that extend these desirable properties to more general settings that allow any real-valued outcome (e.g., binary or count), contrasts other than the difference in mean outcomes (such as the relative risk), and estimators based on a large class of generalized linear models (including logistic regression). To the best of our knowledge, we give the first simulation study in the context of randomized trials that compares these estimators. Furthermore, our simulations are not based on parametric models; instead, our simulations are based on resampling data from completed randomized trials in stroke and HIV in order to assess estimator performance in realistic scenarios. We provide practical guidance on when these estimators are likely to provide substantial precision gains and describe a quick assessment method that allows clinical investigators to determine whether these estimators could be useful in their specific trial contexts. PMID:25872751

  5. Placebo cessation in binge eating disorder: effect on anthropometric, cardiovascular, and metabolic variables.

    PubMed

    Blom, Thomas J; Guerdjikova, Anna I; Mori, Nicole; Casuto, Leah S; McElroy, Susan L

    2015-01-01

    The aim of this study was to evaluate the effects of cessation of binge eating in response to placebo treatment in binge eating disorder (BED) on anthropometric, cardiovascular, and metabolic variables. We pooled participant-level data from 10 randomized, double-blind, placebo-controlled trials of medication for BED. We then compared patients who stopped binge eating with those who did not on changes in weight, body mass index (BMI), systolic and diastolic blood pressure, pulse, and fasting lipids and glucose. Of 234 participants receiving placebo, 60 (26%) attained cessation from binge eating. Patients attaining cessation showed modestly decreased diastolic blood pressure compared with patients who continued to binge eat. Weight and BMI remained stable in patients who stopped binge eating, but increased somewhat in those who continued to binge eat. Patients who stopped binge eating with placebo had greater reductions in diastolic blood pressure and gained less weight than patients who continued to binge eat. Self-report of eating pathology in BED may predict physiologic variables. Copyright © 2014 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2014 John Wiley & Sons, Ltd and Eating Disorders Association.

  6. Vascular Glucose Sensor Symposium: Continuous Glucose Monitoring Systems (CGMS) for Hospitalized and Ambulatory Patients at Risk for Hyperglycemia, Hypoglycemia, and Glycemic Variability.

    PubMed

    Joseph, Jeffrey I; Torjman, Marc C; Strasma, Paul J

    2015-07-01

    Hyperglycemia, hypoglycemia, and glycemic variability have been associated with increased morbidity, mortality, length of stay, and cost in a variety of critical care and non-critical care patient populations in the hospital. The results from prospective randomized clinical trials designed to determine the risks and benefits of intensive insulin therapy and tight glycemic control have been confusing; and at times conflicting. The limitations of point-of-care blood glucose (BG) monitoring in the hospital highlight the great clinical need for an automated real-time continuous glucose monitoring system (CGMS) that can accurately measure the concentration of glucose every few minutes. Automation and standardization of the glucose measurement process have the potential to significantly improve BG control, clinical outcome, safety and cost. © 2015 Diabetes Technology Society.

  7. Practical security analysis of continuous-variable quantum key distribution with jitter in clock synchronization

    NASA Astrophysics Data System (ADS)

    Xie, Cailang; Guo, Ying; Liao, Qin; Zhao, Wei; Huang, Duan; Zhang, Ling; Zeng, Guihua

    2018-03-01

    How to narrow the gap of security between theory and practice has been a notoriously urgent problem in quantum cryptography. Here, we analyze and provide experimental evidence of the clock jitter effect on the practical continuous-variable quantum key distribution (CV-QKD) system. The clock jitter is a random noise which exists permanently in the clock synchronization in the practical CV-QKD system, it may compromise the system security because of its impact on data sampling and parameters estimation. In particular, the practical security of CV-QKD with different clock jitter against collective attack is analyzed theoretically based on different repetition frequencies, the numerical simulations indicate that the clock jitter has more impact on a high-speed scenario. Furthermore, a simplified experiment is designed to investigate the influence of the clock jitter.

  8. Multi-Observation Continuous Density Hidden Markov Models for Anomaly Detection in Full Motion Video

    DTIC Science & Technology

    2012-06-01

    response profiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 3.5 Method for measuring angular movement versus average direction...of movement 49 3.6 Method for calculating Angular Deviation, Θ . . . . . . . . . . . . . . . . . . 50 4.1 HMM produced by K Means Learning for agent H... Angular Deviation. A random variable, the difference in heading (in degrees) from the overall direction of movement over the sequence • S : Speed. A

  9. Design and simulation of stratified probability digital receiver with application to the multipath communication

    NASA Technical Reports Server (NTRS)

    Deal, J. H.

    1975-01-01

    One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.

  10. Random center vortex lines in continuous 3D space-time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Höllwieser, Roman; Institute of Atomic and Subatomic Physics, Vienna University of Technology, Operngasse 9, 1040 Vienna; Altarawneh, Derar

    2016-01-22

    We present a model of center vortices, represented by closed random lines in continuous 2+1-dimensional space-time. These random lines are modeled as being piece-wise linear and an ensemble is generated by Monte Carlo methods. The physical space in which the vortex lines are defined is a cuboid with periodic boundary conditions. Besides moving, growing and shrinking of the vortex configuration, also reconnections are allowed. Our ensemble therefore contains not a fixed, but a variable number of closed vortex lines. This is expected to be important for realizing the deconfining phase transition. Using the model, we study both vortex percolation andmore » the potential V(R) between quark and anti-quark as a function of distance R at different vortex densities, vortex segment lengths, reconnection conditions and at different temperatures. We have found three deconfinement phase transitions, as a function of density, as a function of vortex segment length, and as a function of temperature. The model reproduces the qualitative features of confinement physics seen in SU(2) Yang-Mills theory.« less

  11. Dynamical Localization for Discrete Anderson Dirac Operators

    NASA Astrophysics Data System (ADS)

    Prado, Roberto A.; de Oliveira, César R.; Carvalho, Silas L.

    2017-04-01

    We establish dynamical localization for random Dirac operators on the d-dimensional lattice, with d\\in { 1, 2, 3} , in the three usual regimes: large disorder, band edge and 1D. These operators are discrete versions of the continuous Dirac operators and consist in the sum of a discrete free Dirac operator with a random potential. The potential is a diagonal matrix formed by different scalar potentials, which are sequences of independent and identically distributed random variables according to an absolutely continuous probability measure with bounded density and of compact support. We prove the exponential decay of fractional moments of the Green function for such models in each of the above regimes, i.e., (j) throughout the spectrum at larger disorder, (jj) for energies near the band edges at arbitrary disorder and (jjj) in dimension one, for all energies in the spectrum and arbitrary disorder. Dynamical localization in theses regimes follows from the fractional moments method. The result in the one-dimensional regime contrast with one that was previously obtained for 1D Dirac model with Bernoulli potential.

  12. Bayesian adjustment for measurement error in continuous exposures in an individually matched case-control study.

    PubMed

    Espino-Hernandez, Gabriela; Gustafson, Paul; Burstyn, Igor

    2011-05-14

    In epidemiological studies explanatory variables are frequently subject to measurement error. The aim of this paper is to develop a Bayesian method to correct for measurement error in multiple continuous exposures in individually matched case-control studies. This is a topic that has not been widely investigated. The new method is illustrated using data from an individually matched case-control study of the association between thyroid hormone levels during pregnancy and exposure to perfluorinated acids. The objective of the motivating study was to examine the risk of maternal hypothyroxinemia due to exposure to three perfluorinated acids measured on a continuous scale. Results from the proposed method are compared with those obtained from a naive analysis. Using a Bayesian approach, the developed method considers a classical measurement error model for the exposures, as well as the conditional logistic regression likelihood as the disease model, together with a random-effect exposure model. Proper and diffuse prior distributions are assigned, and results from a quality control experiment are used to estimate the perfluorinated acids' measurement error variability. As a result, posterior distributions and 95% credible intervals of the odds ratios are computed. A sensitivity analysis of method's performance in this particular application with different measurement error variability was performed. The proposed Bayesian method to correct for measurement error is feasible and can be implemented using statistical software. For the study on perfluorinated acids, a comparison of the inferences which are corrected for measurement error to those which ignore it indicates that little adjustment is manifested for the level of measurement error actually exhibited in the exposures. Nevertheless, a sensitivity analysis shows that more substantial adjustments arise if larger measurement errors are assumed. In individually matched case-control studies, the use of conditional logistic regression likelihood as a disease model in the presence of measurement error in multiple continuous exposures can be justified by having a random-effect exposure model. The proposed method can be successfully implemented in WinBUGS to correct individually matched case-control studies for several mismeasured continuous exposures under a classical measurement error model.

  13. Regularization of the big bang singularity with random perturbations

    NASA Astrophysics Data System (ADS)

    Belbruno, Edward; Xue, BingKan

    2018-03-01

    We show how to regularize the big bang singularity in the presence of random perturbations modeled by Brownian motion using stochastic methods. We prove that the physical variables in a contracting universe dominated by a scalar field can be continuously and uniquely extended through the big bang as a function of time to an expanding universe only for a discrete set of values of the equation of state satisfying special co-prime number conditions. This result significantly generalizes a previous result (Xue and Belbruno 2014 Class. Quantum Grav. 31 165002) that did not model random perturbations. This result implies that the extension from a contracting to an expanding universe for the discrete set of co-prime equation of state is robust, which is a surprising result. Implications for a purely expanding universe are discussed, such as a non-smooth, randomly varying scale factor near the big bang.

  14. Effect of cinnamon on glucose control and lipid parameters.

    PubMed

    Baker, William L; Gutierrez-Williams, Gabriela; White, C Michael; Kluger, Jeffrey; Coleman, Craig I

    2008-01-01

    To perform a meta-analysis of randomized controlled trials of cinnamon to better characterize its impact on glucose and plasma lipids. A systematic literature search through July 2007 was conducted to identify randomized placebo-controlled trials of cinnamon that reported data on A1C, fasting blood glucose (FBG), or lipid parameters. The mean change in each study end point from baseline was treated as a continuous variable, and the weighted mean difference was calculated as the difference between the mean value in the treatment and control groups. A random-effects model was used. Five prospective randomized controlled trials (n = 282) were identified. Upon meta-analysis, the use of cinnamon did not significantly alter A1C, FBG, or lipid parameters. Subgroup and sensitivity analyses did not significantly change the results. Cinnamon does not appear to improve A1C, FBG, or lipid parameters in patients with type 1 or type 2 diabetes.

  15. Predictive modeling of cardiovascular complications in incident hemodialysis patients.

    PubMed

    Ion Titapiccolo, J; Ferrario, M; Barbieri, C; Marcelli, D; Mari, F; Gatti, E; Cerutti, S; Smyth, P; Signorini, M G

    2012-01-01

    The administration of hemodialysis (HD) treatment leads to the continuous collection of a vast quantity of medical data. Many variables related to the patient health status, to the treatment, and to dialyzer settings can be recorded and stored at each treatment session. In this study a dataset of 42 variables and 1526 patients extracted from the Fresenius Medical Care database EuCliD was used to develop and apply a random forest predictive model for the prediction of cardiovascular events in the first year of HD treatment. A ridge-lasso logistic regression algorithm was then applied to the subset of variables mostly involved in the prediction model to get insights in the mechanisms underlying the incidence of cardiovascular complications in this high risk population of patients.

  16. Continuous glucose monitoring in acute coronary syndrome.

    PubMed

    Rodríguez-Quintanilla, Karina Alejandra; Lavalle-González, Fernando Javier; Mancillas-Adame, Leonardo Guadalupe; Zapata-Garrido, Alfonso Javier; Villarreal-Pérez, Jesús Zacarías; Tamez-Pérez, Héctor Eloy

    2013-01-01

    Diabetes mellitus is an independent risk factor for cardiovascular disease. To compare the efficacy of devices for continuous glucose monitoring and capillary glucose monitoring in hospitalized patients with acute coronary syndrome using the following parameters: time to achieve normoglycemia, period of time in normoglycemia, and episodes of hypoglycemia. We performed a pilot, non-randomized, unblinded clinical trial that included 16 patients with acute coronary artery syndrome, a capillary or venous blood glucose ≥ 140 mg/dl, and treatment with a continuous infusion of fast acting human insulin. These patients were randomized into 2 groups: a conventional group, in which capillary measurement and recording as well as insulin adjustment were made every 4h, and an intervention group, in which measurement and recording as well as insulin adjustment were made every hour with a subcutaneous continuous monitoring system. Student's t-test was applied for mean differences and the X(2) test for qualitative variables. We observed a statistically significant difference in the mean time for achieving normoglycemia, favoring the conventional group with a P = 0.02. Continuous monitoring systems are as useful as capillary monitoring for achieving normoglycemia. Copyright © 2012 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.

  17. Effectiveness of Anti-Dementia Drugs in Extremely Severe Alzheimer's Disease: A 12-Week, Multicenter, Randomized, Single-Blind Study.

    PubMed

    Hong, Yun Jeong; Choi, Seong Hye; Jeong, Jee Hyang; Park, Kyung Won; Na, Hae Ri

    2018-01-01

    There is insufficient evidence to guide decisions concerning how long anti-dementia drug (ADD) regimens should be maintained in severe Alzheimer's disease (AD). We investigated whether patients with extremely severe AD who were already receiving donepezil or memantine benefited from continuing treatment. In this randomized and rater-blinded trial, 65 AD patients with a Mini-Mental State Examination score from 0 to 5 and a score of 6c or worse on Functional Assessment Staging were randomly assigned to an ADD-continuation group (N = 30) or an ADD-discontinuation group (N = 35). The current use of donepezil or memantine was maintained for 12 weeks in the ADD-continuation group and was discontinued after baseline in the ADD-discontinuation group. Efficacy measures were obtained at baseline and 12 weeks. The primary efficacy variable was the change from baseline to the end of the study in Baylor Profound Mental State Examination (BPMSE) scores. The change in the BPMSE from baseline to the end of the study in the ADD-continuation group (a 0.4-point improvement) was not equivalent to that in the ADD-discontinuation group (a 0.5-point decline), as determined by two one-sided tests of equivalence. Study withdrawals due to adverse events (11.4% versus 6.7%) were more frequent in the ADD-discontinuation group than in the ADD-continuation group. Continued treatment with donepezil or memantine seems unequal and might be superior to withdrawal of the drugs in terms of the effects on global cognition in patients with extremely severe AD. Current Controlled Trials number: KCT0000874 (CRIS).

  18. A randomized controlled trial of post-extubation bubble continuous positive airway pressure versus Infant Flow Driver continuous positive airway pressure in preterm infants with respiratory distress syndrome.

    PubMed

    Gupta, Samir; Sinha, Sunil K; Tin, Win; Donn, Steven M

    2009-05-01

    To compare the efficacy and safety of bubble continuous positive airway pressure (CPAP) and Infant Flow Driver (IFD) CPAP for the post-extubation management of preterm infants with respiratory distress syndrome (RDS). A total of 140 preterm infants at 24 to 29 weeks' gestation or with a birth weight of 600 to 1500 g who were ventilated at birth for RDS were randomized to receive either IFD CPAP (a variable-flow device) or bubble CPAP (a continuous-flow device). A standardized protocol was used for extubation and CPAP. No crossover was allowed. The primary outcome was successful extubation maintained for at least 72 hours. Secondary outcomes included successful extubation maintained for 7 days, total duration of CPAP support, chronic lung disease, and complications of prematurity. Seventy-one infants were randomized to bubble CPAP, and 69 were randomized to IFD CPAP. Mean gestational age and birth weight were similar in the 2 groups, as were the proportions of infants who achieved successful extubation for 72 hours and for 7 days. However, the median duration of CPAP support was 50% shorter in the infants on bubble CPAP. Moreover, in the subset of infants who were ventilated for less than 14 days, the infants on bubble CPAP had a significantly lower extubation failure rate. There was no difference in the incidence of chronic lung disease or other complications between the 2 study groups. Bubble CPAP is as effective as IFD CPAP in the post-extubation management of infants with RDS; however, in infants ventilated for < or = 14 days, bubble CPAP is associated with a significantly higher rate of successful extubation. Bubble CPAP also is associated with a significantly reduced duration of CPAP support.

  19. A stochastical event-based continuous time step rainfall generator based on Poisson rectangular pulse and microcanonical random cascade models

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph

    2017-04-01

    Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30

  20. Quantum key distribution using basis encoding of Gaussian-modulated coherent states

    NASA Astrophysics Data System (ADS)

    Huang, Peng; Huang, Jingzheng; Zhang, Zheshen; Zeng, Guihua

    2018-04-01

    The continuous-variable quantum key distribution (CVQKD) has been demonstrated to be available in practical secure quantum cryptography. However, its performance is restricted strongly by the channel excess noise and the reconciliation efficiency. In this paper, we present a quantum key distribution (QKD) protocol by encoding the secret keys on the random choices of two measurement bases: the conjugate quadratures X and P . The employed encoding method can dramatically weaken the effects of channel excess noise and reconciliation efficiency on the performance of the QKD protocol. Subsequently, the proposed scheme exhibits the capability to tolerate much higher excess noise and enables us to reach a much longer secure transmission distance even at lower reconciliation efficiency. The proposal can work alternatively to strengthen significantly the performance of the known Gaussian-modulated CVQKD protocol and serve as a multiplier for practical secure quantum cryptography with continuous variables.

  1. MIMICKING COUNTERFACTUAL OUTCOMES TO ESTIMATE CAUSAL EFFECTS.

    PubMed

    Lok, Judith J

    2017-04-01

    In observational studies, treatment may be adapted to covariates at several times without a fixed protocol, in continuous time. Treatment influences covariates, which influence treatment, which influences covariates, and so on. Then even time-dependent Cox-models cannot be used to estimate the net treatment effect. Structural nested models have been applied in this setting. Structural nested models are based on counterfactuals: the outcome a person would have had had treatment been withheld after a certain time. Previous work on continuous-time structural nested models assumes that counterfactuals depend deterministically on observed data, while conjecturing that this assumption can be relaxed. This article proves that one can mimic counterfactuals by constructing random variables, solutions to a differential equation, that have the same distribution as the counterfactuals, even given past observed data. These "mimicking" variables can be used to estimate the parameters of structural nested models without assuming the treatment effect to be deterministic.

  2. Comparison of tofogliflozin 20 mg and ipragliflozin 50 mg used together with insulin glargine 300 U/mL using continuous glucose monitoring (CGM): A randomized crossover study.

    PubMed

    Takeishi, Soichi; Tsuboi, Hiroki; Takekoshi, Shodo

    2017-10-28

    To investigate whether sodium glucose co-transporter 2 inhibitors (SGLT2i), tofogliflozin or ipragliflozin, achieve optimal glycemic variability, when used together with insulin glargine 300 U/mL (Glargine 300). Thirty patients with type 2 diabetes were randomly allocated to 2 groups. For the first group: After admission, tofogliflozin 20 mg was administered; Fasting plasma glucose (FPG) levels were titrated using an algorithm and stabilized at 80 mg/dL level with Glargine 300 for 5 days; Next, glucose levels were continuously monitored for 2 days using continuous glucose monitoring (CGM); Tofogliflozin was then washed out over 5 days; Subsequently, ipragliflozin 50 mg was administered; FPG levels were titrated using the same algorithm and stabilized at 80 mg/dL level with Glargine 300 for 5 days; Next, glucose levels were continuously monitored for 2 days using CGM. For the second group, ipragliflozin was administered prior to tofogliflozin, and the same regimen was maintained. Glargine 300 and SGLT2i were administered at 8:00 AM. Data collected on the second day of measurement (mean amplitude of glycemic excursion [MAGE], average daily risk range [ADRR]; on all days of measurement) were analyzed. Area over the glucose curve (<70 mg/dL; 0:00 to 6:00, 24-h), M value, standard deviation, MAGE, ADRR, and mean glucose levels (24-h, 8:00 to 24:00) were significantly lower in patients on tofogliflozin than in those on ipragliflozin. Tofogliflozin, which reduces glycemic variability by preventing nocturnal hypoglycemia and decreasing postprandial glucose levels, is an ideal SGLT2i when used together with Glargine 300 during basal insulin therapy.

  3. An Entropy-Based Measure of Dependence between Two Groups of Random Variables. Research Report. ETS RR-07-20

    ERIC Educational Resources Information Center

    Kong, Nan

    2007-01-01

    In multivariate statistics, the linear relationship among random variables has been fully explored in the past. This paper looks into the dependence of one group of random variables on another group of random variables using (conditional) entropy. A new measure, called the K-dependence coefficient or dependence coefficient, is defined using…

  4. Complete versus partial preservation of mitral valve apparatus during mitral valve replacement: meta-analysis and meta-regression of 1535 patients.

    PubMed

    Sá, Michel Pompeu Barros De Oliveira; Escobar, Rodrigo Renda; Ferraz, Paulo Ernando; Vasconcelos, Frederico Pires; Lima, Ricardo Carvalho

    2013-11-01

    To determine if there is any real difference between complete preservation (CP) and partial preservation (PP) of the mitral valve apparatus during mitral valve replacement (MVR) in terms of hard outcomes. MEDLINE, EMBASE, CENTRAL/CCTR, SciELO, LILACS, Google Scholar and reference lists of relevant articles were searched for clinical studies that compared outcomes [30-day mortality, postoperative low cardiac output syndrome (LCOS), 5-year mortality or left ventricle ejection fraction (LVEF) before and after surgery] between MVR-CP vs MVR-PP during MVR until July 2012. The principal summary measures were odds ratios (ORs) with 95% confidence interval (CI)--for categorical variables (30-day mortality, postoperative LCOS, 5-year mortality); difference means and standard error (SE)--for continuous variables (LVEF before and after surgery) and P values (that will be considered statistically significant when <0.05). The ORs were combined across studies using DerSimonian-Laird random effects weighted model. The same procedure was executed for continuous variables, taking into consideration the difference in means. Eight studies (2 randomized and 6 non-randomized) were identified and included a total of 1535 patients (597 for MVR-CP and 938 for MVR-PP). There was no significant difference between MVR-CP or MVR-PP groups in the risk for 30-day mortality (OR 0.870; 95% CI 0.50-1.52; P = 0.63) or postoperative LCOS (OR 0.35; 95% CI 0.11-1.08 and P = 0.07) or 5-year mortality (OR 0.70; 95% CI 0.43-1.14; P = 0.15). Taking into consideration LVEF, neither MVR-CP nor MVR-CP demonstrated a statistically significant improvement in LVEF before and after surgery, and both strategies were not different from each other. No publication bias was observed. We found evidence that argues against any superiority between both techniques of preservation (complete or partial) of mitral valve apparatus during MVR.

  5. A method for fitting regression splines with varying polynomial order in the linear mixed model.

    PubMed

    Edwards, Lloyd J; Stewart, Paul W; MacDougall, James E; Helms, Ronald W

    2006-02-15

    The linear mixed model has become a widely used tool for longitudinal analysis of continuous variables. The use of regression splines in these models offers the analyst additional flexibility in the formulation of descriptive analyses, exploratory analyses and hypothesis-driven confirmatory analyses. We propose a method for fitting piecewise polynomial regression splines with varying polynomial order in the fixed effects and/or random effects of the linear mixed model. The polynomial segments are explicitly constrained by side conditions for continuity and some smoothness at the points where they join. By using a reparameterization of this explicitly constrained linear mixed model, an implicitly constrained linear mixed model is constructed that simplifies implementation of fixed-knot regression splines. The proposed approach is relatively simple, handles splines in one variable or multiple variables, and can be easily programmed using existing commercial software such as SAS or S-plus. The method is illustrated using two examples: an analysis of longitudinal viral load data from a study of subjects with acute HIV-1 infection and an analysis of 24-hour ambulatory blood pressure profiles.

  6. Anomalous dispersion in correlated porous media: a coupled continuous time random walk approach

    NASA Astrophysics Data System (ADS)

    Comolli, Alessandro; Dentz, Marco

    2017-09-01

    We study the causes of anomalous dispersion in Darcy-scale porous media characterized by spatially heterogeneous hydraulic properties. Spatial variability in hydraulic conductivity leads to spatial variability in the flow properties through Darcy's law and thus impacts on solute and particle transport. We consider purely advective transport in heterogeneity scenarios characterized by broad distributions of heterogeneity length scales and point values. Particle transport is characterized in terms of the stochastic properties of equidistantly sampled Lagrangian velocities, which are determined by the flow and conductivity statistics. The persistence length scales of flow and transport velocities are imprinted in the spatial disorder and reflect the distribution of heterogeneity length scales. Particle transitions over the velocity length scales are kinematically coupled with the transition time through velocity. We show that the average particle motion follows a coupled continuous time random walk (CTRW), which is fully parameterized by the distribution of flow velocities and the medium geometry in terms of the heterogeneity length scales. The coupled CTRW provides a systematic framework for the investigation of the origins of anomalous dispersion in terms of heterogeneity correlation and the distribution of conductivity point values. We derive analytical expressions for the asymptotic scaling of the moments of the spatial particle distribution and first arrival time distribution (FATD), and perform numerical particle tracking simulations of the coupled CTRW to capture the full average transport behavior. Broad distributions of heterogeneity point values and lengths scales may lead to very similar dispersion behaviors in terms of the spatial variance. Their mechanisms, however are very different, which manifests in the distributions of particle positions and arrival times, which plays a central role for the prediction of the fate of dissolved substances in heterogeneous natural and engineered porous materials. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  7. SLEEP AND MENTAL DISORDERS: A META-ANALYSIS OF POLYSOMNOGRAPHIC RESEARCH

    PubMed Central

    Baglioni, Chiara; Nanovska, Svetoslava; Regen, Wolfram; Spiegelhalder, Kai; Feige, Bernd; Nissen, Christoph; Reynolds, Charles F.; Riemann, Dieter

    2016-01-01

    Investigating sleep in mental disorders has the potential to reveal both disorder-specific and transdiagnostic psychophysiological mechanisms. This meta-analysis aimed at determining the polysomnographic (PSG) characteristics of several mental disorders. Relevant studies were searched through standard strategies. Controlled PSG studies evaluating sleep in affective, anxiety, eating, pervasive developmental, borderline and antisocial personality disorders, ADHD, and schizophrenia were included. PSG variables of sleep continuity, depth, and architecture, as well as rapid-eye movement (REM) sleep were considered. Calculations were performed with the “Comprehensive Meta-Analysis” and “R” softwares. Using random effects modeling, for each disorder and each variable, a separate meta-analysis was conducted if at least 3 studies were available for calculation of effect sizes as standardized means (Hedges’g). Sources of variability, i.e., sex, age, and mental disorders comorbidity, were evaluated in subgroup analyses. Sleep alterations were evidenced in all disorders, with the exception of ADHD and seasonal affective disorders. Sleep continuity problems were observed in most mental disorders. Sleep depth and REM pressure alterations were associated with affective, anxiety, autism and schizophrenia disorders. Comorbidity was associated with enhanced REM sleep pressure and more inhibition of sleep depth. No sleep parameter was exclusively altered in one condition; however, no two conditions shared the same PSG profile. Sleep continuity disturbances imply a transdiagnostic imbalance in the arousal system likely representing a basic dimension of mental health. Sleep depth and REM variables might play a key role in psychiatric comorbidity processes. Constellations of sleep alterations may define distinct disorders better than alterations in one single variable. PMID:27416139

  8. THE DISTRIBUTION OF ROUNDS FIRED IN STOCHASTIC DUELS

    DTIC Science & Technology

    This paper continues the development of the theory of Stochastic Duels to include the distribution of the number of rounds fired. Most generally...the duel between two contestants who fire at each other with constant kill probabilities per round is considered. The time between rounds fired may be...at the beginning of the duel may be limited and is a discrete random variable. Besides the distribution of rounds fired, its first two moments and

  9. A randomized controlled trial to compare the effects of sulphonylurea gliclazide MR (modified release) and the DPP-4 inhibitor vildagliptin on glycemic variability and control measured by continuous glucose monitoring (CGM) in Brazilian women with type 2 diabetes.

    PubMed

    Vianna, Andre Gustavo Daher; Lacerda, Claudio Silva; Pechmann, Luciana Muniz; Polesel, Michelle Garcia; Marino, Emerson Cestari; Faria-Neto, Jose Rocha

    2018-05-01

    This study aims to evaluate whether there is a difference between the effects of vildagliptin and gliclazide MR (modified release) on glycemic variability (GV) in women with type 2 diabetes (T2DM) as evaluated by continuous glucose monitoring (CGM). An open-label, randomized study was conducted in T2DM women on steady-dose metformin monotherapy which were treated with 50 mg vildagliptin twice daily or 60-120 mg of gliclazide MR once daily. CGM and GV indices calculation were performed at baseline and after 24 weeks. In total, 42 patients (age: 61.9 ± 5.9 years, baseline glycated hemoglobin (HbA1c): 7.3 ± 0.56) were selected and 37 completed the 24-week protocol. Vildagliptin and gliclazide MR reduced GV, as measured by the mean amplitude of glycemic excursions (MAGE, p = 0.007 and 0.034, respectively). The difference between the groups did not reach statistical significance. Vildagliptin also significantly decreased the standard deviation of the mean glucose (SD) and the mean of the daily differences (MODD) (p = 0.007 and 0.030). Vildagliptin and gliclazide MR similarly reduced the MAGE in women with T2DM after 24 weeks of treatment. Further studies are required to attest differences between vildagliptin and gliclazide MR regarding glycemic variability. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Effect of source tampering in the security of quantum cryptography

    NASA Astrophysics Data System (ADS)

    Sun, Shi-Hai; Xu, Feihu; Jiang, Mu-Sheng; Ma, Xiang-Chun; Lo, Hoi-Kwong; Liang, Lin-Mei

    2015-08-01

    The security of source has become an increasingly important issue in quantum cryptography. Based on the framework of measurement-device-independent quantum key distribution (MDI-QKD), the source becomes the only region exploitable by a potential eavesdropper (Eve). Phase randomization is a cornerstone assumption in most discrete-variable (DV) quantum communication protocols (e.g., QKD, quantum coin tossing, weak-coherent-state blind quantum computing, and so on), and the violation of such an assumption is thus fatal to the security of those protocols. In this paper, we show a simple quantum hacking strategy, with commercial and homemade pulsed lasers, by Eve that allows her to actively tamper with the source and violate such an assumption, without leaving a trace afterwards. Furthermore, our attack may also be valid for continuous-variable (CV) QKD, which is another main class of QKD protocol, since, excepting the phase random assumption, other parameters (e.g., intensity) could also be changed, which directly determine the security of CV-QKD.

  11. Signal and noise extraction from analog memory elements for neuromorphic computing.

    PubMed

    Gong, N; Idé, T; Kim, S; Boybat, I; Sebastian, A; Narayanan, V; Ando, T

    2018-05-29

    Dense crossbar arrays of non-volatile memory (NVM) can potentially enable massively parallel and highly energy-efficient neuromorphic computing systems. The key requirements for the NVM elements are continuous (analog-like) conductance tuning capability and switching symmetry with acceptable noise levels. However, most NVM devices show non-linear and asymmetric switching behaviors. Such non-linear behaviors render separation of signal and noise extremely difficult with conventional characterization techniques. In this study, we establish a practical methodology based on Gaussian process regression to address this issue. The methodology is agnostic to switching mechanisms and applicable to various NVM devices. We show tradeoff between switching symmetry and signal-to-noise ratio for HfO 2 -based resistive random access memory. Then, we characterize 1000 phase-change memory devices based on Ge 2 Sb 2 Te 5 and separate total variability into device-to-device variability and inherent randomness from individual devices. These results highlight the usefulness of our methodology to realize ideal NVM devices for neuromorphic computing.

  12. Contextuality in canonical systems of random variables

    NASA Astrophysics Data System (ADS)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  13. Continuous glucose monitoring to assess the ecologic validity of dietary glycemic index and glycemic load123

    PubMed Central

    Ebbeling, Cara B; Wadden, Thomas A; Ludwig, David S

    2011-01-01

    Background: The circumstances under which the glycemic index (GI) and glycemic load (GL) are derived do not reflect real-world eating behavior. Thus, the ecologic validity of these constructs is incompletely known. Objective: This study examined the relation of dietary intake to glycemic response when foods are consumed under free-living conditions. Design: Participants were 26 overweight or obese adults with type 2 diabetes who participated in a randomized trial of lifestyle modification. The current study includes baseline data, before initiation of the intervention. Participants wore a continuous glucose monitor and simultaneously kept a food diary for 3 d. The dietary variables included GI, GL, and intakes of energy, fat, protein, carbohydrate, sugars, and fiber. The glycemic response variables included AUC, mean and SD of continuous glucose monitoring (CGM) values, percentage of CGM values in euglycemic and hyperglycemic ranges, and mean amplitude of glycemic excursions. Relations between daily dietary intake and glycemic outcomes were examined. Results: Data were available from 41 d of monitoring. Partial correlations, controlled for energy intake, indicated that GI or GL was significantly associated with each glycemic response outcome. In multivariate analyses, dietary GI accounted for 10% to 18% of the variance in each glycemic variable, independent of energy and carbohydrate intakes (P < 0.01). Conclusions: The data support the ecologic validity of the GI and GL constructs in free-living obese adults with type 2 diabetes. GI was the strongest and most consistent independent predictor of glycemic stability and variability. PMID:22071699

  14. A universal self-charging system driven by random biomechanical energy for sustainable operation of mobile electronics

    NASA Astrophysics Data System (ADS)

    Niu, Simiao; Wang, Xiaofeng; Yi, Fang; Zhou, Yu Sheng; Wang, Zhong Lin

    2015-12-01

    Human biomechanical energy is characterized by fluctuating amplitudes and variable low frequency, and an effective utilization of such energy cannot be achieved by classical energy-harvesting technologies. Here we report a high-efficient self-charging power system for sustainable operation of mobile electronics exploiting exclusively human biomechanical energy, which consists of a high-output triboelectric nanogenerator, a power management circuit to convert the random a.c. energy to d.c. electricity at 60% efficiency, and an energy storage device. With palm tapping as the only energy source, this power unit provides a continuous d.c. electricity of 1.044 mW (7.34 W m-3) in a regulated and managed manner. This self-charging unit can be universally applied as a standard `infinite-lifetime' power source for continuously driving numerous conventional electronics, such as thermometers, electrocardiograph system, pedometers, wearable watches, scientific calculators and wireless radio-frequency communication system, which indicates the immediate and broad applications in personal sensor systems and internet of things.

  15. A universal self-charging system driven by random biomechanical energy for sustainable operation of mobile electronics.

    PubMed

    Niu, Simiao; Wang, Xiaofeng; Yi, Fang; Zhou, Yu Sheng; Wang, Zhong Lin

    2015-12-11

    Human biomechanical energy is characterized by fluctuating amplitudes and variable low frequency, and an effective utilization of such energy cannot be achieved by classical energy-harvesting technologies. Here we report a high-efficient self-charging power system for sustainable operation of mobile electronics exploiting exclusively human biomechanical energy, which consists of a high-output triboelectric nanogenerator, a power management circuit to convert the random a.c. energy to d.c. electricity at 60% efficiency, and an energy storage device. With palm tapping as the only energy source, this power unit provides a continuous d.c. electricity of 1.044 mW (7.34 W m(-3)) in a regulated and managed manner. This self-charging unit can be universally applied as a standard 'infinite-lifetime' power source for continuously driving numerous conventional electronics, such as thermometers, electrocardiograph system, pedometers, wearable watches, scientific calculators and wireless radio-frequency communication system, which indicates the immediate and broad applications in personal sensor systems and internet of things.

  16. Naltrexone and Cognitive Behavioral Therapy for the Treatment of Alcohol Dependence

    PubMed Central

    Baros, AM; Latham, PK; Anton, RF

    2008-01-01

    Background Sex differences in regards to pharmacotherapy for alcoholism is a topic of concern following publications suggesting naltrexone, one of the longest approved treatments of alcoholism, is not as effective in women as in men. This study was conducted by combining two randomized placebo controlled clinical trials utilizing similar methodologies and personnel in which the data was amalgamated to evaluate sex effects in a reasonable sized sample. Methods 211 alcoholics (57 female; 154 male) were randomized to the naltrexone/CBT or placebo/CBT arm of the two clinical trials analyzed. Baseline variables were examined for differences between sex and treatment groups via analysis of variance (ANOVA) for continuous variable or chi-square test for categorical variables. All initial outcome analysis was conducted under an intent-to-treat analysis plan. Effect sizes for naltrexone over placebo were determined by Cohen’s D (d). Results The effect size of naltrexone over placebo for the following outcome variables was similar in men and women (%days abstinent (PDA) d=0.36, %heavy drinking days (PHDD) d=0.36 and total standard drinks (TSD) d=0.36). Only for men were the differences significant secondary to the larger sample size (PDA p=0.03; PHDD p=0.03; TSD p=0.04). There were a few variables (GGT at wk-12 change from baseline to week-12: men d=0.36, p=0.05; women d=0.20, p=0.45 and drinks per drinking day: men d=0.36, p=0.05; women d=0.28, p=0.34) where the naltrexone effect size for men was greater than women. In women, naltrexone tended to increase continuous abstinent days before a first drink (women d-0.46, p=0.09; men d=0.00, p=0.44). Conclusions The effect size of naltrexone over placebo appeared similar in women and men in our hands suggesting the findings of sex differences in naltrexone response might have to do with sample size and/or endpoint drinking variables rather than any inherent pharmacological or biological differences in response. PMID:18336635

  17. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Technical Reports Server (NTRS)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  18. Can adverse maternal and perinatal outcomes be predicted when blood pressure becomes elevated? Secondary analyses from the CHIPS (Control of Hypertension In Pregnancy Study) randomized controlled trial.

    PubMed

    Magee, Laura A; von Dadelszen, Peter; Singer, Joel; Lee, Terry; Rey, Evelyne; Ross, Susan; Asztalos, Elizabeth; Murphy, Kellie E; Menzies, Jennifer; Sanchez, Johanna; Gafni, Amiram; Gruslin, Andrée; Helewa, Michael; Hutton, Eileen; Lee, Shoo K; Logan, Alexander G; Ganzevoort, Wessel; Welch, Ross; Thornton, Jim G; Moutquin, Jean Marie

    2016-07-01

    For women with chronic or gestational hypertension in CHIPS (Control of Hypertension In Pregnancy Study, NCT01192412), we aimed to examine whether clinical predictors collected at randomization could predict adverse outcomes. This was a planned, secondary analysis of data from the 987 women in the CHIPS Trial. Logistic regression was used to examine the impact of 19 candidate predictors on the probability of adverse perinatal (pregnancy loss or high level neonatal care for >48 h, or birthweight <10th percentile) or maternal outcomes (severe hypertension, preeclampsia, or delivery at <34 or <37 weeks). A model containing all candidate predictors was used to start the stepwise regression process based on goodness of fit as measured by the Akaike information criterion. For face validity, these variables were forced into the model: treatment group ("less tight" or "tight" control), antihypertensive type at randomization, and blood pressure within 1 week before randomization. Continuous variables were represented continuously or dichotomized based on the smaller p-value in univariate analyses. An area-under-the-receiver-operating-curve (AUC ROC) of ≥0.70 was taken to reflect a potentially useful model. Point estimates for AUC ROC were <0.70 for all but severe hypertension (0.70, 95% CI 0.67-0.74) and delivery at <34 weeks (0.71, 95% CI 0.66-0.75). Therefore, no model warranted further assessment of performance. CHIPS data suggest that when women with chronic hypertension develop an elevated blood pressure in pregnancy, or formerly normotensive women develop new gestational hypertension, maternal and current pregnancy clinical characteristics cannot predict adverse outcomes in the index pregnancy. © 2016 The Authors. Acta Obstetricia et Gynecologica Scandinavica published by John Wiley & Sons Ltd on behalf of Nordic Federation of Societies of Obstetrics and Gynecology (NFOG).

  19. The effects of demand uncertainty on strategic gaming in the merit-order electricity pool market

    NASA Astrophysics Data System (ADS)

    Frem, Bassam

    In a merit-order electricity pool market, generating companies (Gencos) game with their offered incremental cost to meet the electricity demand and earn bigger market shares and higher profits. However when the demand is treated as a random variable instead of as a known constant, these Genco gaming strategies become more complex. After a brief introduction of electricity markets and gaming, the effects of demand uncertainty on strategic gaming are studied in two parts: (1) Demand modelled as a discrete random variable (2) Demand modelled as a continuous random variable. In the first part, we proposed an algorithm, the discrete stochastic strategy (DSS) algorithm that generates a strategic set of offers from the perspective of the Gencos' profits. The DSS offers were tested and compared to the deterministic Nash equilibrium (NE) offers based on the predicted demand. This comparison, based on the expected Genco profits, showed the DSS to be a better strategy in a probabilistic sense than the deterministic NE. In the second part, we presented three gaming strategies: (1) Deterministic NE (2) No-Risk (3) Risk-Taking. The strategies were then tested and their profit performances were compared using two assessment tools: (a) Expected value and standard deviation (b) Inverse cumulative distribution. We concluded that despite yielding higher profit performance under the right conjectures, Risk-Taking strategies are very sensitive to incorrect conjectures on the competitors' gaming decisions. As such, despite its lower profit performance, the No-Risk strategy was deemed preferable.

  20. On the Use of the Beta Distribution in Probabilistic Resource Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olea, Ricardo A., E-mail: olea@usgs.gov

    2011-12-15

    The triangular distribution is a popular choice when it comes to modeling bounded continuous random variables. Its wide acceptance derives mostly from its simple analytic properties and the ease with which modelers can specify its three parameters through the extremes and the mode. On the negative side, hardly any real process follows a triangular distribution, which from the outset puts at a disadvantage any model employing triangular distributions. At a time when numerical techniques such as the Monte Carlo method are displacing analytic approaches in stochastic resource assessments, easy specification remains the most attractive characteristic of the triangular distribution. Themore » beta distribution is another continuous distribution defined within a finite interval offering wider flexibility in style of variation, thus allowing consideration of models in which the random variables closely follow the observed or expected styles of variation. Despite its more complex definition, generation of values following a beta distribution is as straightforward as generating values following a triangular distribution, leaving the selection of parameters as the main impediment to practically considering beta distributions. This contribution intends to promote the acceptance of the beta distribution by explaining its properties and offering several suggestions to facilitate the specification of its two shape parameters. In general, given the same distributional parameters, use of the beta distributions in stochastic modeling may yield significantly different results, yet better estimates, than the triangular distribution.« less

  1. On the minimum of independent geometrically distributed random variables

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Leemis, Lawrence M.; Nicol, David

    1994-01-01

    The expectations E(X(sub 1)), E(Z(sub 1)), and E(Y(sub 1)) of the minimum of n independent geometric, modifies geometric, or exponential random variables with matching expectations differ. We show how this is accounted for by stochastic variability and how E(X(sub 1))/E(Y(sub 1)) equals the expected number of ties at the minimum for the geometric random variables. We then introduce the 'shifted geometric distribution' and show that there is a unique value of the shift for which the individual shifted geometric and exponential random variables match expectations both individually and in the minimums.

  2. Descriptive parameter for photon trajectories in a turbid medium

    NASA Astrophysics Data System (ADS)

    Gandjbakhche, Amir H.; Weiss, George H.

    2000-06-01

    In many applications of laser techniques for diagnostic or therapeutic purposes it is necessary to be able to characterize photon trajectories to know which parts of the tissue are being interrogated. In this paper, we consider the cw reflectance experiment on a semi-infinite medium with uniform optical parameters and having a planar interface. The analysis is carried out in terms of a continuous-time random walk and the relation between the occupancy of a plane parallel to the surface to the maximum depth reached by the random walker is studied. The first moment of the ratio of average depth to the average maximum depth yields information about the volume of tissue interrogated as well as giving some indication of the region of tissue that gets the most light. We have also calculated the standard deviation of this random variable. It is not large enough to qualitatively affect information contained in the first moment.

  3. Sleep and mental disorders: A meta-analysis of polysomnographic research.

    PubMed

    Baglioni, Chiara; Nanovska, Svetoslava; Regen, Wolfram; Spiegelhalder, Kai; Feige, Bernd; Nissen, Christoph; Reynolds, Charles F; Riemann, Dieter

    2016-09-01

    Investigating sleep in mental disorders has the potential to reveal both disorder-specific and transdiagnostic psychophysiological mechanisms. This meta-analysis aimed at determining the polysomnographic (PSG) characteristics of several mental disorders. Relevant studies were searched through standard strategies. Controlled PSG studies evaluating sleep in affective, anxiety, eating, pervasive developmental, borderline and antisocial personality disorders, attention-deficit-hyperactivity disorder (ADHD), and schizophrenia were included. PSG variables of sleep continuity, depth, and architecture, as well as rapid-eye movement (REM) sleep were considered. Calculations were performed with the "Comprehensive Meta-Analysis" and "R" software. Using random effects modeling, for each disorder and each variable, a separate meta-analysis was conducted if at least 3 studies were available for calculation of effect sizes as standardized means (Hedges' g). Sources of variability, that is, sex, age, and mental disorders comorbidity, were evaluated in subgroup analyses. Sleep alterations were evidenced in all disorders, with the exception of ADHD and seasonal affective disorders. Sleep continuity problems were observed in most mental disorders. Sleep depth and REM pressure alterations were associated with affective, anxiety, autism and schizophrenia disorders. Comorbidity was associated with enhanced REM sleep pressure and more inhibition of sleep depth. No sleep parameter was exclusively altered in 1 condition; however, no 2 conditions shared the same PSG profile. Sleep continuity disturbances imply a transdiagnostic imbalance in the arousal system likely representing a basic dimension of mental health. Sleep depth and REM variables might play a key role in psychiatric comorbidity processes. Constellations of sleep alterations may define distinct disorders better than alterations in 1 single variable. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. A Multivariate Randomization Text of Association Applied to Cognitive Test Results

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert; Beard, Bettina

    2009-01-01

    Randomization tests provide a conceptually simple, distribution-free way to implement significance testing. We have applied this method to the problem of evaluating the significance of the association among a number (k) of variables. The randomization method was the random re-ordering of k-1 of the variables. The criterion variable was the value of the largest eigenvalue of the correlation matrix.

  5. Echocardiographic agreement in the diagnostic evaluation for infective endocarditis.

    PubMed

    Lauridsen, Trine Kiilerich; Selton-Suty, Christine; Tong, Steven; Afonso, Luis; Cecchi, Enrico; Park, Lawrence; Yow, Eric; Barnhart, Huiman X; Paré, Carlos; Samad, Zainab; Levine, Donald; Peterson, Gail; Stancoven, Amy Butler; Johansson, Magnus Carl; Dickerman, Stuart; Tamin, Syahidah; Habib, Gilbert; Douglas, Pamela S; Bruun, Niels Eske; Crowley, Anna Lisa

    2016-07-01

    Echocardiography is essential for the diagnosis and management of infective endocarditis (IE). However, the reproducibility for the echocardiographic assessment of variables relevant to IE is unknown. Objectives of this study were: (1) To define the reproducibility for IE echocardiographic variables and (2) to describe a methodology for assessing quality in an observational cohort containing site-interpreted data. IE reproducibility was assessed on a subset of echocardiograms from subjects enrolled in the International Collaboration on Endocarditis registry. Specific echocardiographic case report forms were used. Intra-observer agreement was assessed from six site readers on ten randomly selected echocardiograms. Inter-observer agreement between sites and an echocardiography core laboratory was assessed on a separate random sample of 110 echocardiograms. Agreement was determined using intraclass correlation (ICC), coverage probability (CP), and limits of agreement for continuous variables and kappa statistics (κweighted) and CP for categorical variables. Intra-observer agreement for LVEF was excellent [ICC = 0.93 ± 0.1 and all pairwise differences for LVEF (CP) were within 10 %]. For IE categorical echocardiographic variables, intra-observer agreement was best for aortic abscess (κweighted = 1.0, CP = 1.0 for all readers). Highest inter-observer agreement for IE categorical echocardiographic variables was obtained for vegetation location (κweighted = 0.95; 95 % CI 0.92-0.99) and lowest agreement was found for vegetation mobility (κweighted = 0.69; 95 % CI 0.62-0.86). Moderate to excellent intra- and inter-observer agreement is observed for echocardiographic variables in the diagnostic assessment of IE. A pragmatic approach for determining echocardiographic data reproducibility in a large, multicentre, site interpreted observational cohort is feasible.

  6. High-speed free-space optical continuous-variable quantum key distribution enabled by three-dimensional multiplexing.

    PubMed

    Qu, Zhen; Djordjevic, Ivan B

    2017-04-03

    A high-speed four-state continuous-variable quantum key distribution (CV-QKD) system, enabled by wavelength-division multiplexing, polarization multiplexing, and orbital angular momentum (OAM) multiplexing, is studied in the presence of atmospheric turbulence. The atmospheric turbulence channel is emulated by two spatial light modulators (SLMs) on which two randomly generated azimuthal phase patterns yielding Andrews' spectrum are recorded. The phase noise is mitigated by the phase noise cancellation (PNC) stage, and channel transmittance can be monitored directly by the D.C. level in our PNC stage. After the system calibration, a total SKR of >1.68 Gbit/s can be reached in the ideal system, featured with lossless channel and free of excess noise. In our experiment, based on commercial photodetectors, the minimum transmittances of 0.21 and 0.29 are required for OAM states of 2 (or -2) and 6 (or -6), respectively, to guarantee the secure transmission, while a total SKR of 120 Mbit/s can be obtained in case of mean transmittances.

  7. Refining the Use of Nasal High-Flow Therapy as Primary Respiratory Support for Preterm Infants.

    PubMed

    Manley, Brett J; Roberts, Calum T; Frøisland, Dag H; Doyle, Lex W; Davis, Peter G; Owen, Louise S

    2018-05-01

    To identify clinical and demographic variables that predict nasal high-flow (nHF) treatment failure when used as a primary respiratory support for preterm infants. This secondary analysis used data from a multicenter, randomized, controlled trial comparing nHF with continuous positive airway pressure as primary respiratory support in preterm infants 28-36 completed weeks of gestation. Treatment success or failure with nHF was determined using treatment failure criteria within the first 72 hours after randomization. Infants in whom nHF treatment failed received continuous positive airway pressure, and were then intubated if failure criteria were again met. There were 278 preterm infants included, with a mean gestational age (GA) of 32.0 ± 2.1 weeks and a birth weight of 1737 ± 580 g; of these, nHF treatment failed in 71 infants (25.5%). Treatment failure was moderately predicted by a lower GA and higher prerandomization fraction of inspired oxygen (FiO 2 ): area under a receiver operating characteristic curve of 0.76 (95% CI, 0.70-0.83). Nasal HF treatment success was more likely in infants born at ≥30 weeks GA and with prerandomization FiO 2 <0.30. In preterm infants ≥28 weeks' GA enrolled in a randomized, controlled trial, lower GA and higher FiO 2 before randomization predicted early nHF treatment failure. Infants were more likely to be successfully treated with nHF from soon after birth if they were born at ≥30 weeks GA and had a prerandomization FiO 2 <0.30. However, even in this select population, continuous positive airway pressure remains superior to nHF as early respiratory support in preventing treatment failure. Australian New Zealand Clinical Trials Registry: ACTRN12613000303741. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Variable order fractional Fokker-Planck equations derived from Continuous Time Random Walks

    NASA Astrophysics Data System (ADS)

    Straka, Peter

    2018-08-01

    Continuous Time Random Walk models (CTRW) of anomalous diffusion are studied, where the anomalous exponent β(x) ∈(0 , 1) varies in space. This type of situation occurs e.g. in biophysics, where the density of the intracellular matrix varies throughout a cell. Scaling limits of CTRWs are known to have probability distributions which solve fractional Fokker-Planck type equations (FFPE). This correspondence between stochastic processes and FFPE solutions has many useful extensions e.g. to nonlinear particle interactions and reactions, but has not yet been sufficiently developed for FFPEs of the "variable order" type with non-constant β(x) . In this article, variable order FFPEs (VOFFPE) are derived from scaling limits of CTRWs. The key mathematical tool is the 1-1 correspondence of a CTRW scaling limit to a bivariate Langevin process, which tracks the cumulative sum of jumps in one component and the cumulative sum of waiting times in the other. The spatially varying anomalous exponent is modelled by spatially varying β(x) -stable Lévy noise in the waiting time component. The VOFFPE displays a spatially heterogeneous temporal scaling behaviour, with generalized diffusivity and drift coefficients whose units are length2/timeβ(x) resp. length/timeβ(x). A global change of the time scale results in a spatially varying change in diffusivity and drift. A consequence of the mathematical derivation of a VOFFPE from CTRW limits in this article is that a solution of a VOFFPE can be approximated via Monte Carlo simulations. Based on such simulations, we are able to confirm that the VOFFPE is consistent under a change of the global time scale.

  9. A Random Forest Approach to Predict the Spatial Distribution ...

    EPA Pesticide Factsheets

    Modeling the magnitude and distribution of sediment-bound pollutants in estuaries is often limited by incomplete knowledge of the site and inadequate sample density. To address these modeling limitations, a decision-support tool framework was conceived that predicts sediment contamination from the sub-estuary to broader estuary extent. For this study, a Random Forest (RF) model was implemented to predict the distribution of a model contaminant, triclosan (5-chloro-2-(2,4-dichlorophenoxy)phenol) (TCS), in Narragansett Bay, Rhode Island, USA. TCS is an unregulated contaminant used in many personal care products. The RF explanatory variables were associated with TCS transport and fate (proxies) and direct and indirect environmental entry. The continuous RF TCS concentration predictions were discretized into three levels of contamination (low, medium, and high) for three different quantile thresholds. The RF model explained 63% of the variance with a minimum number of variables. Total organic carbon (TOC) (transport and fate proxy) was a strong predictor of TCS contamination causing a mean squared error increase of 59% when compared to permutations of randomized values of TOC. Additionally, combined sewer overflow discharge (environmental entry) and sand (transport and fate proxy) were strong predictors. The discretization models identified a TCS area of greatest concern in the northern reach of Narragansett Bay (Providence River sub-estuary), which was validated wi

  10. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    PubMed Central

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models. PMID:26890307

  11. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    PubMed

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models.

  12. Continuous Decision Support

    DTIC Science & Technology

    2015-12-24

    but the fundamental approach remains unchanged. We consider the case of a sports memorabilia shop whose owner is an avid personal collector of baseball...collector’s competition 15 days from now. Between now and then, as customers bring in antique baseball cards, he must decide which ones to purchase for his...purchased from the shop each day is a random variable that is Poisson distributed with λout = 2. • 20% of cards are 5.25 in2, 10% are 9.97 in2, and 70% are

  13. Effect of Acarbose on Glycemic Variability in Patients with Poorly Controlled Type 2 Diabetes Mellitus Receiving Stable Background Therapy: A Placebo-Controlled Trial.

    PubMed

    Derosa, Giuseppe; Franzetti, Ivano; Querci, Fabrizio; D'Angelo, Angela; Maffioli, Pamela

    2015-11-01

    To evaluate the effect of acarbose on glycemic control and glycemic variability, using a continuous glucose-monitoring system, in patients with type 2 diabetes mellitus who were not well controlled on metformin and vildagliptin therapy. Multicenter, randomized, double-blind, placebo-controlled study. Clinical research units at three hospitals in Italy. Fifty-three patients with type 2 diabetes who were taking stable dosages of metformin 850 mg 3 times/day and vildagliptin 50 mg twice/day for at least 3 months and who were not adequately controlled with these therapies. Patients were randomized to either placebo or acarbose 100 mg 3 times/day to be added to their metformin-vildagliptin regimen. Glycemic excursions were assessed by using a continuous glucose-monitoring system for 1 week. Glycemic control was estimated as the mean blood glucose (MBG) level, the area under the glucose concentration-time curve for a glucose level above 70 mg/dl (AUC above 70) or 180 mg/dl (AUC above 180), and the percentage of time that the glucose level was above 70 mg/dl (T above 70) or 180 mg/dl (T above 180). Intraday glycemic variability was assessed by the standard deviation of the blood glucose level, the mean amplitude of glycemic excursions (MAGE), the M value, and continuous overlapping net glycemic action. Day-to-day glycemic variability was assessed as the mean of daily difference (MODD). The MBG level was ~20 mg/dl lower in the acarbose group than in the placebo group (p<0.05), particularly during the postprandial period. The AUC above 70 did not significantly differ between the two groups, whereas the AUC above 180 was ~40% lower in the acarbose group than in the placebo group during the daytime (p<0.01). The T above 180 was significantly higher in the placebo group than in the acarbose group (31% vs 8%, p<0.01. Moreover, the standard deviation and MAGE values were significantly lower in the acarbose group. The MODD value was not significantly changed in either group, and no significant differences were recorded between groups. All adverse events were mild in both groups, with only a significantly greater frequency of flatulence noted in the acarbose group (5% with acarbose vs 0.5% with placebo, p<0.05). The addition of acarbose to metformin and vildagliptin background therapy in patients with inadequately controlled type 2 diabetes decreased intraday glycemic variability, especially postprandial variability, but it was not associated with a significant change in interday glycemic variability. © 2015 Pharmacotherapy Publications, Inc.

  14. Using Structured Additive Regression Models to Estimate Risk Factors of Malaria: Analysis of 2010 Malawi Malaria Indicator Survey Data

    PubMed Central

    Chirombo, James; Lowe, Rachel; Kazembe, Lawrence

    2014-01-01

    Background After years of implementing Roll Back Malaria (RBM) interventions, the changing landscape of malaria in terms of risk factors and spatial pattern has not been fully investigated. This paper uses the 2010 malaria indicator survey data to investigate if known malaria risk factors remain relevant after many years of interventions. Methods We adopted a structured additive logistic regression model that allowed for spatial correlation, to more realistically estimate malaria risk factors. Our model included child and household level covariates, as well as climatic and environmental factors. Continuous variables were modelled by assuming second order random walk priors, while spatial correlation was specified as a Markov random field prior, with fixed effects assigned diffuse priors. Inference was fully Bayesian resulting in an under five malaria risk map for Malawi. Results Malaria risk increased with increasing age of the child. With respect to socio-economic factors, the greater the household wealth, the lower the malaria prevalence. A general decline in malaria risk was observed as altitude increased. Minimum temperatures and average total rainfall in the three months preceding the survey did not show a strong association with disease risk. Conclusions The structured additive regression model offered a flexible extension to standard regression models by enabling simultaneous modelling of possible nonlinear effects of continuous covariates, spatial correlation and heterogeneity, while estimating usual fixed effects of categorical and continuous observed variables. Our results confirmed that malaria epidemiology is a complex interaction of biotic and abiotic factors, both at the individual, household and community level and that risk factors are still relevant many years after extensive implementation of RBM activities. PMID:24991915

  15. Using structured additive regression models to estimate risk factors of malaria: analysis of 2010 Malawi malaria indicator survey data.

    PubMed

    Chirombo, James; Lowe, Rachel; Kazembe, Lawrence

    2014-01-01

    After years of implementing Roll Back Malaria (RBM) interventions, the changing landscape of malaria in terms of risk factors and spatial pattern has not been fully investigated. This paper uses the 2010 malaria indicator survey data to investigate if known malaria risk factors remain relevant after many years of interventions. We adopted a structured additive logistic regression model that allowed for spatial correlation, to more realistically estimate malaria risk factors. Our model included child and household level covariates, as well as climatic and environmental factors. Continuous variables were modelled by assuming second order random walk priors, while spatial correlation was specified as a Markov random field prior, with fixed effects assigned diffuse priors. Inference was fully Bayesian resulting in an under five malaria risk map for Malawi. Malaria risk increased with increasing age of the child. With respect to socio-economic factors, the greater the household wealth, the lower the malaria prevalence. A general decline in malaria risk was observed as altitude increased. Minimum temperatures and average total rainfall in the three months preceding the survey did not show a strong association with disease risk. The structured additive regression model offered a flexible extension to standard regression models by enabling simultaneous modelling of possible nonlinear effects of continuous covariates, spatial correlation and heterogeneity, while estimating usual fixed effects of categorical and continuous observed variables. Our results confirmed that malaria epidemiology is a complex interaction of biotic and abiotic factors, both at the individual, household and community level and that risk factors are still relevant many years after extensive implementation of RBM activities.

  16. The impact of a disease management programme for type 2 diabetes on health-related quality of life: multilevel analysis of a cluster-randomised controlled trial.

    PubMed

    Panisch, Sigrid; Johansson, Tim; Flamm, Maria; Winkler, Henrike; Weitgasser, Raimund; Sönnichsen, Andreas C

    2018-01-01

    Type 2 diabetes is a chronic disease associated with poorer health outcomes and decreased health related quality of life (HRQoL). The aim of this analysis was to explore the impact of a disease management programme (DMP) in type 2 diabetes on HRQoL. A multilevel model was used to explain the variation in EQ-VAS. A cluster-randomized controlled trial-analysis of the secondary endpoint HRQoL. Our study population were general practitioners and patients in the province of Salzburg. The DMP "Therapie-Aktiv" was implemented in the intervention group, and controls received usual care. Outcome measure was a change in EQ-VAS after 12 months. For comparison of rates, we used Fisher's Exact test; for continuous variables the independent T test or Welch test were used. In the multilevel modeling, we examined various models, continuously adding variables to explain the variation in the dependent variable, starting with an empty model, including only the random intercept. We analysed random effects parameters in order to disentangle variation of the final EQ-VAS. The EQ-VAS significantly increased within the intervention group (mean difference 2.19, p = 0.005). There was no significant difference in EQ-VAS between groups (mean difference 1.00, p = 0.339). In the intervention group the improvement was more distinct in women (2.46, p = 0.036) compared to men (1.92, p = 0.063). In multilevel modeling, sex, age, family and work circumstances, any macrovascular diabetic complication, duration of diabetes, baseline body mass index and baseline EQ-VAS significantly influence final EQ-VAS, while DMP does not. The final model explains 28.9% (EQ-VAS) of the total variance. Most of the unexplained variance was found on patient-level (95%) and less on GP-level (5%). DMP "Therapie-Aktiv" has no significant impact on final EQ-VAS. The impact of DMPs in type 2 diabetes on HRQoL is still unclear and future programmes should focus on patient specific needs and predictors in order to improve HRQoL. Trial registration Current Controlled trials Ltd., ISRCTN27414162.

  17. The random continued fraction transformation

    NASA Astrophysics Data System (ADS)

    Kalle, Charlene; Kempton, Tom; Verbitskiy, Evgeny

    2017-03-01

    We introduce a random dynamical system related to continued fraction expansions. It uses random combinations of the Gauss map and the Rényi (or backwards) continued fraction map. We explore the continued fraction expansions that this system produces, as well as the dynamical properties of the system.

  18. A joint modeling and estimation method for multivariate longitudinal data with mixed types of responses to analyze physical activity data generated by accelerometers.

    PubMed

    Li, Haocheng; Zhang, Yukun; Carroll, Raymond J; Keadle, Sarah Kozey; Sampson, Joshua N; Matthews, Charles E

    2017-11-10

    A mixed effect model is proposed to jointly analyze multivariate longitudinal data with continuous, proportion, count, and binary responses. The association of the variables is modeled through the correlation of random effects. We use a quasi-likelihood type approximation for nonlinear variables and transform the proposed model into a multivariate linear mixed model framework for estimation and inference. Via an extension to the EM approach, an efficient algorithm is developed to fit the model. The method is applied to physical activity data, which uses a wearable accelerometer device to measure daily movement and energy expenditure information. Our approach is also evaluated by a simulation study. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Turbulent, Extreme Multi-zone Model for Simulating Flux and Polarization Variability in Blazars

    NASA Astrophysics Data System (ADS)

    Marscher, Alan P.

    2014-01-01

    The author presents a model for variability of the flux and polarization of blazars in which turbulent plasma flowing at a relativistic speed down a jet crosses a standing conical shock. The shock compresses the plasma and accelerates electrons to energies up to γmax >~ 104 times their rest-mass energy, with the value of γmax determined by the direction of the magnetic field relative to the shock front. The turbulence is approximated in a computer code as many cells, each with a uniform magnetic field whose direction is selected randomly. The density of high-energy electrons in the plasma changes randomly with time in a manner consistent with the power spectral density of flux variations derived from observations of blazars. The variations in flux and polarization are therefore caused by continuous noise processes rather than by singular events such as explosive injection of energy at the base of the jet. Sample simulations illustrate the behavior of flux and linear polarization versus time that such a model produces. The variations in γ-ray flux generated by the code are often, but not always, correlated with those at lower frequencies, and many of the flares are sharply peaked. The mean degree of polarization of synchrotron radiation is higher and its timescale of variability shorter toward higher frequencies, while the polarization electric vector sometimes randomly executes apparent rotations. The slope of the spectral energy distribution exhibits sharper breaks than can arise solely from energy losses. All of these results correspond to properties observed in blazars.

  20. On the Use of the Beta Distribution in Probabilistic Resource Assessments

    USGS Publications Warehouse

    Olea, R.A.

    2011-01-01

    The triangular distribution is a popular choice when it comes to modeling bounded continuous random variables. Its wide acceptance derives mostly from its simple analytic properties and the ease with which modelers can specify its three parameters through the extremes and the mode. On the negative side, hardly any real process follows a triangular distribution, which from the outset puts at a disadvantage any model employing triangular distributions. At a time when numerical techniques such as the Monte Carlo method are displacing analytic approaches in stochastic resource assessments, easy specification remains the most attractive characteristic of the triangular distribution. The beta distribution is another continuous distribution defined within a finite interval offering wider flexibility in style of variation, thus allowing consideration of models in which the random variables closely follow the observed or expected styles of variation. Despite its more complex definition, generation of values following a beta distribution is as straightforward as generating values following a triangular distribution, leaving the selection of parameters as the main impediment to practically considering beta distributions. This contribution intends to promote the acceptance of the beta distribution by explaining its properties and offering several suggestions to facilitate the specification of its two shape parameters. In general, given the same distributional parameters, use of the beta distributions in stochastic modeling may yield significantly different results, yet better estimates, than the triangular distribution. ?? 2011 International Association for Mathematical Geology (outside the USA).

  1. Biologically-variable rhythmic auditory cues are superior to isochronous cues in fostering natural gait variability in Parkinson's disease.

    PubMed

    Dotov, D G; Bayard, S; Cochen de Cock, V; Geny, C; Driss, V; Garrigue, G; Bardy, B; Dalla Bella, S

    2017-01-01

    Rhythmic auditory cueing improves certain gait symptoms of Parkinson's disease (PD). Cues are typically stimuli or beats with a fixed inter-beat interval. We show that isochronous cueing has an unwanted side-effect in that it exacerbates one of the motor symptoms characteristic of advanced PD. Whereas the parameters of the stride cycle of healthy walkers and early patients possess a persistent correlation in time, or long-range correlation (LRC), isochronous cueing renders stride-to-stride variability random. Random stride cycle variability is also associated with reduced gait stability and lack of flexibility. To investigate how to prevent patients from acquiring a random stride cycle pattern, we tested rhythmic cueing which mimics the properties of variability found in healthy gait (biological variability). PD patients (n=19) and age-matched healthy participants (n=19) walked with three rhythmic cueing stimuli: isochronous, with random variability, and with biological variability (LRC). Synchronization was not instructed. The persistent correlation in gait was preserved only with stimuli with biological variability, equally for patients and controls (p's<0.05). In contrast, cueing with isochronous or randomly varying inter-stimulus/beat intervals removed the LRC in the stride cycle. Notably, the individual's tendency to synchronize steps with beats determined the amount of negative effects of isochronous and random cues (p's<0.05) but not the positive effect of biological variability. Stimulus variability and patients' propensity to synchronize play a critical role in fostering healthier gait dynamics during cueing. The beneficial effects of biological variability provide useful guidelines for improving existing cueing treatments. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Effects of Continuous Positive Airway Pressure on Neurocognitive Function in Obstructive Sleep Apnea Patients: The Apnea Positive Pressure Long-term Efficacy Study (APPLES)

    PubMed Central

    Kushida, Clete A.; Nichols, Deborah A.; Holmes, Tyson H.; Quan, Stuart F.; Walsh, James K.; Gottlieb, Daniel J.; Simon, Richard D.; Guilleminault, Christian; White, David P.; Goodwin, James L.; Schweitzer, Paula K.; Leary, Eileen B.; Hyde, Pamela R.; Hirshkowitz, Max; Green, Sylvan; McEvoy, Linda K.; Chan, Cynthia; Gevins, Alan; Kay, Gary G.; Bloch, Daniel A.; Crabtree, Tami; Dement, William C.

    2012-01-01

    Study Objective: To determine the neurocognitive effects of continuous positive airway pressure (CPAP) therapy on patients with obstructive sleep apnea (OSA). Design, Setting, and Participants: The Apnea Positive Pressure Long-term Efficacy Study (APPLES) was a 6-month, randomized, double-blind, 2-arm, sham-controlled, multicenter trial conducted at 5 U.S. university, hospital, or private practices. Of 1,516 participants enrolled, 1,105 were randomized, and 1,098 participants diagnosed with OSA contributed to the analysis of the primary outcome measures. Intervention: Active or sham CPAP Measurements: Three neurocognitive variables, each representing a neurocognitive domain: Pathfinder Number Test-Total Time (attention and psychomotor function [A/P]), Buschke Selective Reminding Test-Sum Recall (learning and memory [L/M]), and Sustained Working Memory Test-Overall Mid-Day Score (executive and frontal-lobe function [E/F]) Results: The primary neurocognitive analyses showed a difference between groups for only the E/F variable at the 2 month CPAP visit, but no difference at the 6 month CPAP visit or for the A/P or L/M variables at either the 2 or 6 month visits. When stratified by measures of OSA severity (AHI or oxygen saturation parameters), the primary E/F variable and one secondary E/F neurocognitive variable revealed transient differences between study arms for those with the most severe OSA. Participants in the active CPAP group had a significantly greater ability to remain awake whether measured subjectively by the Epworth Sleepiness Scale or objectively by the maintenance of wakefulness test. Conclusions: CPAP treatment improved both subjectively and objectively measured sleepiness, especially in individuals with severe OSA (AHI > 30). CPAP use resulted in mild, transient improvement in the most sensitive measures of executive and frontal-lobe function for those with severe disease, which suggests the existence of a complex OSA-neurocognitive relationship. Clinical Trial Information: Registered at clinicaltrials.gov. Identifier: NCT00051363. Citation: Kushida CA; Nichols DA; Holmes TH; Quan SF; Walsh JK; Gottlieb DJ; Simon RD; Guilleminault C; White DP; Goodwin JL; Schweitzer PK; Leary EB; Hyde PR; Hirshkowitz M; Green S; McEvoy LK; Chan C; Gevins A; Kay GG; Bloch DA; Crabtree T; Demen WC. Effects of continuous positive airway pressure on neurocognitive function in obstructive sleep apnea patients: the Apnea Positive Pressure Long-term Efficacy Study (APPLES). SLEEP 2012;35(12):1593-1602. PMID:23204602

  3. An instrumental variable random-coefficients model for binary outcomes

    PubMed Central

    Chesher, Andrew; Rosen, Adam M

    2014-01-01

    In this paper, we study a random-coefficients model for a binary outcome. We allow for the possibility that some or even all of the explanatory variables are arbitrarily correlated with the random coefficients, thus permitting endogeneity. We assume the existence of observed instrumental variables Z that are jointly independent with the random coefficients, although we place no structure on the joint determination of the endogenous variable X and instruments Z, as would be required for a control function approach. The model fits within the spectrum of generalized instrumental variable models, and we thus apply identification results from our previous studies of such models to the present context, demonstrating their use. Specifically, we characterize the identified set for the distribution of random coefficients in the binary response model with endogeneity via a collection of conditional moment inequalities, and we investigate the structure of these sets by way of numerical illustration. PMID:25798048

  4. Polynomial chaos expansion with random and fuzzy variables

    NASA Astrophysics Data System (ADS)

    Jacquelin, E.; Friswell, M. I.; Adhikari, S.; Dessombz, O.; Sinou, J.-J.

    2016-06-01

    A dynamical uncertain system is studied in this paper. Two kinds of uncertainties are addressed, where the uncertain parameters are described through random variables and/or fuzzy variables. A general framework is proposed to deal with both kinds of uncertainty using a polynomial chaos expansion (PCE). It is shown that fuzzy variables may be expanded in terms of polynomial chaos when Legendre polynomials are used. The components of the PCE are a solution of an equation that does not depend on the nature of uncertainty. Once this equation is solved, the post-processing of the data gives the moments of the random response when the uncertainties are random or gives the response interval when the variables are fuzzy. With the PCE approach, it is also possible to deal with mixed uncertainty, when some parameters are random and others are fuzzy. The results provide a fuzzy description of the response statistical moments.

  5. End-point detection in potentiometric titration by continuous wavelet transform.

    PubMed

    Jakubowska, Małgorzata; Baś, Bogusław; Kubiak, Władysław W

    2009-10-15

    The aim of this work was construction of the new wavelet function and verification that a continuous wavelet transform with a specially defined dedicated mother wavelet is a useful tool for precise detection of end-point in a potentiometric titration. The proposed algorithm does not require any initial information about the nature or the type of analyte and/or the shape of the titration curve. The signal imperfection, as well as random noise or spikes has no influence on the operation of the procedure. The optimization of the new algorithm was done using simulated curves and next experimental data were considered. In the case of well-shaped and noise-free titration data, the proposed method gives the same accuracy and precision as commonly used algorithms. But, in the case of noisy or badly shaped curves, the presented approach works good (relative error mainly below 2% and coefficients of variability below 5%) while traditional procedures fail. Therefore, the proposed algorithm may be useful in interpretation of the experimental data and also in automation of the typical titration analysis, specially in the case when random noise interfere with analytical signal.

  6. A universal self-charging system driven by random biomechanical energy for sustainable operation of mobile electronics

    PubMed Central

    Niu, Simiao; Wang, Xiaofeng; Yi, Fang; Zhou, Yu Sheng; Wang, Zhong Lin

    2015-01-01

    Human biomechanical energy is characterized by fluctuating amplitudes and variable low frequency, and an effective utilization of such energy cannot be achieved by classical energy-harvesting technologies. Here we report a high-efficient self-charging power system for sustainable operation of mobile electronics exploiting exclusively human biomechanical energy, which consists of a high-output triboelectric nanogenerator, a power management circuit to convert the random a.c. energy to d.c. electricity at 60% efficiency, and an energy storage device. With palm tapping as the only energy source, this power unit provides a continuous d.c. electricity of 1.044 mW (7.34 W m−3) in a regulated and managed manner. This self-charging unit can be universally applied as a standard ‘infinite-lifetime' power source for continuously driving numerous conventional electronics, such as thermometers, electrocardiograph system, pedometers, wearable watches, scientific calculators and wireless radio-frequency communication system, which indicates the immediate and broad applications in personal sensor systems and internet of things. PMID:26656252

  7. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  8. Evidence for attractors in English intonation.

    PubMed

    Braun, Bettina; Kochanski, Greg; Grabe, Esther; Rosner, Burton S

    2006-06-01

    Although the pitch of the human voice is continuously variable, some linguists contend that intonation in speech is restricted to a small, limited set of patterns. This claim is tested by asking subjects to mimic a block of 100 randomly generated intonation contours and then to imitate themselves in several successive sessions. The produced f0 contours gradually converge towards a limited set of distinct, previously recognized basic English intonation patterns. These patterns are "attractors" in the space of possible intonation English contours. The convergence does not occur immediately. Seven of the ten participants show continued convergence toward their attractors after the first iteration. Subjects retain and use information beyond phonological contrasts, suggesting that intonational phonology is not a complete description of their mental representation of intonation.

  9. A guidance and navigation system for continuous low-thrust vehicles. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Jack-Chingtse, C.

    1973-01-01

    A midcourse guidance and navigation system for continuous low thrust vehicles was developed. The equinoctial elements are the state variables. Uncertainties are modelled statistically by random vector and stochastic processes. The motion of the vehicle and the measurements are described by nonlinear stochastic differential and difference equations respectively. A minimum time trajectory is defined; equations of motion and measurements are linearized about this trajectory. An exponential cost criterion is constructed and a linear feedback quidance law is derived. An extended Kalman filter is used for state estimation. A short mission using this system is simulated. It is indicated that this system is efficient for short missions, but longer missions require accurate trajectory and ground based measurements.

  10. Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity

    DTIC Science & Technology

    2015-01-07

    vector that helps to manage , predict, and mitigate the risk in the original variable. Residual risk can be exemplified as a quantification of the improved... the random variable of interest is viewed in concert with a related random vector that helps to manage , predict, and mitigate the risk in the original...measures of risk. They view a random variable of interest in concert with an auxiliary random vector that helps to manage , predict and mitigate the risk

  11. Raw and Central Moments of Binomial Random Variables via Stirling Numbers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2013-01-01

    We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…

  12. Clinical Use of Continuous Glucose Monitoring in Adults with Type 1 Diabetes.

    PubMed

    Slattery, David; Choudhary, Pratik

    2017-05-01

    With the emphasis on intensive management of type 1 diabetes, data from studies support frequent monitoring of glucose levels to improve glycemic control and reduce glucose variability, which can be related to an increase in macro and microvascular complications. However, few perform capillary blood glucose that frequently. There are currently two available alternatives that this review will discuss, continuous glucose monitoring (CGM) and flash glucose monitoring. CGM has become an important diagnostic and therapeutic option in optimizing diabetes management. CGM systems are now more accurate, smaller, and easier to use compared to original models. Randomized controlled trials (RCTs) have demonstrated that CGM can improve Hemoglobin A1c (HbA1C) and reduce glucose variability in both continuous subcutaneous insulin infusion and multiple daily injection users. When used in an automated "insulin-suspend" system, reduced frequency of hypoglycemia and shorter time spent in hypoglycemic range have been demonstrated. Despite the potential benefits CGM has to offer in clinical practice, concerns exist on the accuracy of these devices and patient compliance with therapy, which may prevent the true clinical benefit of CGM being achieved, as observed in RCTs. Flash glucose monitoring systems FreeStyle ® Libre™ (Abbott Diabetes Care, Alameda, CA) are as accurate as many CGM systems available and have the added benefit of being factory calibrated. Studies have shown that flash glucose monitoring systems are very well tolerated by patients and effectively reduce glucose variability, increasing time in range.

  13. Hybrid modeling of spatial continuity for application to numerical inverse problems

    USGS Publications Warehouse

    Friedel, Michael J.; Iwashita, Fabio

    2013-01-01

    A novel two-step modeling approach is presented to obtain optimal starting values and geostatistical constraints for numerical inverse problems otherwise characterized by spatially-limited field data. First, a type of unsupervised neural network, called the self-organizing map (SOM), is trained to recognize nonlinear relations among environmental variables (covariates) occurring at various scales. The values of these variables are then estimated at random locations across the model domain by iterative minimization of SOM topographic error vectors. Cross-validation is used to ensure unbiasedness and compute prediction uncertainty for select subsets of the data. Second, analytical functions are fit to experimental variograms derived from original plus resampled SOM estimates producing model variograms. Sequential Gaussian simulation is used to evaluate spatial uncertainty associated with the analytical functions and probable range for constraining variables. The hybrid modeling of spatial continuity is demonstrated using spatially-limited hydrologic measurements at different scales in Brazil: (1) physical soil properties (sand, silt, clay, hydraulic conductivity) in the 42 km2 Vargem de Caldas basin; (2) well yield and electrical conductivity of groundwater in the 132 km2 fractured crystalline aquifer; and (3) specific capacity, hydraulic head, and major ions in a 100,000 km2 transboundary fractured-basalt aquifer. These results illustrate the benefits of exploiting nonlinear relations among sparse and disparate data sets for modeling spatial continuity, but the actual application of these spatial data to improve numerical inverse modeling requires testing.

  14. Logistic quantile regression provides improved estimates for bounded avian counts: A case study of California Spotted Owl fledgling production

    USGS Publications Warehouse

    Cade, Brian S.; Noon, Barry R.; Scherer, Rick D.; Keane, John J.

    2017-01-01

    Counts of avian fledglings, nestlings, or clutch size that are bounded below by zero and above by some small integer form a discrete random variable distribution that is not approximated well by conventional parametric count distributions such as the Poisson or negative binomial. We developed a logistic quantile regression model to provide estimates of the empirical conditional distribution of a bounded discrete random variable. The logistic quantile regression model requires that counts are randomly jittered to a continuous random variable, logit transformed to bound them between specified lower and upper values, then estimated in conventional linear quantile regression, repeating the 3 steps and averaging estimates. Back-transformation to the original discrete scale relies on the fact that quantiles are equivariant to monotonic transformations. We demonstrate this statistical procedure by modeling 20 years of California Spotted Owl fledgling production (0−3 per territory) on the Lassen National Forest, California, USA, as related to climate, demographic, and landscape habitat characteristics at territories. Spotted Owl fledgling counts increased nonlinearly with decreasing precipitation in the early nesting period, in the winter prior to nesting, and in the prior growing season; with increasing minimum temperatures in the early nesting period; with adult compared to subadult parents; when there was no fledgling production in the prior year; and when percentage of the landscape surrounding nesting sites (202 ha) with trees ≥25 m height increased. Changes in production were primarily driven by changes in the proportion of territories with 2 or 3 fledglings. Average variances of the discrete cumulative distributions of the estimated fledgling counts indicated that temporal changes in climate and parent age class explained 18% of the annual variance in owl fledgling production, which was 34% of the total variance. Prior fledgling production explained as much of the variance in the fledgling counts as climate, parent age class, and landscape habitat predictors. Our logistic quantile regression model can be used for any discrete response variables with fixed upper and lower bounds.

  15. A Random Variable Related to the Inversion Vector of a Partial Random Permutation

    ERIC Educational Resources Information Center

    Laghate, Kavita; Deshpande, M. N.

    2005-01-01

    In this article, we define the inversion vector of a permutation of the integers 1, 2,..., n. We set up a particular kind of permutation, called a partial random permutation. The sum of the elements of the inversion vector of such a permutation is a random variable of interest.

  16. Motor output variability, deafferentation, and putative deficits in kinesthetic reafference in Parkinson's disease.

    PubMed

    Torres, Elizabeth B; Cole, Jonathan; Poizner, Howard

    2014-01-01

    Parkinson's disease (PD) is a neurodegenerative disorder defined by motor impairments that include rigidity, systemic slowdown of movement (bradykinesia), postural problems, and tremor. While the progressive decline in motor output functions is well documented, less understood are impairments linked to the continuous kinesthetic sensation emerging from the flow of motions. There is growing evidence in recent years that kinesthetic problems are also part of the symptoms of PD, but objective methods to readily quantify continuously unfolding motions across different contexts have been lacking. Here we present evidence from a deafferented subject (IW) and a new statistical platform that enables new analyses of motor output variability measured as a continuous flow of kinesthetic reafferent input. Systematic increasing similarities between the patterns of motor output variability in IW and the participants with increasing degrees of PD severity suggest potential deficits in kinesthetic sensing in PD. We propose that these deficits may result from persistent, noisy, and random motor patterns as the disorder progresses. The stochastic signatures from the unfolding motions revealed levels of noise in the motor output fluctuations of these patients bound to decrease the kinesthetic signal's bandwidth. The results are interpreted in light of the concept of kinesthetic reafference ( Von Holst and Mittelstaedt, 1950). In this context, noisy motor output variability from voluntary movements in PD leads to a returning stream of noisy afference caused, in turn, by those faulty movements themselves. Faulty efferent output re-enters the CNS as corrupted sensory motor input. We find here that severity level in PD leads to the persistence of such patterns, thus bringing the statistical signatures of the subjects with PD systematically closer to those of the subject without proprioception.

  17. Motor output variability, deafferentation, and putative deficits in kinesthetic reafference in Parkinson’s disease

    PubMed Central

    Torres, Elizabeth B.; Cole, Jonathan; Poizner, Howard

    2014-01-01

    Parkinson’s disease (PD) is a neurodegenerative disorder defined by motor impairments that include rigidity, systemic slowdown of movement (bradykinesia), postural problems, and tremor. While the progressive decline in motor output functions is well documented, less understood are impairments linked to the continuous kinesthetic sensation emerging from the flow of motions. There is growing evidence in recent years that kinesthetic problems are also part of the symptoms of PD, but objective methods to readily quantify continuously unfolding motions across different contexts have been lacking. Here we present evidence from a deafferented subject (IW) and a new statistical platform that enables new analyses of motor output variability measured as a continuous flow of kinesthetic reafferent input. Systematic increasing similarities between the patterns of motor output variability in IW and the participants with increasing degrees of PD severity suggest potential deficits in kinesthetic sensing in PD. We propose that these deficits may result from persistent, noisy, and random motor patterns as the disorder progresses. The stochastic signatures from the unfolding motions revealed levels of noise in the motor output fluctuations of these patients bound to decrease the kinesthetic signal’s bandwidth. The results are interpreted in light of the concept of kinesthetic reafference ( Von Holst and Mittelstaedt, 1950). In this context, noisy motor output variability from voluntary movements in PD leads to a returning stream of noisy afference caused, in turn, by those faulty movements themselves. Faulty efferent output re-enters the CNS as corrupted sensory motor input. We find here that severity level in PD leads to the persistence of such patterns, thus bringing the statistical signatures of the subjects with PD systematically closer to those of the subject without proprioception. PMID:25374524

  18. Meta-Analysis of Drainage Versus No Drainage After Laparoscopic Cholecystectomy

    PubMed Central

    Lucarelli, Pierino; Di Filippo, Annalisa; De Angelis, Francesco; Stipa, Francesco; Spaziani, Erasmo

    2014-01-01

    Background and Objectives: Routine drainage after laparoscopic cholecystectomy is still controversial. This meta-analysis was performed to assess the role of drains in reducing complications in laparoscopic cholecystectomy. Methods: An electronic search of Medline, Science Citation Index Expanded, Scopus, and the Cochrane Library database from January 1990 to June 2013 was performed to identify randomized clinical trials that compare prophylactic drainage with no drainage in laparoscopic cholecystectomy. The odds ratio for qualitative variables and standardized mean difference for continuous variables were calculated. Results: Twelve randomized controlled trials were included in the meta-analysis, involving 1939 patients randomized to a drain (960) versus no drain (979). The morbidity rate was lower in the no drain group (odds ratio, 1.97; 95% confidence interval, 1.26 to 3.10; P = .003). The wound infection rate was lower in the no drain group (odds ratio, 2.35; 95% confidence interval, 1.22 to 4.51; P = .01). Abdominal pain 24 hours after surgery was less severe in the no drain group (standardized mean difference, 2.30; 95% confidence interval, 1.27 to 3.34; P < .0001). No significant difference was present with respect to the presence and quantity of subhepatic fluid collection, shoulder tip pain, parenteral ketorolac consumption, nausea, vomiting, and hospital stay. Conclusion: This study was unable to prove that drains were useful in reducing complications in laparoscopic cholecystectomy. PMID:25516708

  19. Impact of Targeted Preoperative Optimization on Clinical Outcome in Emergency Abdominal Surgeries: A Prospective Randomized Trial.

    PubMed

    Sethi, Ashish; Debbarma, Miltan; Narang, Neeraj; Saxena, Anudeep; Mahobia, Mamta; Tomar, Gaurav Singh

    2018-01-01

    Perforation peritonitis continues to be one of the most common surgical emergencies that need a surgical intervention most of the times. Anesthesiologists are invariably involved in managing such cases efficiently in perioperative period. The assessment and evaluation of Acute Physiology and Chronic Health Evaluation II (APACHE II) score at presentation and 24 h after goal-directed optimization, administration of empirical broad-spectrum antibiotics, and definitive source control postoperatively. Outcome assessment in terms of duration of hospital stay and mortality in with or without optimization was also measured. It is a prospective, randomized, double-blind controlled study in hospital setting. One hundred and one patients aged ≥18 years, of the American Society of Anesthesiologists physical Status I and II (E) with clinical diagnosis of perforation peritonitis posted for surgery were enrolled. Enrolled patients were randomly divided into two groups. Group A is optimized by goal-directed optimization protocol in the preoperative holding room by anesthesiology residents whereas in Group S, managed by surgery residents in the surgical wards without any fixed algorithm. The assessment of APACHE II score was done as a first step on admission and 24 h postoperatively. Duration of hospital stay and mortality in both the groups were also measured and compared. Categorical data are presented as frequency counts (percent) and compared using the Chi-square or Fisher's exact test. The statistical significance for categorical variables was determined by Chi-square analysis. For continuous variables, a two-sample t -test was applied. The mean APACHE II score on admission in case and control groups was comparable. Significant lowering of serial scores in case group was observed as compared to control group ( P = 0.02). There was a significant lowering of mean duration of hospital stay seen in case group (9.8 ± 1.7 days) as compared to control group ( P = 0.007). Furthermore, a significant decline in death rate was noted in case group as compared to control group ( P = 0.03). Goal-directed optimized patients with perforation peritonitis were discharged early as compared to control group with significantly lesser mortality as compared with randomly optimized patients in the perioperative period.

  20. MODIS Aerosol Optical Depth Bias Adjustment Using Machine Learning Algorithms

    NASA Technical Reports Server (NTRS)

    Albayrak, Arif; Wei, Jennifer; Petrenko, Maksym; Lary, David; Leptoukh, Gregory

    2011-01-01

    To monitor the earth atmosphere and its surface changes, satellite based instruments collect continuous data. While some of the data is directly used, some others such as aerosol properties are indirectly retrieved from the observation data. While retrieved variables (RV) form very powerful products, they don't come without obstacles. Different satellite viewing geometries, calibration issues, dynamically changing atmospheric and earth surface conditions, together with complex interactions between observed entities and their environment affect them greatly. This results in random and systematic errors in the final products.

  1. Statistical analysis of multivariate atmospheric variables. [cloud cover

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.

    1979-01-01

    Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.

  2. Conserved directed percolation: exact quasistationary distribution of small systems and Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    César Mansur Filho, Júlio; Dickman, Ronald

    2011-05-01

    We study symmetric sleepy random walkers, a model exhibiting an absorbing-state phase transition in the conserved directed percolation (CDP) universality class. Unlike most examples of this class studied previously, this model possesses a continuously variable control parameter, facilitating analysis of critical properties. We study the model using two complementary approaches: analysis of the numerically exact quasistationary (QS) probability distribution on rings of up to 22 sites, and Monte Carlo simulation of systems of up to 32 000 sites. The resulting estimates for critical exponents β, \\beta /\

  3. Multilevel Model Prediction

    ERIC Educational Resources Information Center

    Frees, Edward W.; Kim, Jee-Seon

    2006-01-01

    Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…

  4. A statistical model for interpreting computerized dynamic posturography data

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.

    2002-01-01

    Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.

  5. Mean convergence theorems and weak laws of large numbers for weighted sums of random variables under a condition of weighted integrability

    NASA Astrophysics Data System (ADS)

    Ordóñez Cabrera, Manuel; Volodin, Andrei I.

    2005-05-01

    From the classical notion of uniform integrability of a sequence of random variables, a new concept of integrability (called h-integrability) is introduced for an array of random variables, concerning an array of constantsE We prove that this concept is weaker than other previous related notions of integrability, such as Cesàro uniform integrability [Chandra, Sankhya Ser. A 51 (1989) 309-317], uniform integrability concerning the weights [Ordóñez Cabrera, Collect. Math. 45 (1994) 121-132] and Cesàro [alpha]-integrability [Chandra and Goswami, J. Theoret. ProbabE 16 (2003) 655-669]. Under this condition of integrability and appropriate conditions on the array of weights, mean convergence theorems and weak laws of large numbers for weighted sums of an array of random variables are obtained when the random variables are subject to some special kinds of dependence: (a) rowwise pairwise negative dependence, (b) rowwise pairwise non-positive correlation, (c) when the sequence of random variables in every row is [phi]-mixing. Finally, we consider the general weak law of large numbers in the sense of Gut [Statist. Probab. Lett. 14 (1992) 49-52] under this new condition of integrability for a Banach space setting.

  6. Model of random center vortex lines in continuous 2 +1 -dimensional spacetime

    NASA Astrophysics Data System (ADS)

    Altarawneh, Derar; Engelhardt, Michael; Höllwieser, Roman

    2016-12-01

    A picture of confinement in QCD based on a condensate of thick vortices with fluxes in the center of the gauge group (center vortices) is studied. Previous concrete model realizations of this picture utilized a hypercubic space-time scaffolding, which, together with many advantages, also has some disadvantages, e.g., in the treatment of vortex topological charge. In the present work, we explore a center vortex model which does not rely on such a scaffolding. Vortices are represented by closed random lines in continuous 2 +1 -dimensional space-time. These random lines are modeled as being piecewise linear, and an ensemble is generated by Monte Carlo methods. The physical space in which the vortex lines are defined is a torus with periodic boundary conditions. Besides moving, growing, and shrinking of the vortex configurations, also reconnections are allowed. Our ensemble therefore contains not a fixed but a variable number of closed vortex lines. This is expected to be important for realizing the deconfining phase transition. We study both vortex percolation and the potential V (R ) between the quark and antiquark as a function of distance R at different vortex densities, vortex segment lengths, reconnection conditions, and at different temperatures. We find three deconfinement phase transitions, as a function of density, as a function of vortex segment length, and as a function of temperature.

  7. Gait analysis following treadmill training with body weight support versus conventional physical therapy: a prospective randomized controlled single blind study.

    PubMed

    Lucareli, P R; Lima, M O; Lima, F P S; de Almeida, J G; Brech, G C; D'Andréa Greve, J M

    2011-09-01

    Single-blind randomized, controlled clinical study. To evaluate, using kinematic gait analysis, the results obtained from gait training on a treadmill with body weight support versus those obtained with conventional gait training and physiotherapy. Thirty patients with sequelae from traumatic incomplete spinal cord injuries at least 12 months earlier; patients were able to walk and were classified according to motor function as ASIA (American Spinal Injury Association) impairment scale C or D. Patients were divided randomly into two groups of 15 patients by the drawing of opaque envelopes: group A (weight support) and group B (conventional). After an initial assessment, both groups underwent 30 sessions of gait training. Sessions occurred twice a week, lasted for 30 min each and continued for four months. All of the patients were evaluated by a single blinded examiner using movement analysis to measure angular and linear kinematic gait parameters. Six patients (three from group A and three from group B) were excluded because they attended fewer than 85% of the training sessions. There were no statistically significant differences in intra-group comparisons among the spatial-temporal variables in group B. In group A, the following significant differences in the studied spatial-temporal variables were observed: increases in velocity, distance, cadence, step length, swing phase and gait cycle duration, in addition to a reduction in stance phase. There were also no significant differences in intra-group comparisons among the angular variables in group B. However, group A achieved significant improvements in maximum hip extension and plantar flexion during stance. Gait training with body weight support was more effective than conventional physiotherapy for improving the spatial-temporal and kinematic gait parameters among patients with incomplete spinal cord injuries.

  8. Evaluating the importance of policy amenable factors in explaining influenza vaccination: a cross-sectional multinational study

    PubMed Central

    Wheelock, Ana; Miraldo, Marisa; Thomson, Angus; Vincent, Charles; Sevdalis, Nick

    2017-01-01

    Objectives Despite continuous efforts to improve influenza vaccination coverage, uptake among high-risk groups remains suboptimal. We aimed to identify policy amenable factors associated with vaccination and to measure their importance in order to assist in the monitoring of vaccination sentiment and the design of communication strategies and interventions to improve vaccination rates. Setting The USA, the UK and France. Participants A total of 2412 participants were surveyed across the three countries. Outcome measures Self-reported influenza vaccination. Methods Between March and April 2014, a stratified random sampling strategy was employed with the aim of obtaining nationally representative samples in the USA, the UK and France through online databases and random-digit dialling. Participants were asked about vaccination practices, perceptions and feelings. Multivariable logistic regression was used to identify factors associated with past influenza vaccination. Results The models were able to explain 64%–80% of the variance in vaccination behaviour. Overall, sociopsychological variables, which are inherently amenable to policy, were better at explaining past vaccination behaviour than demographic, socioeconomic and health variables. Explanatory variables included social influence (physician), influenza and vaccine risk perceptions and traumatic childhood experiences. Conclusions Our results indicate that evidence-based sociopsychological items should be considered for inclusion into national immunisation surveys to gauge the public’s views, identify emerging concerns and thus proactively and opportunely address potential barriers and harness vaccination drivers. PMID:28706088

  9. Impact of exercise on diurnal and nocturnal markers of glycaemic variability and oxidative stress in obese individuals with type 2 diabetes or impaired glucose tolerance.

    PubMed

    Farabi, Sarah S; Carley, David W; Smith, Donald; Quinn, Lauretta

    2015-09-01

    We measured the effects of a single bout of exercise on diurnal and nocturnal oxidative stress and glycaemic variability in obese subjects with type 2 diabetes mellitus or impaired glucose tolerance versus obese healthy controls. Subjects (in random order) performed either a single 30-min bout of moderate-intensity exercise or remained sedentary for 30 min at two separate visits. To quantify glycaemic variability, standard deviation of glucose (measured by continuous glucose monitoring system) and continuous overlapping net glycaemic action of 1-h intervals (CONGA-1) were calculated for three 12-h intervals during each visit. Oxidative stress was measured by 15-isoprostane F(2t) levels in urine collections for matching 12-h intervals. Exercise reduced daytime glycaemic variability (ΔCONGA-1 = -12.62 ± 5.31 mg/dL, p = 0.04) and urinary isoprostanes (ΔCONGA-1 = -0.26 ± 0.12 ng/mg, p = 0.04) in the type 2 diabetes mellitus/impaired glucose tolerance group. Daytime exercise-induced change in urinary 15-isoprostane F(2t) was significantly correlated with both daytime standard deviation (r = 0.68, p = 0.03) and with subsequent overnight standard deviation (r = 0.73, p = 0.027) in the type 2 diabetes mellitus/impaired glucose tolerance group. Exercise significantly impacts the relationship between diurnal oxidative stress and nocturnal glycaemic variability in individuals with type 2 diabetes mellitus/impaired glucose tolerance. © The Author(s) 2015.

  10. Effects of neurofeedback on the short-term memory and continuous attention of patients with moderate traumatic brain injury: A preliminary randomized controlled clinical trial.

    PubMed

    Rostami, Reza; Salamati, Payman; Yarandi, Kourosh Karimi; Khoshnevisan, Alireza; Saadat, Soheil; Kamali, Zeynab Sadat; Ghiasi, Somaie; Zaryabi, Atefeh; Ghazi Mir Saeid, Seyed Shahab; Arjipour, Mehdi; Rezaee-Zavareh, Mohammad Saeid; Rahimi-Movaghar, Vafa

    2017-10-01

    There are some studies which showed neurofeedback therapy (NFT) can be effective in clients with traumatic brain injury (TBI) history. However, randomized controlled clinical trials are still needed for evaluation of this treatment as a standard option. This preliminary study was aimed to evaluate the effect of NFT on continuous attention (CA) and short-term memory (STM) of clients with moderate TBI using a randomized controlled clinical trial (RCT). In this preliminary RCT, seventeen eligible patients with moderate TBI were randomly allocated in two intervention and control groups. All the patients were evaluated for CA and STM using the visual continuous attention test and Wechsler memory scale-4th edition (WMS-IV) test, respectively, both at the time of inclusion to the project and four weeks later. The intervention group participated in 20 sessions of NFT through the first four weeks. Conversely, the control group participated in the same NF sessions from the fifth week to eighth week of the project. Eight subjects in the intervention group and five subjects in the control group completed the study. The mean and standard deviation of participants' age were (26.75 ± 15.16) years and (27.60 ± 8.17) years in experiment and control groups, respectively. All of the subjects were male. No significant improvement was observed in any variables of the visual continuous attention test and WMS-IV test between two groups (p ≥ 0.05). Based on our literature review, it seems that our study is the only study performed on the effect of NFT on TBI patients with control group. NFT has no effect on CA and STM in patients with moderate TBI. More RCTs with large sample sizes, more sessions of treatment, longer time of follow-up and different protocols are recommended. Copyright © 2017 Daping Hospital and the Research Institute of Surgery of the Third Military Medical University. Production and hosting by Elsevier B.V. All rights reserved.

  11. Vehicular traffic noise prediction using soft computing approach.

    PubMed

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Transition in the decay rates of stationary distributions of Lévy motion in an energy landscape.

    PubMed

    Kaleta, Kamil; Lőrinczi, József

    2016-02-01

    The time evolution of random variables with Lévy statistics has the ability to develop jumps, displaying very different behaviors from continuously fluctuating cases. Such patterns appear in an ever broadening range of examples including random lasers, non-Gaussian kinetics, or foraging strategies. The penalizing or reinforcing effect of the environment, however, has been little explored so far. We report a new phenomenon which manifests as a qualitative transition in the spatial decay behavior of the stationary measure of a jump process under an external potential, occurring on a combined change in the characteristics of the process and the lowest eigenvalue resulting from the effect of the potential. This also provides insight into the fundamental question of what is the mechanism of the spatial decay of a ground state.

  13. An exact solution of solute transport by one-dimensional random velocity fields

    USGS Publications Warehouse

    Cvetkovic, V.D.; Dagan, G.; Shapiro, A.M.

    1991-01-01

    The problem of one-dimensional transport of passive solute by a random steady velocity field is investigated. This problem is representative of solute movement in porous media, for example, in vertical flow through a horizontally stratified formation of variable porosity with a constant flux at the soil surface. Relating moments of particle travel time and displacement, exact expressions for the advection and dispersion coefficients in the Focker-Planck equation are compared with the perturbation results for large distances. The first- and second-order approximations for the dispersion coefficient are robust for a lognormal velocity field. The mean Lagrangian velocity is the harmonic mean of the Eulerian velocity for large distances. This is an artifact of one-dimensional flow where the continuity equation provides for a divergence free fluid flux, rather than a divergence free fluid velocity. ?? 1991 Springer-Verlag.

  14. Qualitatively Assessing Randomness in SVD Results

    NASA Astrophysics Data System (ADS)

    Lamb, K. W.; Miller, W. P.; Kalra, A.; Anderson, S.; Rodriguez, A.

    2012-12-01

    Singular Value Decomposition (SVD) is a powerful tool for identifying regions of significant co-variability between two spatially distributed datasets. SVD has been widely used in atmospheric research to define relationships between sea surface temperatures, geopotential height, wind, precipitation and streamflow data for myriad regions across the globe. A typical application for SVD is to identify leading climate drivers (as observed in the wind or pressure data) for a particular hydrologic response variable such as precipitation, streamflow, or soil moisture. One can also investigate the lagged relationship between a climate variable and the hydrologic response variable using SVD. When performing these studies it is important to limit the spatial bounds of the climate variable to reduce the chance of random co-variance relationships being identified. On the other hand, a climate region that is too small may ignore climate signals which have more than a statistical relationship to a hydrologic response variable. The proposed research seeks to identify a qualitative method of identifying random co-variability relationships between two data sets. The research identifies the heterogeneous correlation maps from several past results and compares these results with correlation maps produced using purely random and quasi-random climate data. The comparison identifies a methodology to determine if a particular region on a correlation map may be explained by a physical mechanism or is simply statistical chance.

  15. Extraversion and cardiovascular responses to recurrent social stress: Effect of stress intensity.

    PubMed

    Lü, Wei; Xing, Wanying; Hughes, Brian M; Wang, Zhenhong

    2017-10-28

    The present study sought to establish whether the effects of extraversion on cardiovascular responses to recurrent social stress are contingent on stress intensity. A 2×5×1 mixed-factorial experiment was conducted, with social stress intensity as a between-subject variable, study phase as a within-subject variable, extraversion as a continuous independent variable, and cardiovascular parameter (HR, SBP, DBP, or RSA) as a dependent variable. Extraversion (NEO-FFI), subjective stress, and physiological stress were measured in 166 undergraduate students randomly assigned to undergo moderate (n=82) or high-intensity (n=84) social stress (a public speaking task with different levels of social evaluation). All participants underwent continuous physiological monitoring while facing two consecutive stress exposures distributed across five laboratory phases: baseline, stress exposure 1, post-stress 1, stress exposure 2, post-stress 2. Results indicated that under moderate-intensity social stress, participants higher on extraversion exhibited lesser HR reactivity to stress than participants lower on extraversion, while under high-intensity social stress, they exhibited greater HR, SBP, DBP and RSA reactivity. Under both moderate- and high-intensity social stress, participants higher on extraversion exhibited pronounced SBP and DBP response adaptation to repeated stress, and showed either better degree of HR recovery or greater amount of SBP and DBP recovery after stress. These findings suggest that individuals higher on extraversion exhibit physiological flexibility to cope with social challenges and benefit from adaptive cardiovascular responses. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Influence of body position on the displacement of nasal prongs in preterm newborns receiving continuous positive airway pressure

    PubMed Central

    Brunherotti, Marisa Afonso Andrade; Martinez, Francisco Eulógio

    2015-01-01

    Abstract Objective: To evaluate the influence of body position on the displacement of nasal prongs in preterm infants. Methods: This prospective, randomized, crossover study enrolled infants born at a mean gestational age of 29.7±2 weeks, birth weight of 1353±280g and 2.9±2.2 days of life, submitted to continuous positive airway pressure by nasal prongs. The main outcome was the number of times that the nasal prongs were displaced following infant positioning in the following body positions: prone, right lateral, left lateral, and supine, according to a pre-established random order. Moreover, cardiorespiratory variables (respiratory rate, heart rate, and oxygen saturation) were evaluated for each body position. Data for each position were collected every 10 min, over a period of 60 min. An occurrence was defined when the nasal prongs were displaced from the nostrils after 3 min in the desired position, requiring intervention of the examiner. Results: Among the 16 studied infants, the occurrence of nasal prong displacement was only observed in the prone position (9 infants - 56.2%) and in the left lateral position (2 infants - 12.5%). The number of times that the prongs were displaced was 11 in the prone position (7 within the first 10min) and 2 in the left lateral position (1 within the first 10min). No clinically significant changes were observed in the cardiorespiratory variables. Conclusions: Maintenance of the nasal prongs to provide adequate noninvasive respiratory support was harder in the prone position. PMID:26116326

  17. Comparative effects of a contraceptive vaginal ring delivering a nonandrogenic progestin and continuous ethinyl estradiol and a combined oral contraceptive containing levonorgestrel on hemostasis variables.

    PubMed

    Rad, Mandana; Kluft, Cornelis; Ménard, Joël; Burggraaf, Jacobus; de Kam, Marieke L; Meijer, Piet; Sivin, Irving; Sitruk-Ware, Regine L

    2006-07-01

    This study aimed to compare the effects on hemostasis variables of a contraceptive vaginal ring with those of an oral contraceptive. Twenty-three and 22 healthy premenopausal women were randomized to the contraceptive vaginal ring (150 microg Nestorone and 15 microg ethinyl estradiol) or Stediril 30 during 3 cycles. Analysis of covariance was performed with baseline values as covariate. The contraceptive vaginal ring changed most hemostasis variables similarly but raised (95% confidence intervals of percent treatment differences) Factor VIIt (28% to 49%), extrinsic activated protein C resistance (14% to 65%), and sex hormone-binding globulin (117% to 210%) and lowered Protein S (-32% to -16%) and the global activated partial thromboplastin time-based activated protein C resistance (-12% to -2%) more than the oral contraceptive. The contraceptive vaginal ring affected some measured hemostasis variables and sex hormone-binding globulin differently from the oral contraceptive, most likely because of difference in androgenicity of the progestins. The results suggest that the contraindications for oral contraceptive use would also apply to the tested contraceptive vaginal ring.

  18. The effects of free-living interval-walking training on glycemic control, body composition, and physical fitness in type 2 diabetic patients: a randomized, controlled trial.

    PubMed

    Karstoft, Kristian; Winding, Kamilla; Knudsen, Sine H; Nielsen, Jens S; Thomsen, Carsten; Pedersen, Bente K; Solomon, Thomas P J

    2013-02-01

    To evaluate the feasibility of free-living walking training in type 2 diabetic patients and to investigate the effects of interval-walking training versus continuous-walking training upon physical fitness, body composition, and glycemic control. Subjects with type 2 diabetes were randomized to a control (n = 8), continuous-walking (n = 12), or interval-walking group (n = 12). Training groups were prescribed five sessions per week (60 min/session) and were controlled with an accelerometer and a heart-rate monitor. Continuous walkers performed all training at moderate intensity, whereas interval walkers alternated 3-min repetitions at low and high intensity. Before and after the 4-month intervention, the following variables were measured: VO(2)max, body composition, and glycemic control (fasting glucose, HbA(1c), oral glucose tolerance test, and continuous glucose monitoring [CGM]). Training adherence was high (89 ± 4%), and training energy expenditure and mean intensity were comparable. VO(2)max increased 16.1 ± 3.7% in the interval-walking group (P < 0.05), whereas no changes were observed in the continuous-walking or control group. Body mass and adiposity (fat mass and visceral fat) decreased in the interval-walking group only (P < 0.05). Glycemic control (elevated mean CGM glucose levels and increased fasting insulin) worsened in the control group (P < 0.05), whereas mean (P = 0.05) and maximum (P < 0.05) CGM glucose levels decreased in the interval-walking group. The continuous walkers showed no changes in glycemic control. Free-living walking training is feasible in type 2 diabetic patients. Continuous walking offsets the deterioration in glycemia seen in the control group, and interval walking is superior to energy expenditure-matched continuous walking for improving physical fitness, body composition, and glycemic control.

  19. Anomalous diffusion on a random comblike structure

    NASA Astrophysics Data System (ADS)

    Havlin, Shlomo; Kiefer, James E.; Weiss, George H.

    1987-08-01

    We have recently studied a random walk on a comblike structure as an analog of diffusion on a fractal structure. In our earlier work, the comb was assumed to have a deterministic structure, the comb having teeth of infinite length. In the present paper we study diffusion on a one-dimensional random comb, the length of whose teeth are random variables with an asymptotic stable law distribution φ(L)~L-(1+γ) where 0<γ<=1. Two mean-field methods are used for the analysis, one based on the continuous-time random walk, and the second a self-consistent scaling theory. Both lead to the same conclusions. We find that the diffusion exponent characterizing the mean-square displacement along the backbone of the comb is dw=4/(1+γ) for γ<1 and dw=2 for γ>=1. The probability of being at the origin at time t is P0(t)~t-ds/2 for large t with ds=(3-γ)/2 for γ<1 and ds=1 for γ>1. When a field is applied along the backbone of the comb the diffusion exponent is dw=2/(1+γ) for γ<1 and dw=1 for γ>=1. The theoretical results are confirmed using the exact enumeration method.

  20. Design approaches to experimental mediation☆

    PubMed Central

    Pirlott, Angela G.; MacKinnon, David P.

    2016-01-01

    Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., “measurement-of-mediation” designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable. PMID:27570259

  1. Design approaches to experimental mediation.

    PubMed

    Pirlott, Angela G; MacKinnon, David P

    2016-09-01

    Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., "measurement-of-mediation" designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable.

  2. The effect of dexmedetomidine continuous infusion as an adjuvant to general anesthesia on sevoflurane requirements: A study based on entropy analysis.

    PubMed

    Patel, Chirag Ramanlal; Engineer, Smita R; Shah, Bharat J; Madhu, S

    2013-07-01

    Dexmedetomidine, a α2 agonist as an adjuvant in general anesthesia, has anesthetic and analgesic-sparing property. To evaluate the effect of continuous infusion of dexmedetomidine alone, without use of opioids, on requirement of sevoflurane during general anesthesia with continuous monitoring of depth of anesthesia by entropy analysis. Sixty patients were randomly divided into 2 groups of 30 each. In group A, fentanyl 2 mcg/kg was given while in group B, dexmedetomidine was given intravenously as loading dose of 1 mcg/kg over 10 min prior to induction. After induction with thiopentone in group B, dexmedetomidine was given as infusion at a dose of 0.2-0.8 mcg/kg. Sevoflurane was used as inhalation agent in both groups. Hemodynamic variables, sevoflurane inspired fraction (FIsevo), sevoflurane expired fraction (ETsevo), and entropy (Response entropy and state entropy) were continuously recorded. Statistical analysis was done by unpaired student's t-test and Chi-square test for continuous and categorical variables, respectively. A P-value < 0.05 was considered significant. The use of dexmedetomidine with sevoflurane was associated with a statistical significant decrease in ETsevo at 5 minutes post-intubation (1.49 ± 0.11) and 60 minutes post-intubation (1.11 ±0.28) as compared to the group A [1.73 ±0.30 (5 minutes); 1.68 ±0.50 (60 minutes)]. There was an average 21.5% decrease in ETsevo in group B as compared to group A. Dexmedetomidine, as an adjuvant in general anesthesia, decreases requirement of sevoflurane for maintaining adequate depth of anesthesia.

  3. Continuous-time quantum random walks require discrete space

    NASA Astrophysics Data System (ADS)

    Manouchehri, K.; Wang, J. B.

    2007-11-01

    Quantum random walks are shown to have non-intuitive dynamics which makes them an attractive area of study for devising quantum algorithms for long-standing open problems as well as those arising in the field of quantum computing. In the case of continuous-time quantum random walks, such peculiar dynamics can arise from simple evolution operators closely resembling the quantum free-wave propagator. We investigate the divergence of quantum walk dynamics from the free-wave evolution and show that, in order for continuous-time quantum walks to display their characteristic propagation, the state space must be discrete. This behavior rules out many continuous quantum systems as possible candidates for implementing continuous-time quantum random walks.

  4. Defining fitness in an uncertain world.

    PubMed

    Crewe, Paul; Gratwick, Richard; Grafen, Alan

    2018-04-01

    The recently elucidated definition of fitness employed by Fisher in his fundamental theorem of natural selection is combined with reproductive values as appropriately defined in the context of both random environments and continuing fluctuations in the distribution over classes in a class-structured population. We obtain astonishingly simple results, generalisations of the Price Equation and the fundamental theorem, that show natural selection acting only through the arithmetic expectation of fitness over all uncertainties, in contrast to previous studies with fluctuating demography, in which natural selection looks rather complicated. Furthermore, our setting permits each class to have its characteristic ploidy, thus covering haploidy, diploidy and haplodiploidy at the same time; and allows arbitrary classes, including continuous variables such as condition. The simplicity is achieved by focussing just on the effects of natural selection on genotype frequencies: while other causes are present in the model, and the effect of natural selection is assessed in their presence, these causes will have their own further effects on genoytpe frequencies that are not assessed here. Also, Fisher's uses of reproductive value are shown to have two ambivalences, and a new axiomatic foundation for reproductive value is endorsed. The results continue the formal darwinism project, and extend support for the individual-as-maximising-agent analogy to finite populations with random environments and fluctuating class-distributions. The model may also lead to improved ways to measure fitness in real populations.

  5. Sensitivity subgroup analysis based on single-center vs. multi-center trial status when interpreting meta-analyses pooled estimates: the logical way forward.

    PubMed

    Alexander, Paul E; Bonner, Ashley J; Agarwal, Arnav; Li, Shelly-Anne; Hariharan, Abishek; Izhar, Zain; Bhatnagar, Neera; Alba, Carolina; Akl, Elie A; Fei, Yutong; Guyatt, Gordon H; Beyene, Joseph

    2016-06-01

    Prior studies regarding whether single-center trial estimates are larger than multi-center are equivocal. We examined the extent to which single-center trials yield systematically larger effects than multi-center trials. We searched the 119 core clinical journals and the Cochrane Database of Systematic Reviews for meta-analyses (MAs) of randomized controlled trials (RCTs) published during 2012. In this meta-epidemiologic study, for binary variables, we computed the pooled ratio of ORs (RORs), and for continuous outcomes mean difference in standardized mean differences (SMDs), we conducted weighted random-effects meta-regression and random-effects MA modeling. Our primary analyses were restricted to MAs that included at least five RCTs and in which at least 25% of the studies used each of single trial center (SC) and more trial center (MC) designs. We identified 81 MAs for the odds ratio (OR) and 43 for the SMD outcome measures. Based on our analytic plan, our primary analysis (core) is based on 25 MAs/241 RCTs (binary outcome) and 18 MAs/173 RCTs (continuous outcome). Based on the core analysis, we found no difference in magnitude of effect between SC and MC for binary outcomes [RORs: 1.02; 95% confidence interval (CI): 0.83, 1.24; I(2) 20.2%]. Effect sizes were systematically larger for SC than MC for the continuous outcome measure (mean difference in SMDs: -0.13; 95% CI: -0.21, -0.05; I(2) 0%). Our results do not support prior findings of larger effects in SC than MC trials addressing binary outcomes but show a very similar small increase in effect in SC than MC trials addressing continuous outcomes. Authors of systematic reviews would be wise to include all trials irrespective of SC vs. MC design and address SC vs. MC status as a possible explanation of heterogeneity (and consider sensitivity analyses). Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Chance vs. necessity in living systems: a false antinomy.

    PubMed

    Buiatti, Marcello; Buiatti, Marco

    2008-01-01

    The concepts of order and randomness are crucial to understand 'living systems' structural and dynamical rules. In the history of biology, they lay behind the everlasting debate on the relative roles of chance and determinism in evolution. Jacques Monod [1970] built a theory where chance (randomness) and determinism (order) were considered as two complementary aspects of life. In the present paper, we will give an up to date version of the problem going beyond the dichotomy between chance and determinism. To this end, we will first see how the view on living systems has evolved from the mechanistic one of the 19th century to the one stemming from the most recent literature, where they emerge as complex systems continuously evolving through multiple interactions among their components and with the surrounding environment. We will then report on the ever increasing evidence of "friendly" co-existence in living beings between a number of "variability generators", fixed by evolution, and the "spontaneous order" derived from interactions between components. We will propose that the "disorder" generated is "benevolent" because it allows living systems to rapidly adapt to changes in the environment by continuously changing, while keeping their internal harmony.

  7. Compliance-Effect Correlation Bias in Instrumental Variables Estimators

    ERIC Educational Resources Information Center

    Reardon, Sean F.

    2010-01-01

    Instrumental variable estimators hold the promise of enabling researchers to estimate the effects of educational treatments that are not (or cannot be) randomly assigned but that may be affected by randomly assigned interventions. Examples of the use of instrumental variables in such cases are increasingly common in educational and social science…

  8. Reliability Coupled Sensitivity Based Design Approach for Gravity Retaining Walls

    NASA Astrophysics Data System (ADS)

    Guha Ray, A.; Baidya, D. K.

    2012-09-01

    Sensitivity analysis involving different random variables and different potential failure modes of a gravity retaining wall focuses on the fact that high sensitivity of a particular variable on a particular mode of failure does not necessarily imply a remarkable contribution to the overall failure probability. The present paper aims at identifying a probabilistic risk factor ( R f ) for each random variable based on the combined effects of failure probability ( P f ) of each mode of failure of a gravity retaining wall and sensitivity of each of the random variables on these failure modes. P f is calculated by Monte Carlo simulation and sensitivity analysis of each random variable is carried out by F-test analysis. The structure, redesigned by modifying the original random variables with the risk factors, is safe against all the variations of random variables. It is observed that R f for friction angle of backfill soil ( φ 1 ) increases and cohesion of foundation soil ( c 2 ) decreases with an increase of variation of φ 1 , while R f for unit weights ( γ 1 and γ 2 ) for both soil and friction angle of foundation soil ( φ 2 ) remains almost constant for variation of soil properties. The results compared well with some of the existing deterministic and probabilistic methods and found to be cost-effective. It is seen that if variation of φ 1 remains within 5 %, significant reduction in cross-sectional area can be achieved. But if the variation is more than 7-8 %, the structure needs to be modified. Finally design guidelines for different wall dimensions, based on the present approach, are proposed.

  9. δ-exceedance records and random adaptive walks

    NASA Astrophysics Data System (ADS)

    Park, Su-Chan; Krug, Joachim

    2016-08-01

    We study a modified record process where the kth record in a series of independent and identically distributed random variables is defined recursively through the condition {Y}k\\gt {Y}k-1-{δ }k-1 with a deterministic sequence {δ }k\\gt 0 called the handicap. For constant {δ }k\\equiv δ and exponentially distributed random variables it has been shown in previous work that the process displays a phase transition as a function of δ between a normal phase where the mean record value increases indefinitely and a stationary phase where the mean record value remains bounded and a finite fraction of all entries are records (Park et al 2015 Phys. Rev. E 91 042707). Here we explore the behavior for general probability distributions and decreasing and increasing sequences {δ }k, focusing in particular on the case when {δ }k matches the typical spacing between subsequent records in the underlying simple record process without handicap. We find that a continuous phase transition occurs only in the exponential case, but a novel kind of first order transition emerges when {δ }k is increasing. The problem is partly motivated by the dynamics of evolutionary adaptation in biological fitness landscapes, where {δ }k corresponds to the change of the deterministic fitness component after k mutational steps. The results for the record process are used to compute the mean number of steps that a population performs in such a landscape before being trapped at a local fitness maximum.

  10. Efficacy and safety of regional citrate anticoagulation in critically ill patients undergoing continuous renal replacement therapy.

    PubMed

    Zhang, Zhongheng; Hongying, Ni

    2012-01-01

    Regional citrate anticoagulation (RCA) is an attractive anticoagulation mode in continuous renal replacement therapy (CRRT) because it restricts the anticoagulatory effect to the extracorporeal circuit. In recent years, several randomized controlled trials have been conducted to investigate its superiority over other anticoagulation modes. Thus, we performed a systematic review of available evidence on the efficacy and safety of RCA. A systematic review of randomized controlled trials investigating the efficacy and safety of RCA was performed. PubMed, Current Contents, CINAHL, and EMBASE databases were searched to identify relevance articles. Data on circuit life span, bleeding events, metabolic derangement, and mortality were abstracted. Mean difference was used for continuous variables, and risk ratio was used for binomial variables. The random effects or fixed effect model was used to combine these data according to heterogeneity. The software Review Manager 5.1 was used for the meta-analysis. Six studies met our inclusion criteria, which involved a total of 658 circuits. In these six studies patients with liver failure or a high risk of bleeding were excluded. The circuit life span in the RCA group was significantly longer than that in the control group, with a mean difference of 23.03 h (95% CI 0.45-45.61 h). RCA was able to reduce the risk of bleeding, with a risk ratio of 0.28 (95% CI 0.15-0.50). Metabolic stability (electrolyte and acid-base stabilities) in performing RCA was comparable to that in other anticoagulation modes, and metabolic derangements (hypernatremia, metabolic alkalosis, and hypocalcemia) could be easily controlled without significant clinical consequences. Two studies compared mortality rate between RCA and control groups, with one reported similar mortality rate and the other reported superiority of RCA over the control group (hazards ratio 0.7). RCA is effective in maintaining circuit patency and reducing the risk of bleeding, and thus can be recommended for CRRT if and when metabolic monitoring is adequate and the protocol is followed. However, the safety of citrate in patients with liver failure cannot be concluded from current analysis. The metabolic stability can be easily controlled during RCA. Survival benefit from RCA is still controversial due to limited evidence.

  11. Anderson localization for radial tree-like random quantum graphs

    NASA Astrophysics Data System (ADS)

    Hislop, Peter D.; Post, Olaf

    We prove that certain random models associated with radial, tree-like, rooted quantum graphs exhibit Anderson localization at all energies. The two main examples are the random length model (RLM) and the random Kirchhoff model (RKM). In the RLM, the lengths of each generation of edges form a family of independent, identically distributed random variables (iid). For the RKM, the iid random variables are associated with each generation of vertices and moderate the current flow through the vertex. We consider extensions to various families of decorated graphs and prove stability of localization with respect to decoration. In particular, we prove Anderson localization for the random necklace model.

  12. The Effect of Head Massage on the Regulation of the Cardiac Autonomic Nervous System: A Pilot Randomized Crossover Trial.

    PubMed

    Fazeli, Mir Sohail; Pourrahmat, Mir-Masoud; Liu, Mailan; Guan, Ling; Collet, Jean-Paul

    2016-01-01

    To evaluate the effect of a single 10-minute session of Chinese head massage on the activity of the cardiac autonomic nervous system via measurement of heart rate variability (HRV). In this pilot randomized crossover trial, each participant received both head massage and the control intervention in a randomized fashion. The study was conducted at Children's & Women's Health Centre of British Columbia between June and November 2014. Ten otherwise healthy adults (6 men and 4 women) were enrolled in this study. The intervention comprised 10 minutes of head massage therapy (HMT) in a seated position compared with a control intervention of sitting quietly on the same chair with eyes closed for an equal amount of time (no HMT). The primary outcome measures were the main parameters of HRV, including total power (TP), high frequency (HF), HF as a normalized unit, pre-ejection period, and heart rate (HR). A single short session (10 minutes) of head massage demonstrated an increase in TP continuing up to 20 minutes after massage and reaching statistical significance at 10 minutes after massage (relative change from baseline, 66% for HMT versus -6.6% for no HMT; p = 0.017). The effect on HF also peaked up to 10 minutes after massage (59.4% for HMT versus 4% for no HMT; p = 0.139). Receiving head massage also decreased HR by more than three-fold compared to the control intervention. This study shows the potential benefits of head massage by modulating the cardiac autonomic nervous system through an increase in the total variability and a shift toward higher parasympathetic nervous system activity. Randomized controlled trials with larger sample size and multiple sessions of massage are needed to substantiate these findings.

  13. A computer model of molecular arrangement in a n-paraffinic liquid

    NASA Astrophysics Data System (ADS)

    Vacatello, Michele; Avitabile, Gustavo; Corradini, Paolo; Tuzi, Angela

    1980-07-01

    A computer model of a bulk liquid polymer was built to investigate the problem of local order. The model is made of C30 n-alkane molecules; it is not a lattice model, but it allows for a continuous variability of torsion angles and interchain distances, subject to realistic intra- and intermolecular potentials. Experimental x-ray scattering curves and radial distribution functions are well reproduced. Calculated properties like end-to-end distances, distribution of torsion angles, radial distribution functions, and chain direction correlation parameters, all indicate a random coil conformation and no tendency to form bundles of parallel chains.

  14. Kinematic Methods of Designing Free Form Shells

    NASA Astrophysics Data System (ADS)

    Korotkiy, V. A.; Khmarova, L. I.

    2017-11-01

    The geometrical shell model is formed in light of the set requirements expressed through surface parameters. The shell is modelled using the kinematic method according to which the shell is formed as a continuous one-parameter set of curves. The authors offer a kinematic method based on the use of second-order curves with a variable eccentricity as a form-making element. Additional guiding ruled surfaces are used to control the designed surface form. The authors made a software application enabling to plot a second-order curve specified by a random set of five coplanar points and tangents.

  15. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Simulation of multivariate stationary stochastic processes using dimension-reduction representation methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui; Peng, Yongbo

    2018-03-01

    In view of the Fourier-Stieltjes integral formula of multivariate stationary stochastic processes, a unified formulation accommodating spectral representation method (SRM) and proper orthogonal decomposition (POD) is deduced. By introducing random functions as constraints correlating the orthogonal random variables involved in the unified formulation, the dimension-reduction spectral representation method (DR-SRM) and the dimension-reduction proper orthogonal decomposition (DR-POD) are addressed. The proposed schemes are capable of representing the multivariate stationary stochastic process with a few elementary random variables, bypassing the challenges of high-dimensional random variables inherent in the conventional Monte Carlo methods. In order to accelerate the numerical simulation, the technique of Fast Fourier Transform (FFT) is integrated with the proposed schemes. For illustrative purposes, the simulation of horizontal wind velocity field along the deck of a large-span bridge is proceeded using the proposed methods containing 2 and 3 elementary random variables. Numerical simulation reveals the usefulness of the dimension-reduction representation methods.

  17. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    ERIC Educational Resources Information Center

    Bancroft, Stacie L.; Bourret, Jason C.

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time.…

  18. Evaluation of occupational health interventions using a randomized controlled trial: challenges and alternative research designs.

    PubMed

    Schelvis, Roosmarijn M C; Oude Hengel, Karen M; Burdorf, Alex; Blatter, Birgitte M; Strijk, Jorien E; van der Beek, Allard J

    2015-09-01

    Occupational health researchers regularly conduct evaluative intervention research for which a randomized controlled trial (RCT) may not be the most appropriate design (eg, effects of policy measures, organizational interventions on work schedules). This article demonstrates the appropriateness of alternative designs for the evaluation of occupational health interventions, which permit causal inferences, formulated along two study design approaches: experimental (stepped-wedge) and observational (propensity scores, instrumental variables, multiple baseline design, interrupted time series, difference-in-difference, and regression discontinuity). For each design, the unique characteristics are presented including the advantages and disadvantages compared to the RCT, illustrated by empirical examples in occupational health. This overview shows that several appropriate alternatives for the RCT design are feasible and available, which may provide sufficiently strong evidence to guide decisions on implementation of interventions in workplaces. Researchers are encouraged to continue exploring these designs and thus contribute to evidence-based occupational health.

  19. Optimization Of Mean-Semivariance-Skewness Portfolio Selection Model In Fuzzy Random Environment

    NASA Astrophysics Data System (ADS)

    Chatterjee, Amitava; Bhattacharyya, Rupak; Mukherjee, Supratim; Kar, Samarjit

    2010-10-01

    The purpose of the paper is to construct a mean-semivariance-skewness portfolio selection model in fuzzy random environment. The objective is to maximize the skewness with predefined maximum risk tolerance and minimum expected return. Here the security returns in the objectives and constraints are assumed to be fuzzy random variables in nature and then the vagueness of the fuzzy random variables in the objectives and constraints are transformed into fuzzy variables which are similar to trapezoidal numbers. The newly formed fuzzy model is then converted into a deterministic optimization model. The feasibility and effectiveness of the proposed method is verified by numerical example extracted from Bombay Stock Exchange (BSE). The exact parameters of fuzzy membership function and probability density function are obtained through fuzzy random simulating the past dates.

  20. Improvement in latent variable indirect response joint modeling of a continuous and a categorical clinical endpoint in rheumatoid arthritis.

    PubMed

    Hu, Chuanpu; Zhou, Honghui

    2016-02-01

    Improving the quality of exposure-response modeling is important in clinical drug development. The general joint modeling of multiple endpoints is made possible in part by recent progress on the latent variable indirect response (IDR) modeling for ordered categorical endpoints. This manuscript aims to investigate, when modeling a continuous and a categorical clinical endpoint, the level of improvement achievable by joint modeling in the latent variable IDR modeling framework through the sharing of model parameters for the individual endpoints, guided by the appropriate representation of drug and placebo mechanism. This was illustrated with data from two phase III clinical trials of intravenously administered mAb X for the treatment of rheumatoid arthritis, with the 28-joint disease activity score (DAS28) and 20, 50, and 70% improvement in the American College of Rheumatology (ACR20, ACR50, and ACR70) disease severity criteria were used as efficacy endpoints. The joint modeling framework led to a parsimonious final model with reasonable performance, evaluated by visual predictive check. The results showed that, compared with the more common approach of separately modeling the endpoints, it is possible for the joint model to be more parsimonious and yet better describe the individual endpoints. In particular, the joint model may better describe one endpoint through subject-specific random effects that would not have been estimable from data of this endpoint alone.

  1. Effects of carprofen on renal function during medetomidine-propofol-isoflurane anesthesia in dogs.

    PubMed

    Frendin, Jan H M; Boström, Ingrid M; Kampa, Naruepon; Eksell, Per; Häggström, Jens U; Nyman, Görel C

    2006-12-01

    To investigate effects of carprofen on indices of renal function and results of serum bio-chemical analyses and effects on cardiovascular variables during medetomidine-propofol-isoflurane anesthesia in dogs. 8 healthy male Beagles. A randomized crossover study was conducted with treatments including saline (0.9% NaCl) solution (0.08 mL/kg) and carprofen (4 mg/kg) administered IV. Saline solution or carprofen was administered 30 minutes before induction of anesthesia and immediately before administration of medetomidine (20 microg/kg, IM). Anesthesia was induced with propofol and maintained with inspired isoflurane in oxygen. Blood gas concentrations and ventilation were measured. Cardiovascular variables were continuously monitored via pulse contour cardiac output (CO) measurement. Renal function was assessed via glomerular filtration rate (GFR), renal blood flow (RBF), scintigraphy, serum biochemical analyses, urinalysis, and continuous CO measurements. Hematologic analysis was performed. Values did not differ significantly between the carprofen and saline solution groups. For both treatments, sedation and anesthesia caused changes in results of serum biochemical and hematologic analyses; a transient, significant increase in urine alkaline phosphatase activity; and blood flow diversion to the kidneys. The GFR increased significantly in both groups despite decreased CO, mean arterial pressure, and absolute RBF variables during anesthesia. Carprofen administered IV before anesthesia did not cause detectable, significant adverse effects on renal function during medetomidine-propofol-isoflurane anesthesia in healthy Beagles.

  2. Random Variables: Simulations and Surprising Connections.

    ERIC Educational Resources Information Center

    Quinn, Robert J.; Tomlinson, Stephen

    1999-01-01

    Features activities for advanced second-year algebra students in grades 11 and 12. Introduces three random variables and considers an empirical and theoretical probability for each. Uses coins, regular dice, decahedral dice, and calculators. (ASK)

  3. Binomial leap methods for simulating stochastic chemical kinetics.

    PubMed

    Tian, Tianhai; Burrage, Kevin

    2004-12-01

    This paper discusses efficient simulation methods for stochastic chemical kinetics. Based on the tau-leap and midpoint tau-leap methods of Gillespie [D. T. Gillespie, J. Chem. Phys. 115, 1716 (2001)], binomial random variables are used in these leap methods rather than Poisson random variables. The motivation for this approach is to improve the efficiency of the Poisson leap methods by using larger stepsizes. Unlike Poisson random variables whose range of sample values is from zero to infinity, binomial random variables have a finite range of sample values. This probabilistic property has been used to restrict possible reaction numbers and to avoid negative molecular numbers in stochastic simulations when larger stepsize is used. In this approach a binomial random variable is defined for a single reaction channel in order to keep the reaction number of this channel below the numbers of molecules that undergo this reaction channel. A sampling technique is also designed for the total reaction number of a reactant species that undergoes two or more reaction channels. Samples for the total reaction number are not greater than the molecular number of this species. In addition, probability properties of the binomial random variables provide stepsize conditions for restricting reaction numbers in a chosen time interval. These stepsize conditions are important properties of robust leap control strategies. Numerical results indicate that the proposed binomial leap methods can be applied to a wide range of chemical reaction systems with very good accuracy and significant improvement on efficiency over existing approaches. (c) 2004 American Institute of Physics.

  4. Improving Global Vascular Risk Prediction with Behavioral and Anthropometric Factors: The Multi-ethnic Northern Manhattan Cohort Study

    PubMed Central

    Sacco, Ralph L.; Khatri, Minesh; Rundek, Tatjana; Xu, Qiang; Gardener, Hannah; Boden-Albala, Bernadette; Di Tullio, Marco R.; Homma, Shunichi; Elkind, Mitchell SV; Paik, Myunghee C

    2010-01-01

    Objective To improve global vascular risk prediction with behavioral and anthropometric factors. Background Few cardiovascular risk models are designed to predict the global vascular risk of MI, stroke, or vascular death in multi-ethnic individuals, and existing schemes do not fully include behavioral risk factors. Methods A randomly-derived, population-based, prospective cohort of 2737 community participants free of stroke and coronary artery disease were followed annually for a median of 9.0 years in the Northern Manhattan Study (mean age 69 years; 63.2% women; 52.7% Hispanic, 24.9% African-American, 19.9% white). A global vascular risk score (GVRS) predictive of stroke, myocardial infarction, or vascular death was developed by adding variables to the traditional Framingham cardiovascular variables based on the likelihood ratio criterion. Model utility was assessed through receiver operating characteristics, calibration, and effect on reclassification of subjects. Results Variables which significantly added to the traditional Framingham profile included waist circumference, alcohol consumption, and physical activity. Continuous measures for blood pressure and fasting blood sugar were used instead of hypertension and diabetes. Ten -year event-free probabilities were 0.95 for the first quartile of GVRS, 0.89 for the second quartile, 0.79 for the third quartile, and 0.56 for the fourth quartile. The addition of behavioral factors in our model improved prediction of 10 -year event rates compared to a model restricted to the traditional variables. Conclusion A global vascular risk score that combines both traditional, behavioral, and anthropometric risk factors, uses continuous variables for physiological parameters, and is applicable to non-white subjects could improve primary prevention strategies. PMID:19958966

  5. Do bioclimate variables improve performance of climate envelope models?

    USGS Publications Warehouse

    Watling, James I.; Romañach, Stephanie S.; Bucklin, David N.; Speroterra, Carolina; Brandt, Laura A.; Pearlstine, Leonard G.; Mazzotti, Frank J.

    2012-01-01

    Climate envelope models are widely used to forecast potential effects of climate change on species distributions. A key issue in climate envelope modeling is the selection of predictor variables that most directly influence species. To determine whether model performance and spatial predictions were related to the selection of predictor variables, we compared models using bioclimate variables with models constructed from monthly climate data for twelve terrestrial vertebrate species in the southeastern USA using two different algorithms (random forests or generalized linear models), and two model selection techniques (using uncorrelated predictors or a subset of user-defined biologically relevant predictor variables). There were no differences in performance between models created with bioclimate or monthly variables, but one metric of model performance was significantly greater using the random forest algorithm compared with generalized linear models. Spatial predictions between maps using bioclimate and monthly variables were very consistent using the random forest algorithm with uncorrelated predictors, whereas we observed greater variability in predictions using generalized linear models.

  6. Investigating Factorial Invariance of Latent Variables Across Populations When Manifest Variables Are Missing Completely

    PubMed Central

    Widaman, Keith F.; Grimm, Kevin J.; Early, Dawnté R.; Robins, Richard W.; Conger, Rand D.

    2013-01-01

    Difficulties arise in multiple-group evaluations of factorial invariance if particular manifest variables are missing completely in certain groups. Ad hoc analytic alternatives can be used in such situations (e.g., deleting manifest variables), but some common approaches, such as multiple imputation, are not viable. At least 3 solutions to this problem are viable: analyzing differing sets of variables across groups, using pattern mixture approaches, and a new method using random number generation. The latter solution, proposed in this article, is to generate pseudo-random normal deviates for all observations for manifest variables that are missing completely in a given sample and then to specify multiple-group models in a way that respects the random nature of these values. An empirical example is presented in detail comparing the 3 approaches. The proposed solution can enable quantitative comparisons at the latent variable level between groups using programs that require the same number of manifest variables in each group. PMID:24019738

  7. Inference of median difference based on the Box-Cox model in randomized clinical trials.

    PubMed

    Maruo, K; Isogawa, N; Gosho, M

    2015-05-10

    In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.

  8. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  9. Animal social networks as substrate for cultural behavioural diversity.

    PubMed

    Whitehead, Hal; Lusseau, David

    2012-02-07

    We used individual-based stochastic models to examine how social structure influences the diversity of socially learned behaviour within a non-human population. For continuous behavioural variables we modelled three forms of dyadic social learning, averaging the behavioural value of the two individuals, random transfer of information from one individual to the other, and directional transfer from the individual with highest behavioural value to the other. Learning had potential error. We also examined the transfer of categorical behaviour between individuals with random directionality and two forms of error, the adoption of a randomly chosen existing behavioural category or the innovation of a new type of behaviour. In populations without social structuring the diversity of culturally transmitted behaviour increased with learning error and population size. When the populations were structured socially either by making individuals members of permanent social units or by giving them overlapping ranges, behavioural diversity increased with network modularity under all scenarios, although the proportional increase varied considerably between continuous and categorical behaviour, with transmission mechanism, and population size. Although functions of the form e(c)¹(m)⁻(c)² + (c)³(Log(N)) predicted the mean increase in diversity with modularity (m) and population size (N), behavioural diversity could be highly unpredictable both between simulations with the same set of parameters, and within runs. Errors in social learning and social structuring generally promote behavioural diversity. Consequently, social learning may be considered to produce culture in populations whose social structure is sufficiently modular. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Evaluating the importance of policy amenable factors in explaining influenza vaccination: a cross-sectional multinational study.

    PubMed

    Wheelock, Ana; Miraldo, Marisa; Thomson, Angus; Vincent, Charles; Sevdalis, Nick

    2017-07-12

    Despite continuous efforts to improve influenza vaccination coverage, uptake among high-risk groups remains suboptimal. We aimed to identify policy amenable factors associated with vaccination and to measure their importance in order to assist in the monitoring of vaccination sentiment and the design of communication strategies and interventions to improve vaccination rates. The USA, the UK and France. A total of 2412 participants were surveyed across the three countries. Self-reported influenza vaccination. Between March and April 2014, a stratified random sampling strategy was employed with the aim of obtaining nationally representative samples in the USA, the UK and France through online databases and random-digit dialling. Participants were asked about vaccination practices, perceptions and feelings. Multivariable logistic regression was used to identify factors associated with past influenza vaccination. The models were able to explain 64%-80% of the variance in vaccination behaviour. Overall, sociopsychological variables, which are inherently amenable to policy, were better at explaining past vaccination behaviour than demographic, socioeconomic and health variables. Explanatory variables included social influence (physician), influenza and vaccine risk perceptions and traumatic childhood experiences. Our results indicate that evidence-based sociopsychological items should be considered for inclusion into national immunisation surveys to gauge the public's views, identify emerging concerns and thus proactively and opportunely address potential barriers and harness vaccination drivers. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Evaluation of participant comprehension of information received in an exercise and diet intervention trial: The DR's EXTRA study.

    PubMed

    Länsimies-Antikainen, Helena; Pietilä, Anna-Maija; Kiviniemi, Vesa; Rauramaa, Rainer; Laitinen, Tomi

    2010-01-01

    The informed consent process is the legal and ethical cornerstone of health research and is essential to ensure that participants in health research really understand the information they receive. Clinical studies often fail to provide data that clarify how much participants have understood. To evaluate the comprehension of older volunteer participants in health research. The subjects are a random population sample of 1,410 men and women aged 57-78 years, who are participating in a 4-year randomized controlled intervention trial on the effects of physical exercise and diet on atherosclerosis, endothelial function and cognition. A questionnaire about informed consent was given to all willing participants (n = 1,324) 3 months after the randomization. In addition, participants' long-term continuation in the intervention trial with relation to understanding was evaluated 2 years after the randomization. The response rate was 91%. The majority of respondents (89%) were satisfied with the intelligibility of received information. In addition, the participants' comprehension of the information received seemed to be adequate in 82% of the whole study population. Compared to background variables, higher education (p < 0.001) and satisfaction with one's own health (p = 0.01) were associated with adequate comprehension of the provided information. Furthermore, participants who felt themselves to be healthy were more likely to continue participating in the intervention after 2 years. The findings of this study indicated sufficient understanding of received information in older research participants. However, our results indicate that special efforts should be made with participants with lower educational levels or subjective feelings of impaired health. This study highlights the need for researchers to critically analyze the quality of information and how it is provided. This is especially important in long-term follow-up studies. 2009 S. Karger AG, Basel.

  12. Memory Effects on Movement Behavior in Animal Foraging

    PubMed Central

    Bracis, Chloe; Gurarie, Eliezer; Van Moorter, Bram; Goodwin, R. Andrew

    2015-01-01

    An individual’s choices are shaped by its experience, a fundamental property of behavior important to understanding complex processes. Learning and memory are observed across many taxa and can drive behaviors, including foraging behavior. To explore the conditions under which memory provides an advantage, we present a continuous-space, continuous-time model of animal movement that incorporates learning and memory. Using simulation models, we evaluate the benefit memory provides across several types of landscapes with variable-quality resources and compare the memory model within a nested hierarchy of simpler models (behavioral switching and random walk). We find that memory almost always leads to improved foraging success, but that this effect is most marked in landscapes containing sparse, contiguous patches of high-value resources that regenerate relatively fast and are located in an otherwise devoid landscape. In these cases, there is a large payoff for finding a resource patch, due to size, value, or locational difficulty. While memory-informed search is difficult to differentiate from other factors using solely movement data, our results suggest that disproportionate spatial use of higher value areas, higher consumption rates, and consumption variability all point to memory influencing the movement direction of animals in certain ecosystems. PMID:26288228

  13. Memory Effects on Movement Behavior in Animal Foraging.

    PubMed

    Bracis, Chloe; Gurarie, Eliezer; Van Moorter, Bram; Goodwin, R Andrew

    2015-01-01

    An individual's choices are shaped by its experience, a fundamental property of behavior important to understanding complex processes. Learning and memory are observed across many taxa and can drive behaviors, including foraging behavior. To explore the conditions under which memory provides an advantage, we present a continuous-space, continuous-time model of animal movement that incorporates learning and memory. Using simulation models, we evaluate the benefit memory provides across several types of landscapes with variable-quality resources and compare the memory model within a nested hierarchy of simpler models (behavioral switching and random walk). We find that memory almost always leads to improved foraging success, but that this effect is most marked in landscapes containing sparse, contiguous patches of high-value resources that regenerate relatively fast and are located in an otherwise devoid landscape. In these cases, there is a large payoff for finding a resource patch, due to size, value, or locational difficulty. While memory-informed search is difficult to differentiate from other factors using solely movement data, our results suggest that disproportionate spatial use of higher value areas, higher consumption rates, and consumption variability all point to memory influencing the movement direction of animals in certain ecosystems.

  14. Selection of Variables in Cluster Analysis: An Empirical Comparison of Eight Procedures

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.

    2008-01-01

    Eight different variable selection techniques for model-based and non-model-based clustering are evaluated across a wide range of cluster structures. It is shown that several methods have difficulties when non-informative variables (i.e., random noise) are included in the model. Furthermore, the distribution of the random noise greatly impacts the…

  15. Sampling-Based Stochastic Sensitivity Analysis Using Score Functions for RBDO Problems with Correlated Random Variables

    DTIC Science & Technology

    2010-08-01

    a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...SECURITY CLASSIFICATION OF: This study presents a methodology for computing stochastic sensitivities with respect to the design variables, which are the...Random Variables Report Title ABSTRACT This study presents a methodology for computing stochastic sensitivities with respect to the design variables

  16. Reward and uncertainty in exploration programs

    NASA Technical Reports Server (NTRS)

    Kaufman, G. M.; Bradley, P. G.

    1971-01-01

    A set of variables which are crucial to the economic outcome of petroleum exploration are discussed. These are treated as random variables; the values they assume indicate the number of successes that occur in a drilling program and determine, for a particular discovery, the unit production cost and net economic return if that reservoir is developed. In specifying the joint probability law for those variables, extreme and probably unrealistic assumptions are made. In particular, the different random variables are assumed to be independently distributed. Using postulated probability functions and specified parameters, values are generated for selected random variables, such as reservoir size. From this set of values the economic magnitudes of interest, net return and unit production cost are computed. This constitutes a single trial, and the procedure is repeated many times. The resulting histograms approximate the probability density functions of the variables which describe the economic outcomes of an exploratory drilling program.

  17. A single-loop optimization method for reliability analysis with second order uncertainty

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2015-08-01

    Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.

  18. Heralded processes on continuous-variable spaces as quantum maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreyrol, Franck; Spagnolo, Nicolò; Blandino, Rémi

    2014-12-04

    Heralding processes, which only work when a measurement on a part of the system give the good result, are particularly interesting for continuous-variables. They permit non-Gaussian transformations that are necessary for several continuous-variable quantum information tasks. However if maps and quantum process tomography are commonly used to describe quantum transformations in discrete-variable space, they are much rarer in the continuous-variable domain. Also, no convenient tool for representing maps in a way more adapted to the particularities of continuous variables have yet been explored. In this paper we try to fill this gap by presenting such a tool.

  19. Extragalactic Science With Kepler

    NASA Astrophysics Data System (ADS)

    Fanelli, Michael N.; Marcum, P.

    2012-01-01

    Although designed as an exoplanet and stellar astrophysics experiment, the Kepler mission provides a unique capability to explore the essentially unknown photometric stability of galactic systems at millimag levels using Kepler's blend of high precision and continuous monitoring. Time series observations of galaxies are sensitive to both quasi-continuous variability, driven by accretion activity from embedded active nuclei, and random, episodic events, such as supernovae. In general, galaxies lacking active nuclei are not expected to be variable with the timescales and amplitudes observed in stellar sources and are free of source motions that affect stars (e.g., parallax). These sources can serve as a population of quiescent, non-variable sources, which may be used to quantify the photometric stability and noise characteristics of the Kepler photometer. A factor limiting galaxy monitoring in the Kepler FOV is the overall lack of detailed quantitative information for the galaxy population. Despite these limitations, a significant number of galaxies are being observed, forming the Kepler Galaxy Archive. Observed sources total approximately 100, 250, and 700 in Cycles 1-3 (Cycle 3 began in June 2011). In this poster we interpret the properties of a set of 20 galaxies monitored during quarters 4 through 8, their associated light curves, photometric and astrometric precision and potential variability. We describe data analysis issues relevant to extended sources and available software tools. In addition, we detail ongoing surveys that are providing new photometric and morphological information for galaxies over the entire field. These new datasets will both aid the interpretation of the time series, and improve source selection, e.g., help identify candidate AGNs and starburst systems, for further monitoring.

  20. Mid- and Long-Term Efficacy of Non-Invasive Ventilation in Obesity Hypoventilation Syndrome: The Pickwick's Study.

    PubMed

    López-Jiménez, María José; Masa, Juan F; Corral, Jaime; Terán, Joaquín; Ordaz, Estrella; Troncoso, Maria F; González-Mangado, Nicolás; González, Mónica; Lopez-Martínez, Soledad; De Lucas, Pilar; Marín, José M; Martí, Sergi; Díaz-Cambriles, Trinidad; Díaz-de-Atauri, Josefa; Chiner, Eusebi; Aizpuru, Felipe; Egea, Carlos; Romero, Auxiliadora; Benítez, José M; Sánchez-Gómez, Jesús; Golpe, Rafael; Santiago-Recuerda, Ana; Gómez, Silvia; Barbe, Ferrán; Bengoa, Mónica

    2016-03-01

    The Pickwick project was a prospective, randomized and controlled study, which addressed the issue of obesity hypoventilation syndrome (OHS), a growing problem in developed countries. OHS patients were divided according to apnea-hypopnea index (AHI) ≥30 and <30 determined by polysomnography. The group with AHI≥30 was randomized to intervention with lifestyle changes, noninvasive ventilation (NIV) or continuous positive airway pressure (CPAP); the group with AHI<30 received NIV or lifestyle changes. The aim of the study was to evaluate the efficacy of NIV treatment, CPAP and lifestyle changes (control) in the medium and long-term management of patients with OHS. The primary variables were PaCO2 and days of hospitalization, and operating variables were the percentage of dropouts for medical reasons and mortality. Secondary medium-term objectives were: (i)to evaluate clinical-functional effectiveness on quality of life, echocardiographic and polysomnographic variables; (ii)to investigate the importance of apneic events and leptin in the pathogenesis of daytime alveolar hypoventilation and change according to the different treatments; (ii)to investigate whether metabolic, biochemical and vascular endothelial dysfunction disorders depend on the presence of apneas and hypopneasm and (iv)changes in inflammatory markers and endothelial damage according to treatment. Secondary long-term objectives were to evaluate: (i)clinical and functional effectiveness and quality of life with NIV and CPAP; (ii)changes in leptin, inflammatory markers and endothelial damage according to treatment; (iii)changes in pulmonary hypertension and other echocardiographic variables, as well as blood pressure and incidence of cardiovascular events, and (iv)dropout rate and mortality. Copyright © 2015 SEPAR. Published by Elsevier Espana. All rights reserved.

  1. Reliability of performance velocity for jump squats under feedback and nonfeedback conditions.

    PubMed

    Randell, Aaron D; Cronin, John B; Keogh, Justin Wl; Gill, Nicholas D; Pedersen, Murray C

    2011-12-01

    Randell, AD, Cronin, JB, Keogh, JWL, Gill, ND, and Pedersen, MC. Reliability of performance velocity for jump squats under feedback and nonfeedback conditions. J Strength Cond Res 25(12): 3514-3518, 2011-Advancements in the monitoring of kinematic and kinetic variables during resistance training have resulted in the ability to continuously monitor performance and provide feedback during training. If equipment and software can provide reliable instantaneous feedback related to the variable of interest during training, it is thought that this may result in goal-oriented movement tasks that increase the likelihood of transference to on-field performance or at the very least improve the mechanical variable of interest. The purpose of this study was to determine the reliability of performance velocity for jump squats under feedback and nonfeedback conditions over 3 consecutive training sessions. Twenty subjects were randomly allocated to a feedback or nonfeedback group, and each group performed a total of 3 "jump squat" training sessions with the velocity of each repetition measured using a linear position transducer. There was less change in mean velocities between sessions 1-2 and sessions 2-3 (0.07 and 0.02 vs. 0.13 and -0.04 m·s), less random variation (TE = 0.06 and 0.06 vs. 0.10 and 0.07 m·s) and greater consistency (intraclass correlation coefficient = 0.83 and 0.87 vs. 0.53 and 0.74) between sessions for the feedback condition as compared to the nonfeedback condition. It was concluded that there is approximately a 50-50 probability that the provision of feedback was beneficial to the performance in the squat jump over multiple sessions. It is suggested that this has the potential for increasing transference to on-field performance or at the very least improving the mechanical variable of interest.

  2. The effect of dexmedetomidine continuous infusion as an adjuvant to general anesthesia on sevoflurane requirements: A study based on entropy analysis

    PubMed Central

    Patel, Chirag Ramanlal; Engineer, Smita R; Shah, Bharat J; Madhu, S

    2013-01-01

    Background: Dexmedetomidine, a α2 agonist as an adjuvant in general anesthesia, has anesthetic and analgesic-sparing property. Aims: To evaluate the effect of continuous infusion of dexmedetomidine alone, without use of opioids, on requirement of sevoflurane during general anesthesia with continuous monitoring of depth of anesthesia by entropy analysis. Materials and Methods: Sixty patients were randomly divided into 2 groups of 30 each. In group A, fentanyl 2 mcg/kg was given while in group B, dexmedetomidine was given intravenously as loading dose of 1 mcg/kg over 10 min prior to induction. After induction with thiopentone in group B, dexmedetomidine was given as infusion at a dose of 0.2-0.8 mcg/kg. Sevoflurane was used as inhalation agent in both groups. Hemodynamic variables, sevoflurane inspired fraction (FIsevo), sevoflurane expired fraction (ETsevo), and entropy (Response entropy and state entropy) were continuously recorded. Statistical analysis was done by unpaired student's t-test and Chi-square test for continuous and categorical variables, respectively. A P-value < 0.05 was considered significant. Results: The use of dexmedetomidine with sevoflurane was associated with a statistical significant decrease in ETsevo at 5 minutes post-intubation (1.49 ± 0.11) and 60 minutes post-intubation (1.11 ±0.28) as compared to the group A [1.73 ±0.30 (5 minutes); 1.68 ±0.50 (60 minutes)]. There was an average 21.5% decrease in ETsevo in group B as compared to group A. Conclusions: Dexmedetomidine, as an adjuvant in general anesthesia, decreases requirement of sevoflurane for maintaining adequate depth of anesthesia. PMID:24106354

  3. A comparison of VO2max and metabolic variables between treadmill running and treadmill skating.

    PubMed

    Koepp, Kriston K; Janot, Jeffrey M

    2008-03-01

    The purpose of this study was to determine differences in VO2max and metabolic variables between treadmill running and treadmill skating. This study also examined VO2max responses during a continuous skating treadmill protocol and a discontinuous skating treadmill protocol. Sixteen male high school hockey players, who had a mean age of 16 +/- 1 years and were of an above-average fitness level, participated in this study. All subjects completed 4 exercise trials: a 1-hour skating treadmill familiarization trial, a treadmill running trial, and 2 randomized skating treadmill trials. Minute ventilation (VE), oxygen consumption VO2), carbon dioxide production VCO2), respiratory exchange ratio (RER), and heart rate were averaged every 15 seconds up to VO2max for each exercise test. The results showed that there was a significant difference (P < 0.05) for VO2max (mL.kg.min) and maximal VCO2 (L.min) between the running treadmill protocol and discontinuous skating treadmill protocol. There was also a significant difference for maximal RER between the discontinuous and continuous skating treadmill protocol and between the discontinuous skating treadmill protocol and running treadmill protocol. In conclusion, the running treadmill elicited a greater VO2max (mL.kg.min) than the skating treadmill did, but when it comes to specificity of ice skating, the skating treadmill may be ideal. Also, there was no significant difference between the discontinuous and continuous skating treadmill protocols. Therefore, a continuous protocol is possible on the skating treadmill without compromising correct skating position and physiologic responses. However, the continuous skating treadmill protocol should undergo validation before other scientists, coaches, and strength and conditioning professionals can apply it correctly.

  4. Upscaling Ameriflux observations to assess drought impacts on gross primary productivity across the Southwest

    NASA Astrophysics Data System (ADS)

    Barnes, M.; Moore, D. J.; Scott, R. L.; MacBean, N.; Ponce-Campos, G. E.; Breshears, D. D.

    2017-12-01

    Both satellite observations and eddy covariance estimates provide crucial information about the Earth's carbon, water and energy cycles. Continuous measurements from flux towers facilitate exploration of the exchange of carbon dioxide, water and energy between the land surface and the atmosphere at fine temporal and spatial scales, while satellite observations can fill in the large spatial gaps of in-situ measurements and provide long-term temporal continuity. The Southwest (Southwest United States and Northwest Mexico) and other semi-arid regions represent a key uncertainty in interannual variability in carbon uptake. Comparisons of existing global upscaled gross primary production (GPP) products with flux tower data at sites across the Southwest show widespread mischaracterization of seasonality in vegetation carbon uptake, resulting in large (up to 200%) errors in annual carbon uptake estimates. Here, remotely sensed and distributed meteorological inputs are used to upscale GPP estimates from 25 Ameriflux towers across the Southwest to the regional scale using a machine learning approach. Our random forest model incorporates two novel features that improve the spatial and temporal variability in GPP. First, we incorporate a multi-scalar drought index at multiple timescales to account for differential seasonality between ecosystem types. Second, our machine learning algorithm was trained on twenty five ecologically diverse sites to optimize both the monthly variability in and the seasonal cycle of GPP. The product and its components will be used to examine drought impacts on terrestrial carbon cycling across the Southwest including the effects of drought seasonality and on carbon uptake. Our spatially and temporally continuous upscaled GPP product drawing from both ground and satellite data over the Southwest region helps us understand linkages between the carbon and water cycles in semi-arid ecosystems and informs predictions of vegetation response to future climate conditions.

  5. Context-invariant quasi hidden variable (qHV) modelling of all joint von Neumann measurements for an arbitrary Hilbert space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loubenets, Elena R.

    We prove the existence for each Hilbert space of the two new quasi hidden variable (qHV) models, statistically noncontextual and context-invariant, reproducing all the von Neumann joint probabilities via non-negative values of real-valued measures and all the quantum product expectations—via the qHV (classical-like) average of the product of the corresponding random variables. In a context-invariant model, a quantum observable X can be represented by a variety of random variables satisfying the functional condition required in quantum foundations but each of these random variables equivalently models X under all joint von Neumann measurements, regardless of their contexts. The proved existence ofmore » this model negates the general opinion that, in terms of random variables, the Hilbert space description of all the joint von Neumann measurements for dimH≥3 can be reproduced only contextually. The existence of a statistically noncontextual qHV model, in particular, implies that every N-partite quantum state admits a local quasi hidden variable model introduced in Loubenets [J. Math. Phys. 53, 022201 (2012)]. The new results of the present paper point also to the generality of the quasi-classical probability model proposed in Loubenets [J. Phys. A: Math. Theor. 45, 185306 (2012)].« less

  6. Considerations of multiple imputation approaches for handling missing data in clinical trials.

    PubMed

    Quan, Hui; Qi, Li; Luo, Xiaodong; Darchy, Loic

    2018-07-01

    Missing data exist in all clinical trials and missing data issue is a very serious issue in terms of the interpretability of the trial results. There is no universally applicable solution for all missing data problems. Methods used for handling missing data issue depend on the circumstances particularly the assumptions on missing data mechanisms. In recent years, if the missing at random mechanism cannot be assumed, conservative approaches such as the control-based and returning to baseline multiple imputation approaches are applied for dealing with the missing data issues. In this paper, we focus on the variability in data analysis of these approaches. As demonstrated by examples, the choice of the variability can impact the conclusion of the analysis. Besides the methods for continuous endpoints, we also discuss methods for binary and time to event endpoints as well as consideration for non-inferiority assessment. Copyright © 2018. Published by Elsevier Inc.

  7. Does mental arithmetic before head up tilt have an effect on the orthostatic cardiovascular and hormonal responses?

    NASA Astrophysics Data System (ADS)

    Goswami, Nandu; Lackner, Helmut Karl; Papousek, Ilona; Montani, Jean-Pierre; Jezova, Daniela; Hinghofer-Szalkay, Helmut G.

    2011-05-01

    Passive head up tilt (HUT) and mental arithmetic (MA) are commonly used for providing mental and orthostatic challenges, respectively. In animal experiments, even a single exposure to a stressor has been shown to modify the response to subsequent stress stimulus. We investigated whether MA applied before HUT elicits synergistic responses in orthostatic heart rate (HR), cardiac output (CO), heart rate variability and arterial blood pressure. The 15 healthy young males were subjected to two randomized protocols: (a) HUT and (b) HUT preceded by MA, with sessions randomized and ≥2 weeks apart. Beat to beat continuous hemodynamic variables were measured and saliva samples taken for hormonal assay. HUT alone increased HR from 59±7 (baseline) to 80±10 bpm (mean±SD) and mean blood pressure (MBP) from 88±10 to 91±14 mmHg. HUT results after MA were not different from those with HUT alone. The activity of alpha amylase showed differences during the experiments irrespective of the protocols. We conclude that mental challenge does not affect orthostatic cardiovascular responses when applied before; the timing of mental loading seems to be critical if it is intended to alter cardiovascular responses to upright standing.

  8. Stochastic transfer of polarized radiation in finite cloudy atmospheric media with reflective boundaries

    NASA Astrophysics Data System (ADS)

    Sallah, M.

    2014-03-01

    The problem of monoenergetic radiative transfer in a finite planar stochastic atmospheric medium with polarized (vector) Rayleigh scattering is proposed. The solution is presented for an arbitrary absorption and scattering cross sections. The extinction function of the medium is assumed to be a continuous random function of position, with fluctuations about the mean taken as Gaussian distributed. The joint probability distribution function of these Gaussian random variables is used to calculate the ensemble-averaged quantities, such as reflectivity and transmissivity, for an arbitrary correlation function. A modified Gaussian probability distribution function is also used to average the solution in order to exclude the probable negative values of the optical variable. Pomraning-Eddington approximation is used, at first, to obtain the deterministic analytical solution for both the total intensity and the difference function used to describe the polarized radiation. The problem is treated with specular reflecting boundaries and angular-dependent externally incident flux upon the medium from one side and with no flux from the other side. For the sake of comparison, two different forms of the weight function, which introduced to force the boundary conditions to be fulfilled, are used. Numerical results of the average reflectivity and average transmissivity are obtained for both Gaussian and modified Gaussian probability density functions at the different degrees of polarization.

  9. Patellar denervation with electrocautery in total knee arthroplasty without patellar resurfacing: a meta-analysis.

    PubMed

    Cheng, Tao; Zhu, Chen; Guo, Yongyuan; Shi, Sifeng; Chen, Desheng; Zhang, Xianlong

    2014-11-01

    The impact of patellar denervation with electrocautery in total knee arthroplasty (TKA) on post-operative outcomes has been under debate. This study aims to conduct a meta-analysis and systematic review to compare the benefits and risks of circumpatellar electrocautery with those of non-electrocautery in primary TKAs. Comparative and randomized clinical studies were identified by conducting an electronic search of articles dated up to September 2012 in PubMed, EMBASE, Scopus, and the Cochrane databases. Six studies that focus on a total of 849 knees were analysed. A random-effects model was conducted using the inverse-variance method for continuous variables and the Mantel-Haenszel method for dichotomous variables. There was no significant difference in the incidence of anterior knee pain between the electrocautery and non-electrocautery groups. In term of patellar score and Knee Society Score, circumpatellar electrocautery improved clinical outcomes compared with non-electrocautery in TKAs. The statistical differences were in favour of the electrocautery group but have minimal clinical significance. In addition, the overall complications indicate no statistical significance between the two groups. This study shows no strong evidence either for or against electrocautery compared with non-electrocautery in TKAs. Therapeutic study (systematic review and meta-analysis), Level III.

  10. Properties of behavior under different random ratio and random interval schedules: A parametric study.

    PubMed

    Dembo, M; De Penfold, J B; Ruiz, R; Casalta, H

    1985-03-01

    Four pigeons were trained to peck a key under different values of a temporally defined independent variable (T) and different probabilities of reinforcement (p). Parameter T is a fixed repeating time cycle and p the probability of reinforcement for the first response of each cycle T. Two dependent variables were used: mean response rate and mean postreinforcement pause. For all values of p a critical value for the independent variable T was found (T=1 sec) in which marked changes took place in response rate and postreinforcement pauses. Behavior typical of random ratio schedules was obtained at T 1 sec and behavior typical of random interval schedules at T 1 sec. Copyright © 1985. Published by Elsevier B.V.

  11. Small area estimation for semicontinuous data.

    PubMed

    Chandra, Hukum; Chambers, Ray

    2016-03-01

    Survey data often contain measurements for variables that are semicontinuous in nature, i.e. they either take a single fixed value (we assume this is zero) or they have a continuous, often skewed, distribution on the positive real line. Standard methods for small area estimation (SAE) based on the use of linear mixed models can be inefficient for such variables. We discuss SAE techniques for semicontinuous variables under a two part random effects model that allows for the presence of excess zeros as well as the skewed nature of the nonzero values of the response variable. In particular, we first model the excess zeros via a generalized linear mixed model fitted to the probability of a nonzero, i.e. strictly positive, value being observed, and then model the response, given that it is strictly positive, using a linear mixed model fitted on the logarithmic scale. Empirical results suggest that the proposed method leads to efficient small area estimates for semicontinuous data of this type. We also propose a parametric bootstrap method to estimate the MSE of the proposed small area estimator. These bootstrap estimates of the MSE are compared to the true MSE in a simulation study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. The Expected Sample Variance of Uncorrelated Random Variables with a Common Mean and Some Applications in Unbalanced Random Effects Models

    ERIC Educational Resources Information Center

    Vardeman, Stephen B.; Wendelberger, Joanne R.

    2005-01-01

    There is a little-known but very simple generalization of the standard result that for uncorrelated random variables with common mean [mu] and variance [sigma][superscript 2], the expected value of the sample variance is [sigma][superscript 2]. The generalization justifies the use of the usual standard error of the sample mean in possibly…

  13. Continuous Positive Airway Pressure During Exercise Improves Walking Time in Patients Undergoing Inpatient Cardiac Rehabilitation After Coronary Artery Bypass Graft Surgery: A RANDOMIZED CONTROLLED TRIAL.

    PubMed

    Pantoni, Camila Bianca Falasco; Di Thommazo-Luporini, Luciana; Mendes, Renata Gonçalves; Caruso, Flávia Cristina Rossi; Mezzalira, Daniel; Arena, Ross; Amaral-Neto, Othon; Catai, Aparecida Maria; Borghi-Silva, Audrey

    2016-01-01

    Continuous positive airway pressure (CPAP) has been used as an effective support to decrease the negative pulmonary effects of coronary artery bypass graft (CABG) surgery. However, it is unknown whether CPAP can positively influence patients undergoing CABG during exercise. This study evaluated the effectiveness of CPAP on the first day of ambulation after CABG in patients undergoing inpatient cardiac rehabilitation (CR). Fifty-four patients after CABG surgery were randomly assigned to receive either inpatient CR and CPAP (CPG) or standard CR without CPAP (CG). Cardiac rehabilitation included walking and CPAP pressures were set between 10 to 12 cmH2O. Participants were assessed on the first day of walking at rest and during walking. Outcome measures included breathing pattern variables, exercise time in seconds (ETs), dyspnea/leg effort ratings, and peripheral oxygen saturation (SpO2). Twenty-seven patients (13 CPG vs 14 CG) completed the study. Compared with walking without noninvasive ventilation assistance, CPAP increased ETs by 43.4 seconds (P = .040) during walking, promoted better thoracoabdominal coordination, increased ventilation during walking by 12.5 L/min (P = .001), increased SpO2 values at the end of walking by 2.6% (P = .016), and reduced dyspnea ratings by 1 point (P = .008). Continuous positive airway pressure can positively influence exercise tolerance, ventilatory function, and breathing pattern in response to a single bout of exercise after CABG.

  14. Comparison of Prophylactic Naftopidil, Tamsulosin, and Silodosin for {sup 125}I Brachytherapy-Induced Lower Urinary Tract Symptoms in Patients With Prostate Cancer: Randomized Controlled Trial

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsumura, Hideyasu, E-mail: sugan@pd5.so-net.ne.jp; Satoh, Takefumi; Ishiyama, Hiromichi

    2011-11-15

    Purpose: To compare the efficacy of three {alpha}{sub 1A}/{alpha}{sub 1D}-adrenoceptor (AR) antagonists-naftopidil, tamsulosin, and silodosin-that have differing affinities for the {alpha}{sub 1}-AR subtypes in treating urinary morbidities in Japanese men with {sup 125}I prostate implantation (PI) for prostate cancer. Methods and Materials: This single-institution prospective randomized controlled trial compared naftopidil, tamsulosin, and silodosin in patients undergoing PI. Patients were randomized and received either naftopidil, tamsulosin, or silodosin. Treatment began 1 day after PI and continued for 1 year. The primary efficacy variables were the changes in total International Prostate Symptom Score (IPSS) and postvoid residual urine (PVR). The secondary efficacymore » variables were changes in IPSS storage score and IPSS voiding score from baseline to set points during the study (1, 3, 6, and 12 months). Results: Two hundred twelve patients were evaluated in this study between June 2006 and February 2009: 71, 70, and 71 patients in the naftopidil, tamsulosin, and silodosin groups, respectively. With respect to the primary efficacy variables, the mean changes in the total IPSS at 1 month after PI in the naftopidil, tamsulosin, and silodosin groups were +10.3, +8.9, and +7.5, respectively. There were significantly greater decreases with silodosin than naftopidil at 1 month in the total IPSS. The mean changes in the PVR at 6 months were +14.6, +23.7, and +5.7 mL in the naftopidil, tamsulosin, and silodosin groups, respectively; silodosin showed a significant improvement in the PVR at 6 months vs. tamsulosin. With respect to the secondary efficacy variables, the mean changes in the IPSS voiding score at 1 month in the naftopidil, tamsulosin, and silodosin groups were +6.5, +5.6, and +4.5, respectively; silodosin showed a significant improvement in the IPSS voiding score at 1 month vs. naftopidil. Conclusions: Silodosin has a greater impact on improving PI-induced lower urinary tract symptoms than the other two agents.« less

  15. Predictors and Moderators of Treatment Response in Childhood Anxiety Disorders: Results from the CAMS Trial

    PubMed Central

    Compton, Scott N.; Peris, Tara S.; Almirall, Daniel; Birmaher, Boris; Sherrill, Joel; Kendall, Phillip C.; March, John S.; Gosch, Elizabeth A.; Ginsburg, Golda S.; Rynn, Moira A.; Piacentini, John C.; McCracken, James T.; Keeton, Courtney P.; Suveg, Cynthia M.; Aschenbrand, Sasha G.; Sakolsky, Dara; Iyengar, Satish; Walkup, John T.; Albano, Anne Marie

    2014-01-01

    Objective To examine predictors and moderators of treatment outcomes among 488 youth ages 7-17 years (50% female; 74% ≤ 12 years) with DSM-IV diagnoses of separation anxiety disorder, social phobia, or generalized anxiety disorder who were randomly assigned to receive either cognitive behavior therapy (CBT), sertraline (SRT), their combination (COMB), or medication management with pill placebo (PBO) in the Child/Adolescent Anxiety Multimodal Study (CAMS). Method Six classes of predictor and moderator variables (22 variables) were identified from the literature and examined using continuous (Pediatric Anxiety Ratings Scale; PARS) and categorical (Clinical Global Impression Scale-Improvement; CGI-I) outcome measures. Results Three baseline variables predicted better outcomes (independent of treatment condition) on the PARS, including low anxiety severity (as measured by parents and independent evaluators) and caregiver strain. No baseline variables were found to predict week 12 responder status (CGI-I). Participant's principal diagnosis moderated treatment outcomes, but only on the PARS. No baseline variables were found to moderate treatment outcomes on week 12 responder status (CGI-I). Discussion Overall, anxious children responded favorably to CAMS treatments. However, having more severe and impairing anxiety, greater caregiver strain, and a principal diagnosis of social phobia were associated with less favorable outcomes. Clinical implications of these findings are discussed. PMID:24417601

  16. Do little interactions get lost in dark random forests?

    PubMed

    Wright, Marvin N; Ziegler, Andreas; König, Inke R

    2016-03-31

    Random forests have often been claimed to uncover interaction effects. However, if and how interaction effects can be differentiated from marginal effects remains unclear. In extensive simulation studies, we investigate whether random forest variable importance measures capture or detect gene-gene interactions. With capturing interactions, we define the ability to identify a variable that acts through an interaction with another one, while detection is the ability to identify an interaction effect as such. Of the single importance measures, the Gini importance captured interaction effects in most of the simulated scenarios, however, they were masked by marginal effects in other variables. With the permutation importance, the proportion of captured interactions was lower in all cases. Pairwise importance measures performed about equal, with a slight advantage for the joint variable importance method. However, the overall fraction of detected interactions was low. In almost all scenarios the detection fraction in a model with only marginal effects was larger than in a model with an interaction effect only. Random forests are generally capable of capturing gene-gene interactions, but current variable importance measures are unable to detect them as interactions. In most of the cases, interactions are masked by marginal effects and interactions cannot be differentiated from marginal effects. Consequently, caution is warranted when claiming that random forests uncover interactions.

  17. The influence of an uncertain force environment on reshaping trial-to-trial motor variability.

    PubMed

    Izawa, Jun; Yoshioka, Toshinori; Osu, Rieko

    2014-09-10

    Motor memory is updated to generate ideal movements in a novel environment. When the environment changes every trial randomly, how does the brain incorporate this uncertainty into motor memory? To investigate how the brain adapts to an uncertain environment, we considered a reach adaptation protocol where individuals practiced moving in a force field where a noise was injected. After they had adapted, we measured the trial-to-trial variability in the temporal profiles of the produced hand force. We found that the motor variability was significantly magnified by the adaptation to the random force field. Temporal profiles of the motor variance were significantly dissociable between two different types of random force fields experienced. A model-based analysis suggests that the variability is generated by noise in the gains of the internal model. It further suggests that the trial-to-trial motor variability magnified by the adaptation in a random force field is generated by the uncertainty of the internal model formed in the brain as a result of the adaptation.

  18. Glucose-lowering effect and glycaemic variability of insulin glargine, insulin detemir and insulin lispro protamine in people with type 1 diabetes.

    PubMed

    Derosa, G; Franzetti, I; Querci, F; Romano, D; D'Angelo, A; Maffioli, P

    2015-06-01

    To compare, using a continuous glucose monitoring (CGM) system, the effect on glycaemic variability of insulin glargine, detemir and lispro protamine. A total of 49 white people with type 1 diabetes, not well controlled by three times daily insulin lispro, taken for at least 2 months before study and on a stable dose, were enrolled. The study participants were randomized to add insulin glargine, detemir or lispro protamine, once daily, in the evening. We used a CGM system, the iPro Digital Recorder (Medtronic MiniMed, Northridge, CA, USA) for 1 week. Glycaemic control was assessed according to mean blood glucose values, the area under the glucose curve above 3.9 mmol/l (AUC(>3.9)) or above 10.0 mmol/l (AUC(>10.0)), and the percentage of time spent with glucose values >3.9 or >10.0 mmol/l. Intraday glycaemic variability was assessed using standard deviation (s.d.) values, the mean amplitude of glycaemic excursions and continuous overlapping of net glycaemic action. Day-to-day glycaemic variability was assessed using the mean of daily differences. The s.d. was found to be significantly lower with insulin lispro protamine and glargine compared with insulin detemir. AUC(>3.9) was higher and AUC(>10.0) was lower with insulin lispro protamine and glargine compared with detemir. The mean amplitude of glycaemic excursions and continuous overlapping net glycaemic action values were lower with insulin lispro protamine and glargine compared with detemir. In addition, the mean of daily differences was significantly lower with insulin lispro protamine and glargine compared with detemir. Fewer hypoglycaemic events were recorded during the night-time with insulin lispro protamine compared with glargine and detemir. The results suggest that insulin lispro protamine and glargine are more effective than detemir in reducing glycaemic variability and improving glycaemic control in people with type 1 diabetes. Insulin lispro protamine seems to lead to fewer hypoglycaemic events than other insulin regimens. © 2015 John Wiley & Sons Ltd.

  19. Origins and applications of the Montroll-Weiss continuous time random walk

    NASA Astrophysics Data System (ADS)

    Shlesinger, Michael F.

    2017-05-01

    The Continuous Time Random Walk (CTRW) was introduced by Montroll and Weiss in 1965 in a purely mathematical paper. Its antecedents and later applications beginning in 1973 are discussed, especially for the case of fractal time where the mean waiting time between jumps is infinite. Contribution to the Topical Issue: "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  20. Psychotherapy integration under scrutiny: investigating the impact of integrating emotion-focused components into a CBT-based approach: a study protocol of a randomized controlled trial.

    PubMed

    Babl, Anna; Grosse Holtforth, Martin; Heer, Sara; Lin, Mu; Stähli, Annabarbara; Holstein, Dominique; Belz, Martina; Egenolf, Yvonne; Frischknecht, Eveline; Ramseyer, Fabian; Regli, Daniel; Schmied, Emma; Flückiger, Christoph; Brodbeck, Jeannette; Berger, Thomas; Caspar, Franz

    2016-11-24

    This currently recruiting randomized controlled trial investigates the effects of integrating components of Emotion-Focused Therapy (EFT) into Psychological Therapy (PT), an integrative form of cognitive-behavioral therapy in a manner that is directly mirroring common integrative practice in the sense of assimilative integration. Aims of the study are to understand how both, an existing therapy approach as well as the elements to be integrated, are affected by the integration and to clarify the role of emotional processing as a mediator of therapy outcome. A total of 130 adults with a diagnosed unipolar depressive, anxiety or adjustment disorder (seeking treatment at a psychotherapy outpatient clinic) are randomized to either treatment as usual (PT) with integrated emotion-focused components (TAU + EFT) or PT (TAU). Primary outcome variables are psychopathology and symptom severity at the end of therapy and at follow up; secondary outcome variables are interpersonal problems, psychological wellbeing, quality of life, attainment of individual therapy goals, and emotional competency. Furthermore, process variables such as the quality of the therapeutic relationship are studied as well as aptitude-treatment interactions. Variables are assessed at baseline, after 8 and 16 sessions, at the end of therapy, after 25 ± 3 sessions, and at 6, 12 and 36 month follow-up. Underlying mechanisms of change are investigated. Statistical analyses will be conducted using the appropriate multilevel approaches, mainly two-level regression and growth analysis. The results of this study will indicate whether the integration of emotion-focused elements into treatment as usual increases the effectiveness of Psychological Therapy. If advantages are found, which may be limited to particular variables or subgroups of patients, recommendations for a systematic integration, and caveats if also disadvantages are detected, can be formulated. On a more abstract level, a cognitive behavioral (represented by PT) and humanistic/experiential (represented by EFT) approach will be integrated. It must be emphasized that mimicking common practice in the development and continued education of psychotherapists, EFT is not integrated as a whole, but only elements of EFT that are considered particularly important, and can be trained in an 8-day training plus supervision of therapies. ClinicalTrials.gov, NCT02822443 , 22 June 2016, retrospectively registered.

  1. Regression Tree-Based Methodology for Customizing Building Energy Benchmarks to Individual Commercial Buildings

    NASA Astrophysics Data System (ADS)

    Kaskhedikar, Apoorva Prakash

    According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI pertinent to the building type. The ability to identify and rank the important variables is of great importance in practical implementation of the benchmarking tools which rely on query-based building and HVAC variable filters specified by the user.

  2. A guidance and navigation system for continuous low thrust vehicles. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Tse, C. J. C.

    1973-01-01

    A midcourse guidance and navigation system for continuous low thrust vehicles is described. A set of orbit elements, known as the equinoctial elements, are selected as the state variables. The uncertainties are modelled statistically by random vector and stochastic processes. The motion of the vehicle and the measurements are described by nonlinear stochastic differential and difference equations respectively. A minimum time nominal trajectory is defined and the equation of motion and the measurement equation are linearized about this nominal trajectory. An exponential cost criterion is constructed and a linear feedback guidance law is derived to control the thrusting direction of the engine. Using this guidance law, the vehicle will fly in a trajectory neighboring the nominal trajectory. The extended Kalman filter is used for state estimation. Finally a short mission using this system is simulated. The results indicate that this system is very efficient for short missions.

  3. Point Processes.

    DTIC Science & Technology

    1987-05-01

    O and N(B) < - a.s. for each BEMA. (ii) N(U Bn) = N(Bn) a.s. for any disjoint BI.B 2 .. in ’ R . n n The random variable N(A) represents the number...P(N’(B) = 0). B E o . (C) If (v.X1 X2 ....) (v.Xi.X .. ). then N d N’. The converse is true when E = R + or R and the X ’s are the ordered T ’s.+ n n... R + that are continuous and such that (x: f(x)> O ) is a bounded set. Theorem 1.4. Suppose N and N’ are point processes on E. The following statements

  4. How does informational heterogeneity affect the quality of forecasts?

    NASA Astrophysics Data System (ADS)

    Gualdi, S.; De Martino, A.

    2010-01-01

    We investigate a toy model of inductive interacting agents aiming to forecast a continuous, exogenous random variable E. Private information on E is spread heterogeneously across agents. Herding turns out to be the preferred forecasting mechanism when heterogeneity is maximal. However in such conditions aggregating information efficiently is hard even in the presence of learning, as the herding ratio rises significantly above the efficient market expectation of 1 and remarkably close to the empirically observed values. We also study how different parameters (interaction range, learning rate, cost of information and score memory) may affect this scenario and improve efficiency in the hard phase.

  5. Evolution of bioconvective patterns in variable gravity

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1991-01-01

    Measurements are reported of the evolution of bioconvective patterns in shallow, dense cultures of microorganisms subjected to varying gravity. Various statistical properties of this random, quasi-two-dimensional structure have been found: Aboav's law is obeyed, the average vertex angles follow predictions for regular polygons, and the area of a pattern varies linearly with its number of sides. As gravity varies between 1 g and 1.8 g, these statistical properties continue to hold despite a tripling of the number of polygons and a reduced average polygon dimension by a third. This work compares with experiments on soap foams, Langmuir monolayer foams, metal grains, and simulations.

  6. Continuous variable quantum cryptography using coherent states.

    PubMed

    Grosshans, Frédéric; Grangier, Philippe

    2002-02-04

    We propose several methods for quantum key distribution (QKD) based on the generation and transmission of random distributions of coherent or squeezed states, and we show that they are secure against individual eavesdropping attacks. These protocols require that the transmission of the optical line between Alice and Bob is larger than 50%, but they do not rely on "sub-shot-noise" features such as squeezing. Their security is a direct consequence of the no-cloning theorem, which limits the signal-to-noise ratio of possible quantum measurements on the transmission line. Our approach can also be used for evaluating various QKD protocols using light with Gaussian statistics.

  7. A Random Forest approach to predict the spatial distribution of sediment pollution in an estuarine system

    PubMed Central

    Kreakie, Betty J.; Cantwell, Mark G.; Nacci, Diane

    2017-01-01

    Modeling the magnitude and distribution of sediment-bound pollutants in estuaries is often limited by incomplete knowledge of the site and inadequate sample density. To address these modeling limitations, a decision-support tool framework was conceived that predicts sediment contamination from the sub-estuary to broader estuary extent. For this study, a Random Forest (RF) model was implemented to predict the distribution of a model contaminant, triclosan (5-chloro-2-(2,4-dichlorophenoxy)phenol) (TCS), in Narragansett Bay, Rhode Island, USA. TCS is an unregulated contaminant used in many personal care products. The RF explanatory variables were associated with TCS transport and fate (proxies) and direct and indirect environmental entry. The continuous RF TCS concentration predictions were discretized into three levels of contamination (low, medium, and high) for three different quantile thresholds. The RF model explained 63% of the variance with a minimum number of variables. Total organic carbon (TOC) (transport and fate proxy) was a strong predictor of TCS contamination causing a mean squared error increase of 59% when compared to permutations of randomized values of TOC. Additionally, combined sewer overflow discharge (environmental entry) and sand (transport and fate proxy) were strong predictors. The discretization models identified a TCS area of greatest concern in the northern reach of Narragansett Bay (Providence River sub-estuary), which was validated with independent test samples. This decision-support tool performed well at the sub-estuary extent and provided the means to identify areas of concern and prioritize bay-wide sampling. PMID:28738089

  8. Scenario generation for stochastic optimization problems via the sparse grid method

    DOE PAGES

    Chen, Michael; Mehrotra, Sanjay; Papp, David

    2015-04-19

    We study the use of sparse grids in the scenario generation (or discretization) problem in stochastic programming problems where the uncertainty is modeled using a continuous multivariate distribution. We show that, under a regularity assumption on the random function involved, the sequence of optimal objective function values of the sparse grid approximations converges to the true optimal objective function values as the number of scenarios increases. The rate of convergence is also established. We treat separately the special case when the underlying distribution is an affine transform of a product of univariate distributions, and show how the sparse grid methodmore » can be adapted to the distribution by the use of quadrature formulas tailored to the distribution. We numerically compare the performance of the sparse grid method using different quadrature rules with classic quasi-Monte Carlo (QMC) methods, optimal rank-one lattice rules, and Monte Carlo (MC) scenario generation, using a series of utility maximization problems with up to 160 random variables. The results show that the sparse grid method is very efficient, especially if the integrand is sufficiently smooth. In such problems the sparse grid scenario generation method is found to need several orders of magnitude fewer scenarios than MC and QMC scenario generation to achieve the same accuracy. As a result, it is indicated that the method scales well with the dimension of the distribution--especially when the underlying distribution is an affine transform of a product of univariate distributions, in which case the method appears scalable to thousands of random variables.« less

  9. Continuous-variable quantum homomorphic signature

    NASA Astrophysics Data System (ADS)

    Li, Ke; Shang, Tao; Liu, Jian-wei

    2017-10-01

    Quantum cryptography is believed to be unconditionally secure because its security is ensured by physical laws rather than computational complexity. According to spectrum characteristic, quantum information can be classified into two categories, namely discrete variables and continuous variables. Continuous-variable quantum protocols have gained much attention for their ability to transmit more information with lower cost. To verify the identities of different data sources in a quantum network, we propose a continuous-variable quantum homomorphic signature scheme. It is based on continuous-variable entanglement swapping and provides additive and subtractive homomorphism. Security analysis shows the proposed scheme is secure against replay, forgery and repudiation. Even under nonideal conditions, it supports effective verification within a certain verification threshold.

  10. Power calculator for instrumental variable analysis in pharmacoepidemiology

    PubMed Central

    Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M

    2017-01-01

    Abstract Background Instrumental variable analysis, for example with physicians’ prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. Methods and Results The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. Conclusions The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. PMID:28575313

  11. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  12. Effects of a Flexibility and Relaxation Programme, Walking, and Nordic Walking on Parkinson's Disease

    PubMed Central

    Reuter, I.; Mehnert, S.; Leone, P.; Kaps, M.; Oechsner, M.; Engelhardt, M.

    2011-01-01

    Symptoms of Parkinson's disease (PD) progress despite optimized medical treatment. The present study investigated the effects of a flexibility and relaxation programme, walking, and Nordic walking (NW) on walking speed, stride length, stride length variability, Parkinson-specific disability (UPDRS), and health-related quality of life (PDQ 39). 90 PD patients were randomly allocated to the 3 treatment groups. Patients participated in a 6-month study with 3 exercise sessions per week, each lasting 70 min. Assessment after completion of the training showed that pain was reduced in all groups, and balance and health-related quality of life were improved. Furthermore, walking, and Nordic walking improved stride length, gait variability, maximal walking speed, exercise capacity at submaximal level, and PD disease-specific disability on the UPDRS in addition. Nordic walking was superior to the flexibility and relaxation programme and walking in improving postural stability, stride length, gait pattern and gait variability. No significant injuries occurred during the training. All patients of the Nordic walking group continued Nordic walking after completing the study. PMID:21603199

  13. Coupled continuous time-random walks in quenched random environment

    NASA Astrophysics Data System (ADS)

    Magdziarz, M.; Szczotka, W.

    2018-02-01

    We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.

  14. Comonotonic bounds on the survival probabilities in the Lee-Carter model for mortality projection

    NASA Astrophysics Data System (ADS)

    Denuit, Michel; Dhaene, Jan

    2007-06-01

    In the Lee-Carter framework, future survival probabilities are random variables with an intricate distribution function. In large homogeneous portfolios of life annuities, value-at-risk or conditional tail expectation of the total yearly payout of the company are approximately equal to the corresponding quantities involving random survival probabilities. This paper aims to derive some bounds in the increasing convex (or stop-loss) sense on these random survival probabilities. These bounds are obtained with the help of comonotonic upper and lower bounds on sums of correlated random variables.

  15. Variance approach for multi-objective linear programming with fuzzy random of objective function coefficients

    NASA Astrophysics Data System (ADS)

    Indarsih, Indrati, Ch. Rini

    2016-02-01

    In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.

  16. Exploiting Data Missingness in Bayesian Network Modeling

    NASA Astrophysics Data System (ADS)

    Rodrigues de Morais, Sérgio; Aussem, Alex

    This paper proposes a framework built on the use of Bayesian networks (BN) for representing statistical dependencies between the existing random variables and additional dummy boolean variables, which represent the presence/absence of the respective random variable value. We show how augmenting the BN with these additional variables helps pinpoint the mechanism through which missing data contributes to the classification task. The missing data mechanism is thus explicitly taken into account to predict the class variable using the data at hand. Extensive experiments on synthetic and real-world incomplete data sets reveals that the missingness information improves classification accuracy.

  17. Nonrecurrence and Bell-like inequalities

    NASA Astrophysics Data System (ADS)

    Danforth, Douglas G.

    2017-12-01

    The general class, Λ, of Bell hidden variables is composed of two subclasses ΛR and ΛN such that ΛR⋃ΛN = Λ and ΛR∩ ΛN = {}. The class ΛN is very large and contains random variables whose domain is the continuum, the reals. There are an uncountable infinite number of reals. Every instance of a real random variable is unique. The probability of two instances being equal is zero, exactly zero. ΛN induces sample independence. All correlations are context dependent but not in the usual sense. There is no "spooky action at a distance". Random variables, belonging to ΛN, are independent from one experiment to the next. The existence of the class ΛN makes it impossible to derive any of the standard Bell inequalities used to define quantum entanglement.

  18. Perturbed effects at radiation physics

    NASA Astrophysics Data System (ADS)

    Külahcı, Fatih; Şen, Zekâi

    2013-09-01

    Perturbation methodology is applied in order to assess the linear attenuation coefficient, mass attenuation coefficient and cross-section behavior with random components in the basic variables such as the radiation amounts frequently used in the radiation physics and chemistry. Additionally, layer attenuation coefficient (LAC) and perturbed LAC (PLAC) are proposed for different contact materials. Perturbation methodology provides opportunity to obtain results with random deviations from the average behavior of each variable that enters the whole mathematical expression. The basic photon intensity variation expression as the inverse exponential power law (as Beer-Lambert's law) is adopted for perturbation method exposition. Perturbed results are presented not only in terms of the mean but additionally the standard deviation and the correlation coefficients. Such perturbation expressions provide one to assess small random variability in basic variables.

  19. Evaluation of two-fold fully conditional specification multiple imputation for longitudinal electronic health record data

    PubMed Central

    Welch, Catherine A; Petersen, Irene; Bartlett, Jonathan W; White, Ian R; Marston, Louise; Morris, Richard W; Nazareth, Irwin; Walters, Kate; Carpenter, James

    2014-01-01

    Most implementations of multiple imputation (MI) of missing data are designed for simple rectangular data structures ignoring temporal ordering of data. Therefore, when applying MI to longitudinal data with intermittent patterns of missing data, some alternative strategies must be considered. One approach is to divide data into time blocks and implement MI independently at each block. An alternative approach is to include all time blocks in the same MI model. With increasing numbers of time blocks, this approach is likely to break down because of co-linearity and over-fitting. The new two-fold fully conditional specification (FCS) MI algorithm addresses these issues, by only conditioning on measurements, which are local in time. We describe and report the results of a novel simulation study to critically evaluate the two-fold FCS algorithm and its suitability for imputation of longitudinal electronic health records. After generating a full data set, approximately 70% of selected continuous and categorical variables were made missing completely at random in each of ten time blocks. Subsequently, we applied a simple time-to-event model. We compared efficiency of estimated coefficients from a complete records analysis, MI of data in the baseline time block and the two-fold FCS algorithm. The results show that the two-fold FCS algorithm maximises the use of data available, with the gain relative to baseline MI depending on the strength of correlations within and between variables. Using this approach also increases plausibility of the missing at random assumption by using repeated measures over time of variables whose baseline values may be missing. PMID:24782349

  20. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    EPA Science Inventory

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  1. P-Wave Indices and Risk of Ischemic Stroke: A Systematic Review and Meta-Analysis.

    PubMed

    He, Jinli; Tse, Gary; Korantzopoulos, Panagiotis; Letsas, Konstantinos P; Ali-Hasan-Al-Saegh, Sadeq; Kamel, Hooman; Li, Guangping; Lip, Gregory Y H; Liu, Tong

    2017-08-01

    Atrial cardiomyopathy is associated with an increased risk of ischemic stroke. P-wave terminal force in lead V 1 , P-wave duration, and maximum P-wave area are electrocardiographic parameters that have been used to assess left atrial abnormalities related to developing atrial fibrillation. The aim of this systematic review and meta-analysis was to examine their values for predicting ischemic stroke risk. PubMed and EMBASE databases were searched until December 2016 for studies that evaluated the association between P-wave indices and stroke risk. Both fixed- and random-effects models were used to calculate the overall effect estimates. Ten studies examining P-wave terminal force in lead V 1 , P-wave duration, and maximum P-wave area were included. P-wave terminal force in lead V 1 was found to be an independent predictor of stroke as both a continuous variable (odds ratio [OR] per 1 SD change, 1.18; 95% confidence interval [CI], 1.12-1.25; P <0.0001) and categorical variable (OR, 1.59; 95% CI, 1.10-2.28; P =0.01). P-wave duration was a significant predictor of incident ischemic stroke when analyzed as a categorical variable (OR, 1.86; 95% CI, 1.37-2.52; P <0.0001) but not when analyzed as a continuous variable (OR, 1.05; 95% CI, 0.98-1.13; P =0.15). Maximum P-wave area also predicted the risk of incident ischemic stroke (OR per 1 SD change, 1.10; 95% CI, 1.04-1.17). P-wave terminal force in lead V 1 , P-wave duration, and maximum P-wave area are useful electrocardiographic markers that can be used to stratify the risk of incident ischemic stroke. © 2017 American Heart Association, Inc.

  2. Factors affecting exits from homelessness among persons with serious mental illness and substance use disorders

    PubMed Central

    Gabrielian, Sonya; Bromley, Elizabeth; Hellemann, Gerhard S.; Kern, Robert S.; Goldenson, Nicholas I.; Danley, Megan E.; Young, Alexander S.

    2015-01-01

    Objective We sought to understand the housing trajectories of homeless consumers with serious mental illness (SMI) and co-occurring substance use disorders (SUD) and to identify factors that best-predicted achievement of independent housing. Methods Using administrative data, we identified homeless persons with SMI and SUD admitted to a residential rehabilitation program from 12/2008-11/2011. On a random sample (n=36), we assessed a range of potential predictors of housing outcomes, including symptoms, cognition, and social/community supports. We used the Residential Time-Line Follow-Back (TLFB) Inventory to gather housing histories since exiting rehabilitation and identify housing outcomes. We used recursive partitioning to identify variables that best-differentiated participants by these outcomes. Results We identified three housing trajectories: stable housing (n=14); unstable housing (n=15); and continuously engaged in housing services (n=7). Using recursive partitioning, two variables (symbol digit modalities test (SDMT), a neurocognitive speed of processing measure and Behavior and Symptom Identification Scale (BASIS)-relationships subscale, which quantifies symptoms affecting relationships) were sufficient to capture information provided by 26 predictors to classify participants by housing outcome. Participants predicted to continuously engage in services had impaired processing speeds (SDMT score<32.5). Among consumers with SDMT score≥32.5, those predicted to achieve stable housing had fewer interpersonal symptoms (BASIS-relationships score<0.81) than those predicted to have unstable housing. This model explains 57% of this sample's variability and 14% of this population's variability in housing outcomes. Conclusion As cognition and symptoms influencing relationships predicted housing outcomes for homeless adults with SMI and SUD, cognitive and social skills trainings may be useful for this population. PMID:25919839

  3. Factors affecting exits from homelessness among persons with serious mental illness and substance use disorders.

    PubMed

    Gabrielian, Sonya; Bromley, Elizabeth; Hellemann, Gerhard S; Kern, Robert S; Goldenson, Nicholas I; Danley, Megan E; Young, Alexander S

    2015-04-01

    We sought to understand the housing trajectories of homeless consumers with serious mental illness (SMI) and co-occurring substance use disorders (SUD) and to identify factors that best predicted achievement of independent housing. Using administrative data, we identified homeless persons with SMI and SUD admitted to a residential rehabilitation program from December 2008 to November 2011. Our primary outcome measure was independent housing status. On a random sample (N = 36), we assessed a range of potential predictors of housing outcomes, including symptoms, cognition, and social/community supports. We used the Residential Time-Line Follow-Back (TLFB) Inventory to gather housing histories since exiting rehabilitation and to identify housing outcomes. We used Recursive Partitioning (RP) to identify variables that best differentiated participants by these outcomes. We identified 3 housing trajectories: stable housing (n = 14), unstable housing (n = 15), and continuously engaged in housing services (n = 7). In RP analysis, 2 variables (Symbol Digit Modalities Test [SDMT], a neurocognitive speed of processing measure, and Behavior and Symptom Identification Scale [BASIS-24] Relationships subscale, which quantifies symptoms affecting relationships) were sufficient to capture information provided by 26 predictors to classify participants by housing outcome. Participants predicted to continuously engage in services had impaired processing speeds (SDMT score < 32.5). Among consumers with SDMT score ≥ 32.5, those predicted to achieve stable housing had fewer interpersonal symptoms (BASIS-24 Relationships subscale score < 0.81) than those predicted to have unstable housing. This model explains 57% of this sample's variability and 14% of this population's variability in housing outcomes. Because cognition and symptoms influencing relationships predicted housing outcomes for homeless adults with SMI and SUD, cognitive and social skills training may be useful for this population. © Copyright 2015 Physicians Postgraduate Press, Inc.

  4. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  5. Probalistic Finite Elements (PFEM) structural dynamics and fracture mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing-Kam; Belytschko, Ted; Mani, A.; Besterfield, G.

    1989-01-01

    The purpose of this work is to develop computationally efficient methodologies for assessing the effects of randomness in loads, material properties, and other aspects of a problem by a finite element analysis. The resulting group of methods is called probabilistic finite elements (PFEM). The overall objective of this work is to develop methodologies whereby the lifetime of a component can be predicted, accounting for the variability in the material and geometry of the component, the loads, and other aspects of the environment; and the range of response expected in a particular scenario can be presented to the analyst in addition to the response itself. Emphasis has been placed on methods which are not statistical in character; that is, they do not involve Monte Carlo simulations. The reason for this choice of direction is that Monte Carlo simulations of complex nonlinear response require a tremendous amount of computation. The focus of efforts so far has been on nonlinear structural dynamics. However, in the continuation of this project, emphasis will be shifted to probabilistic fracture mechanics so that the effect of randomness in crack geometry and material properties can be studied interactively with the effect of random load and environment.

  6. Sample size calculations for the design of cluster randomized trials: A summary of methodology.

    PubMed

    Gao, Fei; Earnest, Arul; Matchar, David B; Campbell, Michael J; Machin, David

    2015-05-01

    Cluster randomized trial designs are growing in popularity in, for example, cardiovascular medicine research and other clinical areas and parallel statistical developments concerned with the design and analysis of these trials have been stimulated. Nevertheless, reviews suggest that design issues associated with cluster randomized trials are often poorly appreciated and there remain inadequacies in, for example, describing how the trial size is determined and the associated results are presented. In this paper, our aim is to provide pragmatic guidance for researchers on the methods of calculating sample sizes. We focus attention on designs with the primary purpose of comparing two interventions with respect to continuous, binary, ordered categorical, incidence rate and time-to-event outcome variables. Issues of aggregate and non-aggregate cluster trials, adjustment for variation in cluster size and the effect size are detailed. The problem of establishing the anticipated magnitude of between- and within-cluster variation to enable planning values of the intra-cluster correlation coefficient and the coefficient of variation are also described. Illustrative examples of calculations of trial sizes for each endpoint type are included. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Soccer vs. running training effects in young adult men: which programme is more effective in improvement of body composition? Randomized controlled trial

    PubMed Central

    Pantelić, S; Kostić, R; Trajković, N; Sporiš, G

    2015-01-01

    The aims of this study were: 1) To determine the effects of a 12-week recreational soccer training programme and continuous endurance running on body composition of young adult men and 2) to determine which of these two programmes was more effective concerning body composition. Sixty-four participants completed the randomized controlled trial and were randomly assigned to one of three groups: a soccer training group (SOC; n=20), a running group (RUN; n=21) or a control group performing no physical training (CON; n=23). Training programmes for SOC and RUN lasted 12-week with 3 training sessions per week. Soccer sessions consisted of 60 min ordinary five-a-side, six-a-side or seven-a-side matches on a 30-45 m wide and 45-60 m long plastic grass pitch. Running sessions consisted of 60 min of continuous moderate intensity running at the same average heart rate as in SOC (~80% HRmax). All participants, regardless of group assignment, were tested for each of the following dependent variables: body weight, body height, body mass index, percent body fat, body fat mass, fat-free mass and total body water. In the SOC and RUN groups there was a significant decrease (p < 0.05) in body composition parameters from pre- to post-training values for all measures with the exception of fat-free mass and total body water. Body mass index, percent body fat and body fat mass did not differ between groups at baseline, but by week 12 were significantly lower (p < 0.05) in the SOC and RUN groups compared to CON. To conclude, recreational soccer training provides at least the same changes in body composition parameters as continuous running in young adult men when the training intensity is well matched. PMID:26681832

  8. A Unifying Probability Example.

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.

    2002-01-01

    Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…

  9. Variable selection with random forest: Balancing stability, performance, and interpretation in ecological and environmental modeling

    EPA Science Inventory

    Random forest (RF) is popular in ecological and environmental modeling, in part, because of its insensitivity to correlated predictors and resistance to overfitting. Although variable selection has been proposed to improve both performance and interpretation of RF models, it is u...

  10. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density function, apparently previously unknown and dubbed "Maccone distribution" by Paul Davies. DATA ENRICHMENT PRINCIPLE. It should be noticed that ANY positive number of random variables in the Statistical Drake Equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation, we call the "Data Enrichment Principle," and we regard it as the key to more profound future results in the fields of Astrobiology and SETI. Finally, a practical example is given of how our statistical Drake equation works numerically. We work out in detail the case, where each of the seven random variables is uniformly distributed around its own mean value and has a given standard deviation. For instance, the number of stars in the Galaxy is assumed to be uniformly distributed around (say) 350 billions with a standard deviation of (say) 1 billion. Then, the resulting lognormal distribution of N is computed numerically by virtue of a MathCad file that the author has written. This shows that the mean value of the lognormal random variable N is actually of the same order as the classical N given by the ordinary Drake equation, as one might expect from a good statistical generalization.

  11. Relapse of Graves' disease after successful outcome of antithyroid drug therapy: results of a prospective randomized study on the use of levothyroxine.

    PubMed

    Hoermann, Rudolf; Quadbeck, Beate; Roggenbuck, Ulla; Szabolcs, István; Pfeilschifter, Johannes; Meng, Wieland; Reschke, Kirsten; Hackenberg, Klaus; Dettmann, Juergen; Prehn, Brigitte; Hirche, Herbert; Mann, Klaus

    2002-12-01

    Antithyroid drugs are effective in restoring euthyroidism in Graves' disease, but many patients experience relapse after withdrawal. Prevention of recurrence would therefore be a desirable goal. In a prospective study, patients with successful outcome of 12 to 15 months antithyroid drug therapy were stratified for risk factors and randomly assigned to receive levothyroxine in a variable thyrotropin (TSH)-suppressive dose for 2 years or no treatment. The levothyroxine group was randomized to continue or discontinue levothyroxine after 1 year. End points included relapse of overt hyperthyroidism. Of 346 patients with Graves' disease enrolled 225 were euthyroid 4 weeks after antithyroid drug withdrawal and were randomly assigned to receive levothyroxine (114 patients) or no treatment (controls, 111 patients). Of those not randomized, 39 patients showed early relapse within 4 weeks, 61 endogenous TSH suppression, 7 TSH elevation, and 14 had to be excluded. Dropout rate during the study were 13.3%. Kaplan-Meier analyses showed relapse rates to be similar in the levothyroxine group (20% after 1 year, 32% after 2 years) and the randomized controls (18%, 24%), whereas relapses were significantly more frequent in the follow-up group of patients with endogenously suppressed TSH (33%, 49%). Levothyroxine therapy did not influence TSH-receptor antibody, nor did it reduce goiter size. The best prognostic marker available was basal TSH determined 4 weeks after withdrawal of antithyroid drugs (posttreatment TSH). The study demonstrates that levothyroxine does not prevent relapse of hyperthyroidism after successful restoration of euthyroid function by antithyroid drugs and characterizes posttreatment TSH as a main prognostic marker.

  12. Random effects coefficient of determination for mixed and meta-analysis models

    PubMed Central

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2011-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, Rr2, that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If Rr2 is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of Rr2 apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects—the model can be estimated using the dummy variable approach. We derive explicit formulas for Rr2 in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine. PMID:23750070

  13. Correlated resistive/capacitive state variability in solid TiO2 based memory devices

    NASA Astrophysics Data System (ADS)

    Li, Qingjiang; Salaoru, Iulia; Khiat, Ali; Xu, Hui; Prodromakis, Themistoklis

    2017-05-01

    In this work, we experimentally demonstrated the correlated resistive/capacitive switching and state variability in practical TiO2 based memory devices. Based on filamentary functional mechanism, we argue that the impedance state variability stems from the randomly distributed defects inside the oxide bulk. Finally, our assumption was verified via a current percolation circuit model, by taking into account of random defects distribution and coexistence of memristor and memcapacitor.

  14. Algebraic Functions of H-Functions with Specific Dependency Structure.

    DTIC Science & Technology

    1984-05-01

    a study of its characteristic function. Such analysis is reproduced in books by Springer (17), Anderson (23), Feller (34,35), Mood and Graybill (52...following linearity property for expectations of jointly distributed random variables is derived. r 1 Theorem 1.1: If X and Y are real random variables...appear in American Journal of Mathematical and Management Science. 13. Mathai, A.M., and R.K. Saxena, "On linear combinations of stochastic variables

  15. A spatial error model with continuous random effects and an application to growth convergence

    NASA Astrophysics Data System (ADS)

    Laurini, Márcio Poletti

    2017-10-01

    We propose a spatial error model with continuous random effects based on Matérn covariance functions and apply this model for the analysis of income convergence processes (β -convergence). The use of a model with continuous random effects permits a clearer visualization and interpretation of the spatial dependency patterns, avoids the problems of defining neighborhoods in spatial econometrics models, and allows projecting the spatial effects for every possible location in the continuous space, circumventing the existing aggregations in discrete lattice representations. We apply this model approach to analyze the economic growth of Brazilian municipalities between 1991 and 2010 using unconditional and conditional formulations and a spatiotemporal model of convergence. The results indicate that the estimated spatial random effects are consistent with the existence of income convergence clubs for Brazilian municipalities in this period.

  16. A stochastic hybrid systems based framework for modeling dependent failure processes

    PubMed Central

    Fan, Mengfei; Zeng, Zhiguo; Zio, Enrico; Kang, Rui; Chen, Ying

    2017-01-01

    In this paper, we develop a framework to model and analyze systems that are subject to dependent, competing degradation processes and random shocks. The degradation processes are described by stochastic differential equations, whereas transitions between the system discrete states are triggered by random shocks. The modeling is, then, based on Stochastic Hybrid Systems (SHS), whose state space is comprised of a continuous state determined by stochastic differential equations and a discrete state driven by stochastic transitions and reset maps. A set of differential equations are derived to characterize the conditional moments of the state variables. System reliability and its lower bounds are estimated from these conditional moments, using the First Order Second Moment (FOSM) method and Markov inequality, respectively. The developed framework is applied to model three dependent failure processes from literature and a comparison is made to Monte Carlo simulations. The results demonstrate that the developed framework is able to yield an accurate estimation of reliability with less computational costs compared to traditional Monte Carlo-based methods. PMID:28231313

  17. A stochastic hybrid systems based framework for modeling dependent failure processes.

    PubMed

    Fan, Mengfei; Zeng, Zhiguo; Zio, Enrico; Kang, Rui; Chen, Ying

    2017-01-01

    In this paper, we develop a framework to model and analyze systems that are subject to dependent, competing degradation processes and random shocks. The degradation processes are described by stochastic differential equations, whereas transitions between the system discrete states are triggered by random shocks. The modeling is, then, based on Stochastic Hybrid Systems (SHS), whose state space is comprised of a continuous state determined by stochastic differential equations and a discrete state driven by stochastic transitions and reset maps. A set of differential equations are derived to characterize the conditional moments of the state variables. System reliability and its lower bounds are estimated from these conditional moments, using the First Order Second Moment (FOSM) method and Markov inequality, respectively. The developed framework is applied to model three dependent failure processes from literature and a comparison is made to Monte Carlo simulations. The results demonstrate that the developed framework is able to yield an accurate estimation of reliability with less computational costs compared to traditional Monte Carlo-based methods.

  18. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in 2008. 4. A practical example is then given of how the SEH works numerically. Each of the ten random variables is uniformly distributed around its own mean value as given by Dole (1964) and a standard deviation of 10% is assumed. The conclusion is that the average number of habitable planets in the Galaxy should be around 100 million ±200 million, and the average distance in between any two nearby habitable planets should be about 88 light years ±40 light years. 5. The SEH results are matched against the results of the Statistical Drake Equation from reference 4. As expected, the number of currently communicating ET civilizations in the Galaxy turns out to be much smaller than the number of habitable planets (about 10,000 against 100 million, i.e. one ET civilization out of 10,000 habitable planets). The average distance between any two nearby habitable planets is much smaller that the average distance between any two neighbouring ET civilizations: 88 light years vs. 2000 light years, respectively. This means an ET average distance about 20 times higher than the average distance between any pair of adjacent habitable planets. 6. Finally, a statistical model of the Fermi Paradox is derived by applying the above results to the coral expansion model of Galactic colonization. The symbolic manipulator "Macsyma" is used to solve these difficult equations. A new random variable Tcol, representing the time needed to colonize a new planet is introduced, which follows the lognormal distribution, Then the new quotient random variable Tcol/D is studied and its probability density function is derived by Macsyma. Finally a linear transformation of random variables yields the overall time TGalaxy needed to colonize the whole Galaxy. We believe that our mathematical work in deriving this STATISTICAL Fermi Paradox is highly innovative and fruitful for the future.

  19. Mutilating Data and Discarding Variance: The Dangers of Dichotomizing Continuous Variables.

    ERIC Educational Resources Information Center

    Kroff, Michael W.

    This paper reviews issues involved in converting continuous variables to nominal variables to be used in the OVA techniques. The literature dealing with the dangers of dichotomizing continuous variables is reviewed. First, the assumptions invoked by OVA analyses are reviewed in addition to concerns regarding the loss of variance and a reduction in…

  20. On the distribution of a product of N Gaussian random variables

    NASA Astrophysics Data System (ADS)

    Stojanac, Željka; Suess, Daniel; Kliesch, Martin

    2017-08-01

    The product of Gaussian random variables appears naturally in many applications in probability theory and statistics. It has been known that the distribution of a product of N such variables can be expressed in terms of a Meijer G-function. Here, we compute a similar representation for the corresponding cumulative distribution function (CDF) and provide a power-log series expansion of the CDF based on the theory of the more general Fox H-functions. Numerical computations show that for small values of the argument the CDF of products of Gaussians is well approximated by the lowest orders of this expansion. Analogous results are also shown for the absolute value as well as the square of such products of N Gaussian random variables. For the latter two settings, we also compute the moment generating functions in terms of Meijer G-functions.

  1. Increasing body mass index z-score is continuously associated with complications of overweight in children, even in the healthy weight range.

    PubMed

    Bell, Lana M; Byrne, Sue; Thompson, Alisha; Ratnam, Nirubasini; Blair, Eve; Bulsara, Max; Jones, Timothy W; Davis, Elizabeth A

    2007-02-01

    Overweight/obesity in children is increasing. Incidence data for medical complications use arbitrary cutoff values for categories of overweight and obesity. Continuous relationships are seldom reported. The objective of this study is to report relationships of child body mass index (BMI) z-score as a continuous variable with the medical complications of overweight. This study is a part of the larger, prospective cohort Growth and Development Study. Children were recruited from the community through randomly selected primary schools. Overweight children seeking treatment were recruited through tertiary centers. Children aged 6-13 yr were community-recruited normal weight (n = 73), community-recruited overweight (n = 53), and overweight treatment-seeking (n = 51). Medical history, family history, and symptoms of complications of overweight were collected by interview, and physical examination was performed. Investigations included oral glucose tolerance tests, fasting lipids, and liver function tests. Adjusted regression was used to model each complication of obesity with age- and sex-specific child BMI z-scores entered as a continuous dependent variable. Adjusted logistic regression showed the proportion of children with musculoskeletal pain, obstructive sleep apnea symptoms, headaches, depression, anxiety, bullying, and acanthosis nigricans increased with child BMI z-score. Adjusted linear regression showed BMI z-score was significantly related to systolic and diastolic blood pressure, insulin during oral glucose tolerance test, total cholesterol, high-density lipoprotein, triglycerides, and alanine aminotransferase. Child's BMI z-score is independently related to complications of overweight and obesity in a linear or curvilinear fashion. Children's risks of most complications increase across the entire range of BMI values and are not defined by thresholds.

  2. Accounting for stimulus-specific variation in precision reveals a discrete capacity limit in visual working memory

    PubMed Central

    Pratte, Michael S.; Park, Young Eun; Rademaker, Rosanne L.; Tong, Frank

    2016-01-01

    If we view a visual scene that contains many objects, then momentarily close our eyes, some details persist while others seem to fade. Discrete models of visual working memory (VWM) assume that only a few items can be actively maintained in memory, beyond which pure guessing will emerge. Alternatively, continuous resource models assume that all items in a visual scene can be stored with some precision. Distinguishing between these competing models is challenging, however, as resource models that allow for stochastically variable precision (across items and trials) can produce error distributions that resemble random guessing behavior. Here, we evaluated the hypothesis that a major source of variability in VWM performance arises from systematic variation in precision across the stimuli themselves; such stimulus-specific variability can be incorporated into both discrete-capacity and variable-precision resource models. Participants viewed multiple oriented gratings, and then reported the orientation of a cued grating from memory. When modeling the overall distribution of VWM errors, we found that the variable-precision resource model outperformed the discrete model. However, VWM errors revealed a pronounced “oblique effect”, with larger errors for oblique than cardinal orientations. After this source of variability was incorporated into both models, we found that the discrete model provided a better account of VWM errors. Our results demonstrate that variable precision across the stimulus space can lead to an unwarranted advantage for resource models that assume stochastically variable precision. When these deterministic sources are adequately modeled, human working memory performance reveals evidence of a discrete capacity limit. PMID:28004957

  3. Accounting for stimulus-specific variation in precision reveals a discrete capacity limit in visual working memory.

    PubMed

    Pratte, Michael S; Park, Young Eun; Rademaker, Rosanne L; Tong, Frank

    2017-01-01

    If we view a visual scene that contains many objects, then momentarily close our eyes, some details persist while others seem to fade. Discrete models of visual working memory (VWM) assume that only a few items can be actively maintained in memory, beyond which pure guessing will emerge. Alternatively, continuous resource models assume that all items in a visual scene can be stored with some precision. Distinguishing between these competing models is challenging, however, as resource models that allow for stochastically variable precision (across items and trials) can produce error distributions that resemble random guessing behavior. Here, we evaluated the hypothesis that a major source of variability in VWM performance arises from systematic variation in precision across the stimuli themselves; such stimulus-specific variability can be incorporated into both discrete-capacity and variable-precision resource models. Participants viewed multiple oriented gratings, and then reported the orientation of a cued grating from memory. When modeling the overall distribution of VWM errors, we found that the variable-precision resource model outperformed the discrete model. However, VWM errors revealed a pronounced "oblique effect," with larger errors for oblique than cardinal orientations. After this source of variability was incorporated into both models, we found that the discrete model provided a better account of VWM errors. Our results demonstrate that variable precision across the stimulus space can lead to an unwarranted advantage for resource models that assume stochastically variable precision. When these deterministic sources are adequately modeled, human working memory performance reveals evidence of a discrete capacity limit. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. The Comparison of Relationship Beliefs and Couples Burnout in women who apply for Divorce and Women Who Want to Continue their Marital Life

    PubMed Central

    Adibrad, Nastaran; Sedgh poor, Bahram Saleh

    2010-01-01

    Objective The aim of this study was to examine the comparison of relationship beliefs and couples burnout in women who apply for divorce and women who want to continue their marital life. Method for this study, 50 women who referred to judicial centers and 50 women who claimed they wanted to continue their marital life were randomly selected. Participants were asked to complete the relationship beliefs inventory and marital burnout questionnaires. In this study, descriptive statistical methods such as standard deviation, mean, t- students for independent groups, correlation, multi-variable regression and independent group's correlation difference test were used. Results The comparison between the relationship beliefs of the 2 groups (those wanting to divorce and women wanting to continue their marital life) was significantly different (p<0/1). In addition, the comparison of marital burnout was significantly different in the 2 groups (p<0/1). Discussion Women who were about to divorce were significantly different from those who wanted to continue their marital relationship in the general measure of the relationship beliefs and factors of “believing that disagreement is destructive and their partners can not change their undesirable behaviors”. In other words, women who were applying for divorce had more unreasonable thoughts and burnout compared to those who wanted to continue their marital life. PMID:22952488

  5. Random parameter models for accident prediction on two-lane undivided highways in India.

    PubMed

    Dinu, R R; Veeraragavan, A

    2011-02-01

    Generalized linear modeling (GLM), with the assumption of Poisson or negative binomial error structure, has been widely employed in road accident modeling. A number of explanatory variables related to traffic, road geometry, and environment that contribute to accident occurrence have been identified and accident prediction models have been proposed. The accident prediction models reported in literature largely employ the fixed parameter modeling approach, where the magnitude of influence of an explanatory variable is considered to be fixed for any observation in the population. Similar models have been proposed for Indian highways too, which include additional variables representing traffic composition. The mixed traffic on Indian highways comes with a lot of variability within, ranging from difference in vehicle types to variability in driver behavior. This could result in variability in the effect of explanatory variables on accidents across locations. Random parameter models, which can capture some of such variability, are expected to be more appropriate for the Indian situation. The present study is an attempt to employ random parameter modeling for accident prediction on two-lane undivided rural highways in India. Three years of accident history, from nearly 200 km of highway segments, is used to calibrate and validate the models. The results of the analysis suggest that the model coefficients for traffic volume, proportion of cars, motorized two-wheelers and trucks in traffic, and driveway density and horizontal and vertical curvatures are randomly distributed across locations. The paper is concluded with a discussion on modeling results and the limitations of the present study. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Bias, Confounding, and Interaction: Lions and Tigers, and Bears, Oh My!

    PubMed

    Vetter, Thomas R; Mascha, Edward J

    2017-09-01

    Epidemiologists seek to make a valid inference about the causal effect between an exposure and a disease in a specific population, using representative sample data from a specific population. Clinical researchers likewise seek to make a valid inference about the association between an intervention and outcome(s) in a specific population, based upon their randomly collected, representative sample data. Both do so by using the available data about the sample variable to make a valid estimate about its corresponding or underlying, but unknown population parameter. Random error in an experiment can be due to the natural, periodic fluctuation or variation in the accuracy or precision of virtually any data sampling technique or health measurement tool or scale. In a clinical research study, random error can be due to not only innate human variability but also purely chance. Systematic error in an experiment arises from an innate flaw in the data sampling technique or measurement instrument. In the clinical research setting, systematic error is more commonly referred to as systematic bias. The most commonly encountered types of bias in anesthesia, perioperative, critical care, and pain medicine research include recall bias, observational bias (Hawthorne effect), attrition bias, misclassification or informational bias, and selection bias. A confounding variable is a factor associated with both the exposure of interest and the outcome of interest. A confounding variable (confounding factor or confounder) is a variable that correlates (positively or negatively) with both the exposure and outcome. Confounding is typically not an issue in a randomized trial because the randomized groups are sufficiently balanced on all potential confounding variables, both observed and nonobserved. However, confounding can be a major problem with any observational (nonrandomized) study. Ignoring confounding in an observational study will often result in a "distorted" or incorrect estimate of the association or treatment effect. Interaction among variables, also known as effect modification, exists when the effect of 1 explanatory variable on the outcome depends on the particular level or value of another explanatory variable. Bias and confounding are common potential explanations for statistically significant associations between exposure and outcome when the true relationship is noncausal. Understanding interactions is vital to proper interpretation of treatment effects. These complex concepts should be consistently and appropriately considered whenever one is not only designing but also analyzing and interpreting data from a randomized trial or observational study.

  7. Knee strength retention and analgesia with continuous perineural fentanyl infusion after total knee replacement: randomized controlled trial.

    PubMed

    Mangar, Devanand; Karlnoski, Rachel A; Sprenker, Collin J; Downes, Katheryne L; Taffe, Narrene; Wainwright, Robert; Gustke, Kenneth; Bernasek, Thomas L; Camporesi, Enrico

    2014-04-01

    Despite providing adequate pain relief, a femoral nerve block can induce postoperative muscle weakness after total knee arthoplasty (TKA). Fentanyl has been shown to have peripheral effects but has not been used as a perineural infusate alone after TKA. Sixty patients scheduled for TKA were randomized to one of three blinded groups: a continuous 24 h infusion of either fentanyl 3 μg/ml, ropivacaine 0.1%, or 0.9% normal saline through a femoral nerve sheath catheter at 10 ml/h. The main outcome was maximum voluntary isometric contraction (MVIC) in the quadriceps femoris (knee extension), measured by a handheld dynamometer (Nm/kg). Other variables assessed were preoperative and postoperative visual analog scale (VAS) scores, hamstrings MVIC (knee flexion), active range of motion of the operative knee, distance ambulated, incidence of knee buckling, supplemental morphine usage, postoperative side effects, and serum fentanyl levels. Quadriceps MVIC values were significantly greater in the fentanyl group compared to the group that received ropivacaine (median values, 0.08 vs. 0.03 Nm/kg; p = 0.028). The incidence of postoperative knee buckling upon ambulation was higher in the ropivacaine group compared to the fentanyl group, although not statistically significant (40% vs. 15 %, respectively; p = 0.077). VAS scores while ambulating were not significantly different between the fentanyl group and the ropivacaine group (p = 0.270). Postoperative morphine consumption, nausea and vomiting, and resting VAS scores were similar among the three groups. A continuous perineural infusion of fentanyl produced greater strength retention than ropivacaine post-TKA.

  8. Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity

    DTIC Science & Technology

    2015-04-15

    manage , predict, and mitigate the risk in the original variable. Residual risk can be exemplified as a quantification of the improved... the random variable of interest is viewed in concert with a related random vector that helps to manage , predict, and mitigate the risk in the original... manage , predict and mitigate the risk in the original variable. Residual risk can be exemplified as a quantification of the improved situation faced

  9. Secondary outcome analysis for data from an outcome-dependent sampling design.

    PubMed

    Pan, Yinghao; Cai, Jianwen; Longnecker, Matthew P; Zhou, Haibo

    2018-04-22

    Outcome-dependent sampling (ODS) scheme is a cost-effective way to conduct a study. For a study with continuous primary outcome, an ODS scheme can be implemented where the expensive exposure is only measured on a simple random sample and supplemental samples selected from 2 tails of the primary outcome variable. With the tremendous cost invested in collecting the primary exposure information, investigators often would like to use the available data to study the relationship between a secondary outcome and the obtained exposure variable. This is referred as secondary analysis. Secondary analysis in ODS designs can be tricky, as the ODS sample is not a random sample from the general population. In this article, we use the inverse probability weighted and augmented inverse probability weighted estimating equations to analyze the secondary outcome for data obtained from the ODS design. We do not make any parametric assumptions on the primary and secondary outcome and only specify the form of the regression mean models, thus allow an arbitrary error distribution. Our approach is robust to second- and higher-order moment misspecification. It also leads to more precise estimates of the parameters by effectively using all the available participants. Through simulation studies, we show that the proposed estimator is consistent and asymptotically normal. Data from the Collaborative Perinatal Project are analyzed to illustrate our method. Copyright © 2018 John Wiley & Sons, Ltd.

  10. Semiparametric methods for estimation of a nonlinear exposure-outcome relationship using instrumental variables with application to Mendelian randomization.

    PubMed

    Staley, James R; Burgess, Stephen

    2017-05-01

    Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure-outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure-outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure-outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  11. Assessing mediation using marginal structural models in the presence of confounding and moderation

    PubMed Central

    Coffman, Donna L.; Zhong, Wei

    2012-01-01

    This paper presents marginal structural models (MSMs) with inverse propensity weighting (IPW) for assessing mediation. Generally, individuals are not randomly assigned to levels of the mediator. Therefore, confounders of the mediator and outcome may exist that limit causal inferences, a goal of mediation analysis. Either regression adjustment or IPW can be used to take confounding into account, but IPW has several advantages. Regression adjustment of even one confounder of the mediator and outcome that has been influenced by treatment results in biased estimates of the direct effect (i.e., the effect of treatment on the outcome that does not go through the mediator). One advantage of IPW is that it can properly adjust for this type of confounding, assuming there are no unmeasured confounders. Further, we illustrate that IPW estimation provides unbiased estimates of all effects when there is a baseline moderator variable that interacts with the treatment, when there is a baseline moderator variable that interacts with the mediator, and when the treatment interacts with the mediator. IPW estimation also provides unbiased estimates of all effects in the presence of non-randomized treatments. In addition, for testing mediation we propose a test of the null hypothesis of no mediation. Finally, we illustrate this approach with an empirical data set in which the mediator is continuous, as is often the case in psychological research. PMID:22905648

  12. Semiparametric methods for estimation of a nonlinear exposure‐outcome relationship using instrumental variables with application to Mendelian randomization

    PubMed Central

    Staley, James R.

    2017-01-01

    ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167

  13. Effect of an orientation group for patients with chronic heart failure: randomized controlled trial 1

    PubMed Central

    Arruda, Cristina Silva; Pereira, Juliana de Melo Vellozo; Figueiredo, Lyvia da Silva; Scofano, Bruna dos Santos; Flores, Paula Vanessa Peclat; Cavalcanti, Ana Carla Dantas

    2018-01-01

    ABSTRACT Objective: To evaluate the effect of the orientation group on therapeutic adherence and self-care among patients with chronic heart failure. Method: Randomized controlled trial with 27 patients with chronic heart failure. The intervention group received nursing consultations and participated in group meetings with the multi-professional team. The control group only received nursing consultations in a period of four months. Questionnaires validated for use in Brazil were applied in the beginning and in the end of the study to assess self-care outcomes and adherence to treatment. Categorical variables were expressed through frequency and percentage distributions and the continuous variables through mean and standard deviation. The comparison between the initial and final scores of the intervention and control groups was done through the Student’s t-test. Results: The mean adherence in the intervention group was 13.9 ± 3.6 before the study and 4.8 ± 2.3 after the study. In the control group it was 14.2 ± 3.4 before the study and 14.7 ± 3.5 after the study. The self-care confidence score was lower after the intervention (p=0.01). Conclusion: The orientation group does not improve adherence to treatment and self-care management and maintenance and it may reduce confidence in self-care. Registry REBEC RBR-7r9f2m. PMID:29319747

  14. Effect of an orientation group for patients with chronic heart failure: randomized controlled trial.

    PubMed

    Arruda, Cristina Silva; Pereira, Juliana de Melo Vellozo; Figueiredo, Lyvia da Silva; Scofano, Bruna Dos Santos; Flores, Paula Vanessa Peclat; Cavalcanti, Ana Carla Dantas

    2018-01-08

    To evaluate the effect of the orientation group on therapeutic adherence and self-care among patients with chronic heart failure. Randomized controlled trial with 27 patients with chronic heart failure. The intervention group received nursing consultations and participated in group meetings with the multi-professional team. The control group only received nursing consultations in a period of four months. Questionnaires validated for use in Brazil were applied in the beginning and in the end of the study to assess self-care outcomes and adherence to treatment. Categorical variables were expressed through frequency and percentage distributions and the continuous variables through mean and standard deviation. The comparison between the initial and final scores of the intervention and control groups was done through the Student's t-test. The mean adherence in the intervention group was 13.9 ± 3.6 before the study and 4.8 ± 2.3 after the study. In the control group it was 14.2 ± 3.4 before the study and 14.7 ± 3.5 after the study. The self-care confidence score was lower after the intervention (p=0.01). The orientation group does not improve adherence to treatment and self-care management and maintenance and it may reduce confidence in self-care. Registry REBEC RBR-7r9f2m.

  15. Sums and Products of Jointly Distributed Random Variables: A Simplified Approach

    ERIC Educational Resources Information Center

    Stein, Sheldon H.

    2005-01-01

    Three basic theorems concerning expected values and variances of sums and products of random variables play an important role in mathematical statistics and its applications in education, business, the social sciences, and the natural sciences. A solid understanding of these theorems requires that students be familiar with the proofs of these…

  16. A Strategy to Use Soft Data Effectively in Randomized Controlled Clinical Trials.

    ERIC Educational Resources Information Center

    Kraemer, Helena Chmura; Thiemann, Sue

    1989-01-01

    Sees soft data, measures having substantial intrasubject variability due to errors of measurement or response inconsistency, as important measures of response in randomized clinical trials. Shows that using intensive design and slope of response on time as outcome measure maximizes sample retention and decreases within-group variability, thus…

  17. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    ERIC Educational Resources Information Center

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  18. Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.

    PubMed

    Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M

    2005-11-01

    We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.

  19. Continuously-Variable Positive-Mesh Power Transmission

    NASA Technical Reports Server (NTRS)

    Johnson, J. L.

    1982-01-01

    Proposed transmission with continuously-variable speed ratio couples two mechanical trigonometric-function generators. Transmission is expected to handle higher loads than conventional variable-pulley drives; and, unlike variable pulley, positive traction through entire drive train with no reliance on friction to transmit power. Able to vary speed continuously through zero and into reverse. Possible applications in instrumentation where drive-train slippage cannot be tolerated.

  20. Characterization of cancer and normal tissue fluorescence through wavelet transform and singular value decomposition

    NASA Astrophysics Data System (ADS)

    Gharekhan, Anita H.; Biswal, Nrusingh C.; Gupta, Sharad; Pradhan, Asima; Sureshkumar, M. B.; Panigrahi, Prasanta K.

    2008-02-01

    The statistical and characteristic features of the polarized fluorescence spectra from cancer, normal and benign human breast tissues are studied through wavelet transform and singular value decomposition. The discrete wavelets enabled one to isolate high and low frequency spectral fluctuations, which revealed substantial randomization in the cancerous tissues, not present in the normal cases. In particular, the fluctuations fitted well with a Gaussian distribution for the cancerous tissues in the perpendicular component. One finds non-Gaussian behavior for normal and benign tissues' spectral variations. The study of the difference of intensities in parallel and perpendicular channels, which is free from the diffusive component, revealed weak fluorescence activity in the 630nm domain, for the cancerous tissues. This may be ascribable to porphyrin emission. The role of both scatterers and fluorophores in the observed minor intensity peak for the cancer case is experimentally confirmed through tissue-phantom experiments. Continuous Morlet wavelet also highlighted this domain for the cancerous tissue fluorescence spectra. Correlation in the spectral fluctuation is further studied in different tissue types through singular value decomposition. Apart from identifying different domains of spectral activity for diseased and non-diseased tissues, we found random matrix support for the spectral fluctuations. The small eigenvalues of the perpendicular polarized fluorescence spectra of cancerous tissues fitted remarkably well with random matrix prediction for Gaussian random variables, confirming our observations about spectral fluctuations in the wavelet domain.

  1. Pharmacokinetics of S-ketamine during prolonged sedation at the pediatric intensive care unit.

    PubMed

    Flint, Robert B; Brouwer, Carole N M; Kränzlin, Anne S C; Lie-A-Huen, Loraine; Bos, Albert P; Mathôt, Ron A A

    2017-11-01

    S-ketamine is the S(+)-enantiomer of the racemic mixture ketamine, an anesthetic drug providing both sedation and analgesia. In clinical practice, significant interpatient variability in drug effect of S-ketamine is observed during long-term sedation. The aim of this study was to evaluate the pharmacokinetic variability of S-ketamine in children aged 0-18 years during long-term sedation. Twenty-five children (median age: 0.42 years, range: 0.02-12.5) received continuous intravenous administrations of 0.3-3.6 mg/kg/h S-ketamine for sedation during mechanical ventilation. Infusion rates were adjusted to the desired level of sedation and analgesia based on the COMFORT-B score and Visual Analog Scale. Blood samples were drawn once daily at random time-points, and at 1 and 4 hours after discontinuation of S-ketamine infusion. Time profiles of plasma concentrations of S-ketamine and active metabolite S-norketamine were analyzed using nonlinear mixed-effects modeling software. Clearance and volume of distribution were allometrically scaled using the ¾ power model. A total of 86 blood samples were collected. A 2-compartment and 1-compartment model adequately described the PK of S-ketamine and S-norketamine, respectively. The typical parameter estimates for clearance and central and peripheral volumes of distribution were: CL S - KETAMINE =112 L/h/70 kg, V1 S- KETAMINE =7.7 L/70 kg, V2 S- KETAMINE =545L/70 kg, Q S - kETAMINE =196 L/h/70 kg, and CL S - NORKETAMINE =53 L/h/70 kg. Interpatient variability of CL S - KETAMINE and CL S - NORKETAMINE was considerable with values of 40% and 104%, respectively, leading to marked variability in steady-state plasma concentrations. Substantial interpatient variability in pharmacokinetics in children complicates the development of adequate dosage regimen for continuous sedation. © 2017 John Wiley & Sons Ltd.

  2. Optimal estimation of spatially variable recharge and transmissivity fields under steady-state groundwater flow. Part 1. Theory

    NASA Astrophysics Data System (ADS)

    Graham, Wendy D.; Tankersley, Claude D.

    1994-05-01

    Stochastic methods are used to analyze two-dimensional steady groundwater flow subject to spatially variable recharge and transmissivity. Approximate partial differential equations are developed for the covariances and cross-covariances between the random head, transmissivity and recharge fields. Closed-form solutions of these equations are obtained using Fourier transform techniques. The resulting covariances and cross-covariances can be incorporated into a Bayesian conditioning procedure which provides optimal estimates of the recharge, transmissivity and head fields given available measurements of any or all of these random fields. Results show that head measurements contain valuable information for estimating the random recharge field. However, when recharge is treated as a spatially variable random field, the value of head measurements for estimating the transmissivity field can be reduced considerably. In a companion paper, the method is applied to a case study of the Upper Floridan Aquifer in NE Florida.

  3. Random Effects: Variance Is the Spice of Life.

    PubMed

    Jupiter, Daniel C

    Covariates in regression analyses allow us to understand how independent variables of interest impact our dependent outcome variable. Often, we consider fixed effects covariates (e.g., gender or diabetes status) for which we examine subjects at each value of the covariate. We examine both men and women and, within each gender, examine both diabetic and nondiabetic patients. Occasionally, however, we consider random effects covariates for which we do not examine subjects at every value. For example, we examine patients from only a sample of hospitals and, within each hospital, examine both diabetic and nondiabetic patients. The random sampling of hospitals is in contrast to the complete coverage of all genders. In this column I explore the differences in meaning and analysis when thinking about fixed and random effects variables. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  4. Continuous-variable controlled-Z gate using an atomic ensemble

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Mingfeng; Jiang Nianquan; Jin Qingli

    2011-06-15

    The continuous-variable controlled-Z gate is a canonical two-mode gate for universal continuous-variable quantum computation. It is considered as one of the most fundamental continuous-variable quantum gates. Here we present a scheme for realizing continuous-variable controlled-Z gate between two optical beams using an atomic ensemble. The gate is performed by simply sending the two beams propagating in two orthogonal directions twice through a spin-squeezed atomic medium. Its fidelity can run up to one if the input atomic state is infinitely squeezed. Considering the noise effects due to atomic decoherence and light losses, we show that the observed fidelities of the schememore » are still quite high within presently available techniques.« less

  5. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    PubMed

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  6. Continuous-variable quantum authentication of physical unclonable keys

    NASA Astrophysics Data System (ADS)

    Nikolopoulos, Georgios M.; Diamanti, Eleni

    2017-04-01

    We propose a scheme for authentication of physical keys that are materialized by optical multiple-scattering media. The authentication relies on the optical response of the key when probed by randomly selected coherent states of light, and the use of standard wavefront-shaping techniques that direct the scattered photons coherently to a specific target mode at the output. The quadratures of the electromagnetic field of the scattered light at the target mode are analysed using a homodyne detection scheme, and the acceptance or rejection of the key is decided upon the outcomes of the measurements. The proposed scheme can be implemented with current technology and offers collision resistance and robustness against key cloning.

  7. Lindley frailty model for a class of compound Poisson processes

    NASA Astrophysics Data System (ADS)

    Kadilar, Gamze Özel; Ata, Nihal

    2013-10-01

    The Lindley distribution gain importance in survival analysis for the similarity of exponential distribution and allowance for the different shapes of hazard function. Frailty models provide an alternative to proportional hazards model where misspecified or omitted covariates are described by an unobservable random variable. Despite of the distribution of the frailty is generally assumed to be continuous, it is appropriate to consider discrete frailty distributions In some circumstances. In this paper, frailty models with discrete compound Poisson process for the Lindley distributed failure time are introduced. Survival functions are derived and maximum likelihood estimation procedures for the parameters are studied. Then, the fit of the models to the earthquake data set of Turkey are examined.

  8. Continuous-variable quantum authentication of physical unclonable keys: Security against an emulation attack

    NASA Astrophysics Data System (ADS)

    Nikolopoulos, Georgios M.

    2018-01-01

    We consider a recently proposed entity authentication protocol in which a physical unclonable key is interrogated by random coherent states of light, and the quadratures of the scattered light are analyzed by means of a coarse-grained homodyne detection. We derive a sufficient condition for the protocol to be secure against an emulation attack in which an adversary knows the challenge-response properties of the key and moreover, he can access the challenges during the verification. The security analysis relies on Holevo's bound and Fano's inequality, and suggests that the protocol is secure against the emulation attack for a broad range of physical parameters that are within reach of today's technology.

  9. Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle.

    PubMed

    Shalymov, Dmitry S; Fradkov, Alexander L

    2016-01-01

    We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined.

  10. Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle

    PubMed Central

    2016-01-01

    We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined. PMID:26997886

  11. A prospective, randomized, open-label study comparing the efficacy and safety of preprandial and prandial insulin in combination with acarbose in elderly, insulin-requiring patients with type 2 diabetes mellitus.

    PubMed

    Yang, Guang; Li, Chunlin; Gong, Yanping; Li, Jian; Cheng, Xiaoling; Tian, Hui

    2013-06-01

    By delaying absorption of carbohydrates, acarbose can reduce preprandial hyperglycemia and delay the emergence of postprandial hyperglycemia. To evaluate whether acarbose can shorten the desirable time interval between insulin injection and meals, 60 elderly (≥60 years) patients with unsatisfactorily controlled type 2 diabetes mellitus despite insulin use were enrolled in a randomized, open-label study of 16 weeks' duration. Two groups (n=20 each) were randomized to receive isophane protamine biosynthetic human insulin 70/30 injections twice daily 30 min before meals plus acarbose 50 mg once daily (Group A) or three times daily (Group B) before meals, whereas the third group (n=20) received isophane protamine biosynthetic human insulin 70/30 injections twice daily immediately before meals plus acarbose 50 mg three times daily before meals (Group C). The required insulin dosage at study end was significantly less in Groups B and C than in Group A. Both continuous glucose monitoring data and the patients' self-monitoring data indicated that blood glucose variability parameters were significantly improved in Groups B and C in comparison with Group A, but there were no significant differences between Groups B and C. The incidence of hypoglycemia was low in all three groups. The absence of a significant difference in glucose variability between Groups B and C suggests that the addition of acarbose permitted adjustment of the insulin administration time from 30 min before meals to immediately before meals-which may be more convenient for patients-without affecting glycemic control.

  12. Civamide cream 0.075% in patients with osteoarthritis of the knee: a 12-week randomized controlled clinical trial with a longterm extension.

    PubMed

    Schnitzer, Thomas J; Pelletier, Jean-Pierre; Haselwood, Doug M; Ellison, William T; Ervin, John E; Gordon, Richard D; Lisse, Jeffrey R; Archambault, W Tad; Sampson, Allan R; Fezatte, Heidi B; Phillips, Scott B; Bernstein, Joel E

    2012-03-01

    To evaluate the safety and efficacy of civamide cream 0.075% for the treatment of osteoarthritis (OA) of the knee. We conducted a 12-week, multicenter, randomized, double-blind study with a 52-week open-label extension. Patients with OA of the knee received either civamide cream 0.075% or a lower dose of civamide cream, 0.01%, as the control. The 3 co-primary endpoints in the double-blind study were the time-weighted average (TWA) of change from baseline to Day 84 in the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain subscale, the WOMAC physical function subscale, and the Subject Global Evaluation (SGE). In the 52-week open-label extension study, the Osteoarthritis Pain Score and SGE were assessed. A total of 695 patients were randomized to receive civamide cream 0.075% (n = 351) or civamide cream 0.01% (control; n = 344) in the double-blind study. Significance in favor of civamide cream 0.075% was achieved for the TWA for all 3 co-primary efficacy variables: WOMAC pain (p = 0.009), WOMAC physical function (p < 0.001), and SGE (p = 0.008); and at Day 84 for these 3 variables (p = 0.013, p < 0.001, and p = 0.049, respectively). These analyses accounted for significant baseline-by-treatment interactions. In the 52-week open-label extension, efficacy was maintained. Civamide cream 0.075% was well tolerated throughout the studies. These studies demonstrate the efficacy of civamide cream for up to 1 year of continuous use. Civamide cream, with its lack of systemic absorption, does not have the potential for serious systemic toxicity, in contrast to several other OA treatments.

  13. Atomoxetine could improve intra-individual variability in drug-naïve adults with attention-deficit/hyperactivity disorder comparably with methylphenidate: A head-to-head randomized clinical trial.

    PubMed

    Ni, Hsing-Chang; Hwang Gu, Shoou-Lian; Lin, Hsiang-Yuan; Lin, Yu-Ju; Yang, Li-Kuang; Huang, Hui-Chun; Gau, Susan Shur-Fen

    2016-05-01

    Intra-individual variability in reaction time (IIV-RT) is common in individuals with attention-deficit/hyperactivity disorder (ADHD). It can be improved by stimulants. However, the effects of atomoxetine on IIV-RT are inconclusive. We aimed to investigate the effects of atomoxetine on IIV-RT, and directly compared its efficacy with methylphenidate in adults with ADHD. An 8-10 week, open-label, head-to-head, randomized clinical trial was conducted in 52 drug-naïve adults with ADHD, who were randomly assigned to two treatment groups: immediate-release methylphenidate (n=26) thrice daily (10-20 mg per dose) and atomoxetine once daily (n=26) (0.5-1.2 mg/kg/day). IIV-RT, derived from the Conners' continuous performance test (CCPT), was represented by the Gaussian (reaction time standard error, RTSE) and ex-Gaussian models (sigma and tau). Other neuropsychological functions, including response errors and mean of reaction time, were also measured. Participants received CCPT assessments at baseline and week 8-10 (60.4±6.3 days). We found comparable improvements in performances of CCPT between the immediate-release methylphenidate- and atomoxetine-treated groups. Both medications significantly improved IIV-RT in terms of reducing tau values with comparable efficacy. In addition, both medications significantly improved inhibitory control by reducing commission errors. Our results provide evidence to support that atomoxetine could improve IIV-RT and inhibitory control, of comparable efficacy with immediate-release methylphenidate, in drug-naïve adults with ADHD. Shared and unique mechanisms underpinning these medication effects on IIV-RT awaits further investigation. © The Author(s) 2016.

  14. Shake It Off: A Randomized Pilot Study of the Effect of Whole Body Vibration on Pain in Healing Burn Wounds.

    PubMed

    Ray, Juliet J; Alvarez, Angel D; Ulbrich, Sondra L; Lessner-Eisenberg, Sharon; Satahoo, Shevonne S; Meizoso, Jonathan P; Karcutskie, Charles A; Mundra, Leela S; Namias, Nicholas; Pizano, Louis R; Schulman, Carl I

    Whole body vibration (WBV) has been shown to improve strength in extremities with healed burn wounds. We hypothesize that WBV reduces pain during rehabilitation compared to standard therapy alone. Patients with ≥1% TBSA burn to one or more extremities from October 2014 to December 2015 were randomized to vibration (VIBE) or control. Each burned extremity was tested separately within the assigned group. Patients underwent one to three therapy sessions (S1, S2, S3) consisting of five upper and/or lower extremity exercises with or without WBV. Pain was assessed pre-, mid-, and postsession on a scale of 1 to 10. Mean pain scores at S1 to S3 were compared between groups with paired samples t-tests. An independent t-test was used to compare differences in pain scores between groups. Continuous variables were compared using a t-test or Mann-Whitney U test, and categorical variables were compared using a χ or Fisher's exact test, as appropriate. Forty-eight randomized test extremities (VIBE = 26, control = 22) were analyzed from a total of 31 subjects. There were no significant differences between groups in age, gender, overall TBSA, TBSA in the test extremity, pain medication use before therapy session, or skin grafting before therapy session. At S1, S2, and S3, there was a statistically significant decrease in mid- and postsession pain compared to presession pain in VIBE vs controls. Exposure to WBV decreased pain during and after physical therapy. This modality may be applicable to a variety of soft tissue injuries and warrants additional investigation.

  15. Image discrimination models predict detection in fixed but not random noise

    NASA Technical Reports Server (NTRS)

    Ahumada, A. J. Jr; Beard, B. L.; Watson, A. B. (Principal Investigator)

    1997-01-01

    By means of a two-interval forced-choice procedure, contrast detection thresholds for an aircraft positioned on a simulated airport runway scene were measured with fixed and random white-noise masks. The term fixed noise refers to a constant, or unchanging, noise pattern for each stimulus presentation. The random noise was either the same or different in the two intervals. Contrary to simple image discrimination model predictions, the same random noise condition produced greater masking than the fixed noise. This suggests that observers seem unable to hold a new noisy image for comparison. Also, performance appeared limited by internal process variability rather than by external noise variability, since similar masking was obtained for both random noise types.

  16. Comparison of Random Forest and Parametric Imputation Models for Imputing Missing Data Using MICE: A CALIBER Study

    PubMed Central

    Shah, Anoop D.; Bartlett, Jonathan W.; Carpenter, James; Nicholas, Owen; Hemingway, Harry

    2014-01-01

    Multivariate imputation by chained equations (MICE) is commonly used for imputing missing data in epidemiologic research. The “true” imputation model may contain nonlinearities which are not included in default imputation models. Random forest imputation is a machine learning technique which can accommodate nonlinearities and interactions and does not require a particular regression model to be specified. We compared parametric MICE with a random forest-based MICE algorithm in 2 simulation studies. The first study used 1,000 random samples of 2,000 persons drawn from the 10,128 stable angina patients in the CALIBER database (Cardiovascular Disease Research using Linked Bespoke Studies and Electronic Records; 2001–2010) with complete data on all covariates. Variables were artificially made “missing at random,” and the bias and efficiency of parameter estimates obtained using different imputation methods were compared. Both MICE methods produced unbiased estimates of (log) hazard ratios, but random forest was more efficient and produced narrower confidence intervals. The second study used simulated data in which the partially observed variable depended on the fully observed variables in a nonlinear way. Parameter estimates were less biased using random forest MICE, and confidence interval coverage was better. This suggests that random forest imputation may be useful for imputing complex epidemiologic data sets in which some patients have missing data. PMID:24589914

  17. Comparison of random forest and parametric imputation models for imputing missing data using MICE: a CALIBER study.

    PubMed

    Shah, Anoop D; Bartlett, Jonathan W; Carpenter, James; Nicholas, Owen; Hemingway, Harry

    2014-03-15

    Multivariate imputation by chained equations (MICE) is commonly used for imputing missing data in epidemiologic research. The "true" imputation model may contain nonlinearities which are not included in default imputation models. Random forest imputation is a machine learning technique which can accommodate nonlinearities and interactions and does not require a particular regression model to be specified. We compared parametric MICE with a random forest-based MICE algorithm in 2 simulation studies. The first study used 1,000 random samples of 2,000 persons drawn from the 10,128 stable angina patients in the CALIBER database (Cardiovascular Disease Research using Linked Bespoke Studies and Electronic Records; 2001-2010) with complete data on all covariates. Variables were artificially made "missing at random," and the bias and efficiency of parameter estimates obtained using different imputation methods were compared. Both MICE methods produced unbiased estimates of (log) hazard ratios, but random forest was more efficient and produced narrower confidence intervals. The second study used simulated data in which the partially observed variable depended on the fully observed variables in a nonlinear way. Parameter estimates were less biased using random forest MICE, and confidence interval coverage was better. This suggests that random forest imputation may be useful for imputing complex epidemiologic data sets in which some patients have missing data.

  18. "Congratulations, you have been randomized into the control group!(?)": issues to consider when recruiting schools for matched-pair randomized control trials of prevention programs.

    PubMed

    Ji, Peter; DuBois, David L; Flay, Brian R; Brechling, Vanessa

    2008-03-01

    Recruiting schools into a matched-pair randomized control trial (MP-RCT) to evaluate the efficacy of a school-level prevention program presents challenges for researchers. We considered which of 2 procedures would be most effective for recruiting schools into the study and assigning them to conditions. In 1 procedure (recruit and match/randomize), we would recruit schools and match them prior to randomization, and in the other (match/randomize and recruitment), we would match schools and randomize them prior to recruitment. We considered how each procedure impacted the randomization process and our ability to recruit schools into the study. After implementing the selected procedure, the equivalence of both treatment and control group schools and the participating and nonparticipating schools on school demographic variables was evaluated. We decided on the recruit and match/randomize procedure because we thought it would provide the opportunity to build rapport with the schools and prepare them for the randomization process, thereby increasing the likelihood that they would accept their randomly assigned conditions. Neither the treatment and control group schools nor the participating and nonparticipating schools exhibited statistically significant differences from each other on any of the school demographic variables. Recruitment of schools prior to matching and randomization in an MP-RCT may facilitate the recruitment of schools and thus enhance both the statistical power and the representativeness of study findings. Future research would benefit from the consideration of a broader range of variables (eg, readiness to implement a comprehensive prevention program) both in matching schools and in evaluating their representativeness to nonparticipating schools.

  19. Random effects coefficient of determination for mixed and meta-analysis models.

    PubMed

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2012-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, [Formula: see text], that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If [Formula: see text] is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of [Formula: see text] apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects-the model can be estimated using the dummy variable approach. We derive explicit formulas for [Formula: see text] in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine.

  20. Prospective randomized trial to assess effects of continuing hormone therapy on cerebral function in postmenopausal women at risk for dementia.

    PubMed

    Rasgon, Natalie L; Geist, Cheri L; Kenna, Heather A; Wroolie, Tonita E; Williams, Katherine E; Silverman, Daniel H S

    2014-01-01

    The objective of this study was to examine the effects of estrogen-based hormone therapy (HT) on regional cerebral metabolism in postmenopausal women (mean age = 58, SD = 5) at risk for development of dementia. The prospective clinical trial design included pre- and post-intervention neuroimaging of women randomized to continue (HT+) or discontinue (HT-) therapy following an average of 10 years of use. The primary outcome measure was change in brain metabolism during the subsequent two years, as assessed with fluorodeoxyglucose-18 positron emission tomography (FDG-PET). Longitudinal FDG-PET data were available for 45 study completers. Results showed that women randomized to continue HT experienced relative preservation of frontal and parietal cortical metabolism, compared with women randomized to discontinue HT. Women who discontinued 17-β estradiol (17βE)-based HT, as well as women who continued conjugated equine estrogen (CEE)-based HT, exhibited significant decline in metabolism of the precuneus/posterior cingulate cortical (PCC) area. Significant decline in PCC metabolism was additionally seen in women taking concurrent progestins (with either 17βE or CEE). Together, these findings suggest that among postmenopausal subjects at risk for developing dementia, regional cerebral cortical metabolism is relatively preserved for at least two years in women randomized to continue HT, compared with women randomized to discontinue HT. In addition, continuing unopposed 17βE therapy is associated specifically with preservation of metabolism in PCC, known to undergo the most significant decline in the earliest stages of Alzheimer's disease. ClinicalTrials.gov NCT00097058.

  1. Random vectors and spatial analysis by geostatistics for geotechnical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, D.S.

    1987-08-01

    Geostatistics is extended to the spatial analysis of vector variables by defining the estimation variance and vector variogram in terms of the magnitude of difference vectors. Many random variables in geotechnology are in vectorial terms rather than scalars, and its structural analysis requires those sample variable interpolations to construct and characterize structural models. A better local estimator will result in greater quality of input models; geostatistics can provide such estimators; kriging estimators. The efficiency of geostatistics for vector variables is demonstrated in a case study of rock joint orientations in geological formations. The positive cross-validation encourages application of geostatistics tomore » spatial analysis of random vectors in geoscience as well as various geotechnical fields including optimum site characterization, rock mechanics for mining and civil structures, cavability analysis of block cavings, petroleum engineering, and hydrologic and hydraulic modelings.« less

  2. Stronger steerability criterion for more uncertain continuous-variable systems

    NASA Astrophysics Data System (ADS)

    Chowdhury, Priyanka; Pramanik, Tanumoy; Majumdar, A. S.

    2015-10-01

    We derive a fine-grained uncertainty relation for the measurement of two incompatible observables on a single quantum system of continuous variables, and show that continuous-variable systems are more uncertain than discrete-variable systems. Using the derived fine-grained uncertainty relation, we formulate a stronger steering criterion that is able to reveal the steerability of NOON states that has hitherto not been possible using other criteria. We further obtain a monogamy relation for our steering inequality which leads to an, in principle, improved lower bound on the secret key rate of a one-sided device independent quantum key distribution protocol for continuous variables.

  3. Robustness of quantum key distribution with discrete and continuous variables to channel noise

    NASA Astrophysics Data System (ADS)

    Lasota, Mikołaj; Filip, Radim; Usenko, Vladyslav C.

    2017-06-01

    We study the robustness of quantum key distribution protocols using discrete or continuous variables to the channel noise. We introduce the model of such noise based on coupling of the signal to a thermal reservoir, typical for continuous-variable quantum key distribution, to the discrete-variable case. Then we perform a comparison of the bounds on the tolerable channel noise between these two kinds of protocols using the same noise parametrization, in the case of implementation which is perfect otherwise. Obtained results show that continuous-variable protocols can exhibit similar robustness to the channel noise when the transmittance of the channel is relatively high. However, for strong loss discrete-variable protocols are superior and can overcome even the infinite-squeezing continuous-variable protocol while using limited nonclassical resources. The requirement on the probability of a single-photon production which would have to be fulfilled by a practical source of photons in order to demonstrate such superiority is feasible thanks to the recent rapid development in this field.

  4. Investigation of continuous effect modifiers in a meta-analysis on higher versus lower PEEP in patients requiring mechanical ventilation--protocol of the ICEM study.

    PubMed

    Kasenda, Benjamin; Sauerbrei, Willi; Royston, Patrick; Briel, Matthias

    2014-05-20

    Categorizing an inherently continuous predictor in prognostic analyses raises several critical methodological issues: dependence of the statistical significance on the number and position of the chosen cut-point(s), loss of statistical power, and faulty interpretation of the results if a non-linear association is incorrectly assumed to be linear. This also applies to a therapeutic context where investigators of randomized clinical trials (RCTs) are interested in interactions between treatment assignment and one or more continuous predictors. Our goal is to apply the multivariable fractional polynomial interaction (MFPI) approach to investigate interactions between continuous patient baseline variables and the allocated treatment in an individual patient data meta-analysis of three RCTs (N = 2,299) from the intensive care field. For each study, MFPI will provide a continuous treatment effect function. Functions from each of the three studies will be averaged by a novel meta-analysis approach for functions. We will plot treatment effect functions separately for each study and also the averaged function. The averaged function with a related confidence interval will provide a suitable basis to assess whether a continuous patient characteristic modifies the treatment comparison and may be relevant for clinical decision-making. The compared interventions will be a higher or lower positive end-expiratory pressure (PEEP) ventilation strategy in patients requiring mechanical ventilation. The continuous baseline variables body mass index, PaO2/FiO2, respiratory compliance, and oxygenation index will be the investigated potential effect modifiers. Clinical outcomes for this analysis will be in-hospital mortality, time to death, time to unassisted breathing, and pneumothorax. This project will be the first meta-analysis to combine continuous treatment effect functions derived by the MFPI procedure separately in each of several RCTs. Such an approach requires individual patient data (IPD). They are available from an earlier IPD meta-analysis using different methods for analysis. This new analysis strategy allows assessing whether treatment effects interact with continuous baseline patient characteristics and avoids categorization-based subgroup analyses. These interaction analyses of the present study will be exploratory in nature. However, they may help to foster future research using the MFPI approach to improve interaction analyses of continuous predictors in RCTs and IPD meta-analyses. This study is registered in PROSPERO (CRD42012003129).

  5. Modeling Randomness in Judging Rating Scales with a Random-Effects Rating Scale Model

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Wilson, Mark; Shih, Ching-Lin

    2006-01-01

    This study presents the random-effects rating scale model (RE-RSM) which takes into account randomness in the thresholds over persons by treating them as random-effects and adding a random variable for each threshold in the rating scale model (RSM) (Andrich, 1978). The RE-RSM turns out to be a special case of the multidimensional random…

  6. The effect of bedding system selected by manual muscle testing on sleep-related cardiovascular functions.

    PubMed

    Kuo, Terry B J; Li, Jia-Yi; Lai, Chun-Ting; Huang, Yu-Chun; Hsu, Ya-Chuan; Yang, Cheryl C H

    2013-01-01

    Different types of mattresses affect sleep quality and waking muscle power. Whether manual muscle testing (MMT) predicts the cardiovascular effects of the bedding system was explored using ten healthy young men. For each participant, two bedding systems, one inducing the strongest limb muscle force (strong bedding system) and the other inducing the weakest limb force (weak bedding system), were identified using MMT. Each bedding system, in total five mattresses and eight pillows of different firmness, was used for two continuous weeks at the participant's home in a random and double-blind sequence. A sleep log, a questionnaire, and a polysomnography were used to differentiate the two bedding systems. Heart rate variability and arterial pressure variability analyses showed that the strong bedding system resulted in decreased cardiovascular sympathetic modulation, increased cardiac vagal activity, and increased baroreceptor reflex sensitivity during sleep as compared to the weak bedding system. Different bedding systems have distinct cardiovascular effects during sleep that can be predicted by MMT.

  7. The Effect of Bedding System Selected by Manual Muscle Testing on Sleep-Related Cardiovascular Functions

    PubMed Central

    Kuo, Terry B. J.; Li, Jia-Yi; Lai, Chun-Ting; Huang, Yu-Chun; Hsu, Ya-Chuan; Yang, Cheryl C. H.

    2013-01-01

    Background. Different types of mattresses affect sleep quality and waking muscle power. Whether manual muscle testing (MMT) predicts the cardiovascular effects of the bedding system was explored using ten healthy young men. Methods. For each participant, two bedding systems, one inducing the strongest limb muscle force (strong bedding system) and the other inducing the weakest limb force (weak bedding system), were identified using MMT. Each bedding system, in total five mattresses and eight pillows of different firmness, was used for two continuous weeks at the participant's home in a random and double-blind sequence. A sleep log, a questionnaire, and a polysomnography were used to differentiate the two bedding systems. Results and Conclusion. Heart rate variability and arterial pressure variability analyses showed that the strong bedding system resulted in decreased cardiovascular sympathetic modulation, increased cardiac vagal activity, and increased baroreceptor reflex sensitivity during sleep as compared to the weak bedding system. Different bedding systems have distinct cardiovascular effects during sleep that can be predicted by MMT. PMID:24371836

  8. A conceptual model for site-level ecology of the giant gartersnake (Thamnophis gigas) in the Sacramento Valley, California

    USGS Publications Warehouse

    Halstead, Brian J.; Wylie, Glenn D.; Casazza, Michael L.; Hansen, Eric C.; Scherer, Rick D.; Patterson, Laura C.

    2015-08-14

    Bayesian networks further provide a clear visual display of the model that facilitates understanding among various stakeholders (Marcot and others, 2001; Uusitalo , 2007). Empirical data and expert judgment can be combined, as continuous or categorical variables, to update knowledge about the system (Marcot and others, 2001; Uusitalo , 2007). Importantly, Bayesian network models allow inference from causes to consequences, but also from consequences to causes, so that data can inform the states of nodes (values of different random variables) in either direction (Marcot and others, 2001; Uusitalo , 2007). Because they can incorporate both decision nodes that represent management actions and utility nodes that quantify the costs and benefits of outcomes, Bayesian networks are ideally suited to risk analysis and adaptive management (Nyberg and others, 2006; Howes and others, 2010). Thus, Bayesian network models are useful in situations where empirical data are not available, such as questions concerning the responses of giant gartersnakes to management.

  9. Variable complexity online sequential extreme learning machine, with applications to streamflow prediction

    NASA Astrophysics Data System (ADS)

    Lima, Aranildo R.; Hsieh, William W.; Cannon, Alex J.

    2017-12-01

    In situations where new data arrive continually, online learning algorithms are computationally much less costly than batch learning ones in maintaining the model up-to-date. The extreme learning machine (ELM), a single hidden layer artificial neural network with random weights in the hidden layer, is solved by linear least squares, and has an online learning version, the online sequential ELM (OSELM). As more data become available during online learning, information on the longer time scale becomes available, so ideally the model complexity should be allowed to change, but the number of hidden nodes (HN) remains fixed in OSELM. A variable complexity VC-OSELM algorithm is proposed to dynamically add or remove HN in the OSELM, allowing the model complexity to vary automatically as online learning proceeds. The performance of VC-OSELM was compared with OSELM in daily streamflow predictions at two hydrological stations in British Columbia, Canada, with VC-OSELM significantly outperforming OSELM in mean absolute error, root mean squared error and Nash-Sutcliffe efficiency at both stations.

  10. Variability in the management of lithium poisoning.

    PubMed

    Roberts, Darren M; Gosselin, Sophie

    2014-01-01

    Three patterns of lithium poisoning are recognized: acute, acute-on-chronic, and chronic. Intravenous fluids with or without an extracorporeal treatment are the mainstay of treatment; their respective roles may differ depending on the mode of poisoning being treated. Recommendations for treatment selection are available but these are based on a small number of observational studies and their uptake by clinicians is not known. Clinician decision-making in the treatment of four cases of lithium poisoning was assessed at a recent clinical toxicology meeting using an audience response system. Variability in treatment decisions was evident in addition to discordance with published recommendations. Participants did not consistently indicate that hemodialysis was the first-line treatment, instead opting for a conservative approach, and continuous modalities were viewed favorably; this is in contrast to recommendations in some references. The development of multidisciplinary consensus guidelines may improve the management of patients with lithium poisoning but prospective randomized controlled trials are required to more clearly define the role of extracorporeal treatments. © 2014 Wiley Periodicals, Inc.

  11. Optimizing Multi-Product Multi-Constraint Inventory Control Systems with Stochastic Replenishments

    NASA Astrophysics Data System (ADS)

    Allah Taleizadeh, Ata; Aryanezhad, Mir-Bahador; Niaki, Seyed Taghi Akhavan

    Multi-periodic inventory control problems are mainly studied employing two assumptions. The first is the continuous review, where depending on the inventory level orders can happen at any time and the other is the periodic review, where orders can only happen at the beginning of each period. In this study, we relax these assumptions and assume that the periodic replenishments are stochastic in nature. Furthermore, we assume that the periods between two replenishments are independent and identically random variables. For the problem at hand, the decision variables are of integer-type and there are two kinds of space and service level constraints for each product. We develop a model of the problem in which a combination of back-order and lost-sales are considered for the shortages. Then, we show that the model is of an integer-nonlinear-programming type and in order to solve it, a search algorithm can be utilized. We employ a simulated annealing approach and provide a numerical example to demonstrate the applicability of the proposed methodology.

  12. Practical secure quantum communications

    NASA Astrophysics Data System (ADS)

    Diamanti, Eleni

    2015-05-01

    We review recent advances in the field of quantum cryptography, focusing in particular on practical implementations of two central protocols for quantum network applications, namely key distribution and coin flipping. The former allows two parties to share secret messages with information-theoretic security, even in the presence of a malicious eavesdropper in the communication channel, which is impossible with classical resources alone. The latter enables two distrustful parties to agree on a random bit, again with information-theoretic security, and with a cheating probability lower than the one that can be reached in a classical scenario. Our implementations rely on continuous-variable technology for quantum key distribution and on a plug and play discrete-variable system for coin flipping, and necessitate a rigorous security analysis adapted to the experimental schemes and their imperfections. In both cases, we demonstrate the protocols with provable security over record long distances in optical fibers and assess the performance of our systems as well as their limitations. The reported advances offer a powerful toolbox for practical applications of secure communications within future quantum networks.

  13. Maternal and Child Health Handbook use for maternal and child care: a cluster randomized controlled study in rural Java, Indonesia.

    PubMed

    Osaki, Keiko; Hattori, Tomoko; Toda, Akemi; Mulati, Erna; Hermawan, Lukas; Pritasari, Kirana; Bardosono, Saptawati; Kosen, Soewarta

    2018-01-09

    Effectiveness of the Maternal and Child Health Handbook (MCHHB), a home-based booklet for pregnancy, delivery and postnatal/child health, was evaluated on care acquisition and home care in rural Java, a low service-coverage area. We conducted a health centre-based randomized trial, with a 2-year follow-up. Intervention included (i) MCHHB provision at antenatal care visits; (ii) records and guides by health personnel on and with the MCHHB; and (iii) sensitization of care by volunteers using the MCHHB. The follow-up rate was 70.2% (183, intervention area; 271, control area). Respondents in the intervention area received consecutive MCH services including two doses of tetanus toxoid injections and antenatal care four times or more during pregnancy, professional assistance during child delivery and vitamin A supplements administration to their children, after adjustment for confounding variables and cluster effects (OR = 2.03, 95% CI: 1.19-3.47). In the intervention area, home care (continued breastfeeding; introducing complementary feeding; proper feeding order; varied foods feeding; self-feeding training; and care for cough), perceived support by husbands, and lower underweight rates and stunting rates among children were observed. MCHHB use promoted continuous care acquisition and care at home from pregnancy to early child-rearing stages in rural Java. © The Author(s) 2018. Published by Oxford University Press on behalf of Faculty of Public Health.

  14. Thermodynamics and mechanics of stretch-induced crystallization in rubbers

    NASA Astrophysics Data System (ADS)

    Guo, Qiang; Zaïri, Fahmi; Guo, Xinglin

    2018-05-01

    The aim of the present paper is to provide a quantitative prediction of the stretch-induced crystallization in natural rubber, the exclusive reason for its history-dependent thermomechanical features. A constitutive model based on a micromechanism inspired molecular chain approach is formulated within the context of the thermodynamic framework. The molecular configuration of the partially crystallized single chain is analyzed and calculated by means of some statistical mechanical methods. The random thermal oscillation of the crystal orientation, considered as a continuous random variable, is treated by means of a representative angle. The physical expression of the chain free energy is derived according to a two-step strategy by separating crystallization and stretching. This strategy ensures that the stretch-induced part of the thermodynamic crystallization force is null at the initial instant and allows, without any additional constraint, the formulation of a simple linear relationship for the crystallinity evolution law. The model contains very few physically interpretable material constants to simulate the complex mechanism: two chain-scale constants, one crystallinity kinetics constant, three thermodynamic constants related to the newly formed crystallites, and a function controlling the crystal orientation with respect to the chain. The model is used to discuss some important aspects of the micromechanism and the macroresponse under the equilibrium state and the nonequilibrium state involved during stretching and recovery, and continuous relaxation.

  15. Generic versus disorder specific cognitive behavior therapy for social anxiety disorder in youth: A randomized controlled trial using internet delivery.

    PubMed

    Spence, Susan H; Donovan, Caroline L; March, Sonja; Kenardy, Justin A; Hearn, Cate S

    2017-03-01

    The study examined whether the efficacy of cognitive behavioral treatment for Social Anxiety Disorder for children and adolescents is increased if intervention addresses specific cognitive and behavioral factors linked to the development and maintenance of SAD in young people, over and above the traditional generic CBT approach. Participants were 125 youth, aged 8-17 years, with a primary diagnosis of SAD, who were randomly assigned to generic CBT (CBT-GEN), social anxiety specific CBT (CBT-SAD) or a wait list control (WLC). Intervention was delivered using a therapist-supported online program. After 12-weeks, participants who received treatment (CBT-SAD or CBT-GEN) showed significantly greater reduction in social anxiety and post-event processing, and greater improvement in global functioning than the WLC but there was no significant difference between CBT-SAD and CBT-GEN on any outcome variable at 12-weeks or 6-month follow-up. Despite significant reductions in anxiety, the majority in both treatment conditions continued to meet diagnostic criteria for SAD at 6-month follow-up. Decreases in social anxiety were associated with decreases in post-event processing. Future research should continue to investigate disorder-specific interventions for SAD in young people, drawing on evidence regarding causal or maintaining factors, in order to enhance treatment outcomes for this debilitating condition. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Short-term effects of a randomized computer-based out-of-school smoking prevention trial aimed at elementary schoolchildren.

    PubMed

    Ausems, Marlein; Mesters, Ilse; van Breukelen, Gerard; De Vries, Hein

    2002-06-01

    Smoking prevention programs usually run during school hours. In our study, an out-of-school program was developed consisting of a computer-tailored intervention aimed at the age group before school transition (11- to 12-year-old elementary schoolchildren). The aim of this study is to evaluate the additional effect of out-of-school smoking prevention. One hundred fifty-six participating schools were randomly allocated to one of four research conditions: (a) the in-school condition, an existing seven-lesson program; (b) the out-of-school condition, three computer-tailored letters sent to the students' homes; (c) the in-school and out-of-school condition, a combined approach; (d) the control condition. Pretest and 6 months follow-up data on smoking initiation and continuation, and data on psychosocial variables were collected from 3,349 students. Control and out-of-school conditions differed regarding posttest smoking initiation (18.1 and 10.4%) and regarding posttest smoking continuation (23.5 and 13.1%). Multilevel logistic regression analyses showed positive effects regarding the out-of-school program. Significant effects were not found regarding the in-school program, nor did the combined approach show stronger effects than the single-method approaches. The findings of this study suggest that smoking prevention trials for elementary schoolchildren can be effective when using out-of-school computer-tailored interventions. Copyright 2002 Elsevier Science (USA).

  17. Method and apparatus for executing an asynchronous clutch-to-clutch shift in a hybrid transmission

    DOEpatents

    Demirovic, Besim; Gupta, Pinaki; Kaminsky, Lawrence A.; Naqvi, Ali K.; Heap, Anthony H.; Sah, Jy-Jen F.

    2014-08-12

    A hybrid transmission includes first and second electric machines. A method for operating the hybrid transmission in response to a command to execute a shift from an initial continuously variable mode to a target continuously variable mode includes increasing torque of an oncoming clutch associated with operating in the target continuously variable mode and correspondingly decreasing a torque of an off-going clutch associated with operating in the initial continuously variable mode. Upon deactivation of the off-going clutch, torque outputs of the first and second electric machines and the torque of the oncoming clutch are controlled to synchronize the oncoming clutch. Upon synchronization of the oncoming clutch, the torque for the oncoming clutch is increased and the transmission is operated in the target continuously variable mode.

  18. Instrumental Variable Analysis with a Nonlinear Exposure–Outcome Relationship

    PubMed Central

    Davies, Neil M.; Thompson, Simon G.

    2014-01-01

    Background: Instrumental variable methods can estimate the causal effect of an exposure on an outcome using observational data. Many instrumental variable methods assume that the exposure–outcome relation is linear, but in practice this assumption is often in doubt, or perhaps the shape of the relation is a target for investigation. We investigate this issue in the context of Mendelian randomization, the use of genetic variants as instrumental variables. Methods: Using simulations, we demonstrate the performance of a simple linear instrumental variable method when the true shape of the exposure–outcome relation is not linear. We also present a novel method for estimating the effect of the exposure on the outcome within strata of the exposure distribution. This enables the estimation of localized average causal effects within quantile groups of the exposure or as a continuous function of the exposure using a sliding window approach. Results: Our simulations suggest that linear instrumental variable estimates approximate a population-averaged causal effect. This is the average difference in the outcome if the exposure for every individual in the population is increased by a fixed amount. Estimates of localized average causal effects reveal the shape of the exposure–outcome relation for a variety of models. These methods are used to investigate the relations between body mass index and a range of cardiovascular risk factors. Conclusions: Nonlinear exposure–outcome relations should not be a barrier to instrumental variable analyses. When the exposure–outcome relation is not linear, either a population-averaged causal effect or the shape of the exposure–outcome relation can be estimated. PMID:25166881

  19. A Chlorhexidine Solution Reduces Aerobic Organism Growth in Operative Splash Basins in a Randomized Controlled Trial.

    PubMed

    Lindgren, Kevin E; Pelt, Christopher E; Anderson, Mike B; Peters, Christopher L; Spivak, Emily S; Gililland, Jeremy M

    2018-01-01

    Despite recommendations against the use of splash basins, due to the potential of bacterial contamination, our observation has been that they continue to be used in operating theaters. In hopes of decontaminating the splash basin, we sought to determine if the addition of chlorhexidine gluconate (CHG) would eliminate aerobic bacterial growth within the splash basin. After Institutional Review Board approval, we began enrollment in a randomized controlled trial comparing 2 splash basin solutions. Splash basins (n = 111) were randomized to either the standard of care (control) solution of sterile water or the experimental solution containing 0.05% CHG. One 20 mL aliquot was taken from the basin at the end of the surgical case and delivered to an independent laboratory. Samples were plated on tryptic soy agar (medium) and incubated at 30°C-35°C to encourage growth. After 48-72 hours, the agar plates were examined for growth and a standard plate count of aerobic cultures was performed. The sterile water group was found to have bacterial growth in 9% of samples compared to no growth in the CHG group (P = .045). The organisms included Micrococcus luteus, Staphylococcus hominis, Gram-variable coccobacilli, and unidentifiable Gram-positive rods. Given the safety and efficacy of a concentration of 0.05% CHG in reducing the bacterial contamination in the operative splash basin, it would seem that if the practice of using a splash basin in the operating theater is to be continued, the addition of an antiseptic solution such as that studied here should be considered. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Capture of visual direction in dynamic vergence is reduced with flashed monocular lines.

    PubMed

    Jaschinski, Wolfgang; Jainta, Stephanie; Schürer, Michael

    2006-08-01

    The visual direction of a continuously presented monocular object is captured by the visual direction of a closely adjacent binocular object, which questions the reliability of nonius lines for measuring vergence. This was shown by Erkelens, C. J., and van Ee, R. (1997a,b) [Capture of the visual direction: An unexpected phenomenon in binocular vision. Vision Research, 37, 1193-1196; Capture of the visual direction of monocular objects by adjacent binocular objects. Vision Research, 37, 1735-1745] stimulating dynamic vergence by a counter phase oscillation of two square random-dot patterns (one to each eye) that contained a smaller central dot-free gap (of variable width) with a vertical monocular line oscillating in phase with the random-dot pattern of the respective eye; subjects adjusted the motion-amplitude of the line until it was perceived as (nearly) stationary. With a continuously presented monocular line, we replicated capture of visual direction provided the dot-free gap was narrow: the adjusted motion-amplitude of the line was similar as the motion-amplitude of the random-dot pattern, although large vergence errors occurred. However, when we flashed the line for 67 ms at the moments of maximal and minimal disparity of the vergence stimulus, we found that the adjusted motion-amplitude of the line was smaller; thus, the capture effect appeared to be reduced with flashed nonius lines. Accordingly, we found that the objectively measured vergence gain was significantly correlated (r=0.8) with the motion-amplitude of the flashed monocular line when the separation between the line and the fusion contour was at least 32 min arc. In conclusion, if one wishes to estimate the dynamic vergence response with psychophysical methods, effects of capture of visual direction can be reduced by using flashed nonius lines.

  1. Automatic control of tracheal tube cuff pressure in ventilated patients in semirecumbent position: a randomized trial.

    PubMed

    Valencia, Mauricio; Ferrer, Miquel; Farre, Ramon; Navajas, Daniel; Badia, Joan Ramon; Nicolas, Josep Maria; Torres, Antoni

    2007-06-01

    The aspiration of subglottic secretions colonized by bacteria pooled around the tracheal tube cuff due to inadvertent deflation (<20 cm H2O) of the cuff plays a relevant role in the pathogenesis of ventilator-associated pneumonia. We assessed the efficacy of an automatic, validated device for the continuous regulation of tracheal tube cuff pressure in preventing ventilator-associated pneumonia. Prospective randomized controlled trial. Respiratory intensive care unit and general medical intensive care unit. One hundred and forty-two mechanically ventilated patients (age, 64 +/- 17 yrs; Acute Physiology and Chronic Health Evaluation II score, 18 +/- 6) without pneumonia or aspiration at admission. Within 24 hrs of intubation, patients were randomly allocated to undergo continuous regulation of the cuff pressure with the automatic device (n = 73) or routine care of the cuff pressure (control group, n = 69). Patients remained in a semirecumbent position in bed. The primary end point variable was the incidence of ventilator-associated pneumonia. Main causes for intubation were decreased consciousness (43, 30%) and exacerbation of chronic respiratory diseases (38, 27%). Cuff pressure <20 cm H2O was more frequently observed in the control than the automatic group (45.3 vs. 0.7% determinations, p < .001). However, the rate of ventilator-associated pneumonia with clinical criteria (16, 22% vs. 20, 29%) and microbiological confirmation (11, 15% vs. 10, 15%), the distribution of early and late onset, the causative microorganisms, and intensive care unit (20, 27% vs. 16, 23%) and hospital mortality (30, 41% vs. 23, 33%) were similar for the automatic and control groups, respectively. Cuff pressure is better controlled with the automatic device. However, it did not result in additional benefits to the semirecumbent position in preventing ventilator-associated pneumonia.

  2. Effectiveness of sensor-augmented pump therapy in children and adolescents with type 1 diabetes in the STAR 3 study.

    PubMed

    Slover, Robert H; Welsh, John B; Criego, Amy; Weinzimer, Stuart A; Willi, Steven M; Wood, Michael A; Tamborlane, William V

    2012-02-01

    Maintenance of appropriate A1C values and minimization of hyperglycemic excursions are difficult for many pediatric patients with type 1 diabetes. Continuous glucose monitoring (CGM) sensor-augmented pump (SAP) therapy is an alternative to multiple daily injection (MDI) therapy in this population. Sensor-augmented pump therapy for A1C reduction (STAR 3) was a 1-yr trial that included 82 children (aged 7-12) and 74 adolescents (aged 13-18) with A1C values ranging from 7.4 to 9.5% who were randomized to either SAP or MDI therapy. Quarterly A1C values were obtained from all subjects. CGM studies were carried out at baseline, 6 months, and 12 months to quantify glycemic excursions [calculated as area under the glucose concentration-time curve (AUC)] and variability. In the SAP group, sensor compliance was recorded. Baseline A1C values were similar in subjects randomized to the SAP (8.26 ± 0.55%) and MDI groups (8.30 ± 0.53%). All subsequent A1C values showed significant (p < 0.05) treatment group differences favoring SAP therapy. Compared with the MDI group, subjects in the SAP group were more likely to meet age-specific A1C targets and had lower AUC values for hyperglycemia with no increased risk of hypoglycemia. Glucose variability improved in the SAP group compared to the MDI group. Children wore CGM sensors more often and were more likely to reach age-specific A1C targets than adolescents. SAP therapy allows both children and adolescents with marginally or inadequately controlled type 1 diabetes to reduce A1C values, hyperglycemic excursions, and glycemic variability in a rapid, sustainable, and safe manner. © 2011 John Wiley & Sons A/S.

  3. Prevalence of kidney stones and associated risk factors in the Shunyi District of Beijing, China.

    PubMed

    Jiang, Y G; He, L H; Luo, G T; Zhang, X D

    2017-10-01

    Kidney stone formation is a multifactorial condition that involves interaction of environmental and genetic factors. Presence of kidney stones is strongly related to other diseases, which may result in a heavy economic and social burden. Clinical data on the prevalence and influencing factors in kidney stone disease in the north of China are scarce. In this study, we explored the prevalence of kidney stone and potentially associated risk factors in the Shunyi District of Beijing, China. A population-based cross-sectional study was conducted from December 2011 to November 2012 in a northern area of China. Participants were interviewed in randomly selected towns. Univariate analysis of continuous and categorical variables was first performed by calculation of Spearman's correlation coefficient and Pearson Chi squared value, respectively. Variables with statistical significance were further analysed by multivariate logistic regression to explore the potential influencing factors. A total of 3350 participants (1091 males and 2259 females) completed the survey and the response rate was 99.67%. Among the participants, 3.61% were diagnosed with kidney stone. Univariate analysis showed that significant differences were evident in 31 variables. Blood and urine tests were performed in 100 randomly selected patients with kidney stone and 100 healthy controls. Serum creatinine, calcium, and uric acid were significantly different between the patients with kidney stone and healthy controls. Multivariate logistic regression revealed that being male (odds ratio=102.681; 95% confidence interval, 1.062-9925.797), daily intake of white spirits (6.331; 1.204-33.282), and a history of urolithiasis (1797.775; 24.228-133 396.982) were factors potentially associated with kidney stone prevalence. Male gender, drinking white spirits, and a history of urolithiasis are potentially associated with kidney stone formation.

  4. [Effect of Sijunzi Decoction and enteral nutrition on T-cell subsets and nutritional status in patients with gastric cancer after operation: a randomized controlled trial].

    PubMed

    Cai, Jun; Wang, Hua; Zhou, Sheng; Wu, Bin; Song, Hua-Rong; Xuan, Zheng-Rong

    2008-01-01

    To observe the effect of perioperative application of Sijunzi Decoction and enteral nutrition on T-cell subsets and nutritional status in patients with gastric cancer after operation. In this prospective, single-blinded, controlled clinical trial, fifty-nine patients with gastric cancer were randomly divided into three groups: control group (n=20) and two study groups (group A, n=21; group B, n=18). Sjunzi Decoction (100 ml) was administered via nasogastric tube to the patients in the study group B from the second postoperation day to the 9th postoperation day. Patients in the two study groups were given an isocaloric and isonitrogonous enteral diet, which was started on the second day after operation, and continued for eight days. Patients in the control group were given an isocaloric and isonitrogonous parenteral diet for 9 days. All variables of nutritional status such as serum albumin (ALB), prealbumin (PA), transferrin (TRF) and T-cell subsets were measured one day before operation, and one day and 10 days after operation. All the nutritional variables and the levels of CD3(+), CD4(+), CD4(+)/CD8(+) were decreased significantly after operation. Ten days after operation, T-cell subsets and nutritional variables in the two study groups were increased as compare with the control group. The levels of ALB, TRF and T-cell subsets in the study group B were increased significantly as compared with the study group A (P<0.05). Enteral nutrition assisted with Sijunzi Decoction can positively improve and optimize cellular immune function and nutritional status in the patients with gastric cancer after operation.

  5. Postoperative pain after manual and mechanical glide path: a randomized clinical trial.

    PubMed

    Pasqualini, Damiano; Mollo, Livio; Scotti, Nicola; Cantatore, Giuseppe; Castellucci, Arnaldo; Migliaretti, Giuseppe; Berutti, Elio

    2012-01-01

    This prospective randomized clinical trial evaluated the incidence of postoperative pain after glide path performed with PathFile (PF) (Dentsply Maillefer, Ballaigues, Switzerland) versus stainless-steel K-file (KF). In 149 subjects, the mechanical glide path was performed with nickel-titanium (NiTi) rotary PF; in 146 subjects, the manual glide path was performed with stainless-steel KFs. Postoperative pain, analgesics consumption, and the number of days to complete pain resolution were evaluated in the following 7 days. An analysis of variance model for repeated measures was used to compare the variation of pain-scale values (P < .05). The Student's t test for continuous variables normally distributed, the nonparametric Mann-Whitney U test for the nonnormally distributed variables, and the chi-square test for dichotomous variables were used (P < .05). Despite homogeneous baseline conditions at diagnosis, tooth type, pain prevalence, and scores, the postoperative pain prevalence curves in PF group evidenced a more favorable trend in terms of time to pain resolution compared with the KF group (P = .004). The difference was also evident in the model adjusted for analgesics consumption in both groups (P = .012). The mean analgesics intake per subject was significantly higher in the KF group (3.7 ± 2.2) compared with the PF group (2 ± 1.7) (P < .001). Mean pain stop values were also significantly higher in the KF group (2.7) compared with the PF group (1.7) (P = .001). The glide path with NiTi Rotary PF leads to less postoperative pain and faster symptom resolution. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  6. Empirical study of seven data mining algorithms on different characteristics of datasets for biomedical classification applications.

    PubMed

    Zhang, Yiyan; Xin, Yi; Li, Qin; Ma, Jianshe; Li, Shuai; Lv, Xiaodan; Lv, Weiqi

    2017-11-02

    Various kinds of data mining algorithms are continuously raised with the development of related disciplines. The applicable scopes and their performances of these algorithms are different. Hence, finding a suitable algorithm for a dataset is becoming an important emphasis for biomedical researchers to solve practical problems promptly. In this paper, seven kinds of sophisticated active algorithms, namely, C4.5, support vector machine, AdaBoost, k-nearest neighbor, naïve Bayes, random forest, and logistic regression, were selected as the research objects. The seven algorithms were applied to the 12 top-click UCI public datasets with the task of classification, and their performances were compared through induction and analysis. The sample size, number of attributes, number of missing values, and the sample size of each class, correlation coefficients between variables, class entropy of task variable, and the ratio of the sample size of the largest class to the least class were calculated to character the 12 research datasets. The two ensemble algorithms reach high accuracy of classification on most datasets. Moreover, random forest performs better than AdaBoost on the unbalanced dataset of the multi-class task. Simple algorithms, such as the naïve Bayes and logistic regression model are suitable for a small dataset with high correlation between the task and other non-task attribute variables. K-nearest neighbor and C4.5 decision tree algorithms perform well on binary- and multi-class task datasets. Support vector machine is more adept on the balanced small dataset of the binary-class task. No algorithm can maintain the best performance in all datasets. The applicability of the seven data mining algorithms on the datasets with different characteristics was summarized to provide a reference for biomedical researchers or beginners in different fields.

  7. Effects of Person-Centered Physical Therapy on Fatigue-Related Variables in Persons With Rheumatoid Arthritis: A Randomized Controlled Trial.

    PubMed

    Feldthusen, Caroline; Dean, Elizabeth; Forsblad-d'Elia, Helena; Mannerkorpi, Kaisa

    2016-01-01

    To examine effects of person-centered physical therapy on fatigue and related variables in persons with rheumatoid arthritis (RA). Randomized controlled trial. Hospital outpatient rheumatology clinic. Persons with RA aged 20 to 65 years (N=70): intervention group (n=36) and reference group (n=34). The 12-week intervention, with 6-month follow-up, focused on partnership between participant and physical therapist and tailored health-enhancing physical activity and balancing life activities. The reference group continued with regular activities; both groups received usual health care. Primary outcome was general fatigue (visual analog scale). Secondary outcomes included multidimensional fatigue (Bristol Rheumatoid Arthritis Fatigue Multi-Dimensional Questionnaire) and fatigue-related variables (ie, disease, health, function). At posttest, general fatigue improved more in the intervention group than the reference group (P=.042). Improvement in median general fatigue reached minimal clinically important differences between and within groups at posttest and follow-up. Improvement was also observed for anxiety (P=.0099), and trends toward improvements were observed for most multidimensional aspects of fatigue (P=.023-.048), leg strength/endurance (P=.024), and physical activity (P=.023). Compared with the reference group at follow-up, the intervention group improvement was observed for leg strength/endurance (P=.001), and the trends toward improvements persisted for physical (P=.041) and living-related (P=.031) aspects of fatigue, physical activity (P=.019), anxiety (P=.015), self-rated health (P=.010), and self-efficacy (P=.046). Person-centered physical therapy focused on health-enhancing physical activity and balancing life activities showed significant benefits on fatigue in persons with RA. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. Spatial analysis of factors influencing long-term stress in the grizzly bear (Ursus arctos) population of Alberta, Canada.

    PubMed

    Bourbonnais, Mathieu L; Nelson, Trisalyn A; Cattet, Marc R L; Darimont, Chris T; Stenhouse, Gordon B

    2013-01-01

    Non-invasive measures for assessing long-term stress in free ranging mammals are an increasingly important approach for understanding physiological responses to landscape conditions. Using a spatially and temporally expansive dataset of hair cortisol concentrations (HCC) generated from a threatened grizzly bear (Ursus arctos) population in Alberta, Canada, we quantified how variables representing habitat conditions and anthropogenic disturbance impact long-term stress in grizzly bears. We characterized spatial variability in male and female HCC point data using kernel density estimation and quantified variable influence on spatial patterns of male and female HCC stress surfaces using random forests. Separate models were developed for regions inside and outside of parks and protected areas to account for substantial differences in anthropogenic activity and disturbance within the study area. Variance explained in the random forest models ranged from 55.34% to 74.96% for males and 58.15% to 68.46% for females. Predicted HCC levels were higher for females compared to males. Generally, high spatially continuous female HCC levels were associated with parks and protected areas while low-to-moderate levels were associated with increased anthropogenic disturbance. In contrast, male HCC levels were low in parks and protected areas and low-to-moderate in areas with increased anthropogenic disturbance. Spatial variability in gender-specific HCC levels reveal that the type and intensity of external stressors are not uniform across the landscape and that male and female grizzly bears may be exposed to, or perceive, potential stressors differently. We suggest observed spatial patterns of long-term stress may be the result of the availability and distribution of foods related to disturbance features, potential sexual segregation in available habitat selection, and may not be influenced by sources of mortality which represent acute traumas. In this wildlife system and others, conservation and management efforts can benefit by understanding spatial- and gender-based stress responses to landscape conditions.

  9. Spatial Analysis of Factors Influencing Long-Term Stress in the Grizzly Bear (Ursus arctos) Population of Alberta, Canada

    PubMed Central

    Bourbonnais, Mathieu L.; Nelson, Trisalyn A.; Cattet, Marc R. L.; Darimont, Chris T.; Stenhouse, Gordon B.

    2013-01-01

    Non-invasive measures for assessing long-term stress in free ranging mammals are an increasingly important approach for understanding physiological responses to landscape conditions. Using a spatially and temporally expansive dataset of hair cortisol concentrations (HCC) generated from a threatened grizzly bear (Ursus arctos) population in Alberta, Canada, we quantified how variables representing habitat conditions and anthropogenic disturbance impact long-term stress in grizzly bears. We characterized spatial variability in male and female HCC point data using kernel density estimation and quantified variable influence on spatial patterns of male and female HCC stress surfaces using random forests. Separate models were developed for regions inside and outside of parks and protected areas to account for substantial differences in anthropogenic activity and disturbance within the study area. Variance explained in the random forest models ranged from 55.34% to 74.96% for males and 58.15% to 68.46% for females. Predicted HCC levels were higher for females compared to males. Generally, high spatially continuous female HCC levels were associated with parks and protected areas while low-to-moderate levels were associated with increased anthropogenic disturbance. In contrast, male HCC levels were low in parks and protected areas and low-to-moderate in areas with increased anthropogenic disturbance. Spatial variability in gender-specific HCC levels reveal that the type and intensity of external stressors are not uniform across the landscape and that male and female grizzly bears may be exposed to, or perceive, potential stressors differently. We suggest observed spatial patterns of long-term stress may be the result of the availability and distribution of foods related to disturbance features, potential sexual segregation in available habitat selection, and may not be influenced by sources of mortality which represent acute traumas. In this wildlife system and others, conservation and management efforts can benefit by understanding spatial- and gender-based stress responses to landscape conditions. PMID:24386273

  10. Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.

    PubMed

    Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O

    2006-03-01

    The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.

  11. Shrinkage Estimation of Varying Covariate Effects Based On Quantile Regression

    PubMed Central

    Peng, Limin; Xu, Jinfeng; Kutner, Nancy

    2013-01-01

    Varying covariate effects often manifest meaningful heterogeneity in covariate-response associations. In this paper, we adopt a quantile regression model that assumes linearity at a continuous range of quantile levels as a tool to explore such data dynamics. The consideration of potential non-constancy of covariate effects necessitates a new perspective for variable selection, which, under the assumed quantile regression model, is to retain variables that have effects on all quantiles of interest as well as those that influence only part of quantiles considered. Current work on l1-penalized quantile regression either does not concern varying covariate effects or may not produce consistent variable selection in the presence of covariates with partial effects, a practical scenario of interest. In this work, we propose a shrinkage approach by adopting a novel uniform adaptive LASSO penalty. The new approach enjoys easy implementation without requiring smoothing. Moreover, it can consistently identify the true model (uniformly across quantiles) and achieve the oracle estimation efficiency. We further extend the proposed shrinkage method to the case where responses are subject to random right censoring. Numerical studies confirm the theoretical results and support the utility of our proposals. PMID:25332515

  12. Continuing or Temporarily Stopping Prestroke Antihypertensive Medication in Acute Stroke: An Individual Patient Data Meta-Analysis.

    PubMed

    Woodhouse, Lisa J; Manning, Lisa; Potter, John F; Berge, Eivind; Sprigg, Nikola; Wardlaw, Joanna; Lees, Kennedy R; Bath, Philip M; Robinson, Thompson G

    2017-05-01

    Over 50% of patients are already taking blood pressure-lowering therapy on hospital admission for acute stroke. An individual patient data meta-analysis from randomized controlled trials was undertaken to determine the effect of continuation versus temporarily stopping preexisting antihypertensive medication in acute stroke. Key databases were searched for trials against the following inclusion criteria: randomized design; stroke onset ≤48 hours; investigating the effect of continuation versus stopping prestroke antihypertensive medication; and follow-up of ≥2 weeks. Two randomized controlled trials were identified and included in this meta-analysis of individual patient data from 2860 patients with ≤48 hours of acute stroke. Risk of bias in each study was low. In adjusted logistic regression and multiple regression analyses (using random effects), we found no significant association between continuation of prestroke antihypertensive therapy (versus stopping) and risk of death or dependency at final follow-up: odds ratio 0.96 (95% confidence interval, 0.80-1.14). No significant associations were found between continuation (versus stopping) of therapy and secondary outcomes at final follow-up. Analyses for death and dependency in prespecified subgroups revealed no significant associations with continuation versus temporarily stopping therapy, with the exception of patients randomized ≤12 hours, in whom a difference favoring stopping treatment met statistical significance. We found no significant benefit with continuation of antihypertensive treatment in the acute stroke period. Therefore, there is no urgency to administer preexisting antihypertensive therapy in the first few hours or days after stroke, unless indicated for other comorbid conditions. © 2017 American Heart Association, Inc.

  13. Weight Change After Smoking Cessation Using Variable Doses of Transdermal Nicotine Replacement

    PubMed Central

    Dale, Lowell C; Schroeder, Darrell R; Wolter, Troy D; Croghan, Ivana T; Hurt, Richard D; Offord, Kenneth P

    1998-01-01

    OBJECTIVE Examine weight change in subjects receiving variable doses of transdermal nicotine replacement for smoking cessation. DESIGN Randomized, double-blind clinical trial. SETTING One-week inpatient treatment with outpatient follow-up through 1 year. INTERVENTION This report examines weight change after smoking cessation for 70 subjects randomized to placebo or to 11, 22, or 44 mg/d doses of transdermal nicotine. The study included 1 week of intensive inpatient treatment for nicotine dependence with active patch therapy continuing for another 7 weeks. Counseling sessions were provided weekly for the 8 weeks of patch therapy and with long-term follow-up visits at 3, 6, 9, and 12 months. MEASUREMENTS AND MAIN RESULTS Forty-two subjects were confirmed biochemically (i.e., by expired carbon monoxide) to be nonsmokers at all weekly visits during patch therapy. Their 8-week weight change from baseline was 3.0 ±2.0 kg. For these subjects, 8-week weight change was found to be negatively correlated with percentage of cotinine replacement (r=−.38, p=.012) and positively correlated with baseline weight (r=.48, p=.001), and age (r=.35, p=.025). Men had higher (p=.003) 8-week weight gain (4.0 ±1.8 kg) than women (2.1 ±1.7 kg). Of the 21 subjects who abstained continuously for the entire year, 20 had their weight measured at 1-year follow-up. Among these 20 subjects, 1-year weight change was not found to be associated with gender, baseline weight, baseline smoking rate, total dose of transdermal nicotine, or average percentage of cotinine replacement during the 8 weeks of patch therapy. CONCLUSIONS This study suggests that higher replacement levels of nicotine may delay postcessation weight gain. This effect is consistent for both men and women. We could not identify any factors that predict weight change with long-term abstinence from smoking. PMID:9462489

  14. Long-distance continuous-variable quantum key distribution by controlling excess noise

    NASA Astrophysics Data System (ADS)

    Huang, Duan; Huang, Peng; Lin, Dakai; Zeng, Guihua

    2016-01-01

    Quantum cryptography founded on the laws of physics could revolutionize the way in which communication information is protected. Significant progresses in long-distance quantum key distribution based on discrete variables have led to the secure quantum communication in real-world conditions being available. However, the alternative approach implemented with continuous variables has not yet reached the secure distance beyond 100 km. Here, we overcome the previous range limitation by controlling system excess noise and report such a long distance continuous-variable quantum key distribution experiment. Our result paves the road to the large-scale secure quantum communication with continuous variables and serves as a stepping stone in the quest for quantum network.

  15. Long-distance continuous-variable quantum key distribution by controlling excess noise.

    PubMed

    Huang, Duan; Huang, Peng; Lin, Dakai; Zeng, Guihua

    2016-01-13

    Quantum cryptography founded on the laws of physics could revolutionize the way in which communication information is protected. Significant progresses in long-distance quantum key distribution based on discrete variables have led to the secure quantum communication in real-world conditions being available. However, the alternative approach implemented with continuous variables has not yet reached the secure distance beyond 100 km. Here, we overcome the previous range limitation by controlling system excess noise and report such a long distance continuous-variable quantum key distribution experiment. Our result paves the road to the large-scale secure quantum communication with continuous variables and serves as a stepping stone in the quest for quantum network.

  16. Long-distance continuous-variable quantum key distribution by controlling excess noise

    PubMed Central

    Huang, Duan; Huang, Peng; Lin, Dakai; Zeng, Guihua

    2016-01-01

    Quantum cryptography founded on the laws of physics could revolutionize the way in which communication information is protected. Significant progresses in long-distance quantum key distribution based on discrete variables have led to the secure quantum communication in real-world conditions being available. However, the alternative approach implemented with continuous variables has not yet reached the secure distance beyond 100 km. Here, we overcome the previous range limitation by controlling system excess noise and report such a long distance continuous-variable quantum key distribution experiment. Our result paves the road to the large-scale secure quantum communication with continuous variables and serves as a stepping stone in the quest for quantum network. PMID:26758727

  17. The Integration of Continuous and Discrete Latent Variable Models: Potential Problems and Promising Opportunities

    ERIC Educational Resources Information Center

    Bauer, Daniel J.; Curran, Patrick J.

    2004-01-01

    Structural equation mixture modeling (SEMM) integrates continuous and discrete latent variable models. Drawing on prior research on the relationships between continuous and discrete latent variable models, the authors identify 3 conditions that may lead to the estimation of spurious latent classes in SEMM: misspecification of the structural model,…

  18. Random field theory to interpret the spatial variability of lacustrine soils

    NASA Astrophysics Data System (ADS)

    Russo, Savino; Vessia, Giovanna

    2015-04-01

    The lacustrine soils are quaternary soils, dated from Pleistocene to Holocene periods, generated in low-energy depositional environments and characterized by soil mixture of clays, sands and silts with alternations of finer and coarser grain size layers. They are often met at shallow depth filling several tens of meters of tectonic or erosive basins typically placed in internal Appenine areas. The lacustrine deposits are often locally interbedded by detritic soils resulting from the failure of surrounding reliefs. Their heterogeneous lithology is associated with high spatial variability of physical and mechanical properties both along horizontal and vertical directions. The deterministic approach is still commonly adopted to accomplish the mechanical characterization of these heterogeneous soils where undisturbed sampling is practically not feasible (if the incoherent fraction is prevalent) or not spatially representative (if the cohesive fraction prevails). The deterministic approach consists on performing in situ tests, like Standard Penetration Tests (SPT) or Cone Penetration Tests (CPT) and deriving design parameters through "expert judgment" interpretation of the measure profiles. These readings of tip and lateral resistances (Rp and RL respectively) are almost continuous but highly variable in soil classification according to Schmertmann (1978). Thus, neglecting the spatial variability cannot be the best strategy to estimated spatial representative values of physical and mechanical parameters of lacustrine soils to be used for engineering applications. Hereafter, a method to draw the spatial variability structure of the aforementioned measure profiles is presented. It is based on the theory of the Random Fields (Vanmarcke 1984) applied to vertical readings of Rp measures from mechanical CPTs. The proposed method relies on the application of the regression analysis, by which the spatial mean trend and fluctuations about this trend are derived. Moreover, the scale of fluctuation is calculated to measure the maximum length beyond which profiles of measures are independent. The spatial mean trend can be used to identify "quasi-homogeneous" soil layers where the standard deviation and the scale of fluctuation can be calculated. In this study, five Rp profiles performed in the lacustrine deposits of the high River Pescara Valley have been analyzed. There, silty clay deposits with thickness ranging from a few meters to about 60m, and locally rich in sands and peats, are investigated. In this study, vertical trends of Rp profiles have been derived to be converted into design parameter mean trends. Furthermore, the variability structure derived from Rp readings can be propagated to design parameters to calculate the "characteristic values" requested by the European building codes. References Schmertmann J.H. 1978. Guidelines for Cone Penetration Test, Performance and Design. Report No. FHWA-TS-78-209, U.S. Department of Transportation, Washington, D.C., pp. 145. Vanmarcke E.H. 1984. Random Fields, analysis and synthesis. Cambridge (USA): MIT Press.

  19. Base stock system for patient vs impatient customers with varying demand distribution

    NASA Astrophysics Data System (ADS)

    Fathima, Dowlath; Uduman, P. Sheik

    2013-09-01

    An optimal Base-Stock inventory policy for Patient and Impatient Customers using finite-horizon models is examined. The Base stock system for Patient and Impatient customers is a different type of inventory policy. In case of the model I, Base stock for Patient customer case is evaluated using the Truncated Exponential Distribution. The model II involves the study of Base-stock inventory policies for Impatient customer. A study on these systems reveals that the Customers wait until the arrival of the next order or the customers leaves the system which leads to lost sale. In both the models demand during the period [0, t] is taken to be a random variable. In this paper, Truncated Exponential Distribution satisfies the Base stock policy for the patient customer as a continuous model. So far the Base stock for Impatient Customers leaded to a discrete case but, in this paper we have modeled this condition into a continuous case. We justify this approach mathematically and also numerically.

  20. Financial Data Analysis by means of Coupled Continuous-Time Random Walk in Rachev-Rűschendorf Model

    NASA Astrophysics Data System (ADS)

    Jurlewicz, A.; Wyłomańska, A.; Żebrowski, P.

    2008-09-01

    We adapt the continuous-time random walk formalism to describe asset price evolution. We expand the idea proposed by Rachev and Rűschendorf who analyzed the binomial pricing model in the discrete time with randomization of the number of price changes. As a result, in the framework of the proposed model we obtain a mixture of the Gaussian and a generalized arcsine laws as the limiting distribution of log-returns. Moreover, we derive an European-call-option price that is an extension of the Black-Scholes formula. We apply the obtained theoretical results to model actual financial data and try to show that the continuous-time random walk offers alternative tools to deal with several complex issues of financial markets.

  1. A continuous glucose monitoring and problem-solving intervention to change physical activity behavior in women with type 2 diabetes: a pilot study.

    PubMed

    Allen, Nancy; Whittemore, Robin; Melkus, Gail

    2011-11-01

    Diabetes technology has the potential to provide useful data for theory-based behavioral counseling. The aims of this study are to evaluate the feasibility, acceptability, and preliminary efficacy of a continuous glucose monitoring and problem-solving counseling intervention to change physical activity (PA) behavior in women with type 2 diabetes. Women (n=29) with type 2 diabetes were randomly assigned to one of two treatment conditions: continuous glucose counseling and problem-solving skills or continuous glucose monitoring counseling and general diabetes education. Feasibility data were obtained on intervention dose, implementation, and satisfaction. Preliminary efficacy data were collected at baseline and 12 weeks on the following measures: PA amount and intensity, diet, problem-solving skills, self-efficacy for PA, depression, hemogoloin A1c, weight, and blood pressure. Demographic and implementation variables were described using frequency distributions and summary statistics. Satisfaction data were analyzed using Wilcoxon rank. Differences between groups were analyzed using linear mixed-modeling. Women were mostly white/non-Latina with a mean age of 53 years, a 6.5-year history of diabetes, and suboptimal glycemic control. Continuous glucose monitoring plus problem-solving group participants had significantly greater problem-solving skills and had greater, although not statistically significant, dietary adherence, moderate activity minutes, weight loss, and higher intervention satisfaction pre- to post-intervention than did participants in the continuous glucose monitoring plus education group. A continuous glucose monitoring plus problem-solving intervention was feasible and acceptable, and participants had greater problem-solving skills than continuous glucose monitoring plus education group participants.

  2. Reliability analysis of structures under periodic proof tests in service

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1976-01-01

    A reliability analysis of structures subjected to random service loads and periodic proof tests treats gust loads and maneuver loads as random processes. Crack initiation, crack propagation, and strength degradation are treated as the fatigue process. The time to fatigue crack initiation and ultimate strength are random variables. Residual strength decreases during crack propagation, so that failure rate increases with time. When a structure fails under periodic proof testing, a new structure is built and proof-tested. The probability of structural failure in service is derived from treatment of all the random variables, strength degradations, service loads, proof tests, and the renewal of failed structures. Some numerical examples are worked out.

  3. Randomized controlled dissemination study of community-to-clinic navigation to promote CRC screening: Study design and implications.

    PubMed

    Larkey, Linda; Szalacha, Laura; Herman, Patricia; Gonzalez, Julie; Menon, Usha

    2017-02-01

    Regular screening facilitates early diagnosis of colorectal cancer (CRC) and reduction of CRC morbidity and mortality. Screening rates for minorities and low-income populations remain suboptimal. Provider referral for CRC screening is one of the strongest predictors of adherence, but referrals are unlikely among those who have no clinic home (common among poor and minority populations). This group randomized controlled study will test the effectiveness of an evidence based tailored messaging intervention in a community-to-clinic navigation context compared to no navigation. Multicultural, underinsured individuals from community sites will be randomized (by site) to receive CRC screening education only, or education plus navigation. In Phase I, those randomized to education plus navigation will be guided to make a clinic appointment to receive a provider referral for CRC screening. Patients attending clinic appointments will continue to receive navigation until screened (Phase II) regardless of initial arm assignment. We hypothesize that those receiving education plus navigation will be more likely to attend clinic appointments (H1) and show higher rates of screening (H2) compared to those receiving education only. Phase I group assignment will be used as a control variable in analysis of screening follow-through in Phase II. Costs per screening achieved will be evaluated for each condition and the RE-AIM framework will be used to examine dissemination results. The novelty of our study design is the translational dissemination model that will allow us to assess the real-world application of an efficacious intervention previously tested in a randomized controlled trial. Copyright © 2016. Published by Elsevier Inc.

  4. Robotic-assisted versus laparoscopic colorectal surgery: a meta-analysis of four randomized controlled trials

    PubMed Central

    2014-01-01

    Background Robotic-assisted laparoscopy is popularly performed for colorectal disease. The objective of this meta-analysis was to compare the safety and efficacy of robotic-assisted colorectal surgery (RCS) and laparoscopic colorectal surgery (LCS) for colorectal disease based on randomized controlled trial studies. Methods Literature searches of electronic databases (Pubmed, Web of Science, and Cochrane Library) were performed to identify randomized controlled trial studies that compared the clinical or oncologic outcomes of RCS and LCS. This meta-analysis was performed using the Review Manager (RevMan) software (version 5.2) that is provided by the Cochrane Collaboration. The data used were mean differences and odds ratios for continuous and dichotomous variables, respectively. Fixed-effects or random-effects models were adopted according to heterogeneity. Results Four randomized controlled trial studies were identified for this meta-analysis. In total, 110 patients underwent RCS, and 116 patients underwent LCS. The results revealed that estimated blood losses (EBLs), conversion rates and times to the recovery of bowel function were significantly reduced following RCS compared with LCS. There were no significant differences in complication rates, lengths of hospital stays, proximal margins, distal margins or harvested lymph nodes between the two techniques. Conclusions RCS is a promising technique and is a safe and effective alternative to LCS for colorectal surgery. The advantages of RCS include reduced EBLs, lower conversion rates and shorter times to the recovery of bowel function. Further studies are required to define the financial effects of RCS and the effects of RCS on long-term oncologic outcomes. PMID:24767102

  5. AUTOCLASSIFICATION OF THE VARIABLE 3XMM SOURCES USING THE RANDOM FOREST MACHINE LEARNING ALGORITHM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrell, Sean A.; Murphy, Tara; Lo, Kitty K., E-mail: s.farrell@physics.usyd.edu.au

    In the current era of large surveys and massive data sets, autoclassification of astrophysical sources using intelligent algorithms is becoming increasingly important. In this paper we present the catalog of variable sources in the Third XMM-Newton Serendipitous Source catalog (3XMM) autoclassified using the Random Forest machine learning algorithm. We used a sample of manually classified variable sources from the second data release of the XMM-Newton catalogs (2XMMi-DR2) to train the classifier, obtaining an accuracy of ∼92%. We also evaluated the effectiveness of identifying spurious detections using a sample of spurious sources, achieving an accuracy of ∼95%. Manual investigation of amore » random sample of classified sources confirmed these accuracy levels and showed that the Random Forest machine learning algorithm is highly effective at automatically classifying 3XMM sources. Here we present the catalog of classified 3XMM variable sources. We also present three previously unidentified unusual sources that were flagged as outlier sources by the algorithm: a new candidate supergiant fast X-ray transient, a 400 s X-ray pulsar, and an eclipsing 5 hr binary system coincident with a known Cepheid.« less

  6. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  7. Violation of Bell's Inequality Using Continuous Variable Measurements

    NASA Astrophysics Data System (ADS)

    Thearle, Oliver; Janousek, Jiri; Armstrong, Seiji; Hosseini, Sara; Schünemann Mraz, Melanie; Assad, Syed; Symul, Thomas; James, Matthew R.; Huntington, Elanor; Ralph, Timothy C.; Lam, Ping Koy

    2018-01-01

    A Bell inequality is a fundamental test to rule out local hidden variable model descriptions of correlations between two physically separated systems. There have been a number of experiments in which a Bell inequality has been violated using discrete-variable systems. We demonstrate a violation of Bell's inequality using continuous variable quadrature measurements. By creating a four-mode entangled state with homodyne detection, we recorded a clear violation with a Bell value of B =2.31 ±0.02 . This opens new possibilities for using continuous variable states for device independent quantum protocols.

  8. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  9. Application of random effects to the study of resource selection by animals

    USGS Publications Warehouse

    Gillies, C.S.; Hebblewhite, M.; Nielsen, S.E.; Krawchuk, M.A.; Aldridge, Cameron L.; Frair, J.L.; Saher, D.J.; Stevens, C.E.; Jerde, C.L.

    2006-01-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence.2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability.3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed.4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects.5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection.6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.

  10. Application of random effects to the study of resource selection by animals.

    PubMed

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.

  11. Discrete-continuous variable structural synthesis using dual methods

    NASA Technical Reports Server (NTRS)

    Schmit, L. A.; Fleury, C.

    1980-01-01

    Approximation concepts and dual methods are extended to solve structural synthesis problems involving a mix of discrete and continuous sizing type of design variables. Pure discrete and pure continuous variable problems can be handled as special cases. The basic mathematical programming statement of the structural synthesis problem is converted into a sequence of explicit approximate primal problems of separable form. These problems are solved by constructing continuous explicit dual functions, which are maximized subject to simple nonnegativity constraints on the dual variables. A newly devised gradient projection type of algorithm called DUAL 1, which includes special features for handling dual function gradient discontinuities that arise from the discrete primal variables, is used to find the solution of each dual problem. Computational implementation is accomplished by incorporating the DUAL 1 algorithm into the ACCESS 3 program as a new optimizer option. The power of the method set forth is demonstrated by presenting numerical results for several example problems, including a pure discrete variable treatment of a metallic swept wing and a mixed discrete-continuous variable solution for a thin delta wing with fiber composite skins.

  12. Optimal allocation of testing resources for statistical simulations

    NASA Astrophysics Data System (ADS)

    Quintana, Carolina; Millwater, Harry R.; Singh, Gulshan; Golden, Patrick

    2015-07-01

    Statistical estimates from simulation involve uncertainty caused by the variability in the input random variables due to limited data. Allocating resources to obtain more experimental data of the input variables to better characterize their probability distributions can reduce the variance of statistical estimates. The methodology proposed determines the optimal number of additional experiments required to minimize the variance of the output moments given single or multiple constraints. The method uses multivariate t-distribution and Wishart distribution to generate realizations of the population mean and covariance of the input variables, respectively, given an amount of available data. This method handles independent and correlated random variables. A particle swarm method is used for the optimization. The optimal number of additional experiments per variable depends on the number and variance of the initial data, the influence of the variable in the output function and the cost of each additional experiment. The methodology is demonstrated using a fretting fatigue example.

  13. A comparison of multiple imputation methods for handling missing values in longitudinal data in the presence of a time-varying covariate with a non-linear association with time: a simulation study.

    PubMed

    De Silva, Anurika Priyanjali; Moreno-Betancur, Margarita; De Livera, Alysha Madhu; Lee, Katherine Jane; Simpson, Julie Anne

    2017-07-25

    Missing data is a common problem in epidemiological studies, and is particularly prominent in longitudinal data, which involve multiple waves of data collection. Traditional multiple imputation (MI) methods (fully conditional specification (FCS) and multivariate normal imputation (MVNI)) treat repeated measurements of the same time-dependent variable as just another 'distinct' variable for imputation and therefore do not make the most of the longitudinal structure of the data. Only a few studies have explored extensions to the standard approaches to account for the temporal structure of longitudinal data. One suggestion is the two-fold fully conditional specification (two-fold FCS) algorithm, which restricts the imputation of a time-dependent variable to time blocks where the imputation model includes measurements taken at the specified and adjacent times. To date, no study has investigated the performance of two-fold FCS and standard MI methods for handling missing data in a time-varying covariate with a non-linear trajectory over time - a commonly encountered scenario in epidemiological studies. We simulated 1000 datasets of 5000 individuals based on the Longitudinal Study of Australian Children (LSAC). Three missing data mechanisms: missing completely at random (MCAR), and a weak and a strong missing at random (MAR) scenarios were used to impose missingness on body mass index (BMI) for age z-scores; a continuous time-varying exposure variable with a non-linear trajectory over time. We evaluated the performance of FCS, MVNI, and two-fold FCS for handling up to 50% of missing data when assessing the association between childhood obesity and sleep problems. The standard two-fold FCS produced slightly more biased and less precise estimates than FCS and MVNI. We observed slight improvements in bias and precision when using a time window width of two for the two-fold FCS algorithm compared to the standard width of one. We recommend the use of FCS or MVNI in a similar longitudinal setting, and when encountering convergence issues due to a large number of time points or variables with missing values, the two-fold FCS with exploration of a suitable time window.

  14. A study of potential pharmacokinetic and pharmacodynamic interactions between dextromethorphan/quinidine and memantine in healthy volunteers.

    PubMed

    Pope, Laura E; Schoedel, Kerri A; Bartlett, Cynthia; Sellers, Edward M

    2012-08-01

    Dextromethorphan/quinidine (DMQ) is the first agent indicated for the treatment of pseudobulbar affect. Dextromethorphan, the active ingredient, is a low-affinity, uncompetitive N-methyl-D-aspartate (NMDA) receptor antagonist. This study evaluated the potential for a drug-drug interaction (DDI) of DMQ with memantine, which is also an NMDA receptor antagonist. This open-label, randomized, parallel-group study enrolled healthy adults who were randomized into one of two treatment groups. Group 1 subjects were administered memantine at a starting dose of 5 mg once daily, which was titrated over a 3-week period to a dose of 10 mg twice daily (every 12 hours) and continued for another 11 days to attain steady state; DMQ 30 mg (dextromethorphan 30 mg/quinidine 30 mg) every 12 hours was then added for a further 8 days. Group 2 subjects received DMQ 30 mg every 12 hours for 8 days to attain steady state; memantine was then added, titrated on the same schedule as in group 1, and continued at 10 mg every 12 hours for an additional 11 days. Pharmacokinetic blood sampling was performed to assess the primary endpoints of the 90% confidence intervals (CIs) for the geometric mean ratios of the areas under the plasma concentration-time curves (AUCs) for memantine, dextromethorphan, dextrorphan - the dextromethorphan metabolite - and quinidine during concomitant therapy versus monotherapy. Safety/tolerability and pharmacodynamic variables were also assessed. A total of 52 subjects were randomized. In both group 1 (n = 23) and group 2 (n = 29), the 90% CIs for the ratios of the AUCs during concomitant therapy versus monotherapy were within the predefined range to indicate similarity (0.8-1.25) for memantine, dextromethorphan and dextrorphan, indicating no pharmacokinetic DDI. The 90% CI for the AUC ratio for quinidine was slightly above the predefined range; however, the mean AUC increased by only 25%. In both groups, incidence of adverse events was similar, and pharmacodynamic variables were either similar or slightly improved with DMQ added to memantine and memantine added to DMQ, compared to monotherapy with either agent. Minimal pharmacokinetic and pharmacodynamic interactions were observed between memantine and DMQ, suggesting they can be coadministered without dose adjustment.

  15. The Randomized CRM: An Approach to Overcoming the Long-Memory Property of the CRM

    PubMed Central

    Koopmeiners, Joseph S.; Wey, Andrew

    2017-01-01

    The primary object of a phase I clinical trial is to determine the maximum tolerated dose (MTD). Typically, the MTD is identified using a dose-escalation study, where initial subjects are treated at the lowest dose level and subsequent subjects are treated at progressively higher dose levels until the MTD is identified. The continual reassessment method (CRM) is a popular model-based dose-escalation design, which utilizes a formal model for the relationship between dose and toxicity to guide dose-finding. Recently, it was shown that the CRM has a tendency to get “stuck” on a dose-level, with little escalation or de-escalation in the late stages of the trial, due to the long-memory property of the CRM. We propose the randomized CRM (rCRM), which introduces random escalation and de-escalation into the standard CRM dose-finding algorithm, as well as a hybrid approach that incorporates escalation and de-escalation only when certain criteria are met. Our simulation results show that both the rCRM and the hybrid approach reduce the trial-to-trial variability in the number of cohorts treated at the MTD but that the hybrid approach has a more favorable trade-off with respect to the average number treated at the MTD. PMID:28340333

  16. The Randomized CRM: An Approach to Overcoming the Long-Memory Property of the CRM.

    PubMed

    Koopmeiners, Joseph S; Wey, Andrew

    2017-01-01

    The primary object of a Phase I clinical trial is to determine the maximum tolerated dose (MTD). Typically, the MTD is identified using a dose-escalation study, where initial subjects are treated at the lowest dose level and subsequent subjects are treated at progressively higher dose levels until the MTD is identified. The continual reassessment method (CRM) is a popular model-based dose-escalation design, which utilizes a formal model for the relationship between dose and toxicity to guide dose finding. Recently, it was shown that the CRM has a tendency to get "stuck" on a dose level, with little escalation or de-escalation in the late stages of the trial, due to the long-memory property of the CRM. We propose the randomized CRM (rCRM), which introduces random escalation and de-escalation into the standard CRM dose-finding algorithm, as well as a hybrid approach that incorporates escalation and de-escalation only when certain criteria are met. Our simulation results show that both the rCRM and the hybrid approach reduce the trial-to-trial variability in the number of cohorts treated at the MTD but that the hybrid approach has a more favorable tradeoff with respect to the average number treated at the MTD.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dondero, Rachel Elizabeth

    The increased use of Field Programmable Gate Arrays (FPGAs) in critical systems brings new challenges in securing the diversely programmable fabric from cyber-attacks. FPGAs are an inexpensive, efficient, and flexible alternative to Application Specific Integrated Circuits (ASICs), which are becoming increasingly expensive and impractical for low volume manufacturing as technology nodes continue to shrink. Unfortunately, FPGAs are not designed for high security applications, and their high-flexibility lends itself to low security and vulnerability to malicious attacks. Similar to securing an ASIC’s functionality, FPGA programmers can exploit the inherent randomness introduced into hardware structures during fabrication for security applications. Physically Unclonablemore » Functions (PUFs) are one such solution that uses the die specific variability in hardware fabrication for both secret key generation and verification. PUFs strive to be random, unique, and reliable. Throughout recent years many PUF structures have been presented to try and maximize these three design constraints, reliability being the most difficult of the three to achieve. This thesis presents a new PUF structure that combines two elementary PUF concepts (a bi-stable SRAM PUF and a delay-based arbiter PUF) to create a PUF with increased reliability, while maintaining both random and unique qualities. Properties of the new PUF will be discussed as well as the various design modifications that can be made to tweak the desired performance and overhead.« less

  18. A Randomized Clinical Trial of Methadone Maintenance for Prisoners: Prediction of Treatment Entry and Completion in Prison

    PubMed Central

    GORDON, MICHAEL S.; KINLOCK, TIMOTHY W.; COUVILLION, KATHRYN A.; SCHWARTZ, ROBERT P.; O’GRADY, KEVIN

    2014-01-01

    The present report is an intent-to-treat analysis involving secondary data drawn from the first randomized clinical trial of prison-initiated methadone in the United States. This study examined predictors of treatment entry and completion in prison. A sample of 211 adult male prerelease inmates with preincarceration heroin dependence were randomly assigned to one of three treatment conditions: counseling only (counseling in prison; n= 70); counseling plus transfer (counseling in prison with transfer to methadone maintenance treatment upon release; n= 70); and counseling plus methadone (methadone maintenance in prison, continued in a community-based methadone maintenance program upon release; n= 71). Entered prison treatment (p <. 01), and completed prison treatment (p< .001) were significantly predicted by the set of 10 explanatory variables and favored the treatment conditions receiving methadone. The present results indicate that individuals who are older in age and have longer prison sentences may have better outcomes than younger individuals with shorter sentences, meaning they are more likely to enter and complete prison-based treatment. Furthermore, implications for the treatment of prisoners with prior heroin dependence and for conducting clinical trials may indicate the importance of examining individual characteristics and the possibility of the examination of patient preference. PMID:25392605

  19. Universality in a Neutral Evolution Model

    NASA Astrophysics Data System (ADS)

    King, Dawn; Scott, Adam; Maric, Nevena; Bahar, Sonya

    2013-03-01

    Agent-based models are ideal for investigating the complex problems of biodiversity and speciation because they allow for complex interactions between individuals and between individuals and the environment. Presented here is a ``null'' model that investigates three mating types - assortative, bacterial, and random - in phenotype space, as a function of the percentage of random death δ. Previous work has shown phase transition behavior in an assortative mating model with variable fitness landscapes as the maximum mutation size (μ) was varied (Dees and Bahar, 2010). Similarly, this behavior was recently presented in the work of Scott et al. (submitted), on a completely neutral landscape, for bacterial-like fission as well as for assortative mating. Here, in order to achieve an appropriate ``null'' hypothesis, the random death process was changed so each individual, in each generation, has the same probability of death. Results show a continuous nonequilibrium phase transition for the order parameters of the population size and the number of clusters (analogue of species) as δ is varied for three different mutation sizes of the system. The system shows increasing robustness as μ increases. Universality classes and percolation properties of this system are also explored. This research was supported by funding from: University of Missouri Research Board and James S. McDonnell Foundation

  20. A Randomized Clinical Trial of Methadone Maintenance for Prisoners: Prediction of Treatment Entry and Completion in Prison.

    PubMed

    Gordon, Michael S; Kinlock, Timothy W; Couvillion, Kathryn A; Schwartz, Robert P; O'Grady, Kevin

    2012-05-01

    The present report is an intent-to-treat analysis involving secondary data drawn from the first randomized clinical trial of prison-initiated methadone in the United States. This study examined predictors of treatment entry and completion in prison. A sample of 211 adult male prerelease inmates with preincarceration heroin dependence were randomly assigned to one of three treatment conditions: counseling only (counseling in prison; n= 70); counseling plus transfer (counseling in prison with transfer to methadone maintenance treatment upon release; n= 70); and counseling plus methadone (methadone maintenance in prison, continued in a community-based methadone maintenance program upon release; n= 71). Entered prison treatment (p <. 01), and completed prison treatment (p< .001) were significantly predicted by the set of 10 explanatory variables and favored the treatment conditions receiving methadone. The present results indicate that individuals who are older in age and have longer prison sentences may have better outcomes than younger individuals with shorter sentences, meaning they are more likely to enter and complete prison-based treatment. Furthermore, implications for the treatment of prisoners with prior heroin dependence and for conducting clinical trials may indicate the importance of examining individual characteristics and the possibility of the examination of patient preference.

  1. Evaluating physical habitat and water chemistry data from statewide stream monitoring programs to establish least-impacted conditions in Washington State

    USGS Publications Warehouse

    Wilmoth, Siri K.; Irvine, Kathryn M.; Larson, Chad

    2015-01-01

    Various GIS-generated land-use predictor variables, physical habitat metrics, and water chemistry variables from 75 reference streams and 351 randomly sampled sites throughout Washington State were evaluated for effectiveness at discriminating reference from random sites within level III ecoregions. A combination of multivariate clustering and ordination techniques were used. We describe average observed conditions for a subset of predictor variables as well as proposing statistical criteria for establishing reference conditions for stream habitat in Washington. Using these criteria, we determined whether any of the random sites met expectations for reference condition and whether any of the established reference sites failed to meet expectations for reference condition. Establishing these criteria will set a benchmark from which future data will be compared.

  2. Non-manipulation quantitative designs.

    PubMed

    Rumrill, Phillip D

    2004-01-01

    The article describes non-manipulation quantitative designs of two types, correlational and causal comparative studies. Both of these designs are characterized by the absence of random assignment of research participants to conditions or groups and non-manipulation of the independent variable. Without random selection or manipulation of the independent variable, no attempt is made to draw causal inferences regarding relationships between independent and dependent variables. Nonetheless, non-manipulation studies play an important role in rehabilitation research, as described in this article. Examples from the contemporary rehabilitation literature are included. Copyright 2004 IOS Press

  3. Morinda citrifolia (Noni) as an Anti-Inflammatory Treatment in Women with Primary Dysmenorrhoea: A Randomised Double-Blind Placebo-Controlled Trial.

    PubMed

    Fletcher, H M; Dawkins, J; Rattray, C; Wharfe, G; Reid, M; Gordon-Strachan, G

    2013-01-01

    Introduction. Noni (Morinda citrifolia) has been used for many years as an anti-inflammatory agent. We tested the efficacy of Noni in women with dysmenorrhea. Method. We did a prospective randomized double-blind placebo-controlled trial in 100 university students of 18 years and older over three menstrual cycles. Patients were invited to participate and randomly assigned to receive 400 mg Noni capsules or placebo. They were assessed for baseline demographic variables such as age, parity, and BMI. They were also assessed before and after treatment, for pain, menstrual blood loss, and laboratory variables: ESR, hemoglobin, and packed cell volume. Results. Of the 1027 women screened, 100 eligible women were randomized. Of the women completing the study, 42 women were randomized to Noni and 38 to placebo. There were no significant differences in any of the variables at randomization. There were also no significant differences in mean bleeding score or pain score at randomization. Both bleeding and pain scores gradually improved in both groups as the women were observed over three menstrual cycles; however, the improvement was not significantly different in the Noni group when compared to the controls. Conclusion. Noni did not show a reduction in menstrual pain or bleeding when compared to placebo.

  4. Morinda citrifolia (Noni) as an Anti-Inflammatory Treatment in Women with Primary Dysmenorrhoea: A Randomised Double-Blind Placebo-Controlled Trial

    PubMed Central

    Fletcher, H. M.; Dawkins, J.; Rattray, C.; Wharfe, G.; Reid, M.; Gordon-Strachan, G.

    2013-01-01

    Introduction. Noni (Morinda citrifolia) has been used for many years as an anti-inflammatory agent. We tested the efficacy of Noni in women with dysmenorrhea. Method. We did a prospective randomized double-blind placebo-controlled trial in 100 university students of 18 years and older over three menstrual cycles. Patients were invited to participate and randomly assigned to receive 400 mg Noni capsules or placebo. They were assessed for baseline demographic variables such as age, parity, and BMI. They were also assessed before and after treatment, for pain, menstrual blood loss, and laboratory variables: ESR, hemoglobin, and packed cell volume. Results. Of the 1027 women screened, 100 eligible women were randomized. Of the women completing the study, 42 women were randomized to Noni and 38 to placebo. There were no significant differences in any of the variables at randomization. There were also no significant differences in mean bleeding score or pain score at randomization. Both bleeding and pain scores gradually improved in both groups as the women were observed over three menstrual cycles; however, the improvement was not significantly different in the Noni group when compared to the controls. Conclusion. Noni did not show a reduction in menstrual pain or bleeding when compared to placebo. PMID:23431314

  5. K-Means Algorithm Performance Analysis With Determining The Value Of Starting Centroid With Random And KD-Tree Method

    NASA Astrophysics Data System (ADS)

    Sirait, Kamson; Tulus; Budhiarti Nababan, Erna

    2017-12-01

    Clustering methods that have high accuracy and time efficiency are necessary for the filtering process. One method that has been known and applied in clustering is K-Means Clustering. In its application, the determination of the begining value of the cluster center greatly affects the results of the K-Means algorithm. This research discusses the results of K-Means Clustering with starting centroid determination with a random and KD-Tree method. The initial determination of random centroid on the data set of 1000 student academic data to classify the potentially dropout has a sse value of 952972 for the quality variable and 232.48 for the GPA, whereas the initial centroid determination by KD-Tree has a sse value of 504302 for the quality variable and 214,37 for the GPA variable. The smaller sse values indicate that the result of K-Means Clustering with initial KD-Tree centroid selection have better accuracy than K-Means Clustering method with random initial centorid selection.

  6. Designing management strategies for carbon dioxide storage and utilization under uncertainty using inexact modelling

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2017-06-01

    Effective application of carbon capture, utilization and storage (CCUS) systems could help to alleviate the influence of climate change by reducing carbon dioxide (CO2) emissions. The research objective of this study is to develop an equilibrium chance-constrained programming model with bi-random variables (ECCP model) for supporting the CCUS management system under random circumstances. The major advantage of the ECCP model is that it tackles random variables as bi-random variables with a normal distribution, where the mean values follow a normal distribution. This could avoid irrational assumptions and oversimplifications in the process of parameter design and enrich the theory of stochastic optimization. The ECCP model is solved by an equilibrium change-constrained programming algorithm, which provides convenience for decision makers to rank the solution set using the natural order of real numbers. The ECCP model is applied to a CCUS management problem, and the solutions could be useful in helping managers to design and generate rational CO2-allocation patterns under complexities and uncertainties.

  7. Log-normal distribution from a process that is not multiplicative but is additive.

    PubMed

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  8. Variable density randomized stack of spirals (VDR-SoS) for compressive sensing MRI.

    PubMed

    Valvano, Giuseppe; Martini, Nicola; Landini, Luigi; Santarelli, Maria Filomena

    2016-07-01

    To develop a 3D sampling strategy based on a stack of variable density spirals for compressive sensing MRI. A random sampling pattern was obtained by rotating each spiral by a random angle and by delaying for few time steps the gradient waveforms of the different interleaves. A three-dimensional (3D) variable sampling density was obtained by designing different variable density spirals for each slice encoding. The proposed approach was tested with phantom simulations up to a five-fold undersampling factor. Fully sampled 3D dataset of a human knee, and of a human brain, were obtained from a healthy volunteer. The proposed approach was tested with off-line reconstructions of the knee dataset up to a four-fold acceleration and compared with other noncoherent trajectories. The proposed approach outperformed the standard stack of spirals for various undersampling factors. The level of coherence and the reconstruction quality of the proposed approach were similar to those of other trajectories that, however, require 3D gridding for the reconstruction. The variable density randomized stack of spirals (VDR-SoS) is an easily implementable trajectory that could represent a valid sampling strategy for 3D compressive sensing MRI. It guarantees low levels of coherence without requiring 3D gridding. Magn Reson Med 76:59-69, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  9. Random walks and diffusion on networks

    NASA Astrophysics Data System (ADS)

    Masuda, Naoki; Porter, Mason A.; Lambiotte, Renaud

    2017-11-01

    Random walks are ubiquitous in the sciences, and they are interesting from both theoretical and practical perspectives. They are one of the most fundamental types of stochastic processes; can be used to model numerous phenomena, including diffusion, interactions, and opinions among humans and animals; and can be used to extract information about important entities or dense groups of entities in a network. Random walks have been studied for many decades on both regular lattices and (especially in the last couple of decades) on networks with a variety of structures. In the present article, we survey the theory and applications of random walks on networks, restricting ourselves to simple cases of single and non-adaptive random walkers. We distinguish three main types of random walks: discrete-time random walks, node-centric continuous-time random walks, and edge-centric continuous-time random walks. We first briefly survey random walks on a line, and then we consider random walks on various types of networks. We extensively discuss applications of random walks, including ranking of nodes (e.g., PageRank), community detection, respondent-driven sampling, and opinion models such as voter models.

  10. Effect of hydration and continuous urinary drainage on urine production in children.

    PubMed

    Galetseli, Marianthi; Dimitriou, Panagiotis; Tsapra, Helen; Moustaki, Maria; Nicolaidou, Polyxeni; Fretzayas, Andrew

    2008-01-01

    Although urine production depends on numerous physiological variables there are no quantitative data regarding the effect of bladder decompression, by means of continuous catheter drainage, on urine production. The aim of this study was to investigate this effect. The study was carried out in two stages, each consisting of two phases. The effect of two distinct orally administered amounts of water was recorded in relation to continuous bladder decompression on the changes with time of urine volume and the urine production rate. In the first stage, 35 children were randomly divided into two groups and two different hydration schemes (290 and 580 ml of water/m2) were used. After the second urination of Phase 1, continuous drainage was employed in the phase that followed (Phase 2). In the second stage, a group of 10 children participated and Phase 2 was carried out 1 day after the completion of Phase 1. It was shown that the amount of urine produced increased in accordance with the degree of hydration and doubled or tripled with continual urine drainage by catheter for the same degree of hydration and within the same time interval. This was also true for Stage 2, in which Phase 2 was performed 24 h after Phase 1, indicating that diuresis during Phase 2 (as a result of Phase 1) was negligible. It was shown that during continuous drainage of urine with bladder catheterization there is an increased need for fluids, which should be administered early.

  11. Atomic clocks and the continuous-time random-walk

    NASA Astrophysics Data System (ADS)

    Formichella, Valerio; Camparo, James; Tavella, Patrizia

    2017-11-01

    Atomic clocks play a fundamental role in many fields, most notably they generate Universal Coordinated Time and are at the heart of all global navigation satellite systems. Notwithstanding their excellent timekeeping performance, their output frequency does vary: it can display deterministic frequency drift; diverse continuous noise processes result in nonstationary clock noise (e.g., random-walk frequency noise, modelled as a Wiener process), and the clock frequency may display sudden changes (i.e., "jumps"). Typically, the clock's frequency instability is evaluated by the Allan or Hadamard variances, whose functional forms can identify the different operative noise processes. Here, we show that the Allan and Hadamard variances of a particular continuous-time random-walk, the compound Poisson process, have the same functional form as for a Wiener process with drift. The compound Poisson process, introduced as a model for observed frequency jumps, is an alternative to the Wiener process for modelling random walk frequency noise. This alternate model fits well the behavior of the rubidium clocks flying on GPS Block-IIR satellites. Further, starting from jump statistics, the model can be improved by considering a more general form of continuous-time random-walk, and this could bring new insights into the physics of atomic clocks.

  12. Estimating mutual information using B-spline functions – an improved similarity measure for analysing gene expression data

    PubMed Central

    Daub, Carsten O; Steuer, Ralf; Selbig, Joachim; Kloska, Sebastian

    2004-01-01

    Background The information theoretic concept of mutual information provides a general framework to evaluate dependencies between variables. In the context of the clustering of genes with similar patterns of expression it has been suggested as a general quantity of similarity to extend commonly used linear measures. Since mutual information is defined in terms of discrete variables, its application to continuous data requires the use of binning procedures, which can lead to significant numerical errors for datasets of small or moderate size. Results In this work, we propose a method for the numerical estimation of mutual information from continuous data. We investigate the characteristic properties arising from the application of our algorithm and show that our approach outperforms commonly used algorithms: The significance, as a measure of the power of distinction from random correlation, is significantly increased. This concept is subsequently illustrated on two large-scale gene expression datasets and the results are compared to those obtained using other similarity measures. A C++ source code of our algorithm is available for non-commercial use from kloska@scienion.de upon request. Conclusion The utilisation of mutual information as similarity measure enables the detection of non-linear correlations in gene expression datasets. Frequently applied linear correlation measures, which are often used on an ad-hoc basis without further justification, are thereby extended. PMID:15339346

  13. Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Chang, K. C.

    2005-05-01

    Probabilistic inference for Bayesian networks is in general NP-hard using either exact algorithms or approximate methods. However, for very complex networks, only the approximate methods such as stochastic sampling could be used to provide a solution given any time constraint. There are several simulation methods currently available. They include logic sampling (the first proposed stochastic method for Bayesian networks, the likelihood weighting algorithm) the most commonly used simulation method because of its simplicity and efficiency, the Markov blanket scoring method, and the importance sampling algorithm. In this paper, we first briefly review and compare these available simulation methods, then we propose an improved importance sampling algorithm called linear Gaussian importance sampling algorithm for general hybrid model (LGIS). LGIS is aimed for hybrid Bayesian networks consisting of both discrete and continuous random variables with arbitrary distributions. It uses linear function and Gaussian additive noise to approximate the true conditional probability distribution for continuous variable given both its parents and evidence in a Bayesian network. One of the most important features of the newly developed method is that it can adaptively learn the optimal important function from the previous samples. We test the inference performance of LGIS using a 16-node linear Gaussian model and a 6-node general hybrid model. The performance comparison with other well-known methods such as Junction tree (JT) and likelihood weighting (LW) shows that LGIS-GHM is very promising.

  14. Cortical Components of Reaction-Time during Perceptual Decisions in Humans.

    PubMed

    Dmochowski, Jacek P; Norcia, Anthony M

    2015-01-01

    The mechanisms of perceptual decision-making are frequently studied through measurements of reaction time (RT). Classical sequential-sampling models (SSMs) of decision-making posit RT as the sum of non-overlapping sensory, evidence accumulation, and motor delays. In contrast, recent empirical evidence hints at a continuous-flow paradigm in which multiple motor plans evolve concurrently with the accumulation of sensory evidence. Here we employ a trial-to-trial reliability-based component analysis of encephalographic data acquired during a random-dot motion task to directly image continuous flow in the human brain. We identify three topographically distinct neural sources whose dynamics exhibit contemporaneous ramping to time-of-response, with the rate and duration of ramping discriminating fast and slow responses. Only one of these sources, a parietal component, exhibits dependence on strength-of-evidence. The remaining two components possess topographies consistent with origins in the motor system, and their covariation with RT overlaps in time with the evidence accumulation process. After fitting the behavioral data to a popular SSM, we find that the model decision variable is more closely matched to the combined activity of the three components than to their individual activity. Our results emphasize the role of motor variability in shaping RT distributions on perceptual decision tasks, suggesting that physiologically plausible computational accounts of perceptual decision-making must model the concurrent nature of evidence accumulation and motor planning.

  15. Benefits of a Continuous Ambulatory Peritoneal Dialysis (CAPD) Technique with One Icodextrin-Containing and Two Biocompatible Glucose-Containing Dialysates for Preservation of Residual Renal Function and Biocompatibility in Incident CAPD Patients

    PubMed Central

    2014-01-01

    In a prospective randomized controlled study, the efficacy and safety of a continuous ambulatory peritoneal dialysis (CAPD) technique has been evaluated using one icodextrin-containing and two glucose-containing dialysates a day. Eighty incident CAPD patients were randomized to two groups; GLU group continuously using four glucose-containing dialysates (n=39) and ICO group using one icodextrin-containing and two glucose-containing dialysates (n=41). Variables related to residual renal function (RRF), metabolic and fluid control, dialysis adequacy, and dialysate effluent cancer antigen 125 (CA125) and interleukin 6 (IL-6) levels were measured. The GLU group showed a significant decrease in mean renal urea and creatinine clearance (-Δ1.2±2.9 mL/min/1.73 m2, P=0.027) and urine volume (-Δ363.6±543.0 mL/day, P=0.001) during 12 months, but the ICO group did not (-Δ0.5±2.7 mL/min/1.73 m2, P=0.266; -Δ108.6±543.3 mL/day, P=0.246). Peritoneal glucose absorption and dialysate calorie load were significantly lower in the ICO group than the GLU group. The dialysate CA125 and IL-6 levels were significantly higher in the ICO group than the GLU group. Dialysis adequacy, β2-microglobulin clearance and blood pressure did not differ between the two groups. The CAPD technique using one icodextrin-containing and two glucose-containing dialysates tends to better preserve RRF and is more biocompatible, with similar dialysis adequacy compared to that using four glucose-containing dialysates in incident CAPD patients. [Clincal Trial Registry, ISRCTN23727549] Graphical Abstract PMID:25246739

  16. A comparison of confidence interval methods for the intraclass correlation coefficient in community-based cluster randomization trials with a binary outcome.

    PubMed

    Braschel, Melissa C; Svec, Ivana; Darlington, Gerarda A; Donner, Allan

    2016-04-01

    Many investigators rely on previously published point estimates of the intraclass correlation coefficient rather than on their associated confidence intervals to determine the required size of a newly planned cluster randomized trial. Although confidence interval methods for the intraclass correlation coefficient that can be applied to community-based trials have been developed for a continuous outcome variable, fewer methods exist for a binary outcome variable. The aim of this study is to evaluate confidence interval methods for the intraclass correlation coefficient applied to binary outcomes in community intervention trials enrolling a small number of large clusters. Existing methods for confidence interval construction are examined and compared to a new ad hoc approach based on dividing clusters into a large number of smaller sub-clusters and subsequently applying existing methods to the resulting data. Monte Carlo simulation is used to assess the width and coverage of confidence intervals for the intraclass correlation coefficient based on Smith's large sample approximation of the standard error of the one-way analysis of variance estimator, an inverted modified Wald test for the Fleiss-Cuzick estimator, and intervals constructed using a bootstrap-t applied to a variance-stabilizing transformation of the intraclass correlation coefficient estimate. In addition, a new approach is applied in which clusters are randomly divided into a large number of smaller sub-clusters with the same methods applied to these data (with the exception of the bootstrap-t interval, which assumes large cluster sizes). These methods are also applied to a cluster randomized trial on adolescent tobacco use for illustration. When applied to a binary outcome variable in a small number of large clusters, existing confidence interval methods for the intraclass correlation coefficient provide poor coverage. However, confidence intervals constructed using the new approach combined with Smith's method provide nominal or close to nominal coverage when the intraclass correlation coefficient is small (<0.05), as is the case in most community intervention trials. This study concludes that when a binary outcome variable is measured in a small number of large clusters, confidence intervals for the intraclass correlation coefficient may be constructed by dividing existing clusters into sub-clusters (e.g. groups of 5) and using Smith's method. The resulting confidence intervals provide nominal or close to nominal coverage across a wide range of parameters when the intraclass correlation coefficient is small (<0.05). Application of this method should provide investigators with a better understanding of the uncertainty associated with a point estimator of the intraclass correlation coefficient used for determining the sample size needed for a newly designed community-based trial. © The Author(s) 2015.

  17. Cognitive Improvement of Attention and Inhibition in the Late Afternoon in Children With Attention-Deficit Hyperactivity Disorder (ADHD) Treated With Osmotic-Release Oral System Methylphenidate.

    PubMed

    Slama, Hichem; Fery, Patrick; Verheulpen, Denis; Vanzeveren, Nathalie; Van Bogaert, Patrick

    2015-07-01

    Long-acting medications have been developed and approved for use in the treatment of attention-deficit hyperactivity disorder (ADHD). These compounds are intended to optimize and maintain symptoms control throughout the day. We tested prolonged effects of osmotic-release oral system methylphenidate on both attention and inhibition, in the late afternoon. A double-blind, randomized, placebo-controlled study was conducted in 36 boys (7-12 years) with ADHD and 40 typically developing children. The ADHD children received an individualized dose of placebo or osmotic-release oral system methylphenidate. They were tested about 8 hours after taking with 2 continuous performance tests (continuous performance test-X [CPT-X] and continuous performance test-AX [CPT-AX]) and a counting Stroop. A positive effect of osmotic-release oral system methylphenidate was present in CPT-AX with faster and less variable reaction times under osmotic-release oral system methylphenidate than under placebo, and no difference with typically developing children. In the counting Stroop, we found a decreased interference with osmotic-release oral system methylphenidate but no difference between children with ADHD under placebo and typically developing children. © The Author(s) 2014.

  18. A continuous quality improvement project to improve the quality of cervical Papanicolaou smears.

    PubMed

    Burkman, R T; Ward, R; Balchandani, K; Kini, S

    1994-09-01

    To improve the quality of cervical Papanicolaou smears by continuous quality improvement techniques. The study used a Papanicolaou smear data base of over 200,000 specimens collected between June 1988 and December 1992. A team approach employing techniques such as process flow-charting, cause and effect diagrams, run charts, and a randomized trial of collection methods was used to evaluate potential causes of Papanicolaou smear reports with the notation "inadequate" or "less than optimal" due to too few or absent endocervical cells. Once a key process variable (method of collection) was identified, the proportion of Papanicolaou smears with inadequate or absent endocervical cells was determined before and after employment of a collection technique using a spatula and Cytobrush. We measured the rate of less than optimal Papanicolaou smears due to too few or absent endocervical cells. Before implementing the new collection technique fully by June 1990, the overall rate of less than optimal cervical Papanicolaou smears ranged from 20-25%; by December 1993, it had stabilized at about 10%. Continuous quality improvement can be used successfully to study a clinical process and implement change that will lead to improvement.

  19. Is the Non-Dipole Magnetic Field Random?

    NASA Technical Reports Server (NTRS)

    Walker, Andrew D.; Backus, George E.

    1996-01-01

    Statistical modelling of the Earth's magnetic field B has a long history. In particular, the spherical harmonic coefficients of scalar fields derived from B can be treated as Gaussian random variables. In this paper, we give examples of highly organized fields whose spherical harmonic coefficients pass tests for independent Gaussian random variables. The fact that coefficients at some depth may be usefully summarized as independent samples from a normal distribution need not imply that there really is some physical, random process at that depth. In fact, the field can be extremely structured and still be regarded for some purposes as random. In this paper, we examined the radial magnetic field B(sub r) produced by the core, but the results apply to any scalar field on the core-mantle boundary (CMB) which determines B outside the CMB.

  20. Random Walks in a One-Dimensional Lévy Random Environment

    NASA Astrophysics Data System (ADS)

    Bianchi, Alessandra; Cristadoro, Giampaolo; Lenci, Marco; Ligabò, Marilena

    2016-04-01

    We consider a generalization of a one-dimensional stochastic process known in the physical literature as Lévy-Lorentz gas. The process describes the motion of a particle on the real line in the presence of a random array of marked points, whose nearest-neighbor distances are i.i.d. and long-tailed (with finite mean but possibly infinite variance). The motion is a continuous-time, constant-speed interpolation of a symmetric random walk on the marked points. We first study the quenched random walk on the point process, proving the CLT and the convergence of all the accordingly rescaled moments. Then we derive the quenched and annealed CLTs for the continuous-time process.

  1. Are glucose levels, glucose variability and autonomic control influenced by inspiratory muscle exercise in patients with type 2 diabetes? Study protocol for a randomized controlled trial.

    PubMed

    Schein, Aso; Correa, Aps; Casali, Karina Rabello; Schaan, Beatriz D

    2016-01-20

    Physical exercise reduces glucose levels and glucose variability in patients with type 2 diabetes. Acute inspiratory muscle exercise has been shown to reduce these parameters in a small group of patients with type 2 diabetes, but these results have yet to be confirmed in a well-designed study. The aim of this study is to investigate the effect of acute inspiratory muscle exercise on glucose levels, glucose variability, and cardiovascular autonomic function in patients with type 2 diabetes. This study will use a randomized clinical trial crossover design. A total of 14 subjects will be recruited and randomly allocated to two groups to perform acute inspiratory muscle loading at 2 % of maximal inspiratory pressure (PImax, placebo load) or 60 % of PImax (experimental load). Inspiratory muscle training could be a novel exercise modality to be used to decrease glucose levels and glucose variability. ClinicalTrials.gov NCT02292810 .

  2. Meteorological variables to aid forecasting deep slab avalanches on persistent weak layers

    USGS Publications Warehouse

    Marienthal, Alex; Hendrikx, Jordy; Birkeland, Karl; Irvine, Kathryn M.

    2015-01-01

    Deep slab avalanches are particularly challenging to forecast. These avalanches are difficult to trigger, yet when they release they tend to propagate far and can result in large and destructive avalanches. We utilized a 44-year record of avalanche control and meteorological data from Bridger Bowl ski area in southwest Montana to test the usefulness of meteorological variables for predicting seasons and days with deep slab avalanches. We defined deep slab avalanches as those that failed on persistent weak layers deeper than 0.9 m, and that occurred after February 1st. Previous studies often used meteorological variables from days prior to avalanches, but we also considered meteorological variables over the early months of the season. We used classification trees and random forests for our analyses. Our results showed seasons with either dry or wet deep slabs on persistent weak layers typically had less precipitation from November through January than seasons without deep slabs on persistent weak layers. Days with deep slab avalanches on persistent weak layers often had warmer minimum 24-hour air temperatures, and more precipitation over the prior seven days, than days without deep slabs on persistent weak layers. Days with deep wet slab avalanches on persistent weak layers were typically preceded by three days of above freezing air temperatures. Seasonal and daily meteorological variables were found useful to aid forecasting dry and wet deep slab avalanches on persistent weak layers, and should be used in combination with continuous observation of the snowpack and avalanche activity.

  3. Fast and Accurate Multivariate Gaussian Modeling of Protein Families: Predicting Residue Contacts and Protein-Interaction Partners

    PubMed Central

    Feinauer, Christoph; Procaccini, Andrea; Zecchina, Riccardo; Weigt, Martin; Pagnani, Andrea

    2014-01-01

    In the course of evolution, proteins show a remarkable conservation of their three-dimensional structure and their biological function, leading to strong evolutionary constraints on the sequence variability between homologous proteins. Our method aims at extracting such constraints from rapidly accumulating sequence data, and thereby at inferring protein structure and function from sequence information alone. Recently, global statistical inference methods (e.g. direct-coupling analysis, sparse inverse covariance estimation) have achieved a breakthrough towards this aim, and their predictions have been successfully implemented into tertiary and quaternary protein structure prediction methods. However, due to the discrete nature of the underlying variable (amino-acids), exact inference requires exponential time in the protein length, and efficient approximations are needed for practical applicability. Here we propose a very efficient multivariate Gaussian modeling approach as a variant of direct-coupling analysis: the discrete amino-acid variables are replaced by continuous Gaussian random variables. The resulting statistical inference problem is efficiently and exactly solvable. We show that the quality of inference is comparable or superior to the one achieved by mean-field approximations to inference with discrete variables, as done by direct-coupling analysis. This is true for (i) the prediction of residue-residue contacts in proteins, and (ii) the identification of protein-protein interaction partner in bacterial signal transduction. An implementation of our multivariate Gaussian approach is available at the website http://areeweb.polito.it/ricerca/cmp/code. PMID:24663061

  4. Theory of Financial Risk and Derivative Pricing

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2009-01-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  5. Theory of Financial Risk and Derivative Pricing - 2nd Edition

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2003-12-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  6. Update on the therapy of Behçet disease

    PubMed Central

    Saleh, Zeinab

    2014-01-01

    Behçet disease is a chronic inflammatory systemic disorder, characterized by a relapsing and remitting course. It manifests with oral and genital ulcerations, skin lesions, uveitis, and vascular, central nervous system and gastrointestinal involvement. The main histopathological finding is a widespread vasculitis of the arteries and veins of any size. The cause of this disease is presumed to be multifactorial involving infectious triggers, genetic predisposition, and dysregulation of the immune system. As the clinical expression of Behçet disease is heterogeneous, pharmacological therapy is variable and depends largely on the severity of the disease and organ involvement. Treatment of Behçet disease continues to be based largely on anecdotal case reports, case series, and a few randomized clinical trials. PMID:24790727

  7. Comparative Efficacy and Durability of Continuation Phase Cognitive Therapy for Preventing Recurrent Depression: Design of a Double-Blinded, Fluoxetine- and Pill-Placebo–Controlled, Randomized Trial with 2-Year Follow-up

    PubMed Central

    Thase, Michael E.

    2010-01-01

    Background Major depressive disorder (MDD) is highly prevalent and associated with disability and chronicity. Although cognitive therapy (CT) is an effective short-term treatment for MDD, a significant proportion of responders subsequently suffer relapses or recurrences. Purpose This design prospectively evaluates: 1) a method to discriminate CT-treated responders at lower versus higher risk for relapse; and 2) the subsequent durability of 8-month continuation phase therapies in randomized higher risk responders followed for an additional 24-months. The primary prediction is: after protocol treatments are stopped, higher risk patients randomly assigned to continuation phase CT (C-CT) will have a lower risk of relapse/recurrence than those randomized to fluoxetine (FLX). Methods Outpatients, aged 18 to 70 years, with recurrent MDD received 12–14 weeks of CT provided by 15 experienced therapists from two sites. Responders (i.e., no MDD and 17-item Hamilton Rating Scale for Depression ≤ 12) were stratified into higher and lower risk groups based on stability of remission during the last 6 weeks of CT. The lower risk group entered follow-up for 32 months; the higher risk group was randomized to 8 months of continuation phase therapy with either C-CT or clinical management plus either double-blinded FLX or pill placebo. Following the continuation phase, higher risk patients were followed by blinded evaluators for 24 months. Results The trial began in 2000. Enrollment is complete (N=523). The follow-up continues. Conclusions The trial evaluates the preventive effects and durability of acute and continuation phase treatments in the largest known sample of CT responders collected worldwide. PMID:20451668

  8. Nonergodic property of the space-time coupled CTRW: Dependence on the long-tailed property and correlation

    NASA Astrophysics Data System (ADS)

    Liu, Jian; Li, Baohe; Chen, Xiaosong

    2018-02-01

    The space-time coupled continuous time random walk model is a stochastic framework of anomalous diffusion with many applications in physics, geology and biology. In this manuscript the time averaged mean squared displacement and nonergodic property of a space-time coupled continuous time random walk model is studied, which is a prototype of the coupled continuous time random walk presented and researched intensively with various methods. The results in the present manuscript show that the time averaged mean squared displacements increase linearly with lag time which means ergodicity breaking occurs, besides, we find that the diffusion coefficient is intrinsically random which shows both aging and enhancement, the analysis indicates that the either aging or enhancement phenomena are determined by the competition between the correlation exponent γ and the waiting time's long-tailed index α.

  9. CONCEPTT: Continuous Glucose Monitoring in Women with Type 1 Diabetes in Pregnancy Trial: A multi-center, multi-national, randomized controlled trial - Study protocol.

    PubMed

    Feig, Denice S; Asztalos, Elizabeth; Corcoy, Rosa; De Leiva, Alberto; Donovan, Lois; Hod, Moshe; Jovanovic, Lois; Keely, Erin; Kollman, Craig; McManus, Ruth; Murphy, Kellie; Ruedy, Katrina; Sanchez, J Johanna; Tomlinson, George; Murphy, Helen R

    2016-07-18

    Women with type 1 diabetes strive for optimal glycemic control before and during pregnancy to avoid adverse obstetric and perinatal outcomes. For most women, optimal glycemic control is challenging to achieve and maintain. The aim of this study is to determine whether the use of real-time continuous glucose monitoring (RT-CGM) will improve glycemic control in women with type 1 diabetes who are pregnant or planning pregnancy. A multi-center, open label, randomized, controlled trial of women with type 1 diabetes who are either planning pregnancy with an HbA1c of 7.0 % to ≤10.0 % (53 to ≤ 86 mmol/mol) or are in early pregnancy (<13 weeks 6 days) with an HbA1c of 6.5 % to ≤10.0 % (48 to ≤ 86 mmol/mol). Participants will be randomized to either RT-CGM alongside conventional intermittent home glucose monitoring (HGM), or HGM alone. Eligible women will wear a CGM which does not display the glucose result for 6 days during the run-in phase. To be eligible for randomization, a minimum of 4 HGM measurements per day and a minimum of 96 hours total with 24 hours overnight (11 pm-7 am) of CGM glucose values are required. Those meeting these criteria are randomized to RT- CGM or HGM. A total of 324 women will be recruited (110 planning pregnancy, 214 pregnant). This takes into account 15 and 20 % attrition rates for the planning pregnancy and pregnant cohorts and will detect a clinically relevant 0.5 % difference between groups at 90 % power with 5 % significance. Randomization will stratify for type of insulin treatment (pump or multiple daily injections) and baseline HbA1c. Analyses will be performed according to intention to treat. The primary outcome is the change in glycemic control as measured by HbA1c from baseline to 24 weeks or conception in women planning pregnancy, and from baseline to 34 weeks gestation during pregnancy. Secondary outcomes include maternal hypoglycemia, CGM time in, above and below target (3.5-7.8 mmol/l), glucose variability measures, maternal and neonatal outcomes. This will be the first international multicenter randomized controlled trial to evaluate the impact of RT- CGM before and during pregnancy in women with type 1 diabetes. ClinicalTrials.gov Identifier: NCT01788527 Registration Date: December 19, 2012.

  10. Efficacy of Lisdexamfetamine in Adults With Moderate to Severe Binge-Eating Disorder: A Randomized Clinical Trial.

    PubMed

    Hudson, James I; McElroy, Susan L; Ferreira-Cornwell, M Celeste; Radewonuk, Jana; Gasior, Maria

    2017-09-01

    The ability of pharmacotherapies to prevent relapse and maintain efficacy with long-term treatment in psychiatric conditions is important. To assess lisdexamfetamine dimesylate maintenance of efficacy in adults with moderate to severe binge-eating disorder. A multinational, phase 3, double-blind, placebo-controlled, randomized withdrawal study including 418 participants was conducted at 49 clinical research study sites from January 27, 2014, to April 8, 2015. Eligible adults met DSM-IV-R binge-eating disorder criteria and had moderate to severe binge eating disorder (≥3 binge-eating days per week for 14 days before open-label baseline; Clinical Global Impressions-Severity [CGI-S] scores ≥4 [moderate severity] at screening and open-label baseline). Following a 12-week, open-label phase (dose optimization, 4 weeks [lisdexamfetamine dimesylate, 50 or 70 mg]; dose maintenance, 8 weeks), lisdexamfetamine responders (≤1 binge eating day per week for 4 consecutive weeks and CGI-S scores ≤2 at week 12) were randomized to placebo or continued lisdexamfetamine during a 26-week, double-blind, randomized withdrawal phase. Lisdexamfetamine administration. The primary outcome variable, time to relapse (≥2 binge-eating days per week for 2 consecutive weeks and ≥2-point CGI-S score increases from randomized withdrawal baseline), was analyzed using a log-rank test (primary analysis); the analysis was stratified for dichotomized 4-week cessation status. Safety assessments included treatment-emergent adverse events. Of the 418 participants enrolled in the open-label phase of the study, 411 (358 [87.1%] women; mean [SD] age, 38.3 [10.4] years) were included in the safety analysis set. Of 275 randomized lisdexamfetamine responders (placebo, n = 138; lisdexamfetamine, n = 137), the observed proportions of participants meeting relapse criteria were 3.7% (5 of 136) for lisdexamfetamine and 32.1% (42 of 131) for placebo. Lisdexamfetamine demonstrated superiority over placebo on the log-rank test (χ21, 40.37; P < .001) for time to relapse; the hazard ratio, based on a Cox proportional hazards model for lisdexamfetamine vs placebo, was 0.09 (95% CI, 0.04-0.23). The treatment-emergent adverse events observed were generally consistent with the known profile of lisdexamfetamine. Risk of binge-eating relapse over 6 months was lower in participants continuing lisdexamfetamine than in those randomized to placebo. The hazard for relapse was lower with lisdexamfetamine than placebo. clinicaltrials.gov Identifier: NCT02009163.

  11. [Development and validation of quality standards for colonoscopy].

    PubMed

    Sánchez Del Río, Antonio; Baudet, Juan Salvador; Naranjo Rodríguez, Antonio; Campo Fernández de Los Ríos, Rafael; Salces Franco, Inmaculada; Aparicio Tormo, Jose Ramón; Sánchez Muñoz, Diego; Llach, Joseph; Hervás Molina, Antonio; Parra-Blanco, Adolfo; Díaz Acosta, Juan Antonio

    2010-01-30

    Before starting programs for colorectal cancer screening it is necessary to evaluate the quality of colonoscopy. Our objectives were to develop a group of quality indicators of colonoscopy easily applicable and to determine the variability of their achievement. After reviewing the bibliography we prepared 21 potential indicators of quality that were submitted to a process of selection in which we measured their facial validity, content validity, reliability and viability of their measurement. We estimated the variability of their achievement by means of the coefficient of variability (CV) and the variability of the achievement of the standards by means of chi(2). Six indicators overcome the selection process: informed consent, medication administered, completed colonoscopy, complications, every polyp removed and recovered, and adenoma detection rate in patients older than 50 years. 1928 colonoscopies were included from eight endoscopy units. Every unit included the same number of colonoscopies selected by means of simple random sampling with substitution. There was an important variability in the achievement of some indicators and standards: medication administered (CV 43%, p<0.01), complications registered (CV 37%, p<0.01), every polyp removed and recovered (CV 12%, p<0.01) and adenoma detection rate in older than fifty years (CV 2%, p<0.01). We have validated six quality indicators for colonoscopy which are easily measurable. An important variability exists in the achievement of some indicators and standards. Our data highlight the importance of the development of continuous quality improvement programmes for colonoscopy before starting colorectal cancer screening. Copyright (c) 2009 Elsevier España, S.L. All rights reserved.

  12. Biological community structure on patch reefs in Biscayne National Park, FL, USA

    USGS Publications Warehouse

    Kuffner, Ilsa B.; Grober-Dunsmore, Rikki; Brock, John C.; Hickey, T. Don

    2010-01-01

    Coral reef ecosystem management benefits from continual quantitative assessment of the resources being managed, plus assessment of factors that affect distribution patterns of organisms in the ecosystem. In this study, we investigate the relationships among physical, benthic, and fish variables in an effort to help explain the distribution patterns of organisms on patch reefs within Biscayne National Park, FL, USA. We visited a total of 196 randomly selected sampling stations on 12 shallow (<10 m) patch reefs and measured physical variables (e.g., substratum rugosity, substratum type) and benthic and fish community variables. We also incorporated data on substratum rugosity collected remotely via airborne laser surveying (Experimental Advanced Airborne Research Lidar—EAARL). Across all stations, only weak relationships were found between physical, benthic cover, and fish assemblage variables. Much of the variance was attributable to a “reef effect,” meaning that community structure and organism abundances were more variable at stations among reefs than within reefs. However, when the reef effect was accounted for and removed statistically, patterns were detected. Within reefs, juvenile scarids were most abundant at stations with high coverage of the fleshy macroalgae Dictyota spp., and the calcified alga Halimeda tuna was most abundant at stations with low EAARL rugosity. Explanations for the overwhelming importance of “reef” in explaining variance in our dataset could include the stochastic arrangement of organisms on patch reefs related to variable larval recruitment in space and time and/or strong historical effects due to patchy disturbances (e.g., hurricanes, fishing), as well as legacy effects of prior residents (“priority” effects).

  13. Combined use of intravenous and topical versus intravenous tranexamic acid in primary total joint arthroplasty: A meta-analysis of randomized controlled trials.

    PubMed

    Zhang, Xue-Qin; Ni, Jie; Ge, Wei-Hong

    2017-02-01

    To compare the safety and efficacy of combined use of intravenous and topical tranexamic acid with that of intravenous tranexamic acid in primary total joint arthroplasty. Literature was searched in PubMed, Cochrane Library, Embase, Medline, and China National Knowledge Infrastructure databases. Only randomized controlled trials were included in our study. Data were using fixed-effects or random-effects models with standard mean differences and risk ratios for continuous and dichotomous variables, respectively. Seven randomized controlled trials encompassing 683 patients were retrieved for this meta-analysis. Outcomes showed that when compared with intravenous tranexamic acid, combined use of intravenous and topical tranexamic acid could significantly reduce total blood loss by a mean of 138.70 mL [95% confidence interval (CI): -196.14 to -81.26, p < 0.001], transfusion rates (risk ratio 0.42, 95% CI: 0.2 to 0.85, p < 0.001). No significant difference in the occurrence of deep vein thrombosis, pulmonary embolism was found between the two groups. This meta-analysis indicated that comparing with only intravenous tranexamic acid, combined use of intravenous and topical tranexamic acid can significantly reduce blood loss and transfusion rate in primary total joint arthroplasty without increasing the risk of thrombotic complications. Therefore, we suggest that tranexamic acid should be intravenously combined with topically administered in primary total joint arthroplasty. Copyright © 2016. Published by Elsevier Ltd.

  14. Mixture Factor Analysis for Approximating a Nonnormally Distributed Continuous Latent Factor with Continuous and Dichotomous Observed Variables

    ERIC Educational Resources Information Center

    Wall, Melanie M.; Guo, Jia; Amemiya, Yasuo

    2012-01-01

    Mixture factor analysis is examined as a means of flexibly estimating nonnormally distributed continuous latent factors in the presence of both continuous and dichotomous observed variables. A simulation study compares mixture factor analysis with normal maximum likelihood (ML) latent factor modeling. Different results emerge for continuous versus…

  15. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  16. Optimizing a Sensor Network with Data from Hazard Mapping Demonstrated in a Heavy-Vehicle Manufacturing Facility.

    PubMed

    Berman, Jesse D; Peters, Thomas M; Koehler, Kirsten A

    2018-05-28

    To design a method that uses preliminary hazard mapping data to optimize the number and location of sensors within a network for a long-term assessment of occupational concentrations, while preserving temporal variability, accuracy, and precision of predicted hazards. Particle number concentrations (PNCs) and respirable mass concentrations (RMCs) were measured with direct-reading instruments in a large heavy-vehicle manufacturing facility at 80-82 locations during 7 mapping events, stratified by day and season. Using kriged hazard mapping, a statistical approach identified optimal orders for removing locations to capture temporal variability and high prediction precision of PNC and RMC concentrations. We compared optimal-removal, random-removal, and least-optimal-removal orders to bound prediction performance. The temporal variability of PNC was found to be higher than RMC with low correlation between the two particulate metrics (ρ = 0.30). Optimal-removal orders resulted in more accurate PNC kriged estimates (root mean square error [RMSE] = 49.2) at sample locations compared with random-removal order (RMSE = 55.7). For estimates at locations having concentrations in the upper 10th percentile, the optimal-removal order preserved average estimated concentrations better than random- or least-optimal-removal orders (P < 0.01). However, estimated average concentrations using an optimal-removal were not statistically different than random-removal when averaged over the entire facility. No statistical difference was observed for optimal- and random-removal methods for RMCs that were less variable in time and space than PNCs. Optimized removal performed better than random-removal in preserving high temporal variability and accuracy of hazard map for PNC, but not for the more spatially homogeneous RMC. These results can be used to reduce the number of locations used in a network of static sensors for long-term monitoring of hazards in the workplace, without sacrificing prediction performance.

  17. Variable Selection in the Presence of Missing Data: Imputation-based Methods.

    PubMed

    Zhao, Yize; Long, Qi

    2017-01-01

    Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.

  18. Heart rate variability (HRV) and posttraumatic stress disorder (PTSD): a pilot study.

    PubMed

    Tan, Gabriel; Dao, Tam K; Farmer, Lorie; Sutherland, Roy John; Gevirtz, Richard

    2011-03-01

    Exposure to combat experiences is associated with increased risk of developing Post Traumatic Stress Disorder. Prolonged exposure therapy and cognitive processing therapy have garnered a significant amount of empirical support for PTSD treatment; however, they are not universally effective with some patients continuing to struggle with residual PTSD symptoms. Heart rate variability (HRV) is a measure of the autonomic nervous system functioning and reflects an individual's ability to adaptively cope with stress. A pilot study was undertaken to determine if veterans with PTSD (as measured by the Clinician-Administered PTSD Scale and the PTSD Checklist) would show significantly different HRV prior to an intervention at baseline compared to controls; specifically, to determine whether the HRV among veterans with PTSD is more depressed than that among veterans without PTSD. The study also aimed at assessing the feasibility, acceptability, and potential efficacy of providing HRV biofeedback as a treatment for PTSD. The findings suggest that implementing an HRV biofeedback as a treatment for PTSD is effective, feasible, and acceptable for veterans. Veterans with combat-related PTSD displayed significantly depressed HRV as compared to subjects without PTSD. When the veterans with PTSD were randomly assigned to receive either HRV biofeedback plus treatment as usual (TAU) or just TAU, the results indicated that HRV biofeedback significantly increased the HRV while reducing symptoms of PTSD. However, the TAU had no significant effect on either HRV or symptom reduction. A larger randomized control trial to validate these findings appears warranted.

  19. How to derive biological information from the value of the normalization constant in allometric equations.

    PubMed

    Kaitaniemi, Pekka

    2008-04-09

    Allometric equations are widely used in many branches of biological science. The potential information content of the normalization constant b in allometric equations of the form Y = bX(a) has, however, remained largely neglected. To demonstrate the potential for utilizing this information, I generated a large number of artificial datasets that resembled those that are frequently encountered in biological studies, i.e., relatively small samples including measurement error or uncontrolled variation. The value of X was allowed to vary randomly within the limits describing different data ranges, and a was set to a fixed theoretical value. The constant b was set to a range of values describing the effect of a continuous environmental variable. In addition, a normally distributed random error was added to the values of both X and Y. Two different approaches were then used to model the data. The traditional approach estimated both a and b using a regression model, whereas an alternative approach set the exponent a at its theoretical value and only estimated the value of b. Both approaches produced virtually the same model fit with less than 0.3% difference in the coefficient of determination. Only the alternative approach was able to precisely reproduce the effect of the environmental variable, which was largely lost among noise variation when using the traditional approach. The results show how the value of b can be used as a source of valuable biological information if an appropriate regression model is selected.

  20. Residential surface soil guidance values applied worldwide to the original 2001 Stockholm Convention POP pesticides.

    PubMed

    Jennings, Aaron A; Li, Zijian

    2015-09-01

    Surface soil contamination is a worldwide problem. Many regulatory jurisdictions attempt to control human exposures with regulatory guidance values (RGVs) that specify a soil's maximum allowable concentration. Pesticides are important soil contaminants because of their intentional toxicity and widespread surface soil application. Worldwide, at least 174 regulatory jurisdictions from 54 United Nations member states have published more than 19,400 pesticide RGVs for at least 739 chemically unique pesticides. This manuscript examines the variability of the guidance values that are applied worldwide to the original 2001 Stockholm Convention persistent organic pollutants (POP) pesticides (Aldrin, Chlordane, DDT, Dieldrin, Endrin, Heptachlor, Mirex, and Toxaphene) for which at least 1667 RGVs have been promulgated. Results indicate that the spans of the RGVs applied to each of these pesticides vary from 6.1 orders of magnitude for Toxaphene to 10.0 orders of magnitude for Mirex. The distribution of values across these value spans resembles the distribution of lognormal random variables, but also contain non-random value clusters. Approximately 40% of all the POP RGVs fall within uncertainty bounds computed from the U.S. Environmental Protection Agency (USEPA) RGV cancer risk model. Another 22% of the values fall within uncertainty bounds computed from the USEPA's non-cancer risk model, but the cancer risk calculations yield the binding (lowest) value for all POP pesticides except Endrin. The results presented emphasize the continued need to rationalize the RGVs applied worldwide to important soil contaminants. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oikari, A.O.J.

    Relevance of the choice of a test organism intended to be representative for a given environment seems to be under continual debate in aquatic ecotoxicology. For instance, it is commonly argue that acute toxicity tests with rainbow trout, the species most often recommended as a standard cold water teleost, were not representative for Nordic countries because the species is an alien in local faunas. A comparative study with several freshwater species was therefore initiated to clarify the validity of this assumption. As a first approximation, standard LC 50 assays were conducted. The species used were chosen only on the basismore » of their local availability, i.e, they randomly represented the fish fauna of Nordic inland waters. Furthermore, inter-species variation of toxicity response was compared with certain other, quantitatively more important, intra-species sources of variability affecting the toxicity of chemicals. Use of reference toxicants has been recommended as a means of standardizing bioassays. Compounds, characteristic of effluents from the pulp and paper industry, were selected for the present study. The toxicity of organic acids such a phenols and resin acids, as well as that of pupmill effluents, strongly depends on water pH. Because of the possibility that species differences could exist in this respect, effects of water acidity on toxicity of these types of substances to a randomly selected local species was investigated. Finally, as an example of the biological source of assay variability, the effect of yolk absorption was studied with a subsequent crisis period due to moderate starvation under laboratory conditions.« less

  2. Multifactor valuation models of energy futures and options on futures

    NASA Astrophysics Data System (ADS)

    Bertus, Mark J.

    The intent of this dissertation is to investigate continuous time pricing models for commodity derivative contracts that consider mean reversion. The motivation for pricing commodity futures and option on futures contracts leads to improved practical risk management techniques in markets where uncertainty is increasing. In the dissertation closed-form solutions to mean reverting one-factor, two-factor, three-factor Brownian motions are developed for futures contracts. These solutions are obtained through risk neutral pricing methods that yield tractable expressions for futures prices, which are linear in the state variables, hence making them attractive for estimation. These functions, however, are expressed in terms of latent variables (i.e. spot prices, convenience yield) which complicate the estimation of the futures pricing equation. To address this complication a discussion on Dynamic factor analysis is given. This procedure documents latent variables using a Kalman filter and illustrations show how this technique may be used for the analysis. In addition, to the futures contracts closed form solutions for two option models are obtained. Solutions to the one- and two-factor models are tailored solutions of the Black-Scholes pricing model. Furthermore, since these contracts are written on the futures contracts, they too are influenced by the same underlying parameters of the state variables used to price the futures contracts. To conclude, the analysis finishes with an investigation of commodity futures options that incorporate random discrete jumps.

  3. Habitat of calling blue and fin whales in the Southern California Bight

    NASA Astrophysics Data System (ADS)

    Sirovic, A.; Chou, E.; Roch, M. A.

    2016-02-01

    Northeast Pacific blue whale B calls and fin whale 20 Hz calls were detected from passive acoustic data collected over seven years at 16 sites in the Southern California Bight (SCB). Calling blue whales were most common in the coastal areas, during the summer and fall months. Fin whales began calling in fall and continued through winter, in the southcentral SCB. These data were used to develop habitat models of calling blue and fin whales in areas of high and low abundance in the SCB, using remotely sensed variables such as sea surface temperature, sea surface height, chlorophyll a, and primary productivity as model covariates. A random forest framework was used for variable selection and generalized additive models were developed to explain functional relationships, evaluate relative contribution of each significant variable, and investigate predictive abilities of models of calling whales. Seasonal component was an important feature of all models. Additionally, areas of high calling blue and fin whale abundance both had a positive relationship with the sea surface temperature. In areas of lower abundance, chlorophyll a concentration and primary productivity were important variables for blue whale models and sea surface height and primary productivity were significant covariates in fin whale models. Predictive models were generally better for predicting general trends than absolute values, but there was a large degree of variation in year-to-year predictability across different sites.

  4. A master equation and moment approach for biochemical systems with creation-time-dependent bimolecular rate functions

    PubMed Central

    Chevalier, Michael W.; El-Samad, Hana

    2014-01-01

    Noise and stochasticity are fundamental to biology and derive from the very nature of biochemical reactions where thermal motion of molecules translates into randomness in the sequence and timing of reactions. This randomness leads to cell-to-cell variability even in clonal populations. Stochastic biochemical networks have been traditionally modeled as continuous-time discrete-state Markov processes whose probability density functions evolve according to a chemical master equation (CME). In diffusion reaction systems on membranes, the Markov formalism, which assumes constant reaction propensities is not directly appropriate. This is because the instantaneous propensity for a diffusion reaction to occur depends on the creation times of the molecules involved. In this work, we develop a chemical master equation for systems of this type. While this new CME is computationally intractable, we make rational dimensional reductions to form an approximate equation, whose moments are also derived and are shown to yield efficient, accurate results. This new framework forms a more general approach than the Markov CME and expands upon the realm of possible stochastic biochemical systems that can be efficiently modeled. PMID:25481130

  5. Impact of Two Adolescent Pregnancy Prevention Interventions on Risky Sexual Behavior: A Three-Arm Cluster Randomized Control Trial.

    PubMed

    Barbee, Anita P; Cunningham, Michael R; van Zyl, Michiel A; Antle, Becky F; Langley, Cheri N

    2016-09-01

    To test the efficacy of Reducing the Risk (RTR) and Love Notes (LN) on reducing risky sexual behavior among youths yet to experience or cause a pregnancy. The four dependent variables were ever had sex, condom use, birth control use, and number of sexual partners at 3- and 6-month follow-up in a 3-arm cluster randomized controlled trial of 1448 impoverished youths, aged 14 to 19 years, in 23 community-based organizations in Louisville, Kentucky, from September 2011 through March 2014. At 3 and 6 months, compared with the control condition, youths in RTR reported fewer sexual partners and greater use of birth control. At 6 months, LN participants reported greater use of birth control and condoms, fewer sexual partners, and were less likely to have ever had sex compared with the control condition. We provided additional evidence for the continued efficacy of RTR and the first rigorous study of LN, which embeds sex education into a larger curriculum on healthy relationships and violence prevention.

  6. A master equation and moment approach for biochemical systems with creation-time-dependent bimolecular rate functions

    NASA Astrophysics Data System (ADS)

    Chevalier, Michael W.; El-Samad, Hana

    2014-12-01

    Noise and stochasticity are fundamental to biology and derive from the very nature of biochemical reactions where thermal motion of molecules translates into randomness in the sequence and timing of reactions. This randomness leads to cell-to-cell variability even in clonal populations. Stochastic biochemical networks have been traditionally modeled as continuous-time discrete-state Markov processes whose probability density functions evolve according to a chemical master equation (CME). In diffusion reaction systems on membranes, the Markov formalism, which assumes constant reaction propensities is not directly appropriate. This is because the instantaneous propensity for a diffusion reaction to occur depends on the creation times of the molecules involved. In this work, we develop a chemical master equation for systems of this type. While this new CME is computationally intractable, we make rational dimensional reductions to form an approximate equation, whose moments are also derived and are shown to yield efficient, accurate results. This new framework forms a more general approach than the Markov CME and expands upon the realm of possible stochastic biochemical systems that can be efficiently modeled.

  7. Observational studies of patients in the emergency department: a comparison of 4 sampling methods.

    PubMed

    Valley, Morgan A; Heard, Kennon J; Ginde, Adit A; Lezotte, Dennis C; Lowenstein, Steven R

    2012-08-01

    We evaluate the ability of 4 sampling methods to generate representative samples of the emergency department (ED) population. We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by using 2 sample sizes (n=200 and n=400) and 4 sampling methods: true random, random 4-hour time blocks by exact sample size, random 4-hour time blocks by a predetermined number of blocks, and convenience or "business hours." For each method and sample size, we obtained 1,000 samples from the population. Using χ(2) tests, we measured the number of statistically significant differences between the sample and the population for 8 variables (age, sex, race/ethnicity, language, triage acuity, arrival mode, disposition, and payer source). Then, for each variable, method, and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Only the true random samples represented the population with respect to sex, race/ethnicity, triage acuity, mode of arrival, language, and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. Copyright © 2012. Published by Mosby, Inc.

  8. Key-Generation Algorithms for Linear Piece In Hand Matrix Method

    NASA Astrophysics Data System (ADS)

    Tadaki, Kohtaro; Tsujii, Shigeo

    The linear Piece In Hand (PH, for short) matrix method with random variables was proposed in our former work. It is a general prescription which can be applicable to any type of multivariate public-key cryptosystems for the purpose of enhancing their security. Actually, we showed, in an experimental manner, that the linear PH matrix method with random variables can certainly enhance the security of HFE against the Gröbner basis attack, where HFE is one of the major variants of multivariate public-key cryptosystems. In 1998 Patarin, Goubin, and Courtois introduced the plus method as a general prescription which aims to enhance the security of any given MPKC, just like the linear PH matrix method with random variables. In this paper we prove the equivalence between the plus method and the primitive linear PH matrix method, which is introduced by our previous work to explain the notion of the PH matrix method in general in an illustrative manner and not for a practical use to enhance the security of any given MPKC. Based on this equivalence, we show that the linear PH matrix method with random variables has the substantial advantage over the plus method with respect to the security enhancement. In the linear PH matrix method with random variables, the three matrices, including the PH matrix, play a central role in the secret-key and public-key. In this paper, we clarify how to generate these matrices and thus present two probabilistic polynomial-time algorithms to generate these matrices. In particular, the second one has a concise form, and is obtained as a byproduct of the proof of the equivalence between the plus method and the primitive linear PH matrix method.

  9. Long-acting reversible contraceptive acceptability and unintended pregnancy among women presenting for short-acting methods: a randomized patient preference trial.

    PubMed

    Hubacher, David; Spector, Hannah; Monteith, Charles; Chen, Pai-Lien; Hart, Catherine

    2017-02-01

    Measures of contraceptive effectiveness combine technology and user-related factors. Observational studies show higher effectiveness of long-acting reversible contraception compared with short-acting reversible contraception. Women who choose long-acting reversible contraception may differ in key ways from women who choose short-acting reversible contraception, and it may be these differences that are responsible for the high effectiveness of long-acting reversible contraception. Wider use of long-acting reversible contraception is recommended, but scientific evidence of acceptability and successful use is lacking in a population that typically opts for short-acting methods. The objective of the study was to reduce bias in measuring contraceptive effectiveness and better isolate the independent role that long-acting reversible contraception has in preventing unintended pregnancy relative to short-acting reversible contraception. We conducted a partially randomized patient preference trial and recruited women aged 18-29 years who were seeking a short-acting method (pills or injectable). Participants who agreed to randomization were assigned to 1 of 2 categories: long-acting reversible contraception or short-acting reversible contraception. Women who declined randomization but agreed to follow-up in the observational cohort chose their preferred method. Under randomization, participants chose a specific method in the category and received it for free, whereas participants in the preference cohort paid for the contraception in their usual fashion. Participants were followed up prospectively to measure primary outcomes of method continuation and unintended pregnancy at 12 months. Kaplan-Meier techniques were used to estimate method continuation probabilities. Intent-to-treat principles were applied after method initiation for comparing incidence of unintended pregnancy. We also measured acceptability in terms of level of happiness with the products. Of the 916 participants, 43% chose randomization and 57% chose the preference option. Complete loss to follow-up at 12 months was <2%. The 12-month method continuation probabilities were 63.3% (95% confidence interval, 58.9-67.3) (preference short-acting reversible contraception), 53.0% (95% confidence interval, 45.7-59.8) (randomized short-acting reversible contraception), and 77.8% (95% confidence interval, 71.0-83.2) (randomized long-acting reversible contraception) (P < .001 in the primary comparison involving randomized groups). The 12-month cumulative unintended pregnancy probabilities were 6.4% (95% confidence interval, 4.1-8.7) (preference short-acting reversible contraception), 7.7% (95% confidence interval, 3.3-12.1) (randomized short-acting reversible contraception), and 0.7% (95% confidence interval, 0.0-4.7) (randomized long-acting reversible contraception) (P = .01 when comparing randomized groups). In the secondary comparisons involving only short-acting reversible contraception users, the continuation probability was higher in the preference group compared with the randomized group (P = .04). However, the short-acting reversible contraception randomized group and short-acting reversible contraception preference group had statistically equivalent rates of unintended pregnancy (P = .77). Seventy-eight percent of randomized long-acting reversible contraception users were happy/neutral with their initial method, compared with 89% of randomized short-acting reversible contraception users (P < .05). However, among method continuers at 12 months, all groups were equally happy/neutral (>90%). Even in a typical population of women who presented to initiate or continue short-acting reversible contraception, long-acting reversible contraception proved highly acceptable. One year after initiation, women randomized to long-acting reversible contraception had high continuation rates and consequently experienced superior protection from unintended pregnancy compared with women using short-acting reversible contraception; these findings are attributable to the initial technology and not underlying factors that often bias observational estimates of effectiveness. The similarly patterned experiences of the 2 short-acting reversible contraception cohorts provide a bridge of generalizability between the randomized group and usual-care preference group. Benefits of increased voluntary uptake of long-acting reversible contraception may extend to wider populations than previously thought. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Air pollution, neighbourhood and maternal-level factors modify the effect of smoking on birth weight: a multilevel analysis in British Columbia, Canada.

    PubMed

    Erickson, Anders C; Ostry, Aleck; Chan, Hing Man; Arbour, Laura

    2016-07-16

    Maternal smoking during pregnancy negatively impacts fetal growth, but the effect is not homogenous across the population. We sought to determine how the relationship between cigarette use and fetal growth is modified by the social and physical environment. Birth records with covariates were obtained from the BC Perinatal Database Registry (N = 232,291). Maternal smoking status was self-reported as the number of cigarettes smoked per day usually at the first prenatal care visit. Census dissemination areas (DAs) were used as neighbourhood-level units and linked to individual births using residential postal codes to assign exposure to particulate air pollution (PM 2.5 ) and neighbourhood-level attributes such as socioeconomic status (SES), proportion of post-secondary education, immigrant density and living in a rural place. Random coefficient models were used with cigarettes/day modeled with a random slope to estimate its between-DA variability and test cross-level interactions with the neighbourhood-level variables on continuous birth weight. A significant negative and non-linear association was found between maternal smoking and birth weight. There was significant between-DA intercept variability in birth weight as well as between-DA slope variability of maternal smoking on birth weight of which 68 and 30 % respectively was explained with the inclusion of DA-level variables and their cross-level interactions. High DA-level SES had a strong positive association with birth weight but the effect was moderated with increased cigarettes/day. Conversely, heavy smokers showed the largest increases in birth weight with rising neighbourhood education levels. Increased levels of PM 2.5 and immigrant density were negatively associated with birth weight, but showed positive interactions with increased levels of smoking. Older maternal age and suspected drug or alcohol use both had negative interactions with increased levels of maternal smoking. Maternal smoking had a negative and non-linear dose-response association with birth weight which was highly variable between neighbourhoods and evidence of effect modification with neighbourhood-level factors. These results suggest that focusing exclusively on individual behaviours may have limited success in improving outcomes without addressing the contextual influences at the neighbourhood-level. Further studies are needed to corroborate our findings and to understand how neighbourhood-level attributes interact with smoking to affect birth outcomes.

  11. Variable versus conventional lung protective mechanical ventilation during open abdominal surgery: study protocol for a randomized controlled trial.

    PubMed

    Spieth, Peter M; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo

    2014-05-02

    General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary function and reduces systemic inflammatory response. However, it is currently not known whether patients undergoing open abdominal surgery might benefit from intraoperative variable ventilation. The PROtective VARiable ventilation trial ('PROVAR') is a single center, randomized controlled trial enrolling 50 patients who are planning for open abdominal surgery expected to last longer than 3 hours. PROVAR compares conventional (non-variable) lung protective ventilation (CV) with variable lung protective ventilation (VV) regarding pulmonary function and inflammatory response. The primary endpoint of the study is the forced vital capacity on the first postoperative day. Secondary endpoints include further lung function tests, plasma cytokine levels, spatial distribution of ventilation assessed by means of electrical impedance tomography and postoperative pulmonary complications. We hypothesize that VV improves lung function and reduces systemic inflammatory response compared to CV in patients receiving mechanical ventilation during general anesthesia for open abdominal surgery longer than 3 hours. PROVAR is the first randomized controlled trial aiming at intra- and postoperative effects of VV on lung function. This study may help to define the role of VV during general anesthesia requiring mechanical ventilation. Clinicaltrials.gov NCT01683578 (registered on September 3 3012).

  12. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  13. High power tunable mid-infrared optical parametric oscillator enabled by random fiber laser.

    PubMed

    Wu, Hanshuo; Wang, Peng; Song, Jiaxin; Ye, Jun; Xu, Jiangming; Li, Xiao; Zhou, Pu

    2018-03-05

    Random fiber laser, as a kind of novel fiber laser that utilizes random distributed feedback as well as Raman gain, has become a research focus owing to its advantages of wavelength flexibility, modeless property and output stability. Herein, a tunable optical parametric oscillator (OPO) enabled by a random fiber laser is reported for the first time. By exploiting a tunable random fiber laser to pump the OPO, the central wavelength of idler light can be continuously tuned from 3977.34 to 4059.65 nm with stable temporal average output power. The maximal output power achieved is 2.07 W. So far as we know, this is the first demonstration of a continuous-wave tunable OPO pumped by a tunable random fiber laser, which could not only provide a new approach for achieving tunable mid-infrared (MIR) emission, but also extend the application scenarios of random fiber lasers.

  14. A Cautious Note on Auxiliary Variables That Can Increase Bias in Missing Data Problems.

    PubMed

    Thoemmes, Felix; Rose, Norman

    2014-01-01

    The treatment of missing data in the social sciences has changed tremendously during the last decade. Modern missing data techniques such as multiple imputation and full-information maximum likelihood are used much more frequently. These methods assume that data are missing at random. One very common approach to increase the likelihood that missing at random is achieved consists of including many covariates as so-called auxiliary variables. These variables are either included based on data considerations or in an inclusive fashion; that is, taking all available auxiliary variables. In this article, we point out that there are some instances in which auxiliary variables exhibit the surprising property of increasing bias in missing data problems. In a series of focused simulation studies, we highlight some situations in which this type of biasing behavior can occur. We briefly discuss possible ways how one can avoid selecting bias-inducing covariates as auxiliary variables.

  15. Detecting Random, Partially Random, and Nonrandom Minnesota Multiphasic Personality Inventory-2 Protocols

    ERIC Educational Resources Information Center

    Pinsoneault, Terry B.

    2007-01-01

    The ability of the Minnesota Multiphasic Personality Inventory-2 (MMPI-2; J. N. Butcher et al., 2001) validity scales to detect random, partially random, and nonrandom MMPI-2 protocols was investigated. Investigations included the Variable Response Inconsistency scale (VRIN), F, several potentially useful new F and VRIN subscales, and F-sub(b) - F…

  16. How Robust Is Linear Regression with Dummy Variables?

    ERIC Educational Resources Information Center

    Blankmeyer, Eric

    2006-01-01

    Researchers in education and the social sciences make extensive use of linear regression models in which the dependent variable is continuous-valued while the explanatory variables are a combination of continuous-valued regressors and dummy variables. The dummies partition the sample into groups, some of which may contain only a few observations.…

  17. Continuation Power Flow with Variable-Step Variable-Order Nonlinear Predictor

    NASA Astrophysics Data System (ADS)

    Kojima, Takayuki; Mori, Hiroyuki

    This paper proposes a new continuation power flow calculation method for drawing a P-V curve in power systems. The continuation power flow calculation successively evaluates power flow solutions through changing a specified value of the power flow calculation. In recent years, power system operators are quite concerned with voltage instability due to the appearance of deregulated and competitive power markets. The continuation power flow calculation plays an important role to understand the load characteristics in a sense of static voltage instability. In this paper, a new continuation power flow with a variable-step variable-order (VSVO) nonlinear predictor is proposed. The proposed method evaluates optimal predicted points confirming with the feature of P-V curves. The proposed method is successfully applied to IEEE 118-bus and IEEE 300-bus systems.

  18. Comparison of High-Intensity Interval Training and Moderate-to-Vigorous Continuous Training for Cardiometabolic Health and Exercise Enjoyment in Obese Young Women: A Randomized Controlled Trial.

    PubMed

    Kong, Zhaowei; Fan, Xitao; Sun, Shengyan; Song, Lili; Shi, Qingde; Nie, Jinlei

    2016-01-01

    The aim of this study was to compare the effects of 5-week high-intensity interval training (HIIT) and moderate-to-vigorous intensity continuous training (MVCT) on cardiometabolic health outcomes and enjoyment of exercise in obese young women. A randomized controlled experiment was conducted that involved thirty-one obese females (age range of 18-30) randomly assigned to either HIIT or MVCT five-week training programs. Participants in HIIT condition performed 20 min of repeated 8 s cycling interspersed with 12 s rest intervals, and those in MVCT condition cycled continuously for 40 min at 60-80% of peak oxygen consumption ([Formula: see text]O2peak), both for four days in a week. Outcomes such as [Formula: see text]O2peak, body composition estimated by bioimpedance analysis, blood lipids, and serum sexual hormones were measured at pre-and post-training. The scores of Physical Activity Enjoyment Scale (PAES) were collected during the intervention. After training, [Formula: see text]O2peak increased significantly for both training programs (9.1% in HIIT and 10.3% in MVCT) (p = 0.010, η2 = 0.41). Although MVCT group had a significant reduction in total body weight (TBW, -1.8%, p = 0.034), fat mass (FM, - 4.7%, p = 0.002) and percentage body fat (PBF, -2.9%, p = 0.016), there were no significant between-group differences in the change of the pre- and post-measures of these variables. The HIIT group had a higher score on PAES than the MVCT group during the intervention. For both conditions, exercise training led to a decline in resting testosterone and estradiol levels, but had no significant effect on blood lipids. Both HIIT and MVCT are effective in improving cardiorespiratory fitness and in reducing sexual hormones in obese young women; however, HIIT is a more enjoyable and time-efficient strategy. The mild-HIIT protocol seems to be useful for at least maintaining the body weight among sedentary individuals.

  19. Comparison of High-Intensity Interval Training and Moderate-to-Vigorous Continuous Training for Cardiometabolic Health and Exercise Enjoyment in Obese Young Women: A Randomized Controlled Trial

    PubMed Central

    Sun, Shengyan; Song, Lili; Shi, Qingde

    2016-01-01

    Objective The aim of this study was to compare the effects of 5-week high-intensity interval training (HIIT) and moderate-to-vigorous intensity continuous training (MVCT) on cardiometabolic health outcomes and enjoyment of exercise in obese young women. Methods A randomized controlled experiment was conducted that involved thirty-one obese females (age range of 18–30) randomly assigned to either HIIT or MVCT five-week training programs. Participants in HIIT condition performed 20 min of repeated 8 s cycling interspersed with 12 s rest intervals, and those in MVCT condition cycled continuously for 40 min at 60–80% of peak oxygen consumption (V˙O2peak), both for four days in a week. Outcomes such as V˙O2peak, body composition estimated by bioimpedance analysis, blood lipids, and serum sexual hormones were measured at pre-and post-training. The scores of Physical Activity Enjoyment Scale (PAES) were collected during the intervention. Results After training, V˙O2peak increased significantly for both training programs (9.1% in HIIT and 10.3% in MVCT) (p = 0.010, η2 = 0.41). Although MVCT group had a significant reduction in total body weight (TBW, −1.8%, p = 0.034), fat mass (FM, - 4.7%, p = 0.002) and percentage body fat (PBF, −2.9%, p = 0.016), there were no significant between-group differences in the change of the pre- and post-measures of these variables. The HIIT group had a higher score on PAES than the MVCT group during the intervention. For both conditions, exercise training led to a decline in resting testosterone and estradiol levels, but had no significant effect on blood lipids. Conclusion Both HIIT and MVCT are effective in improving cardiorespiratory fitness and in reducing sexual hormones in obese young women; however, HIIT is a more enjoyable and time-efficient strategy. The mild-HIIT protocol seems to be useful for at least maintaining the body weight among sedentary individuals. PMID:27368057

  20. The Impact of Insulin Pump Therapy on Glycemic Profiles in Patients with Type 2 Diabetes: Data from the OpT2mise Study.

    PubMed

    Conget, Ignacio; Castaneda, Javier; Petrovski, Goran; Guerci, Bruno; Racault, Anne-Sophie; Reznik, Yves; Cohen, Ohad; Runzis, Sarah; de Portu, Simona; Aronson, Ronnie

    2016-01-01

    The OpT2mise randomized trial was designed to compare the effects of continuous subcutaneous insulin infusion (CSII) and multiple daily injections (MDI) on glucose profiles in patients with type 2 diabetes. Patients with glycated hemoglobin (HbA1c) levels of ≥8% (64 mmol/mol) and ≤12% (108 mmol/mol) despite insulin doses of 0.7-1.8 U/kg/day via MDI were randomized to CSII (n=168) or continued MDI (n=163). Changes in glucose profiles were evaluated using continuous glucose monitoring data collected over 6-day periods before and 6 months after randomization. After 6 months, reductions in HbA1c levels were significantly greater with CSII (-1.1±1.2% [-12.0±13.1 mmol/mol]) than with MDI (-0.4±1.1% [-4.4±12.0 mmol/mol]) (P<0.001). Similarly, compared with patients receiving MDI, those receiving CSII showed significantly greater reductions in 24-h mean sensor glucose (SG) (treatment difference, -17.1 mg/dL; P=0.0023), less exposure to SG >180 mg/dL (-12.4%; P=0.0004) and SG >250 mg/dL (-5.5%; P=0.0153), and more time in the SG range of 70-180 mg/dL (12.3%; P=0.0002), with no differences in exposure to SG<70 mg/dL or in glucose variability. Changes in postprandial (4-h) glucose area under the curve >180 mg/dL were significantly greater with CSII than with MDI after breakfast (-775.9±1,441.2 mg/dL/min vs. -160.7±1,074.1 mg/dL/min; P=0.0015) and after dinner (-731.4±1,580.7 mg/dL/min vs. -71.1±1,083.5 mg/dL/min; P=0.0014). In patients with suboptimally controlled type 2 diabetes, CSII significantly improves selected glucometrics, compared with MDI, without increasing the risk of hypoglycemia.

  1. Uncertainty in Random Forests: What does it mean in a spatial context?

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Fouedjio, Francky

    2017-04-01

    Geochemical surveys are an important part of exploration for mineral resources and in environmental studies. The samples and chemical analyses are often laborious and difficult to obtain and therefore come at a high cost. As a consequence, these surveys are characterised by datasets with large numbers of variables but relatively few data points when compared to conventional big data problems. With more remote sensing platforms and sensor networks being deployed, large volumes of auxiliary data of the surveyed areas are becoming available. The use of these auxiliary data has the potential to improve the prediction of chemical element concentrations over the whole study area. Kriging is a well established geostatistical method for the prediction of spatial data but requires significant pre-processing and makes some basic assumptions about the underlying distribution of the data. Some machine learning algorithms, on the other hand, may require less data pre-processing and are non-parametric. In this study we used a dataset provided by Kirkwood et al. [1] to explore the potential use of Random Forest in geochemical mapping. We chose Random Forest because it is a well understood machine learning method and has the advantage that it provides us with a measure of uncertainty. By comparing Random Forest to Kriging we found that both methods produced comparable maps of estimated values for our variables of interest. Kriging outperformed Random Forest for variables of interest with relatively strong spatial correlation. The measure of uncertainty provided by Random Forest seems to be quite different to the measure of uncertainty provided by Kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. In conclusion, our preliminary results show that the model driven approach in geostatistics gives us more reliable estimates for our target variables than Random Forest for variables with relatively strong spatial correlation. However, in cases of weak spatial correlation Random Forest, as a nonparametric method, may give the better results once we have a better understanding of the meaning of its uncertainty measures in a spatial context. References [1] Kirkwood, C., M. Cave, D. Beamish, S. Grebby, and A. Ferreira (2016), A machine learning approach to geochemical mapping, Journal of Geochemical Exploration, 163, 28-40, doi:10.1016/j.gexplo.2016.05.003.

  2. Random walk, diffusion and mixing in simulations of scalar transport in fluid flows

    NASA Astrophysics Data System (ADS)

    Klimenko, A. Y.

    2008-12-01

    Physical similarity and mathematical equivalence of continuous diffusion and particle random walk form one of the cornerstones of modern physics and the theory of stochastic processes. In many applied models used in simulation of turbulent transport and turbulent combustion, mixing between particles is used to reflect the influence of the continuous diffusion terms in the transport equations. We show that the continuous scalar transport and diffusion can be accurately specified by means of mixing between randomly walking Lagrangian particles with scalar properties and assess errors associated with this scheme. This gives an alternative formulation for the stochastic process which is selected to represent the continuous diffusion. This paper focuses on statistical errors and deals with relatively simple cases, where one-particle distributions are sufficient for a complete description of the problem.

  3. Continuous versus bolus tube feeds: Does the modality affect glycemic variability, tube feeding volume, caloric intake, or insulin utilization?

    PubMed

    Evans, David C; Forbes, Rachel; Jones, Christian; Cotterman, Robert; Njoku, Chinedu; Thongrong, Cattleya; Tulman, David; Bergese, Sergio D; Thomas, Sheela; Papadimos, Thomas J; Stawicki, Stanislaw P

    2016-01-01

    Enteral nutrition (EN) is very important to optimizing outcomes in critical illness. Debate exists regarding the best strategy for enteral tube feeding (TF), with concerns that bolus TF (BTF) may increase glycemic variability (GV) but result in fewer nutritional interruptions than continuous TF (CTF). This study examines if there is a difference in GV, insulin usage, TF volume, and caloric delivery among intensive care patients receiving BTF versus CTF. We hypothesize that there are no significant differences between CTF and BTF when comparing the above parameters. Prospective, randomized pilot study of critically ill adult patients undergoing percutaneous endoscopic gastrostomy (PEG) placement for EN was performed between March 1, 2012 and May 15, 2014. Patients were randomized to BTF or CTF. Glucose values, insulin use, TF volume, and calories administered were recorded. Data were organized into 12-h epochs for statistical analyses and GV determination. In addition, time to ≥80% nutritional delivery goal, demographics, Acute Physiology and Chronic Health Evaluation II scores, and TF interruptions were examined. When performing BTF versus CTF assessments, continuous parameters were compared using Mann-Whitney U-test or repeated measures t-test, as appropriate. Categorical data were analyzed using Fisher's exact test. No significant demographic or physiologic differences between the CTF (n = 24) and BTF (n = 26) groups were seen. The immediate post-PEG 12-h epoch showed significantly lower GV and median TF volume for patients in the CTF group. All subsequent epochs (up to 18 days post-PEG) showed no differences in GV, insulin use, TF volume, or caloric intake. Insulin use for both groups increased when comparing the first 24 h post-PEG values to measurements from day 8. There were no differences in TF interruptions, time to ≥80% nutritional delivery goal, or hypoglycemic episodes. This study demonstrated no clinically relevant differences in GV, insulin use, TF volume or caloric intake between BTF and CTF groups. Despite some shortcomings, our data suggest that providers should not feel limited to BTF or CTF because of concerns for GV, time to goal nutrition, insulin use, or caloric intake, and should consider other factors such as resource utilization, ease of administration, and/or institutional/patient characteristics.

  4. Permissive Attitude Towards Drug Use, Life Satisfaction, and Continuous Drug Use Among Psychoactive Drug Users in Hong Kong.

    PubMed

    Cheung, N Wt; Cheung, Y W; Chen, X

    2016-06-01

    To examine the effects of a permissive attitude towards regular and occasional drug use, life satisfaction, self-esteem, depression, and other psychosocial variables in the drug use of psychoactive drug users. Psychosocial factors that might affect a permissive attitude towards regular / occasional drug use and life satisfaction were further explored. We analysed data of a sample of psychoactive drug users from a longitudinal survey of psychoactive drug abusers in Hong Kong who were interviewed at 6 time points at 6-month intervals between January 2009 and December 2011. Data of the second to the sixth time points were stacked into an individual time point structure. Random-effects probit regression analysis was performed to estimate the relative contribution of the independent variables to the binary dependent variable of drug use in the last 30 days. A permissive attitude towards drug use, life satisfaction, and depression at the concurrent time point, and self-esteem at the previous time point had direct effects on drug use in the last 30 days. Interestingly, permissiveness to occasional drug use was a stronger predictor of drug use than permissiveness to regular drug use. These 2 permissive attitude variables were affected by the belief that doing extreme things shows the vitality of young people (at concurrent time point), life satisfaction (at concurrent time point), and self-esteem (at concurrent and previous time points). Life satisfaction was affected by sense of uncertainty about the future (at concurrent time point), self-esteem (at concurrent time point), depression (at both concurrent and previous time points), and being stricken by stressful events (at previous time point). A number of psychosocial factors could affect the continuation or discontinuation of drug use, as well as the permissive attitude towards regular and occasional drug use, and life satisfaction. Implications of the findings for prevention and intervention work targeted at psychoactive drug users are discussed.

  5. Reliability Overhaul Model

    DTIC Science & Technology

    1989-08-01

    Random variables for the conditional exponential distribution are generated using the inverse transform method. C1) Generate U - UCO,i) (2) Set s - A ln...e - [(x+s - 7)/ n] 0 + [Cx-T)/n]0 c. Random variables from the conditional weibull distribution are generated using the inverse transform method. C1...using a standard normal transformation and the inverse transform method. B - 3 APPENDIX 3 DISTRIBUTIONS SUPPORTED BY THE MODEL (1) Generate Y - PCX S

  6. Performance of DS/SSMA (Direct-Sequence Spread-Spectrum Multiple-Access) Communications in Impulsive Channels.

    DTIC Science & Technology

    1986-11-01

    mother and my brother. Their support and encouragement made this research exciting and enjoyable. I am grateful to my advisor, Professor H. Vincent Poor...the model. The m! M A variance of a random variable with density given by (A. 1) is a2 KmC 2 2A(I+l’)• (A.2) With the variance of the random variable

  7. A New Approach to Extreme Value Estimation Applicable to a Wide Variety of Random Variables

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.

    1997-01-01

    Designing reliable structures requires an estimate of the maximum and minimum values (i.e., strength and load) that may be encountered in service. Yet designs based on very extreme values (to insure safety) can result in extra material usage and hence, uneconomic systems. In aerospace applications, severe over-design cannot be tolerated making it almost mandatory to design closer to the assumed limits of the design random variables. The issue then is predicting extreme values that are practical, i.e. neither too conservative or non-conservative. Obtaining design values by employing safety factors is well known to often result in overly conservative designs and. Safety factor values have historically been selected rather arbitrarily, often lacking a sound rational basis. To answer the question of how safe a design needs to be has lead design theorists to probabilistic and statistical methods. The so-called three-sigma approach is one such method and has been described as the first step in utilizing information about the data dispersion. However, this method is based on the assumption that the random variable is dispersed symmetrically about the mean and is essentially limited to normally distributed random variables. Use of this method can therefore result in unsafe or overly conservative design allowables if the common assumption of normality is incorrect.

  8. Tests of Hypotheses Arising In the Correlated Random Coefficient Model*

    PubMed Central

    Heckman, James J.; Schmierer, Daniel

    2010-01-01

    This paper examines the correlated random coefficient model. It extends the analysis of Swamy (1971), who pioneered the uncorrelated random coefficient model in economics. We develop the properties of the correlated random coefficient model and derive a new representation of the variance of the instrumental variable estimator for that model. We develop tests of the validity of the correlated random coefficient model against the null hypothesis of the uncorrelated random coefficient model. PMID:21170148

  9. Latin Hypercube Sampling (LHS) UNIX Library/Standalone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-05-13

    The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less

  10. Assessing differences in groups randomized by recruitment chain in a respondent-driven sample of Seattle-area injection drug users.

    PubMed

    Burt, Richard D; Thiede, Hanne

    2014-11-01

    Respondent-driven sampling (RDS) is a form of peer-based study recruitment and analysis that incorporates features designed to limit and adjust for biases in traditional snowball sampling. It is being widely used in studies of hidden populations. We report an empirical evaluation of RDS's consistency and variability, comparing groups recruited contemporaneously, by identical methods and using identical survey instruments. We randomized recruitment chains from the RDS-based 2012 National HIV Behavioral Surveillance survey of injection drug users in the Seattle area into two groups and compared them in terms of sociodemographic characteristics, drug-associated risk behaviors, sexual risk behaviors, human immunodeficiency virus (HIV) status and HIV testing frequency. The two groups differed in five of the 18 variables examined (P ≤ .001): race (e.g., 60% white vs. 47%), gender (52% male vs. 67%), area of residence (32% downtown Seattle vs. 44%), an HIV test in the previous 12 months (51% vs. 38%). The difference in serologic HIV status was particularly pronounced (4% positive vs. 18%). In four further randomizations, differences in one to five variables attained this level of significance, although the specific variables involved differed. We found some material differences between the randomized groups. Although the variability of the present study was less than has been reported in serial RDS surveys, these findings indicate caution in the interpretation of RDS results. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Pathwise upper semi-continuity of random pullback attractors along the time axis

    NASA Astrophysics Data System (ADS)

    Cui, Hongyong; Kloeden, Peter E.; Wu, Fuke

    2018-07-01

    The pullback attractor of a non-autonomous random dynamical system is a time-indexed family of random sets, typically having the form {At(ṡ) } t ∈ R with each At(ṡ) a random set. This paper is concerned with the nature of such time-dependence. It is shown that the upper semi-continuity of the mapping t ↦At(ω) for each ω fixed has an equivalence relationship with the uniform compactness of the local union ∪s∈IAs(ω) , where I ⊂ R is compact. Applied to a semi-linear degenerate parabolic equation with additive noise and a wave equation with multiplicative noise we show that, in order to prove the above locally uniform compactness and upper semi-continuity, no additional conditions are required, in which sense the two properties appear to be general properties satisfied by a large number of real models.

  12. Metastability of Reversible Random Walks in Potential Fields

    NASA Astrophysics Data System (ADS)

    Landim, C.; Misturini, R.; Tsunoda, K.

    2015-09-01

    Let be an open and bounded subset of , and let be a twice continuously differentiable function. Denote by the discretization of , , and denote by the continuous-time, nearest-neighbor, random walk on which jumps from to at rate . We examine in this article the metastable behavior of among the wells of the potential F.

  13. Management of Hypertension in Private Practice: A Randomized Controlled Trial in Continuing Medical Education.

    ERIC Educational Resources Information Center

    Gullion, David S.; And Others

    1988-01-01

    A randomized control trial was used to evaluate a physician education program designed to improve physician management of patients' hypertension, hypertension-related behaviors, and diastolic blood pressure. It was suggested that more intensive continuing medical education programs are needed to improve physician performance and patient outcome.…

  14. Smoking cessation and outcome after ischemic stroke or TIA.

    PubMed

    Epstein, Katherine A; Viscoli, Catherine M; Spence, J David; Young, Lawrence H; Inzucchi, Silvio E; Gorman, Mark; Gerstenhaber, Brett; Guarino, Peter D; Dixit, Anand; Furie, Karen L; Kernan, Walter N

    2017-10-17

    To assess whether smoking cessation after an ischemic stroke or TIA improves outcomes compared to continued smoking. We conducted a prospective observational cohort study of 3,876 nondiabetic men and women enrolled in the Insulin Resistance Intervention After Stroke (IRIS) trial who were randomized to pioglitazone or placebo within 180 days of a qualifying stroke or TIA and followed up for a median of 4.8 years. A tobacco use history was obtained at baseline and updated during annual interviews. The primary outcome, which was not prespecified in the IRIS protocol, was recurrent stroke, myocardial infarction (MI), or death. Cox regression models were used to assess the differences in stroke, MI, and death after 4.8 years, with correction for adjustment variables prespecified in the IRIS trial: age, sex, stroke (vs TIA) as index event, history of stroke, history of hypertension, history of coronary artery disease, and systolic and diastolic blood pressures. At the time of their index event, 1,072 (28%) patients were current smokers. By the time of randomization, 450 (42%) patients had quit smoking. Among quitters, the 5-year risk of stroke, MI, or death was 15.7% compared to 22.6% for patients who continued to smoke (adjusted hazard ratio 0.66, 95% confidence interval 0.48-0.90). Cessation of cigarette smoking after an ischemic stroke or TIA was associated with significant health benefits over 4.8 years in the IRIS trial cohort. © 2017 American Academy of Neurology.

  15. Efficient geostatistical inversion of transient groundwater flow using preconditioned nonlinear conjugate gradients

    NASA Astrophysics Data System (ADS)

    Klein, Ole; Cirpka, Olaf A.; Bastian, Peter; Ippisch, Olaf

    2017-04-01

    In the geostatistical inverse problem of subsurface hydrology, continuous hydraulic parameter fields, in most cases hydraulic conductivity, are estimated from measurements of dependent variables, such as hydraulic heads, under the assumption that the parameter fields are autocorrelated random space functions. Upon discretization, the continuous fields become large parameter vectors with O (104 -107) elements. While cokriging-like inversion methods have been shown to be efficient for highly resolved parameter fields when the number of measurements is small, they require the calculation of the sensitivity of each measurement with respect to all parameters, which may become prohibitive with large sets of measured data such as those arising from transient groundwater flow. We present a Preconditioned Conjugate Gradient method for the geostatistical inverse problem, in which a single adjoint equation needs to be solved to obtain the gradient of the objective function. Using the autocovariance matrix of the parameters as preconditioning matrix, expensive multiplications with its inverse can be avoided, and the number of iterations is significantly reduced. We use a randomized spectral decomposition of the posterior covariance matrix of the parameters to perform a linearized uncertainty quantification of the parameter estimate. The feasibility of the method is tested by virtual examples of head observations in steady-state and transient groundwater flow. These synthetic tests demonstrate that transient data can reduce both parameter uncertainty and time spent conducting experiments, while the presented methods are able to handle the resulting large number of measurements.

  16. Spatial modelling and mapping of female genital mutilation in Kenya.

    PubMed

    Achia, Thomas N O

    2014-03-25

    Female genital mutilation/cutting (FGM/C) is still prevalent in several communities in Kenya and other areas in Africa, as well as being practiced by some migrants from African countries living in other parts of the world. This study aimed at detecting clustering of FGM/C in Kenya, and identifying those areas within the country where women still intend to continue the practice. A broader goal of the study was to identify geographical areas where the practice continues unabated and where broad intervention strategies need to be introduced. The prevalence of FGM/C was investigated using the 2008 Kenya Demographic and Health Survey (KDHS) data. The 2008 KDHS used a multistage stratified random sampling plan to select women of reproductive age (15-49 years) and asked questions concerning their FGM/C status and their support for the continuation of FGM/C. A spatial scan statistical analysis was carried out using SaTScan™ to test for statistically significant clustering of the practice of FGM/C in the country. The risk of FGM/C was also modelled and mapped using a hierarchical spatial model under the Integrated Nested Laplace approximation approach using the INLA library in R. The prevalence of FGM/C stood at 28.2% and an estimated 10.3% of the women interviewed indicated that they supported the continuation of FGM. On the basis of the Deviance Information Criterion (DIC), hierarchical spatial models with spatially structured random effects were found to best fit the data for both response variables considered. Age, region, rural-urban classification, education, marital status, religion, socioeconomic status and media exposure were found to be significantly associated with FGM/C. The current FGM/C status of a woman was also a significant predictor of support for the continuation of FGM/C. Spatial scan statistics confirm FGM clusters in the North-Eastern and South-Western regions of Kenya (p<0.001). This suggests that the fight against FGM/C in Kenya is not yet over. There are still deep cultural and religious beliefs to be addressed in a bid to eradicate the practice. Interventions by government and other stakeholders must address these challenges and target the identified clusters.

  17. Ratio index variables or ANCOVA? Fisher's cats revisited.

    PubMed

    Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S

    2010-01-01

    Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.

  18. Smoking cessation and reduction in schizophrenia (SCARIS) with e-cigarette: study protocol for a randomized control trial.

    PubMed

    Caponnetto, Pasquale; Polosa, Riccardo; Auditore, Roberta; Minutolo, Giuseppe; Signorelli, Maria; Maglia, Marilena; Alamo, Angela; Palermo, Filippo; Aguglia, Eugenio

    2014-03-22

    It is well established in studies across several countries that tobacco smoking is more prevalent among schizophrenic patients than the general population. Electronic cigarettes are becoming increasingly popular with smokers worldwide. To date there are no large randomized trials of electronic cigarettes in schizophrenic smokers. A well-designed trial is needed to compare efficacy and safety of these products in this special population. We have designed a randomized controlled trial investigating the efficacy and safety of electronic cigarette. The trial will take the form of a prospective 12-month randomized clinical study to evaluate smoking reduction, smoking abstinence and adverse events in schizophrenic smokers not intending to quit. We will also monitor quality of life, neurocognitive functioning and measure participants' perception and satisfaction of the product. A ≥50% reduction in the number of cigarettes/day from baseline, will be calculated at each study visit ("reducers"). Abstinence from smoking will be calculated at each study visit ("quitters"). Smokers who leave the study protocol before its completion and will carry out the Early Termination Visit or who will not satisfy the criteria of "reducers" and "quitters" will be defined "non responders". The differences of continuous variables between the three groups will be evaluated with the Kruskal-Wallis Test, followed by the Dunn multiple comparison test. The differences between the three groups for normally distributed data will be evaluated with ANOVA test one way, followed by the Newman-Keuls multiple comparison test. The normality of the distribution will be evaluated with the Kolmogorov-Smirnov test. Any correlations between the variables under evaluation will be assessed by Spearman r correlation. To compare qualitative data will be used the Chi-square test. The main strengths of the SCARIS study are the following: it's the first large RCT on schizophrenic patient, involving in and outpatient, evaluating the effect of a three-arm study design, and a long term of follow-up (52-weeks).The goal is to propose an effective intervention to reduce the risk of tobacco smoking, as a complementary tool to treat tobacco addiction in schizophrenia. ClinicalTrials.gov, NCT01979796.

  19. Screening large-scale association study data: exploiting interactions using random forests.

    PubMed

    Lunetta, Kathryn L; Hayward, L Brooke; Segal, Jonathan; Van Eerdewegh, Paul

    2004-12-10

    Genome-wide association studies for complex diseases will produce genotypes on hundreds of thousands of single nucleotide polymorphisms (SNPs). A logical first approach to dealing with massive numbers of SNPs is to use some test to screen the SNPs, retaining only those that meet some criterion for further study. For example, SNPs can be ranked by p-value, and those with the lowest p-values retained. When SNPs have large interaction effects but small marginal effects in a population, they are unlikely to be retained when univariate tests are used for screening. However, model-based screens that pre-specify interactions are impractical for data sets with thousands of SNPs. Random forest analysis is an alternative method that produces a single measure of importance for each predictor variable that takes into account interactions among variables without requiring model specification. Interactions increase the importance for the individual interacting variables, making them more likely to be given high importance relative to other variables. We test the performance of random forests as a screening procedure to identify small numbers of risk-associated SNPs from among large numbers of unassociated SNPs using complex disease models with up to 32 loci, incorporating both genetic heterogeneity and multi-locus interaction. Keeping other factors constant, if risk SNPs interact, the random forest importance measure significantly outperforms the Fisher Exact test as a screening tool. As the number of interacting SNPs increases, the improvement in performance of random forest analysis relative to Fisher Exact test for screening also increases. Random forests perform similarly to the univariate Fisher Exact test as a screening tool when SNPs in the analysis do not interact. In the context of large-scale genetic association studies where unknown interactions exist among true risk-associated SNPs or SNPs and environmental covariates, screening SNPs using random forest analyses can significantly reduce the number of SNPs that need to be retained for further study compared to standard univariate screening methods.

  20. Random variability explains apparent global clustering of large earthquakes

    USGS Publications Warehouse

    Michael, A.J.

    2011-01-01

    The occurrence of 5 Mw ≥ 8.5 earthquakes since 2004 has created a debate over whether or not we are in a global cluster of large earthquakes, temporarily raising risks above long-term levels. I use three classes of statistical tests to determine if the record of M ≥ 7 earthquakes since 1900 can reject a null hypothesis of independent random events with a constant rate plus localized aftershock sequences. The data cannot reject this null hypothesis. Thus, the temporal distribution of large global earthquakes is well-described by a random process, plus localized aftershocks, and apparent clustering is due to random variability. Therefore the risk of future events has not increased, except within ongoing aftershock sequences, and should be estimated from the longest possible record of events.

  1. Nonlinear dual-axis biodynamic response of the semi-supine human body during longitudinal horizontal whole-body vibration

    NASA Astrophysics Data System (ADS)

    Huang, Ya; Griffin, Michael J.

    2008-04-01

    The resonance frequencies in frequency response functions of the human body (e.g. apparent mass and transmissibility) decrease with increasing vibration magnitude. This nonlinear biodynamic response is found with various sitting and standing postures requiring postural control. The present study measured the apparent mass of the body in a relaxed semi-supine posture with two types of longitudinal horizontal vibration (in the z-axis of the semi-supine body): (i) continuous random excitation (0.25-20 Hz) at five magnitudes (0.125, 0.25, 0.5, 0.75 and 1.0 ms -2 rms); (ii) intermittent random excitation (0.25-20 Hz) alternately at 0.25 and 1.0 ms -2 rms. With continuous random vibration, the dominant primary resonance frequency in the median normalised apparent mass decreased from 3.7 to 2.4 Hz as the vibration magnitude increased from 0.125 to 1.0 ms -2 rms. A nonlinear response was apparent in both the horizontal ( z-axis) apparent mass and the vertical ( x-axis) cross-axis apparent mass. With intermittent random vibration, as the vibration magnitude increased from 0.25 to 1.0 ms -2 rms, the median resonance frequency of the apparent mass decreased from 3.2 to 2.5 Hz whereas, with continuous random vibration over the same range of magnitudes, the resonance frequency decreased from 3.4 to 2.4 Hz. The median change in the resonance frequency (between 0.25 and 1.0 ms -2 rms) was 0.6 Hz with the intermittent random vibration and 0.9 Hz with the continuous random vibration. With intermittent vibration, the resonance frequency was higher at the high magnitude and lower at the low magnitude than with continuous vibration at the same magnitudes. The responses were consistent with passive thixotropy being a primary cause of nonlinear biodynamic responses to whole-body vibration, although reflex activity of the muscles may also have an influence.

  2. Nonlinear dual-axis biodynamic response of the semi-supine human body during vertical whole-body vibration

    NASA Astrophysics Data System (ADS)

    Huang, Ya; Griffin, Michael J.

    2008-04-01

    Nonlinear biodynamic responses are evident in many studies of the apparent masses of sitting and standing subjects in static postures that require muscle activity for postural control. In the present study, 12 male subjects adopted a relaxed semi-supine posture assumed to involve less muscle activity than during static sitting and standing. The supine subjects were exposed to two types of vertical vibration (in the x-axis of the semi-supine body): (i) continuous random vibration (0.25-20 Hz) at five magnitudes (0.125, 0.25, 0.5, 0.75, and 1.0 m s -2 rms); (ii) intermittent random vibration (0.25-20 Hz) alternately at 0.25 and 1.0 m s -2 rms. With continuous random vibration, the dominant primary resonance frequency in the median normalised apparent mass decreased from 10.35 to 7.32 Hz as the vibration magnitude increased from 0.125 to 1.0 m s -2 rms. This nonlinear response was apparent in both the vertical ( x-axis) apparent mass and in the horizontal ( z-axis) cross-axis apparent mass. As the vibration magnitude increased from 0.25 to 1.0 m s -2 rms, the median resonance frequency of the apparent mass with intermittent random vibration decreased from 9.28 to 8.06 Hz whereas, over the same range of magnitudes with continuous random vibration, the resonance frequency decreased from 9.62 to 7.81 Hz. The median change in the resonance frequency (between 0.25 and 1.0 m s -2 rms) was 1.37 Hz with the intermittent random vibration and 1.71 with the continuous random vibration. With the intermittent vibration, the resonance frequency was higher at the high magnitude and lower at the low magnitude than with continuous vibration of the same magnitudes. The response was typical of thixotropy that may be a primary cause of the nonlinear biodynamic responses to whole-body vibration.

  3. Probabilistic evaluation of fuselage-type composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1992-01-01

    A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.

  4. Bayesian dynamic modeling of time series of dengue disease case counts.

    PubMed

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health.

  5. Financial Management of a Large Multi-site Randomized Clinical Trial

    PubMed Central

    Sheffet, Alice J.; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E.; Longbottom, Mary E.; Howard, Virginia J.; Marler, John R.; Brott, Thomas G.

    2014-01-01

    Background The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years’ funding ($21,112,866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2,500 randomized participants at 40 sites. Aims Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Methods Projections of the original grant’s fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant’s fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Results Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2,500 targeted sample size, 138 (5.5%) were randomized during the first five years and 1,387 (55.5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13,845) of the projected per-patient costs ($152,992) of the fixed model. Conclusions Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. PMID:24661748

  6. Financial management of a large multisite randomized clinical trial.

    PubMed

    Sheffet, Alice J; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E; Longbottom, Mary E; Howard, Virginia J; Marler, John R; Brott, Thomas G

    2014-08-01

    The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years' funding ($21 112 866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2500 randomized participants at 40 sites. Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Projections of the original grant's fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant's fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2500 targeted sample size, 138 (5·5%) were randomized during the first five years and 1387 (55·5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13 845) of the projected per-patient costs ($152 992) of the fixed model. Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. © 2014 The Authors. International Journal of Stroke © 2014 World Stroke Organization.

  7. Visualizing Time-Varying Distribution Data in EOS Application

    NASA Technical Reports Server (NTRS)

    Shen, Han-Wei

    2004-01-01

    In this research, we have developed several novel visualization methods for spatial probability density function data. Our focus has been on 2D spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We developed novel clustering algorithms as a means to reduce the information contained in these datasets; and investigated different ways of interpreting and clustering the data.

  8. Decisions with Uncertain Consequences—A Total Ordering on Loss-Distributions

    PubMed Central

    König, Sandra; Schauer, Stefan

    2016-01-01

    Decisions are often based on imprecise, uncertain or vague information. Likewise, the consequences of an action are often equally unpredictable, thus putting the decision maker into a twofold jeopardy. Assuming that the effects of an action can be modeled by a random variable, then the decision problem boils down to comparing different effects (random variables) by comparing their distribution functions. Although the full space of probability distributions cannot be ordered, a properly restricted subset of distributions can be totally ordered in a practically meaningful way. We call these loss-distributions, since they provide a substitute for the concept of loss-functions in decision theory. This article introduces the theory behind the necessary restrictions and the hereby constructible total ordering on random loss variables, which enables decisions under uncertainty of consequences. Using data obtained from simulations, we demonstrate the practical applicability of our approach. PMID:28030572

  9. Bayes to the Rescue: Continuous Positive Airway Pressure Has Less Mortality Than High-Flow Oxygen.

    PubMed

    Modesto I Alapont, Vicent; Khemani, Robinder G; Medina, Alberto; Del Villar Guerra, Pablo; Molina Cambra, Alfred

    2017-02-01

    The merits of high-flow nasal cannula oxygen versus bubble continuous positive airway pressure are debated in children with pneumonia, with suggestions that randomized controlled trials are needed. In light of a previous randomized controlled trial showing a trend for lower mortality with bubble continuous positive airway pressure, we sought to determine the probability that a new randomized controlled trial would find high-flow nasal cannula oxygen superior to bubble continuous positive airway pressure through a "robust" Bayesian analysis. Sample data were extracted from the trial by Chisti et al, and requisite to "robust" Bayesian analysis, we specified three prior distributions to represent clinically meaningful assumptions. These priors (reference, pessimistic, and optimistic) were used to generate three scenarios to represent the range of possible hypotheses. 1) "Reference": we believe bubble continuous positive airway pressure and high-flow nasal cannula oxygen are equally effective with the same uninformative reference priors; 2) "Sceptic on high-flow nasal cannula oxygen": we believe that bubble continuous positive airway pressure is better than high-flow nasal cannula oxygen (bubble continuous positive airway pressure has an optimistic prior and high-flow nasal cannula oxygen has a pessimistic prior); and 3) "Enthusiastic on high-flow nasal cannula oxygen": we believe that high-flow nasal cannula oxygen is better than bubble continuous positive airway pressure (high-flow nasal cannula oxygen has an optimistic prior and bubble continuous positive airway pressure has a pessimistic prior). Finally, posterior empiric Bayesian distributions were obtained through 100,000 Markov Chain Monte Carlo simulations. In all three scenarios, there was a high probability for more death from high-flow nasal cannula oxygen compared with bubble continuous positive airway pressure (reference, 0.98; sceptic on high-flow nasal cannula oxygen, 0.982; enthusiastic on high-flow nasal cannula oxygen, 0.742). The posterior 95% credible interval on the difference in mortality identified a future randomized controlled trial would be extremely unlikely to find a mortality benefit for high-flow nasal cannula oxygen over bubble continuous positive airway pressure, regardless of the scenario. Interpreting these findings using the "range of practical equivalence" framework would recommend rejecting the hypothesis that high-flow nasal cannula oxygen is superior to bubble continuous positive airway pressure for these children. For children younger than 5 years with pneumonia, high-flow nasal cannula oxygen has higher mortality than bubble continuous positive airway pressure. A future randomized controlled trial in this population is unlikely to find high-flow nasal cannula oxygen superior to bubble continuous positive airway pressure.

  10. Continuous equilibrium scores: factoring in the time before a fall.

    PubMed

    Wood, Scott J; Reschke, Millard F; Owen Black, F

    2012-07-01

    The equilibrium (EQ) score commonly used in computerized dynamic posturography is normalized between 0 and 100, with falls assigned a score of 0. The resulting mixed discrete-continuous distribution limits certain statistical analyses and treats all trials with falls equally. We propose a simple modification of the formula in which peak-to-peak sway data from trials with falls is scaled according the percent of the trial completed to derive a continuous equilibrium (cEQ) score. The cEQ scores for trials without falls remain unchanged from the original methodology. The cEQ factors in the time before a fall and results in a continuous variable retaining the central tendencies of the original EQ distribution. A random set of 5315 Sensory Organization Test trials were pooled that included 81 falls. A comparison of the original and cEQ distributions and their rank ordering demonstrated that trials with falls continue to constitute the lower range of scores with the cEQ methodology. The area under the receiver operating characteristic curve (0.997) demonstrates that the cEQ retained near-perfect discrimination between trials with and without falls. We conclude that the cEQ score provides the ability to discriminate between ballistic falls from falls that occur later in the trial. This approach of incorporating time and sway magnitude can be easily extended to enhance other balance tests that include fall data or incomplete trials. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Optimizing the models for rapid determination of chlorogenic acid, scopoletin and rutin in plant samples by near-infrared diffuse reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Mao, Zhiyi; Shan, Ruifeng; Wang, Jiajun; Cai, Wensheng; Shao, Xueguang

    2014-07-01

    Polyphenols in plant samples have been extensively studied because phenolic compounds are ubiquitous in plants and can be used as antioxidants in promoting human health. A method for rapid determination of three phenolic compounds (chlorogenic acid, scopoletin and rutin) in plant samples using near-infrared diffuse reflectance spectroscopy (NIRDRS) is studied in this work. Partial least squares (PLS) regression was used for building the calibration models, and the effects of spectral preprocessing and variable selection on the models are investigated for optimization of the models. The results show that individual spectral preprocessing and variable selection has no or slight influence on the models, but the combination of the techniques can significantly improve the models. The combination of continuous wavelet transform (CWT) for removing the variant background, multiplicative scatter correction (MSC) for correcting the scattering effect and randomization test (RT) for selecting the informative variables was found to be the best way for building the optimal models. For validation of the models, the polyphenol contents in an independent sample set were predicted. The correlation coefficients between the predicted values and the contents determined by high performance liquid chromatography (HPLC) analysis are as high as 0.964, 0.948 and 0.934 for chlorogenic acid, scopoletin and rutin, respectively.

  12. Factors affecting age at first dental exam for children seen at Federally Qualified Health Centers

    PubMed Central

    Kuthy, Raymond A.; Pendharkar, Bhagyashree; Momany, Elizabeth T.; Jones, Michael P.; Askelson, Natoshia M.; Chi, Donald L.; Wehby, George L.; Damiano, Peter C.

    2014-01-01

    Objective To estimate age at first dental visit (FDV) and identify variables predicting earlier visits for Medicaid-enrolled children at Federally Qualified Health Centers (FQHC). Methods Statewide Medicaid claims data were used to draw a random sample of children who received their FDV prior to 6 years of age at a FQHC, were Medicaid-enrolled within the first two months of life, and remained continuously enrolled over the study period. Forty children from each of 5 FQHCs had their dental charts abstracted and merged with other Medicaid records and birth certificate data. The logarithmic age at FDV was regressed against several predictor variables. Results Mean and median ages for FDV were 25.6 and 23 months, respectively. When controlling for other variables, there were differences in FDV age by mother’s marital status (p=0.003), number of medical well-child visits (MCV) at a FQHC prior to the FDV (p<0.0001), and which FQHC the child visited. However, only 27.5% of these children had >1MCV at the FQHC. Conclusion Medicaid enrolled children who visited FQHCs for FDV were seen at an earlier age than previously recorded for such health centers (i.e., mean-4 years). Children who also received their MCVs at FQHCs were more likely to have earlier FDVs. PMID:23756303

  13. Strategic Use of Random Subsample Replication and a Coefficient of Factor Replicability

    ERIC Educational Resources Information Center

    Katzenmeyer, William G.; Stenner, A. Jackson

    1975-01-01

    The problem of demonstrating replicability of factor structure across random variables is addressed. Procedures are outlined which combine the use of random subsample replication strategies with the correlations between factor score estimates across replicate pairs to generate a coefficient of replicability and confidence intervals associated with…

  14. Simulation of the Effects of Random Measurement Errors

    ERIC Educational Resources Information Center

    Kinsella, I. A.; Hannaidh, P. B. O.

    1978-01-01

    Describes a simulation method for measurement of errors that requires calculators and tables of random digits. Each student simulates the random behaviour of the component variables in the function and by combining the results of all students, the outline of the sampling distribution of the function can be obtained. (GA)

  15. Population coding in sparsely connected networks of noisy neurons.

    PubMed

    Tripp, Bryan P; Orchard, Jeff

    2012-01-01

    This study examines the relationship between population coding and spatial connection statistics in networks of noisy neurons. Encoding of sensory information in the neocortex is thought to require coordinated neural populations, because individual cortical neurons respond to a wide range of stimuli, and exhibit highly variable spiking in response to repeated stimuli. Population coding is rooted in network structure, because cortical neurons receive information only from other neurons, and because the information they encode must be decoded by other neurons, if it is to affect behavior. However, population coding theory has often ignored network structure, or assumed discrete, fully connected populations (in contrast with the sparsely connected, continuous sheet of the cortex). In this study, we modeled a sheet of cortical neurons with sparse, primarily local connections, and found that a network with this structure could encode multiple internal state variables with high signal-to-noise ratio. However, we were unable to create high-fidelity networks by instantiating connections at random according to spatial connection probabilities. In our models, high-fidelity networks required additional structure, with higher cluster factors and correlations between the inputs to nearby neurons.

  16. Stochastic analysis of uncertain thermal parameters for random thermal regime of frozen soil around a single freezing pipe

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Zhou, Guoqing; Wang, Jianzhou; Zhou, Lei

    2018-03-01

    The artificial ground freezing method (AGF) is widely used in civil and mining engineering, and the thermal regime of frozen soil around the freezing pipe affects the safety of design and construction. The thermal parameters can be truly random due to heterogeneity of the soil properties, which lead to the randomness of thermal regime of frozen soil around the freezing pipe. The purpose of this paper is to study the one-dimensional (1D) random thermal regime problem on the basis of a stochastic analysis model and the Monte Carlo (MC) method. Considering the uncertain thermal parameters of frozen soil as random variables, stochastic processes and random fields, the corresponding stochastic thermal regime of frozen soil around a single freezing pipe are obtained and analyzed. Taking the variability of each stochastic parameter into account individually, the influences of each stochastic thermal parameter on stochastic thermal regime are investigated. The results show that the mean temperatures of frozen soil around the single freezing pipe with three analogy method are the same while the standard deviations are different. The distributions of standard deviation have a great difference at different radial coordinate location and the larger standard deviations are mainly at the phase change area. The computed data with random variable method and stochastic process method have a great difference from the measured data while the computed data with random field method well agree with the measured data. Each uncertain thermal parameter has a different effect on the standard deviation of frozen soil temperature around the single freezing pipe. These results can provide a theoretical basis for the design and construction of AGF.

  17. Compiling probabilistic, bio-inspired circuits on a field programmable analog array

    PubMed Central

    Marr, Bo; Hasler, Jennifer

    2014-01-01

    A field programmable analog array (FPAA) is presented as an energy and computational efficiency engine: a mixed mode processor for which functions can be compiled at significantly less energy costs using probabilistic computing circuits. More specifically, it will be shown that the core computation of any dynamical system can be computed on the FPAA at significantly less energy per operation than a digital implementation. A stochastic system that is dynamically controllable via voltage controlled amplifier and comparator thresholds is implemented, which computes Bernoulli random variables. From Bernoulli variables it is shown exponentially distributed random variables, and random variables of an arbitrary distribution can be computed. The Gillespie algorithm is simulated to show the utility of this system by calculating the trajectory of a biological system computed stochastically with this probabilistic hardware where over a 127X performance improvement over current software approaches is shown. The relevance of this approach is extended to any dynamical system. The initial circuits and ideas for this work were generated at the 2008 Telluride Neuromorphic Workshop. PMID:24847199

  18. Robustness-Based Design Optimization Under Data Uncertainty

    NASA Technical Reports Server (NTRS)

    Zaman, Kais; McDonald, Mark; Mahadevan, Sankaran; Green, Lawrence

    2010-01-01

    This paper proposes formulations and algorithms for design optimization under both aleatory (i.e., natural or physical variability) and epistemic uncertainty (i.e., imprecise probabilistic information), from the perspective of system robustness. The proposed formulations deal with epistemic uncertainty arising from both sparse and interval data without any assumption about the probability distributions of the random variables. A decoupled approach is proposed in this paper to un-nest the robustness-based design from the analysis of non-design epistemic variables to achieve computational efficiency. The proposed methods are illustrated for the upper stage design problem of a two-stage-to-orbit (TSTO) vehicle, where the information on the random design inputs are only available as sparse point and/or interval data. As collecting more data reduces uncertainty but increases cost, the effect of sample size on the optimality and robustness of the solution is also studied. A method is developed to determine the optimal sample size for sparse point data that leads to the solutions of the design problem that are least sensitive to variations in the input random variables.

  19. Not seeking yet trying long-acting reversible contraception: a 24-month randomized trial on continuation, unintended pregnancy and satisfaction.

    PubMed

    Hubacher, David; Spector, Hannah; Monteith, Charles; Chen, Pai-Lien

    2018-06-01

    To measure the 24-month impact on continuation, unintended pregnancy and satisfaction of trying long-acting reversible contraception (LARC) in a population seeking short-acting reversible contraception (SARC). We enrolled 916 women aged 18-29 who were seeking pills or injectables in a partially randomized patient preference trial. Women with strong preferences for pills or injectables started on those products, while others opted for randomization to LARC or SARC and received their methods gratis. We estimated continuation and unintended pregnancy rates through 24months. Intent-to-treat principles were applied after method initiation for comparing incidence of unintended pregnancy. We also examined how satisfaction levels varied by cohort and how baseline negative LARC attitudes were associated with satisfaction over time. Forty-three percent chose randomization, and 57% chose the preference option. Complete loss to follow-up was<2%. The 24-month LARC continuation probability was 64.3% [95% confidence interval (CI): 56.6-70.9], statistically higher than SARC groups [25.5% (randomized) and 40.0% (preference)]. The 24-month cumulative unintended pregnancy probabilities were 9.9% (95% CI: 7.2-12.6) (preference-SARC), 6.9% (95% CI: 3.3-10.6) (randomized-SARC) and 3.6% (95% CI: 1.8-6.4) (randomized-LARC). Statistical tests for comparing randomized groups on unintended pregnancy were mixed: binomial at 24-month time point (p=.02) and log-rank survival probabilities (p=.14 for first pregnancies and p=.07 when including second pregnancies). LARC satisfaction was high (80% happy/neutral, 73% would use LARC again, 81% would recommend to a friend). Baseline negative attitudes toward LARC (27%) were not clearly associated with satisfaction or early discontinuation. The decision to try LARC resulted in high continuation rates and substantial protection from unintended pregnancy over 24months. Despite participants' initial desires to begin short-acting regimens, they had high satisfaction with LARC. Voluntary decisions to try LARC will benefit large proportions of typical SARC users. Even women who do not necessarily view LARC as a first choice may have a highly satisfying experience and avoid unintended pregnancy if they try it. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Testing quantum contextuality of continuous-variable states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKeown, Gerard; Paternostro, Mauro; Paris, Matteo G. A.

    2011-06-15

    We investigate the violation of noncontextuality by a class of continuous-variable states, including variations of entangled coherent states and a two-mode continuous superposition of coherent states. We generalize the Kochen-Specker (KS) inequality discussed by Cabello [A. Cabello, Phys. Rev. Lett. 101, 210401 (2008)] by using effective bidimensional observables implemented through physical operations acting on continuous-variable states, in a way similar to an approach to the falsification of Bell-Clauser-Horne-Shimony-Holt inequalities put forward recently. We test for state-independent violation of KS inequalities under variable degrees of state entanglement and mixedness. We then demonstrate theoretically the violation of a KS inequality for anymore » two-mode state by using pseudospin observables and a generalized quasiprobability function.« less

  1. Continuous-time random walks with reset events. Historical background and new perspectives

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Masó-Puigdellosas, Axel; Villarroel, Javier

    2017-09-01

    In this paper, we consider a stochastic process that may experience random reset events which relocate the system to its starting position. We focus our attention on a one-dimensional, monotonic continuous-time random walk with a constant drift: the process moves in a fixed direction between the reset events, either by the effect of the random jumps, or by the action of a deterministic bias. However, the orientation of its motion is randomly determined after each restart. As a result of these alternating dynamics, interesting properties do emerge. General formulas for the propagator as well as for two extreme statistics, the survival probability and the mean first-passage time, are also derived. The rigor of these analytical results is verified by numerical estimations, for particular but illuminating examples.

  2. SETI and SEH (Statistical Equation for Habitables)

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2011-01-01

    The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book "Habitable planets for man" (1964). In this paper, we first provide the statistical generalization of the original and by now too simplistic Dole equation. In other words, a product of ten positive numbers is now turned into the product of ten positive random variables. This we call the SEH, an acronym standing for "Statistical Equation for Habitables". The mathematical structure of the SEH is then derived. The proof is based on the central limit theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be arbitrarily distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov form of the CLT, or the Lindeberg form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the lognormal distribution. By construction, the mean value of this lognormal distribution is the total number of habitable planets as given by the statistical Dole equation. But now we also derive the standard deviation, the mode, the median and all the moments of this new lognormal NHab random variable. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. An application of our SEH then follows. The (average) distancebetween any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density function, apparently previously unknown and dubbed "Maccone distribution" by Paul Davies in 2008. Data Enrichment Principle. It should be noticed that ANY positive number of random variables in the SEH is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the SEH we call the "Data Enrichment Principle", and we regard it as the key to more profound future results in the fields of Astrobiology and SETI. A practical example is then given of how our SEH works numerically. We work out in detail the case where each of the ten random variables is uniformly distributed around its own mean value as given by Dole back in 1964 and has an assumed standard deviation of 10%. The conclusion is that the average number of habitable planets in the Galaxy should be around 100 million±200 million, and the average distance in between any couple of nearby habitable planets should be about 88 light years±40 light years. Finally, we match our SEH results against the results of the Statistical Drake Equation that we introduced in our 2008 IAC presentation. As expected, the number of currently communicating ET civilizations in the Galaxy turns out to be much smaller than the number of habitable planets (about 10,000 against 100 million, i.e. one ET civilization out of 10,000 habitable planets). And the average distance between any two nearby habitable planets turns out to be much smaller than the average distance between any two neighboring ET civilizations: 88 light years vs. 2000 light years, respectively. This means an ET average distance about 20 times higher than the average distance between any couple of adjacent habitable planets.

  3. Design and implementation of a dental caries prevention trial in remote Canadian Aboriginal communities.

    PubMed

    Harrison, Rosamund; Veronneau, Jacques; Leroux, Brian

    2010-05-13

    The goal of this cluster randomized trial is to test the effectiveness of a counseling approach, Motivational Interviewing, to control dental caries in young Aboriginal children. Motivational Interviewing, a client-centred, directive counseling style, has not yet been evaluated as an approach for promotion of behaviour change in indigenous communities in remote settings. Aboriginal women were hired from the 9 communities to recruit expectant and new mothers to the trial, administer questionnaires and deliver the counseling to mothers in the test communities. The goal is for mothers to receive the intervention during pregnancy and at their child's immunization visits. Data on children's dental health status and family dental health practices will be collected when children are 30-months of age. The communities were randomly allocated to test or control group by a random "draw" over community radio. Sample size and power were determined based on an anticipated 20% reduction in caries prevalence. Randomization checks were conducted between groups. In the 5 test and 4 control communities, 272 of the original target sample size of 309 mothers have been recruited over a two-and-a-half year period. A power calculation using the actual attained sample size showed power to be 79% to detect a treatment effect. If an attrition fraction of 4% per year is maintained, power will remain at 80%. Power will still be > 90% to detect a 25% reduction in caries prevalence. The distribution of most baseline variables was similar for the two randomized groups of mothers. However, despite the random assignment of communities to treatment conditions, group differences exist for stage of pregnancy and prior tooth extractions in the family. Because of the group imbalances on certain variables, control of baseline variables will be done in the analyses of treatment effects. This paper explains the challenges of conducting randomized trials in remote settings, the importance of thorough community collaboration, and also illustrates the likelihood that some baseline variables that may be clinically important will be unevenly split in group-randomized trials when the number of groups is small. This trial is registered as ISRCTN41467632.

  4. Design and implementation of a dental caries prevention trial in remote Canadian Aboriginal communities

    PubMed Central

    2010-01-01

    Background The goal of this cluster randomized trial is to test the effectiveness of a counseling approach, Motivational Interviewing, to control dental caries in young Aboriginal children. Motivational Interviewing, a client-centred, directive counseling style, has not yet been evaluated as an approach for promotion of behaviour change in indigenous communities in remote settings. Methods/design Aboriginal women were hired from the 9 communities to recruit expectant and new mothers to the trial, administer questionnaires and deliver the counseling to mothers in the test communities. The goal is for mothers to receive the intervention during pregnancy and at their child's immunization visits. Data on children's dental health status and family dental health practices will be collected when children are 30-months of age. The communities were randomly allocated to test or control group by a random "draw" over community radio. Sample size and power were determined based on an anticipated 20% reduction in caries prevalence. Randomization checks were conducted between groups. Discussion In the 5 test and 4 control communities, 272 of the original target sample size of 309 mothers have been recruited over a two-and-a-half year period. A power calculation using the actual attained sample size showed power to be 79% to detect a treatment effect. If an attrition fraction of 4% per year is maintained, power will remain at 80%. Power will still be > 90% to detect a 25% reduction in caries prevalence. The distribution of most baseline variables was similar for the two randomized groups of mothers. However, despite the random assignment of communities to treatment conditions, group differences exist for stage of pregnancy and prior tooth extractions in the family. Because of the group imbalances on certain variables, control of baseline variables will be done in the analyses of treatment effects. This paper explains the challenges of conducting randomized trials in remote settings, the importance of thorough community collaboration, and also illustrates the likelihood that some baseline variables that may be clinically important will be unevenly split in group-randomized trials when the number of groups is small. Trial registration This trial is registered as ISRCTN41467632. PMID:20465831

  5. [Effects of two analgesic regimens on the postoperative analgesia and knee functional recovery after unilateral total knee arthroplasty-a randomized controlled trial].

    PubMed

    Ren, Li; Peng, Lihua; Qin, Peipei; Min, Su

    2015-07-01

    To evaluate the efficacy of continuous femoral block on the postoperative analgesia and functional recovery after total knee arthroplasty (TKA). Two hundreds and eighty patients who underwent TKA were randomized into two groups:the group receiving continuous femoral block (CFNB) and the group receiving patient controlled intravenous analgesia (PCIA), each group included 140 participants. Femoral nerve block with ropivacaine by ultrasonic guidance was performed in group CFNB and group PCIA were administrated with patient controlled intravenous analgesia. Numerical rating scale (NRS) scores at rest and in motion at 24, 48, 72 h, 3, 6 and 12 months postoperatively, also the NRS scores at hospital discharge were recorded. The incidence of moderate-severity pain, as well as the degree of knee flexion and the WOMAC scores at 3, 6 and 12 months after surgery were analyzed. The rescue analgesic administration and analgesia-related adverse effects were also recorded. Data were expressed as mean± standard deviation (SD) for normally distributed continuous variables and total number (percent frequency) for categorical variables. If non-normally distributed, data were expressed median inter-quartile range. Student's t-test, Wilcoxon rank test were used to compare results for continuous variables, when appropriate. Chi-square test was used to compare results for categorical variable, Fisher exact test was used for categorical variables when the number of event was less than 5. NRS scores of group CFNB in motion was 3 (3-4) at discharge time, and 3 (2-4), 3 (2-3) at 3 months and 6 months postoperatively, while the scores of group PCIA was 4 (4-4), 3 (3-4), 3 (3-4), respectively. And at rest, NRS scores of group CFNB was 3 (2-3), 1 (1-2), 1 (1-1) at discharge time, and 3, 6 months postoperatively. Compared with group PCIA, NRS scores in motion of group CFNB at discharge time (Z=-5.174, P<0.05) and 3 months (Z=2.308, P=0.021), as well as 6 months postoperatively (Z=-2.495, P=0.013), were significantly lower,also for the NRS scores at rest (Z=-2.405, P=0.016; Z=-4.360, P<0.05; Z=-9.268, P<0.05). The degree of knee flexion of group CFNB at 3 and 6 months postoperatively was 92 (88-97), 103 (99-106), while the degree of knee flexion of group PCIA was 89 (86-95), 100 (97-105); the WOMAC scores of group CFNB at 3 and 6 months postoperatively was 21 (18-26), 18 (16-22), while the scores of group PCIA was 24 (20-27), 21 (17-24). WOMAC scores of group CFNB was lower compared with group PCIA at 3 (Z=-2.467, P=0.014) and 6 (Z=-2.537, P=0.011) months postoperatively while the degree of knee flexion of group CFNB was higher (Z=-2.175, P=0.030; Z=-2.471, P=0.013). Moreover, the frequency of bolus and frequency of rescue of group CFNB was 2.3 and 0.6, while the frequency of group PCIA was 2.6 and 1.1, the frequency of bolus and frequency of rescue were lower in group CFNB (t=-2.984, P=0.003; t=-3.213, P=0.002). The incidence of adverse events such muscle weakness of low limbs,nausea and vomiting were similar in two groups (P>0.05). CFNB can alleviate the postoperative pain after TKA with safety, help improving the short-middle-term functions of knee and quality of patients' lives.

  6. CAN'T MISS--conquer any number task by making important statistics simple. Part 1. Types of variables, mean, median, variance, and standard deviation.

    PubMed

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 1, presents basic information about data including a classification system that describes the four major types of variables: continuous quantitative variable, discrete quantitative variable, ordinal categorical variable (including the binomial variable), and nominal categorical variable. A histogram is a graph that displays the frequency distribution for a continuous variable. The article also demonstrates how to calculate the mean, median, standard deviation, and variance for a continuous variable.

  7. Random trinomial tree models and vanilla options

    NASA Astrophysics Data System (ADS)

    Ganikhodjaev, Nasir; Bayram, Kamola

    2013-09-01

    In this paper we introduce and study random trinomial model. The usual trinomial model is prescribed by triple of numbers (u, d, m). We call the triple (u, d, m) an environment of the trinomial model. A triple (Un, Dn, Mn), where {Un}, {Dn} and {Mn} are the sequences of independent, identically distributed random variables with 0 < Dn < 1 < Un and Mn = 1 for all n, is called a random environment and trinomial tree model with random environment is called random trinomial model. The random trinomial model is considered to produce more accurate results than the random binomial model or usual trinomial model.

  8. Frequency and management of breakthrough bleeding with continuous use of the transvaginal contraceptive ring: a randomized controlled trial.

    PubMed

    Sulak, Patricia J; Smith, Virginia; Coffee, Andrea; Witt, Iris; Kuehl, Alicia L; Kuehl, Thomas J

    2008-09-01

    To assess bleeding patterns with continuous use of the transvaginal contraceptive ring. We did a prospective analysis of daily menstrual flow during a 21/7 cycle followed by 6 months of continuous use and institution of a randomized protocol to manage breakthrough bleeding/spotting. Seventy-four women completed the baseline 21/7 phase and were randomized equally into two groups during the continuous phase. Group 1 was instructed to replace the ring monthly on the same calendar day with no ring-free days. Group 2 was instructed to use the same process, but if breakthrough bleeding/spotting occurred for 5 days or more, they were to remove the ring for 4 days, store it, and then reinsert that ring. Sixty-five women completed the continuous phase with reduced average flow scores in the continuous phase compared with the 21/7 phase (P<.02). Most patients had no to minimal bleeding during continuous use, with group 2 experiencing a statistically greater percentage of days without breakthrough bleeding or spotting (95%) compared with group 1 (89%) (P=.016). Instituting a 4-day hormone-free interval was more (P<.001) effective in resolving breakthrough bleeding/spotting than continuing ring use. A reduction in bleeding occurred during continuous use with replacement of the transvaginal ring compared with baseline 21/7 use. Continuous vaginal ring use resulted in an acceptable bleeding profile in most patients, reduction in flow, reduction in pelvic pain, and a high continuation rate.

  9. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    PubMed

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  10. Virtual continuity of measurable functions and its applications

    NASA Astrophysics Data System (ADS)

    Vershik, A. M.; Zatitskii, P. B.; Petrov, F. V.

    2014-12-01

    A classical theorem of Luzin states that a measurable function of one real variable is `almost' continuous. For measurable functions of several variables the analogous statement (continuity on a product of sets having almost full measure) does not hold in general. The search for a correct analogue of Luzin's theorem leads to a notion of virtually continuous functions of several variables. This apparently new notion implicitly appears in the statements of embedding theorems and trace theorems for Sobolev spaces. In fact it reveals the nature of such theorems as statements about virtual continuity. The authors' results imply that under the conditions of Sobolev theorems there is a well-defined integration of a function with respect to a wide class of singular measures, including measures concentrated on submanifolds. The notion of virtual continuity is also used for the classification of measurable functions of several variables and in some questions on dynamical systems, the theory of polymorphisms, and bistochastic measures. In this paper the necessary definitions and properties of admissible metrics are recalled, several definitions of virtual continuity are given, and some applications are discussed. Bibliography: 24 titles.

  11. Relevance of anisotropy and spatial variability of gas diffusivity for soil-gas transport

    NASA Astrophysics Data System (ADS)

    Schack-Kirchner, Helmer; Kühne, Anke; Lang, Friederike

    2017-04-01

    Models of soil gas transport generally do not consider neither direction dependence of gas diffusivity, nor its small-scale variability. However, in a recent study, we could provide evidence for anisotropy favouring vertical gas diffusion in natural soils. We hypothesize that gas transport models based on gas diffusion data measured with soil rings are strongly influenced by both, anisotropy and spatial variability and the use of averaged diffusivities could be misleading. To test this we used a 2-dimensional model of soil gas transport to under compacted wheel tracks to model the soil-air oxygen distribution in the soil. The model was parametrized with data obtained from soil-ring measurements with its central tendency and variability. The model includes vertical parameter variability as well as variation perpendicular to the elongated wheel track. Different parametrization types have been tested: [i)]Averaged values for wheel track and undisturbed. em [ii)]Random distribution of soil cells with normally distributed variability within the strata. em [iii)]Random distributed soil cells with uniformly distributed variability within the strata. All three types of small-scale variability has been tested for [j)] isotropic gas diffusivity and em [jj)]reduced horizontal gas diffusivity (constant factor), yielding in total six models. As expected the different parametrizations had an important influence to the aeration state under wheel tracks with the strongest oxygen depletion in case of uniformly distributed variability and anisotropy towards higher vertical diffusivity. The simple simulation approach clearly showed the relevance of anisotropy and spatial variability in case of identical central tendency measures of gas diffusivity. However, until now it did not consider spatial dependency of variability, that could even aggravate effects. To consider anisotropy and spatial variability in gas transport models we recommend a) to measure soil-gas transport parameters spatially explicit including different directions and b) to use random-field stochastic models to assess the possible effects for gas-exchange models.

  12. Using Latent Sleepiness to Evaluate an Important Effect of Promethazine

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Hayat, Matthew; Vksman, Zalman; Putcha, Laksmi

    2007-01-01

    Astronauts often use promethazine (PMZ) to counteract space motion sickness; however PMZ may cause drowsiness, which might impair cognitive function. In a NASA ground study, subjects received PMZ and their cognitive performance was then monitored over time. Subjects also reported sleepiness using the Karolinska Sleepiness Score (KSS), which ranges from 1 - 9. A problem arises when using KSS to establish an association between true sleepiness and performance because KSS scores tend to overly concentrate on the values 3 (fairly awake) and 7 (moderately tired). Therefore, we defined a latent sleepiness measure as a continuous random variable describing a subject s actual, but unobserved true state of sleepiness through time. The latent sleepiness and observed KSS are associated through a conditional probability model, which when coupled with demographic factors, predicts performance.

  13. Intermittent Lagrangian velocities and accelerations in three-dimensional porous medium flow.

    PubMed

    Holzner, M; Morales, V L; Willmann, M; Dentz, M

    2015-07-01

    Intermittency of Lagrangian velocity and acceleration is a key to understanding transport in complex systems ranging from fluid turbulence to flow in porous media. High-resolution optical particle tracking in a three-dimensional (3D) porous medium provides detailed 3D information on Lagrangian velocities and accelerations. We find sharp transitions close to pore throats, and low flow variability in the pore bodies, which gives rise to stretched exponential Lagrangian velocity and acceleration distributions characterized by a sharp peak at low velocity, superlinear evolution of particle dispersion, and double-peak behavior in the propagators. The velocity distribution is quantified in terms of pore geometry and flow connectivity, which forms the basis for a continuous-time random-walk model that sheds light on the observed Lagrangian flow and transport behaviors.

  14. The partition function of the Bures ensemble as the τ-function of BKP and DKP hierarchies: continuous and discrete

    NASA Astrophysics Data System (ADS)

    Hu, Xing-Biao; Li, Shi-Hao

    2017-07-01

    The relationship between matrix integrals and integrable systems was revealed more than 20 years ago. As is known, matrix integrals over a Gaussian ensemble used in random matrix theory could act as the τ-function of several hierarchies of integrable systems. In this article, we will show that the time-dependent partition function of the Bures ensemble, whose measure has many interesting geometric properties, could act as the τ-function of BKP and DKP hierarchies. In addition, if discrete time variables are introduced, then this partition function could act as the τ-function of discrete BKP and DKP hierarchies. In particular, there are some links between the partition function of the Bures ensemble and Toda-type equations.

  15. Branching random walk with step size coming from a power law

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Ayan; Subhra Hazra, Rajat; Roy, Parthanil

    2015-09-01

    In their seminal work, Brunet and Derrida made predictions on the random point configurations associated with branching random walks. We shall discuss the limiting behavior of such point configurations when the displacement random variables come from a power law. In particular, we establish that two prediction of remains valid in this setup and investigate various other issues mentioned in their paper.

  16. Cognitive effects of hormone therapy continuation or discontinuation in a sample of women at risk for Alzheimer’s disease

    PubMed Central

    Wroolie, Tonita E.; Kenna, Heather A.; Williams, Katherine E.; Rasgon, Natalie L.

    2015-01-01

    Use of estrogen-based hormone therapy (HT), as a protection from cognitive decline and Alzheimer’s disease, is controversial although cumulative data supports HT use when initiated close to menopause onset with estrogen formulations containing 17β-estradiol (17β-E) being preferable to conjugated equine estrogen formulations. Little is known regarding specific populations of women who may derive benefit from HT. Women with heightened risk for AD (aged 49-69), all of whom were taking HT for at least one year, most of whom initiated HT close to menopause onset, underwent cognitive assessment followed by randomization to continue or discontinue HT. Assessments were repeated at two years after randomization. Women who continued HT performed better on cognitive domains composed of measures of verbal memory, and combined attention, working memory, and processing speed measures. Women who used 17β-E versus conjugated equine estrogen, whether randomized to continue or discontinue HT showed better verbal memory performance at the 2-year follow-up assessment. An interaction was also found with HT randomization and family history of AD in a first degree relative. All women offspring of patients with AD declined in verbal memory however, women who continued HT declined less than women who discontinued HT. Women without a first-degree relative with AD showed verbal memory improvement (likely due to practice effects) with continuance and declined with discontinuance of HT. Continuation of HT use appears to protect cognition in women with heightened risk for AD when initiated close to menopause onset. PMID:26209223

  17. The effects of nocturnal compared with conventional hemodialysis on mineral metabolism: A randomized-controlled trial.

    PubMed

    Walsh, Michael; Manns, Braden J; Klarenbach, Scott; Tonelli, Marcello; Hemmelgarn, Brenda; Culleton, Bruce

    2010-04-01

    Hyperphosphatemia is common among patients receiving dialysis and is associated with increased mortality. Nocturnal hemodialysis (NHD) is a long, slow dialytic modality that may improve hyperphosphatemia and disorders of mineral metabolism. We performed a randomized-controlled trial of NHD compared with conventional hemodialysis (CvHD); in this paper, we report detailed results of mineral metabolism outcomes. Prevalent patients were randomized to receive NHD 5 to 6 nights per week for 6to 10 hours per night or to continue CvHD thrice weekly for 6 months. Oral phosphate binders and vitamin D analogs were adjusted to maintain phosphate, calcium and parathyroid hormone (PTH) levels within recommended targets. Compared with CvHD patients, patients in the NHD group had a significant decrease in serum phosphate over the course of the study (0.49 mmol/L, 95% confidence interval 0.24-0.74; P=0.002) despite a significant reduction in the use of phosphate binders. Sixty-one percent of patients in the NHD group compared with 20% in the CvHD group had a decline in intact PTH (P=0.003). Nocturnal hemodialysis lowers serum phosphate, calcium-phosphate product and requirement for phosphate binders. The effects of NHD on PTH are variable. The impact of these changes on long-term cardiovascular and bone-related outcomes requires further investigation.

  18. How happy is your web browsing? A model to quantify satisfaction of an Internet user searching for desired information

    NASA Astrophysics Data System (ADS)

    Banerji, Anirban; Magarkar, Aniket

    2012-09-01

    We feel happy when web browsing operations provide us with necessary information; otherwise, we feel bitter. How to measure this happiness (or bitterness)? How does the profile of happiness grow and decay during the course of web browsing? We propose a probabilistic framework that models the evolution of user satisfaction, on top of his/her continuous frustration at not finding the required information. It is found that the cumulative satisfaction profile of a web-searching individual can be modeled effectively as the sum of a random number of random terms, where each term is a mutually independent random variable, originating from ‘memoryless’ Poisson flow. Evolution of satisfaction over the entire time interval of a user’s browsing was modeled using auto-correlation analysis. A utilitarian marker, a magnitude of greater than unity of which describes happy web-searching operations, and an empirical limit that connects user’s satisfaction with his frustration level-are proposed too. The presence of pertinent information in the very first page of a website and magnitude of the decay parameter of user satisfaction (frustration, irritation etc.) are found to be two key aspects that dominate the web user’s psychology. The proposed model employed different combinations of decay parameter, searching time and number of helpful websites. The obtained results are found to match the results from three real-life case studies.

  19. Transcranial Magnetic Stimulation for Obsessive-Compulsive Disorder: An Updated Systematic Review and Meta-analysis.

    PubMed

    Trevizol, Alisson Paulino; Shiozawa, Pedro; Cook, Ian A; Sato, Isa Albuquerque; Kaku, Caio Barbosa; Guimarães, Fernanda Bs; Sachdev, Perminder; Sarkhel, Sujit; Cordeiro, Quirino

    2016-12-01

    Transcranial magnetic stimulation (TMS) is a promising noninvasive brain stimulation intervention. Transcranial magnetic stimulation has been proposed for obsessive-compulsive disorder (OCD) with auspicious results. To assess the efficacy of TMS for OCD in randomized clinical trials (RCTs). Systematic review using MEDLINE and EMBASE from the first RCT available until March 11, 2016. The main outcome was the Hedges g for continuous scores for Yale-Brown Obsessive Compulsive Scale in a random-effects model. Heterogeneity was evaluated with the I and the χ test. Publication bias was evaluated using the Begg funnel plot. Metaregression was performed using the random-effects model modified by Knapp and Hartung. We included 15 RCTs (n = 483), most had small-to-modest sample sizes. Comparing active versus sham TMS, active stimulation was significantly superior for OCD symptoms (Hedges g = 0.45; 95% confidence interval, 0.2-0.71). The funnel plot showed that the risk of publication bias was low and between-study heterogeneity was low (I = 43%, P = 0.039 for the χ test). Metaregression showed no particular influence of any variable on the results. Transcranial magnetic stimulation active was superior to sham stimulation for the amelioration of OCD symptoms. Trials had moderate heterogeneity results, despite different protocols of stimulation used. Further RCTs with larger sample sizes are fundamentally needed to clarify the precise impact of TMS in OCD symptoms.

  20. Homeopathy for attention-deficit/hyperactivity disorder: a pilot randomized-controlled trial.

    PubMed

    Jacobs, Jennifer; Williams, Anna-Leila; Girard, Christine; Njike, Valentine Yanchou; Katz, David

    2005-10-01

    The aim of this study was to carry out a preliminary trial evaluating the effectiveness of homeopathy in the treatment of attention-deficit/hyperactivity disorder (ADHD). This work was a randomized, double-blind, placebo-controlled trial. This study was conducted in a private homeopathic clinic in the Seattle metropolitan area. Subjects included children 6-12 years of age meeting Diagnostic and Statistical Manual of Mental Disorders 4th edition (DSM-IV) criteria for ADHD. Forty-three subjects were randomized to receive a homeopathic consultation and either an individualized homeopathic remedy or placebo. Patients were seen by homeopathic physicians every 6 weeks for 18 weeks. Outcome measures included the Conner's Global Index-Parent, Conner's Global Index- Teacher, Conner's Parent Rating Scale-Brief, Continuous Performance Test, and the Clinical Global Impression Scale. There were no statistically significant differences between homeopathic remedy and placebo groups on the primary or secondary outcome variables. However, there were statistically and clinically significant improvements in both groups on many of the outcome measures. This pilot study provides no evidence to support a therapeutic effect of individually selected homeopathic remedies in children with ADHD. A therapeutic effect of the homeopathic encounter is suggested and warrants further evaluation. Future studies should be carried out over a longer period of time and should include a control group that does not receive the homeopathic consultation. Comparison to conventional stimulant medication for ADHD also should be considered.

Top