Science.gov

Sample records for zero-mean random variables

  1. Non-zero mean and asymmetry of neuronal oscillations have different implications for evoked responses.

    PubMed

    Nikulin, Vadim V; Linkenkaer-Hansen, Klaus; Nolte, Guido; Curio, Gabriel

    2010-02-01

    The aim of the present study was to show analytically and with simulations that it is the non-zero mean of neuronal oscillations, and not an amplitude asymmetry of peaks and troughs, that is a prerequisite for the generation of evoked responses through a mechanism of amplitude modulation of oscillations. Secondly, we detail the rationale and implementation of the "baseline-shift index" (BSI) for deducing whether empirical oscillations have non-zero mean. Finally, we illustrate with empirical data why the "amplitude fluctuation asymmetry" (AFA) index should be used with caution in research aimed at explaining variability in evoked responses through a mechanism of amplitude modulation of ongoing oscillations. An analytical approach, simulations and empirical MEG data were used to compare the specificity of BSI and AFA index to differentiate between a non-zero mean and a non-sinusoidal shape of neuronal oscillations. Both the BSI and the AFA index were sensitive to the presence of non-zero mean in neuronal oscillations. The AFA index, however, was also sensitive to the shape of oscillations even when they had a zero mean. Our findings indicate that it is the non-zero mean of neuronal oscillations, and not an amplitude asymmetry of peaks and troughs, that is a prerequisite for the generation of evoked responses through a mechanism of amplitude modulation of oscillations. A clear distinction should be made between the shape and non-zero mean properties of neuronal oscillations. This is because only the latter contributes to evoked responses, whereas the former does not. Copyright (c) 2009 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  2. Students' Misconceptions about Random Variables

    ERIC Educational Resources Information Center

    Kachapova, Farida; Kachapov, Ilias

    2012-01-01

    This article describes some misconceptions about random variables and related counter-examples, and makes suggestions about teaching initial topics on random variables in general form instead of doing it separately for discrete and continuous cases. The focus is on post-calculus probability courses. (Contains 2 figures.)

  3. [Random Variable Read Me File

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Sankararaman, Shankar; Cullo, Aiden

    2017-01-01

    Readme for the Random Variable Toolbox usable manner. is a Web-based Git version control repository hosting service. It is mostly used for computer code. It offers all of the distributed version control and source code management (SCM) functionality of Git as well as adding its own features. It provides access control and several collaboration features such as bug tracking, feature requests, task management, and wikis for every project.[3] GitHub offers both plans for private and free repositories on the same account[4] which are commonly used to host open-source software projects.[5] As of April 2017, GitHub reports having almost 20 million users and 57 million repositories,[6] making it the largest host of source code in the world.[7] GitHub has a mascot called Octocat, a cat with five tentacles and a human-like face

  4. Simulation Analysis of Zero Mean Flow Edge Turbulence in LAPD

    NASA Astrophysics Data System (ADS)

    Friedman, Brett Cory

    I model, simulate, and analyze the turbulence in a particular experiment on the Large Plasma Device (LAPD) at UCLA. The experiment, conducted by Schaffner et al. [D. Schaffner et al., Phys. Rev. Lett. 109, 135002 (2012)], nulls out the intrinsic mean flow in LAPD by limiter biasing. The model that I use in the simulation is an electrostatic reduced Braginskii two-fluid model that describes the time evolution of density, electron temperature, electrostatic potential, and parallel electron velocity fluctuations in the edge region of LAPD. The spatial domain is annular, encompassing the radial coordinates over which a significant equilibrium density gradient exists. My model breaks the independent variables in the equations into time-independent equilibrium parts and time-dependent fluctuating parts, and I use experimentally obtained values as input for the equilibrium parts. After an initial exponential growth period due to a linear drift wave instability, the fluctuations saturate and the frequency and azimuthal wavenumber spectra become broadband with no visible coherent peaks, at which point the fluctuations become turbulent. The turbulence develops intermittent pressure and flow filamentary structures that grow and dissipate, but look much different than the unstable linear drift waves, primarily in the extremely long axial wavelengths that the filaments possess. An energy dynamics analysis that I derive reveals the mechanism that drives these structures. The long k|| ˜ 0 intermittent potential filaments convect equilibrium density across the equilibrium density gradient, setting up local density filaments. These density filaments, also with k || ˜ 0, produce azimuthal density gradients, which drive radially propagating secondary drift waves. These finite k|| drift waves nonlinearly couple to one another and reinforce the original convective filament, allowing the process to bootstrap itself. The growth of these structures is by nonlinear instability because

  5. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  6. Random Variables: Simulations and Surprising Connections.

    ERIC Educational Resources Information Center

    Quinn, Robert J.; Tomlinson, Stephen

    1999-01-01

    Features activities for advanced second-year algebra students in grades 11 and 12. Introduces three random variables and considers an empirical and theoretical probability for each. Uses coins, regular dice, decahedral dice, and calculators. (ASK)

  7. Contextuality in canonical systems of random variables

    NASA Astrophysics Data System (ADS)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  8. Couette-Poiseuille flow experiment with zero mean advection velocity: Subcritical transition to turbulence

    NASA Astrophysics Data System (ADS)

    Klotz, L.; Lemoult, G.; Frontczak, I.; Tuckerman, L. S.; Wesfreid, J. E.

    2017-04-01

    We present an experimental setup that creates a shear flow with zero mean advection velocity achieved by counterbalancing the nonzero streamwise pressure gradient by moving boundaries, which generates plane Couette-Poiseuille flow. We obtain experimental results in the transitional regime for this flow. Using flow visualization, we characterize the subcritical transition to turbulence in Couette-Poiseuille flow and show the existence of turbulent spots generated by a permanent perturbation. Due to the zero mean advection velocity of the base profile, these turbulent structures are nearly stationary. We distinguish two regions of the turbulent spot: the active turbulent core, which is characterized by waviness of the streaks similar to traveling waves, and the surrounding region, which includes in addition the weak undisturbed streaks and oblique waves at the laminar-turbulent interface. We also study the dependence of the size of these two regions on Reynolds number. Finally, we show that the traveling waves move in the downstream (Poiseuille) direction.

  9. Benford's law and continuous dependent random variables

    NASA Astrophysics Data System (ADS)

    Becker, Thealexa; Burt, David; Corcoran, Taylor C.; Greaves-Tunnell, Alec; Iafrate, Joseph R.; Jing, Joy; Miller, Steven J.; Porfilio, Jaclyn D.; Ronan, Ryan; Samranvedhya, Jirapat; Strauch, Frederick W.; Talbut, Blaine

    2018-01-01

    Many mathematical, man-made and natural systems exhibit a leading-digit bias, where a first digit (base 10) of 1 occurs not 11% of the time, as one would expect if all digits were equally likely, but rather 30%. This phenomenon is known as Benford's Law. Analyzing which datasets adhere to Benford's Law and how quickly Benford behavior sets in are the two most important problems in the field. Most previous work studied systems of independent random variables, and relied on the independence in their analyses. Inspired by natural processes such as particle decay, we study the dependent random variables that emerge from models of decomposition of conserved quantities. We prove that in many instances the distribution of lengths of the resulting pieces converges to Benford behavior as the number of divisions grow, and give several conjectures for other fragmentation processes. The main difficulty is that the resulting random variables are dependent. We handle this by using tools from Fourier analysis and irrationality exponents to obtain quantified convergence rates as well as introducing and developing techniques to measure and control the dependencies. The construction of these tools is one of the major motivations of this work, as our approach can be applied to many other dependent systems. As an example, we show that the n ! entries in the determinant expansions of n × n matrices with entries independently drawn from nice random variables converges to Benford's Law.

  10. Mass transfer from a sphere in an oscillating flow with zero mean velocity

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.; Lyman, Frederic A.

    1990-01-01

    A pseudospectral numerical method is used for the solution of the Navier-Stokes and mass transport equations for a sphere in a sinusoidally oscillating flow with zero mean velocity. The flow is assumed laminar and axisymmetric about the sphere's polar axis. Oscillating flow results were obtained for Reynolds numbers (based on the free-stream oscillatory flow amplitude) between 1 and 150, and Strouhal numbers between 1 and 1000. Sherwood numbers were computed and their dependency on the flow frequency and amplitude discussed. An assessment of the validity of the quasi-steady assumption for mass transfer is based on these results.

  11. A Comparison of Zero Mean Strain Rotating Beam Fatigue Test Methods for Nitinol Wire

    NASA Astrophysics Data System (ADS)

    Norwich, Dennis W.

    2014-07-01

    Zero mean strain rotating beam fatigue testing has become the standard for comparing the fatigue properties of Nitinol wire. Most commercially available equipment consists of either a two-chuck or a chuck and bushing system, where the wire length and center-to-center axis distance determine the maximum strain on the wire. For the two-chuck system, the samples are constrained at either end of the wire, and both chucks are driven at the same speed. For the chuck and bushing system, the sample is constrained at one end in a chuck and rides freely in a bushing at the other end. These equivalent systems will both be herein referred to as Chuck-to-Chuck systems. An alternate system uses a machined test block with a specific radius to guide the wire at a known strain during testing. In either system, the test parts can be immersed in a temperature-controlled fluid bath to eliminate any heating effect created in the specimen due to dissipative processes during cyclic loading (cyclic stress induced the formation of martensite) Wagner et al. ( Mater. Sci. Eng. A, 378, p 105-109, 1). This study will compare the results of the same starting material tested with each system to determine if the test system differences affect the final results. The advantages and disadvantages of each system will be highlighted and compared. The factors compared will include ease of setup, operator skill level required, consistency of strain measurement, equipment test limits, and data recovery and analysis. Also, the effect of test speed on the test results for each system will be investigated.

  12. On the minimum of independent geometrically distributed random variables

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Leemis, Lawrence M.; Nicol, David

    1994-01-01

    The expectations E(X(sub 1)), E(Z(sub 1)), and E(Y(sub 1)) of the minimum of n independent geometric, modifies geometric, or exponential random variables with matching expectations differ. We show how this is accounted for by stochastic variability and how E(X(sub 1))/E(Y(sub 1)) equals the expected number of ties at the minimum for the geometric random variables. We then introduce the 'shifted geometric distribution' and show that there is a unique value of the shift for which the individual shifted geometric and exponential random variables match expectations both individually and in the minimums.

  13. Transcription, intercellular variability and correlated random walk.

    PubMed

    Müller, Johannes; Kuttler, Christina; Hense, Burkhard A; Zeiser, Stefan; Liebscher, Volkmar

    2008-11-01

    We develop a simple model for the random distribution of a gene product. It is assumed that the only source of variance is due to switching transcription on and off by a random process. Under the condition that the transition rates between on and off are constant we find that the amount of mRNA follows a scaled Beta distribution. Additionally, a simple positive feedback loop is considered. The simplicity of the model allows for an explicit solution also in this setting. These findings in turn allow, e.g., for easy parameter scans. We find that bistable behavior translates into bimodal distributions. These theoretical findings are in line with experimental results.

  14. An instrumental variable random-coefficients model for binary outcomes

    PubMed Central

    Chesher, Andrew; Rosen, Adam M

    2014-01-01

    In this paper, we study a random-coefficients model for a binary outcome. We allow for the possibility that some or even all of the explanatory variables are arbitrarily correlated with the random coefficients, thus permitting endogeneity. We assume the existence of observed instrumental variables Z that are jointly independent with the random coefficients, although we place no structure on the joint determination of the endogenous variable X and instruments Z, as would be required for a control function approach. The model fits within the spectrum of generalized instrumental variable models, and we thus apply identification results from our previous studies of such models to the present context, demonstrating their use. Specifically, we characterize the identified set for the distribution of random coefficients in the binary response model with endogeneity via a collection of conditional moment inequalities, and we investigate the structure of these sets by way of numerical illustration. PMID:25798048

  15. Polynomial chaos expansion with random and fuzzy variables

    NASA Astrophysics Data System (ADS)

    Jacquelin, E.; Friswell, M. I.; Adhikari, S.; Dessombz, O.; Sinou, J.-J.

    2016-06-01

    A dynamical uncertain system is studied in this paper. Two kinds of uncertainties are addressed, where the uncertain parameters are described through random variables and/or fuzzy variables. A general framework is proposed to deal with both kinds of uncertainty using a polynomial chaos expansion (PCE). It is shown that fuzzy variables may be expanded in terms of polynomial chaos when Legendre polynomials are used. The components of the PCE are a solution of an equation that does not depend on the nature of uncertainty. Once this equation is solved, the post-processing of the data gives the moments of the random response when the uncertainties are random or gives the response interval when the variables are fuzzy. With the PCE approach, it is also possible to deal with mixed uncertainty, when some parameters are random and others are fuzzy. The results provide a fuzzy description of the response statistical moments.

  16. Randomized trial of intermittent or continuous amnioinfusion for variable decelerations.

    PubMed

    Rinehart, B K; Terrone, D A; Barrow, J H; Isler, C M; Barrilleaux, P S; Roberts, W E

    2000-10-01

    To determine whether continuous or intermittent bolus amnioinfusion is more effective in relieving variable decelerations. Patients with repetitive variable decelerations were randomized to an intermittent bolus or continuous amnioinfusion. The intermittent bolus infusion group received boluses of 500 mL of normal saline, each over 30 minutes, with boluses repeated if variable decelerations recurred. The continuous infusion group received a bolus infusion of 500 mL of normal saline over 30 minutes and then 3 mL per minute until delivery occurred. The ability of the amnioinfusion to abolish variable decelerations was analyzed, as were maternal demographic and pregnancy outcome variables. Power analysis indicated that 64 patients would be required. Thirty-five patients were randomized to intermittent infusion and 30 to continuous infusion. There were no differences between groups in terms of maternal demographics, gestational age, delivery mode, neonatal outcome, median time to resolution of variable decelerations, or the number of times variable decelerations recurred. The median volume infused in the intermittent infusion group (500 mL) was significantly less than that in the continuous infusion group (905 mL, P =.003). Intermittent bolus amnioinfusion is as effective as continuous infusion in relieving variable decelerations in labor. Further investigation is necessary to determine whether either of these techniques is associated with increased occurrence of rare complications such as cord prolapse or uterine rupture.

  17. A Random Variable Related to the Inversion Vector of a Partial Random Permutation

    ERIC Educational Resources Information Center

    Laghate, Kavita; Deshpande, M. N.

    2005-01-01

    In this article, we define the inversion vector of a permutation of the integers 1, 2,..., n. We set up a particular kind of permutation, called a partial random permutation. The sum of the elements of the inversion vector of such a permutation is a random variable of interest.

  18. Random variability explains apparent global clustering of large earthquakes

    Michael, A.J.

    2011-01-01

    The occurrence of 5 Mw ≥ 8.5 earthquakes since 2004 has created a debate over whether or not we are in a global cluster of large earthquakes, temporarily raising risks above long-term levels. I use three classes of statistical tests to determine if the record of M ≥ 7 earthquakes since 1900 can reject a null hypothesis of independent random events with a constant rate plus localized aftershock sequences. The data cannot reject this null hypothesis. Thus, the temporal distribution of large global earthquakes is well-described by a random process, plus localized aftershocks, and apparent clustering is due to random variability. Therefore the risk of future events has not increased, except within ongoing aftershock sequences, and should be estimated from the longest possible record of events.

  19. Use of allele scores as instrumental variables for Mendelian randomization

    PubMed Central

    Burgess, Stephen; Thompson, Simon G

    2013-01-01

    Background An allele score is a single variable summarizing multiple genetic variants associated with a risk factor. It is calculated as the total number of risk factor-increasing alleles for an individual (unweighted score), or the sum of weights for each allele corresponding to estimated genetic effect sizes (weighted score). An allele score can be used in a Mendelian randomization analysis to estimate the causal effect of the risk factor on an outcome. Methods Data were simulated to investigate the use of allele scores in Mendelian randomization where conventional instrumental variable techniques using multiple genetic variants demonstrate ‘weak instrument’ bias. The robustness of estimates using the allele score to misspecification (for example non-linearity, effect modification) and to violations of the instrumental variable assumptions was assessed. Results Causal estimates using a correctly specified allele score were unbiased with appropriate coverage levels. The estimates were generally robust to misspecification of the allele score, but not to instrumental variable violations, even if the majority of variants in the allele score were valid instruments. Using a weighted rather than an unweighted allele score increased power, but the increase was small when genetic variants had similar effect sizes. Naive use of the data under analysis to choose which variants to include in an allele score, or for deriving weights, resulted in substantial biases. Conclusions Allele scores enable valid causal estimates with large numbers of genetic variants. The stringency of criteria for genetic variants in Mendelian randomization should be maintained for all variants in an allele score. PMID:24062299

  20. Leveraging prognostic baseline variables to gain precision in randomized trials

    PubMed Central

    Colantuoni, Elizabeth; Rosenblum, Michael

    2015-01-01

    We focus on estimating the average treatment effect in a randomized trial. If baseline variables are correlated with the outcome, then appropriately adjusting for these variables can improve precision. An example is the analysis of covariance (ANCOVA) estimator, which applies when the outcome is continuous, the quantity of interest is the difference in mean outcomes comparing treatment versus control, and a linear model with only main effects is used. ANCOVA is guaranteed to be at least as precise as the standard unadjusted estimator, asymptotically, under no parametric model assumptions and also is locally semiparametric efficient. Recently, several estimators have been developed that extend these desirable properties to more general settings that allow any real-valued outcome (e.g., binary or count), contrasts other than the difference in mean outcomes (such as the relative risk), and estimators based on a large class of generalized linear models (including logistic regression). To the best of our knowledge, we give the first simulation study in the context of randomized trials that compares these estimators. Furthermore, our simulations are not based on parametric models; instead, our simulations are based on resampling data from completed randomized trials in stroke and HIV in order to assess estimator performance in realistic scenarios. We provide practical guidance on when these estimators are likely to provide substantial precision gains and describe a quick assessment method that allows clinical investigators to determine whether these estimators could be useful in their specific trial contexts. PMID:25872751

  1. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    PubMed Central

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values. PMID:18595286

  2. Generating variable and random schedules of reinforcement using Microsoft Excel macros.

    PubMed

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.

  3. A Random Variable Approach to Nuclear Targeting and Survivability

    SciT

    Undem, Halvor A.

    We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less

  4. Instrumental variables and Mendelian randomization with invalid instruments

    NASA Astrophysics Data System (ADS)

    Kang, Hyunseung

    Instrumental variables (IV) methods have been widely used to determine the causal effect of a treatment, exposure, policy, or an intervention on an outcome of interest. The IV method relies on having a valid instrument, a variable that is (A1) associated with the exposure, (A2) has no direct effect on the outcome, and (A3) is unrelated to the unmeasured confounders associated with the exposure and the outcome. However, in practice, finding a valid instrument, especially those that satisfy (A2) and (A3), can be challenging. For example, in Mendelian randomization studies where genetic markers are used as instruments, complete knowledge about instruments' validity is equivalent to complete knowledge about the involved genes' functions. The dissertation explores the theory, methods, and application of IV methods when invalid instruments are present. First, when we have multiple candidate instruments, we establish a theoretical bound whereby causal effects are only identified as long as less than 50% of instruments are invalid, without knowing which of the instruments are invalid. We also propose a fast penalized method, called sisVIVE, to estimate the causal effect. We find that sisVIVE outperforms traditional IV methods when invalid instruments are present both in simulation studies as well as in real data analysis. Second, we propose a robust confidence interval under the multiple invalid IV setting. This work is an extension of our work on sisVIVE. However, unlike sisVIVE which is robust to violations of (A2) and (A3), our confidence interval procedure provides honest coverage even if all three assumptions, (A1)-(A3), are violated. Third, we study the single IV setting where the one IV we have may actually be invalid. We propose a nonparametric IV estimation method based on full matching, a technique popular in causal inference for observational data, that leverages observed covariates to make the instrument more valid. We propose an estimator along with

  5. Rates of profit as correlated sums of random variables

    NASA Astrophysics Data System (ADS)

    Greenblatt, R. E.

    2013-10-01

    Profit realization is the dominant feature of market-based economic systems, determining their dynamics to a large extent. Rather than attaining an equilibrium, profit rates vary widely across firms, and the variation persists over time. Differing definitions of profit result in differing empirical distributions. To study the statistical properties of profit rates, I used data from a publicly available database for the US Economy for 2009-2010 (Risk Management Association). For each of three profit rate measures, the sample space consists of 771 points. Each point represents aggregate data from a small number of US manufacturing firms of similar size and type (NAICS code of principal product). When comparing the empirical distributions of profit rates, significant ‘heavy tails’ were observed, corresponding principally to a number of firms with larger profit rates than would be expected from simple models. An apparently novel correlated sum of random variables statistical model was used to model the data. In the case of operating and net profit rates, a number of firms show negative profits (losses), ruling out simple gamma or lognormal distributions as complete models for these data.

  6. On the fluctuations of sums of independent random variables.

    PubMed

    Feller, W

    1969-07-01

    If X(1), X(2),... are independent random variables with zero expectation and finite variances, the cumulative sums S(n) are, on the average, of the order of magnitude S(n), where S(n) (2) = E(S(n) (2)). The occasional maxima of the ratios S(n)/S(n) are surprisingly large and the problem is to estimate the extent of their probable fluctuations.Specifically, let S(n) (*) = (S(n) - b(n))/a(n), where {a(n)} and {b(n)}, two numerical sequences. For any interval I, denote by p(I) the probability that the event S(n) (*) epsilon I occurs for infinitely many n. Under mild conditions on {a(n)} and {b(n)}, it is shown that p(I) equals 0 or 1 according as a certain series converges or diverges. To obtain the upper limit of S(n)/a(n), one has to set b(n) = +/- epsilon a(n), but finer results are obtained with smaller b(n). No assumptions concerning the under-lying distributions are made; the criteria explain structurally which features of {X(n)} affect the fluctuations, but for concrete results something about P{S(n)>a(n)} must be known. For example, a complete solution is possible when the X(n) are normal, replacing the classical law of the iterated logarithm. Further concrete estimates may be obtained by combining the new criteria with some recently developed limit theorems.

  7. Improved ensemble-mean forecasting of ENSO events by a zero-mean stochastic error model of an intermediate coupled model

    NASA Astrophysics Data System (ADS)

    Zheng, Fei; Zhu, Jiang

    2017-04-01

    How to design a reliable ensemble prediction strategy with considering the major uncertainties of a forecasting system is a crucial issue for performing an ensemble forecast. In this study, a new stochastic perturbation technique is developed to improve the prediction skills of El Niño-Southern Oscillation (ENSO) through using an intermediate coupled model. We first estimate and analyze the model uncertainties from the ensemble Kalman filter analysis results through assimilating the observed sea surface temperatures. Then, based on the pre-analyzed properties of model errors, we develop a zero-mean stochastic model-error model to characterize the model uncertainties mainly induced by the missed physical processes of the original model (e.g., stochastic atmospheric forcing, extra-tropical effects, Indian Ocean Dipole). Finally, we perturb each member of an ensemble forecast at each step by the developed stochastic model-error model during the 12-month forecasting process, and add the zero-mean perturbations into the physical fields to mimic the presence of missing processes and high-frequency stochastic noises. The impacts of stochastic model-error perturbations on ENSO deterministic predictions are examined by performing two sets of 21-yr hindcast experiments, which are initialized from the same initial conditions and differentiated by whether they consider the stochastic perturbations. The comparison results show that the stochastic perturbations have a significant effect on improving the ensemble-mean prediction skills during the entire 12-month forecasting process. This improvement occurs mainly because the nonlinear terms in the model can form a positive ensemble-mean from a series of zero-mean perturbations, which reduces the forecasting biases and then corrects the forecast through this nonlinear heating mechanism.

  8. CDC6600 subroutine for normal random variables. [RVNORM (RMU, SIG)

    SciT

    Amos, D.E.

    1977-04-01

    A value y for a uniform variable on (0,1) is generated and a table of 96-percent points for the (0,1) normal distribution is interpolated for a value of the normal variable x(0,1) on 0.02 less than or equal to y less than or equal to 0.98. For the tails, the inverse normal is computed by a rational Chebyshev approximation in an appropriate variable. Then X = x sigma + ..mu.. gives the X(..mu..,sigma) variable.

  9. Raw and Central Moments of Binomial Random Variables via Stirling Numbers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2013-01-01

    We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…

  10. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    EPA Science Inventory

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  11. The Expected Sample Variance of Uncorrelated Random Variables with a Common Mean and Some Applications in Unbalanced Random Effects Models

    ERIC Educational Resources Information Center

    Vardeman, Stephen B.; Wendelberger, Joanne R.

    2005-01-01

    There is a little-known but very simple generalization of the standard result that for uncorrelated random variables with common mean [mu] and variance [sigma][superscript 2], the expected value of the sample variance is [sigma][superscript 2]. The generalization justifies the use of the usual standard error of the sample mean in possibly…

  12. An Entropy-Based Measure of Dependence between Two Groups of Random Variables. Research Report. ETS RR-07-20

    ERIC Educational Resources Information Center

    Kong, Nan

    2007-01-01

    In multivariate statistics, the linear relationship among random variables has been fully explored in the past. This paper looks into the dependence of one group of random variables on another group of random variables using (conditional) entropy. A new measure, called the K-dependence coefficient or dependence coefficient, is defined using…

  13. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    ERIC Educational Resources Information Center

    Bancroft, Stacie L.; Bourret, Jason C.

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time.…

  14. Sampling-Based Stochastic Sensitivity Analysis Using Score Functions for RBDO Problems with Correlated Random Variables

    DTIC Science & Technology

    2010-08-01

    a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...SECURITY CLASSIFICATION OF: This study presents a methodology for computing stochastic sensitivities with respect to the design variables, which are the...Random Variables Report Title ABSTRACT This study presents a methodology for computing stochastic sensitivities with respect to the design variables

  15. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  16. Axial Fatigue Tests at Zero Mean Stress of 24S-T and 75S-T Aluminum-alloy Strips with a Central Circular Hole

    NASA Technical Reports Server (NTRS)

    Brueggeman, W C; Mayer, M JR

    1948-01-01

    Axial fatigue tests at zero mean stress have been made on 0.032- and 0.064-inch 24S-T and 0.032-inch 75S-T sheet-metal specimens 1/4, 1/2, 1, and 2 inches wide without a hole and with central holes giving a range of hole diameter D to specimen width W from 0.01 to 0.95. No systematic difference was noted between the results for the 0.032-inch and the 0.064-inch specimens although the latter seemed the more consistent. In general the fatigue strength based on the minimum section dropped sharply as the ration D/W was increased from zero to about 0.25. The plain specimens showed quite a pronounced decrease in fatigue strength with increasing width. The holed specimens showed only slight and rather inconclusive evidence of this size effect. The fatigue stress-concentration factor was higher for 75S-T than for 24S-T alloy. Evidence was found that a very small hole would not cause any reduction in fatigue strength.

  17. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Sums and Products of Jointly Distributed Random Variables: A Simplified Approach

    ERIC Educational Resources Information Center

    Stein, Sheldon H.

    2005-01-01

    Three basic theorems concerning expected values and variances of sums and products of random variables play an important role in mathematical statistics and its applications in education, business, the social sciences, and the natural sciences. A solid understanding of these theorems requires that students be familiar with the proofs of these…

  19. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    ERIC Educational Resources Information Center

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  20. Variable selection with random forest: Balancing stability, performance, and interpretation in ecological and environmental modeling

    EPA Science Inventory

    Random forest (RF) is popular in ecological and environmental modeling, in part, because of its insensitivity to correlated predictors and resistance to overfitting. Although variable selection has been proposed to improve both performance and interpretation of RF models, it is u...

  1. On the distribution of a product of N Gaussian random variables

    NASA Astrophysics Data System (ADS)

    Stojanac, Željka; Suess, Daniel; Kliesch, Martin

    2017-08-01

    The product of Gaussian random variables appears naturally in many applications in probability theory and statistics. It has been known that the distribution of a product of N such variables can be expressed in terms of a Meijer G-function. Here, we compute a similar representation for the corresponding cumulative distribution function (CDF) and provide a power-log series expansion of the CDF based on the theory of the more general Fox H-functions. Numerical computations show that for small values of the argument the CDF of products of Gaussians is well approximated by the lowest orders of this expansion. Analogous results are also shown for the absolute value as well as the square of such products of N Gaussian random variables. For the latter two settings, we also compute the moment generating functions in terms of Meijer G-functions.

  2. Network Mendelian randomization: using genetic variants as instrumental variables to investigate mediation in causal pathways

    PubMed Central

    Burgess, Stephen; Daniel, Rhian M; Butterworth, Adam S; Thompson, Simon G

    2015-01-01

    Background: Mendelian randomization uses genetic variants, assumed to be instrumental variables for a particular exposure, to estimate the causal effect of that exposure on an outcome. If the instrumental variable criteria are satisfied, the resulting estimator is consistent even in the presence of unmeasured confounding and reverse causation. Methods: We extend the Mendelian randomization paradigm to investigate more complex networks of relationships between variables, in particular where some of the effect of an exposure on the outcome may operate through an intermediate variable (a mediator). If instrumental variables for the exposure and mediator are available, direct and indirect effects of the exposure on the outcome can be estimated, for example using either a regression-based method or structural equation models. The direction of effect between the exposure and a possible mediator can also be assessed. Methods are illustrated in an applied example considering causal relationships between body mass index, C-reactive protein and uric acid. Results: These estimators are consistent in the presence of unmeasured confounding if, in addition to the instrumental variable assumptions, the effects of both the exposure on the mediator and the mediator on the outcome are homogeneous across individuals and linear without interactions. Nevertheless, a simulation study demonstrates that even considerable heterogeneity in these effects does not lead to bias in the estimates. Conclusions: These methods can be used to estimate direct and indirect causal effects in a mediation setting, and have potential for the investigation of more complex networks between multiple interrelated exposures and disease outcomes. PMID:25150977

  3. Random variable transformation for generalized stochastic radiative transfer in finite participating slab media

    NASA Astrophysics Data System (ADS)

    El-Wakil, S. A.; Sallah, M.; El-Hanbaly, A. M.

    2015-10-01

    The stochastic radiative transfer problem is studied in a participating planar finite continuously fluctuating medium. The problem is considered for specular- and diffusly-reflecting boundaries with linear anisotropic scattering. Random variable transformation (RVT) technique is used to get the complete average for the solution functions, that are represented by the probability-density function (PDF) of the solution process. In the RVT algorithm, a simple integral transformation to the input stochastic process (the extinction function of the medium) is applied. This linear transformation enables us to rewrite the stochastic transport equations in terms of the optical random variable (x) and the optical random thickness (L). Then the transport equation is solved deterministically to get a closed form for the solution as a function of x and L. So, the solution is used to obtain the PDF of the solution functions applying the RVT technique among the input random variable (L) and the output process (the solution functions). The obtained averages of the solution functions are used to get the complete analytical averages for some interesting physical quantities, namely, reflectivity and transmissivity at the medium boundaries. In terms of the average reflectivity and transmissivity, the average of the partial heat fluxes for the generalized problem with internal source of radiation are obtained and represented graphically.

  4. Variable density randomized stack of spirals (VDR-SoS) for compressive sensing MRI.

    PubMed

    Valvano, Giuseppe; Martini, Nicola; Landini, Luigi; Santarelli, Maria Filomena

    2016-07-01

    To develop a 3D sampling strategy based on a stack of variable density spirals for compressive sensing MRI. A random sampling pattern was obtained by rotating each spiral by a random angle and by delaying for few time steps the gradient waveforms of the different interleaves. A three-dimensional (3D) variable sampling density was obtained by designing different variable density spirals for each slice encoding. The proposed approach was tested with phantom simulations up to a five-fold undersampling factor. Fully sampled 3D dataset of a human knee, and of a human brain, were obtained from a healthy volunteer. The proposed approach was tested with off-line reconstructions of the knee dataset up to a four-fold acceleration and compared with other noncoherent trajectories. The proposed approach outperformed the standard stack of spirals for various undersampling factors. The level of coherence and the reconstruction quality of the proposed approach were similar to those of other trajectories that, however, require 3D gridding for the reconstruction. The variable density randomized stack of spirals (VDR-SoS) is an easily implementable trajectory that could represent a valid sampling strategy for 3D compressive sensing MRI. It guarantees low levels of coherence without requiring 3D gridding. Magn Reson Med 76:59-69, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  5. Random dopant fluctuations and statistical variability in n-channel junctionless FETs

    NASA Astrophysics Data System (ADS)

    Akhavan, N. D.; Umana-Membreno, G. A.; Gu, R.; Antoszewski, J.; Faraone, L.

    2018-01-01

    The influence of random dopant fluctuations on the statistical variability of the electrical characteristics of n-channel silicon junctionless nanowire transistor (JNT) has been studied using three dimensional quantum simulations based on the non-equilibrium Green’s function (NEGF) formalism. Average randomly distributed body doping densities of 2 × 1019, 6 × 1019 and 1 × 1020 cm-3 have been considered employing an atomistic model for JNTs with gate lengths of 5, 10 and 15 nm. We demonstrate that by properly adjusting the doping density in the JNT, a near ideal statistical variability and electrical performance can be achieved, which can pave the way for the continuation of scaling in silicon CMOS technology.

  6. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    PubMed

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  7. A New Approach to Extreme Value Estimation Applicable to a Wide Variety of Random Variables

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.

    1997-01-01

    Designing reliable structures requires an estimate of the maximum and minimum values (i.e., strength and load) that may be encountered in service. Yet designs based on very extreme values (to insure safety) can result in extra material usage and hence, uneconomic systems. In aerospace applications, severe over-design cannot be tolerated making it almost mandatory to design closer to the assumed limits of the design random variables. The issue then is predicting extreme values that are practical, i.e. neither too conservative or non-conservative. Obtaining design values by employing safety factors is well known to often result in overly conservative designs and. Safety factor values have historically been selected rather arbitrarily, often lacking a sound rational basis. To answer the question of how safe a design needs to be has lead design theorists to probabilistic and statistical methods. The so-called three-sigma approach is one such method and has been described as the first step in utilizing information about the data dispersion. However, this method is based on the assumption that the random variable is dispersed symmetrically about the mean and is essentially limited to normally distributed random variables. Use of this method can therefore result in unsafe or overly conservative design allowables if the common assumption of normality is incorrect.

  8. AUTOCLASSIFICATION OF THE VARIABLE 3XMM SOURCES USING THE RANDOM FOREST MACHINE LEARNING ALGORITHM

    SciT

    Farrell, Sean A.; Murphy, Tara; Lo, Kitty K., E-mail: s.farrell@physics.usyd.edu.au

    In the current era of large surveys and massive data sets, autoclassification of astrophysical sources using intelligent algorithms is becoming increasingly important. In this paper we present the catalog of variable sources in the Third XMM-Newton Serendipitous Source catalog (3XMM) autoclassified using the Random Forest machine learning algorithm. We used a sample of manually classified variable sources from the second data release of the XMM-Newton catalogs (2XMMi-DR2) to train the classifier, obtaining an accuracy of ∼92%. We also evaluated the effectiveness of identifying spurious detections using a sample of spurious sources, achieving an accuracy of ∼95%. Manual investigation of amore » random sample of classified sources confirmed these accuracy levels and showed that the Random Forest machine learning algorithm is highly effective at automatically classifying 3XMM sources. Here we present the catalog of classified 3XMM variable sources. We also present three previously unidentified unusual sources that were flagged as outlier sources by the algorithm: a new candidate supergiant fast X-ray transient, a 400 s X-ray pulsar, and an eclipsing 5 hr binary system coincident with a known Cepheid.« less

  9. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  10. Evaluation of variable selection methods for random forests and omics data sets.

    PubMed

    Degenhardt, Frauke; Seifert, Stephan; Szymczak, Silke

    2017-10-16

    Machine learning methods and in particular random forests are promising approaches for prediction based on high dimensional omics data sets. They provide variable importance measures to rank predictors according to their predictive power. If building a prediction model is the main goal of a study, often a minimal set of variables with good prediction performance is selected. However, if the objective is the identification of involved variables to find active networks and pathways, approaches that aim to select all relevant variables should be preferred. We evaluated several variable selection procedures based on simulated data as well as publicly available experimental methylation and gene expression data. Our comparison included the Boruta algorithm, the Vita method, recurrent relative variable importance, a permutation approach and its parametric variant (Altmann) as well as recursive feature elimination (RFE). In our simulation studies, Boruta was the most powerful approach, followed closely by the Vita method. Both approaches demonstrated similar stability in variable selection, while Vita was the most robust approach under a pure null model without any predictor variables related to the outcome. In the analysis of the different experimental data sets, Vita demonstrated slightly better stability in variable selection and was less computationally intensive than Boruta.In conclusion, we recommend the Boruta and Vita approaches for the analysis of high-dimensional data sets. Vita is considerably faster than Boruta and thus more suitable for large data sets, but only Boruta can also be applied in low-dimensional settings. © The Author 2017. Published by Oxford University Press.

  11. A Geostatistical Scaling Approach for the Generation of Non Gaussian Random Variables and Increments

    NASA Astrophysics Data System (ADS)

    Guadagnini, Alberto; Neuman, Shlomo P.; Riva, Monica; Panzeri, Marco

    2016-04-01

    We address manifestations of non-Gaussian statistical scaling displayed by many variables, Y, and their (spatial or temporal) increments. Evidence of such behavior includes symmetry of increment distributions at all separation distances (or lags) with sharp peaks and heavy tails which tend to decay asymptotically as lag increases. Variables reported to exhibit such distributions include quantities of direct relevance to hydrogeological sciences, e.g. porosity, log permeability, electrical resistivity, soil and sediment texture, sediment transport rate, rainfall, measured and simulated turbulent fluid velocity, and other. No model known to us captures all of the documented statistical scaling behaviors in a unique and consistent manner. We recently proposed a generalized sub-Gaussian model (GSG) which reconciles within a unique theoretical framework the probability distributions of a target variable and its increments. We presented an algorithm to generate unconditional random realizations of statistically isotropic or anisotropic GSG functions and illustrated it in two dimensions. In this context, we demonstrated the feasibility of estimating all key parameters of a GSG model underlying a single realization of Y by analyzing jointly spatial moments of Y data and corresponding increments. Here, we extend our GSG model to account for noisy measurements of Y at a discrete set of points in space (or time), present an algorithm to generate conditional realizations of corresponding isotropic or anisotropic random field, and explore them on one- and two-dimensional synthetic test cases.

  12. Continuous-variable phase estimation with unitary and random linear disturbance

    NASA Astrophysics Data System (ADS)

    Delgado de Souza, Douglas; Genoni, Marco G.; Kim, M. S.

    2014-10-01

    We address the problem of continuous-variable quantum phase estimation in the presence of linear disturbance at the Hamiltonian level by means of Gaussian probe states. In particular we discuss both unitary and random disturbance by considering the parameter which characterizes the unwanted linear term present in the Hamiltonian as fixed (unitary disturbance) or random with a given probability distribution (random disturbance). We derive the optimal input Gaussian states at fixed energy, maximizing the quantum Fisher information over the squeezing angle and the squeezing energy fraction, and we discuss the scaling of the quantum Fisher information in terms of the output number of photons, nout. We observe that, in the case of unitary disturbance, the optimal state is a squeezed vacuum state and the quadratic scaling is conserved. As regards the random disturbance, we observe that the optimal squeezing fraction may not be equal to one and, for any nonzero value of the noise parameter, the quantum Fisher information scales linearly with the average number of photons. Finally, we discuss the performance of homodyne measurement by comparing the achievable precision with the ultimate limit imposed by the quantum Cramér-Rao bound.

  13. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    PubMed

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  14. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    PubMed Central

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  15. Physical activity, mindfulness meditation, or heart rate variability biofeedback for stress reduction: a randomized controlled trial.

    PubMed

    van der Zwan, Judith Esi; de Vente, Wieke; Huizink, Anja C; Bögels, Susan M; de Bruin, Esther I

    2015-12-01

    In contemporary western societies stress is highly prevalent, therefore the need for stress-reducing methods is great. This randomized controlled trial compared the efficacy of self-help physical activity (PA), mindfulness meditation (MM), and heart rate variability biofeedback (HRV-BF) in reducing stress and its related symptoms. We randomly allocated 126 participants to PA, MM, or HRV-BF upon enrollment, of whom 76 agreed to participate. The interventions consisted of psycho-education and an introduction to the specific intervention techniques and 5 weeks of daily exercises at home. The PA exercises consisted of a vigorous-intensity activity of free choice. The MM exercises consisted of guided mindfulness meditation. The HRV-BF exercises consisted of slow breathing with a heart rate variability biofeedback device. Participants received daily reminders for their exercises and were contacted weekly to monitor their progress. They completed questionnaires prior to, directly after, and 6 weeks after the intervention. Results indicated an overall beneficial effect consisting of reduced stress, anxiety and depressive symptoms, and improved psychological well-being and sleep quality. No significant between-intervention effect was found, suggesting that PA, MM, and HRV-BF are equally effective in reducing stress and its related symptoms. These self-help interventions provide easily accessible help for people with stress complaints.

  16. Multivariate non-normally distributed random variables in climate research - introduction to the copula approach

    NASA Astrophysics Data System (ADS)

    Schölzel, C.; Friederichs, P.

    2008-10-01

    Probability distributions of multivariate random variables are generally more complex compared to their univariate counterparts which is due to a possible nonlinear dependence between the random variables. One approach to this problem is the use of copulas, which have become popular over recent years, especially in fields like econometrics, finance, risk management, or insurance. Since this newly emerging field includes various practices, a controversial discussion, and vast field of literature, it is difficult to get an overview. The aim of this paper is therefore to provide an brief overview of copulas for application in meteorology and climate research. We examine the advantages and disadvantages compared to alternative approaches like e.g. mixture models, summarize the current problem of goodness-of-fit (GOF) tests for copulas, and discuss the connection with multivariate extremes. An application to station data shows the simplicity and the capabilities as well as the limitations of this approach. Observations of daily precipitation and temperature are fitted to a bivariate model and demonstrate, that copulas are valuable complement to the commonly used methods.

  17. Stochastic effects in EUV lithography: random, local CD variability, and printing failures

    NASA Astrophysics Data System (ADS)

    De Bisschop, Peter

    2017-10-01

    Stochastic effects in lithography are usually quantified through local CD variability metrics, such as line-width roughness or local CD uniformity (LCDU), and these quantities have been measured and studied intensively, both in EUV and optical lithography. Next to the CD-variability, stochastic effects can also give rise to local, random printing failures, such as missing contacts or microbridges in spaces. When these occur, there often is no (reliable) CD to be measured locally, and then such failures cannot be quantified with the usual CD-measuring techniques. We have developed algorithms to detect such stochastic printing failures in regular line/space (L/S) or contact- or dot-arrays from SEM images, leading to a stochastic failure metric that we call NOK (not OK), which we consider a complementary metric to the CD-variability metrics. This paper will show how both types of metrics can be used to experimentally quantify dependencies of stochastic effects to, e.g., CD, pitch, resist, exposure dose, etc. As it is also important to be able to predict upfront (in the OPC verification stage of a production-mask tape-out) whether certain structures in the layout are likely to have a high sensitivity to stochastic effects, we look into the feasibility of constructing simple predictors, for both stochastic CD-variability and printing failure, that can be calibrated for the process and exposure conditions used and integrated into the standard OPC verification flow. Finally, we briefly discuss the options to reduce stochastic variability and failure, considering the entire patterning ecosystem.

  18. European Randomized Study of Screening for Prostate Cancer Risk Calculator: External Validation, Variability, and Clinical Significance.

    PubMed

    Gómez-Gómez, Enrique; Carrasco-Valiente, Julia; Blanca-Pedregosa, Ana; Barco-Sánchez, Beatriz; Fernandez-Rueda, Jose Luis; Molina-Abril, Helena; Valero-Rosa, Jose; Font-Ugalde, Pilar; Requena-Tapia, Maria José

    2017-04-01

    To externally validate the European Randomized Study of Screening for Prostate Cancer (ERSPC) risk calculator (RC) and to evaluate its variability between 2 consecutive prostate-specific antigen (PSA) values. We prospectively catalogued 1021 consecutive patients before prostate biopsy for suspicion of prostate cancer (PCa). The risk of PCa and significant PCa (Gleason score ≥7) from 749 patients was calculated according to ERSPC-RC (digital rectal examination-based version 3 of 4) for 2 consecutive PSA tests per patient. The calculators' predictions were analyzed using calibration plots and the area under the receiver operating characteristic curve (area under the curve). Cohen kappa coefficient was used to compare the ability and variability. Of 749 patients, PCa was detected in 251 (33.5%) and significant PCa was detected in 133 (17.8%). Calibration plots showed an acceptable parallelism and similar discrimination ability for both PSA levels with an area under the curve of 0.69 for PCa and 0.74 for significant PCa. The ERSPC showed 226 (30.2%) unnecessary biopsies with the loss of 10 significant PCa. The variability of the RC was 16% for PCa and 20% for significant PCa, and a higher variability was associated with a reduced risk of significant PCa. We can conclude that the performance of the ERSPC-RC in the present cohort shows a high similitude between the 2 PSA levels; however, the RC variability value is associated with a decreased risk of significant PCa. The use of the ERSPC in our cohort detects a high number of unnecessary biopsies. Thus, the incorporation of ERSPC-RC could help the clinical decision to carry out a prostate biopsy. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Bubble CPAP versus CPAP with variable flow in newborns with respiratory distress: a randomized controlled trial.

    PubMed

    Yagui, Ana Cristina Zanon; Vale, Luciana Assis Pires Andrade; Haddad, Luciana Branco; Prado, Cristiane; Rossi, Felipe Souza; Deutsch, Alice D Agostini; Rebello, Celso Moura

    2011-01-01

    To evaluate the efficacy and safety of nasal continuous positive airway pressure (NCPAP) using devices with variable flow or bubble continuous positive airway pressure (CPAP) regarding CPAP failure, presence of air leaks, total CPAP and oxygen time, and length of intensive care unit and hospital stay in neonates with moderate respiratory distress (RD) and birth weight (BW) ≥ 1,500 g. Forty newborns requiring NCPAP were randomized into two study groups: variable flow group (VF) and continuous flow group (CF). The study was conducted between October 2008 and April 2010. Demographic data, CPAP failure, presence of air leaks, and total CPAP and oxygen time were recorded. Categorical outcomes were tested using the chi-square test or the Fisher's exact test. Continuous variables were analyzed using the Mann-Whitney test. The level of significance was set at p < 0.05. There were no differences between the groups with regard to demographic data, CPAP failure (21.1 and 20.0% for VF and CF, respectively; p = 1.000), air leak syndrome (10.5 and 5.0%, respectively; p = 0.605), total CPAP time (median: 22.0 h, interquartile range [IQR]: 8.00-31.00 h and median: 22.0 h, IQR: 6.00-32.00 h, respectively; p = 0.822), and total oxygen time (median: 24.00 h, IQR: 7.00-85.00 h and median: 21.00 h, IQR: 9.50-66.75 h, respectively; p = 0.779). In newborns with BW ≥ 1,500 g and moderate RD, the use of continuous flow NCPAP showed the same benefits as the use of variable flow NCPAP.

  20. MODELING THE TIME VARIABILITY OF SDSS STRIPE 82 QUASARS AS A DAMPED RANDOM WALK

    SciT

    MacLeod, C. L.; Ivezic, Z.; Bullock, E.

    2010-10-01

    We model the time variability of {approx}9000 spectroscopically confirmed quasars in SDSS Stripe 82 as a damped random walk (DRW). Using 2.7 million photometric measurements collected over 10 yr, we confirm the results of Kelly et al. and Kozlowski et al. that this model can explain quasar light curves at an impressive fidelity level (0.01-0.02 mag). The DRW model provides a simple, fast (O(N) for N data points), and powerful statistical description of quasar light curves by a characteristic timescale ({tau}) and an asymptotic rms variability on long timescales (SF{sub {infinity}}). We searched for correlations between these two variability parametersmore » and physical parameters such as luminosity and black hole mass, and rest-frame wavelength. Our analysis shows SF{sub {infinity}} to increase with decreasing luminosity and rest-frame wavelength as observed previously, and without a correlation with redshift. We find a correlation between SF{sub {infinity}} and black hole mass with a power-law index of 0.18 {+-} 0.03, independent of the anti-correlation with luminosity. We find that {tau} increases with increasing wavelength with a power-law index of 0.17, remains nearly constant with redshift and luminosity, and increases with increasing black hole mass with a power-law index of 0.21 {+-} 0.07. The amplitude of variability is anti-correlated with the Eddington ratio, which suggests a scenario where optical fluctuations are tied to variations in the accretion rate. However, we find an additional dependence on luminosity and/or black hole mass that cannot be explained by the trend with Eddington ratio. The radio-loudest quasars have systematically larger variability amplitudes by about 30%, when corrected for the other observed trends, while the distribution of their characteristic timescale is indistinguishable from that of the full sample. We do not detect any statistically robust differences in the characteristic timescale and variability amplitude between

  1. Effects of Yoga on Heart Rate Variability and Depressive Symptoms in Women: A Randomized Controlled Trial.

    PubMed

    Chu, I-Hua; Wu, Wen-Lan; Lin, I-Mei; Chang, Yu-Kai; Lin, Yuh-Jen; Yang, Pin-Chen

    2017-04-01

    The purpose of the study was to investigate the effects of a 12-week yoga program on heart rate variability (HRV) and depressive symptoms in depressed women. This was a randomized controlled trial. Twenty-six sedentary women scoring ≥14 on the Beck Depression Inventory-II were randomized to either the yoga or the control group. The yoga group completed a 12-week yoga program, which took place twice a week for 60 min per session and consisted of breathing exercises, yoga pose practice, and supine meditation/relaxation. The control group was instructed not to engage in any yoga practice and to maintain their usual level of physical activity during the course of the study. Participants' HRV, depressive symptoms, and perceived stress were assessed at baseline and post-test. The yoga group had a significant increase in high-frequency HRV and decreases in low-frequency HRV and low frequency/high frequency ratio after the intervention. The yoga group also reported significantly reduced depressive symptoms and perceived stress. No change was found in the control group. A 12-week yoga program was effective in increasing parasympathetic tone and reducing depressive symptoms and perceived stress in women with elevated depressive symptoms. Regular yoga practice may be recommended for women to cope with their depressive symptoms and stress and to improve their HRV.

  2. What variables are important in predicting bovine viral diarrhea virus? A random forest approach.

    PubMed

    Machado, Gustavo; Mendoza, Mariana Recamonde; Corbellini, Luis Gustavo

    2015-07-24

    Bovine viral diarrhea virus (BVDV) causes one of the most economically important diseases in cattle, and the virus is found worldwide. A better understanding of the disease associated factors is a crucial step towards the definition of strategies for control and eradication. In this study we trained a random forest (RF) prediction model and performed variable importance analysis to identify factors associated with BVDV occurrence. In addition, we assessed the influence of features selection on RF performance and evaluated its predictive power relative to other popular classifiers and to logistic regression. We found that RF classification model resulted in an average error rate of 32.03% for the negative class (negative for BVDV) and 36.78% for the positive class (positive for BVDV).The RF model presented area under the ROC curve equal to 0.702. Variable importance analysis revealed that important predictors of BVDV occurrence were: a) who inseminates the animals, b) number of neighboring farms that have cattle and c) rectal palpation performed routinely. Our results suggest that the use of machine learning algorithms, especially RF, is a promising methodology for the analysis of cross-sectional studies, presenting a satisfactory predictive power and the ability to identify predictors that represent potential risk factors for BVDV investigation. We examined classical predictors and found some new and hard to control practices that may lead to the spread of this disease within and among farms, mainly regarding poor or neglected reproduction management, which should be considered for disease control and eradication.

  3. Insight into Best Variables for COPD Case Identification: A Random Forests Analysis.

    PubMed

    Leidy, Nancy K; Malley, Karen G; Steenrod, Anna W; Mannino, David M; Make, Barry J; Bowler, Russ P; Thomashow, Byron M; Barr, R G; Rennard, Stephen I; Houfek, Julia F; Yawn, Barbara P; Han, Meilan K; Meldrum, Catherine A; Bacci, Elizabeth D; Walsh, John W; Martinez, Fernando

    This study is part of a larger, multi-method project to develop a questionnaire for identifying undiagnosed cases of chronic obstructive pulmonary disease (COPD) in primary care settings, with specific interest in the detection of patients with moderate to severe airway obstruction or risk of exacerbation. To examine 3 existing datasets for insight into key features of COPD that could be useful in the identification of undiagnosed COPD. Random forests analyses were applied to the following databases: COPD Foundation Peak Flow Study Cohort (N=5761), Burden of Obstructive Lung Disease (BOLD) Kentucky site (N=508), and COPDGene® (N=10,214). Four scenarios were examined to find the best, smallest sets of variables that distinguished cases and controls:(1) moderate to severe COPD (forced expiratory volume in 1 second [FEV 1 ] <50% predicted) versus no COPD; (2) undiagnosed versus diagnosed COPD; (3) COPD with and without exacerbation history; and (4) clinically significant COPD (FEV 1 <60% predicted or history of acute exacerbation) versus all others. From 4 to 8 variables were able to differentiate cases from controls, with sensitivity ≥73 (range: 73-90) and specificity >68 (range: 68-93). Across scenarios, the best models included age, smoking status or history, symptoms (cough, wheeze, phlegm), general or breathing-related activity limitation, episodes of acute bronchitis, and/or missed work days and non-work activities due to breathing or health. Results provide insight into variables that should be considered during the development of candidate items for a new questionnaire to identify undiagnosed cases of clinically significant COPD.

  4. Mendelian randomization with fine-mapped genetic data: Choosing from large numbers of correlated instrumental variables.

    PubMed

    Burgess, Stephen; Zuber, Verena; Valdes-Marquez, Elsa; Sun, Benjamin B; Hopewell, Jemma C

    2017-12-01

    Mendelian randomization uses genetic variants to make causal inferences about the effect of a risk factor on an outcome. With fine-mapped genetic data, there may be hundreds of genetic variants in a single gene region any of which could be used to assess this causal relationship. However, using too many genetic variants in the analysis can lead to spurious estimates and inflated Type 1 error rates. But if only a few genetic variants are used, then the majority of the data is ignored and estimates are highly sensitive to the particular choice of variants. We propose an approach based on summarized data only (genetic association and correlation estimates) that uses principal components analysis to form instruments. This approach has desirable theoretical properties: it takes the totality of data into account and does not suffer from numerical instabilities. It also has good properties in simulation studies: it is not particularly sensitive to varying the genetic variants included in the analysis or the genetic correlation matrix, and it does not have greatly inflated Type 1 error rates. Overall, the method gives estimates that are less precise than those from variable selection approaches (such as using a conditional analysis or pruning approach to select variants), but are more robust to seemingly arbitrary choices in the variable selection step. Methods are illustrated by an example using genetic associations with testosterone for 320 genetic variants to assess the effect of sex hormone related pathways on coronary artery disease risk, in which variable selection approaches give inconsistent inferences. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  5. Random field theory to interpret the spatial variability of lacustrine soils

    NASA Astrophysics Data System (ADS)

    Russo, Savino; Vessia, Giovanna

    2015-04-01

    The lacustrine soils are quaternary soils, dated from Pleistocene to Holocene periods, generated in low-energy depositional environments and characterized by soil mixture of clays, sands and silts with alternations of finer and coarser grain size layers. They are often met at shallow depth filling several tens of meters of tectonic or erosive basins typically placed in internal Appenine areas. The lacustrine deposits are often locally interbedded by detritic soils resulting from the failure of surrounding reliefs. Their heterogeneous lithology is associated with high spatial variability of physical and mechanical properties both along horizontal and vertical directions. The deterministic approach is still commonly adopted to accomplish the mechanical characterization of these heterogeneous soils where undisturbed sampling is practically not feasible (if the incoherent fraction is prevalent) or not spatially representative (if the cohesive fraction prevails). The deterministic approach consists on performing in situ tests, like Standard Penetration Tests (SPT) or Cone Penetration Tests (CPT) and deriving design parameters through "expert judgment" interpretation of the measure profiles. These readings of tip and lateral resistances (Rp and RL respectively) are almost continuous but highly variable in soil classification according to Schmertmann (1978). Thus, neglecting the spatial variability cannot be the best strategy to estimated spatial representative values of physical and mechanical parameters of lacustrine soils to be used for engineering applications. Hereafter, a method to draw the spatial variability structure of the aforementioned measure profiles is presented. It is based on the theory of the Random Fields (Vanmarcke 1984) applied to vertical readings of Rp measures from mechanical CPTs. The proposed method relies on the application of the regression analysis, by which the spatial mean trend and fluctuations about this trend are derived. Moreover, the

  6. An AUC-based permutation variable importance measure for random forests

    PubMed Central

    2013-01-01

    Background The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. Results We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. Conclusions The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html. PMID:23560875

  7. An AUC-based permutation variable importance measure for random forests.

    PubMed

    Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure

    2013-04-05

    The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.

  8. Variable order fractional Fokker-Planck equations derived from Continuous Time Random Walks

    NASA Astrophysics Data System (ADS)

    Straka, Peter

    2018-08-01

    Continuous Time Random Walk models (CTRW) of anomalous diffusion are studied, where the anomalous exponent β(x) ∈(0 , 1) varies in space. This type of situation occurs e.g. in biophysics, where the density of the intracellular matrix varies throughout a cell. Scaling limits of CTRWs are known to have probability distributions which solve fractional Fokker-Planck type equations (FFPE). This correspondence between stochastic processes and FFPE solutions has many useful extensions e.g. to nonlinear particle interactions and reactions, but has not yet been sufficiently developed for FFPEs of the "variable order" type with non-constant β(x) . In this article, variable order FFPEs (VOFFPE) are derived from scaling limits of CTRWs. The key mathematical tool is the 1-1 correspondence of a CTRW scaling limit to a bivariate Langevin process, which tracks the cumulative sum of jumps in one component and the cumulative sum of waiting times in the other. The spatially varying anomalous exponent is modelled by spatially varying β(x) -stable Lévy noise in the waiting time component. The VOFFPE displays a spatially heterogeneous temporal scaling behaviour, with generalized diffusivity and drift coefficients whose units are length2/timeβ(x) resp. length/timeβ(x). A global change of the time scale results in a spatially varying change in diffusivity and drift. A consequence of the mathematical derivation of a VOFFPE from CTRW limits in this article is that a solution of a VOFFPE can be approximated via Monte Carlo simulations. Based on such simulations, we are able to confirm that the VOFFPE is consistent under a change of the global time scale.

  9. Divided dosing reduces prednisolone-induced hyperglycaemia and glycaemic variability: a randomized trial after kidney transplantation.

    PubMed

    Yates, Christopher J; Fourlanos, Spiros; Colman, Peter G; Cohney, Solomon J

    2014-03-01

    Prednisolone is a major risk factor for hyperglycaemia and new-onset diabetes after transplantation. Uncontrolled observational data suggest that divided dosing may reduce requirements for hypoglycaemic agents. This study aims to compare the glycaemic effects of divided twice daily (BD) and once daily (QD) prednisolone. Twenty-two kidney transplant recipients without diabetes were randomized to BD or QD prednisolone. Three weeks post-transplant, a continuous glucose monitor (iPro2(®) Medtronic) was applied for 5 days with subjects continuing their initial prednisolone regimen (Days 1-2) before crossover to the alternative regimen. Mean glucose, peak glucose, nadir glucose, exposure to hyperglycaemia (glucose ≥7.8 mmol/L) and glycaemic variability were assessed. The mean ± standard deviation (SD) age of subjects was 50 ± 10 years and 77% were male. Median (interquartile range) daily prednisolone dose was 25 (20, 25) mg. BD prednisolone was associated with decreased mean glucose (mean 7.9 ± 1.7 versus 8.1 ± 2.3 mmol/L, P < 0.001), peak glucose [median 10.4 (9.5, 11.4) versus 11.4 (10.3, 13.4) mmol/L, P< 0.001] and exposure to hyperglycaemia [median 25.5 (14.6, 30.3) versus 40.4 (33.2, 51.2) mmol/L/h, P = 0.003]. Median glucose peaked between 14:55-15.05 h with BD and 15:25-15:30 h with QD. Median glycaemic variability scores were decreased with BD: SD (1.1 versus 1.9, P < 0.001), mean amplitude of glycaemic excursion (1.5 versus 2.2, P = 0.001), continuous overlapping net glycaemic action-1 (CONGA-1; 1.0 versus 1.2, P = 0.039), CONGA-2 (1.2 versus 1.4, P = 0.008) and J-index (25 versus 31, P = 0.003). Split prednisolone dosing reduces glycaemic variability and hyperglycaemia early post-kidney transplant.

  10. The quotient of normal random variables and application to asset price fat tails

    NASA Astrophysics Data System (ADS)

    Caginalp, Carey; Caginalp, Gunduz

    2018-06-01

    The quotient of random variables with normal distributions is examined and proven to have power law decay, with density f(x) ≃f0x-2, with the coefficient depending on the means and variances of the numerator and denominator and their correlation. We also obtain the conditional probability densities for each of the four quadrants given by the signs of the numerator and denominator for arbitrary correlation ρ ∈ [ - 1 , 1) . For ρ = - 1 we obtain a particularly simple closed form solution for all x ∈ R. The results are applied to a basic issue in economics and finance, namely the density of relative price changes. Classical finance stipulates a normal distribution of relative price changes, though empirical studies suggest a power law at the tail end. By considering the supply and demand in a basic price change model, we prove that the relative price change has density that decays with an x-2 power law. Various parameter limits are established.

  11. Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2008-06-01

    The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space). These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras). The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous Bell’s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and “death of reality” which are typically linked to Bell’s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in Bell’s type inequalities statistical data from a number of experiments performed under different experimental contexts.

  12. The behaviour of random forest permutation-based variable importance measures under predictor correlation.

    PubMed

    Nicodemus, Kristin K; Malley, James D; Strobl, Carolin; Ziegler, Andreas

    2010-02-27

    Random forests (RF) have been increasingly used in applications such as genome-wide association and microarray studies where predictor correlation is frequently observed. Recent works on permutation-based variable importance measures (VIMs) used in RF have come to apparently contradictory conclusions. We present an extended simulation study to synthesize results. In the case when both predictor correlation was present and predictors were associated with the outcome (HA), the unconditional RF VIM attributed a higher share of importance to correlated predictors, while under the null hypothesis that no predictors are associated with the outcome (H0) the unconditional RF VIM was unbiased. Conditional VIMs showed a decrease in VIM values for correlated predictors versus the unconditional VIMs under HA and was unbiased under H0. Scaled VIMs were clearly biased under HA and H0. Unconditional unscaled VIMs are a computationally tractable choice for large datasets and are unbiased under the null hypothesis. Whether the observed increased VIMs for correlated predictors may be considered a "bias" - because they do not directly reflect the coefficients in the generating model - or if it is a beneficial attribute of these VIMs is dependent on the application. For example, in genetic association studies, where correlation between markers may help to localize the functionally relevant variant, the increased importance of correlated predictors may be an advantage. On the other hand, we show examples where this increased importance may result in spurious signals.

  13. How Far Is Quasar UV/Optical Variability from a Damped Random Walk at Low Frequency?

    SciT

    Guo Hengxiao; Wang Junxian; Cai Zhenyi

    Studies have shown that UV/optical light curves of quasars can be described using the prevalent damped random walk (DRW) model, also known as the Ornstein–Uhlenbeck process. A white noise power spectral density (PSD) is expected at low frequency in this model; however, a direct observational constraint to the low-frequency PSD slope is difficult due to the limited lengths of the light curves available. Meanwhile, quasars show scatter in their DRW parameters that is too large to be attributed to uncertainties in the measurements and dependence on the variation of known physical factors. In this work we present simulations showing that,more » if the low-frequency PSD deviates from the DRW, the red noise leakage can naturally produce large scatter in the variation parameters measured from simulated light curves. The steeper the low-frequency PSD slope, the larger scatter we expect. Based on observations of SDSS Stripe 82 quasars, we find that the low-frequency PSD slope should be no steeper than −1.3. The actual slope could be flatter, which consequently requires that the quasar variabilities should be influenced by other unknown factors. We speculate that the magnetic field and/or metallicity could be such additional factors.« less

  14. Standard errors and confidence intervals for variable importance in random forest regression, classification, and survival.

    PubMed

    Ishwaran, Hemant; Lu, Min

    2018-06-04

    Random forests are a popular nonparametric tree ensemble procedure with broad applications to data analysis. While its widespread popularity stems from its prediction performance, an equally important feature is that it provides a fully nonparametric measure of variable importance (VIMP). A current limitation of VIMP, however, is that no systematic method exists for estimating its variance. As a solution, we propose a subsampling approach that can be used to estimate the variance of VIMP and for constructing confidence intervals. The method is general enough that it can be applied to many useful settings, including regression, classification, and survival problems. Using extensive simulations, we demonstrate the effectiveness of the subsampling estimator and in particular find that the delete-d jackknife variance estimator, a close cousin, is especially effective under low subsampling rates due to its bias correction properties. These 2 estimators are highly competitive when compared with the .164 bootstrap estimator, a modified bootstrap procedure designed to deal with ties in out-of-sample data. Most importantly, subsampling is computationally fast, thus making it especially attractive for big data settings. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Variable stiffness colonoscope versus regular adult colonoscope: meta-analysis of randomized controlled trials.

    PubMed

    Othman, M O; Bradley, A G; Choudhary, A; Hoffman, R M; Roy, P K

    2009-01-01

    The variable stiffness colonoscope (VSC) may have theoretical advantages over standard adult colonoscopes (SACs), though data are conflicting. We conducted a meta-analysis to compare the efficacies of the VSC and SAC. We searched Medline (1966 - 2008) and abstracts of gastroenterology scientific meetings in the 5 years to February 2008, only for randomized clinical trials (RCTs) of adult patients. Trial quality was assessed using the Delphi list. In a meta-analysis with a fixed effects model, cecal intubation rates, cecal intubation times, abdominal pain scores, sedation used, and use of ancillary maneuvers, were compared in separate analyses, using weighted mean differences (WMDs), standardized mean differences (SMDs), or odds ratios (ORs). Seven RCTs satisfied the inclusion criteria (1923 patients), four comparing VSC with SAC procedures in adults, and three evaluating the pediatric VSC. There was no significant heterogeneity among the studies. The overall trial quality was adequate. Cecal intubation rate was higher with the use of VSC (OR = 2.08, 95 % confidence interval [CI] 1.29 to 3.36). The VSC was associated with lower abdominal pain scores and a decreased need for sedation during colonoscopy. Cecal intubation time was similar for the two colonscope types (WMD = - 0.21 minutes, 95 % CI - 0.85 to 0.43). Because of the nature of the intervention no studies were blinded. There was no universal method for using the VSC. Compared with the SAC, VSC use was associated with a higher cecal intubation rate, less abdominal pain, and decreased need for sedation. However, cecal intubation times were similar for the two colonoscope types.

  16. Variable versus conventional lung protective mechanical ventilation during open abdominal surgery: study protocol for a randomized controlled trial.

    PubMed

    Spieth, Peter M; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo

    2014-05-02

    General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary function and reduces systemic inflammatory response. However, it is currently not known whether patients undergoing open abdominal surgery might benefit from intraoperative variable ventilation. The PROtective VARiable ventilation trial ('PROVAR') is a single center, randomized controlled trial enrolling 50 patients who are planning for open abdominal surgery expected to last longer than 3 hours. PROVAR compares conventional (non-variable) lung protective ventilation (CV) with variable lung protective ventilation (VV) regarding pulmonary function and inflammatory response. The primary endpoint of the study is the forced vital capacity on the first postoperative day. Secondary endpoints include further lung function tests, plasma cytokine levels, spatial distribution of ventilation assessed by means of electrical impedance tomography and postoperative pulmonary complications. We hypothesize that VV improves lung function and reduces systemic inflammatory response compared to CV in patients receiving mechanical ventilation during general anesthesia for open abdominal surgery longer than 3 hours. PROVAR is the first randomized controlled trial aiming at intra- and postoperative effects of VV on lung function. This study may help to define the role of VV during general anesthesia requiring mechanical ventilation. Clinicaltrials.gov NCT01683578 (registered on September 3 3012).

  17. Pigeons' Choices between Fixed-Interval and Random-Interval Schedules: Utility of Variability?

    ERIC Educational Resources Information Center

    Andrzejewski, Matthew E.; Cardinal, Claudia D.; Field, Douglas P.; Flannery, Barbara A.; Johnson, Michael; Bailey, Kathleen; Hineline, Philip N.

    2005-01-01

    Pigeons' choosing between fixed-interval and random-interval schedules of reinforcement was investigated in three experiments using a discrete-trial procedure. In all three experiments, the random-interval schedule was generated by sampling a probability distribution at an interval (and in multiples of the interval) equal to that of the…

  18. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    PubMed

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  19. Heart rate variability biofeedback in patients with alcohol dependence: a randomized controlled study

    PubMed Central

    Penzlin, Ana Isabel; Siepmann, Timo; Illigens, Ben Min-Woo; Weidner, Kerstin; Siepmann, Martin

    2015-01-01

    Background and objective In patients with alcohol dependence, ethyl-toxic damage of vasomotor and cardiac autonomic nerve fibers leads to autonomic imbalance with neurovascular and cardiac dysfunction, the latter resulting in reduced heart rate variability (HRV). Autonomic imbalance is linked to increased craving and cardiovascular mortality. In this study, we sought to assess the effects of HRV biofeedback training on HRV, vasomotor function, craving, and anxiety. Methods We conducted a randomized controlled study in 48 patients (14 females, ages 25–59 years) undergoing inpatient rehabilitation treatment. In the treatment group, patients (n=24) attended six sessions of HRV biofeedback over 2 weeks in addition to standard rehabilitative care, whereas, in the control group, subjects received standard care only. Psychometric testing for craving (Obsessive Compulsive Drinking Scale), anxiety (Symptom Checklist-90-Revised), HRV assessment using coefficient of variation of R-R intervals (CVNN) analysis, and vasomotor function assessment using laser Doppler flowmetry were performed at baseline, immediately after completion of treatment or control period, and 3 and 6 weeks afterward (follow-ups 1 and 2). Results Psychometric testing showed decreased craving in the biofeedback group immediately postintervention (OCDS scores: 8.6±7.9 post-biofeedback versus 13.7±11.0 baseline [mean ± standard deviation], P<0.05), whereas craving was unchanged at this time point in the control group. Anxiety was reduced at follow-ups 1 and 2 post-biofeedback, but was unchanged in the control group (P<0.05). Following biofeedback, CVNN tended to be increased (10.3%±2.8% post-biofeedback, 10.1%±3.5% follow-up 1, 10.1%±2.9% follow-up 2 versus 9.7%±3.6% baseline; P=not significant). There was no such trend in the control group. Vasomotor function assessed using the mean duration to 50% vasoconstriction of cutaneous vessels after deep inspiration was improved following biofeedback

  20. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    PubMed Central

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  1. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  2. Implementation of Instrumental Variable Bounds for Data Missing Not at Random.

    PubMed

    Marden, Jessica R; Wang, Linbo; Tchetgen, Eric J Tchetgen; Walter, Stefan; Glymour, M Maria; Wirth, Kathleen E

    2018-05-01

    Instrumental variables are routinely used to recover a consistent estimator of an exposure causal effect in the presence of unmeasured confounding. Instrumental variable approaches to account for nonignorable missing data also exist but are less familiar to epidemiologists. Like instrumental variables for exposure causal effects, instrumental variables for missing data rely on exclusion restriction and instrumental variable relevance assumptions. Yet these two conditions alone are insufficient for point identification. For estimation, researchers have invoked a third assumption, typically involving fairly restrictive parametric constraints. Inferences can be sensitive to these parametric assumptions, which are typically not empirically testable. The purpose of our article is to discuss another approach for leveraging a valid instrumental variable. Although the approach is insufficient for nonparametric identification, it can nonetheless provide informative inferences about the presence, direction, and magnitude of selection bias, without invoking a third untestable parametric assumption. An important contribution of this article is an Excel spreadsheet tool that can be used to obtain empirical evidence of selection bias and calculate bounds and corresponding Bayesian 95% credible intervals for a nonidentifiable population proportion. For illustrative purposes, we used the spreadsheet tool to analyze HIV prevalence data collected by the 2007 Zambia Demographic and Health Survey (DHS).

  3. Non-Random Variability in Functional Composition of Coral Reef Fish Communities along an Environmental Gradient.

    PubMed

    Plass-Johnson, Jeremiah G; Taylor, Marc H; Husain, Aidah A A; Teichberg, Mirta C; Ferse, Sebastian C A

    2016-01-01

    Changes in the coral reef complex can affect predator-prey relationships, resource availability and niche utilisation in the associated fish community, which may be reflected in decreased stability of the functional traits present in a community. This is because particular traits may be favoured by a changing environment, or by habitat degradation. Furthermore, other traits can be selected against because degradation can relax the association between fishes and benthic habitat. We characterised six important ecological traits for fish species occurring at seven sites across a disturbed coral reef archipelago in Indonesia, where reefs have been exposed to eutrophication and destructive fishing practices for decades. Functional diversity was assessed using two complementary indices (FRic and RaoQ) and correlated to important environmental factors (live coral cover and rugosity, representing local reef health, and distance from shore, representing a cross-shelf environmental gradient). Indices were examined for both a change in their mean, as well as temporal (short-term; hours) and spatial (cross-shelf) variability, to assess whether fish-habitat association became relaxed along with habitat degradation. Furthermore, variability in individual traits was examined to identify the traits that are most affected by habitat change. Increases in the general reef health indicators, live coral cover and rugosity (correlated with distance from the mainland), were associated with decreases in the variability of functional diversity and with community-level changes in the abundance of several traits (notably home range size, maximum length, microalgae, detritus and small invertebrate feeding and reproductive turnover). A decrease in coral cover increased variability of RaoQ while rugosity and distance both inversely affected variability of FRic; however, averages for these indices did not reveal patterns associated with the environment. These results suggest that increased

  4. Non-Random Variability in Functional Composition of Coral Reef Fish Communities along an Environmental Gradient

    PubMed Central

    Plass-Johnson, Jeremiah G.; Taylor, Marc H.; Husain, Aidah A. A.; Teichberg, Mirta C.; Ferse, Sebastian C. A.

    2016-01-01

    Changes in the coral reef complex can affect predator-prey relationships, resource availability and niche utilisation in the associated fish community, which may be reflected in decreased stability of the functional traits present in a community. This is because particular traits may be favoured by a changing environment, or by habitat degradation. Furthermore, other traits can be selected against because degradation can relax the association between fishes and benthic habitat. We characterised six important ecological traits for fish species occurring at seven sites across a disturbed coral reef archipelago in Indonesia, where reefs have been exposed to eutrophication and destructive fishing practices for decades. Functional diversity was assessed using two complementary indices (FRic and RaoQ) and correlated to important environmental factors (live coral cover and rugosity, representing local reef health, and distance from shore, representing a cross-shelf environmental gradient). Indices were examined for both a change in their mean, as well as temporal (short-term; hours) and spatial (cross-shelf) variability, to assess whether fish-habitat association became relaxed along with habitat degradation. Furthermore, variability in individual traits was examined to identify the traits that are most affected by habitat change. Increases in the general reef health indicators, live coral cover and rugosity (correlated with distance from the mainland), were associated with decreases in the variability of functional diversity and with community-level changes in the abundance of several traits (notably home range size, maximum length, microalgae, detritus and small invertebrate feeding and reproductive turnover). A decrease in coral cover increased variability of RaoQ while rugosity and distance both inversely affected variability of FRic; however, averages for these indices did not reveal patterns associated with the environment. These results suggest that increased

  5. General Exact Solution to the Problem of the Probability Density for Sums of Random Variables

    NASA Astrophysics Data System (ADS)

    Tribelsky, Michael I.

    2002-07-01

    The exact explicit expression for the probability density pN(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of pN(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.

  6. General exact solution to the problem of the probability density for sums of random variables.

    PubMed

    Tribelsky, Michael I

    2002-08-12

    The exact explicit expression for the probability density p(N)(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of p(N)(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.

  7. Motor Variability Arises from a Slow Random Walk in Neural State

    PubMed Central

    Chaisanguanthum, Kris S.; Shen, Helen H.

    2014-01-01

    Even well practiced movements cannot be repeated without variability. This variability is thought to reflect “noise” in movement preparation or execution. However, we show that, for both professional baseball pitchers and macaque monkeys making reaching movements, motor variability can be decomposed into two statistical components, a slowly drifting mean and fast trial-by-trial fluctuations about the mean. The preparatory activity of dorsal premotor cortex/primary motor cortex neurons in monkey exhibits similar statistics. Although the neural and behavioral drifts appear to be correlated, neural activity does not account for trial-by-trial fluctuations in movement, which must arise elsewhere, likely downstream. The statistics of this drift are well modeled by a double-exponential autocorrelation function, with time constants similar across the neural and behavioral drifts in two monkeys, as well as the drifts observed in baseball pitching. These time constants can be explained by an error-corrective learning processes and agree with learning rates measured directly in previous experiments. Together, these results suggest that the central contributions to movement variability are not simply trial-by-trial fluctuations but are rather the result of longer-timescale processes that may arise from motor learning. PMID:25186752

  8. Using instrumental variables to disentangle treatment and placebo effects in blinded and unblinded randomized clinical trials influenced by unmeasured confounders

    NASA Astrophysics Data System (ADS)

    Chaibub Neto, Elias

    2016-11-01

    Clinical trials traditionally employ blinding as a design mechanism to reduce the influence of placebo effects. In practice, however, it can be difficult or impossible to blind study participants and unblinded trials are common in medical research. Here we show how instrumental variables can be used to quantify and disentangle treatment and placebo effects in randomized clinical trials comparing control and active treatments in the presence of confounders. The key idea is to use randomization to separately manipulate treatment assignment and psychological encouragement conversations/interactions that increase the participants’ desire for improved symptoms. The proposed approach is able to improve the estimation of treatment effects in blinded studies and, most importantly, opens the doors to account for placebo effects in unblinded trials.

  9. Computing approximate random Delta v magnitude probability densities. [for spacecraft trajectory correction

    NASA Technical Reports Server (NTRS)

    Chadwick, C.

    1984-01-01

    This paper describes the development and use of an algorithm to compute approximate statistics of the magnitude of a single random trajectory correction maneuver (TCM) Delta v vector. The TCM Delta v vector is modeled as a three component Cartesian vector each of whose components is a random variable having a normal (Gaussian) distribution with zero mean and possibly unequal standard deviations. The algorithm uses these standard deviations as input to produce approximations to (1) the mean and standard deviation of the magnitude of Delta v, (2) points of the probability density function of the magnitude of Delta v, and (3) points of the cumulative and inverse cumulative distribution functions of Delta v. The approximates are based on Monte Carlo techniques developed in a previous paper by the author and extended here. The algorithm described is expected to be useful in both pre-flight planning and in-flight analysis of maneuver propellant requirements for space missions.

  10. Probabilistic SSME blades structural response under random pulse loading

    NASA Technical Reports Server (NTRS)

    Shiao, Michael; Rubinstein, Robert; Nagpal, Vinod K.

    1987-01-01

    The purpose is to develop models of random impacts on a Space Shuttle Main Engine (SSME) turbopump blade and to predict the probabilistic structural response of the blade to these impacts. The random loading is caused by the impact of debris. The probabilistic structural response is characterized by distribution functions for stress and displacements as functions of the loading parameters which determine the random pulse model. These parameters include pulse arrival, amplitude, and location. The analysis can be extended to predict level crossing rates. This requires knowledge of the joint distribution of the response and its derivative. The model of random impacts chosen allows the pulse arrivals, pulse amplitudes, and pulse locations to be random. Specifically, the pulse arrivals are assumed to be governed by a Poisson process, which is characterized by a mean arrival rate. The pulse intensity is modelled as a normally distributed random variable with a zero mean chosen independently at each arrival. The standard deviation of the distribution is a measure of pulse intensity. Several different models were used for the pulse locations. For example, three points near the blade tip were chosen at which pulses were allowed to arrive with equal probability. Again, the locations were chosen independently at each arrival. The structural response was analyzed both by direct Monte Carlo simulation and by a semi-analytical method.

  11. A new mean estimator using auxiliary variables for randomized response models

    NASA Astrophysics Data System (ADS)

    Ozgul, Nilgun; Cingi, Hulya

    2013-10-01

    Randomized response models are commonly used in surveys dealing with sensitive questions such as abortion, alcoholism, sexual orientation, drug taking, annual income, tax evasion to ensure interviewee anonymity and reduce nonrespondents rates and biased responses. Starting from the pioneering work of Warner [7], many versions of RRM have been developed that can deal with quantitative responses. In this study, new mean estimator is suggested for RRM including quantitative responses. The mean square error is derived and a simulation study is performed to show the efficiency of the proposed estimator to other existing estimators in RRM.

  12. A large-scale study of the random variability of a coding sequence: a study on the CFTR gene.

    PubMed

    Modiano, Guido; Bombieri, Cristina; Ciminelli, Bianca Maria; Belpinati, Francesca; Giorgi, Silvia; Georges, Marie des; Scotet, Virginie; Pompei, Fiorenza; Ciccacci, Cinzia; Guittard, Caroline; Audrézet, Marie Pierre; Begnini, Angela; Toepfer, Michael; Macek, Milan; Ferec, Claude; Claustres, Mireille; Pignatti, Pier Franco

    2005-02-01

    Coding single nucleotide substitutions (cSNSs) have been studied on hundreds of genes using small samples (n(g) approximately 100-150 genes). In the present investigation, a large random European population sample (average n(g) approximately 1500) was studied for a single gene, the CFTR (Cystic Fibrosis Transmembrane conductance Regulator). The nonsynonymous (NS) substitutions exhibited, in accordance with previous reports, a mean probability of being polymorphic (q > 0.005), much lower than that of the synonymous (S) substitutions, but they showed a similar rate of subpolymorphic (q < 0.005) variability. This indicates that, in autosomal genes that may have harmful recessive alleles (nonduplicated genes with important functions), genetic drift overwhelms selection in the subpolymorphic range of variability, making disadvantageous alleles behave as neutral. These results imply that the majority of the subpolymorphic nonsynonymous alleles of these genes are selectively negative or even pathogenic.

  13. A randomized controlled trial investigating the effects of craniosacral therapy on pain and heart rate variability in fibromyalgia patients.

    PubMed

    Castro-Sánchez, Adelaida María; Matarán-Peñarrocha, Guillermo A; Sánchez-Labraca, Nuria; Quesada-Rubio, José Manuel; Granero-Molina, José; Moreno-Lorenzo, Carmen

    2011-01-01

    Fibromyalgia is a prevalent musculoskeletal disorder associated with widespread mechanical tenderness, fatigue, non-refreshing sleep, depressed mood and pervasive dysfunction of the autonomic nervous system: tachycardia, postural intolerance, Raynaud's phenomenon and diarrhoea. To determine the effects of craniosacral therapy on sensitive tender points and heart rate variability in patients with fibromyalgia. A randomized controlled trial. Ninety-two patients with fibromyalgia were randomly assigned to an intervention group or placebo group. Patients received treatments for 20 weeks. The intervention group underwent a craniosacral therapy protocol and the placebo group received sham treatment with disconnected magnetotherapy equipment. Pain intensity levels were determined by evaluating tender points, and heart rate variability was recorded by 24-hour Holter monitoring. After 20 weeks of treatment, the intervention group showed significant reduction in pain at 13 of the 18 tender points (P < 0.05). Significant differences in temporal standard deviation of RR segments, root mean square deviation of temporal standard deviation of RR segments and clinical global impression of improvement versus baseline values were observed in the intervention group but not in the placebo group. At two months and one year post therapy, the intervention group showed significant differences versus baseline in tender points at left occiput, left-side lower cervical, left epicondyle and left greater trochanter and significant differences in temporal standard deviation of RR segments, root mean square deviation of temporal standard deviation of RR segments and clinical global impression of improvement. Craniosacral therapy improved medium-term pain symptoms in patients with fibromyalgia.

  14. Intrinsic Cellular Properties and Connectivity Density Determine Variable Clustering Patterns in Randomly Connected Inhibitory Neural Networks

    PubMed Central

    Rich, Scott; Booth, Victoria; Zochowski, Michal

    2016-01-01

    The plethora of inhibitory interneurons in the hippocampus and cortex play a pivotal role in generating rhythmic activity by clustering and synchronizing cell firing. Results of our simulations demonstrate that both the intrinsic cellular properties of neurons and the degree of network connectivity affect the characteristics of clustered dynamics exhibited in randomly connected, heterogeneous inhibitory networks. We quantify intrinsic cellular properties by the neuron's current-frequency relation (IF curve) and Phase Response Curve (PRC), a measure of how perturbations given at various phases of a neurons firing cycle affect subsequent spike timing. We analyze network bursting properties of networks of neurons with Type I or Type II properties in both excitability and PRC profile; Type I PRCs strictly show phase advances and IF curves that exhibit frequencies arbitrarily close to zero at firing threshold while Type II PRCs display both phase advances and delays and IF curves that have a non-zero frequency at threshold. Type II neurons whose properties arise with or without an M-type adaptation current are considered. We analyze network dynamics under different levels of cellular heterogeneity and as intrinsic cellular firing frequency and the time scale of decay of synaptic inhibition are varied. Many of the dynamics exhibited by these networks diverge from the predictions of the interneuron network gamma (ING) mechanism, as well as from results in all-to-all connected networks. Our results show that randomly connected networks of Type I neurons synchronize into a single cluster of active neurons while networks of Type II neurons organize into two mutually exclusive clusters segregated by the cells' intrinsic firing frequencies. Networks of Type II neurons containing the adaptation current behave similarly to networks of either Type I or Type II neurons depending on network parameters; however, the adaptation current creates differences in the cluster dynamics

  15. Spatiotemporal dynamics of random stimuli account for trial-to-trial variability in perceptual decision making

    PubMed Central

    Park, Hame; Lueckmann, Jan-Matthis; von Kriegstein, Katharina; Bitzer, Sebastian; Kiebel, Stefan J.

    2016-01-01

    Decisions in everyday life are prone to error. Standard models typically assume that errors during perceptual decisions are due to noise. However, it is unclear how noise in the sensory input affects the decision. Here we show that there are experimental tasks for which one can analyse the exact spatio-temporal details of a dynamic sensory noise and better understand variability in human perceptual decisions. Using a new experimental visual tracking task and a novel Bayesian decision making model, we found that the spatio-temporal noise fluctuations in the input of single trials explain a significant part of the observed responses. Our results show that modelling the precise internal representations of human participants helps predict when perceptual decisions go wrong. Furthermore, by modelling precisely the stimuli at the single-trial level, we were able to identify the underlying mechanism of perceptual decision making in more detail than standard models. PMID:26752272

  16. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    SciT

    Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less

  17. Randomized Trial of a Lifestyle Physical Activity Intervention for Breast Cancer Survivors: Effects on Transtheoretical Model Variables.

    PubMed

    Scruggs, Stacie; Mama, Scherezade K; Carmack, Cindy L; Douglas, Tommy; Diamond, Pamela; Basen-Engquist, Karen

    2018-01-01

    This study examined whether a physical activity intervention affects transtheoretical model (TTM) variables that facilitate exercise adoption in breast cancer survivors. Sixty sedentary breast cancer survivors were randomized to a 6-month lifestyle physical activity intervention or standard care. TTM variables that have been shown to facilitate exercise adoption and progress through the stages of change, including self-efficacy, decisional balance, and processes of change, were measured at baseline, 3 months, and 6 months. Differences in TTM variables between groups were tested using repeated measures analysis of variance. The intervention group had significantly higher self-efficacy ( F = 9.55, p = .003) and perceived significantly fewer cons of exercise ( F = 5.416, p = .025) at 3 and 6 months compared with the standard care group. Self-liberation, counterconditioning, and reinforcement management processes of change increased significantly from baseline to 6 months in the intervention group, and self-efficacy and reinforcement management were significantly associated with improvement in stage of change. The stage-based physical activity intervention increased use of select processes of change, improved self-efficacy, decreased perceptions of the cons of exercise, and helped participants advance in stage of change. These results point to the importance of using a theory-based approach in interventions to increase physical activity in cancer survivors.

  18. The effects of yoga on psychosocial variables and exercise adherence: a randomized, controlled pilot study.

    PubMed

    Bryan, Stephanie; Pinto Zipp, Genevieve; Parasher, Raju

    2012-01-01

    Physical inactivity is a serious issue for the American public. Because of conditions that result from inactivity, individuals incur close to $1 trillion USD in health-care costs, and approximately 250 000 premature deaths occur per year. Researchers have linked engaging in yoga to improved overall fitness, including improved muscular strength, muscular endurance, flexibility, and balance. Researchers have not yet investigated the impact of yoga on exercise adherence. The research team assessed the effects of 10 weeks of yoga classes held twice a week on exercise adherence in previously sedentary adults. The research team designed a randomized controlled pilot trial. The team collected data from the intervention (yoga) and control groups at baseline, midpoint, and posttest (posttest 1) and also collected data pertaining to exercise adherence for the yoga group at 5 weeks posttest (posttest 2). The pilot took place in a yoga studio in central New Jersey in the United States. The pretesting occurred at the yoga studio for all participants. Midpoint testing and posttesting occurred at the studio for the yoga group and by mail for the control group. Participants were 27 adults (mean age 51 y) who had been physically inactive for a period of at least 6 months prior to the study. Interventions The intervention group (yoga group) received hour-long hatha yoga classes that met twice a week for 10 weeks. The control group did not participate in classes during the research study; however, they were offered complimentary post research classes. Outcome Measures The study's primary outcome measure was exercise adherence as measured by the 7-day Physical Activity Recall. The secondary measures included (1) exercise self-efficacy as measured by the Multidimensional Self-Efficacy for Exercise Scale, (2) general well-being as measured by the General Well-Being Schedule, (3) exercise-group cohesion as measured by the Group Environment Questionnaire (GEQ), (4) acute feeling response

  19. Hot-spot model for accretion disc variability as random process. II. Mathematics of the power-spectrum break frequency

    NASA Astrophysics Data System (ADS)

    Pecháček, T.; Goosmann, R. W.; Karas, V.; Czerny, B.; Dovčiak, M.

    2013-08-01

    Context. We study some general properties of accretion disc variability in the context of stationary random processes. In particular, we are interested in mathematical constraints that can be imposed on the functional form of the Fourier power-spectrum density (PSD) that exhibits a multiply broken shape and several local maxima. Aims: We develop a methodology for determining the regions of the model parameter space that can in principle reproduce a PSD shape with a given number and position of local peaks and breaks of the PSD slope. Given the vast space of possible parameters, it is an important requirement that the method is fast in estimating the PSD shape for a given parameter set of the model. Methods: We generated and discuss the theoretical PSD profiles of a shot-noise-type random process with exponentially decaying flares. Then we determined conditions under which one, two, or more breaks or local maxima occur in the PSD. We calculated positions of these features and determined the changing slope of the model PSD. Furthermore, we considered the influence of the modulation by the orbital motion for a variability pattern assumed to result from an orbiting-spot model. Results: We suggest that our general methodology can be useful for describing non-monotonic PSD profiles (such as the trend seen, on different scales, in exemplary cases of the high-mass X-ray binary Cygnus X-1 and the narrow-line Seyfert galaxy Ark 564). We adopt a model where these power spectra are reproduced as a superposition of several Lorentzians with varying amplitudes in the X-ray-band light curve. Our general approach can help in constraining the model parameters and in determining which parts of the parameter space are accessible under various circumstances.

  20. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    PubMed

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  1. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    PubMed Central

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  2. Comparison of electroacupuncture frequency-related effects on heart rate variability in healthy volunteers: a randomized clinical trial.

    PubMed

    Lee, Jong-Ho; Kim, Kyu-Hyeong; Hong, Jin-Woo; Lee, Won-Chul; Koo, Sungtae

    2011-06-01

    This study aimed to compare the effects of high frequency electroacupuncture (EA) and low-frequency EA on the autonomic nervous system by using a heart rate variability measuring device in normal individuals. Fourteen participants were recruited and each participated in the high-frequency and low-frequency sessions (crossover design). The order of sessions was randomized and the interval between the two sessions was over 2 weeks. Participants received needle insertion with 120-Hz stimulation during the high-frequency session (high-frequency EA group), and with 2-Hz stimulation during the low-frequency session (low-frequency EA group). Acupuncture needles were directly inserted perpendicularly to LI 4 and LI 11 acupoints followed by delivery of electric pulses to these points for 15 minutes. Heart rate variability was measured 5 minutes before and after EA stimulation by a heart rate variability measuring system. We found a significant increase in the standard deviation of the normal-to-normal interval in the high-frequency EA group, with no change in the low-frequency EA group. Both the high-frequency and low-frequency EA groups showed no significant differences in other parameters including high-frequency power, low-frequency power, and the ratio of low-frequency power to high-frequency power. Based on these findings, we concluded that high-frequency EA stimulation is more effective than low-frequency EA stimulation in increasing autonomic nervous activity and there is no difference between the two EA frequencies in enhancing sympathovagal balance. Copyright © 2011 Korean Pharmacopuncture Institute. Published by .. All rights reserved.

  3. Mean convergence theorems and weak laws of large numbers for weighted sums of random variables under a condition of weighted integrability

    NASA Astrophysics Data System (ADS)

    Ordóñez Cabrera, Manuel; Volodin, Andrei I.

    2005-05-01

    From the classical notion of uniform integrability of a sequence of random variables, a new concept of integrability (called h-integrability) is introduced for an array of random variables, concerning an array of constantsE We prove that this concept is weaker than other previous related notions of integrability, such as Cesàro uniform integrability [Chandra, Sankhya Ser. A 51 (1989) 309-317], uniform integrability concerning the weights [Ordóñez Cabrera, Collect. Math. 45 (1994) 121-132] and Cesàro [alpha]-integrability [Chandra and Goswami, J. Theoret. ProbabE 16 (2003) 655-669]. Under this condition of integrability and appropriate conditions on the array of weights, mean convergence theorems and weak laws of large numbers for weighted sums of an array of random variables are obtained when the random variables are subject to some special kinds of dependence: (a) rowwise pairwise negative dependence, (b) rowwise pairwise non-positive correlation, (c) when the sequence of random variables in every row is [phi]-mixing. Finally, we consider the general weak law of large numbers in the sense of Gut [Statist. Probab. Lett. 14 (1992) 49-52] under this new condition of integrability for a Banach space setting.

  4. Variability in research ethics review of cluster randomized trials: a scenario-based survey in three countries

    PubMed Central

    2014-01-01

    Background Cluster randomized trials (CRTs) present unique ethical challenges. In the absence of a uniform standard for their ethical design and conduct, problems such as variability in procedures and requirements by different research ethics committees will persist. We aimed to assess the need for ethics guidelines for CRTs among research ethics chairs internationally, investigate variability in procedures for research ethics review of CRTs within and among countries, and elicit research ethics chairs’ perspectives on specific ethical issues in CRTs, including the identification of research subjects. The proper identification of research subjects is a necessary requirement in the research ethics review process, to help ensure, on the one hand, that subjects are protected from harm and exploitation, and on the other, that reviews of CRTs are completed efficiently. Methods A web-based survey with closed- and open-ended questions was administered to research ethics chairs in Canada, the United States, and the United Kingdom. The survey presented three scenarios of CRTs involving cluster-level, professional-level, and individual-level interventions. For each scenario, a series of questions was posed with respect to the type of review required (full, expedited, or no review) and the identification of research subjects at cluster and individual levels. Results A total of 189 (35%) of 542 chairs responded. Overall, 144 (84%, 95% CI 79 to 90%) agreed or strongly agreed that there is a need for ethics guidelines for CRTs and 158 (92%, 95% CI 88 to 96%) agreed or strongly agreed that research ethics committees could be better informed about distinct ethical issues surrounding CRTs. There was considerable variability among research ethics chairs with respect to the type of review required, as well as the identification of research subjects. The cluster-cluster and professional-cluster scenarios produced the most disagreement. Conclusions Research ethics committees

  5. Semiparametric methods for estimation of a nonlinear exposure-outcome relationship using instrumental variables with application to Mendelian randomization.

    PubMed

    Staley, James R; Burgess, Stephen

    2017-05-01

    Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure-outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure-outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure-outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  6. Semiparametric methods for estimation of a nonlinear exposure‐outcome relationship using instrumental variables with application to Mendelian randomization

    PubMed Central

    Staley, James R.

    2017-01-01

    ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167

  7. Genetic variability of VEGF pathway genes in six randomized phase III trials assessing the addition of bevacizumab to standard therapy.

    PubMed

    de Haas, Sanne; Delmar, Paul; Bansal, Aruna T; Moisse, Matthieu; Miles, David W; Leighl, Natasha; Escudier, Bernard; Van Cutsem, Eric; Carmeliet, Peter; Scherer, Stefan J; Pallaud, Celine; Lambrechts, Diether

    2014-10-01

    Despite extensive translational research, no validated biomarkers predictive of bevacizumab treatment outcome have been identified. We performed a meta-analysis of individual patient data from six randomized phase III trials in colorectal, pancreatic, lung, renal, breast, and gastric cancer to explore the potential relationships between 195 common genetic variants in the vascular endothelial growth factor (VEGF) pathway and bevacizumab treatment outcome. The analysis included 1,402 patients (716 bevacizumab-treated and 686 placebo-treated). Twenty variants were associated (P < 0.05) with progression-free survival (PFS) in bevacizumab-treated patients. Of these, 4 variants in EPAS1 survived correction for multiple testing (q < 0.05). Genotype-by-treatment interaction tests revealed that, across these 20 variants, 3 variants in VEGF-C (rs12510099), EPAS1 (rs4953344), and IL8RA (rs2234671) were potentially predictive (P < 0.05), but not resistant to multiple testing (q > 0.05). A weak genotype-by-treatment interaction effect was also observed for rs699946 in VEGF-A, whereas Bayesian genewise analysis revealed that genetic variability in VHL was associated with PFS in the bevacizumab arm (q < 0.05). Variants in VEGF-A, EPAS1, and VHL were located in expression quantitative loci derived from lymphoblastoid cell lines, indicating that they affect the expression levels of their respective gene. This large genetic analysis suggests that variants in VEGF-A, EPAS1, IL8RA, VHL, and VEGF-C have potential value in predicting bevacizumab treatment outcome across tumor types. Although these associations did not survive correction for multiple testing in a genotype-by-interaction analysis, they are among the strongest predictive effects reported to date for genetic variants and bevacizumab efficacy.

  8. Random Distribution Pattern and Non-adaptivity of Genome Size in a Highly Variable Population of Festuca pallens

    PubMed Central

    Šmarda, Petr; Bureš, Petr; Horová, Lucie

    2007-01-01

    Background and Aims The spatial and statistical distribution of genome sizes and the adaptivity of genome size to some types of habitat, vegetation or microclimatic conditions were investigated in a tetraploid population of Festuca pallens. The population was previously documented to vary highly in genome size and is assumed as a model for the study of the initial stages of genome size differentiation. Methods Using DAPI flow cytometry, samples were measured repeatedly with diploid Festuca pallens as the internal standard. Altogether 172 plants from 57 plots (2·25 m2), distributed in contrasting habitats over the whole locality in South Moravia, Czech Republic, were sampled. The differences in DNA content were confirmed by the double peaks of simultaneously measured samples. Key Results At maximum, a 1·115-fold difference in genome size was observed. The statistical distribution of genome sizes was found to be continuous and best fits the extreme (Gumbel) distribution with rare occurrences of extremely large genomes (positive-skewed), as it is similar for the log-normal distribution of the whole Angiosperms. Even plants from the same plot frequently varied considerably in genome size and the spatial distribution of genome sizes was generally random and unautocorrelated (P > 0·05). The observed spatial pattern and the overall lack of correlations of genome size with recognized vegetation types or microclimatic conditions indicate the absence of ecological adaptivity of genome size in the studied population. Conclusions These experimental data on intraspecific genome size variability in Festuca pallens argue for the absence of natural selection and the selective non-significance of genome size in the initial stages of genome size differentiation, and corroborate the current hypothetical model of genome size evolution in Angiosperms (Bennetzen et al., 2005, Annals of Botany 95: 127–132). PMID:17565968

  9. Are glucose levels, glucose variability and autonomic control influenced by inspiratory muscle exercise in patients with type 2 diabetes? Study protocol for a randomized controlled trial.

    PubMed

    Schein, Aso; Correa, Aps; Casali, Karina Rabello; Schaan, Beatriz D

    2016-01-20

    Physical exercise reduces glucose levels and glucose variability in patients with type 2 diabetes. Acute inspiratory muscle exercise has been shown to reduce these parameters in a small group of patients with type 2 diabetes, but these results have yet to be confirmed in a well-designed study. The aim of this study is to investigate the effect of acute inspiratory muscle exercise on glucose levels, glucose variability, and cardiovascular autonomic function in patients with type 2 diabetes. This study will use a randomized clinical trial crossover design. A total of 14 subjects will be recruited and randomly allocated to two groups to perform acute inspiratory muscle loading at 2 % of maximal inspiratory pressure (PImax, placebo load) or 60 % of PImax (experimental load). Inspiratory muscle training could be a novel exercise modality to be used to decrease glucose levels and glucose variability. ClinicalTrials.gov NCT02292810 .

  10. Night-to-Night Sleep Variability in Older Adults With Chronic Insomnia: Mediators and Moderators in a Randomized Controlled Trial of Brief Behavioral Therapy (BBT-I)

    PubMed Central

    Chan, Wai Sze; Williams, Jacob; Dautovich, Natalie D.; McNamara, Joseph P.H.; Stripling, Ashley; Dzierzewski, Joseph M.; Berry, Richard B.; McCoy, Karin J.M.; McCrae, Christina S.

    2017-01-01

    Study Objectives: Sleep variability is a clinically significant variable in understanding and treating insomnia in older adults. The current study examined changes in sleep variability in the course of brief behavioral therapy for insomnia (BBT-I) in older adults who had chronic insomnia. Additionally, the current study examined the mediating mechanisms underlying reductions of sleep variability and the moderating effects of baseline sleep variability on treatment responsiveness. Methods: Sixty-two elderly participants were randomly assigned to either BBT-I or self-monitoring and attention control (SMAC). Sleep was assessed by sleep diaries and actigraphy from baseline to posttreatment and at 3-month follow-up. Mixed models were used to examine changes in sleep variability (within-person standard deviations of weekly sleep parameters) and the hypothesized mediation and moderation effects. Results: Variabilities in sleep diary-assessed sleep onset latency (SOL) and actigraphy-assessed total sleep time (TST) significantly decreased in BBT-I compared to SMAC (Pseudo R2 = .12, .27; P = .018, .008). These effects were mediated by reductions in bedtime and wake time variability and time in bed. Significant time × group × baseline sleep variability interactions on sleep outcomes indicated that participants who had higher baseline sleep variability were more responsive to BBT-I; their actigraphy-assessed TST, SOL, and sleep efficiency improved to a greater degree (Pseudo R2 = .15 to .66; P < .001 to .044). Conclusions: BBT-I is effective in reducing sleep variability in older adults who have chronic insomnia. Increased consistency in bedtime and wake time and decreased time in bed mediate reductions of sleep variability. Baseline sleep variability may serve as a marker of high treatment responsiveness to BBT-I. Clinical Trial Registration: ClinicalTrials.gov, Identifier: NCT02967185 Citation: Chan WS, Williams J, Dautovich ND, McNamara JP, Stripling A, Dzierzewski JM

  11. Dietary sodium influences the effect of mental stress on heart rate variability: a randomized trial in healthy adults.

    PubMed

    Allen, Alexander R; Gullixson, Leah R; Wolhart, Sarah C; Kost, Susan L; Schroeder, Darrell R; Eisenach, John H

    2014-02-01

    Dietary sodium influences intermediate physiological traits in healthy adults independent of changes in blood pressure. The purpose of this study was to test the hypothesis that dietary sodium affects cardiac autonomic modulation during mental stress. In a prospective, randomized cross-over design separated by 1 month between diets, 70 normotensive healthy young adults (F/M: 44/26, aged 18-38 years) consumed a 5-day low (10 mmol/day), normal (150 mmol), and high (400 mmol) sodium diet followed by heart rate variability (HRV) recordings at rest and during 5-min computerized mental arithmetic. Women were studied in the low hormone phase of the menstrual cycle following each diet. Diet did not affect resting blood pressure, but heart rate (HR) (mean ± SE) was 66 ± 1, 64 ± 1, and 63 ± 1 bpm in low, normal, and high sodium conditions, respectively (analysis of variance P = 0.02). For HRV, there was a main effect of sodium on resting SD of normalized RR intervals (SDNN), square root of the mean squared difference of successive normalized RR intervals (RMSSD), high frequency, low-frequency normalized units (LFnu), and high-frequency normalized units (HFnu) (P < 0.01 for all). The response to low sodium was most marked and consistent with sympathetic activation and reduced vagal activity, with increased LFnu and decreased SDNN, RMSSD, and HFnu compared to both normal and high sodium conditions (P ≤0.05 for all). Dietary sodium-by-mental stress interactions were significant for mean NN, RMSSD, high-frequency power, LFnu, and low frequency/high frequency ratio (P < 0.05 for all). The interactions signify that sodium restriction evoked an increase in resting sympathetic activity and reduced vagal activity to the extent that mental stress caused modest additional disruptions in autonomic balance. Conversely, normal and high sodium evoked a reduction in resting sympathetic activity and incremental increase in resting vagal activity, which were disrupted to a greater

  12. Effects of Yoga on Stress, Stress Adaption, and Heart Rate Variability Among Mental Health Professionals--A Randomized Controlled Trial.

    PubMed

    Lin, Shu-Ling; Huang, Ching-Ya; Shiu, Shau-Ping; Yeh, Shu-Hui

    2015-08-01

    Mental health professionals experiencing work-related stress may experience burn out, leading to a negative impact on their organization and patients. The aim of this study was to examine the effects of yoga classes on work-related stress, stress adaptation, and autonomic nerve activity among mental health professionals. A randomized controlled trial was used, which compared the outcomes between the experimental (e.g., yoga program) and the control groups (e.g., no yoga exercise) for 12 weeks. Work-related stress and stress adaptation were assessed before and after the program. Heart rate variability (HRV) was measured at baseline, midpoint through the weekly yoga classes (6 weeks), and postintervention (after 12 weeks of yoga classes). The results showed that the mental health professionals in the yoga group experienced a significant reduction in work-related stress (t = -6.225, p < .001), and a significant enhancement of stress adaptation (t = 2.128, p = .042). Participants in the control group revealed no significant changes. Comparing the mean differences in pre- and posttest scores between yoga and control groups, we found the yoga group significantly decreased work-related stress (t = -3.216, p = .002), but there was no significant change in stress adaptation (p = .084). While controlling for the pretest scores of work-related stress, participants in yoga, but not the control group, revealed a significant increase in autonomic nerve activity at midpoint (6 weeks) test (t = -2.799, p = .007), and at posttest (12 weeks; t = -2.099, p = .040). Because mental health professionals experienced a reduction in work-related stress and an increase in autonomic nerve activity in a weekly yoga program for 12 weeks, clinicians, administrators, and educators should offer yoga classes as a strategy to help health professionals reduce their work-related stress and balance autonomic nerve activities. © 2015 The Authors. Worldviews on Evidence-Based Nursing published by Wiley

  13. Effect of an office worksite-based yoga program on heart rate variability: outcomes of a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Chronic work-related stress is an independent risk factor for cardiometabolic diseases and associated mortality, particularly when compounded by a sedentary work environment. The purpose of this study was to determine if an office worksite-based hatha yoga program could improve physiological stress, evaluated via heart rate variability (HRV), and associated health-related outcomes in a cohort of office workers. Methods Thirty-seven adults employed in university-based office positions were randomized upon the completion of baseline testing to an experimental or control group. The experimental group completed a 10-week yoga program prescribed three sessions per week during lunch hour (50 min per session). An experienced instructor led the sessions, which emphasized asanas (postures) and vinyasa (exercises). The primary outcome was the high frequency (HF) power component of HRV. Secondary outcomes included additional HRV parameters, musculoskeletal fitness (i.e. push-up, side-bridge, and sit & reach tests) and psychological indices (i.e. state and trait anxiety, quality of life and job satisfaction). Results All measures of HRV failed to change in the experimental group versus the control group, except that the experimental group significantly increased LF:HF (p = 0.04) and reduced pNN50 (p = 0.04) versus control, contrary to our hypotheses. Flexibility, evaluated via sit & reach test increased in the experimental group versus the control group (p < 0.001). No other adaptations were noted. Post hoc analysis comparing participants who completed ≥70% of yoga sessions (n = 11) to control (n = 19) yielded the same findings, except that the high adherers also reduced state anxiety (p = 0.02) and RMSSD (p = 0.05), and tended to improve the push-up test (p = 0.07) versus control. Conclusions A 10-week hatha yoga intervention delivered at the office worksite during lunch hour did not improve HF power or other HRV parameters

  14. Random Initialisation of the Spectral Variables: an Alternate Approach for Initiating Multivariate Curve Resolution Alternating Least Square (MCR-ALS) Analysis.

    PubMed

    Kumar, Keshav

    2017-11-01

    Multivariate curve resolution alternating least square (MCR-ALS) analysis is the most commonly used curve resolution technique. The MCR-ALS model is fitted using the alternate least square (ALS) algorithm that needs initialisation of either contribution profiles or spectral profiles of each of the factor. The contribution profiles can be initialised using the evolve factor analysis; however, in principle, this approach requires that data must belong to the sequential process. The initialisation of the spectral profiles are usually carried out using the pure variable approach such as SIMPLISMA algorithm, this approach demands that each factor must have the pure variables in the data sets. Despite these limitations, the existing approaches have been quite a successful for initiating the MCR-ALS analysis. However, the present work proposes an alternate approach for the initialisation of the spectral variables by generating the random variables in the limits spanned by the maxima and minima of each spectral variable of the data set. The proposed approach does not require that there must be pure variables for each component of the multicomponent system or the concentration direction must follow the sequential process. The proposed approach is successfully validated using the excitation-emission matrix fluorescence data sets acquired for certain fluorophores with significant spectral overlap. The calculated contribution and spectral profiles of these fluorophores are found to correlate well with the experimental results. In summary, the present work proposes an alternate way to initiate the MCR-ALS analysis.

  15. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: An instrumental variables re-analysis of randomized clinical trials

    PubMed Central

    Humphreys, Keith; Blodgett, Janet C.; Wagner, Todd H.

    2014-01-01

    Background Observational studies of Alcoholics Anonymous’ (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study therefore employed an innovative statistical technique to derive a selection bias-free estimate of AA’s impact. Methods Six datasets from 5 National Institutes of Health-funded randomized trials (one with two independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol dependent individuals in one of the datasets (n = 774) were analyzed separately from the rest of sample (n = 1582 individuals pooled from 5 datasets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Results Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In five of the six data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = .38, p = .001) and 15-month (B = 0.42, p = .04) follow-up. However, in the remaining dataset, in which pre-existing AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. Conclusions For most individuals seeking help for alcohol problems, increasing AA attendance leads to short and long term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high pre-existing AA involvement, further increases in AA attendance may have little impact. PMID:25421504

  16. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: an instrumental variables re-analysis of randomized clinical trials.

    PubMed

    Humphreys, Keith; Blodgett, Janet C; Wagner, Todd H

    2014-11-01

    Observational studies of Alcoholics Anonymous' (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study, therefore, employed an innovative statistical technique to derive a selection bias-free estimate of AA's impact. Six data sets from 5 National Institutes of Health-funded randomized trials (1 with 2 independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol-dependent individuals in one of the data sets (n = 774) were analyzed separately from the rest of sample (n = 1,582 individuals pooled from 5 data sets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In 5 of the 6 data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = 0.38, p = 0.001) and 15-month (B = 0.42, p = 0.04) follow-up. However, in the remaining data set, in which preexisting AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. For most individuals seeking help for alcohol problems, increasing AA attendance leads to short- and long-term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high preexisting AA involvement, further increases in AA attendance may have little impact. Copyright © 2014 by the Research Society on Alcoholism.

  17. Free variable selection QSPR study to predict 19F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods

    NASA Astrophysics Data System (ADS)

    Goudarzi, Nasser

    2016-04-01

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the 19F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the 19F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  18. Between-Batch Pharmacokinetic Variability Inflates Type I Error Rate in Conventional Bioequivalence Trials: A Randomized Advair Diskus Clinical Trial.

    PubMed

    Burmeister Getz, E; Carroll, K J; Mielke, J; Benet, L Z; Jones, B

    2017-03-01

    We previously demonstrated pharmacokinetic differences among manufacturing batches of a US Food and Drug Administration (FDA)-approved dry powder inhalation product (Advair Diskus 100/50) large enough to establish between-batch bio-inequivalence. Here, we provide independent confirmation of pharmacokinetic bio-inequivalence among Advair Diskus 100/50 batches, and quantify residual and between-batch variance component magnitudes. These variance estimates are used to consider the type I error rate of the FDA's current two-way crossover design recommendation. When between-batch pharmacokinetic variability is substantial, the conventional two-way crossover design cannot accomplish the objectives of FDA's statistical bioequivalence test (i.e., cannot accurately estimate the test/reference ratio and associated confidence interval). The two-way crossover, which ignores between-batch pharmacokinetic variability, yields an artificially narrow confidence interval on the product comparison. The unavoidable consequence is type I error rate inflation, to ∼25%, when between-batch pharmacokinetic variability is nonzero. This risk of a false bioequivalence conclusion is substantially higher than asserted by regulators as acceptable consumer risk (5%). © 2016 The Authors Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of The American Society for Clinical Pharmacology and Therapeutics.

  19. The Relations of Cognitive, Behavioral, and Physical Activity Variables to Depression Severity in Traumatic Brain Injury: Reanalysis of Data From a Randomized Controlled Trial.

    PubMed

    Bombardier, Charles H; Fann, Jesse R; Ludman, Evette J; Vannoy, Steven D; Dyer, Joshua R; Barber, Jason K; Temkin, Nancy R

    To explore the relations of cognitive, behavioral, and physical activity variables to depression severity among people with traumatic brain injury (TBI) undergoing a depression treatment trial. Community. Adults (N = 88) who sustained complicated mild to severe TBI within the past 10 years, met criteria for major depressive disorder, and completed study measures. Randomized controlled trial. Participants were randomized to cognitive-behavioral therapy (n = 58) or usual care (n = 42). Outcomes were measured at baseline and 16 weeks. We combined the groups and used regressions to explore the relations among theoretical variables and depression outcomes. Depression severity was measured with the Hamilton Depression Rating Scale and Symptom Checklist-20. Theory-based measures were the Dysfunctional Attitudes Scale (DAS), Automatic Thoughts Questionnaire (ATQ), Environmental Rewards Observation Scale (EROS), and the International Physical Activity Questionnaire (IPAQ). Compared with non-TBI norms, baseline DAS and ATQ scores were high and EROS and IPAQ scores were low. All outcomes improved from baseline to 16 weeks except the DAS. The ATQ was an independent predictor of baseline depression. An increase in EROS scores was correlated with decreased depression. Increasing participation in meaningful roles and pleasant activities may be a promising approach to treating depression after TBI.

  20. Comparison of structured and unstructured physical activity training on predicted VO2max and heart rate variability in adolescents - a randomized control trial.

    PubMed

    Sharma, Vivek Kumar; Subramanian, Senthil Kumar; Radhakrishnan, Krishnakumar; Rajendran, Rajathi; Ravindran, Balasubramanian Sulur; Arunachalam, Vinayathan

    2017-05-01

    Physical inactivity contributes to many health issues. The WHO-recommended physical activity for adolescents encompasses aerobic, resistance, and bone strengthening exercises aimed at achieving health-related physical fitness. Heart rate variability (HRV) and maximal aerobic capacity (VO2max) are considered as noninvasive measures of cardiovascular health. The objective of this study is to compare the effect of structured and unstructured physical training on maximal aerobic capacity and HRV among adolescents. We designed a single blinded, parallel, randomized active-controlled trial (Registration No. CTRI/2013/08/003897) to compare the physiological effects of 6 months of globally recommended structured physical activity (SPA), with that of unstructured physical activity (USPA) in healthy school-going adolescents. We recruited 439 healthy student volunteers (boys: 250, girls: 189) in the age group of 12-17 years. Randomization across the groups was done using age and gender stratified randomization method, and the participants were divided into two groups: SPA (n=219, boys: 117, girls: 102) and USPA (n=220, boys: 119, girls: 101). Depending on their training status and gender the participants in both SPA and USPA groups were further subdivided into the following four sub-groups: SPA athlete boys (n=22) and girls (n=17), SPA nonathlete boys (n=95) and girls (n=85), USPA athlete boys (n=23) and girls (n=17), and USPA nonathlete boys (n=96) and girls (n=84). We recorded HRV, body fat%, and VO2 max using Rockport Walk Fitness test before and after the intervention. Maximum aerobic capacity and heart rate variability increased significantly while heart rate, systolic blood pressure, diastolic blood pressure, and body fat percentage decreased significantly after both SPA and USPA intervention. However, the improvement was more in SPA as compared to USPA. SPA is more beneficial for improving cardiorespiratory fitness, HRV, and reducing body fat percentage in terms of

  1. Nasal Jet-CPAP (variable flow) versus Bubble-CPAP in preterm infants with respiratory distress: an open label, randomized controlled trial.

    PubMed

    Bhatti, A; Khan, J; Murki, S; Sundaram, V; Saini, S S; Kumar, P

    2015-11-01

    To compare the failure rates between Jet continuous positive airway pressure device (J-CPAP-variable flow) and Bubble continuous positive airway device (B-CPAP) in preterm infants with respiratory distress. Preterm newborns <34 weeks gestation with onset of respiratory distress within 6 h of life were randomized to receive J-CPAP (a variable flow device) or B-CPAP (continuous flow device). A standardized protocol was followed for titration, weaning and removal of CPAP. Pressure was monitored close to the nares in both the devices every 6 hours and settings were adjusted to provide desired CPAP. The primary outcome was CPAP failure rate within 72 h of life. Secondary outcomes were CPAP failure within 7 days of life, need for surfactant post-randomization, time to CPAP failure, duration of CPAP and complications of prematurity. An intention to treat analysis was done. One-hundred seventy neonates were randomized, 80 to J-CPAP and 90 to B-CPAP. CPAP failure rates within 72 h were similar in infants who received J-CPAP and in those who received B-CPAP (29 versus 21%; relative risks 1.4 (0.8 to 2.3), P=0.25). Mean (95% confidence intervals) time to CPAP failure was 59 h (54 to 64) in the Jet CPAP group in comparison with 65 h (62 to 68) in the Bubble CPAP group (log rank P=0.19). All other secondary outcomes were similar between the two groups. In preterm infants with respiratory distress starting within 6 h of life, CPAP failure rates were similar with Jet CPAP and Bubble CPAP.

  2. Attention Measures of Accuracy, Variability, and Fatigue Detect Early Response to Donepezil in Alzheimer's Disease: A Randomized, Double-blind, Placebo-Controlled Pilot Trial.

    PubMed

    Vila-Castelar, Clara; Ly, Jenny J; Kaplan, Lillian; Van Dyk, Kathleen; Berger, Jeffrey T; Macina, Lucy O; Stewart, Jennifer L; Foldi, Nancy S

    2018-04-09

    Donepezil is widely used to treat Alzheimer's disease (AD), but detecting early response remains challenging for clinicians. Acetylcholine is known to directly modulate attention, particularly under high cognitive conditions, but no studies to date test whether measures of attention under high load can detect early effects of donepezil. We hypothesized that load-dependent attention tasks are sensitive to short-term treatment effects of donepezil, while global and other domain-specific cognitive measures are not. This longitudinal, randomized, double-blind, placebo-controlled pilot trial (ClinicalTrials.gov Identifier: NCT03073876) evaluated 23 participants newly diagnosed with AD initiating de novo donepezil treatment (5 mg). After baseline assessment, participants were randomized into Drug (n = 12) or Placebo (n = 11) groups, and retested after approximately 6 weeks. Cognitive assessment included: (a) attention tasks (Foreperiod Effect, Attentional Blink, and Covert Orienting tasks) measuring processing speed, top-down accuracy, orienting, intra-individual variability, and fatigue; (b) global measures (Alzheimer's Disease Assessment Scale-Cognitive Subscale, Mini-Mental Status Examination, Dementia Rating Scale); and (c) domain-specific measures (memory, language, visuospatial, and executive function). The Drug but not the Placebo group showed benefits of treatment at high-load measures by preserving top-down accuracy, improving intra-individual variability, and averting fatigue. In contrast, other global or cognitive domain-specific measures could not detect treatment effects over the same treatment interval. The pilot-study suggests that attention measures targeting accuracy, variability, and fatigue under high-load conditions could be sensitive to short-term cholinergic treatment. Given the central role of acetylcholine in attentional function, load-dependent attentional measures may be valuable cognitive markers of early treatment response.

  3. Interactions between serum leptin, the insulin-like growth factor-I system, and sex, age, anthropometric and body composition variables in a healthy population randomly selected.

    PubMed

    Gómez, José Manuel; Maravall, Francisco Javier; Gómez, Núria; Navarro, Miguel Angel; Casamitjana, Roser; Soler, Juan

    2003-02-01

    Leptin secretion is influenced by many factors and the GH/IGF axis plays an important role in the regulation of body composition, but the physiological interactions between leptin and the IGF-I system remain unknown. In this study we investigated the relationship between leptin, the IGF-I system, and sex, age, anthropometric and body composition variables in a group of healthy adults randomly selected. A cross-sectional study. The study included 268 subjects, representative of the whole population of the city of L'Hospitalet de Llobregat in sex and age distribution: 134 men aged 41.4 years, range 15-70 years; and 134 women, aged 40.7 years, range 15-70 years. Body mass index (BMI) was calculated, and body composition was determined by using a bioelectrical impedance analyser. Serum leptin concentrations were determined by using a radioimmunoassay (RIA). Serum total IGF-I concentrations, after acid-ethanol extraction, were also measured by RIA. Serum free IGF-I concentrations were determined by an enzymoimmunometric assay. Serum IGFBP3 concentrations were determined by RIA. Plasma basal TSH concentrations were determined by a specific electrochemiluminescence assay. In men the BMI was similar in all decades and waist/hip ratio increased in the last three decades. Fat-free mass decreased by decade. We observed an increase in leptin in the fourth decade with a decrease in IGF-I, free IGF-I and IGFBP3 throughout the decades. Basal TSH showed an increase in the last two decades. In women, BMI, waist/hip ratio and fat mass increased significantly in the last decades. Leptin concentrations increased in the last decades and total IGF-I, free IGF-I and IGFBP3 decreased by decade without changes in basal TSH concentration. In men, there was a positive correlation between leptin and BMI, waist/hip ratio, total body water, fat-free mass and fat mass, and these anthropometric and body composition variables showed a negative correlation with free IGF-I and IGFBP3, without any

  4. Analog model for quantum gravity effects: phonons in random fluids.

    PubMed

    Krein, G; Menezes, G; Svaiter, N F

    2010-09-24

    We describe an analog model for quantum gravity effects in condensed matter physics. The situation discussed is that of phonons propagating in a fluid with a random velocity wave equation. We consider that there are random fluctuations in the reciprocal of the bulk modulus of the system and study free phonons in the presence of Gaussian colored noise with zero mean. We show that, in this model, after performing the random averages over the noise function a free conventional scalar quantum field theory describing free phonons becomes a self-interacting model.

  5. Cruciferous Vegetables Have Variable Effects on Biomarkers of Systemic Inflammation in a Randomized Controlled Trial in Healthy Young Adults12

    PubMed Central

    Navarro, Sandi L.; Schwarz, Yvonne; Song, Xiaoling; Wang, Ching-Yun; Chen, Chu; Trudo, Sabrina P.; Kristal, Alan R.; Kratz, Mario; Eaton, David L.; Lampe, Johanna W.

    2014-01-01

    Background: Isothiocyanates in cruciferous vegetables modulate signaling pathways critical to carcinogenesis, including nuclear factor kappa-light-chain-enhancer of activated B cells (NF-κB), a central regulator of inflammation. Glutathione S-transferase (GST) M1 and GSTT1 metabolize isothiocyanates; genetic variants may result in differences in biologic response. Objective: The objective of this study was to test whether consumption of cruciferous or cruciferous plus apiaceous vegetables altered serum concentrations of interleukin (IL)-6, IL-8, C-reactive protein (CRP), tumor necrosis factor (TNF) α, and soluble TNF receptor (sTNFR) I and II, and whether this response was GSTM1/GSTT1 genotype dependent. Methods: In a randomized crossover trial, healthy men (n = 32) and women (n = 31) aged 20–40 y consumed 4 14-d controlled diets: basal (vegetable-free), single-dose cruciferous (1xC) [7 g vegetables/kg body weight (BW)], double-dose cruciferous (2xC) (14 g/kg BW), and cruciferous plus apiaceous (carrot family) (1xC+A) vegetables (7 and 4 g/kg BW, respectively), with a 21-d washout period between each intervention. Urinary isothiocyanate excretion was also evaluated as a marker of systemic isothiocyanate exposure. Fasting morning blood and urine samples were collected on days 0 and 14 and analyzed. Results: IL-6 concentrations were significantly lower on day 14 of the 2xC and 1xC+A diets than with the basal diet [−19% (95% CI: −30%, −0.1%) and −20% (95% CI: −31%, -0.7%), respectively]. IL-8 concentrations were higher after the 1xC+A diet (+16%; 95% CI: 4.2%, 35.2%) than after the basal diet. There were no effects of diet on CRP, TNF-α, or sTNFRI or II. There were significant differences between GSTM1-null/GSTT1+ individuals for several biomarkers in response to 1xC+A compared with basal diets (CRP: −37.8%; 95% CI: −58.0%, −7.4%; IL-6: −48.6%; 95% CI: −49.6%, −12.0%; IL-8: 16.3%; 95% CI: 6.7%, 57.7%) and with the 2xC diet compared with the

  6. Impact of Flavonols on Cardiometabolic Biomarkers: A Meta-Analysis of Randomized Controlled Human Trials to Explore the Role of Inter-Individual Variability

    PubMed Central

    Menezes, Regina; Rodriguez-Mateos, Ana; Kaltsatou, Antonia; González-Sarrías, Antonio; Greyling, Arno; Giannaki, Christoforos; Andres-Lacueva, Cristina; Milenkovic, Dragan; Gibney, Eileen R.; Dumont, Julie; Schär, Manuel; Garcia-Aloy, Mar; Palma-Duran, Susana Alejandra; Ruskovska, Tatjana; Maksimova, Viktorija; Combet, Emilie; Pinto, Paula

    2017-01-01

    Several epidemiological studies have linked flavonols with decreased risk of cardiovascular disease (CVD). However, some heterogeneity in the individual physiological responses to the consumption of these compounds has been identified. This meta-analysis aimed to study the effect of flavonol supplementation on biomarkers of CVD risk such as, blood lipids, blood pressure and plasma glucose, as well as factors affecting their inter-individual variability. Data from 18 human randomized controlled trials were pooled and the effect was estimated using fixed or random effects meta-analysis model and reported as difference in means (DM). Variability in the response of blood lipids to supplementation with flavonols was assessed by stratifying various population subgroups: age, sex, country, and health status. Results showed significant reductions in total cholesterol (DM = −0.10 mmol/L; 95% CI: −0.20, −0.01), LDL cholesterol (DM = −0.14 mmol/L; 95% CI: −0.21, 0.07), and triacylglycerol (DM = −0.10 mmol/L; 95% CI: −0.18, 0.03), and a significant increase in HDL cholesterol (DM = 0.05 mmol/L; 95% CI: 0.02, 0.07). A significant reduction was also observed in fasting plasma glucose (DM = −0.18 mmol/L; 95% CI: −0.29, −0.08), and in blood pressure (SBP: DM = −4.84 mmHg; 95% CI: −5.64, −4.04; DBP: DM = −3.32 mmHg; 95% CI: −4.09, −2.55). Subgroup analysis showed a more pronounced effect of flavonol intake in participants from Asian countries and in participants with diagnosed disease or dyslipidemia, compared to healthy and normal baseline values. In conclusion, flavonol consumption improved biomarkers of CVD risk, however, country of origin and health status may influence the effect of flavonol intake on blood lipid levels. PMID:28208791

  7. Effects of Aerobic Plus Resistance Exercise on Body Composition Related Variables in Pediatric Obesity: A Systematic Review and Meta-Analysis of Randomized Controlled Trials.

    PubMed

    García-Hermoso, Antonio; Sánchez-López, Mairena; Martínez-Vizcaíno, Vicente

    2015-11-01

    The purpose of this meta-analysis of randomized trials was to determine the effectiveness of aerobic plus resistance exercise interventions on body composition related to variables in overweight and obese youth. A computerized search was made of 7 databases. The analysis was restricted to randomized controlled trials that examined the effect of aerobic and resistance exercise on body composition (body weight, body mass index, fat mass, fat-free mass, and waist circumference) in obese youth. Two independent reviewers screened studies and extracted data. Weighted mean differences (WMD) and 95% confidence intervals were calculated. Nine studies were selected for meta-analysis as they fulfilled the inclusion criteria (n = 365). Aerobic plus resistance exercise interventions (8-24 weeks duration) produced a decrease in body weight (WMD=-3.31 kg), body mass index (WMD=-1.05 kg/m2), and fat mass (WMD=-1.93% and 5.05 kg), but changes in fat-free mass and waist circumference were not observed. These changes were accentuated through programs of at least 60 min of exercise per session, generating greater reductions in body weight (WMD=-4.11 kg), fat mass (WMD=-4.07%), and increase in fat-free mass (WMD = 2.45 kg). This meta-analysis provides insight into the effectiveness of short-term aerobic plus resistance exercise interventions for decreasing body weight, body mass index, and fat mass in pediatric obesity.

  8. Operant Variability: Some Random Thoughts

    ERIC Educational Resources Information Center

    Marr, M. Jackson

    2012-01-01

    Barba's (2012) paper is a serious and thoughtful analysis of a vexing problem in behavior analysis: Just what should count as an operant class and how do people know? The slippery issue of a "generalized operant" or functional response class illustrates one aspect of this problem, and "variation" or "novelty" as an operant appears to fall into…

  9. Variations of high frequency parameter of heart rate variability following osteopathic manipulative treatment in healthy subjects compared to control group and sham therapy: randomized controlled trial.

    PubMed

    Ruffini, Nuria; D'Alessandro, Giandomenico; Mariani, Nicolò; Pollastrelli, Alberto; Cardinali, Lucia; Cerritelli, Francesco

    2015-01-01

    Heart Rate Variability (HRV) indicates how heart rate changes in response to inner and external stimuli. HRV is linked to health status and it is an indirect marker of the autonomic nervous system (ANS) function. To investigate the influence of osteopathic manipulative treatment (OMT) on cardiac autonomic modulation in healthy subjects, compared with sham therapy and control group. Sixty-six healthy subjects, both male and female, were included in the present 3-armed randomized placebo controlled within subject cross-over single blinded study. Participants were asymptomatic adults (26.7 ± 8.4 y, 51% male, BMI 18.5 ± 4.8), both smokers and non-smokers and not on medications. At enrollment subjects were randomized in three groups: A, B, C. Standardized structural evaluation followed by a patient need-based osteopathic treatment was performed in the first session of group A and in the second session of group B. Standardized evaluation followed by a protocoled sham treatment was provided in the second session of group A and in the first session of group B. No intervention was performed in the two sessions of group C, acting as a time-control. The trial was registered on clinicaltrials.gov identifier: NCT01908920. HRV was calculated from electrocardiography before, during and after the intervention, for a total amount time of 25 min and considering frequency domain as well as linear and non-linear methods as outcome measures. OMT engendered a statistically significant increase of parasympathetic activity, as shown by High Frequency power (p < 0.001), expressed in normalized and absolute unit, and possibly decrease of sympathetic activity, as revealed by Low Frequency power (p < 0.01); results also showed a reduction of Low Frequency/High Frequency ratio (p < 0.001) and Detrended fluctuation scaling exponent (p < 0.05). Findings suggested that OMT can influence ANS activity increasing parasympathetic function and decreasing sympathetic activity, compared to sham

  10. Comparison of the effect of two sugar-substituted chewing gums on different caries- and gingivitis-related variables: a double-blind, randomized, controlled clinical trial.

    PubMed

    Martínez-Pabón, María C; Duque-Agudelo, Lucas; Díaz-Gil, Juan D; Isaza-Guzmán, Diana M; Tobón-Arroyave, Sergio I

    2014-01-01

    The aim of this study was to compare the effect of two sugar-substituted chewing gums besides toothbrushing on different clinical, microbiological, and biochemical caries- and gingivitis-related variables. The study was designed as a double-blind, randomized, controlled trial with three parallel arms. A total of 130 dental students, who volunteered after signing an informed consent, were randomly allocated to receive one of the following interventions: hexitol-sweetened gum containing casein phosphopeptide-amorphous calcium phosphate (CPP-ACP), pentitol-sweetened gum containing no CPP-ACP, and control group with no gum. Subjects within the experimental groups chewed two gum pellets for 20 min three times a day after meals. The daily consumption level of both polyols was 6.0 g. Clinical examinations and salivary samplings were conducted at baseline and after 30 days of gum use. Pre- and post-intervention stimulated whole saliva samples were quantified for calcium/phosphate ionic concentration, total facultative bacterial load, Streptococcus mutans/Lactobacillus spp. counts, and Gram-negative percentage. A statistically significant reduction in visible plaque score was displayed in the hexitol/CPP-ACP gum group after the intervention when compared with baseline, but the order of the effect was in the same order as the differences between the groups at baseline. A similar tendency was seen in both the pentitol/non-CPP-ACP gum and control groups regarding total salivary facultative bacterial load and S. mutans count, but median values of these parameters were more significantly reduced in the pentitol/non-CPP-ACP gum group in comparison with those of the control group. Alterations of salivary Lactobacillus spp. were demonstrated only in the pentitol/non-CPP-ACP gum group. Although these findings might indicate that a 30-day protocol of daily chewing of pentitol-sweetened gum containing no CPP-ACP might have some a reducing effect on the salivary levels of facultative

  11. Effects of Person-Centered Physical Therapy on Fatigue-Related Variables in Persons With Rheumatoid Arthritis: A Randomized Controlled Trial.

    PubMed

    Feldthusen, Caroline; Dean, Elizabeth; Forsblad-d'Elia, Helena; Mannerkorpi, Kaisa

    2016-01-01

    To examine effects of person-centered physical therapy on fatigue and related variables in persons with rheumatoid arthritis (RA). Randomized controlled trial. Hospital outpatient rheumatology clinic. Persons with RA aged 20 to 65 years (N=70): intervention group (n=36) and reference group (n=34). The 12-week intervention, with 6-month follow-up, focused on partnership between participant and physical therapist and tailored health-enhancing physical activity and balancing life activities. The reference group continued with regular activities; both groups received usual health care. Primary outcome was general fatigue (visual analog scale). Secondary outcomes included multidimensional fatigue (Bristol Rheumatoid Arthritis Fatigue Multi-Dimensional Questionnaire) and fatigue-related variables (ie, disease, health, function). At posttest, general fatigue improved more in the intervention group than the reference group (P=.042). Improvement in median general fatigue reached minimal clinically important differences between and within groups at posttest and follow-up. Improvement was also observed for anxiety (P=.0099), and trends toward improvements were observed for most multidimensional aspects of fatigue (P=.023-.048), leg strength/endurance (P=.024), and physical activity (P=.023). Compared with the reference group at follow-up, the intervention group improvement was observed for leg strength/endurance (P=.001), and the trends toward improvements persisted for physical (P=.041) and living-related (P=.031) aspects of fatigue, physical activity (P=.019), anxiety (P=.015), self-rated health (P=.010), and self-efficacy (P=.046). Person-centered physical therapy focused on health-enhancing physical activity and balancing life activities showed significant benefits on fatigue in persons with RA. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. Sound propagation through a variable area duct - Experiment and theory

    NASA Technical Reports Server (NTRS)

    Silcox, R. J.; Lester, H. C.

    1981-01-01

    A comparison of experiment and theory has been made for the propagation of sound through a variable area axisymmetric duct with zero mean flow. Measurement of the acoustic pressure field on both sides of the constricted test section was resolved on a modal basis for various spinning mode sources. Transmitted and reflected modal amplitudes and phase angles were compared with finite element computations. Good agreement between experiment and computation was obtained over a wide range of frequencies and modal transmission variations. The study suggests that modal transmission through a variable area duct is governed by the throat modal cut-off ratio.

  13. Statistical variability study of random dopant fluctuation on gate-all-around inversion-mode silicon nanowire field-effect transistors

    NASA Astrophysics Data System (ADS)

    Yoon, Jun-Sik; Rim, Taiuk; Kim, Jungsik; Kim, Kihyun; Baek, Chang-Ki; Jeong, Yoon-Ha

    2015-03-01

    Random dopant fluctuation effects of gate-all-around inversion-mode silicon nanowire field-effect transistors (FETs) with different diameters and extension lengths are investigated. The nanowire FETs with smaller diameter and longer extension length reduce average values and variations of subthreshold swing and drain-induced barrier lowering, thus improving short channel immunity. Relative variations of the drain currents increase as the diameter decreases because of decreased current drivability from narrower channel cross-sections. Absolute variations of the drain currents decrease critically as the extension length increases due to decreasing the number of arsenic dopants penetrating into the channel region. To understand variability origins of the drain currents, variations of source/drain series resistance and low-field mobility are investigated. All these two parameters affect the variations of the drain currents concurrently. The nanowire FETs having extension lengths sufficient to prevent dopant penetration into the channel regions and maintaining relatively large cross-sections are suggested to achieve suitable short channel immunity and small variations of the drain currents.

  14. Heart rate variability during acute psychosocial stress: A randomized cross-over trial of verbal and non-verbal laboratory stressors.

    PubMed

    Brugnera, Agostino; Zarbo, Cristina; Tarvainen, Mika P; Marchettini, Paolo; Adorni, Roberta; Compare, Angelo

    2018-05-01

    Acute psychosocial stress is typically investigated in laboratory settings using protocols with distinctive characteristics. For example, some tasks involve the action of speaking, which seems to alter Heart Rate Variability (HRV) through acute changes in respiration patterns. However, it is still unknown which task induces the strongest subjective and autonomic stress response. The present cross-over randomized trial sought to investigate the differences in perceived stress and in linear and non-linear analyses of HRV between three different verbal (Speech and Stroop) and non-verbal (Montreal Imaging Stress Task; MIST) stress tasks, in a sample of 60 healthy adults (51.7% females; mean age = 25.6 ± 3.83 years). Analyses were run controlling for respiration rates. Participants reported similar levels of perceived stress across the three tasks. However, MIST induced a stronger cardiovascular response than Speech and Stroop tasks, even after controlling for respiration rates. Finally, women reported higher levels of perceived stress and lower HRV both at rest and in response to acute psychosocial stressors, compared to men. Taken together, our results suggest the presence of gender-related differences during psychophysiological experiments on stress. They also suggest that verbal activity masked the vagal withdrawal through altered respiration patterns imposed by speaking. Therefore, our findings support the use of highly-standardized math task, such as MIST, as a valid and reliable alternative to verbal protocols during laboratory studies on stress. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Effect of Transcutaneous Acupoint Electrical Stimulation on Post-Hemorrhoidectomy-Associated Pain, Anxiety, and Heart Rate Variability: A Randomized-Controlled Study.

    PubMed

    Yeh, Mei-Ling; Chung, Yu-Chu; Hsu, Lun-Chia; Hung, Shuo-Hui

    2018-05-01

    Hemorrhoidectomy is the current best treatment for severe hemorrhoids, but it causes significant postoperative pain and anxiety, which is associated with heart rate variability (HRV). Transcutaneous acupoint electrical stimulation (TAES) was assumed to alleviate pain and anxiety, and modify the autonomic nervous system. This study aimed to examine the effects of TAES intervention on postoperative pain, anxiety, and HRV in patients who received a hemorrhoidectomy. A randomized-controlled trial with five repeated measures was conducted. The TAES group ( n = 39) received four 20-min sessions of electrical stimulation at chengshan (BL57) and erbai (EX-UE2) after hemorrhoidectomy, whereas the control group ( n = 41) did not. Data were collected using Visual Analogue Scale (VAS), State Anxiety Inventory (STAI), and HRV physiological signal monitor. TAES resulted in a significant group difference in pain scores, anxiety levels, and some HRV parameters. The findings indicate that TAES can help reduce pain and anxiety associated with hemorrhoidectomy. TAES is a noninvasive, simple, and convenient modality for post-hemorrhoidectomy-associated pain control and anxiety reduction.

  16. Heart rate variability is enhanced in controls but not maladaptive perfectionists during brief mindfulness meditation following stress-induction: A stratified-randomized trial.

    PubMed

    Azam, Muhammad Abid; Katz, Joel; Fashler, Samantha R; Changoor, Tina; Azargive, Saam; Ritvo, Paul

    2015-10-01

    Heart rate variability (HRV) is a vagal nerve-mediated biomarker of cardiac function used to investigate chronic illness, psychopathology, stress and, more recently, attention-regulation processes such as meditation. This study investigated HRV in relation to maladaptive perfectionism, a stress-related personality factor, and mindfulness meditation, a stress coping practice expected to elevate HRV, and thereby promote relaxation. Maladaptive perfectionists (n=21) and Controls (n=39) were exposed to a lab-based assessment in which HRV was measured during (1) a 5-minute baseline resting phase, (2) a 5-minute cognitive stress-induction phase, and (3) a post-stress phase. In the post-stress phase, participants were randomly assigned to a 10-minute audio-instructed mindfulness meditation condition or a 10-minute rest condition with audio-description of mindfulness meditation. Analyses revealed a significant elevation in HRV during meditation for Controls but not for Perfectionists. These results suggest that mindfulness meditation promotes relaxation following cognitive stress and that the perfectionist personality hinders relaxation possibly because of decreased cardiac vagal tone. The results are discussed in the context of developing psychophysiological models to advance therapeutic interventions for distressed populations. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Variations of high frequency parameter of heart rate variability following osteopathic manipulative treatment in healthy subjects compared to control group and sham therapy: randomized controlled trial

    PubMed Central

    Ruffini, Nuria; D'Alessandro, Giandomenico; Mariani, Nicolò; Pollastrelli, Alberto; Cardinali, Lucia; Cerritelli, Francesco

    2015-01-01

    Context: Heart Rate Variability (HRV) indicates how heart rate changes in response to inner and external stimuli. HRV is linked to health status and it is an indirect marker of the autonomic nervous system (ANS) function. Objective: To investigate the influence of osteopathic manipulative treatment (OMT) on cardiac autonomic modulation in healthy subjects, compared with sham therapy and control group. Methods: Sixty-six healthy subjects, both male and female, were included in the present 3-armed randomized placebo controlled within subject cross-over single blinded study. Participants were asymptomatic adults (26.7 ± 8.4 y, 51% male, BMI 18.5 ± 4.8), both smokers and non-smokers and not on medications. At enrollment subjects were randomized in three groups: A, B, C. Standardized structural evaluation followed by a patient need-based osteopathic treatment was performed in the first session of group A and in the second session of group B. Standardized evaluation followed by a protocoled sham treatment was provided in the second session of group A and in the first session of group B. No intervention was performed in the two sessions of group C, acting as a time-control. The trial was registered on clinicaltrials.gov identifier: NCT01908920. Main Outcomes Measures: HRV was calculated from electrocardiography before, during and after the intervention, for a total amount time of 25 min and considering frequency domain as well as linear and non-linear methods as outcome measures. Results: OMT engendered a statistically significant increase of parasympathetic activity, as shown by High Frequency power (p < 0.001), expressed in normalized and absolute unit, and possibly decrease of sympathetic activity, as revealed by Low Frequency power (p < 0.01); results also showed a reduction of Low Frequency/High Frequency ratio (p < 0.001) and Detrended fluctuation scaling exponent (p < 0.05). Conclusions: Findings suggested that OMT can influence ANS activity increasing

  18. Impact of resistance training on body composition and metabolic syndrome variables during androgen deprivation therapy for prostate cancer: a pilot randomized controlled trial.

    PubMed

    Dawson, Jacqueline K; Dorff, Tanya B; Todd Schroeder, E; Lane, Christianne J; Gross, Mitchell E; Dieli-Conwright, Christina M

    2018-04-03

    Prostate cancer patients on androgen deprivation therapy (ADT) experience adverse effects such as lean mass loss, known as sarcopenia, fat gain, and changes in cardiometabolic factors that increase risk of metabolic syndrome (MetS). Resistance training can increase lean mass, reduce body fat, and improve physical function and quality of life, but no exercise interventions in prostate cancer patients on ADT have concomitantly improved body composition and MetS. This pilot trial investigated 12 weeks of resistance training on body composition and MetS changes in prostate cancer patients on ADT. An exploratory aim examined if a combined approach of training and protein supplementation would elicit greater changes in body composition. Prostate cancer patients on ADT were randomized to resistance training and protein supplementation (TRAINPRO), resistance training (TRAIN), protein supplementation (PRO), or control stretching (STRETCH). Exercise groups (EXE = TRAINPRO, TRAIN) performed supervised exercise 3 days per week for 12 weeks, while non-exercise groups (NoEXE = PRO, STRETCH) performed a home-based stretching program. TRAINPRO and PRO received 50 g⋅day - 1 of whey protein. The primary outcome was change in lean mass assessed through dual energy x-ray absorptiometry. Secondary outcomes examined changes in sarcopenia, assessed through appendicular skeletal mass (ASM) index (kg/m 2 ), body fat %, strength, physical function, quality of life, MetS score and the MetS components of waist circumference, blood pressure, glucose, high-density lipoprotein-cholesterol, and triglyceride levels. A total of 37 participants were randomized; 32 participated in the intervention (EXE n = 13; NoEXE n = 19). At baseline, 43.8% of participants were sarcopenic and 40.6% met the criteria for MetS. Post-intervention, EXE significantly improved lean mass (d = 0.9), sarcopenia prevalence (d = 0.8), body fat % (d = 1.1), strength (d = 0.8-3.0), and

  19. Rye-Based Evening Meals Favorably Affected Glucose Regulation and Appetite Variables at the Following Breakfast; A Randomized Controlled Study in Healthy Subjects.

    PubMed

    Sandberg, Jonna C; Björck, Inger M E; Nilsson, Anne C

    2016-01-01

    Whole grain has shown potential to prevent obesity, cardiovascular disease and type 2 diabetes. Possible mechanism could be related to colonic fermentation of specific indigestible carbohydrates, i.e. dietary fiber (DF). The aim of this study was to investigate effects on cardiometabolic risk factors and appetite regulation the next day when ingesting rye kernel bread rich in DF as an evening meal. Whole grain rye kernel test bread (RKB) or a white wheat flour based bread (reference product, WWB) was provided as late evening meals to healthy young adults in a randomized cross-over design. The test products RKB and WWB were provided in two priming settings: as a single evening meal or as three consecutive evening meals prior to the experimental days. Test variables were measured in the morning, 10.5-13.5 hours after ingestion of RKB or WWB. The postprandial phase was analyzed for measures of glucose metabolism, inflammatory markers, appetite regulating hormones and short chain fatty acids (SCFA) in blood, hydrogen excretion in breath and subjective appetite ratings. With the exception of serum CRP, no significant differences in test variables were observed depending on length of priming (P>0.05). The RKB evening meal increased plasma concentrations of PYY (0-120 min, P<0.001), GLP-1 (0-90 min, P<0.05) and fasting SCFA (acetate and butyrate, P<0.05, propionate, P = 0.05), compared to WWB. Moreover, RKB decreased blood glucose (0-120 min, P = 0.001), serum insulin response (0-120 min, P<0.05) and fasting FFA concentrations (P<0.05). Additionally, RKB improved subjective appetite ratings during the whole experimental period (P<0.05), and increased breath hydrogen excretion (P<0.001), indicating increased colonic fermentation activity. The results indicate that RKB evening meal has an anti-diabetic potential and that the increased release of satiety hormones and improvements of appetite sensation could be beneficial in preventing obesity. These effects could possibly be

  20. Rye-Based Evening Meals Favorably Affected Glucose Regulation and Appetite Variables at the Following Breakfast; A Randomized Controlled Study in Healthy Subjects

    PubMed Central

    Sandberg, Jonna C.; Björck, Inger M. E.; Nilsson, Anne C.

    2016-01-01

    Background Whole grain has shown potential to prevent obesity, cardiovascular disease and type 2 diabetes. Possible mechanism could be related to colonic fermentation of specific indigestible carbohydrates, i.e. dietary fiber (DF). The aim of this study was to investigate effects on cardiometabolic risk factors and appetite regulation the next day when ingesting rye kernel bread rich in DF as an evening meal. Method Whole grain rye kernel test bread (RKB) or a white wheat flour based bread (reference product, WWB) was provided as late evening meals to healthy young adults in a randomized cross-over design. The test products RKB and WWB were provided in two priming settings: as a single evening meal or as three consecutive evening meals prior to the experimental days. Test variables were measured in the morning, 10.5–13.5 hours after ingestion of RKB or WWB. The postprandial phase was analyzed for measures of glucose metabolism, inflammatory markers, appetite regulating hormones and short chain fatty acids (SCFA) in blood, hydrogen excretion in breath and subjective appetite ratings. Results With the exception of serum CRP, no significant differences in test variables were observed depending on length of priming (P>0.05). The RKB evening meal increased plasma concentrations of PYY (0–120 min, P<0.001), GLP-1 (0–90 min, P<0.05) and fasting SCFA (acetate and butyrate, P<0.05, propionate, P = 0.05), compared to WWB. Moreover, RKB decreased blood glucose (0–120 min, P = 0.001), serum insulin response (0–120 min, P<0.05) and fasting FFA concentrations (P<0.05). Additionally, RKB improved subjective appetite ratings during the whole experimental period (P<0.05), and increased breath hydrogen excretion (P<0.001), indicating increased colonic fermentation activity. Conclusion The results indicate that RKB evening meal has an anti-diabetic potential and that the increased release of satiety hormones and improvements of appetite sensation could be beneficial in

  1. A randomized trial of high-dairy-protein, variable-carbohydrate diets and exercise on body composition in adults with obesity.

    PubMed

    Parr, Evelyn B; Coffey, Vernon G; Cato, Louise E; Phillips, Stuart M; Burke, Louise M; Hawley, John A

    2016-05-01

    This study determined the effects of 16-week high-dairy-protein, variable-carbohydrate (CHO) diets and exercise training (EXT) on body composition in men and women with overweight/obesity. One hundred and eleven participants (age 47 ± 6 years, body mass 90.9 ± 11.7 kg, BMI 33 ± 4 kg/m(2) , values mean ± SD) were randomly stratified to diets with either: high dairy protein, moderate CHO (40% CHO: 30% protein: 30% fat; ∼4 dairy servings); high dairy protein, high CHO (55%: 30%: 15%; ∼4 dairy servings); or control (55%: 15%: 30%; ∼1 dairy serving). Energy restriction (500 kcal/day) was achieved through diet (∼250 kcal/day) and EXT (∼250 kcal/day). Body composition was measured using dual-energy X-ray absorptiometry before, midway, and upon completion of the intervention. Eighty-nine (25 M/64 F) of 115 participants completed the 16-week intervention, losing 7.7 ± 3.2 kg fat mass (P < 0.001) and gaining 0.50 ± 1.75 kg lean mass (P < 0.01). There was no difference in the changes in body composition (fat mass or lean mass) between groups. Compared to a healthy control diet, energy-restricted high-protein diets containing different proportions of fat and CHO confer no advantage to weight loss or change in body composition in the presence of an appropriate exercise stimulus. © 2016 The Obesity Society.

  2. Clustering according to urolithin metabotype explains the interindividual variability in the improvement of cardiovascular risk biomarkers in overweight-obese individuals consuming pomegranate: A randomized clinical trial.

    PubMed

    González-Sarrías, Antonio; García-Villalba, Rocío; Romo-Vaquero, María; Alasalvar, Cesarettin; Örem, Asim; Zafrilla, Pilar; Tomás-Barberán, Francisco A; Selma, María V; Espín, Juan Carlos

    2017-05-01

    The pomegranate lipid-lowering properties remain controversial, probably due to the interindividual variability in polyphenol (ellagitannins) metabolism. We aimed at investigating whether the microbially derived ellagitannin-metabolizing phenotypes, i.e. urolithin metabotypes A, (UM-A), B (UM-B), and 0 (UM-0), influence the effects of pomegranate extract (PE) consumption on 18 cardiovascular risk biomarkers in healthy overweight-obese individuals. A double-blind, crossover, dose-response, randomized, placebo-controlled trial was conducted. The study (POMEcardio) consisted of two test phases (dose-1 and dose-2, lasting 3 weeks each) and a 3-week washout period between each phase. Forty-nine participants (BMI > 27 kg/m 2 ) daily consumed one (dose-1, 160 mg phenolics/day) or four (dose-2, 640 mg phenolics/day) PE or placebo capsules. Notably, UM-B individuals showed the highest baseline cardiovascular risk. After dose-2, total cholesterol (-15.5 ± 3.7%), LDL-cholesterol (-14.9 ± 2.1%), small LDL-cholesterol (-47 ± 7%), non-HDL-cholesterol (-11.3 ± 2.5%), apolipoprotein-B (-12 ± 2.2%), and oxidized LDL-cholesterol -24 ± 2.5%) dose dependently decreased (P < 0.05) but only in UM-B subjects. These effects were partially correlated with urolithin production and the increase in Gordonibacter levels. Three (50%) nonproducers (UM-0) became producers following PE consumption. UM clustering suggests a personalized effect of ellagitannin-containing foods and could explain the controversial pomegranate benefits. Research on the specific role of urolithins and the microbiota associated with each UM is warranted. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Pilot Randomized Study of a Gratitude Journaling Intervention on Heart Rate Variability and Inflammatory Biomarkers in Patients With Stage B Heart Failure.

    PubMed

    Redwine, Laura S; Henry, Brook L; Pung, Meredith A; Wilson, Kathleen; Chinh, Kelly; Knight, Brian; Jain, Shamini; Rutledge, Thomas; Greenberg, Barry; Maisel, Alan; Mills, Paul J

    2016-01-01

    Stage B, asymptomatic heart failure (HF) presents a therapeutic window for attenuating disease progression and development of HF symptoms, and improving quality of life. Gratitude, the practice of appreciating positive life features, is highly related to quality of life, leading to development of promising clinical interventions. However, few gratitude studies have investigated objective measures of physical health; most relied on self-report measures. We conducted a pilot study in Stage B HF patients to examine whether gratitude journaling improved biomarkers related to HF prognosis. Patients (n = 70; mean [standard deviation] age = 66.2 [7.6] years) were randomized to an 8-week gratitude journaling intervention or treatment as usual. Baseline (T1) assessments included the six-item Gratitude Questionnaire, resting heart rate variability (HRV), and an inflammatory biomarker index. At T2 (midintervention), the six-item Gratitude Questionnaire was measured. At T3 (postintervention), T1 measures were repeated but also included a gratitude journaling task. The gratitude intervention was associated with improved trait gratitude scores (F = 6.0, p = .017, η = 0.10), reduced inflammatory biomarker index score over time (F = 9.7, p = .004, η = 0.21), and increased parasympathetic HRV responses during the gratitude journaling task (F = 4.2, p = .036, η = 0.15), compared with treatment as usual. However, there were no resting preintervention to postintervention group differences in HRV (p values > .10). Gratitude journaling may improve biomarkers related to HF morbidity, such as reduced inflammation; large-scale studies with active control conditions are needed to confirm these findings. Clinicaltrials.govidentifier:NCT01615094.

  4. A novel aminoacid determinant of HIV-1 restriction in the TRIM5α variable 1 region isolated in a random mutagenic screen.

    PubMed

    Pham, Quang Toan; Veillette, Maxime; Brandariz-Nuñez, Alberto; Pawlica, Paulina; Thibert-Lefebvre, Caroline; Chandonnet, Nadia; Diaz-Griffero, Felipe; Berthoux, Lionel

    2013-05-01

    Human-derived antiretroviral transgenes are of great biomedical interest and are actively pursued. HIV-1 is efficiently inhibited at post-entry, pre-integration replication stages by point mutations in the variable region 1 (v1) of the human restriction factor TRIM5α. Here we use a mutated megaprimer approach to create a mutant library of TRIM5αHu v1 and to isolate a mutation at Gly330 (G330E) that inhibits transduction of an HIV-1 vector as efficiently as the previously described mutants at positions Arg332 and Arg335. As was the case for these other mutations, modification of the local v1 charge toward increased acidity was key to inhibiting HIV-1. G330E TRIM5αHu also disrupted replication-competent HIV-1 propagation in a human T cell line. Interestingly, G330E did not enhance restriction of HIV-1 when combined with mutations at Arg332 or Arg335. Accordingly, the triple mutant G330E-R332G-R335G bound purified recombinant HIV-1 capsid tubes less efficiently than the double mutant R332G-R335G did. In a structural model of the TRIM5αHu PRYSPRY domain, the addition of G330E to the double mutant R332G-R335G caused extensive changes to the capsid-binding surface, which may explain why the triple mutant was no more restrictive than the double mutant. The HIV-1 inhibitory potential of Gly330 mutants was not predicted by examination of natural TRIM5α orthologs that are known to strongly inhibit HIV-1. This work underlines the potential of random mutagenesis to isolate novel variants of human proteins with antiviral properties. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. The impact of periodontal therapy and the adjunctive effect of antiseptics on breath odor-related outcome variables: a double-blind randomized study.

    PubMed

    Quirynen, Marc; Zhao, Hong; Soers, Catherine; Dekeyser, Christel; Pauwels, Martine; Coucke, Wim; Steenberghe, Daniel van

    2005-05-01

    Bad breath is often caused by periodontitis and/or tongue coating. This study followed the impact of initial periodontal therapy on several halitosis-related outcome variables over a 6-month period. Organoleptic ratings are often uncomfortable for the patient and have several disadvantages. They are, for instance, influenced by external parameters (e.g., food intake and cosmetics) and need to be calibrated among researchers worldwide. A second aim was to evaluate the reliability of saliva incubation as an in vitro indirect test for breath recording. In this double-blind, randomized, medium-term, parallel study 45 moderate periodontitis patients without obvious tongue coating were enrolled. Besides a one-stage, full-mouth disinfection and oral hygiene improvement (including daily tongue scraping), patients were instructed to rinse daily for 6 months with one of the following products (randomly allocated): chlorhexidine (CHX) 0.2% + alcohol, CHX 0.05% + cetyl pyridinium chloride (CPC) 0.05% without alcohol (a new formulation), or a placebo solution. At baseline and 3 and 6 months, a series of parameters were recorded including: concentration of volatile sulfide compounds (VSC), tongue coating, and an estimation of the microbial load (at anterior and posterior parts of the tongue, saliva, dental plaque). The intraoral VSC ratings were compared to in vitro VSC recordings and organoleptic evaluations of the headspace air from 1 and 2 hours incubated saliva (0.5 ml, 37 degrees C, anaerobic chamber). Even though the initial VSC values were not high (+/-90 ppb with only 18 patients revealing more than 100 ppb), significant (P <0.05) reductions could be achieved in the CHX and CHX + CPC group, and to a lower extent in the placebo group (P = 0.10). Tongue scraping resulted in a significant reduction (P < or =0.05) of the tongue coating up to month 6 in the placebo and CHX + CPC group, but not in the CHX group (confusion due to staining). The CHX and CHX + CPC group showed

  6. Tai Chi for Reducing Dual-task Gait Variability, a Potential Mediator of Fall Risk in Parkinson’s Disease: A Pilot Randomized Controlled Trial

    PubMed Central

    Vergara-Diaz, Gloria; Osypiuk, Kamila; Hausdorff, Jeffrey M; Bonato, Paolo; Gow, Brian J; Miranda, Jose GV; Sudarsky, Lewis R; Tarsy, Daniel; Fox, Michael D; Gardiner, Paula; Thomas, Cathi A; Macklin, Eric A; Wayne, Peter M

    2018-01-01

    Objectives To assess the feasibility and inform design features of a fully powered randomized controlled trial (RCT) evaluating the effects of Tai Chi (TC) in Parkinson’s disease (PD) and to select outcomes most responsive to TC assessed during off-medication states. Design Two-arm, wait-list controlled RCT. Settings Tertiary care hospital. Subjects Thirty-two subjects aged 40–75 diagnosed with idiopathic PD within 10 years. Interventions Six-month TC intervention added to usual care (UC) versus UC alone. Outcome Measures Primary outcomes were feasibility-related (recruitment rate, adherence, and compliance). Change in dual-task (DT) gait stride-time variability (STV) from baseline to 6 months was defined, a priori, as the clinical outcome measure of primary interest. Other outcomes included: PD motor symptom progression (Unified Parkinson’s Disease Rating Scale [UPDRS]), PD-related quality of life (PDQ-39), executive function (Trail Making Test), balance confidence (Activity-Specific Balance Confidence Scale, ABC), and Timed Up and Go test (TUG). All clinical assessments were made in the off-state for PD medications. Results Thirty-two subjects were enrolled into 3 sequential cohorts over 417 days at an average rate of 0.08 subjects per day. Seventy-five percent (12/16) in the TC group vs 94% (15/16) in the UC group completed the primary 6-month follow-up assessment. Mean TC exposure hours overall: 52. No AEs occurred during or as a direct result of TC exercise. Statistically nonsignificant improvements were observed in the TC group at 6 months in DT gait STV (TC [20.1%] vs UC [−0.1%] group [effect size 0.49; P = .47]), ABC, TUG, and PDQ-39. UPDRS progression was modest and very similar in TC and UC groups. Conclusions Conducting an RCT of TC for PD is feasible, though measures to improve recruitment and adherence rates are needed. DT gait STV is a sensitive and logical outcome for evaluating the combined cognitive-motor effects of TC in PD.

  7. Tai Chi for Reducing Dual-task Gait Variability, a Potential Mediator of Fall Risk in Parkinson's Disease: A Pilot Randomized Controlled Trial.

    PubMed

    Vergara-Diaz, Gloria; Osypiuk, Kamila; Hausdorff, Jeffrey M; Bonato, Paolo; Gow, Brian J; Miranda, Jose Gv; Sudarsky, Lewis R; Tarsy, Daniel; Fox, Michael D; Gardiner, Paula; Thomas, Cathi A; Macklin, Eric A; Wayne, Peter M

    2018-01-01

    To assess the feasibility and inform design features of a fully powered randomized controlled trial (RCT) evaluating the effects of Tai Chi (TC) in Parkinson's disease (PD) and to select outcomes most responsive to TC assessed during off-medication states. Two-arm, wait-list controlled RCT. Tertiary care hospital. Thirty-two subjects aged 40-75 diagnosed with idiopathic PD within 10 years. Six-month TC intervention added to usual care (UC) versus UC alone. Primary outcomes were feasibility-related (recruitment rate, adherence, and compliance). Change in dual-task (DT) gait stride-time variability (STV) from baseline to 6 months was defined, a priori, as the clinical outcome measure of primary interest. Other outcomes included: PD motor symptom progression (Unified Parkinson's Disease Rating Scale [UPDRS]), PD-related quality of life (PDQ-39), executive function (Trail Making Test), balance confidence (Activity-Specific Balance Confidence Scale, ABC), and Timed Up and Go test (TUG). All clinical assessments were made in the off-state for PD medications. Thirty-two subjects were enrolled into 3 sequential cohorts over 417 days at an average rate of 0.08 subjects per day. Seventy-five percent (12/16) in the TC group vs 94% (15/16) in the UC group completed the primary 6-month follow-up assessment. Mean TC exposure hours overall: 52. No AEs occurred during or as a direct result of TC exercise. Statistically nonsignificant improvements were observed in the TC group at 6 months in DT gait STV (TC [20.1%] vs UC [-0.1%] group [effect size 0.49; P = .47]), ABC, TUG, and PDQ-39. UPDRS progression was modest and very similar in TC and UC groups. Conducting an RCT of TC for PD is feasible, though measures to improve recruitment and adherence rates are needed. DT gait STV is a sensitive and logical outcome for evaluating the combined cognitive-motor effects of TC in PD.

  8. Selection of Common Items as an Unrecognized Source of Variability in Test Equating: A Bootstrap Approximation Assuming Random Sampling of Common Items

    ERIC Educational Resources Information Center

    Michaelides, Michalis P.; Haertel, Edward H.

    2014-01-01

    The standard error of equating quantifies the variability in the estimation of an equating function. Because common items for deriving equated scores are treated as fixed, the only source of variability typically considered arises from the estimation of common-item parameters from responses of samples of examinees. Use of alternative, equally…

  9. Random Item IRT Models

    ERIC Educational Resources Information Center

    De Boeck, Paul

    2008-01-01

    It is common practice in IRT to consider items as fixed and persons as random. Both, continuous and categorical person parameters are most often random variables, whereas for items only continuous parameters are used and they are commonly of the fixed type, although exceptions occur. It is shown in the present article that random item parameters…

  10. Effects of hyperthermic baths on depression, sleep and heart rate variability in patients with depressive disorder: a randomized clinical pilot trial.

    PubMed

    Naumann, Johannes; Grebe, Julian; Kaifel, Sonja; Weinert, Tomas; Sadaghiani, Catharina; Huber, Roman

    2017-03-28

    Despite advances in the treatment of depression, one-third of depressed patients fail to respond to conventional antidepressant medication. There is a need for more effective treatments with fewer side effects. The primary aim of this study was to determine whether hyperthermic baths reduce depressive symptoms in adults with depressive disorder. Randomized, two-arm placebo-controlled, 8-week pilot trial. Medically stable outpatients with confirmed depressive disorder (ICD-10: F32/F33) who were moderately depressed as determined by the 17-item Hamilton Scale for Depression (HAM-D) score ≥18 were randomly assigned to 2 hyperthermic baths (40 °C) per week for 4 weeks or a sham intervention with green light and follow-up after 4 weeks. Main outcome measure was the change in HAM-D total score from baseline (T0) to the 2-week time point (T1). A total of 36 patients were randomized (hyperthermic baths, n = 17; sham condition, n = 19). The intention-to-treat analysis showed a significant (P = .037) difference in the change in HAM-D total score with 3.14 points after 4 interventions (T1) in favour of the hyperthermic bath group compared to the placebo group. This pilot study suggests that hyperthermic baths do have generalized efficacy in depressed patients. DRKS00004803 at drks-neu.uniklinik-freiburg.de, German Clinical Trials Register (registration date 2016-02-02), retrospectively registered.

  11. Demographic variables, design characteristics, and effect sizes of randomized, placebo-controlled, monotherapy trials of major depressive disorder and bipolar depression.

    PubMed

    Papakostas, George I; Martinson, Max A; Fava, Maurizio; Iovieno, Nadia

    2016-05-01

    The aim of this work is to compare the efficacy of pharmacologic agents for the treatment of major depressive disorder (MDD) and bipolar depression. MEDLINE/PubMed databases were searched for studies published in English between January 1980 and September 2014 by cross-referencing the search term placebo with each of the antidepressant agents identified and with bipolar. The search was supplemented by manual bibliography review. We selected double-blind, randomized, placebo-controlled trials of antidepressant monotherapies for the treatment of MDD and of oral drug monotherapies for the treatment of bipolar depression. 196 trials in MDD and 19 trials in bipolar depression were found eligible for inclusion in our analysis. Data were extracted by one of the authors and checked for accuracy by a second one. Data extracted included year of publication, number of patients randomized, probability of receiving placebo, duration of the trial, baseline symptom severity, dosing schedule, study completion rates, and clinical response rates. Response rates for drug versus placebo in trials of MDD and bipolar depression were 52.7% versus 37.5% and 54.7% versus 40.5%, respectively. The random-effects meta-analysis indicated that drug therapy was more effective than placebo in both MDD (risk ratio for response = 1.373; P < .001) and bipolar depression (risk ratio = 1.257; P < .001) trials. The meta-regression analysis suggested a statistically significant difference in the risk ratio of responding to drug versus placebo between MDD and bipolar depression trials in favor of MDD (P = .008). Although a statistically significantly greater treatment effect size was noted in MDD relative to bipolar depression studies, the absolute magnitude of the difference was numerically small. Therefore, the present study suggests no clinically significant differences in the overall short-term efficacy of pharmacologic monotherapies for MDD and bipolar depression. © Copyright 2016 Physicians

  12. Generating and controlling homogeneous air turbulence using random jet arrays

    NASA Astrophysics Data System (ADS)

    Carter, Douglas; Petersen, Alec; Amili, Omid; Coletti, Filippo

    2016-12-01

    The use of random jet arrays, already employed in water tank facilities to generate zero-mean-flow homogeneous turbulence, is extended to air as a working fluid. A novel facility is introduced that uses two facing arrays of individually controlled jets (256 in total) to force steady homogeneous turbulence with negligible mean flow, shear, and strain. Quasi-synthetic jet pumps are created by expanding pressurized air through small straight nozzles and are actuated by fast-response low-voltage solenoid valves. Velocity fields, two-point correlations, energy spectra, and second-order structure functions are obtained from 2D PIV and are used to characterize the turbulence from the integral-to-the Kolmogorov scales. Several metrics are defined to quantify how well zero-mean-flow homogeneous turbulence is approximated for a wide range of forcing and geometric parameters. With increasing jet firing time duration, both the velocity fluctuations and the integral length scales are augmented and therefore the Reynolds number is increased. We reach a Taylor-microscale Reynolds number of 470, a large-scale Reynolds number of 74,000, and an integral-to-Kolmogorov length scale ratio of 680. The volume of the present homogeneous turbulence, the largest reported to date in a zero-mean-flow facility, is much larger than the integral length scale, allowing for the natural development of the energy cascade. The turbulence is found to be anisotropic irrespective of the distance between the jet arrays. Fine grids placed in front of the jets are effective at modulating the turbulence, reducing both velocity fluctuations and integral scales. Varying the jet-to-jet spacing within each array has no effect on the integral length scale, suggesting that this is dictated by the length scale of the jets.

  13. Mendelian Randomization.

    PubMed

    Grover, Sandeep; Del Greco M, Fabiola; Stein, Catherine M; Ziegler, Andreas

    2017-01-01

    Confounding and reverse causality have prevented us from drawing meaningful clinical interpretation even in well-powered observational studies. Confounding may be attributed to our inability to randomize the exposure variable in observational studies. Mendelian randomization (MR) is one approach to overcome confounding. It utilizes one or more genetic polymorphisms as a proxy for the exposure variable of interest. Polymorphisms are randomly distributed in a population, they are static throughout an individual's lifetime, and may thus help in inferring directionality in exposure-outcome associations. Genome-wide association studies (GWAS) or meta-analyses of GWAS are characterized by large sample sizes and the availability of many single nucleotide polymorphisms (SNPs), making GWAS-based MR an attractive approach. GWAS-based MR comes with specific challenges, including multiple causality. Despite shortcomings, it still remains one of the most powerful techniques for inferring causality.With MR still an evolving concept with complex statistical challenges, the literature is relatively scarce in terms of providing working examples incorporating real datasets. In this chapter, we provide a step-by-step guide for causal inference based on the principles of MR with a real dataset using both individual and summary data from unrelated individuals. We suggest best possible practices and give recommendations based on the current literature.

  14. The effect of carbohydrate mouth rinse on performance, biochemical and psychophysiological variables during a cycling time trial: a crossover randomized trial.

    PubMed

    Ferreira, Amanda M J; Farias-Junior, Luiz F; Mota, Thaynan A A; Elsangedy, Hassan M; Marcadenti, Aline; Lemos, Telma M A M; Okano, Alexandre H; Fayh, Ana P T

    2018-01-01

    The hypothesis of the central effect of carbohydrate mouth rinse (CMR) on performance improvement in a fed state has not been established, and its psychophysiological responses have not yet been described. The aim of this study was to evaluate the effect of CMR in athletes fed state on performance, biochemical and psychophysiological responses compared to ad libitum water intake. Eleven trained male cyclists completed a randomized, crossover trial, which consisted of a 30 km cycle ergometer at self-selected intensity and in a fed state. Subjects were under random influence of the following interventions: CMR with a 6% unflavored maltodextrin solution; mouth rinsing with a placebo solution (PMR); drinking "ad libitum" (DAL). The time for completion of the test (min), heart rate (bpm) and power (watts), rating of perceived exertion (RPE), affective response, blood glucose (mg/dL) and lactate (mmol/DL), were evaluated before, during and immediately after the test, while insulin (uIL/mL), cortisol (μg/dL) and creatine kinase (U/L) levels were measured before, immediately after the test and 30 min after the test. Time for completion of the 30 km trial did not differ significantly among CMR, PMR and DAL interventions (means = 54.5 ± 2.9, 54.7 ± 2.9 and 54.5 ± 2.5 min, respectively; p  = 0.82). RPE and affective response were higher in DAL intervention ( p  < 0.01). Glucose, insulin, cortisol and creatine kinase responses showed no significant difference among interventions. In a fed state, CMR has not caused metabolic changes, and it has not improved physical performance compared to ad libitum water intake, but demonstrated a possible central effect. ReBec registration number: RBR-4vpwkg. Available in http://www.ensaiosclinicos.gov.br/rg/?q=RBR-4vpwkg.

  15. Boosting of HIV envelope CD4 binding site antibodies with long variable heavy third complementarity determining region in the randomized double blind RV305 HIV-1 vaccine trial.

    PubMed

    Easterhoff, David; Moody, M Anthony; Fera, Daniela; Cheng, Hao; Ackerman, Margaret; Wiehe, Kevin; Saunders, Kevin O; Pollara, Justin; Vandergrift, Nathan; Parks, Rob; Kim, Jerome; Michael, Nelson L; O'Connell, Robert J; Excler, Jean-Louis; Robb, Merlin L; Vasan, Sandhya; Rerks-Ngarm, Supachai; Kaewkungwal, Jaranit; Pitisuttithum, Punnee; Nitayaphan, Sorachai; Sinangil, Faruk; Tartaglia, James; Phogat, Sanjay; Kepler, Thomas B; Alam, S Munir; Liao, Hua-Xin; Ferrari, Guido; Seaman, Michael S; Montefiori, David C; Tomaras, Georgia D; Harrison, Stephen C; Haynes, Barton F

    2017-02-01

    The canary pox vector and gp120 vaccine (ALVAC-HIV and AIDSVAX B/E gp120) in the RV144 HIV-1 vaccine trial conferred an estimated 31% vaccine efficacy. Although the vaccine Env AE.A244 gp120 is antigenic for the unmutated common ancestor of V1V2 broadly neutralizing antibody (bnAbs), no plasma bnAb activity was induced. The RV305 (NCT01435135) HIV-1 clinical trial was a placebo-controlled randomized double-blinded study that assessed the safety and efficacy of vaccine boosting on B cell repertoires. HIV-1-uninfected RV144 vaccine recipients were reimmunized 6-8 years later with AIDSVAX B/E gp120 alone, ALVAC-HIV alone, or a combination of ALVAC-HIV and AIDSVAX B/E gp120 in the RV305 trial. Env-specific post-RV144 and RV305 boost memory B cell VH mutation frequencies increased from 2.9% post-RV144 to 6.7% post-RV305. The vaccine was well tolerated with no adverse events reports. While post-boost plasma did not have bnAb activity, the vaccine boosts expanded a pool of envelope CD4 binding site (bs)-reactive memory B cells with long third heavy chain complementarity determining regions (HCDR3) whose germline precursors and affinity matured B cell clonal lineage members neutralized the HIV-1 CRF01 AE tier 2 (difficult to neutralize) primary isolate, CNE8. Electron microscopy of two of these antibodies bound with near-native gp140 trimers showed that they recognized an open conformation of the Env trimer. Although late boosting of RV144 vaccinees expanded a novel pool of neutralizing B cell clonal lineages, we hypothesize that boosts with stably closed trimers would be necessary to elicit antibodies with greater breadth of tier 2 HIV-1 strains. ClinicalTrials.gov NCT01435135.

  16. Boosting of HIV envelope CD4 binding site antibodies with long variable heavy third complementarity determining region in the randomized double blind RV305 HIV-1 vaccine trial

    PubMed Central

    Ackerman, Margaret; Saunders, Kevin O.; Pollara, Justin; Vandergrift, Nathan; Parks, Rob; Michael, Nelson L.; O’Connell, Robert J.; Vasan, Sandhya; Rerks-Ngarm, Supachai; Kaewkungwal, Jaranit; Pitisuttithum, Punnee; Nitayaphan, Sorachai; Sinangil, Faruk; Phogat, Sanjay; Alam, S. Munir; Liao, Hua-Xin; Ferrari, Guido; Seaman, Michael S.; Montefiori, David C.; Harrison, Stephen C.; Haynes, Barton F.

    2017-01-01

    The canary pox vector and gp120 vaccine (ALVAC-HIV and AIDSVAX B/E gp120) in the RV144 HIV-1 vaccine trial conferred an estimated 31% vaccine efficacy. Although the vaccine Env AE.A244 gp120 is antigenic for the unmutated common ancestor of V1V2 broadly neutralizing antibody (bnAbs), no plasma bnAb activity was induced. The RV305 (NCT01435135) HIV-1 clinical trial was a placebo-controlled randomized double-blinded study that assessed the safety and efficacy of vaccine boosting on B cell repertoires. HIV-1-uninfected RV144 vaccine recipients were reimmunized 6–8 years later with AIDSVAX B/E gp120 alone, ALVAC-HIV alone, or a combination of ALVAC-HIV and AIDSVAX B/E gp120 in the RV305 trial. Env-specific post-RV144 and RV305 boost memory B cell VH mutation frequencies increased from 2.9% post-RV144 to 6.7% post-RV305. The vaccine was well tolerated with no adverse events reports. While post-boost plasma did not have bnAb activity, the vaccine boosts expanded a pool of envelope CD4 binding site (bs)-reactive memory B cells with long third heavy chain complementarity determining regions (HCDR3) whose germline precursors and affinity matured B cell clonal lineage members neutralized the HIV-1 CRF01 AE tier 2 (difficult to neutralize) primary isolate, CNE8. Electron microscopy of two of these antibodies bound with near-native gp140 trimers showed that they recognized an open conformation of the Env trimer. Although late boosting of RV144 vaccinees expanded a novel pool of neutralizing B cell clonal lineages, we hypothesize that boosts with stably closed trimers would be necessary to elicit antibodies with greater breadth of tier 2 HIV-1 strains. Trial Registration: ClinicalTrials.gov NCT01435135 PMID:28235027

  17. The role of the immunological background of mice in the genetic variability of Schistosoma mansoni as detected by random amplification of polymorphic DNA.

    PubMed

    Cossa-Moiane, I L; Mendes, T; Ferreira, T M; Mauricio, I; Calado, M; Afonso, A; Belo, S

    2015-11-01

    Schistosomiasis is a parasitic disease caused by flatworms of the genus Schistosoma. Among the Schistosoma species known to infect humans, S. mansoni is the most frequent cause of intestinal schistosomiasis in sub-Saharan Africa and South America: the World Health Organization estimates that about 200,000 deaths per year result from schistosomiasis in sub-Saharan Africa alone. The Schistosoma life cycle requires two different hosts: a snail as intermediate host and a mammal as definitive host. People become infected when they come into contact with water contaminated with free-living larvae (e.g. when swimming, fishing, washing). Although S. mansoni has mechanisms for escaping the host immune system, only a minority of infecting larvae develop into adults, suggesting that strain selection occurs at the host level. To test this hypothesis, we compared the Belo Horizonte (BH) strain of S. mansoni recovered from definitive hosts with different immunological backgrounds using random amplification of polymorphic DNA-polymerase chain reaction (RAPD-PCR). Schistosoma mansoni DNA profiles of worms obtained from wild-type (CD1 and C57BL/6J) and mutant (Jα18- / - and TGFβRIIdn) mice were analysed. Four primers produced polymorphic profiles, which can therefore potentially be used as reference biomarkers. All male worms were genetically distinct from females isolated from the same host, with female worms showing more specific fragments than males. Of the four host-derived schistosome populations, female and male adults recovered from TGFβRIIdn mice showed RAPD-PCR profiles that were most similar to each other. Altogether, these data indicate that host immunological backgrounds can influence the genetic diversity of parasite populations.

  18. Exploring individual cognitions, self-regulation skills, and environmental-level factors as mediating variables of two versions of a Web-based computer-tailored nutrition education intervention aimed at adults: A randomized controlled trial.

    PubMed

    Springvloet, Linda; Lechner, Lilian; Candel, Math J J M; de Vries, Hein; Oenema, Anke

    2016-03-01

    This study explored whether the determinants that were targeted in two versions of a Web-based computer-tailored nutrition education intervention mediated the effects on fruit, high-energy snack, and saturated fat intake among adults who did not comply with dietary guidelines. A RCT was conducted with a basic (tailored intervention targeting individual cognitions and self-regulation), plus (additionally targeting environmental-level factors), and control group (generic nutrition information). Participants were recruited from the general Dutch adult population and randomly assigned to one of the study groups. Online self-reported questionnaires assessed dietary intake and potential mediating variables (behavior-specific cognitions, action- and coping planning, environmental-level factors) at baseline and one (T1) and four (T2) months post-intervention (i.e. four and seven months after baseline). The joint-significance test was used to establish mediating variables at different time points (T1-mediating variables - T2-intake; T1-mediating variables - T1-intake; T2-mediating variables - T2-intake). Educational differences were examined by testing interaction terms. The effect of the plus version on fruit intake was mediated (T2-T2) by intention and fruit availability at home and for high-educated participants also by attitude. Among low/moderate-educated participants, high-energy snack availability at home mediated (T1-T1) the effect of the basic version on high-energy snack intake. Subjective norm mediated (T1-T1) the effect of the basic version on fat intake among high-educated participants. Only some of the targeted determinants mediated the effects of both intervention versions on fruit, high-energy snack, and saturated fat intake. A possible reason for not finding a more pronounced pattern of mediating variables is that the educational content was tailored to individual characteristics and that participants only received feedback for relevant and not for all

  19. Limits on relief through constrained exchange on random graphs

    NASA Astrophysics Data System (ADS)

    LaViolette, Randall A.; Ellebracht, Lory A.; Gieseler, Charles J.

    2007-09-01

    Agents are represented by nodes on a random graph (e.g., “small world”). Each agent is endowed with a zero-mean random value that may be either positive or negative. All agents attempt to find relief, i.e., to reduce the magnitude of that initial value, to zero if possible, through exchanges. The exchange occurs only between the agents that are linked, a constraint that turns out to dominate the results. The exchange process continues until Pareto equilibrium is achieved. Only 40-90% of the agents achieved relief on small-world graphs with mean degree between 2 and 40. Even fewer agents achieved relief on scale-free-like graphs with a truncated power-law degree distribution. The rate at which relief grew with increasing degree was slow, only at most logarithmic for all of the graphs considered; viewed in reverse, the fraction of nodes that achieve relief is resilient to the removal of links.

  20. Heart rate variability and hemodynamic change in the superior mesenteric artery by acupuncture stimulation of lower limb points: a randomized crossover trial.

    PubMed

    Kaneko, Soichiro; Watanabe, Masashi; Takayama, Shin; Numata, Takehiro; Seki, Takashi; Tanaka, Junichi; Kanemura, Seiki; Kagaya, Yutaka; Ishii, Tadashi; Kimura, Yoshitaka; Yaegashi, Nobuo

    2013-01-01

    Objective. We investigated the relationship between superior mesenteric artery blood flow volume (SMA BFV) and autonomic nerve activity in acupuncture stimulation of lower limb points through heart rate variability (HRV) evaluations. Methods. Twenty-six healthy volunteers underwent crossover applications of bilateral manual acupuncture stimulation at ST36 or LR3 or no stimulation. Heart rate, blood pressure, cardiac index, systemic vascular resistance index, SMA BFV, and HRV at rest and 30 min after the intervention were analyzed. Results. SMA BFV showed a significant increase after ST36 stimulation (0% to 14.1% ± 23.4%, P = 0.007); very low frequency (VLF), high frequency (HF), low frequency (LF), and LF/HF were significantly greater than those at rest (0% to 479.4% ± 1185.6%, P = 0.045; 0% to 78.9% ± 197.6%, P = 0.048; 0% to 123.9% ± 217.1%, P = 0.006; 0% to 71.5% ± 171.1%, P = 0.039). Changes in HF and LF also differed significantly from those resulting from LR3 stimulation (HF: 78.9% ± 197.6% versus -18.2% ± 35.8%, P = 0.015; LF: 123.9% ± 217.1% versus 10.6% ± 70.6%, P = 0.013). Conclusion. Increased vagus nerve activity after ST36 stimulation resulted in increased SMA BFV. This partly explains the mechanism of acupuncture-induced BFV changes.

  1. Heart Rate Variability and Hemodynamic Change in the Superior Mesenteric Artery by Acupuncture Stimulation of Lower Limb Points: A Randomized Crossover Trial

    PubMed Central

    Watanabe, Masashi; Tanaka, Junichi; Kanemura, Seiki; Kagaya, Yutaka; Ishii, Tadashi; Kimura, Yoshitaka; Yaegashi, Nobuo

    2013-01-01

    Objective. We investigated the relationship between superior mesenteric artery blood flow volume (SMA BFV) and autonomic nerve activity in acupuncture stimulation of lower limb points through heart rate variability (HRV) evaluations. Methods. Twenty-six healthy volunteers underwent crossover applications of bilateral manual acupuncture stimulation at ST36 or LR3 or no stimulation. Heart rate, blood pressure, cardiac index, systemic vascular resistance index, SMA BFV, and HRV at rest and 30 min after the intervention were analyzed. Results. SMA BFV showed a significant increase after ST36 stimulation (0% to 14.1% ± 23.4%, P = 0.007); very low frequency (VLF), high frequency (HF), low frequency (LF), and LF/HF were significantly greater than those at rest (0% to 479.4% ± 1185.6%, P = 0.045; 0% to 78.9% ± 197.6%, P = 0.048; 0% to 123.9% ± 217.1%, P = 0.006; 0% to 71.5% ± 171.1%, P = 0.039). Changes in HF and LF also differed significantly from those resulting from LR3 stimulation (HF: 78.9% ± 197.6% versus −18.2% ± 35.8%, P = 0.015; LF: 123.9% ± 217.1% versus 10.6% ± 70.6%, P = 0.013). Conclusion. Increased vagus nerve activity after ST36 stimulation resulted in increased SMA BFV. This partly explains the mechanism of acupuncture-induced BFV changes. PMID:24381632

  2. Post-exercise recovery of biological, clinical and metabolic variables after different temperatures and durations of cold water immersion: a randomized clinical trial.

    PubMed

    Vanderlei, Franciele M; de Albuquerque, Maíra C; de Almeida, Aline C; Machado, Aryane F; Netto, Jayme; Pastre, Carlos M

    2017-10-01

    Cold water immersion (CWI) is a commonly used recuperative strategy. However there is a lack of standardization of protocols considering the duration and temperature of application of the technique and the stress model. Therefore it is important to study the issue of dose response in a specific stress model. Thus the objective was to analyze and compare the effects of CWI during intense post-exercise recovery using different durations and temperatures of immersion. One hundred and five male individuals were divided into five groups: one control group (CG) and four recovery groups (G1: 5' at 9±1 °C; G2: 5' at 14±1 °C; G3: 15' at 9±1 °C; G4: 15' at 14±1 °C). The volunteers were submitted to an exhaustion protocol that consisted of a jump program and the Wingate Test. Immediately after the exhaustion protocol, the volunteers were directed to a tank with water and ice, where they were immersed for the recovery procedure, during which blood samples were collected for later lactate and creatine kinase (CK) analysis. Variables were collected prior to the exercise and 24, 48, 72, and 96 hours after its completion. For the CK concentration, 15 minutes at 14 °C was the best intervention option, considering the values at 72 hours after exercise, while for the moment of peak lactate an advantage was observed for immersion for 5 minutes at 14 °C. Regarding the perception of recovery, CWI for 5 minutes at 14 °C performed better long-term, from the time of the intervention to 96 hours post-exercise. For pain, no form of immersion responded better than the CG at the immediately post-intervention moment. There were no differences in behavior between the CWI intervention groups for the outcomes studied.

  3. The effects of cold water immersion with different dosages (duration and temperature variations) on heart rate variability post-exercise recovery: A randomized controlled trial.

    PubMed

    Almeida, Aline C; Machado, Aryane F; Albuquerque, Maíra C; Netto, Lara M; Vanderlei, Franciele M; Vanderlei, Luiz Carlos M; Junior, Jayme Netto; Pastre, Carlos M

    2016-08-01

    The aim of the present study was to investigate the effects of cold water immersion during post-exercise recovery, with different durations and temperatures, on heart rate variability indices. Hundred participants performed a protocol of jumps and a Wingate test, and immediately afterwards were immersed in cold water, according to the characteristics of each group (CG: control; G1: 5' at 9±1°C; G2: 5' at 14±1°C; G3: 15' at 9±1°C; G4: 15' at 14±1°C). Analyses were performed at baseline, during the CWI recuperative technique (TRec) and 20, 30, 40, 50 and 60min post-exercise. The average HRV indices of all RR-intervals in each analysis period (MeanRR), standard deviation of normal RR-intervals (SDNN), square root of the mean of the sum of the squares of differences between adjacent RR-intervals (RMSSD), spectral components of very low frequency (VLF), low frequency (LF) and high frequency (HF), scatter of points perpendicular to the line of identity of the Poincaré Plot (SD1) and scatter points along the line of identity (SD2) were assessed. Mean RR, VLF and LF presented an anticipated return to baseline values at all the intervention groups, but the same was observed for SDNN and SD2 only in the immersion for 15min at 14°C group (G4). In addition, G4 presented higher values when compared to CG. These findings demonstrate that if the purpose of the recovery process is restoration of cardiac autonomic modulation, the technique is recommended, specifically for 15min at 14°C. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  4. Effects of heart rate variability biofeedback during exposure to fear-provoking stimuli within spider-fearful individuals: study protocol for a randomized controlled trial.

    PubMed

    Schäfer, Sarah K; Ihmig, Frank R; Lara H, Karen A; Neurohr, Frank; Kiefer, Stephan; Staginnus, Marlene; Lass-Hennemann, Johanna; Michael, Tanja

    2018-03-16

    Specific phobias are among the most common anxiety disorders. Exposure therapy is the treatment of choice for specific phobias. However, not all patients respond equally well to it. Hence, current research focuses on therapeutic add-ons to increase and consolidate the effects of exposure therapy. One potential therapeutic add-on is biofeedback to increase heart rate variability (HRV). A recent meta-analysis shows beneficial effects of HRV biofeedback interventions on stress and anxiety symptoms. Therefore, the purpose of the current trial is to evaluate the effects of HRV biofeedback, which is practiced before and utilized during exposure, in spider-fearful individuals. Further, this trial is the first to differentiate between the effects of a HRV biofeedback intervention and those of a low-load working memory (WM) task. Eighty spider-fearful individuals participate in the study. All participants receive a training session in which they practice two tasks (HRV biofeedback and a motor pseudo-biofeedback task or two motor pseudo-biofeedback tasks). Afterwards, they train both tasks at home for 6 days. One week later, during the exposure session, they watch 16 1-min spider video clips. Participants are divided into four groups: group 1 practices the HRV biofeedback and one motor pseudo-task before exposure and utilizes HRV biofeedback during exposure. Group 2 receives the same training, but continues the pseudo-biofeedback task during exposure. Group 3 practices two pseudo-biofeedback tasks and continues one of them during exposure. Group 4 trains in two pseudo-biofeedback tasks and has no additional task during exposure. The primary outcome is fear of spiders (measured by the Fear of Spiders Questionnaire and the Behavioral Approach Test). Secondary outcomes are physiological measures based on electrodermal activity, electrocardiogram and respiration. This RCT is the first one to investigate the effects of using a pre-trained HRV biofeedback during exposure in

  5. A randomized controlled trial to compare the effects of sulphonylurea gliclazide MR (modified release) and the DPP-4 inhibitor vildagliptin on glycemic variability and control measured by continuous glucose monitoring (CGM) in Brazilian women with type 2 diabetes.

    PubMed

    Vianna, Andre Gustavo Daher; Lacerda, Claudio Silva; Pechmann, Luciana Muniz; Polesel, Michelle Garcia; Marino, Emerson Cestari; Faria-Neto, Jose Rocha

    2018-05-01

    This study aims to evaluate whether there is a difference between the effects of vildagliptin and gliclazide MR (modified release) on glycemic variability (GV) in women with type 2 diabetes (T2DM) as evaluated by continuous glucose monitoring (CGM). An open-label, randomized study was conducted in T2DM women on steady-dose metformin monotherapy which were treated with 50 mg vildagliptin twice daily or 60-120 mg of gliclazide MR once daily. CGM and GV indices calculation were performed at baseline and after 24 weeks. In total, 42 patients (age: 61.9 ± 5.9 years, baseline glycated hemoglobin (HbA1c): 7.3 ± 0.56) were selected and 37 completed the 24-week protocol. Vildagliptin and gliclazide MR reduced GV, as measured by the mean amplitude of glycemic excursions (MAGE, p = 0.007 and 0.034, respectively). The difference between the groups did not reach statistical significance. Vildagliptin also significantly decreased the standard deviation of the mean glucose (SD) and the mean of the daily differences (MODD) (p = 0.007 and 0.030). Vildagliptin and gliclazide MR similarly reduced the MAGE in women with T2DM after 24 weeks of treatment. Further studies are required to attest differences between vildagliptin and gliclazide MR regarding glycemic variability. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Modeling Linguistic Variables With Regression Models: Addressing Non-Gaussian Distributions, Non-independent Observations, and Non-linear Predictors With Random Effects and Generalized Additive Models for Location, Scale, and Shape

    PubMed Central

    Coupé, Christophe

    2018-01-01

    As statistical approaches are getting increasingly used in linguistics, attention must be paid to the choice of methods and algorithms used. This is especially true since they require assumptions to be satisfied to provide valid results, and because scientific articles still often fall short of reporting whether such assumptions are met. Progress is being, however, made in various directions, one of them being the introduction of techniques able to model data that cannot be properly analyzed with simpler linear regression models. We report recent advances in statistical modeling in linguistics. We first describe linear mixed-effects regression models (LMM), which address grouping of observations, and generalized linear mixed-effects models (GLMM), which offer a family of distributions for the dependent variable. Generalized additive models (GAM) are then introduced, which allow modeling non-linear parametric or non-parametric relationships between the dependent variable and the predictors. We then highlight the possibilities offered by generalized additive models for location, scale, and shape (GAMLSS). We explain how they make it possible to go beyond common distributions, such as Gaussian or Poisson, and offer the appropriate inferential framework to account for ‘difficult’ variables such as count data with strong overdispersion. We also demonstrate how they offer interesting perspectives on data when not only the mean of the dependent variable is modeled, but also its variance, skewness, and kurtosis. As an illustration, the case of phonemic inventory size is analyzed throughout the article. For over 1,500 languages, we consider as predictors the number of speakers, the distance from Africa, an estimation of the intensity of language contact, and linguistic relationships. We discuss the use of random effects to account for genealogical relationships, the choice of appropriate distributions to model count data, and non-linear relationships. Relying on GAMLSS

  7. Modeling Linguistic Variables With Regression Models: Addressing Non-Gaussian Distributions, Non-independent Observations, and Non-linear Predictors With Random Effects and Generalized Additive Models for Location, Scale, and Shape.

    PubMed

    Coupé, Christophe

    2018-01-01

    As statistical approaches are getting increasingly used in linguistics, attention must be paid to the choice of methods and algorithms used. This is especially true since they require assumptions to be satisfied to provide valid results, and because scientific articles still often fall short of reporting whether such assumptions are met. Progress is being, however, made in various directions, one of them being the introduction of techniques able to model data that cannot be properly analyzed with simpler linear regression models. We report recent advances in statistical modeling in linguistics. We first describe linear mixed-effects regression models (LMM), which address grouping of observations, and generalized linear mixed-effects models (GLMM), which offer a family of distributions for the dependent variable. Generalized additive models (GAM) are then introduced, which allow modeling non-linear parametric or non-parametric relationships between the dependent variable and the predictors. We then highlight the possibilities offered by generalized additive models for location, scale, and shape (GAMLSS). We explain how they make it possible to go beyond common distributions, such as Gaussian or Poisson, and offer the appropriate inferential framework to account for 'difficult' variables such as count data with strong overdispersion. We also demonstrate how they offer interesting perspectives on data when not only the mean of the dependent variable is modeled, but also its variance, skewness, and kurtosis. As an illustration, the case of phonemic inventory size is analyzed throughout the article. For over 1,500 languages, we consider as predictors the number of speakers, the distance from Africa, an estimation of the intensity of language contact, and linguistic relationships. We discuss the use of random effects to account for genealogical relationships, the choice of appropriate distributions to model count data, and non-linear relationships. Relying on GAMLSS, we

  8. A double-blind, placebo-controlled, randomized trial of the effects of dark chocolate and cocoa on variables associated with neuropsychological functioning and cardiovascular health: clinical findings from a sample of healthy, cognitively intact older adults.

    PubMed

    Crews, W David; Harrison, David W; Wright, James W

    2008-04-01

    In recent years, there has been increased interest in the potential health-related benefits of antioxidant- and phytochemical-rich dark chocolate and cocoa. The objective of the study was to examine the short-term (6 wk) effects of dark chocolate and cocoa on variables associated with neuropsychological functioning and cardiovascular health in healthy older adults. A double-blind, placebo-controlled, fixed-dose, parallel-group clinical trial was used. Participants (n = 101) were randomly assigned to receive a 37-g dark chocolate bar and 8 ounces (237 mL) of an artificially sweetened cocoa beverage or similar placebo products each day for 6 wk. No significant group (dark chocolate and cocoa or placebo)-by-trial (baseline, midpoint, and end-of-treatment assessments) interactions were found for the neuropsychological, hematological, or blood pressure variables examined. In contrast, the midpoint and end-of-treatment mean pulse rate assessments in the dark chocolate and cocoa group were significantly higher than those at baseline and significantly higher than the midpoint and end-of-treatment rates in the control group. Results of a follow-up questionnaire item on the treatment products that participants believed they had consumed during the trial showed that more than half of the participants in both groups correctly identified the products that they had ingested during the experiment. This investigation failed to support the predicted beneficial effects of short-term dark chocolate and cocoa consumption on any of the neuropsychological or cardiovascular health-related variables included in this research. Consumption of dark chocolate and cocoa was, however, associated with significantly higher pulse rates at 3- and 6-wk treatment assessments.

  9. Quantization of Gaussian samples at very low SNR regime in continuous variable QKD applications

    NASA Astrophysics Data System (ADS)

    Daneshgaran, Fred; Mondin, Marina

    2016-09-01

    The main problem for information reconciliation in continuous variable Quantum Key Distribution (QKD) at low Signal to Noise Ratio (SNR) is quantization and assignment of labels to the samples of the Gaussian Random Variables (RVs) observed at Alice and Bob. Trouble is that most of the samples, assuming that the Gaussian variable is zero mean which is de-facto the case, tend to have small magnitudes and are easily disturbed by noise. Transmission over longer and longer distances increases the losses corresponding to a lower effective SNR exasperating the problem. This paper looks at the quantization problem of the Gaussian samples at very low SNR regime from an information theoretic point of view. We look at the problem of two bit per sample quantization of the Gaussian RVs at Alice and Bob and derive expressions for the mutual information between the bit strings as a result of this quantization. The quantization threshold for the Most Significant Bit (MSB) should be chosen based on the maximization of the mutual information between the quantized bit strings. Furthermore, while the LSB string at Alice and Bob are balanced in a sense that their entropy is close to maximum, this is not the case for the second most significant bit even under optimal threshold. We show that with two bit quantization at SNR of -3 dB we achieve 75.8% of maximal achievable mutual information between Alice and Bob, hence, as the number of quantization bits increases beyond 2-bits, the number of additional useful bits that can be extracted for secret key generation decreases rapidly. Furthermore, the error rates between the bit strings at Alice and Bob at the same significant bit level are rather high demanding very powerful error correcting codes. While our calculations and simulation shows that the mutual information between the LSB at Alice and Bob is 0.1044 bits, that at the MSB level is only 0.035 bits. Hence, it is only by looking at the bits jointly that we are able to achieve a

  10. THE EFFECT OF HORMONE THERAPY ON MEAN BLOOD PRESSURE AND VISIT-TO-VISIT BLOOD PRESSURE VARIABILITY IN POSTMENOPAUSAL WOMEN: RESULTS FROM THE WOMEN’S HEALTH INITIATIVE RANDOMIZED CONTROLLED TRIALS

    PubMed Central

    Shimbo, Daichi; Wang, Lu; Lamonte, Michael J.; Allison, Matthew; Wellenius, Gregory A.; Bavry, Anthony A.; Martin, Lisa W.; Aragaki, Aaron; Newman, Jonathan D.; Swica, Yael; Rossouw, Jacques E.; Manson, JoAnn E.; Wassertheil-Smoller, Sylvia

    2014-01-01

    Objectives Mean and visit-to-visit variability (VVV) of blood pressure are associated with an increased cardiovascular disease risk. We examined the effect of hormone therapy on mean and VVV of blood pressure in postmenopausal women from the Women’s Health Initiative (WHI) randomized controlled trials. Methods Blood pressure was measured at baseline and annually in the two WHI hormone therapy trials in which 10,739 and 16,608 postmenopausal women were randomized to conjugated equine estrogens (CEE, 0.625 mg/day) or placebo, and CEE plus medroxyprogesterone acetate (MPA, 2.5 mg/day) or placebo, respectively. Results At the first annual visit (Year 1), mean systolic blood pressure was 1.04 mmHg (95% CI 0.58, 1.50) and 1.35 mmHg (95% CI 0.99, 1.72) higher in the CEE and CEE+MPA arms respectively compared to corresponding placebos. These effects remained stable after Year 1. CEE also increased VVV of systolic blood pressure (ratio of VVV in CEE vs. placebo, 1.03, P<0.001), whereas CEE+MPA did not (ratio of VVV in CEE+MPA vs. placebo, 1.01, P=0.20). After accounting for study drug adherence, the effects of CEE and CEE+MPA on mean systolic blood pressure increased at Year 1, and the differences in the CEE and CEE+MPA arms vs. placebos also continued to increase after Year 1. Further, both CEE and CEE+MPA significantly increased VVV of systolic blood pressure (ratio of VVV in CEE vs. placebo, 1.04, P<0.001; ratio of VVV in CEE+MPA vs. placebo, 1.05, P<0.001). Conclusions Among postmenopausal women, CEE and CEE+MPA at conventional doses increased mean and VVV of systolic blood pressure. PMID:24991872

  11. Atomoxetine could improve intra-individual variability in drug-naïve adults with attention-deficit/hyperactivity disorder comparably with methylphenidate: A head-to-head randomized clinical trial.

    PubMed

    Ni, Hsing-Chang; Hwang Gu, Shoou-Lian; Lin, Hsiang-Yuan; Lin, Yu-Ju; Yang, Li-Kuang; Huang, Hui-Chun; Gau, Susan Shur-Fen

    2016-05-01

    Intra-individual variability in reaction time (IIV-RT) is common in individuals with attention-deficit/hyperactivity disorder (ADHD). It can be improved by stimulants. However, the effects of atomoxetine on IIV-RT are inconclusive. We aimed to investigate the effects of atomoxetine on IIV-RT, and directly compared its efficacy with methylphenidate in adults with ADHD. An 8-10 week, open-label, head-to-head, randomized clinical trial was conducted in 52 drug-naïve adults with ADHD, who were randomly assigned to two treatment groups: immediate-release methylphenidate (n=26) thrice daily (10-20 mg per dose) and atomoxetine once daily (n=26) (0.5-1.2 mg/kg/day). IIV-RT, derived from the Conners' continuous performance test (CCPT), was represented by the Gaussian (reaction time standard error, RTSE) and ex-Gaussian models (sigma and tau). Other neuropsychological functions, including response errors and mean of reaction time, were also measured. Participants received CCPT assessments at baseline and week 8-10 (60.4±6.3 days). We found comparable improvements in performances of CCPT between the immediate-release methylphenidate- and atomoxetine-treated groups. Both medications significantly improved IIV-RT in terms of reducing tau values with comparable efficacy. In addition, both medications significantly improved inhibitory control by reducing commission errors. Our results provide evidence to support that atomoxetine could improve IIV-RT and inhibitory control, of comparable efficacy with immediate-release methylphenidate, in drug-naïve adults with ADHD. Shared and unique mechanisms underpinning these medication effects on IIV-RT awaits further investigation. © The Author(s) 2016.

  12. α-Glucosidase inhibitor miglitol attenuates glucose fluctuation, heart rate variability and sympathetic activity in patients with type 2 diabetes and acute coronary syndrome: a multicenter randomized controlled (MACS) study.

    PubMed

    Shimabukuro, Michio; Tanaka, Atsushi; Sata, Masataka; Dai, Kazuoki; Shibata, Yoshisato; Inoue, Yohei; Ikenaga, Hiroki; Kishimoto, Shinji; Ogasawara, Kozue; Takashima, Akira; Niki, Toshiyuki; Arasaki, Osamu; Oshiro, Koichi; Mori, Yutaka; Ishihara, Masaharu; Node, Koichi

    2017-07-06

    Little is known about clinical associations between glucose fluctuations including hypoglycemia, heart rate variability (HRV), and the activity of the sympathetic nervous system (SNS) in patients with acute phase of acute coronary syndrome (ACS). This pilot study aimed to evaluate the short-term effects of glucose fluctuations on HRV and SNS activity in type 2 diabetes mellitus (T2DM) patients with recent ACS. We also examined the effect of suppressing glucose fluctuations with miglitol on these variables. This prospective, randomized, open-label, blinded-endpoint, multicenter, parallel-group comparative study included 39 T2DM patients with recent ACS, who were randomly assigned to either a miglitol group (n = 19) or a control group (n = 20). After initial 24-h Holter electrocardiogram (ECG) (Day 1), miglitol was commenced and another 24-h Holter ECG (Day 2) was recorded. In addition, continuous glucose monitoring (CGM) was performed throughout the Holter ECG. Although frequent episodes of subclinical hypoglycemia (≤4.44 mmo/L) during CGM were observed on Day 1 in the both groups (35% of patients in the control group and 31% in the miglitol group), glucose fluctuations were decreased and the minimum glucose level was increased with substantial reduction in the episodes of subclinical hypoglycemia to 7.7% in the miglitol group on Day 2. Holter ECG showed that the mean and maximum heart rate and mean LF/HF were increased on Day 2 in the control group, and these increases were attenuated by miglitol. When divided 24-h time periods into day-time (0700-1800 h), night-time (1800-0000 h), and bed-time (0000-0700 h), we found increased SNS activity during day-time, increased maximum heart rate during night-time, and glucose fluctuations during bed-time, which were attenuated by miglitol treatment. In T2DM patients with recent ACS, glucose fluctuations with subclinical hypoglycemia were associated with alterations of HRV and SNS activity, which were mitigated by

  13. Dynamic stability of spinning pretwisted beams subjected to axial random forces

    NASA Astrophysics Data System (ADS)

    Young, T. H.; Gau, C. Y.

    2003-11-01

    This paper studies the dynamic stability of a pretwisted cantilever beam spinning along its longitudinal axis and subjected to an axial random force at the free end. The axial force is assumed as the sum of a constant force and a random process with a zero mean. Due to this axial force, the beam may experience parametric random instability. In this work, the finite element method is first applied to yield discretized system equations. The stochastic averaging method is then adopted to obtain Ito's equations for the response amplitudes of the system. Finally the mean-square stability criterion is utilized to determine the stability condition of the system. Numerical results show that the stability boundary of the system converges as the first three modes are taken into calculation. Before the convergence is reached, the stability condition predicted is not conservative enough.

  14. On Digital Simulation of Multicorrelated Random Processes and Its Applications. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sinha, A. K.

    1973-01-01

    Two methods are described to simulate, on a digital computer, a set of correlated, stationary, and Gaussian time series with zero mean from the given matrix of power spectral densities and cross spectral densities. The first method is based upon trigonometric series with random amplitudes and deterministic phase angles. The random amplitudes are generated by using a standard random number generator subroutine. An example is given which corresponds to three components of wind velocities at two different spatial locations for a total of six correlated time series. In the second method, the whole process is carried out using the Fast Fourier Transform approach. This method gives more accurate results and works about twenty times faster for a set of six correlated time series.

  15. Mapping Variables.

    ERIC Educational Resources Information Center

    Stone, Mark H.; Wright, Benjamin D.; Stenner, A. Jackson

    1999-01-01

    Describes mapping variables, the principal technique for planning and constructing a test or rating instrument. A variable map is also useful for interpreting results. Provides several maps to show the importance and value of mapping a variable by person and item data. (Author/SLD)

  16. High Temporospatial Resolution Dynamic Contrast Enhanced (DCE) Wrist MRI with Variable-Density Pseudo-Random CIRcular Cartesian UnderSampling (CIRCUS) Acquisition: Evaluation of Perfusion in Rheumatoid Arthritis Patients

    PubMed Central

    Liu, Jing; Pedoia, Valentina; Heilmeier, Ursula; Ku, Eric; Su, Favian; Khanna, Sameer; Imboden, John; Graf, Jonathan; Link, Thomas; Li, Xiaojuan

    2016-01-01

    This study is to evaluate highly accelerated 3D dynamic contrast-enhanced (DCE) wrist MRI for assessment of perfusion in rheumatoid arthritis (RA) patients. A pseudo-random variable-density undersampling strategy, CIRcular Cartesian UnderSampling (CIRCUS), was combined with k-t SPARSE-SENSE reconstruction to achieve a highly accelerated 3D DCE wrist MRI. Two healthy volunteers and ten RA patients were studied. Two patients were on methotrexate (MTX) only (Group I) and the other eight were treated with a combination therapy of MTX and Anti-Tumour Necrosis Factor (TNF) therapy (Group II). Patients were scanned at baseline and 3-month follow-up. DCE MR images were used to evaluate perfusion in synovitis and bone marrow edema pattern in the RA wrist joints. A series of perfusion parameters were derived and compared with clinical disease activity scores of 28 joints (DAS28). 3D DCE wrist MR images were obtained with a spatial resolution of 0.3×0.3×1.5mm3 and temporal resolution of 5 s (with an acceleration factor of 20). The derived perfusion parameters, most notably, transition time (dT) of synovitis, showed significant negative correlations with DAS28-ESR (r=-0.80, p<0.05) and DAS28-CRP (r=-0.87, p<0.05) at baseline and also correlated significantly with treatment responses evaluated by clinical score changes between baseline and 3-month follow-up (with DAS28-ESR: r=-0.79, p<0.05, and DAS28-CRP: r=-0.82, p<0.05). Highly accelerated 3D DCE wrist MRI with improved temporospatial resolution has been achieved in RA patients and provides accurate assessment of neovascularization and perfusion in RA joints, showing promise as a potential tool for evaluating treatment responses. PMID:26608949

  17. Individuals with tension and migraine headaches exhibit increased heart rate variability during post-stress mindfulness meditation practice but a decrease during a post-stress control condition - A randomized, controlled experiment.

    PubMed

    Azam, Muhammad Abid; Katz, Joel; Mohabir, Vina; Ritvo, Paul

    2016-12-01

    Current research suggests that associations between headache conditions (migraine, tension) and imbalances in the autonomic nervous system (ANS) are due to stress-related dysregulation in the activity of the parasympathetic-sympathetic branches. Mindfulness meditation has demonstrated effectiveness in reducing pain-related distress, and in enhancing heart rate variability-a vagal-mediated marker of ANS balance. This study examined HRV during cognitive stress and mindfulness meditation in individuals with migraine and tension headaches. Undergraduate students with tension and migraine headaches (n=36) and headache-free students (n=39) were recruited for an experiment involving HRV measurement during baseline, cognitive stress-induction, and after randomization to post-stress conditions of audio-guided mindfulness meditation practice (MMP) or mindfulness meditation description (MMD). HRV was derived using electrocardiograms as the absolute power in the high frequency bandwidth (ms 2 ). A three-way ANOVA tested the effects of Group (headache vs. headache-free), Phase (baseline, stress, & post-stress), and Condition (MMP vs. MMD) on HRV. ANOVA revealed a significant three-way interaction. Simple effects tests indicated: 1) HRV increased significantly from stress to MMP for headache and headache-free groups (p<0.001), 2) significantly greater HRV for headache (p<0.001) and headache-free (p<0.05) groups during MMP compared to MMD, and 3) significantly lower HRV in the headache vs. headache-free group during the post-stress MMD condition (p<0.05). Results suggest mindfulness practice can promote effective heart rate regulation, and thereby promote effective recovery after a stressful event for individuals with headache conditions. Moreover, headache conditions may be associated with dysregulated stress recovery, thus more research is needed on the cardiovascular health and stress resilience of headache sufferers. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Short-term effects of a hypocaloric diet with low glycemic index and low glycemic load on body adiposity, metabolic variables, ghrelin, leptin, and pregnancy rate in overweight and obese infertile women: a randomized controlled trial.

    PubMed

    Becker, Geórgia F; Passos, Eduardo P; Moulin, Cileide C

    2015-12-01

    Obesity is related to hormonal disorders that affect the reproductive system. Low-glycemic index (LGI) diets seem to exert a positive effect on weight loss and on metabolic changes that result from obesity. We investigated the effects of a hypocaloric diet with an LGI and low glycemic load on anthropometric and metabolic variables, ghrelin and leptin concentrations, and the pregnancy rate in overweight and obese infertile women who were undergoing in vitro fertilization (IVF). The study was a randomized block-design controlled trial in which we analyzed 26 overweight or obese infertile women. Patients were assigned to a hypocaloric LGI-diet group or a control group and followed the protocol for 12 wk. Body weight, body mass index (BMI), percentage of body fat, glucose, insulin, homeostasis model assessment of insulin resistance, serum lipids, reproductive hormones, leptin, acylated ghrelin, number of oocytes retrieved in the IVF cycle, and pregnancy rate were determined. There were greater reductions in body mass, BMI, percentage of body fat, waist:hip ratio, and leptin in the LGI-diet group than in the control group (P < 0.05). Despite a change of 18% in mean values, there was no significant increase in acylated ghrelin concentrations in the LGI group compared with the control group (P = 0.215). The LGI-diet group had 85.4% more oocytes retrieved than did the control group (7.75 ± 1.44 and 4.18 ± 0.87, respectively; P = 0.039) in the IVF cycle. Three patients (21.4%) in the LGI group experienced a spontaneous pregnancy during the follow-up, which generated 3 live births. The hypocaloric LGI diet promoted a decrease in BMI, percentage of body fat, and leptin concentrations, which improved oocyte development and pregnancy rate. These results support the clinical recommendation to advise overweight and obese women to lose weight through a balanced diet before being submitted for treatment with assisted reproduction technologies. A hypocaloric diet combined with LGI

  19. Tangent linear super-parameterization: attributable, decomposable moist processes for tropical variability studies

    NASA Astrophysics Data System (ADS)

    Mapes, B. E.; Kelly, P.; Song, S.; Hu, I. K.; Kuang, Z.

    2015-12-01

    An economical 10-layer global primitive equation solver is driven by time-independent forcing terms, derived from a training process, to produce a realisting eddying basic state with a tracer q trained to act like water vapor mixing ratio. Within this basic state, linearized anomaly moist physics in the column are applied in the form of a 20x20 matrix. The control matrix was derived from the results of Kuang (2010, 2012) who fitted a linear response function from a cloud resolving model in a state of deep convecting equilibrium. By editing this matrix in physical space and eigenspace, scaling and clipping its action, and optionally adding terms for processes that do not conserve moist statice energy (radiation, surface fluxes), we can decompose and explain the model's diverse moist process coupled variability. Recitified effects of this variability on the general circulation and climate, even in strictly zero-mean centered anomaly physic cases, also are sometimes surprising.

  20. Modeling Randomness in Judging Rating Scales with a Random-Effects Rating Scale Model

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Wilson, Mark; Shih, Ching-Lin

    2006-01-01

    This study presents the random-effects rating scale model (RE-RSM) which takes into account randomness in the thresholds over persons by treating them as random-effects and adding a random variable for each threshold in the rating scale model (RSM) (Andrich, 1978). The RE-RSM turns out to be a special case of the multidimensional random…

  1. The effects of a fat loss supplement on resting metabolic rate and hemodynamic variables in resistance trained males: a randomized, double-blind, placebo-controlled, cross-over trial.

    PubMed

    Campbell, Bill I; Colquhoun, Ryan J; Zito, Gina; Martinez, Nic; Kendall, Kristina; Buchanan, Laura; Lehn, Matt; Johnson, Mallory; St Louis, Courtney; Smith, Yasmin; Cloer, Brad

    2016-01-01

    While it is known that dietary supplements containing a combination of thermogenic ingredients can increase resting metabolic rate (RMR), the magnitude can vary based on the active ingredient and/or combination of active ingredients. The purpose of this study was to examine the effects of a commercially available thermogenic fat loss supplement on RMR and hemodynamic variables in healthy, resistance trained males. Ten resistance-trained male participants (29 ± 9 years; 178 ± 4 cm; 85.7 ± 11 kg, and BMI = 26.8 ± 3.7) volunteered to participate in this randomized, double-blind, placebo controlled cross-over study. Participants underwent two testing sessions separated by at least 24 h. On their first visit, participants arrived to the laboratory after an overnight fast and a 24-h avoidance of exercise, and underwent a baseline RMR, HR, and BP assessment. Next, each participant ingested a thermogenic fat loss supplement (TFLS) or a placebo (PLA) and repeated the RMR, HR, and BP assessments at 60, 120, and 180 min post-ingestion. During the second visit the alternative supplement was ingested and the assessments were repeated in the exact same manner. Data were analyzed via a 2-factor [2x4] within-subjects repeated measures analysis of variance (ANOVA). Post-hoc tests were analyzed via paired samples t-tests. The criterion for significance was set at p ≤ 0.05. A significant main effect for time relative to raw RMR data (p = 0.014) was observed. Post-hoc analysis revealed that the TFLS significantly increased RMR at 60-min, 120-min, and 180-min post ingestion (p < 0.05) as compared to baseline RMR values. No significant changes in RMR were observed for the PLA treatment (p > 0.05). Specifically, RMR was increased by 7.8 % (from 1,906 to 2,057 kcal), 6.9 % (from 1,906 to 2,037 kcal), and 9.1 % (from 1,906 to 2,081 kcal) in the TFLS, while the PLA treatment increased RMR by 3.3 % (from 1,919 to 1,981 kcal), 3.1

  2. Random Fields

    NASA Astrophysics Data System (ADS)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  3. Random trinomial tree models and vanilla options

    NASA Astrophysics Data System (ADS)

    Ganikhodjaev, Nasir; Bayram, Kamola

    2013-09-01

    In this paper we introduce and study random trinomial model. The usual trinomial model is prescribed by triple of numbers (u, d, m). We call the triple (u, d, m) an environment of the trinomial model. A triple (Un, Dn, Mn), where {Un}, {Dn} and {Mn} are the sequences of independent, identically distributed random variables with 0 < Dn < 1 < Un and Mn = 1 for all n, is called a random environment and trinomial tree model with random environment is called random trinomial model. The random trinomial model is considered to produce more accurate results than the random binomial model or usual trinomial model.

  4. Random walks with random velocities.

    PubMed

    Zaburdaev, Vasily; Schmiedeberg, Michael; Stark, Holger

    2008-07-01

    We consider a random walk model that takes into account the velocity distribution of random walkers. Random motion with alternating velocities is inherent to various physical and biological systems. Moreover, the velocity distribution is often the first characteristic that is experimentally accessible. Here, we derive transport equations describing the dispersal process in the model and solve them analytically. The asymptotic properties of solutions are presented in the form of a phase diagram that shows all possible scaling regimes, including superdiffusive, ballistic, and superballistic motion. The theoretical results of this work are in excellent agreement with accompanying numerical simulations.

  5. A novel approach to assess the treatment response using Gaussian random field in PET

    SciT

    Wang, Mengdie; Guo, Ning; Hu, Guangshu

    2016-02-15

    Purpose: The assessment of early therapeutic response to anticancer therapy is vital for treatment planning and patient management in clinic. With the development of personal treatment plan, the early treatment response, especially before any anatomically apparent changes after treatment, becomes urgent need in clinic. Positron emission tomography (PET) imaging serves an important role in clinical oncology for tumor detection, staging, and therapy response assessment. Many studies on therapy response involve interpretation of differences between two PET images, usually in terms of standardized uptake values (SUVs). However, the quantitative accuracy of this measurement is limited. This work proposes a statistically robustmore » approach for therapy response assessment based on Gaussian random field (GRF) to provide a statistically more meaningful scale to evaluate therapy effects. Methods: The authors propose a new criterion for therapeutic assessment by incorporating image noise into traditional SUV method. An analytical method based on the approximate expressions of the Fisher information matrix was applied to model the variance of individual pixels in reconstructed images. A zero mean unit variance GRF under the null hypothesis (no response to therapy) was obtained by normalizing each pixel of the post-therapy image with the mean and standard deviation of the pretherapy image. The performance of the proposed method was evaluated by Monte Carlo simulation, where XCAT phantoms (128{sup 2} pixels) with lesions of various diameters (2–6 mm), multiple tumor-to-background contrasts (3–10), and different changes in intensity (6.25%–30%) were used. The receiver operating characteristic curves and the corresponding areas under the curve were computed for both the proposed method and the traditional methods whose figure of merit is the percentage change of SUVs. The formula for the false positive rate (FPR) estimation was developed for the proposed therapy

  6. Random Dynamics

    NASA Astrophysics Data System (ADS)

    Bennett, D. L.; Brene, N.; Nielsen, H. B.

    1987-01-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model.

  7. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  8. Investigating Factorial Invariance of Latent Variables Across Populations When Manifest Variables Are Missing Completely

    PubMed Central

    Widaman, Keith F.; Grimm, Kevin J.; Early, Dawnté R.; Robins, Richard W.; Conger, Rand D.

    2013-01-01

    Difficulties arise in multiple-group evaluations of factorial invariance if particular manifest variables are missing completely in certain groups. Ad hoc analytic alternatives can be used in such situations (e.g., deleting manifest variables), but some common approaches, such as multiple imputation, are not viable. At least 3 solutions to this problem are viable: analyzing differing sets of variables across groups, using pattern mixture approaches, and a new method using random number generation. The latter solution, proposed in this article, is to generate pseudo-random normal deviates for all observations for manifest variables that are missing completely in a given sample and then to specify multiple-group models in a way that respects the random nature of these values. An empirical example is presented in detail comparing the 3 approaches. The proposed solution can enable quantitative comparisons at the latent variable level between groups using programs that require the same number of manifest variables in each group. PMID:24019738

  9. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    NASA Astrophysics Data System (ADS)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  10. Randomized Item Response Theory Models

    ERIC Educational Resources Information Center

    Fox, Jean-Paul

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by an item response theory (IRT) model. The RR…

  11. A Phase 2, randomized, partially blinded, active-controlled study assessing the efficacy and safety of variable anticoagulation reversal using the REG1 system in patients with acute coronary syndromes: results of the RADAR trial

    PubMed Central

    Povsic, Thomas J.; Vavalle, John P.; Aberle, Laura H.; Kasprzak, Jaroslaw D.; Cohen, Mauricio G.; Mehran, Roxana; Bode, Christoph; Buller, Christopher E.; Montalescot, Gilles; Cornel, Jan H.; Rynkiewicz, Andrzej; Ring, Michael E.; Zeymer, Uwe; Natarajan, Madhu; Delarche, Nicolas; Zelenkofske, Steven L.; Becker, Richard C.; Alexander, John H.

    2013-01-01

    Aims We sought to determine the degree of anticoagulation reversal required to mitigate bleeding, and assess the feasibility of using pegnivacogin to prevent ischaemic events in acute coronary syndrome (ACS) patients managed with an early invasive approach. REG1 consists of pegnivacogin, an RNA aptamer selective factor IXa inhibitor, and its complementary controlling agent, anivamersen. REG1 has not been studied in invasively managed patients with ACS nor has an optimal level of reversal allowing safe sheath removal been defined. Methods and results Non-ST-elevation ACS patients (n = 640) with planned early cardiac catheterization via femoral access were randomized 2:1:1:2:2 to pegnivacogin with 25, 50, 75, or 100% anivamersen reversal or heparin. The primary endpoint was total ACUITY bleeding through 30 days. Secondary endpoints included major bleeding and the composite of death, myocardial infarction, urgent target vessel revascularization, or recurrent ischaemia. Enrolment in the 25% reversal arm was suspended after 41 patients. Enrolment was stopped after three patients experienced allergic-like reactions. Bleeding occurred in 65, 34, 35, 30, and 31% of REG1 patients with 25, 50, 75, and 100% reversal and heparin. Major bleeding occurred in 20, 11, 8, 7, and 10% of patients. Ischaemic events occurred in 3.0 and 5.7% of REG1 and heparin patients, respectively. Conclusion At least 50% reversal is required to allow safe sheath removal after cardiac catheterization. REG1 appears a safe strategy to anticoagulate ACS patients managed invasively and warrants further investigation in adequately powered clinical trials of patients who require short-term high-intensity anticoagulation. Clinical Trials Registration: ClinicalTrials.gov NCT00932100. PMID:22859796

  12. Understanding Solar Cycle Variability

    SciT

    Cameron, R. H.; Schüssler, M., E-mail: cameron@mps.mpg.de

    The level of solar magnetic activity, as exemplified by the number of sunspots and by energetic events in the corona, varies on a wide range of timescales. Most prominent is the 11-year solar cycle, which is significantly modulated on longer timescales. Drawing from dynamo theory, together with the empirical results of past solar activity and similar phenomena for solar-like stars, we show that the variability of the solar cycle can be essentially understood in terms of a weakly nonlinear limit cycle affected by random noise. In contrast to ad hoc “toy models” for the solar cycle, this leads to amore » generic normal-form model, whose parameters are all constrained by observations. The model reproduces the characteristics of the variable solar activity on timescales between decades and millennia, including the occurrence and statistics of extended periods of very low activity (grand minima). Comparison with results obtained with a Babcock–Leighton-type dynamo model confirm the validity of the normal-mode approach.« less

  13. Is random access memory random?

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.

  14. Gaussian closure technique applied to the hysteretic Bouc model with non-zero mean white noise excitation

    NASA Astrophysics Data System (ADS)

    Waubke, Holger; Kasess, Christian H.

    2016-11-01

    Devices that emit structure-borne sound are commonly decoupled by elastic components to shield the environment from acoustical noise and vibrations. The elastic elements often have a hysteretic behavior that is typically neglected. In order to take hysteretic behavior into account, Bouc developed a differential equation for such materials, especially joints made of rubber or equipped with dampers. In this work, the Bouc model is solved by means of the Gaussian closure technique based on the Kolmogorov equation. Kolmogorov developed a method to derive probability density functions for arbitrary explicit first-order vector differential equations under white noise excitation using a partial differential equation of a multivariate conditional probability distribution. Up to now no analytical solution of the Kolmogorov equation in conjunction with the Bouc model exists. Therefore a wide range of approximate solutions, especially the statistical linearization, were developed. Using the Gaussian closure technique that is an approximation to the Kolmogorov equation assuming a multivariate Gaussian distribution an analytic solution is derived in this paper for the Bouc model. For the stationary case the two methods yield equivalent results, however, in contrast to statistical linearization the presented solution allows to calculate the transient behavior explicitly. Further, stationary case leads to an implicit set of equations that can be solved iteratively with a small number of iterations and without instabilities for specific parameter sets.

  15. Compliance-Effect Correlation Bias in Instrumental Variables Estimators

    ERIC Educational Resources Information Center

    Reardon, Sean F.

    2010-01-01

    Instrumental variable estimators hold the promise of enabling researchers to estimate the effects of educational treatments that are not (or cannot be) randomly assigned but that may be affected by randomly assigned interventions. Examples of the use of instrumental variables in such cases are increasingly common in educational and social science…

  16. Effects of neuromuscular electrical stimulation, laser therapy and LED therapy on the masticatory system and the impact on sleep variables in cerebral palsy patients: a randomized, five arms clinical trial.

    PubMed

    Giannasi, Lilian Chrystiane; Matsui, Miriam Yumi; de Freitas Batista, Sandra Regina; Hardt, Camila Teixeira; Gomes, Carla Paes; Amorim, José Benedito Oliveira; de Carvalho Aguiar, Isabella; Collange, Luanda; Dos Reis Dos Santos, Israel; Dias, Ismael Souza; de Oliveira, Cláudia Santos; de Oliveira, Luis Vicente Franco; Gomes, Mônica Fernandes

    2012-05-15

    Few studies demonstrate effectiveness of therapies for oral rehabilitation of patients with cerebral palsy (CP), given the difficulties in chewing, swallowing and speech, besides the intellectual, sensory and social limitations. Due to upper airway obstruction, they are also vulnerable to sleep disorders. This study aims to assess the sleep variables, through polysomnography, and masticatory dynamics, using electromiography, before and after neuromuscular electrical stimulation, associated or not with low power laser (Gallium Arsenide- Aluminun, =780 nm) and LED (= 660 nm) irradiation in CP patients. 50 patients with CP, both gender, aged between 19 and 60 years will be enrolled in this study. The inclusion criteria are: voluntary participation, patient with hemiparesis, quadriparesis or diparetic CP, with ability to understand and respond to verbal commands. The exclusion criteria are: patients undergoing/underwent orthodontic, functional maxillary orthopedic or botulinum toxin treatment. Polysomnographic and surface electromyographic exams on masseter, temporalis and suprahyoid will be carry out in all sample. Questionnaire assessing oral characteristics will be applied. The sample will be divided into 5 treatment groups: Group 1: neuromuscular electrical stimulation; Group 2: laser therapy; Group 3: LED therapy; Group 4: neuromuscular electrical stimulation and laser therapy and Group 5: neuromuscular electrical stimulation and LED therapy. All patients will be treated during 8 consecutive weeks. After treatment, polysomnographic and electromiographic exams will be collected again. This paper describes a five arm clinical trial assessing the examination of sleep quality and masticatory function in patients with CP under non-invasive therapies. The protocol for this study is registered with the Brazilian Registry of Clinical Trials - ReBEC RBR-994XFS.

  17. Effects of neuromuscular electrical stimulation, laser therapy and LED therapy on the masticatory system and the impact on sleep variables in cerebral palsy patients: a randomized, five arms clinical trial

    PubMed Central

    2012-01-01

    Background Few studies demonstrate effectiveness of therapies for oral rehabilitation of patients with cerebral palsy (CP), given the difficulties in chewing, swallowing and speech, besides the intellectual, sensory and social limitations. Due to upper airway obstruction, they are also vulnerable to sleep disorders. This study aims to assess the sleep variables, through polysomnography, and masticatory dynamics, using electromiography, before and after neuromuscular electrical stimulation, associated or not with low power laser (Gallium Arsenide- Aluminun, =780 nm) and LED (= 660 nm) irradiation in CP patients. Methods/design 50 patients with CP, both gender, aged between 19 and 60 years will be enrolled in this study. The inclusion criteria are: voluntary participation, patient with hemiparesis, quadriparesis or diparetic CP, with ability to understand and respond to verbal commands. The exclusion criteria are: patients undergoing/underwent orthodontic, functional maxillary orthopedic or botulinum toxin treatment. Polysomnographic and surface electromyographic exams on masseter, temporalis and suprahyoid will be carry out in all sample. Questionnaire assessing oral characteristics will be applied. The sample will be divided into 5 treatment groups: Group 1: neuromuscular electrical stimulation; Group 2: laser therapy; Group 3: LED therapy; Group 4: neuromuscular electrical stimulation and laser therapy and Group 5: neuromuscular electrical stimulation and LED therapy. All patients will be treated during 8 consecutive weeks. After treatment, polysomnographic and electromiographic exams will be collected again. Discussion This paper describes a five arm clinical trial assessing the examination of sleep quality and masticatory function in patients with CP under non-invasive therapies. Trial registration The protocol for this study is registered with the Brazilian Registry of Clinical Trials - ReBEC RBR-994XFS Descriptors Cerebral Palsy. Stomatognathic System

  18. Diffusion in random networks

    DOE PAGES

    Zhang, Duan Z.; Padrino, Juan C.

    2017-06-01

    The ensemble averaging technique is applied to model mass transport by diffusion in random networks. The system consists of an ensemble of random networks, where each network is made of pockets connected by tortuous channels. Inside a channel, fluid transport is assumed to be governed by the one-dimensional diffusion equation. Mass balance leads to an integro-differential equation for the pocket mass density. The so-called dual-porosity model is found to be equivalent to the leading order approximation of the integration kernel when the diffusion time scale inside the channels is small compared to the macroscopic time scale. As a test problem,more » we consider the one-dimensional mass diffusion in a semi-infinite domain. Because of the required time to establish the linear concentration profile inside a channel, for early times the similarity variable is xt $-$1/4 rather than xt $-$1/2 as in the traditional theory. We found this early time similarity can be explained by random walk theory through the network.« less

  19. Anderson localization for radial tree-like random quantum graphs

    NASA Astrophysics Data System (ADS)

    Hislop, Peter D.; Post, Olaf

    We prove that certain random models associated with radial, tree-like, rooted quantum graphs exhibit Anderson localization at all energies. The two main examples are the random length model (RLM) and the random Kirchhoff model (RKM). In the RLM, the lengths of each generation of edges form a family of independent, identically distributed random variables (iid). For the RKM, the iid random variables are associated with each generation of vertices and moderate the current flow through the vertex. We consider extensions to various families of decorated graphs and prove stability of localization with respect to decoration. In particular, we prove Anderson localization for the random necklace model.

  20. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  1. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  2. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  3. A Multivariate Randomization Text of Association Applied to Cognitive Test Results

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert; Beard, Bettina

    2009-01-01

    Randomization tests provide a conceptually simple, distribution-free way to implement significance testing. We have applied this method to the problem of evaluating the significance of the association among a number (k) of variables. The randomization method was the random re-ordering of k-1 of the variables. The criterion variable was the value of the largest eigenvalue of the correlation matrix.

  4. Tests of Hypotheses Arising In the Correlated Random Coefficient Model*

    PubMed Central

    Heckman, James J.; Schmierer, Daniel

    2010-01-01

    This paper examines the correlated random coefficient model. It extends the analysis of Swamy (1971), who pioneered the uncorrelated random coefficient model in economics. We develop the properties of the correlated random coefficient model and derive a new representation of the variance of the instrumental variable estimator for that model. We develop tests of the validity of the correlated random coefficient model against the null hypothesis of the uncorrelated random coefficient model. PMID:21170148

  5. Machine learning search for variable stars

    NASA Astrophysics Data System (ADS)

    Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis

    2018-04-01

    Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.

  6. Qualitatively Assessing Randomness in SVD Results

    NASA Astrophysics Data System (ADS)

    Lamb, K. W.; Miller, W. P.; Kalra, A.; Anderson, S.; Rodriguez, A.

    2012-12-01

    Singular Value Decomposition (SVD) is a powerful tool for identifying regions of significant co-variability between two spatially distributed datasets. SVD has been widely used in atmospheric research to define relationships between sea surface temperatures, geopotential height, wind, precipitation and streamflow data for myriad regions across the globe. A typical application for SVD is to identify leading climate drivers (as observed in the wind or pressure data) for a particular hydrologic response variable such as precipitation, streamflow, or soil moisture. One can also investigate the lagged relationship between a climate variable and the hydrologic response variable using SVD. When performing these studies it is important to limit the spatial bounds of the climate variable to reduce the chance of random co-variance relationships being identified. On the other hand, a climate region that is too small may ignore climate signals which have more than a statistical relationship to a hydrologic response variable. The proposed research seeks to identify a qualitative method of identifying random co-variability relationships between two data sets. The research identifies the heterogeneous correlation maps from several past results and compares these results with correlation maps produced using purely random and quasi-random climate data. The comparison identifies a methodology to determine if a particular region on a correlation map may be explained by a physical mechanism or is simply statistical chance.

  7. Detecting Random, Partially Random, and Nonrandom Minnesota Multiphasic Personality Inventory-2 Protocols

    ERIC Educational Resources Information Center

    Pinsoneault, Terry B.

    2007-01-01

    The ability of the Minnesota Multiphasic Personality Inventory-2 (MMPI-2; J. N. Butcher et al., 2001) validity scales to detect random, partially random, and nonrandom MMPI-2 protocols was investigated. Investigations included the Variable Response Inconsistency scale (VRIN), F, several potentially useful new F and VRIN subscales, and F-sub(b) - F…

  8. Variable-bias coin tossing

    NASA Astrophysics Data System (ADS)

    Colbeck, Roger; Kent, Adrian

    2006-03-01

    Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT.

  9. Neyman Pearson detection of K-distributed random variables

    NASA Astrophysics Data System (ADS)

    Tucker, J. Derek; Azimi-Sadjadi, Mahmood R.

    2010-04-01

    In this paper a new detection method for sonar imagery is developed in K-distributed background clutter. The equation for the log-likelihood is derived and compared to the corresponding counterparts derived for the Gaussian and Rayleigh assumptions. Test results of the proposed method on a data set of synthetic underwater sonar images is also presented. This database contains images with targets of different shapes inserted into backgrounds generated using a correlated K-distributed model. Results illustrating the effectiveness of the K-distributed detector are presented in terms of probability of detection, false alarm, and correct classification rates for various bottom clutter scenarios.

  10. Probability Distributions of Minkowski Distances between Discrete Random Variables.

    ERIC Educational Resources Information Center

    Schroger, Erich; And Others

    1993-01-01

    Minkowski distances are used to indicate similarity of two vectors in an N-dimensional space. How to compute the probability function, the expectation, and the variance for Minkowski distances and the special cases City-block distance and Euclidean distance. Critical values for tests of significance are presented in tables. (SLD)

  11. How to Do Random Allocation (Randomization)

    PubMed Central

    Shin, Wonshik

    2014-01-01

    Purpose To explain the concept and procedure of random allocation as used in a randomized controlled study. Methods We explain the general concept of random allocation and demonstrate how to perform the procedure easily and how to report it in a paper. PMID:24605197

  12. Soil variability in engineering applications

    NASA Astrophysics Data System (ADS)

    Vessia, Giovanna

    2014-05-01

    Natural geomaterials, as soils and rocks, show spatial variability and heterogeneity of physical and mechanical properties. They can be measured by in field and laboratory testing. The heterogeneity concerns different values of litho-technical parameters pertaining similar lithological units placed close to each other. On the contrary, the variability is inherent to the formation and evolution processes experienced by each geological units (homogeneous geomaterials on average) and captured as a spatial structure of fluctuation of physical property values about their mean trend, e.g. the unit weight, the hydraulic permeability, the friction angle, the cohesion, among others. The preceding spatial variations shall be managed by engineering models to accomplish reliable designing of structures and infrastructures. Materon (1962) introduced the Geostatistics as the most comprehensive tool to manage spatial correlation of parameter measures used in a wide range of earth science applications. In the field of the engineering geology, Vanmarcke (1977) developed the first pioneering attempts to describe and manage the inherent variability in geomaterials although Terzaghi (1943) already highlighted that spatial fluctuations of physical and mechanical parameters used in geotechnical designing cannot be neglected. A few years later, Mandelbrot (1983) and Turcotte (1986) interpreted the internal arrangement of geomaterial according to Fractal Theory. In the same years, Vanmarcke (1983) proposed the Random Field Theory providing mathematical tools to deal with inherent variability of each geological units or stratigraphic succession that can be resembled as one material. In this approach, measurement fluctuations of physical parameters are interpreted through the spatial variability structure consisting in the correlation function and the scale of fluctuation. Fenton and Griffiths (1992) combined random field simulation with the finite element method to produce the Random

  13. Variable Pitch Propellers

    NASA Technical Reports Server (NTRS)

    1920-01-01

    In this report are described four different types of propellers which appeared at widely separated dates, but which were exhibited together at the last Salon de l'Aeronautique. The four propellers are the Chaviere variable pitch propeller, the variable pitch propeller used on the Clement Bayard dirigible, the variable pitch propeller used on Italian dirigibles, and the Levasseur variable pitch propeller.

  14. A Geometrical Framework for Covariance Matrices of Continuous and Categorical Variables

    ERIC Educational Resources Information Center

    Vernizzi, Graziano; Nakai, Miki

    2015-01-01

    It is well known that a categorical random variable can be represented geometrically by a simplex. Accordingly, several measures of association between categorical variables have been proposed and discussed in the literature. Moreover, the standard definitions of covariance and correlation coefficient for continuous random variables have been…

  15. Confirmation of radial velocity variability in Arcturus

    NASA Technical Reports Server (NTRS)

    Cochran, William D.

    1988-01-01

    The paper presents results of high-precision measurements of radial-velocity variations in Alpha Boo. Significant radial-velocity variability is detected well in excess of the random and systematic measurement errors. The radial velocity varies by an amount greater than 200 m/sec with a period of around 2 days.

  16. Effects of a single administration of prostaglandin F2alpha, or a combination of prostaglandin F2alpha and prostaglandin E2, or placebo on fertility variables in dairy cows 3-5 weeks post partum, a randomized, double-blind clinical trial.

    PubMed

    Hirsbrunner, Gaby; Burkhardt, Heinz W; Steiner, Adrian

    2006-12-21

    Delayed uterine involution has negative effects on the fertility of cows; use of prostaglandin F2alpha alone as a single treatment has not been shown to consistently improve fertility. Combined administration of PGF2alpha and PGE2 increased uterine pressure in healthy cows. We hypothesized, that the combination of both prostaglandins would accelerate uterine involution and have, therefore, a positive effect on fertility variables. In commercial dairy farming, the benefit of a single post partum combined prostaglandin treatment should be demonstrated. 383 cows from commercial dairy farms were included in this study. Uterine size and secretion were evaluated at treatment 21-35 days post partum and 14 days later. Cows were randomly allocated to one of three treatment groups: PGF2alpha and PGE2, PGF2alpha or placebo. For every animal participating in the study, the following reproduction variables were recorded: Interval from calving to first insemination, days open, number of artificial inseminations (AI) to conception; subsequent treatment of uterus, subsequent treatment of ovaries. Plasma progesterone level at time of treatment was used as a covariable. For continuous measurements, analysis of variance was performed. Fisher's exact test for categorical non-ordered data and exact Kruskal-Wallis test for ordered data were used; pairwise group comparisons with Bonferroni adjustment of significance level were performed. There was no significant difference among treatment groups in uterine size. Furthermore, there was no significant difference among treatments concerning days open, number of AI, and subsequent treatment of uterus and ovaries. Days from calving to first insemination tended to be shorter for cows with low progesterone level given PGF2alpha and PGE2 in combination than for the placebo-group (P = 0.024). The results of this study indicate that the administration of PGF2alpha or a combination of PGF2alpha and PGE2 21 to 35 days post partum had no beneficial

  17. Effects of a single administration of prostaglandin F2alpha, or a combination of prostaglandin F2alpha and prostaglandin E2, or placebo on fertility variables in dairy cows 3–5 weeks post partum, a randomized, double-blind clinical trial

    PubMed Central

    Hirsbrunner, Gaby; Burkhardt, Heinz W; Steiner, Adrian

    2006-01-01

    Background Delayed uterine involution has negative effects on the fertility of cows; use of prostaglandin F2alpha alone as a single treatment has not been shown to consistently improve fertility. Combined administration of PGF2alpha and PGE2 increased uterine pressure in healthy cows. We hypothesized, that the combination of both prostaglandins would accelerate uterine involution and have, therefore, a positive effect on fertility variables. In commercial dairy farming, the benefit of a single post partum combined prostaglandin treatment should be demonstrated. Methods 383 cows from commercial dairy farms were included in this study. Uterine size and secretion were evaluated at treatment 21–35 days post partum and 14 days later. Cows were randomly allocated to one of three treatment groups: PGF2alpha and PGE2, PGF2alpha or placebo. For every animal participating in the study, the following reproduction variables were recorded: Interval from calving to first insemination, days open, number of artificial inseminations (AI) to conception; subsequent treatment of uterus, subsequent treatment of ovaries. Plasma progesterone level at time of treatment was used as a covariable. For continuous measurements, analysis of variance was performed. Fisher's exact test for categorical non-ordered data and exact Kruskal-Wallis test for ordered data were used; pairwise group comparisons with Bonferroni adjustment of significance level were performed. Results There was no significant difference among treatment groups in uterine size. Furthermore, there was no significant difference among treatments concerning days open, number of AI, and subsequent treatment of uterus and ovaries. Days from calving to first insemination tended to be shorter for cows with low progesterone level given PGF2alpha and PGE2 in combination than for the placebo-group (P = 0.024). Conclusion The results of this study indicate that the administration of PGF2alpha or a combination of PGF2alpha and PGE2 21 to

  18. FOG Random Drift Signal Denoising Based on the Improved AR Model and Modified Sage-Husa Adaptive Kalman Filter.

    PubMed

    Sun, Jin; Xu, Xiaosu; Liu, Yiting; Zhang, Tao; Li, Yao

    2016-07-12

    In order to reduce the influence of fiber optic gyroscope (FOG) random drift error on inertial navigation systems, an improved auto regressive (AR) model is put forward in this paper. First, based on real-time observations at each restart of the gyroscope, the model of FOG random drift can be established online. In the improved AR model, the FOG measured signal is employed instead of the zero mean signals. Then, the modified Sage-Husa adaptive Kalman filter (SHAKF) is introduced, which can directly carry out real-time filtering on the FOG signals. Finally, static and dynamic experiments are done to verify the effectiveness. The filtering results are analyzed with Allan variance. The analysis results show that the improved AR model has high fitting accuracy and strong adaptability, and the minimum fitting accuracy of single noise is 93.2%. Based on the improved AR(3) model, the denoising method of SHAKF is more effective than traditional methods, and its effect is better than 30%. The random drift error of FOG is reduced effectively, and the precision of the FOG is improved.

  19. How random is a random vector?

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  20. Hilbert-Huang spectral analysis for characterizing the intrinsic time-scales of variability in decennial time-series of surface solar radiation

    NASA Astrophysics Data System (ADS)

    Bengulescu, Marc; Blanc, Philippe; Wald, Lucien

    2016-04-01

    An analysis of the variability of the surface solar irradiance (SSI) at different local time-scales is presented in this study. Since geophysical signals, such as long-term measurements of the SSI, are often produced by the non-linear interaction of deterministic physical processes that may also be under the influence of non-stationary external forcings, the Hilbert-Huang transform (HHT), an adaptive, noise-assisted, data-driven technique, is employed to extract locally - in time and in space - the embedded intrinsic scales at which a signal oscillates. The transform consists of two distinct steps. First, by means of the Empirical Mode Decomposition (EMD), the time-series is "de-constructed" into a finite number - often small - of zero-mean components that have distinct temporal scales of variability, termed hereinafter the Intrinsic Mode Functions (IMFs). The signal model of the components is an amplitude modulation - frequency modulation (AM - FM) one, and can also be thought of as an extension of a Fourier series having both time varying amplitude and frequency. Following the decomposition, Hilbert spectral analysis is then employed on the IMFs, yielding a time-frequency-energy representation that portrays changes in the spectral contents of the original data, with respect to time. As measurements of surface solar irradiance may possibly be contaminated by the manifestation of different type of stochastic processes (i.e. noise), the identification of real, physical processes from this background of random fluctuations is of interest. To this end, an adaptive background noise null hypothesis is assumed, based on the robust statistical properties of the EMD when applied to time-series of different classes of noise (e.g. white, red or fractional Gaussian). Since the algorithm acts as an efficient constant-Q dyadic, "wavelet-like", filter bank, the different noise inputs are decomposed into components having the same spectral shape, but that are translated to the

  1. Variable rate irrigation (VRI)

    Variable rate irrigation (VRI) technology is now offered by all major manufacturers of moving irrigation systems, mostly on center pivot irrigation systems. Variable irrigation depths may be controlled by sector only, in which case only the speed of the irrigation lateral is regulated. Or, variable ...

  2. Genetic variability in krill.

    PubMed

    Valentine, J W; Ayala, F J

    1976-02-01

    We have estimated genetic variability by gel electrophoresis in three species of krill, genus Euphausia (Arthropoda: Crustacea). Genetic variability is low where trophic resources are most seasonal, and high where trophic resources are most stable. Simlar trends have been found in benthic marine invertebrates. The observed trends of genetic variability do not correlate with trends in the stability of physical environment parameters.

  3. Quantifying variability in delta experiments

    NASA Astrophysics Data System (ADS)

    Miller, K. L.; Berg, S. R.; McElroy, B. J.

    2017-12-01

    Large populations of people and wildlife make their homes on river deltas, therefore it is important to be able to make useful and accurate predictions of how these landforms will change over time. However, making predictions can be a challenge due to inherent variability of the natural system. Furthermore, when we extrapolate results from the laboratory to the field setting, we bring with it random and systematic errors of the experiment. We seek to understand both the intrinsic and experimental variability of river delta systems to help better inform predictions of how these landforms will evolve. We run exact replicates of experiments with steady sediment and water discharge and record delta evolution with overhead time lapse imaging. We measure aspects of topset progradation and channel dynamics and compare these metrics of delta morphology between the 6 replicated experimental runs. We also use data from all experimental runs collectively to build a large dataset to extract statistics of the system properties. We find that although natural variability exists, the processes in the experiments must have outcomes that no longer depend on their initial conditions after some time. Applying these results to the field scale will aid in our ability to make forecasts of how these landforms will progress.

  4. Impact of Flavonols on Cardiometabolic Biomarkers:  A Meta-Analysis of Randomized Controlled Human  Trials to Explore the Role of Inter-Individual  Variability.

    PubMed

    Menezes, Regina; Rodriguez-Mateos, Ana; Kaltsatou, Antonia; González-Sarrías, Antonio; Greyling, Arno; Giannaki, Christoforos; Andres-Lacueva, Cristina; Milenkovic, Dragan; Gibney, Eileen R; Dumont, Julie; Schär, Manuel; Garcia-Aloy, Mar; Palma-Duran, Susana Alejandra; Ruskovska, Tatjana; Maksimova, Viktorija; Combet, Emilie; Pinto, Paula

    2017-02-09

    Several  epidemiological  studies  have  linked  flavonols  with  decreased  risk  of  cardiovascular  disease  (CVD).  However,  some  heterogeneity  in  the  individual  physiological  responses to the consumption of these compounds has been identified. This meta-analysis aimed to  study the effect of flavonol supplementation on biomarkers of CVD risk such as, blood lipids, blood  pressure and plasma glucose, as well as factors affecting their inter-individual variability. Data from  18 human randomized controlled trials were pooled and the effect was estimated using fixed or  random effects meta-analysis model and reported as difference in means (DM). Variability in the  response of blood lipids to supplementation with flavonols was assessed by stratifying various  population subgroups: age, sex, country, and health status. Results showed significant reductions  in total cholesterol (DM = -0.10 mmol/L; 95% CI: -0.20, -0.01), LDL cholesterol (DM = -0.14 mmol/L;  Nutrients 2017, 9, 117  2 of 21  95% CI: -0.21, 0.07), and triacylglycerol (DM = -0.10 mmol/L; 95% CI: -0.18, 0.03), and a significant  increase in HDL cholesterol (DM = 0.05 mmol/L; 95% CI: 0.02, 0.07). A significant reduction was also  observed in fasting plasma glucose (DM = -0.18 mmol/L; 95%CI: -0.29, -0.08), and in blood pressure  (SBP: DM = -4.84 mmHg; 95% CI: -5.64, -4.04; DBP: DM = -3.32 mmHg; 95% CI: -4.09, -2.55).  Subgroup analysis showed a more pronounced effect of flavonol intake in participants from Asian  countries and in participants with diagnosed disease or dyslipidemia, compared to healthy and  normal baseline values. In conclusion, flavonol consumption improved biomarkers of CVD risk,  however, country of origin and health status may influence

  5. Properties of networks with partially structured and partially random connectivity

    NASA Astrophysics Data System (ADS)

    Ahmadian, Yashar; Fumarola, Francesco; Miller, Kenneth D.

    2015-01-01

    Networks studied in many disciplines, including neuroscience and mathematical biology, have connectivity that may be stochastic about some underlying mean connectivity represented by a non-normal matrix. Furthermore, the stochasticity may not be independent and identically distributed (iid) across elements of the connectivity matrix. More generally, the problem of understanding the behavior of stochastic matrices with nontrivial mean structure and correlations arises in many settings. We address this by characterizing large random N ×N matrices of the form A =M +L J R , where M ,L , and R are arbitrary deterministic matrices and J is a random matrix of zero-mean iid elements. M can be non-normal, and L and R allow correlations that have separable dependence on row and column indices. We first provide a general formula for the eigenvalue density of A . For A non-normal, the eigenvalues do not suffice to specify the dynamics induced by A , so we also provide general formulas for the transient evolution of the magnitude of activity and frequency power spectrum in an N -dimensional linear dynamical system with a coupling matrix given by A . These quantities can also be thought of as characterizing the stability and the magnitude of the linear response of a nonlinear network to small perturbations about a fixed point. We derive these formulas and work them out analytically for some examples of M ,L , and R motivated by neurobiological models. We also argue that the persistence as N →∞ of a finite number of randomly distributed outlying eigenvalues outside the support of the eigenvalue density of A , as previously observed, arises in regions of the complex plane Ω where there are nonzero singular values of L-1(z 1 -M ) R-1 (for z ∈Ω ) that vanish as N →∞ . When such singular values do not exist and L and R are equal to the identity, there is a correspondence in the normalized Frobenius norm (but not in the operator norm) between the support of the spectrum

  6. Random walks on combs

    NASA Astrophysics Data System (ADS)

    Durhuus, Bergfinnur; Jonsson, Thordur; Wheater, John F.

    2006-02-01

    We develop techniques to obtain rigorous bounds on the behaviour of random walks on combs. Using these bounds, we calculate exactly the spectral dimension of random combs with infinite teeth at random positions or teeth with random but finite length. We also calculate exactly the spectral dimension of some fixed non-translationally invariant combs. We relate the spectral dimension to the critical exponent of the mass of the two-point function for random walks on random combs, and compute mean displacements as a function of walk duration. We prove that the mean first passage time is generally infinite for combs with anomalous spectral dimension.

  7. Variable mechanical ventilation

    PubMed Central

    Fontela, Paula Caitano; Prestes, Renata Bernardy; Forgiarini Jr., Luiz Alberto; Friedman, Gilberto

    2017-01-01

    Objective To review the literature on the use of variable mechanical ventilation and the main outcomes of this technique. Methods Search, selection, and analysis of all original articles on variable ventilation, without restriction on the period of publication and language, available in the electronic databases LILACS, MEDLINE®, and PubMed, by searching the terms "variable ventilation" OR "noisy ventilation" OR "biologically variable ventilation". Results A total of 36 studies were selected. Of these, 24 were original studies, including 21 experimental studies and three clinical studies. Conclusion Several experimental studies reported the beneficial effects of distinct variable ventilation strategies on lung function using different models of lung injury and healthy lungs. Variable ventilation seems to be a viable strategy for improving gas exchange and respiratory mechanics and preventing lung injury associated with mechanical ventilation. However, further clinical studies are necessary to assess the potential of variable ventilation strategies for the clinical improvement of patients undergoing mechanical ventilation. PMID:28444076

  8. Biologically-variable rhythmic auditory cues are superior to isochronous cues in fostering natural gait variability in Parkinson's disease.

    PubMed

    Dotov, D G; Bayard, S; Cochen de Cock, V; Geny, C; Driss, V; Garrigue, G; Bardy, B; Dalla Bella, S

    2017-01-01

    Rhythmic auditory cueing improves certain gait symptoms of Parkinson's disease (PD). Cues are typically stimuli or beats with a fixed inter-beat interval. We show that isochronous cueing has an unwanted side-effect in that it exacerbates one of the motor symptoms characteristic of advanced PD. Whereas the parameters of the stride cycle of healthy walkers and early patients possess a persistent correlation in time, or long-range correlation (LRC), isochronous cueing renders stride-to-stride variability random. Random stride cycle variability is also associated with reduced gait stability and lack of flexibility. To investigate how to prevent patients from acquiring a random stride cycle pattern, we tested rhythmic cueing which mimics the properties of variability found in healthy gait (biological variability). PD patients (n=19) and age-matched healthy participants (n=19) walked with three rhythmic cueing stimuli: isochronous, with random variability, and with biological variability (LRC). Synchronization was not instructed. The persistent correlation in gait was preserved only with stimuli with biological variability, equally for patients and controls (p's<0.05). In contrast, cueing with isochronous or randomly varying inter-stimulus/beat intervals removed the LRC in the stride cycle. Notably, the individual's tendency to synchronize steps with beats determined the amount of negative effects of isochronous and random cues (p's<0.05) but not the positive effect of biological variability. Stimulus variability and patients' propensity to synchronize play a critical role in fostering healthier gait dynamics during cueing. The beneficial effects of biological variability provide useful guidelines for improving existing cueing treatments. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. The effect of modeled absolute timing variability and relative timing variability on observational learning.

    PubMed

    Grierson, Lawrence E M; Roberts, James W; Welsher, Arthur M

    2017-05-01

    There is much evidence to suggest that skill learning is enhanced by skill observation. Recent research on this phenomenon indicates a benefit of observing variable/erred demonstrations. In this study, we explore whether it is variability within the relative organization or absolute parameterization of a movement that facilitates skill learning through observation. To do so, participants were randomly allocated into groups that observed a model with no variability, absolute timing variability, relative timing variability, or variability in both absolute and relative timing. All participants performed a four-segment movement pattern with specific absolute and relative timing goals prior to and following the observational intervention, as well as in a 24h retention test and transfers tests that featured new relative and absolute timing goals. Absolute timing error indicated that all groups initially acquired the absolute timing, maintained their performance at 24h retention, and exhibited performance deterioration in both transfer tests. Relative timing error revealed that the observation of no variability and relative timing variability produced greater performance at the post-test, 24h retention and relative timing transfer tests, but for the no variability group, deteriorated at absolute timing transfer test. The results suggest that the learning of absolute timing following observation unfolds irrespective of model variability. However, the learning of relative timing benefits from holding the absolute features constant, while the observation of no variability partially fails in transfer. We suggest learning by observing no variability and variable/erred models unfolds via similar neural mechanisms, although the latter benefits from the additional coding of information pertaining to movements that require a correction. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Quantifying and mapping spatial variability in simulated forest plots

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  11. Measurement variability error for estimates of volume change

    James A. Westfall; Paul L. Patterson

    2007-01-01

    Using quality assurance data, measurement variability distributions were developed for attributes that affect tree volume prediction. Random deviations from the measurement variability distributions were applied to 19381 remeasured sample trees in Maine. The additional error due to measurement variation and measurement bias was estimated via a simulation study for...

  12. Cataclysmic Variable Stars

    NASA Astrophysics Data System (ADS)

    Hellier, Coel

    2001-01-01

    Cataclysmic variable stars are the most variable stars in the night sky, fluctuating in brightness continually on timescales from seconds to hours to weeks to years. The changes can be recorded using amateur telescopes, yet are also the subject of intensive study by professional astronomers. That study has led to an understanding of cataclysmic variables as binary stars, orbiting so closely that material transfers from one star to the other. The resulting process of accretion is one of the most important in astrophysics. This book presents the first account of cataclysmic variables at an introductory level. Assuming no previous knowledge of the field, it explains the basic principles underlying the variability, while providing an extensive compilation of cataclysmic variable light curves. Aimed at amateur astronomers, undergraduates, and researchers, the main text is accessible to those with no mathematical background, while supplementary boxes present technical details and equations.

  13. Ratio index variables or ANCOVA? Fisher's cats revisited.

    PubMed

    Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S

    2010-01-01

    Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.

  14. Genetic variability in krill.

    PubMed Central

    Valentine, J W; Ayala, F J

    1976-01-01

    We have estimated genetic variability by gel electrophoresis in three species of krill, genus Euphausia (Arthropoda: Crustacea). Genetic variability is low where trophic resources are most seasonal, and high where trophic resources are most stable. Simlar trends have been found in benthic marine invertebrates. The observed trends of genetic variability do not correlate with trends in the stability of physical environment parameters. Images PMID:1061166

  15. On the Wigner law in dilute random matrices

    NASA Astrophysics Data System (ADS)

    Khorunzhy, A.; Rodgers, G. J.

    1998-12-01

    We consider ensembles of N × N symmetric matrices whose entries are weakly dependent random variables. We show that random dilution can change the limiting eigenvalue distribution of such matrices. We prove that under general and natural conditions the normalised eigenvalue counting function coincides with the semicircle (Wigner) distribution in the limit N → ∞. This can be explained by the observation that dilution (or more generally, random modulation) eliminates the weak dependence (or correlations) between random matrix entries. It also supports our earlier conjecture that the Wigner distribution is stable to random dilution and modulation.

  16. VARIABLE TIME DELAY MEANS

    DOEpatents

    Clemensen, R.E.

    1959-11-01

    An electrically variable time delay line is described which may be readily controlled simuitaneously with variable impedance matching means coupied thereto such that reflections are prevented. Broadly, the delay line includes a signal winding about a magnetic core whose permeability is electrically variable. Inasmuch as the inductance of the line varies directly with the permeability, the time delay and characteristic impedance of the line both vary as the square root of the permeability. Consequently, impedance matching means may be varied similariy and simultaneously w:th the electrically variable permeability to match the line impedance over the entire range of time delay whereby reflections are prevented.

  17. Robust Optimum Invariant Tests for Random MANOVA Models.

    DTIC Science & Technology

    1986-10-01

    are assumed to be independent normal with zero mean and dispersion o2 and o72 respectively, Roy and Gnanadesikan (1959) considered the prob- 2 2 lem of...Part II: The multivariate case. Ann. Math. Statist. 31, 939-968. [7] Roy, S.N. and Gnanadesikan , R. (1959). Some contributions to ANOVA in one or more

  18. Quantum random number generation

    DOE PAGES

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; ...

    2016-06-28

    Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less

  19. Drop Spreading with Random Viscosity

    NASA Astrophysics Data System (ADS)

    Xu, Feng; Jensen, Oliver

    2016-11-01

    Airway mucus acts as a barrier to protect the lung. However as a biological material, its physical properties are known imperfectly and can be spatially heterogeneous. In this study we assess the impact of these uncertainties on the rate of spreading of a drop (representing an inhaled aerosol) over a mucus film. We model the film as Newtonian, having a viscosity that depends linearly on the concentration of a passive solute (a crude proxy for mucin proteins). Given an initial random solute (and hence viscosity) distribution, described as a Gaussian random field with a given correlation structure, we seek to quantify the uncertainties in outcomes as the drop spreads. Using lubrication theory, we describe the spreading of the drop in terms of a system of coupled nonlinear PDEs governing the evolution of film height and the vertically-averaged solute concentration. We perform Monte Carlo simulations to predict the variability in the drop centre location and width (1D) or area (2D). We show how simulation results are well described (at much lower computational cost) by a low-order model using a weak disorder expansion. Our results show for example how variability in the drop location is a non-monotonic function of the solute correlation length increases. Engineering and Physical Sciences Research Council.

  20. Variable volume combustor

    DOEpatents

    Ostebee, Heath Michael; Ziminsky, Willy Steve; Johnson, Thomas Edward; Keener, Christopher Paul

    2017-01-17

    The present application provides a variable volume combustor for use with a gas turbine engine. The variable volume combustor may include a liner, a number of micro-mixer fuel nozzles positioned within the liner, and a linear actuator so as to maneuver the micro-mixer fuel nozzles axially along the liner.

  1. A variety of variables.

    PubMed

    Jupiter, Daniel C

    2014-01-01

    In designing studies and developing plans for analyses, we must consider which tests are appropriate for the types of variables we are using. Here I describe the types of variables available to us, and I briefly consider the appropriate tools to use in their analysis. Copyright © 2014 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  2. Basic properties and variability

    NASA Technical Reports Server (NTRS)

    Querci, Francois R.

    1987-01-01

    Giant and supergiant M, S, and C stars are discussed in this survey of research. Basic properties as determined by spectra, chemical composition, photometry, or variability type are discussed. Space motions and space distributions of cool giants are described. Distribution of these stars in our galaxy and those nearby is discussed. Mira variables in particular are surveyed with emphasis on the following topics: (1) phase lag phenomenon; (2) Mira light curves; (3) variations in color indices; (4) determination of multiple periods; (5) correlations between quantities such as period length, light-curve shape, infrared (IR) excess, and visible and IR color diagram; (6) semiregular (SR) variables and different time scales in SR light variations; (7) irregular variable Lb and Lc stars; (8) different time-scale light variations; (9) hydrogen-deficient carbon (HdC) stars, in particular RCB stars; and (10) irreversible changes and rapid evolution in red variable stars.

  3. Complete convergence of randomly weighted END sequences and its application.

    PubMed

    Li, Penghua; Li, Xiaoqin; Wu, Kehan

    2017-01-01

    We investigate the complete convergence of partial sums of randomly weighted extended negatively dependent (END) random variables. Some results of complete moment convergence, complete convergence and the strong law of large numbers for this dependent structure are obtained. As an application, we study the convergence of the state observers of linear-time-invariant systems. Our results extend the corresponding earlier ones.

  4. Simulation of the Effects of Random Measurement Errors

    ERIC Educational Resources Information Center

    Kinsella, I. A.; Hannaidh, P. B. O.

    1978-01-01

    Describes a simulation method for measurement of errors that requires calculators and tables of random digits. Each student simulates the random behaviour of the component variables in the function and by combining the results of all students, the outline of the sampling distribution of the function can be obtained. (GA)

  5. Are randomly grown graphs really random?

    PubMed

    Callaway, D S; Hopcroft, J E; Kleinberg, J M; Newman, M E; Strogatz, S H

    2001-10-01

    We analyze a minimal model of a growing network. At each time step, a new vertex is added; then, with probability delta, two vertices are chosen uniformly at random and joined by an undirected edge. This process is repeated for t time steps. In the limit of large t, the resulting graph displays surprisingly rich characteristics. In particular, a giant component emerges in an infinite-order phase transition at delta=1/8. At the transition, the average component size jumps discontinuously but remains finite. In contrast, a static random graph with the same degree distribution exhibits a second-order phase transition at delta=1/4, and the average component size diverges there. These dramatic differences between grown and static random graphs stem from a positive correlation between the degrees of connected vertices in the grown graph-older vertices tend to have higher degree, and to link with other high-degree vertices, merely by virtue of their age. We conclude that grown graphs, however randomly they are constructed, are fundamentally different from their static random graph counterparts.

  6. Selection of Variables in Cluster Analysis: An Empirical Comparison of Eight Procedures

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.

    2008-01-01

    Eight different variable selection techniques for model-based and non-model-based clustering are evaluated across a wide range of cluster structures. It is shown that several methods have difficulties when non-informative variables (i.e., random noise) are included in the model. Furthermore, the distribution of the random noise greatly impacts the…

  7. Noisy oscillator: Random mass and random damping.

    PubMed

    Burov, Stanislav; Gitterman, Moshe

    2016-11-01

    The problem of a linear damped noisy oscillator is treated in the presence of two multiplicative sources of noise which imply a random mass and random damping. The additive noise and the noise in the damping are responsible for an influx of energy to the oscillator and its dissipation to the surrounding environment. A random mass implies that the surrounding molecules not only collide with the oscillator but may also adhere to it, thereby changing its mass. We present general formulas for the first two moments and address the question of mean and energetic stabilities. The phenomenon of stochastic resonance, i.e., the expansion due to the noise of a system response to an external periodic signal, is considered for separate and joint action of two sources of noise and their characteristics.

  8. Rapidly variable relatvistic absorption

    NASA Astrophysics Data System (ADS)

    Parker, M.; Pinto, C.; Fabian, A.; Lohfink, A.; Buisson, D.; Alston, W.; Jiang, J.

    2017-10-01

    I will present results from the 1.5Ms XMM-Newton observing campaign on the most X-ray variable AGN, IRAS 13224-3809. We find a series of nine absorption lines with a velocity of 0.24c from an ultra-fast outflow. For the first time, we are able to see extremely rapid variability of the UFO features, and can link this to the X-ray variability from the inner accretion disk. We find a clear flux dependence of the outflow features, suggesting that the wind is ionized by increasing X-ray emission.

  9. Magnetically Controlled Variable Transformer

    NASA Technical Reports Server (NTRS)

    Kleiner, Charles T.

    1994-01-01

    Improved variable-transformer circuit, output voltage and current of which controlled by use of relatively small current supplied at relatively low power to control windings on its magnetic cores. Transformer circuits of this type called "magnetic amplifiers" because ratio between controlled output power and power driving control current of such circuit large. This ratio - power gain - can be as large as 100 in present circuit. Variable-transformer circuit offers advantages of efficiency, safety, and controllability over some prior variable-transformer circuits.

  10. Random fiber laser.

    PubMed

    de Matos, Christiano J S; de S Menezes, Leonardo; Brito-Silva, Antônio M; Martinez Gámez, M A; Gomes, Anderson S L; de Araújo, Cid B

    2007-10-12

    We investigate the effects of two-dimensional confinement on the lasing properties of a classical random laser system operating in the incoherent feedback (diffusive) regime. A suspension of 250 nm rutile (TiO2) particles in a rhodamine 6G solution was inserted into the hollow core of a photonic crystal fiber generating the first random fiber laser and a novel quasi-one-dimensional random laser geometry. A comparison with similar systems in bulk format shows that the random fiber laser presents an efficiency that is at least 2 orders of magnitude higher.

  11. Partitioning neuronal variability

    PubMed Central

    Goris, Robbe L.T.; Movshon, J. Anthony; Simoncelli, Eero P.

    2014-01-01

    Responses of sensory neurons differ across repeated measurements. This variability is usually treated as stochasticity arising within neurons or neural circuits. However, some portion of the variability arises from fluctuations in excitability due to factors that are not purely sensory, such as arousal, attention, and adaptation. To isolate these fluctuations, we developed a model in which spikes are generated by a Poisson process whose rate is the product of a drive that is sensory in origin, and a gain summarizing stimulus-independent modulatory influences on excitability. This model provides an accurate account of response distributions of visual neurons in macaque LGN, V1, V2, and MT, revealing that variability originates in large part from excitability fluctuations which are correlated over time and between neurons, and which increase in strength along the visual pathway. The model provides a parsimonious explanation for observed systematic dependencies of response variability and covariability on firing rate. PMID:24777419

  12. Solar variability datalogger

    DOE PAGES

    Lave, Matthew; Stein, Joshua; Smith, Ryan

    2016-07-28

    To address the lack of knowledge of local solar variability, we have developed and deployed a low-cost solar variability datalogger (SVD). While most currently used solar irradiance sensors are expensive pyranometers with high accuracy (relevant for annual energy estimates), low-cost sensors display similar precision (relevant for solar variability) as high-cost pyranometers, even if they are not as accurate. In this work, we present evaluation of various low-cost irradiance sensor types, describe the SVD, and present validation and comparison of the SVD collected data. In conclusion, the low cost and ease of use of the SVD will enable a greater understandingmore » of local solar variability, which will reduce developer and utility uncertainty about the impact of solar photovoltaic (PV) installations and thus will encourage greater penetrations of solar energy.« less

  13. VARIABLE-THROW CAM

    DOEpatents

    Godsil, E.C.; Robinson, E.Y.

    1963-07-16

    A variable-throw cam comprising inner and outer eccentric sleeves which are adjustably locked together is described. The cam throw is varied by unlocking the inner and outer sleeves, rotating the outer sleeve relative to the inner one until the desired throw is obtained, and locking the sleeves together again. The cam is useful in applications wherein a continuously-variable throw is required, e.g., ram-and-die pressing operations, cyclic fatigue testing of materials, etc. (AEC)

  14. Deconstructed transverse mass variables

    DOE PAGES

    Ismail, Ahmed; Schwienhorst, Reinhard; Virzi, Joseph S.; ...

    2015-04-02

    Traditional searches for R-parity conserving natural supersymmetry (SUSY) require large transverse mass and missing energy cuts to separate the signal from large backgrounds. SUSY models with compressed spectra inherently produce signal events with small amounts of missing energy that are hard to explore. We use this difficulty to motivate the construction of "deconstructed" transverse mass variables which are designed preserve information on both the norm and direction of the missing momentum. Here, we demonstrate the effectiveness of these variables in searches for the pair production of supersymmetric top-quark partners which subsequently decay into a final state with an isolated lepton,more » jets and missing energy. We show that the use of deconstructed transverse mass variables extends the accessible compressed spectra parameter space beyond the region probed by traditional methods. The parameter space can further be expanded to neutralino masses that are larger than the difference between the stop and top masses. In addition, we also discuss how these variables allow for novel searches of single stop production, in order to directly probe unconstrained stealth stops in the small stop-and neutralino-mass regime. We also demonstrate the utility of these variables for generic gluino and stop searches in all-hadronic final states. Overall, we demonstrate that deconstructed transverse variables are essential to any search wanting to maximize signal separation from the background when the signal has undetected particles in the final state.« less

  15. Exponential gain of randomness certified by quantum contextuality

    NASA Astrophysics Data System (ADS)

    Um, Mark; Zhang, Junhua; Wang, Ye; Wang, Pengfei; Kim, Kihwan

    2017-04-01

    We demonstrate the protocol of exponential gain of randomness certified by quantum contextuality in a trapped ion system. The genuine randomness can be produced by quantum principle and certified by quantum inequalities. Recently, randomness expansion protocols based on inequality of Bell-text and Kochen-Specker (KS) theorem, have been demonstrated. These schemes have been theoretically innovated to exponentially expand the randomness and amplify the randomness from weak initial random seed. Here, we report the experimental evidence of such exponential expansion of randomness. In the experiment, we use three states of a 138Ba + ion between a ground state and two quadrupole states. In the 138Ba + ion system, we do not have detection loophole and we apply a methods to rule out certain hidden variable models that obey a kind of extended noncontextuality.

  16. Programmable random interval generator

    NASA Technical Reports Server (NTRS)

    Lindsey, R. S., Jr.

    1973-01-01

    Random pulse generator can supply constant-amplitude randomly distributed pulses with average rate ranging from a few counts per second to more than one million counts per second. Generator requires no high-voltage power supply or any special thermal cooling apparatus. Device is uniquely versatile and provides wide dynamic range of operation.

  17. Quantum random number generator

    DOEpatents

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  18. Random pulse generator

    NASA Technical Reports Server (NTRS)

    Lindsey, R. S., Jr. (Inventor)

    1975-01-01

    An exemplary embodiment of the present invention provides a source of random width and random spaced rectangular voltage pulses whose mean or average frequency of operation is controllable within prescribed limits of about 10 hertz to 1 megahertz. A pair of thin-film metal resistors are used to provide a differential white noise voltage pulse source. Pulse shaping and amplification circuitry provide relatively short duration pulses of constant amplitude which are applied to anti-bounce logic circuitry to prevent ringing effects. The pulse outputs from the anti-bounce circuits are then used to control two one-shot multivibrators whose output comprises the random length and random spaced rectangular pulses. Means are provided for monitoring, calibrating and evaluating the relative randomness of the generator.

  19. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  20. Phenomenological picture of fluctuations in branching random walks

    NASA Astrophysics Data System (ADS)

    Mueller, A. H.; Munier, S.

    2014-10-01

    We propose a picture of the fluctuations in branching random walks, which leads to predictions for the distribution of a random variable that characterizes the position of the bulk of the particles. We also interpret the 1 /√{t } correction to the average position of the rightmost particle of a branching random walk for large times t ≫1 , computed by Ebert and Van Saarloos, as fluctuations on top of the mean-field approximation of this process with a Brunet-Derrida cutoff at the tip that simulates discreteness. Our analytical formulas successfully compare to numerical simulations of a particular model of a branching random walk.

  1. Variable stator radial turbine

    NASA Technical Reports Server (NTRS)

    Rogo, C.; Hajek, T.; Chen, A. G.

    1984-01-01

    A radial turbine stage with a variable area nozzle was investigated. A high work capacity turbine design with a known high performance base was modified to accept a fixed vane stagger angle moveable sidewall nozzle. The nozzle area was varied by moving the forward and rearward sidewalls. Diffusing and accelerating rotor inlet ramps were evaluated in combinations with hub and shroud rotor exit rings. Performance of contoured sidewalls and the location of the sidewall split line with respect to the rotor inlet was compared to the baseline. Performance and rotor exit survey data are presented for 31 different geometries. Detail survey data at the nozzle exit are given in contour plot format for five configurations. A data base is provided for a variable geometry concept that is a viable alternative to the more common pivoted vane variable geometry radial turbine.

  2. Variable Lifting Index (VLI)

    PubMed Central

    Waters, Thomas; Occhipinti, Enrico; Colombini, Daniela; Alvarez-Casado, Enrique; Fox, Robert

    2015-01-01

    Objective: We seek to develop a new approach for analyzing the physical demands of highly variable lifting tasks through an adaptation of the Revised NIOSH (National Institute for Occupational Safety and Health) Lifting Equation (RNLE) into a Variable Lifting Index (VLI). Background: There are many jobs that contain individual lifts that vary from lift to lift due to the task requirements. The NIOSH Lifting Equation is not suitable in its present form to analyze variable lifting tasks. Method: In extending the prior work on the VLI, two procedures are presented to allow users to analyze variable lifting tasks. One approach involves the sampling of lifting tasks performed by a worker over a shift and the calculation of the Frequency Independent Lift Index (FILI) for each sampled lift and the aggregation of the FILI values into six categories. The Composite Lift Index (CLI) equation is used with lifting index (LI) category frequency data to calculate the VLI. The second approach employs a detailed systematic collection of lifting task data from production and/or organizational sources. The data are organized into simplified task parameter categories and further aggregated into six FILI categories, which also use the CLI equation to calculate the VLI. Results: The two procedures will allow practitioners to systematically employ the VLI method to a variety of work situations where highly variable lifting tasks are performed. Conclusions: The scientific basis for the VLI procedure is similar to that for the CLI originally presented by NIOSH; however, the VLI method remains to be validated. Application: The VLI method allows an analyst to assess highly variable manual lifting jobs in which the task characteristics vary from lift to lift during a shift. PMID:26646300

  3. Stratospheric variability in summer

    NASA Technical Reports Server (NTRS)

    Rind, D.; Donn, W. L.; Robinson, W.

    1981-01-01

    Rocketsonde observations and infrasound results are used to investigate the variability of the summer stratopause region during one month in summer. Fluctuations of 2-3 days and about 16-day periods are evident, and they appear to be propagating vertically. In this month the 2-3 day oscillations have an amplitude envelope equal in period to the longer period oscillations, implying a connection between the two phenomena. Observations of the diurnal tide and shorter period variability during the month are also presented.

  4. Do bioclimate variables improve performance of climate envelope models?

    Watling, James I.; Romañach, Stephanie S.; Bucklin, David N.; Speroterra, Carolina; Brandt, Laura A.; Pearlstine, Leonard G.; Mazzotti, Frank J.

    2012-01-01

    Climate envelope models are widely used to forecast potential effects of climate change on species distributions. A key issue in climate envelope modeling is the selection of predictor variables that most directly influence species. To determine whether model performance and spatial predictions were related to the selection of predictor variables, we compared models using bioclimate variables with models constructed from monthly climate data for twelve terrestrial vertebrate species in the southeastern USA using two different algorithms (random forests or generalized linear models), and two model selection techniques (using uncorrelated predictors or a subset of user-defined biologically relevant predictor variables). There were no differences in performance between models created with bioclimate or monthly variables, but one metric of model performance was significantly greater using the random forest algorithm compared with generalized linear models. Spatial predictions between maps using bioclimate and monthly variables were very consistent using the random forest algorithm with uncorrelated predictors, whereas we observed greater variability in predictions using generalized linear models.

  5. Properties of behavior under different random ratio and random interval schedules: A parametric study.

    PubMed

    Dembo, M; De Penfold, J B; Ruiz, R; Casalta, H

    1985-03-01

    Four pigeons were trained to peck a key under different values of a temporally defined independent variable (T) and different probabilities of reinforcement (p). Parameter T is a fixed repeating time cycle and p the probability of reinforcement for the first response of each cycle T. Two dependent variables were used: mean response rate and mean postreinforcement pause. For all values of p a critical value for the independent variable T was found (T=1 sec) in which marked changes took place in response rate and postreinforcement pauses. Behavior typical of random ratio schedules was obtained at T 1 sec and behavior typical of random interval schedules at T 1 sec. Copyright © 1985. Published by Elsevier B.V.

  6. Quantum random access memory.

    PubMed

    Giovannetti, Vittorio; Lloyd, Seth; Maccone, Lorenzo

    2008-04-25

    A random access memory (RAM) uses n bits to randomly address N=2(n) distinct memory cells. A quantum random access memory (QRAM) uses n qubits to address any quantum superposition of N memory cells. We present an architecture that exponentially reduces the requirements for a memory call: O(logN) switches need be thrown instead of the N used in conventional (classical or quantum) RAM designs. This yields a more robust QRAM algorithm, as it in general requires entanglement among exponentially less gates, and leads to an exponential decrease in the power needed for addressing. A quantum optical implementation is presented.

  7. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    PubMed

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  8. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships

    PubMed Central

    Rassen, Jeremy A.; Brookhart, M. Alan; Glynn, Robert J.; Mittleman, Murray A.; Schneeweiss, Sebastian

    2010-01-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of “exchangeability” between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects. PMID:19356901

  9. Stochastic stability of parametrically excited random systems

    NASA Astrophysics Data System (ADS)

    Labou, M.

    2004-01-01

    Multidegree-of-freedom dynamic systems subjected to parametric excitation are analyzed for stochastic stability. The variation of excitation intensity with time is described by the sum of a harmonic function and a stationary random process. The stability boundaries are determined by the stochastic averaging method. The effect of random parametric excitation on the stability of trivial solutions of systems of differential equations for the moments of phase variables is studied. It is assumed that the frequency of harmonic component falls within the region of combination resonances. Stability conditions for the first and second moments are obtained. It turns out that additional parametric excitation may have a stabilizing or destabilizing effect, depending on the values of certain parameters of random excitation. As an example, the stability of a beam in plane bending is analyzed.

  10. Competitive Facility Location with Fuzzy Random Demands

    NASA Astrophysics Data System (ADS)

    Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke

    2010-10-01

    This paper proposes a new location problem of competitive facilities, e.g. shops, with uncertainty and vagueness including demands for the facilities in a plane. By representing the demands for facilities as fuzzy random variables, the location problem can be formulated as a fuzzy random programming problem. For solving the fuzzy random programming problem, first the α-level sets for fuzzy numbers are used for transforming it to a stochastic programming problem, and secondly, by using their expectations and variances, it can be reformulated to a deterministic programming problem. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic oscillation. The efficiency of the proposed method is shown by applying it to numerical examples of the facility location problems.

  11. Variable gravity research facility

    NASA Technical Reports Server (NTRS)

    Allan, Sean; Ancheta, Stan; Beine, Donna; Cink, Brian; Eagon, Mark; Eckstein, Brett; Luhman, Dan; Mccowan, Daniel; Nations, James; Nordtvedt, Todd

    1988-01-01

    Spin and despin requirements; sequence of activities required to assemble the Variable Gravity Research Facility (VGRF); power systems technology; life support; thermal control systems; emergencies; communication systems; space station applications; experimental activities; computer modeling and simulation of tether vibration; cost analysis; configuration of the crew compartments; and tether lengths and rotation speeds are discussed.

  12. Lake Ontario: Nearshore Variability

    EPA Science Inventory

    We conducted a high-resolution survey with towed electronic instrumentation along the Lake Ontario nearshore (720 km) at a 20 meter contour. The survey was conducted September 6-10, 2008 with a shorter 300 km survey conducted August 14-15 for comparing of temporal variability. ...

  13. Variable polarity arc welding

    NASA Technical Reports Server (NTRS)

    Bayless, E. O., Jr.

    1991-01-01

    Technological advances generate within themselves dissatisfactions that lead to further advances in a process. A series of advances in welding technology which culminated in the Variable Polarity Plasma Arc (VPPA) Welding Process and an advance instituted to overcome the latest dissatisfactions with the process: automated VPPA welding are described briefly.

  14. TAC Variable Sweep Model

    1960-05-14

    Project: Wing Sweep Range Series TAC Variable Sweep Model configure 8 A. Taken at 8 foot tunnels building 641. L60-3412 through 3416 Model of proposed military supersonic attack airplane shows wing sweep range. TAC Models taken at the 8 Foot Tunnel. Photograph published in Sixty Years of Aeronautical Research 1917-1977 By David A. Anderton. A NASA publication. Page 53.

  15. Surfing wave climate variability

    NASA Astrophysics Data System (ADS)

    Espejo, Antonio; Losada, Iñigo J.; Méndez, Fernando J.

    2014-10-01

    International surfing destinations are highly dependent on specific combinations of wind-wave formation, thermal conditions and local bathymetry. Surf quality depends on a vast number of geophysical variables, and analyses of surf quality require the consideration of the seasonal, interannual and long-term variability of surf conditions on a global scale. A multivariable standardized index based on expert judgment is proposed for this purpose. This index makes it possible to analyze surf conditions objectively over a global domain. A summary of global surf resources based on a new index integrating existing wave, wind, tides and sea surface temperature databases is presented. According to general atmospheric circulation and swell propagation patterns, results show that west-facing low to middle-latitude coasts are more suitable for surfing, especially those in the Southern Hemisphere. Month-to-month analysis reveals strong seasonal variations in the occurrence of surfable events, enhancing the frequency of such events in the North Atlantic and the North Pacific. Interannual variability was investigated by comparing occurrence values with global and regional modes of low-frequency climate variability such as El Niño and the North Atlantic Oscillation, revealing their strong influence at both the global and the regional scale. Results of the long-term trends demonstrate an increase in the probability of surfable events on west-facing coasts around the world in recent years. The resulting maps provide useful information for surfers, the surf tourism industry and surf-related coastal planners and stakeholders.

  16. Variable camber rotor study

    NASA Technical Reports Server (NTRS)

    Dadone, L.; Cowan, J.; Mchugh, F. J.

    1982-01-01

    Deployment of variable camber concepts on helicopter rotors was analytically assessed. It was determined that variable camber extended the operating range of helicopters provided that the correct compromise can be obtained between performance/loads gains and mechanical complexity. A number of variable camber concepts were reviewed on a two dimensional basis to determine the usefulness of leading edge, trailing edge and overall camber variation schemes. The most powerful method to vary camber was through the trailing edge flaps undergoing relatively small motions (-5 deg to +15 deg). The aerodynamic characteristics of the NASA/Ames A-1 airfoil with 35% and 50% plain trailing edge flaps were determined by means of current subcritical and transonic airfoil design methods and used by rotor performance and loads analysis codes. The most promising variable camber schedule reviewed was a configuration with a 35% plain flap deployment in an on/off mode near the tip of a blade. Preliminary results show approximately 11% reduction in power is possible at 192 knots and a rotor thrust coefficient of 0.09. The potential demonstrated indicates a significant potential for expanding the operating envelope of the helicopter. Further investigation into improving the power saving and defining the improvement in the operational envelope of the helicopter is recommended.

  17. Variable Oakleaf Caterpillar

    Louis F. Wilson; Gordon A. Surgeoner

    1979-01-01

    The variable oakleaf caterpillar (Heterocampa manteo (Dbldy.)) is a common insect in deciduous forests of Eastern North America. It has been recorded from most of the Eastern Canadian Provinces and most of the States in the East to North Dakota in the West and south to eastern Texas, Louisiana, and Mississippi. Heavy defoliations of hosts may occur anywhere within this...

  18. Branching random walk with step size coming from a power law

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Ayan; Subhra Hazra, Rajat; Roy, Parthanil

    2015-09-01

    In their seminal work, Brunet and Derrida made predictions on the random point configurations associated with branching random walks. We shall discuss the limiting behavior of such point configurations when the displacement random variables come from a power law. In particular, we establish that two prediction of remains valid in this setup and investigate various other issues mentioned in their paper.

  19. Causal Inference and Omitted Variable Bias in Financial Aid Research: Assessing Solutions

    ERIC Educational Resources Information Center

    Riegg, Stephanie K.

    2008-01-01

    This article highlights the problem of omitted variable bias in research on the causal effect of financial aid on college-going. I first describe the problem of self-selection and the resulting bias from omitted variables. I then assess and explore the strengths and weaknesses of random assignment, multivariate regression, proxy variables, fixed…

  20. Bias and Bias Correction in Multi-Site Instrumental Variables Analysis of Heterogeneous Mediator Effects

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Unlu, Faith; Zhu, Pei; Bloom, Howard

    2013-01-01

    We explore the use of instrumental variables (IV) analysis with a multi-site randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, as assumption known in the instrumental variables literature as the…

  1. Tides and Decadal Variability

    NASA Technical Reports Server (NTRS)

    Ray, Richard D.

    2003-01-01

    This paper reviews the mechanisms by which oceanic tides and decadal variability in the oceans are connected. We distinguish between variability caused by tides and variability observed in the tides themselves. Both effects have been detected at some level. The most obvious connection with decadal timescales is through the 18.6-year precession of the moon's orbit plane. This precession gives rise to a small tide of the same period and to 18.6-year modulations in the phase and amplitudes of short-period tides. The 18.6-year "node tide" is very small, no more than 2 cm anywhere, and in sea level data it is dominated by the ocean's natural Variability. Some authors have naively attributed climate variations with periods near 19 years directly to the node tide, but the amplitude of the tide is too small for this mechanism to be operative. The more likely explanation (Loder and Garrett, JGR, 83, 1967-70, 1978) is that the 18.6-y modulations in short-period tides, especially h e principal tide M2, cause variations in ocean mixing, which is then observed in temperature and other climatic indicators. Tidally forced variability has also been proposed by some authors, either in response to occasional (and highly predictable) tidal extremes or as a nonlinear low-frequency oscillation caused by interactions between short-period tides. The former mechanism can produce only short-duration events hardly more significant than normal tidal ranges, but the latter mechanism can in principle induce low-frequency oscillations. The most recent proposal of this type is by Keeling and Whorf, who highlight the 1800-year spectral peak discovered by Bond et al. (1997). But the proposal appears contrived and should be considered, in the words of Munk et al. (2002), "as the most likely among unlikely candidates."

  2. Effect of randomness in logistic maps

    NASA Astrophysics Data System (ADS)

    Khaleque, Abdul; Sen, Parongama

    2015-01-01

    We study a random logistic map xt+1 = atxt[1 - xt] where at are bounded (q1 ≤ at ≤ q2), random variables independently drawn from a distribution. xt does not show any regular behavior in time. We find that xt shows fully ergodic behavior when the maximum allowed value of at is 4. However , averaged over different realizations reaches a fixed point. For 1 ≤ at ≤ 4, the system shows nonchaotic behavior and the Lyapunov exponent is strongly dependent on the asymmetry of the distribution from which at is drawn. Chaotic behavior is seen to occur beyond a threshold value of q1(q2) when q2(q1) is varied. The most striking result is that the random map is chaotic even when q2 is less than the threshold value 3.5699⋯ at which chaos occurs in the nonrandom map. We also employ a different method in which a different set of random variables are used for the evolution of two initially identical x values, here the chaotic regime exists for all q1 ≠ q2 values.

  3. Random array grid collimator

    DOEpatents

    Fenimore, E.E.

    1980-08-22

    A hexagonally shaped quasi-random no-two-holes touching grid collimator. The quasi-random array grid collimator eliminates contamination from small angle off-axis rays by using a no-two-holes-touching pattern which simultaneously provides for a self-supporting array increasng throughput by elimination of a substrate. The presentation invention also provides maximum throughput using hexagonally shaped holes in a hexagonal lattice pattern for diffraction limited applications. Mosaicking is also disclosed for reducing fabrication effort.

  4. Randomized Dynamic Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Erichson, N. Benjamin; Brunton, Steven L.; Kutz, J. Nathan

    2017-11-01

    The dynamic mode decomposition (DMD) is an equation-free, data-driven matrix decomposition that is capable of providing accurate reconstructions of spatio-temporal coherent structures arising in dynamical systems. We present randomized algorithms to compute the near-optimal low-rank dynamic mode decomposition for massive datasets. Randomized algorithms are simple, accurate and able to ease the computational challenges arising with `big data'. Moreover, randomized algorithms are amenable to modern parallel and distributed computing. The idea is to derive a smaller matrix from the high-dimensional input data matrix using randomness as a computational strategy. Then, the dynamic modes and eigenvalues are accurately learned from this smaller representation of the data, whereby the approximation quality can be controlled via oversampling and power iterations. Here, we present randomized DMD algorithms that are categorized by how many passes the algorithm takes through the data. Specifically, the single-pass randomized DMD does not require data to be stored for subsequent passes. Thus, it is possible to approximately decompose massive fluid flows (stored out of core memory, or not stored at all) using single-pass algorithms, which is infeasible with traditional DMD algorithms.

  5. Variable Permanent Magnet Quadrupole

    SciT

    Mihara, T.; Iwashita, Y.; /Kyoto U.

    A permanent magnet quadrupole (PMQ) is one of the candidates for the final focus lens in a linear collider. An over 120 T/m strong variable permanent magnet quadrupole is achieved by the introduction of saturated iron and a 'double ring structure'. A fabricated PMQ achieved 24 T integrated gradient with 20 mm bore diameter, 100 mm magnet diameter and 20 cm pole length. The strength of the PMQ is adjustable in 1.4 T steps, due to its 'double ring structure': the PMQ is split into two nested rings; the outer ring is sliced along the beam line into four partsmore » and is rotated to change the strength. This paper describes the variable PMQ from fabrication to recent adjustments.« less

  6. Variable stiffness torsion springs

    NASA Astrophysics Data System (ADS)

    Alhorn, Dean C.; Polites, Michael E.

    1994-05-01

    In a torsion spring the spring action is a result of the relationships between the torque applied in twisting the spring, the angle through which the torsion spring twists, and the modulus of elasticity of the spring material in shear. Torsion springs employed industrially have been strips, rods, or bars, generally termed shafts, capabable of being flexed by twisting their axes. They rely on the variations in shearing forces to furnish an internal restoring torque. In the torsion springs herein the restoring torque is external and therefore independent of the shearing modulus of elasticity of the torsion spring shaft. Also provided herein is a variable stiffness torsion spring. This torsion spring can be so adjusted as to have a given spring constant. Such variable stiffness torsion springs are extremely useful in gimballed payloads such as sensors, telescopes, and electronic devices on such platforms as a space shuttle or a space station.

  7. Variable stiffness torsion springs

    NASA Technical Reports Server (NTRS)

    Alhorn, Dean C. (Inventor); Polites, Michael E. (Inventor)

    1995-01-01

    In a torsion spring the spring action is a result of the relationships between the torque applied in twisting the spring, the angle through which the torsion spring twists, and the modulus of elasticity of the spring material in shear. Torsion springs employed industrially have been strips, rods, or bars, generally termed shafts, capabable of being flexed by twisting their axes. They rely on the variations in shearing forces to furnish an internal restoring torque. In the torsion springs herein the restoring torque is external and therefore independent of the shearing modulus of elasticity of the torsion spring shaft. Also provided herein is a variable stiffness torsion spring. This torsion spring can be so adjusted as to have a given spring constant. Such variable stiffness torsion springs are extremely useful in gimballed payloads such as sensors, telescopes, and electronic devices on such platforms as a space shuttle or a space station.

  8. Variable stiffness torsion springs

    NASA Astrophysics Data System (ADS)

    Alhorn, Dean C.; Polites, Michael E.

    1995-08-01

    In a torsion spring the spring action is a result of the relationships between the torque applied in twisting the spring, the angle through which the torsion spring twists, and the modulus of elasticity of the spring material in shear. Torsion springs employed industrially have been strips, rods, or bars, generally termed shafts, capabable of being flexed by twisting their axes. They rely on the variations in shearing forces to furnish an internal restoring torque. In the torsion springs herein the restoring torque is external and therefore independent of the shearing modulus of elasticity of the torsion spring shaft. Also provided herein is a variable stiffness torsion spring. This torsion spring can be so adjusted as to have a given spring constant. Such variable stiffness torsion springs are extremely useful in gimballed payloads such as sensors, telescopes, and electronic devices on such platforms as a space shuttle or a space station.

  9. Variable stiffness torsion springs

    NASA Technical Reports Server (NTRS)

    Alhorn, Dean C. (Inventor); Polites, Michael E. (Inventor)

    1994-01-01

    In a torsion spring the spring action is a result of the relationships between the torque applied in twisting the spring, the angle through which the torsion spring twists, and the modulus of elasticity of the spring material in shear. Torsion springs employed industrially have been strips, rods, or bars, generally termed shafts, capabable of being flexed by twisting their axes. They rely on the variations in shearing forces to furnish an internal restoring torque. In the torsion springs herein the restoring torque is external and therefore independent of the shearing modulus of elasticity of the torsion spring shaft. Also provided herein is a variable stiffness torsion spring. This torsion spring can be so adjusted as to have a given spring constant. Such variable stiffness torsion springs are extremely useful in gimballed payloads such as sensors, telescopes, and electronic devices on such platforms as a space shuttle or a space station.

  10. Intrinsically variable stars

    NASA Technical Reports Server (NTRS)

    Bohm-Vitense, Erika; Querci, Monique

    1987-01-01

    The characteristics of intrinsically variable stars are examined, reviewing the results of observations obtained with the IUE satellite since its launch in 1978. Selected data on both medium-spectral-class pulsating stars (Delta Cep stars, W Vir stars, and related groups) and late-type variables (M, S, and C giants and supergiants) are presented in spectra, graphs, and tables and described in detail. Topics addressed include the calibration of the the period-luminosity relation, Cepheid distance determination, checking stellar evolution theory by the giant companions of Cepheids, Cepheid masses, the importance of the hydrogen convection zone in Cepheids, temperature and abundance estimates for Population II pulsating stars, mass loss in Population II Cepheids, SWP and LWP images of cold giants and supergiants, temporal variations in the UV lines of cold stars, C-rich cold stars, and cold stars with highly ionized emission lines.

  11. Climate Variability Program

    NASA Technical Reports Server (NTRS)

    Halpern, David (Editor)

    2002-01-01

    The Annual Report of the Climate Variability Program briefly describes research activities of Principal Investigators who are funded by NASA's Earth Science Enterprise Research Division. The report is focused on the year 2001. Utilization of satellite observations is a singularity of research on climate science and technology at JPL (Jet Propulsion Laboratory). Research at JPL has two foci: generate new knowledge and develop new technology.

  12. Variable percentage sampler

    DOEpatents

    Miller, Jr., William H.

    1976-01-01

    A remotely operable sampler is provided for obtaining variable percentage samples of nuclear fuel particles and the like for analyses. The sampler has a rotating cup for a sample collection chamber designed so that the effective size of the sample inlet opening to the cup varies with rotational speed. Samples of a desired size are withdrawn from a flowing stream of particles without a deterrent to the flow of remaining particles.

  13. Variable depth core sampler

    DOEpatents

    Bourgeois, P.M.; Reger, R.J.

    1996-02-20

    A variable depth core sampler apparatus is described comprising a first circular hole saw member, having longitudinal sections that collapses to form a point and capture a sample, and a second circular hole saw member residing inside said first hole saw member to support the longitudinal sections of said first hole saw member and prevent them from collapsing to form a point. The second hole saw member may be raised and lowered inside said first hole saw member. 7 figs.

  14. Variable laser attenuator

    DOEpatents

    Foltyn, S.R.

    1987-05-29

    The disclosure relates to low loss, high power variable attenuators comprising one or more transmissive and/or reflective multilayer dielectric filters. The attenuator is particularly suitable to use with unpolarized lasers such as excimer lasers. Beam attenuation is a function of beam polarization and the angle of incidence between the beam and the filter and is controlled by adjusting the angle of incidence the beam makes to the filter or filters. Filters are selected in accordance with beam wavelength. 9 figs.

  15. Variable laser attenuator

    DOEpatents

    Foltyn, Stephen R.

    1988-01-01

    The disclosure relates to low loss, high power variable attenuators comprng one or more transmissive and/or reflective multilayer dielectric filters. The attenuator is particularly suitable to use with unpolarized lasers such as excimer lasers. Beam attenuation is a function of beam polarization and the angle of incidence between the beam and the filter and is controlled by adjusting the angle of incidence the beam makes to the filter or filters. Filters are selected in accordance with beam wavelength.

  16. Variable depth core sampler

    DOEpatents

    Bourgeois, Peter M.; Reger, Robert J.

    1996-01-01

    A variable depth core sampler apparatus comprising a first circular hole saw member, having longitudinal sections that collapses to form a point and capture a sample, and a second circular hole saw member residing inside said first hole saw member to support the longitudinal sections of said first hole saw member and prevent them from collapsing to form a point. The second hole saw member may be raised and lowered inside said first hole saw member.

  17. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    NASA Astrophysics Data System (ADS)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  18. Fatigue crack growth under variable amplitude loading

    NASA Astrophysics Data System (ADS)

    Sidawi, Jihad A.

    1994-09-01

    Fatigue crack growth tests were conducted on an Fe 510 E C-Mn steel and a submerged arc welded joint from the same material under constant, variable, and random loading amplitudes. Paris-Erdogan's crack growth rate law was tested for the evaluation of m and C using the stress intensity factor K, the J-integral, the effective stress intensity factor K(sub eff), and the root mean square stress intensity factor K(sub rms) fracture mechanics concepts. The effect of retardation and residual stresses resulting from welding was also considered. It was found that all concepts gave good life predictions in all cases.

  19. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  20. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  1. On grey levels in random CAPTCHA generation

    NASA Astrophysics Data System (ADS)

    Newton, Fraser; Kouritzin, Michael A.

    2011-06-01

    A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.

  2. Quantization of high dimensional Gaussian vector using permutation modulation with application to information reconciliation in continuous variable QKD

    NASA Astrophysics Data System (ADS)

    Daneshgaran, Fred; Mondin, Marina; Olia, Khashayar

    This paper is focused on the problem of Information Reconciliation (IR) for continuous variable Quantum Key Distribution (QKD). The main problem is quantization and assignment of labels to the samples of the Gaussian variables observed at Alice and Bob. Trouble is that most of the samples, assuming that the Gaussian variable is zero mean which is de-facto the case, tend to have small magnitudes and are easily disturbed by noise. Transmission over longer and longer distances increases the losses corresponding to a lower effective Signal-to-Noise Ratio (SNR) exasperating the problem. Quantization over higher dimensions is advantageous since it allows for fractional bit per sample accuracy which may be needed at very low SNR conditions whereby the achievable secret key rate is significantly less than one bit per sample. In this paper, we propose to use Permutation Modulation (PM) for quantization of Gaussian vectors potentially containing thousands of samples. PM is applied to the magnitudes of the Gaussian samples and we explore the dependence of the sign error probability on the magnitude of the samples. At very low SNR, we may transmit the entire label of the PM code from Bob to Alice in Reverse Reconciliation (RR) over public channel. The side information extracted from this label can then be used by Alice to characterize the sign error probability of her individual samples. Forward Error Correction (FEC) coding can be used by Bob on each subset of samples with similar sign error probability to aid Alice in error correction. This can be done for different subsets of samples with similar sign error probabilities leading to an Unequal Error Protection (UEP) coding paradigm.

  3. Characterizing the Optical Variability of Bright Blazars: Variability-based Selection of Fermi Active Galactic Nuclei

    NASA Astrophysics Data System (ADS)

    Ruan, John J.; Anderson, Scott F.; MacLeod, Chelsea L.; Becker, Andrew C.; Burnett, T. H.; Davenport, James R. A.; Ivezić, Željko; Kochanek, Christopher S.; Plotkin, Richard M.; Sesar, Branimir; Stuart, J. Scott

    2012-11-01

    We investigate the use of optical photometric variability to select and identify blazars in large-scale time-domain surveys, in part to aid in the identification of blazar counterparts to the ~30% of γ-ray sources in the Fermi 2FGL catalog still lacking reliable associations. Using data from the optical LINEAR asteroid survey, we characterize the optical variability of blazars by fitting a damped random walk model to individual light curves with two main model parameters, the characteristic timescales of variability τ, and driving amplitudes on short timescales \\hat{\\sigma }. Imposing cuts on minimum τ and \\hat{\\sigma } allows for blazar selection with high efficiency E and completeness C. To test the efficacy of this approach, we apply this method to optically variable LINEAR objects that fall within the several-arcminute error ellipses of γ-ray sources in the Fermi 2FGL catalog. Despite the extreme stellar contamination at the shallow depth of the LINEAR survey, we are able to recover previously associated optical counterparts to Fermi active galactic nuclei with E >= 88% and C = 88% in Fermi 95% confidence error ellipses having semimajor axis r < 8'. We find that the suggested radio counterpart to Fermi source 2FGL J1649.6+5238 has optical variability consistent with other γ-ray blazars and is likely to be the γ-ray source. Our results suggest that the variability of the non-thermal jet emission in blazars is stochastic in nature, with unique variability properties due to the effects of relativistic beaming. After correcting for beaming, we estimate that the characteristic timescale of blazar variability is ~3 years in the rest frame of the jet, in contrast with the ~320 day disk flux timescale observed in quasars. The variability-based selection method presented will be useful for blazar identification in time-domain optical surveys and is also a probe of jet physics.

  4. Axial Fatigue Tests at Zero Mean Stress of 24S-T Aluminum-alloy Sheet with and Without a Circular Hole

    NASA Technical Reports Server (NTRS)

    Brueggeman, W C; Mayer, M JR; Smith, W H

    1944-01-01

    Axial fatigue tests were made on 189 coupon specimens of 0.032-inch 24S-T aluminum-alloy sheet and a few supplementary specimens of 0.004-inch sheet. The mean load was zero. The specimens were restrained against lateral buckling by lubricated solid guides described in a previous report on this project. About two-thirds of the 0.032-inch specimens were plain coupons nominally free from stress raisers. The remainder contained a 0.1285-inch drilled hole at the center where the reduced section was 0.5 inch wide. S-N diagrams were obtained for cycles to failure between about 1000 and 10 to the 7th power cycles for the plain specimens and 17 and 10 to the 7th power cycles for the drilled specimens. The fatigue stress concentration factor increased from about 1.08 for a stress amplitude causing failure at 0.25 cycles (static) to a maximum of 1.83 at 15,000 cycles and then decreased gradually. The graph for the drilled specimens showed less scatter than that for the plain specimens.

  5. Individual Movement Variability Magnitudes Are Explained by Cortical Neural Variability.

    PubMed

    Haar, Shlomi; Donchin, Opher; Dinstein, Ilan

    2017-09-13

    Humans exhibit considerable motor variability even across trivial reaching movements. This variability can be separated into specific kinematic components such as extent and direction that are thought to be governed by distinct neural processes. Here, we report that individual subjects (males and females) exhibit different magnitudes of kinematic variability, which are consistent (within individual) across movements to different targets and regardless of which arm (right or left) was used to perform the movements. Simultaneous fMRI recordings revealed that the same subjects also exhibited different magnitudes of fMRI variability across movements in a variety of motor system areas. These fMRI variability magnitudes were also consistent across movements to different targets when performed with either arm. Cortical fMRI variability in the posterior-parietal cortex of individual subjects explained their movement-extent variability. This relationship was apparent only in posterior-parietal cortex and not in other motor system areas, thereby suggesting that individuals with more variable movement preparation exhibit larger kinematic variability. We therefore propose that neural and kinematic variability are reliable and interrelated individual characteristics that may predispose individual subjects to exhibit distinct motor capabilities. SIGNIFICANCE STATEMENT Neural activity and movement kinematics are remarkably variable. Although intertrial variability is rarely studied, here, we demonstrate that individual human subjects exhibit distinct magnitudes of neural and kinematic variability that are reproducible across movements to different targets and when performing these movements with either arm. Furthermore, when examining the relationship between cortical variability and movement variability, we find that cortical fMRI variability in parietal cortex of individual subjects explained their movement extent variability. This enabled us to explain why some subjects

  6. Is the Non-Dipole Magnetic Field Random?

    NASA Technical Reports Server (NTRS)

    Walker, Andrew D.; Backus, George E.

    1996-01-01

    Statistical modelling of the Earth's magnetic field B has a long history. In particular, the spherical harmonic coefficients of scalar fields derived from B can be treated as Gaussian random variables. In this paper, we give examples of highly organized fields whose spherical harmonic coefficients pass tests for independent Gaussian random variables. The fact that coefficients at some depth may be usefully summarized as independent samples from a normal distribution need not imply that there really is some physical, random process at that depth. In fact, the field can be extremely structured and still be regarded for some purposes as random. In this paper, we examined the radial magnetic field B(sub r) produced by the core, but the results apply to any scalar field on the core-mantle boundary (CMB) which determines B outside the CMB.

  7. Random Effects: Variance Is the Spice of Life.

    PubMed

    Jupiter, Daniel C

    Covariates in regression analyses allow us to understand how independent variables of interest impact our dependent outcome variable. Often, we consider fixed effects covariates (e.g., gender or diabetes status) for which we examine subjects at each value of the covariate. We examine both men and women and, within each gender, examine both diabetic and nondiabetic patients. Occasionally, however, we consider random effects covariates for which we do not examine subjects at every value. For example, we examine patients from only a sample of hospitals and, within each hospital, examine both diabetic and nondiabetic patients. The random sampling of hospitals is in contrast to the complete coverage of all genders. In this column I explore the differences in meaning and analysis when thinking about fixed and random effects variables. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  8. Examining impulse-variability in overarm throwing.

    PubMed

    Urbin, M A; Stodden, David; Boros, Rhonda; Shannon, David

    2012-01-01

    The purpose of this study was to examine variability in overarm throwing velocity and spatial output error at various percentages of maximum to test the prediction of an inverted-U function as predicted by impulse-variability theory and a speed-accuracy trade-off as predicted by Fitts' Law Thirty subjects (16 skilled, 14 unskilled) were instructed to throw a tennis ball at seven percentages of their maximum velocity (40-100%) in random order (9 trials per condition) at a target 30 feet away. Throwing velocity was measured with a radar gun and interpreted as an index of overall systemic power output. Within-subject throwing velocity variability was examined using within-subjects repeated-measures ANOVAs (7 repeated conditions) with built-in polynomial contrasts. Spatial error was analyzed using mixed model regression. Results indicated a quadratic fit with variability in throwing velocity increasing from 40% up to 60%, where it peaked, and then decreasing at each subsequent interval to maximum (p < .001, η2 = .555). There was no linear relationship between speed and accuracy. Overall, these data support the notion of an inverted-U function in overarm throwing velocity variability as both skilled and unskilled subjects approach maximum effort. However, these data do not support the notion of a speed-accuracy trade-off. The consistent demonstration of an inverted-U function associated with systemic power output variability indicates an enhanced capability to regulate aspects of force production and relative timing between segments as individuals approach maximum effort, even in a complex ballistic skill.

  9. Variable camshaft timing system

    SciT

    Butterfield, R.P.; Smith, F.R.

    1989-09-05

    This patent describes an improvement in a variable camshaft timing system for an internal combustion engine having intake and exhaust valves and a camshaft for each of the intake and exhaust valves, an intake sprocket and an exhaust sprocket keyed to their respective camshaft, only one of the camshafts being directly driven by an engine crankshaft, and a timing chain engaging both sprockets. The improvement comprising a single bracket carrying at least one idler sprocket engaging the timing chain, the bracket being mounted for movement to alter the timing relationship between the intake and exhaust sprockets.

  10. Variable leak gas source

    DOEpatents

    Henderson, Timothy M.; Wuttke, Gilbert H.

    1977-01-01

    A variable leak gas source and a method for obtaining the same which includes filling a quantity of hollow glass micro-spheres with a gas, storing said quantity in a confined chamber having a controllable outlet, heating said chamber above room temperature, and controlling the temperature of said chamber to control the quantity of gas passing out of said controllable outlet. Individual gas filled spheres may be utilized for calibration purposes by breaking a sphere having a known quantity of a known gas to calibrate a gas detection apparatus.

  11. Intermittency and random matrices

    NASA Astrophysics Data System (ADS)

    Sokoloff, Dmitry; Illarionov, E. A.

    2015-08-01

    A spectacular phenomenon of intermittency, i.e. a progressive growth of higher statistical moments of a physical field excited by an instability in a random medium, attracted the attention of Zeldovich in the last years of his life. At that time, the mathematical aspects underlying the physical description of this phenomenon were still under development and relations between various findings in the field remained obscure. Contemporary results from the theory of the product of independent random matrices (the Furstenberg theory) allowed the elaboration of the phenomenon of intermittency in a systematic way. We consider applications of the Furstenberg theory to some problems in cosmology and dynamo theory.

  12. Effects of field variables on infield biomass bales aggregation strategies

    Infield aggregation of bales, an essential logistics operation of clearing the field for subsequent cropping, is influenced by several field variables, such as field shape, area, randomness on bale layout, biomass yield per unit area, bale row spacing, number of bales handled simultaneously, collect...

  13. Rational Variability in Children's Causal Inferences: The Sampling Hypothesis

    ERIC Educational Resources Information Center

    Denison, Stephanie; Bonawitz, Elizabeth; Gopnik, Alison; Griffiths, Thomas L.

    2013-01-01

    We present a proposal--"The Sampling Hypothesis"--suggesting that the variability in young children's responses may be part of a rational strategy for inductive inference. In particular, we argue that young learners may be randomly sampling from the set of possible hypotheses that explain the observed data, producing different hypotheses with…

  14. Investigating Organizational Alienation Behavior in Terms of Some Variables

    ERIC Educational Resources Information Center

    Dagli, Abidin; Averbek, Emel

    2017-01-01

    The aim of this study is to detect the perceptions of public primary school teachers regarding organizational alienation behaviors in terms of some variables (gender, marital status and seniority). Survey model was used in this study. The research sample consists of randomly selected 346 teachers from 40 schools in the central district of Mardin,…

  15. Variable word length encoder reduces TV bandwith requirements

    NASA Technical Reports Server (NTRS)

    Sivertson, W. E., Jr.

    1965-01-01

    Adaptive variable resolution encoding technique provides an adaptive compression pseudo-random noise signal processor for reducing television bandwidth requirements. Complementary processors are required in both the transmitting and receiving systems. The pretransmission processor is analog-to-digital, while the postreception processor is digital-to-analog.

  16. Approximating prediction uncertainty for random forest regression models

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  17. Aflatoxin variability in pistachios.

    PubMed Central

    Mahoney, N E; Rodriguez, S B

    1996-01-01

    Pistachio fruit components, including hulls (mesocarps and epicarps), seed coats (testas), and kernels (seeds), all contribute to variable aflatoxin content in pistachios. Fresh pistachio kernels were individually inoculated with Aspergillus flavus and incubated 7 or 10 days. Hulled, shelled kernels were either left intact or wounded prior to inoculation. Wounded kernels, with or without the seed coat, were readily colonized by A. flavus and after 10 days of incubation contained 37 times more aflatoxin than similarly treated unwounded kernels. The aflatoxin levels in the individual wounded pistachios were highly variable. Neither fungal colonization nor aflatoxin was detected in intact kernels without seed coats. Intact kernels with seed coats had limited fungal colonization and low aflatoxin concentrations compared with their wounded counterparts. Despite substantial fungal colonization of wounded hulls, aflatoxin was not detected in hulls. Aflatoxin levels were significantly lower in wounded kernels with hulls than in kernels of hulled pistachios. Both the seed coat and a water-soluble extract of hulls suppressed aflatoxin production by A. flavus. PMID:8919781

  18. Delta Scuti Variables

    NASA Astrophysics Data System (ADS)

    Handler, Gerald

    2009-09-01

    We review recent research on Delta Scuti stars from an observer's viewpoint. First, some signposts helping to find the way through the Delta Scuti jungle are placed. Then, some problems in studying individual pulsators in the framework of asteroseismology are given before a view on how the study of these variables has benefited (or not) from past and present high-precision asteroseismic space missions is presented. Some possible pitfalls in the analysis of data with a large dynamical range in pulsational amplitudes are pointed out, and a strategy to optimize the outcome of asteroseismic studies of Delta Scuti stars is suggested. We continue with some views on ``hybrid'' pulsators and interesting individual High Amplitude Delta Scuti stars, and then take a look on Delta Scuti stars in stellar systems of several different kinds. Recent results on pre-main sequence Delta Scuti stars are discussed as are questions related to the instability strip of these variables. Finally, some remarkable new theoretical results are highlighted before, instead of a set of classical conclusions, questions to be solved in the future, are raised.

  19. Quantifying Proportional Variability

    PubMed Central

    Heath, Joel P.; Borowski, Peter

    2013-01-01

    Real quantities can undergo such a wide variety of dynamics that the mean is often a meaningless reference point for measuring variability. Despite their widespread application, techniques like the Coefficient of Variation are not truly proportional and exhibit pathological properties. The non-parametric measure Proportional Variability (PV) [1] resolves these issues and provides a robust way to summarize and compare variation in quantities exhibiting diverse dynamical behaviour. Instead of being based on deviation from an average value, variation is simply quantified by comparing the numbers to each other, requiring no assumptions about central tendency or underlying statistical distributions. While PV has been introduced before and has already been applied in various contexts to population dynamics, here we present a deeper analysis of this new measure, derive analytical expressions for the PV of several general distributions and present new comparisons with the Coefficient of Variation, demonstrating cases in which PV is the more favorable measure. We show that PV provides an easily interpretable approach for measuring and comparing variation that can be generally applied throughout the sciences, from contexts ranging from stock market stability to climate variation. PMID:24386334

  20. Physics of Magnetospheric Variability

    NASA Astrophysics Data System (ADS)

    Vasyliūnas, Vytenis M.

    2011-01-01

    Many widely used methods for describing and understanding the magnetosphere are based on balance conditions for quasi-static equilibrium (this is particularly true of the classical theory of magnetosphere/ionosphere coupling, which in addition presupposes the equilibrium to be stable); they may therefore be of limited applicability for dealing with time-variable phenomena as well as for determining cause-effect relations. The large-scale variability of the magnetosphere can be produced both by changing external (solar-wind) conditions and by non-equilibrium internal dynamics. Its developments are governed by the basic equations of physics, especially Maxwell's equations combined with the unique constraints of large-scale plasma; the requirement of charge quasi-neutrality constrains the electric field to be determined by plasma dynamics (generalized Ohm's law) and the electric current to match the existing curl of the magnetic field. The structure and dynamics of the ionosphere/magnetosphere/solar-wind system can then be described in terms of three interrelated processes: (1) stress equilibrium and disequilibrium, (2) magnetic flux transport, (3) energy conversion and dissipation. This provides a framework for a unified formulation of settled as well as of controversial issues concerning, e.g., magnetospheric substorms and magnetic storms.

  1. Current Climate Variability & Change

    NASA Astrophysics Data System (ADS)

    Diem, J.; Criswell, B.; Elliott, W. C.

    2013-12-01

    Current Climate Variability & Change is the ninth among a suite of ten interconnected, sequential labs that address all 39 climate-literacy concepts in the U.S. Global Change Research Program's Climate Literacy: The Essential Principles of Climate Sciences. The labs are as follows: Solar Radiation & Seasons, Stratospheric Ozone, The Troposphere, The Carbon Cycle, Global Surface Temperature, Glacial-Interglacial Cycles, Temperature Changes over the Past Millennium, Climates & Ecosystems, Current Climate Variability & Change, and Future Climate Change. All are inquiry-based, on-line products designed in a way that enables students to construct their own knowledge of a topic. Questions representative of various levels of Webb's depth of knowledge are embedded in each lab. In addition to the embedded questions, each lab has three or four essential questions related to the driving questions for the lab suite. These essential questions are presented as statements at the beginning of the material to represent the lab objectives, and then are asked at the end as questions to function as a summative assessment. For example, the Current Climate Variability & Change is built around these essential questions: (1) What has happened to the global temperature at the Earth's surface, in the middle troposphere, and in the lower stratosphere over the past several decades?; (2) What is the most likely cause of the changes in global temperature over the past several decades and what evidence is there that this is the cause?; and (3) What have been some of the clearly defined effects of the change in global temperature on the atmosphere and other spheres of the Earth system? An introductory Prezi allows the instructor to assess students' prior knowledge in relation to these questions, while also providing 'hooks' to pique their interest related to the topic. The lab begins by presenting examples of and key differences between climate variability (e.g., Mt. Pinatubo eruption) and

  2. Variable Valve Actuation

    SciT

    Jeffrey Gutterman; A. J. Lasley

    2008-08-31

    Many approaches exist to enable advanced mode, low temperature combustion systems for diesel engines - such as premixed charge compression ignition (PCCI), Homogeneous Charge Compression Ignition (HCCI) or other HCCI-like combustion modes. The fuel properties and the quantity, distribution and temperature profile of air, fuel and residual fraction in the cylinder can have a marked effect on the heat release rate and combustion phasing. Figure 1 shows that a systems approach is required for HCCI-like combustion. While the exact requirements remain unclear (and will vary depending on fuel, engine size and application), some form of substantially variable valve actuation ismore » a likely element in such a system. Variable valve actuation, for both intake and exhaust valve events, is a potent tool for controlling the parameters that are critical to HCCI-like combustion and expanding its operational range. Additionally, VVA can be used to optimize the combustion process as well as exhaust temperatures and impact the after treatment system requirements and its associated cost. Delphi Corporation has major manufacturing and product development and applied R&D expertise in the valve train area. Historical R&D experience includes the development of fully variable electro-hydraulic valve train on research engines as well as several generations of mechanical VVA for gasoline systems. This experience has enabled us to evaluate various implementations and determine the strengths and weaknesses of each. While a fully variable electro-hydraulic valve train system might be the 'ideal' solution technically for maximum flexibility in the timing and control of the valve events, its complexity, associated costs, and high power consumption make its implementation on low cost high volume applications unlikely. Conversely, a simple mechanical system might be a low cost solution but not deliver the flexibility required for HCCI operation. After modeling more than 200 variations of the

  3. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  4. Randomized branch sampling

    Harry T. Valentine

    2002-01-01

    Randomized branch sampling (RBS) is a special application of multistage probability sampling (see Sampling, environmental), which was developed originally by Jessen [3] to estimate fruit counts on individual orchard trees. In general, the method can be used to obtain estimates of many different attributes of trees or other branched plants. The usual objective of RBS is...

  5. Blood Pressure Variability and Cognitive Function Among Older African Americans: Introducing a New Blood Pressure Variability Measure.

    PubMed

    Tsang, Siny; Sperling, Scott A; Park, Moon Ho; Helenius, Ira M; Williams, Ishan C; Manning, Carol

    2017-09-01

    Although blood pressure (BP) variability has been reported to be associated with cognitive impairment, whether this relationship affects African Americans has been unclear. We sought correlations between systolic and diastolic BP variability and cognitive function in community-dwelling older African Americans, and introduced a new BP variability measure that can be applied to BP data collected in clinical practice. We assessed cognitive function in 94 cognitively normal older African Americans using the Mini-Mental State Examination (MMSE) and the Computer Assessment of Mild Cognitive Impairment (CAMCI). We used BP measurements taken at the patients' three most recent primary care clinic visits to generate three traditional BP variability indices, range, standard deviation, and coefficient of variation, plus a new index, random slope, which accounts for unequal BP measurement intervals within and across patients. MMSE scores did not correlate with any of the BP variability indices. Patients with greater diastolic BP variability were less accurate on the CAMCI verbal memory and incidental memory tasks. Results were similar across the four BP variability indices. In a sample of cognitively intact older African American adults, BP variability did not correlate with global cognitive function, as measured by the MMSE. However, higher diastolic BP variability correlated with poorer verbal and incidental memory. By accounting for differences in BP measurement intervals, our new BP variability index may help alert primary care physicians to patients at particular risk for cognitive decline.

  6. Alzheimer random walk

    NASA Astrophysics Data System (ADS)

    Odagaki, Takashi; Kasuya, Keisuke

    2017-09-01

    Using the Monte Carlo simulation, we investigate a memory-impaired self-avoiding walk on a square lattice in which a random walker marks each of sites visited with a given probability p and makes a random walk avoiding the marked sites. Namely, p = 0 and p = 1 correspond to the simple random walk and the self-avoiding walk, respectively. When p> 0, there is a finite probability that the walker is trapped. We show that the trap time distribution can well be fitted by Stacy's Weibull distribution b(a/b){a+1}/{b}[Γ({a+1}/{b})]-1x^a\\exp(-a/bx^b)} where a and b are fitting parameters depending on p. We also find that the mean trap time diverges at p = 0 as p- α with α = 1.89. In order to produce sufficient number of long walks, we exploit the pivot algorithm and obtain the mean square displacement and its Flory exponent ν(p) as functions of p. We find that the exponent determined for 1000 step walks interpolates both limits ν(0) for the simple random walk and ν(1) for the self-avoiding walk as [ ν(p) - ν(0) ] / [ ν(1) - ν(0) ] = pβ with β = 0.388 when p ≪ 0.1 and β = 0.0822 when p ≫ 0.1. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  7. Do little interactions get lost in dark random forests?

    PubMed

    Wright, Marvin N; Ziegler, Andreas; König, Inke R

    2016-03-31

    Random forests have often been claimed to uncover interaction effects. However, if and how interaction effects can be differentiated from marginal effects remains unclear. In extensive simulation studies, we investigate whether random forest variable importance measures capture or detect gene-gene interactions. With capturing interactions, we define the ability to identify a variable that acts through an interaction with another one, while detection is the ability to identify an interaction effect as such. Of the single importance measures, the Gini importance captured interaction effects in most of the simulated scenarios, however, they were masked by marginal effects in other variables. With the permutation importance, the proportion of captured interactions was lower in all cases. Pairwise importance measures performed about equal, with a slight advantage for the joint variable importance method. However, the overall fraction of detected interactions was low. In almost all scenarios the detection fraction in a model with only marginal effects was larger than in a model with an interaction effect only. Random forests are generally capable of capturing gene-gene interactions, but current variable importance measures are unable to detect them as interactions. In most of the cases, interactions are masked by marginal effects and interactions cannot be differentiated from marginal effects. Consequently, caution is warranted when claiming that random forests uncover interactions.

  8. Random vectors and spatial analysis by geostatistics for geotechnical applications

    SciT

    Young, D.S.

    1987-08-01

    Geostatistics is extended to the spatial analysis of vector variables by defining the estimation variance and vector variogram in terms of the magnitude of difference vectors. Many random variables in geotechnology are in vectorial terms rather than scalars, and its structural analysis requires those sample variable interpolations to construct and characterize structural models. A better local estimator will result in greater quality of input models; geostatistics can provide such estimators; kriging estimators. The efficiency of geostatistics for vector variables is demonstrated in a case study of rock joint orientations in geological formations. The positive cross-validation encourages application of geostatistics tomore » spatial analysis of random vectors in geoscience as well as various geotechnical fields including optimum site characterization, rock mechanics for mining and civil structures, cavability analysis of block cavings, petroleum engineering, and hydrologic and hydraulic modelings.« less

  9. Heart rate variability.

    PubMed

    Cygankiewicz, Iwona; Zareba, Wojciech

    2013-01-01

    Heart rate variability (HRV) provides indirect insight into autonomic nervous system tone, and has a well-established role as a marker of cardiovascular risk. Recent decades brought an increasing interest in HRV assessment as a diagnostic tool in detection of autonomic impairment, and prediction of prognosis in several neurological disorders. Both bedside analysis of simple markers of HRV, as well as more sophisticated HRV analyses including time, frequency domain and nonlinear analysis have been proven to detect early autonomic involvement in several neurological disorders. Furthermore, altered HRV parameters were shown to be related with cardiovascular risk, including sudden cardiac risk, in patients with neurological diseases. This chapter aims to review clinical and prognostic application of HRV analysis in diabetes, stroke, multiple sclerosis, muscular dystrophies, Parkinson's disease and epilepsy. © 2013 Elsevier B.V. All rights reserved.

  10. Variable speed controller

    NASA Technical Reports Server (NTRS)

    Estes, Christa; Spiggle, Charles; Swift, Shannon; Vangeffen, Stephen; Younger, Frank

    1992-01-01

    This report details a new design for a variable speed controller which can be used to operate lunar machinery without the astronaut using his or her upper body. In order to demonstrate the design, a treadle for an industrial sewing machine was redesigned to be used by a standing operator. Since the invention of an electrically powered sewing machine, the operator has been seated. Today, companies are switching from sit down to stand up operation involving modular stations. The old treadle worked well with a sitting operator, but problems have been found when trying to use the same treadle with a standing operator. Emphasis is placed on the ease of use by the operator along with the ergonomics involved. Included with the design analysis are suggestions for possible uses for the speed controller in other applications.

  11. Essential biodiversity variables

    Pereira, H.M.; Ferrier, S.; Walters, M.; Geller, G.N.; Jongman, R.H.G.; Scholes, Robert J.; Bruford, M.W.; Brummitt, N.; Butchart, S.H.M.; Cardoso, A.C.; Coops, N.C.; Dulloo, E.; Faith, D.P.; Freyhof, J.; Gregory, R.D.; Heip, C.; Höft, R.; Hurtt, G.; Jetz, W.; Karp, D.S.; McGeoch, M.A.; Obura, D.; Onada, Y.; Pettorelli, N.; Reyers, B.; Sayre, R.; Scharlemann, J.P.W.; Stuart, S.N.; Turak, E.; Walpole, M.; Wegmann, M.

    2013-01-01

    Reducing the rate of biodiversity loss and averting dangerous biodiversity change are international goals, reasserted by the Aichi Targets for 2020 by Parties to the United Nations (UN) Convention on Biological Diversity (CBD) after failure to meet the 2010 target (1, 2). However, there is no global, harmonized observation system for delivering regular, timely data on biodiversity change (3). With the first plenary meeting of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) soon under way, partners from the Group on Earth Observations Biodiversity Observation Network (GEO BON) (4) are developing—and seeking consensus around—Essential Biodiversity Variables (EBVs) that could form the basis of monitoring programs worldwide.

  12. On botulinum neurotoxin variability.

    PubMed

    Montecucco, Cesare; Rasotto, Maria Berica

    2015-01-06

    The rapidly growing number of botulinum neurotoxin sequences poses the problem of the possible evolutionary significance of the variability of these superpotent neurotoxins for toxin-producing Clostridium species. To progress in the understanding of this remarkable phenomenon, we suggest that researchers should (i) abandon an anthropocentric view of these neurotoxins as human botulism-causing agents or as human therapeutics, (ii) begin to investigate in depth the role of botulinum neurotoxins in animal botulism in the wilderness, and (iii) devote large efforts to next-generation sequencing of soil samples to identify novel botulinum neurotoxins. In order to compare the fitness of the different toxins, we suggest that assays of all the steps from toxin production to animal death should be performed. Copyright © 2015 Montecucco and Rasotto.

  13. Sequential variable fuel injection

    SciT

    Weglarz, M.W.; Vincent, M.T.; Prestel, J.F.

    This patent describes a fuel injection system for an engine of an automotive vehicle including cylinders, a spark plug for each of the cylinders, a distributor electrically connected to the spark plug, a throttle body having a throttle valve connected to the engine to allow or prevent air to the cylinders, a fuel source at least one fuel line connected to the fuel source, fuel injectors connected to the fuel line for delivering fuel to the cylinders, a sensor located near the distributor for sensing predetermined states of the distributor, and an electronic control unit (ECU) electrically connected to themore » sensor, distributor and fuel injectors. It comprises calculating a desired total injector on time for current engine conditions; calculating a variable injection time (VIT) and a turn on time based on the VIT; and firing the fuel injectors at the calculated turn on time for the calculated total injector on time.« less

  14. Variable mixer propulsion cycle

    NASA Technical Reports Server (NTRS)

    Rundell, D. J.; Mchugh, D. P.; Foster, T.; Brown, R. H. (Inventor)

    1978-01-01

    A design technique, method and apparatus are delineated for controlling the bypass gas stream pressure and varying the bypass ratio of a mixed flow gas turbine engine in order to achieve improved performance. The disclosed embodiments each include a mixing device for combining the core and bypass gas streams. The variable area mixing device permits the static pressures of the core and bypass streams to be balanced prior to mixing at widely varying bypass stream pressure levels. The mixed flow gas turbine engine therefore operates efficiently over a wide range of bypass ratios and the dynamic pressure of the bypass stream is maintained at a level which will keep the engine inlet airflow matched to an optimum design level throughout a wide range of engine thrust settings.

  15. Scenarios for Motivating the Learning of Variability: An Example in Finances

    ERIC Educational Resources Information Center

    Cordani, Lisbeth K.

    2013-01-01

    This article explores an example in finances in order to motivate the random variable learning to the very beginners in statistics. In addition, it offers a relationship between standard deviation and range in a very specific situation.

  16. Variance approach for multi-objective linear programming with fuzzy random of objective function coefficients

    NASA Astrophysics Data System (ADS)

    Indarsih, Indrati, Ch. Rini

    2016-02-01

    In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.

  17. Anomalous diffusion on a random comblike structure

    NASA Astrophysics Data System (ADS)

    Havlin, Shlomo; Kiefer, James E.; Weiss, George H.

    1987-08-01

    We have recently studied a random walk on a comblike structure as an analog of diffusion on a fractal structure. In our earlier work, the comb was assumed to have a deterministic structure, the comb having teeth of infinite length. In the present paper we study diffusion on a one-dimensional random comb, the length of whose teeth are random variables with an asymptotic stable law distribution φ(L)~L-(1+γ) where 0<γ<=1. Two mean-field methods are used for the analysis, one based on the continuous-time random walk, and the second a self-consistent scaling theory. Both lead to the same conclusions. We find that the diffusion exponent characterizing the mean-square displacement along the backbone of the comb is dw=4/(1+γ) for γ<1 and dw=2 for γ>=1. The probability of being at the origin at time t is P0(t)~t-ds/2 for large t with ds=(3-γ)/2 for γ<1 and ds=1 for γ>1. When a field is applied along the backbone of the comb the diffusion exponent is dw=2/(1+γ) for γ<1 and dw=1 for γ>=1. The theoretical results are confirmed using the exact enumeration method.

  18. Correlated resistive/capacitive state variability in solid TiO2 based memory devices

    NASA Astrophysics Data System (ADS)

    Li, Qingjiang; Salaoru, Iulia; Khiat, Ali; Xu, Hui; Prodromakis, Themistoklis

    2017-05-01

    In this work, we experimentally demonstrated the correlated resistive/capacitive switching and state variability in practical TiO2 based memory devices. Based on filamentary functional mechanism, we argue that the impedance state variability stems from the randomly distributed defects inside the oxide bulk. Finally, our assumption was verified via a current percolation circuit model, by taking into account of random defects distribution and coexistence of memristor and memcapacitor.

  19. Packet Randomized Experiments for Eliminating Classes of Confounders

    PubMed Central

    Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.

    2014-01-01

    Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088

  20. Variable Star Observing in Hungary

    NASA Astrophysics Data System (ADS)

    Mizser, Attila

    1986-12-01

    Astronomy and variable star observing has a long history in Hungary, dating back to the private observatories erected by the Hungarian nobility in the late 19th Century. The first organized network of amateur variable star observers, the Variable Star Section of the new Hungarian Astronomical Association, was organized around the Urania Observatory in Budapest in 1948. Other groups, dedicated to various types of variables, have since been organized.

  1. Usefulness of Mendelian Randomization in Observational Epidemiology

    PubMed Central

    Bochud, Murielle; Rousson, Valentin

    2010-01-01

    Mendelian randomization refers to the random allocation of alleles at the time of gamete formation. In observational epidemiology, this refers to the use of genetic variants to estimate a causal effect between a modifiable risk factor and an outcome of interest. In this review, we recall the principles of a “Mendelian randomization” approach in observational epidemiology, which is based on the technique of instrumental variables; we provide simulations and an example based on real data to demonstrate its implications; we present the results of a systematic search on original articles having used this approach; and we discuss some limitations of this approach in view of what has been found so far. PMID:20616999

  2. Transcranial Random Noise Stimulation of Visual Cortex: Stochastic Resonance Enhances Central Mechanisms of Perception.

    PubMed

    van der Groen, Onno; Wenderoth, Nicole

    2016-05-11

    Random noise enhances the detectability of weak signals in nonlinear systems, a phenomenon known as stochastic resonance (SR). Though counterintuitive at first, SR has been demonstrated in a variety of naturally occurring processes, including human perception, where it has been shown that adding noise directly to weak visual, tactile, or auditory stimuli enhances detection performance. These results indicate that random noise can push subthreshold receptor potentials across the transfer threshold, causing action potentials in an otherwise silent afference. Despite the wealth of evidence demonstrating SR for noise added to a stimulus, relatively few studies have explored whether or not noise added directly to cortical networks enhances sensory detection. Here we administered transcranial random noise stimulation (tRNS; 100-640 Hz zero-mean Gaussian white noise) to the occipital region of human participants. For increasing tRNS intensities (ranging from 0 to 1.5 mA), the detection accuracy of a visual stimuli changed according to an inverted-U-shaped function, typical of the SR phenomenon. When the optimal level of noise was added to visual cortex, detection performance improved significantly relative to a zero noise condition (9.7 ± 4.6%) and to a similar extent as optimal noise added to the visual stimuli (11.2 ± 4.7%). Our results demonstrate that adding noise to cortical networks can improve human behavior and that tRNS is an appropriate tool to exploit this mechanism. Our findings suggest that neural processing at the network level exhibits nonlinear system properties that are sensitive to the stochastic resonance phenomenon and highlight the usefulness of tRNS as a tool to modulate human behavior. Since tRNS can be applied to all cortical areas, exploiting the SR phenomenon is not restricted to the perceptual domain, but can be used for other functions that depend on nonlinear neural dynamics (e.g., decision making, task switching, response inhibition, and

  3. Smooth random change point models.

    PubMed

    van den Hout, Ardo; Muniz-Terrera, Graciela; Matthews, Fiona E

    2011-03-15

    Change point models are used to describe processes over time that show a change in direction. An example of such a process is cognitive ability, where a decline a few years before death is sometimes observed. A broken-stick model consists of two linear parts and a breakpoint where the two lines intersect. Alternatively, models can be formulated that imply a smooth change between the two linear parts. Change point models can be extended by adding random effects to account for variability between subjects. A new smooth change point model is introduced and examples are presented that show how change point models can be estimated using functions in R for mixed-effects models. The Bayesian inference using WinBUGS is also discussed. The methods are illustrated using data from a population-based longitudinal study of ageing, the Cambridge City over 75 Cohort Study. The aim is to identify how many years before death individuals experience a change in the rate of decline of their cognitive ability. Copyright © 2010 John Wiley & Sons, Ltd.

  4. Preservice Teachers' Understanding of Variable

    ERIC Educational Resources Information Center

    Brown, Sue; Bergman, Judy

    2013-01-01

    This study examines the research on middle school students' understanding of variables and explores preservice elementary and middle school teachers' knowledge of variables. According to research studies, middle school students have limited understanding of variables. Many studies have examined the performance of middle school students and offered…

  5. Variable Screening for Cluster Analysis.

    ERIC Educational Resources Information Center

    Donoghue, John R.

    Inclusion of irrelevant variables in a cluster analysis adversely affects subgroup recovery. This paper examines using moment-based statistics to screen variables; only variables that pass the screening are then used in clustering. Normal mixtures are analytically shown often to possess negative kurtosis. Two related measures, "m" and…

  6. Certified randomness in quantum physics.

    PubMed

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  7. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  8. Efficient Variable Selection Method for Exposure Variables on Binary Data

    NASA Astrophysics Data System (ADS)

    Ohno, Manabu; Tarumi, Tomoyuki

    In this paper, we propose a new variable selection method for "robust" exposure variables. We define "robust" as property that the same variable can select among original data and perturbed data. There are few studies of effective for the selection method. The problem that selects exposure variables is almost the same as a problem that extracts correlation rules without robustness. [Brin 97] is suggested that correlation rules are possible to extract efficiently using chi-squared statistic of contingency table having monotone property on binary data. But the chi-squared value does not have monotone property, so it's is easy to judge the method to be not independent with an increase in the dimension though the variable set is completely independent, and the method is not usable in variable selection for robust exposure variables. We assume anti-monotone property for independent variables to select robust independent variables and use the apriori algorithm for it. The apriori algorithm is one of the algorithms which find association rules from the market basket data. The algorithm use anti-monotone property on the support which is defined by association rules. But independent property does not completely have anti-monotone property on the AIC of independent probability model, but the tendency to have anti-monotone property is strong. Therefore, selected variables with anti-monotone property on the AIC have robustness. Our method judges whether a certain variable is exposure variable for the independent variable using previous comparison of the AIC. Our numerical experiments show that our method can select robust exposure variables efficiently and precisely.

  9. Nova-like variables

    NASA Technical Reports Server (NTRS)

    Ladous, Constanze

    1993-01-01

    On grounds of different observable characteristics five classes of nova-like objects are distinguished: the UX Ursae Majoris stars, the antidwarf novae, the DQ Herculis stars, the AM Herculis stars, and the AM Canum Venaticorum stars. Some objects have not been classified specifically. Nova-like stars share most observable features with dwarf novae, except for the outburst behavior. The understanding is that dwarf novae, UX Ursae Majoris stars, and anti-dwarf novae are basically the same sort of objects. The difference between them is that in UX Ursae Majoris stars the mass transfer through the accretion disc always is high so the disc is stationary all the time; in anti-dwarf novae for some reason the mass transfer occasionally drops considerably for some time, and in dwarf novae it is low enough for the disc to undergo semiperiodic changes between high and low accretion events. DQ Herculis stars are believed to possess weakly magnetic white dwarfs which disrupt the inner disc at some distance from the central star; the rotation of the white dwarf can be seen as an additional photometric period. In AM Herculis stars, a strongly magnetic white dwarf entirely prevents the formation of an accretion disk and at the same time locks the rotation of the white dwarf to the binary orbit. Finally, AM Canum Venaticorum stars are believed to be cataclysmic variables that consist of two white dwarf components.

  10. Titan's Variable Plasma Interaction

    NASA Astrophysics Data System (ADS)

    Ledvina, S. A.; Brecht, S. H.

    2015-12-01

    Cassini observations have found that the plasma and magnetic field conditions upstream of Titan are far more complex than they were thought to be after the Voyager encounter. Rymer et al., (2009) used the Cassini Plasma Spectrometer (CAPS) electron observations to classify the plasma conditions along Titan's orbit into 5 types (Plasma Sheet, Lobe, Mixed, Magnetosheath and Misc.). Nemeth et al., (2011) found that the CAPS ion observations could also be separated into the same plasma regions as defined by Rymer et al. Additionally the T-96 encounter found Titan in the solar wind adding a sixth classification. Understanding the effects of the variable upstream plasma conditions on Titan's plasma interaction and the evolution of Titan's ionosphere/atmosphere is one of the main objectives of the Cassini mission. To compliment the mission we perform hybrid simulations of Titan's plasma interaction to examine the effects of the incident plasma distribution function and the flow velocity. We closely examine the results on Titan's induced magnetosphere and the resulting pickup ion properties.

  11. Variable transmittance electrochromic windows

    SciT

    Rauh, R.D.

    1983-11-01

    Electrochromic apertures based on RF sputtered thin films of WO3 are projected to have widely different sunlight attenuation properties when converted to MxWO3 (M H, Li, Na, Ag, etc.), depending on the initial preparation conditions. Amorphous WO3, prepared at low temperature, has a coloration spectrum centered in the visible, while high temperature crystalline WO3 attenuates infrared light most efficiently, but appears to become highly reflective at high values of x. The possibility therefore exists of producing variable light transmission apertures of the general form (a-MxWO3/FIC/c-WO3), where the FIC is an ion conducting thin film, such as LiAlF4 (for M Li).more » The attenuation of 90% of the solar spectrum requires an injected charge of 30 to 40 mcoul/sq cm in either amorphous or crystalline WO3, corresponding to 0.2 Whr/sq m per coloration cycle. In order to produce windows with very high solar transparency in the bleached form, new counter electrode materials must be found with complementary electrochromism to WO3.« less

  12. Variable transmittance electrochromic windows

    NASA Astrophysics Data System (ADS)

    Rauh, R. D.

    1983-11-01

    Electrochromic apertures based on RF sputtered thin films of WO3 are projected to have widely different sunlight attenuation properties when converted to MxWO3 (M = H, Li, Na, Ag, etc.), depending on the initial preparation conditions. Amorphous WO3, prepared at low temperature, has a coloration spectrum centered in the visible, while high temperature crystalline WO3 attenuates infrared light most efficiently, but appears to become highly reflective at high values of x. The possibility therefore exists of producing variable light transmission apertures of the general form (a-MxWO3/FIC/c-WO3), where the FIC is an ion conducting thin film, such as LiAlF4 (for M = Li). The attenuation of 90% of the solar spectrum requires an injected charge of 30 to 40 mcoul/sq cm in either amorphous or crystalline WO3, corresponding to 0.2 Whr/sq m per coloration cycle. In order to produce windows with very high solar transparency in the bleached form, new counter electrode materials must be found with complementary electrochromism to WO3.

  13. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    ERIC Educational Resources Information Center

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  14. Competitive Facility Location with Random Demands

    NASA Astrophysics Data System (ADS)

    Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke

    2009-10-01

    This paper proposes a new location problem of competitive facilities, e.g. shops and stores, with uncertain demands in the plane. By representing the demands for facilities as random variables, the location problem is formulated to a stochastic programming problem, and for finding its solution, three deterministic programming problems: expectation maximizing problem, probability maximizing problem, and satisfying level maximizing problem are considered. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic vibration. Efficiency of the solution method is shown by applying to numerical examples of the facility location problems.

  15. Greenland Glacier Albedo Variability

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The program for Arctic Regional Climate Assessment (PARCA) is a NASA-funded project with the prime goal of addressing the mass balance of the Greenland ice sheet. Since the formal initiation of the program in 1995, there has been a significant improvement in the estimates of the mass balance of the ice sheet. Results from this program reveal that the high-elevation regions of the ice sheet are approximately in balance, but the margins are thinning. Laser surveys reveal significant thinning along 70 percent of the ice sheet periphery below 2000 m elevations, and in at least one outlet glacier, Kangerdlugssuaq in southeast Greenland, thinning has been as much as 10 m/yr. This study examines the albedo variability in four outlet glaciers to help separate out the relative contributions of surface melting versus ice dynamics to the recent mass balance changes. Analysis of AVHRR Polar Pathfinder albedo shows that at the Petermann and Jakobshavn glaciers, there has been a negative trend in albedo at the glacier terminus from 1981 to 2000, whereas the Stor+strommen and Kangerdlugssuaq glaciers show slightly positive trends in albedo. These findings are consistent with recent observations of melt extent from passive microwave data which show more melt on the western side of Greenland and slightly less on the eastern side. Significance of albedo trends will depend on where and when the albedo changes occur. Since the majority of surface melt occurs in the shallow sloping western margin of the ice sheet where the shortwave radiation dominates the energy balance in summer (e.g. Jakobshavn region) this region will be more sensitive to changes in albedo than in regions where this is not the case. Near the Jakobshavn glacier, even larger changes in albedo have been observed, with decreases as much as 20 percent per decade.

  16. Geenland Glacier Albedo Variability

    NASA Astrophysics Data System (ADS)

    2004-01-01

    The program for Arctic Regional Climate Assessment (PARCA) is a NASA-funded project with the prime goal of addressing the mass balance of the Greenland ice sheet. Since the formal initiation of the program in 1995, there has been a significant improvement in the estimates of the mass balance of the ice sheet. Results from this program reveal that the high-elevation regions of the ice sheet are approximately in balance, but the margins are thinning. Laser surveys reveal significant thinning along 70 percent of the ice sheet periphery below 2000 m elevations, and in at least one outlet glacier, Kangerdlugssuaq in southeast Greenland, thinning has been as much as 10 m/yr. This study examines the albedo variability in four outlet glaciers to help separate out the relative contributions of surface melting versus ice dynamics to the recent mass balance changes. Analysis of AVHRR Polar Pathfinder albedo shows that at the Petermann and Jakobshavn glaciers, there has been a negative trend in albedo at the glacier terminus from 1981 to 2000, whereas the Stor+strommen and Kangerdlugssuaq glaciers show slightly positive trends in albedo. These findings are consistent with recent observations of melt extent from passive microwave data which show more melt on the western side of Greenland and slightly less on the eastern side. Significance of albedo trends will depend on where and when the albedo changes occur. Since the majority of surface melt occurs in the shallow sloping western margin of the ice sheet where the shortwave radiation dominates the energy balance in summer (e.g. Jakobshavn region) this region will be more sensitive to changes in albedo than in regions where this is not the case. Near the Jakobshavn glacier, even larger changes in albedo have been observed, with decreases as much as 20 percent per decade.

  17. Variable Selection for Regression Models of Percentile Flows

    NASA Astrophysics Data System (ADS)

    Fouad, G.

    2017-12-01

    Percentile flows describe the flow magnitude equaled or exceeded for a given percent of time, and are widely used in water resource management. However, these statistics are normally unavailable since most basins are ungauged. Percentile flows of ungauged basins are often predicted using regression models based on readily observable basin characteristics, such as mean elevation. The number of these independent variables is too large to evaluate all possible models. A subset of models is typically evaluated using automatic procedures, like stepwise regression. This ignores a large variety of methods from the field of feature (variable) selection and physical understanding of percentile flows. A study of 918 basins in the United States was conducted to compare an automatic regression procedure to the following variable selection methods: (1) principal component analysis, (2) correlation analysis, (3) random forests, (4) genetic programming, (5) Bayesian networks, and (6) physical understanding. The automatic regression procedure only performed better than principal component analysis. Poor performance of the regression procedure was due to a commonly used filter for multicollinearity, which rejected the strongest models because they had cross-correlated independent variables. Multicollinearity did not decrease model performance in validation because of a representative set of calibration basins. Variable selection methods based strictly on predictive power (numbers 2-5 from above) performed similarly, likely indicating a limit to the predictive power of the variables. Similar performance was also reached using variables selected based on physical understanding, a finding that substantiates recent calls to emphasize physical understanding in modeling for predictions in ungauged basins. The strongest variables highlighted the importance of geology and land cover, whereas widely used topographic variables were the weakest predictors. Variables suffered from a high

  18. Automatic identification of variables in epidemiological datasets using logic regression.

    PubMed

    Lorenz, Matthias W; Abdi, Negin Ashtiani; Scheckenbach, Frank; Pflug, Anja; Bülbül, Alpaslan; Catapano, Alberico L; Agewall, Stefan; Ezhov, Marat; Bots, Michiel L; Kiechl, Stefan; Orth, Andreas

    2017-04-13

    For an individual participant data (IPD) meta-analysis, multiple datasets must be transformed in a consistent format, e.g. using uniform variable names. When large numbers of datasets have to be processed, this can be a time-consuming and error-prone task. Automated or semi-automated identification of variables can help to reduce the workload and improve the data quality. For semi-automation high sensitivity in the recognition of matching variables is particularly important, because it allows creating software which for a target variable presents a choice of source variables, from which a user can choose the matching one, with only low risk of having missed a correct source variable. For each variable in a set of target variables, a number of simple rules were manually created. With logic regression, an optimal Boolean combination of these rules was searched for every target variable, using a random subset of a large database of epidemiological and clinical cohort data (construction subset). In a second subset of this database (validation subset), this optimal combination rules were validated. In the construction sample, 41 target variables were allocated on average with a positive predictive value (PPV) of 34%, and a negative predictive value (NPV) of 95%. In the validation sample, PPV was 33%, whereas NPV remained at 94%. In the construction sample, PPV was 50% or less in 63% of all variables, in the validation sample in 71% of all variables. We demonstrated that the application of logic regression in a complex data management task in large epidemiological IPD meta-analyses is feasible. However, the performance of the algorithm is poor, which may require backup strategies.

  19. Random numbers from vacuum fluctuations

    SciT

    Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com; Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543

    2016-07-25

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  20. Missing Not at Random Models for Latent Growth Curve Analyses

    ERIC Educational Resources Information Center

    Enders, Craig K.

    2011-01-01

    The past decade has seen a noticeable shift in missing data handling techniques that assume a missing at random (MAR) mechanism, where the propensity for missing data on an outcome is related to other analysis variables. Although MAR is often reasonable, there are situations where this assumption is unlikely to hold, leading to biased parameter…

  1. Computer-Assisted Dieting: Effects of a Randomized Nutrition Intervention

    ERIC Educational Resources Information Center

    Schroder, Kerstin E. E.

    2011-01-01

    Objectives: To compare the effects of a computer-assisted dieting intervention (CAD) with and without self-management training on dieting among 55 overweight and obese adults. Methods: Random assignment to a single-session nutrition intervention (CAD-only) or a combined CAD plus self-management group intervention (CADG). Dependent variables were…

  2. The Not-so-Random Drunkard's Walk

    ERIC Educational Resources Information Center

    Ehrhardt, George

    2013-01-01

    This dataset contains the results of a quasi-experiment, testing Karl Pearson's "drunkard's walk" analogy for an abstract random walk. Inspired by the alternate hypothesis that drunkards stumble to the side of their dominant hand, it includes data on intoxicated test subjects walking a 10' line. Variables include: the…

  3. Control of variable speed variable pitch wind turbine based on a disturbance observer

    NASA Astrophysics Data System (ADS)

    Ren, Haijun; Lei, Xin

    2017-11-01

    In this paper, a novel sliding mode controller based on disturbance observer (DOB) to optimize the efficiency of variable speed variable pitch (VSVP) wind turbine is developed and analyzed. Due to the highly nonlinearity of the VSVP system, the model is linearly processed to obtain the state space model of the system. Then, a conventional sliding mode controller is designed and a DOB is added to estimate wind speed. The proposed control strategy can successfully deal with the random nature of wind speed, the nonlinearity of VSVP system, the uncertainty of parameters and external disturbance. Via adding the observer to the sliding mode controller, it can greatly reduce the chattering produced by the sliding mode switching gain. The simulation results show that the proposed control system has the effectiveness and robustness.

  4. Investigating the Randomness of Numbers

    ERIC Educational Resources Information Center

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  5. Cluster Randomized Controlled Trial

    PubMed Central

    Young, John; Chapman, Katie; Nixon, Jane; Patel, Anita; Holloway, Ivana; Mellish, Kirste; Anwar, Shamaila; Breen, Rachel; Knapp, Martin; Murray, Jenni; Farrin, Amanda

    2015-01-01

    Background and Purpose— We developed a new postdischarge system of care comprising a structured assessment covering longer-term problems experienced by patients with stroke and their carers, linked to evidence-based treatment algorithms and reference guides (the longer-term stroke care system of care) to address the poor longer-term recovery experienced by many patients with stroke. Methods— A pragmatic, multicentre, cluster randomized controlled trial of this system of care. Eligible patients referred to community-based Stroke Care Coordinators were randomized to receive the new system of care or usual practice. The primary outcome was improved patient psychological well-being (General Health Questionnaire-12) at 6 months; secondary outcomes included functional outcomes for patients, carer outcomes, and cost-effectiveness. Follow-up was through self-completed postal questionnaires at 6 and 12 months. Results— Thirty-two stroke services were randomized (29 participated); 800 patients (399 control; 401 intervention) and 208 carers (100 control; 108 intervention) were recruited. In intention to treat analysis, the adjusted difference in patient General Health Questionnaire-12 mean scores at 6 months was −0.6 points (95% confidence interval, −1.8 to 0.7; P=0.394) indicating no evidence of statistically significant difference between the groups. Costs of Stroke Care Coordinator inputs, total health and social care costs, and quality-adjusted life year gains at 6 months, 12 months, and over the year were similar between the groups. Conclusions— This robust trial demonstrated no benefit in clinical or cost-effectiveness outcomes associated with the new system of care compared with usual Stroke Care Coordinator practice. Clinical Trial Registration— URL: http://www.controlled-trials.com. Unique identifier: ISRCTN 67932305. PMID:26152298

  6. Variable Sampling Mapping

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey, S.; Aronstein, David L.; Dean, Bruce H.; Lyon, Richard G.

    2012-01-01

    The performance of an optical system (for example, a telescope) is limited by the misalignments and manufacturing imperfections of the optical elements in the system. The impact of these misalignments and imperfections can be quantified by the phase variations imparted on light traveling through the system. Phase retrieval is a methodology for determining these variations. Phase retrieval uses images taken with the optical system and using a light source of known shape and characteristics. Unlike interferometric methods, which require an optical reference for comparison, and unlike Shack-Hartmann wavefront sensors that require special optical hardware at the optical system's exit pupil, phase retrieval is an in situ, image-based method for determining the phase variations of light at the system s exit pupil. Phase retrieval can be used both as an optical metrology tool (during fabrication of optical surfaces and assembly of optical systems) and as a sensor used in active, closed-loop control of an optical system, to optimize performance. One class of phase-retrieval algorithms is the iterative transform algorithm (ITA). ITAs estimate the phase variations by iteratively enforcing known constraints in the exit pupil and at the detector, determined from modeled or measured data. The Variable Sampling Mapping (VSM) technique is a new method for enforcing these constraints in ITAs. VSM is an open framework for addressing a wide range of issues that have previously been considered detrimental to high-accuracy phase retrieval, including undersampled images, broadband illumination, images taken at or near best focus, chromatic aberrations, jitter or vibration of the optical system or detector, and dead or noisy detector pixels. The VSM is a model-to-data mapping procedure. In VSM, fully sampled electric fields at multiple wavelengths are modeled inside the phase-retrieval algorithm, and then these fields are mapped to intensities on the light detector, using the properties

  7. A new numerical benchmark for variably saturated variable-density flow and transport in porous media

    NASA Astrophysics Data System (ADS)

    Guevara, Carlos; Graf, Thomas

    2016-04-01

    In subsurface hydrological systems, spatial and temporal variations in solute concentration and/or temperature may affect fluid density and viscosity. These variations could lead to potentially unstable situations, in which a dense fluid overlies a less dense fluid. These situations could produce instabilities that appear as dense plume fingers migrating downwards counteracted by vertical upwards flow of freshwater (Simmons et al., Transp. Porous Medium, 2002). As a result of unstable variable-density flow, solute transport rates are increased over large distances and times as compared to constant-density flow. The numerical simulation of variable-density flow in saturated and unsaturated media requires corresponding benchmark problems against which a computer model is validated (Diersch and Kolditz, Adv. Water Resour, 2002). Recorded data from a laboratory-scale experiment of variable-density flow and solute transport in saturated and unsaturated porous media (Simmons et al., Transp. Porous Medium, 2002) is used to define a new numerical benchmark. The HydroGeoSphere code (Therrien et al., 2004) coupled with PEST (www.pesthomepage.org) are used to obtain an optimized parameter set capable of adequately representing the data set by Simmons et al., (2002). Fingering in the numerical model is triggered using random hydraulic conductivity fields. Due to the inherent randomness, a large number of simulations were conducted in this study. The optimized benchmark model adequately predicts the plume behavior and the fate of solutes. This benchmark is useful for model verification of variable-density flow problems in saturated and/or unsaturated media.

  8. Assessing the accuracy and stability of variable selection ...

    EPA Pesticide Factsheets

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used, or stepwise procedures are employed which iteratively add/remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating dataset consists of the good/poor condition of n=1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p=212) of landscape features from the StreamCat dataset. Two types of RF models are compared: a full variable set model with all 212 predictors, and a reduced variable set model selected using a backwards elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors, and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substanti

  9. Can randomization be informative?

    NASA Astrophysics Data System (ADS)

    Pereira, Carlos A. B.; Campos, Thiago F.; Silva, Gustavo M.; Wechsler, Sergio

    2012-10-01

    In this paper the Pair of Siblings Paradox introduced by Pereira [1] is extended by considering more than two children and more than one child observed for gender. We follow the same lines of Wechsler et al. [2] that generalizes the three prisoners' dilemma, introduced by Gardner [3]. This paper's conjecture is that the Pair of Siblings and the Three Prisoners dilemma are dual paradoxes. Looking at possible likelihoods, the sure (randomized) selection for the former is non informative (informative), the opposite that holds for the latter. This situation is maintained for generalizations. Non informative likelihood here means that prior and posterior are equal.

  10. RNA-seq: technical variability and sampling

    PubMed Central

    2011-01-01

    Background RNA-seq is revolutionizing the way we study transcriptomes. mRNA can be surveyed without prior knowledge of gene transcripts. Alternative splicing of transcript isoforms and the identification of previously unknown exons are being reported. Initial reports of differences in exon usage, and splicing between samples as well as quantitative differences among samples are beginning to surface. Biological variation has been reported to be larger than technical variation. In addition, technical variation has been reported to be in line with expectations due to random sampling. However, strategies for dealing with technical variation will differ depending on the magnitude. The size of technical variance, and the role of sampling are examined in this manuscript. Results In this study three independent Solexa/Illumina experiments containing technical replicates are analyzed. When coverage is low, large disagreements between technical replicates are apparent. Exon detection between technical replicates is highly variable when the coverage is less than 5 reads per nucleotide and estimates of gene expression are more likely to disagree when coverage is low. Although large disagreements in the estimates of expression are observed at all levels of coverage. Conclusions Technical variability is too high to ignore. Technical variability results in inconsistent detection of exons at low levels of coverage. Further, the estimate of the relative abundance of a transcript can substantially disagree, even when coverage levels are high. This may be due to the low sampling fraction and if so, it will persist as an issue needing to be addressed in experimental design even as the next wave of technology produces larger numbers of reads. We provide practical recommendations for dealing with the technical variability, without dramatic cost increases. PMID:21645359

  11. Dairy consumption, systolic blood pressure, and risk of hypertension: Mendelian randomization study

    Objective: To examine whether previous observed inverse associations of dairy intake with systolic blood pressure and risk of hypertension were causal. Design: Mendelian randomization study using the single nucleotide polymorphism rs4988235 related to lactase persistence as an instrumental variable...

  12. Dynamic Quantum Allocation and Swap-Time Variability in Time-Sharing Operating Systems.

    ERIC Educational Resources Information Center

    Bhat, U. Narayan; Nance, Richard E.

    The effects of dynamic quantum allocation and swap-time variability on central processing unit (CPU) behavior are investigated using a model that allows both quantum length and swap-time to be state-dependent random variables. Effective CPU utilization is defined to be the proportion of a CPU busy period that is devoted to program processing, i.e.…

  13. A Composite Likelihood Inference in Latent Variable Models for Ordinal Longitudinal Responses

    ERIC Educational Resources Information Center

    Vasdekis, Vassilis G. S.; Cagnone, Silvia; Moustaki, Irini

    2012-01-01

    The paper proposes a composite likelihood estimation approach that uses bivariate instead of multivariate marginal probabilities for ordinal longitudinal responses using a latent variable model. The model considers time-dependent latent variables and item-specific random effects to be accountable for the interdependencies of the multivariate…

  14. Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Unlu, Fatih; Zhu, Pei; Bloom, Howard S.

    2014-01-01

    We explore the use of instrumental variables (IV) analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, an assumption known in the IV literature as the exclusion restriction.…

  15. A generator for unique quantum random numbers based on vacuum states

    NASA Astrophysics Data System (ADS)

    Gabriel, Christian; Wittmann, Christoffer; Sych, Denis; Dong, Ruifang; Mauerer, Wolfgang; Andersen, Ulrik L.; Marquardt, Christoph; Leuchs, Gerd

    2010-10-01

    Random numbers are a valuable component in diverse applications that range from simulations over gambling to cryptography. The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational unpredictability of quantum mechanics. However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique. Here we present a simple experimental setup based on homodyne measurements that uses the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably unique randomness, are important attributes for achieving high-reliability, high-speed and low-cost quantum random number generators.

  16. Observability-based Local Path Planning and Collision Avoidance Using Bearing-only Measurements

    DTIC Science & Technology

    2012-01-20

    Clark N. Taylorb aDepartment of Electrical and Computer Engineering, Brigham Young University , Provo, Utah, 84602 bSensors Directorate, Air Force Research...NAME(S) AND ADDRESS(ES) Brigham Young University ,Department of Electrical and Computer Engineering,Provo,UT,84602 8. PERFORMING ORGANIZATION... vit is the measurement noise that is assumed to be a zero-mean Gaus- sian random variable. Based on the state transition model expressed by Eqs. (1

  17. Quantifying Variability of Avian Colours: Are Signalling Traits More Variable?

    PubMed Central

    Delhey, Kaspar; Peters, Anne

    2008-01-01

    Background Increased variability in sexually selected ornaments, a key assumption of evolutionary theory, is thought to be maintained through condition-dependence. Condition-dependent handicap models of sexual selection predict that (a) sexually selected traits show amplified variability compared to equivalent non-sexually selected traits, and since males are usually the sexually selected sex, that (b) males are more variable than females, and (c) sexually dimorphic traits more variable than monomorphic ones. So far these predictions have only been tested for metric traits. Surprisingly, they have not been examined for bright coloration, one of the most prominent sexual traits. This omission stems from computational difficulties: different types of colours are quantified on different scales precluding the use of coefficients of variation. Methodology/Principal Findings Based on physiological models of avian colour vision we develop an index to quantify the degree of discriminable colour variation as it can be perceived by conspecifics. A comparison of variability in ornamental and non-ornamental colours in six bird species confirmed (a) that those coloured patches that are sexually selected or act as indicators of quality show increased chromatic variability. However, we found no support for (b) that males generally show higher levels of variability than females, or (c) that sexual dichromatism per se is associated with increased variability. Conclusions/Significance We show that it is currently possible to realistically estimate variability of animal colours as perceived by them, something difficult to achieve with other traits. Increased variability of known sexually-selected/quality-indicating colours in the studied species, provides support to the predictions borne from sexual selection theory but the lack of increased overall variability in males or dimorphic colours in general indicates that sexual differences might not always be shaped by similar selective

  18. Random-walk enzymes.

    PubMed

    Mak, Chi H; Pham, Phuong; Afif, Samir A; Goodman, Myron F

    2015-09-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C→U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.

  19. Random-walk enzymes

    PubMed Central

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-01-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C → U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics. PMID:26465508

  20. Random catalytic reaction networks

    NASA Astrophysics Data System (ADS)

    Stadler, Peter F.; Fontana, Walter; Miller, John H.

    1993-03-01

    We study networks that are a generalization of replicator (or Lotka-Volterra) equations. They model the dynamics of a population of object types whose binary interactions determine the specific type of interaction product. Such a system always reduces its dimension to a subset that contains production pathways for all of its members. The network equation can be rewritten at a level of collectives in terms of two basic interaction patterns: replicator sets and cyclic transformation pathways among sets. Although the system contains well-known cases that exhibit very complicated dynamics, the generic behavior of randomly generated systems is found (numerically) to be extremely robust: convergence to a globally stable rest point. It is easy to tailor networks that display replicator interactions where the replicators are entire self-sustaining subsystems, rather than structureless units. A numerical scan of random systems highlights the special properties of elementary replicators: they reduce the effective interconnectedness of the system, resulting in enhanced competition, and strong correlations between the concentrations.

  1. Quincke random walkers

    NASA Astrophysics Data System (ADS)

    Pradillo, Gerardo; Heintz, Aneesh; Vlahovska, Petia

    2017-11-01

    The spontaneous rotation of a sphere in an applied uniform DC electric field (Quincke effect) has been utilized to engineer self-propelled particles: if the sphere is initially resting on a surface, it rolls. The Quincke rollers have been widely used as a model system to study collective behavior in ``active'' suspensions. If the applied field is DC, an isolated Quincke roller follows a straight line trajectory. In this talk, we discuss the design of a Quincke roller that executes a random-walk-like behavior. We utilize AC field - upon reversal of the field direction a fluctuation in the axis of rotation (which is degenerate in the plane perpendicular to the field and parallel to the surface) introduces randomness in the direction of motion. The MSD of an isolated Quincke walker depends on frequency, amplitude, and waveform of the electric field. Experiment and theory are compared. We also investigate the collective behavior of Quincke walkers,the transport of inert particles in a bath of Quincke walkers, and the spontaneous motion of a drop containing Quincke active particle. supported by NSF Grant CBET 1437545.

  2. Random-walk enzymes

    NASA Astrophysics Data System (ADS)

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-09-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C →U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.

  3. Randomness in Competitions

    NASA Astrophysics Data System (ADS)

    Ben-Naim, E.; Hengartner, N. W.; Redner, S.; Vazquez, F.

    2013-05-01

    We study the effects of randomness on competitions based on an elementary random process in which there is a finite probability that a weaker team upsets a stronger team. We apply this model to sports leagues and sports tournaments, and compare the theoretical results with empirical data. Our model shows that single-elimination tournaments are efficient but unfair: the number of games is proportional to the number of teams N, but the probability that the weakest team wins decays only algebraically with N. In contrast, leagues, where every team plays every other team, are fair but inefficient: the top √{N} of teams remain in contention for the championship, while the probability that the weakest team becomes champion is exponentially small. We also propose a gradual elimination schedule that consists of a preliminary round and a championship round. Initially, teams play a small number of preliminary games, and subsequently, a few teams qualify for the championship round. This algorithm is fair and efficient: the best team wins with a high probability and the number of games scales as N 9/5, whereas traditional leagues require N 3 games to fairly determine a champion.

  4. Randoms Counter Analysis

    NASA Astrophysics Data System (ADS)

    Hensley, Winston; Giovanetti, Kevin

    2008-10-01

    A 1 ppm precision measurement of the muon lifetime is being conducted by the MULAN collaboration. The reason for this new measurement lies in recent advances in theory that have reduced the uncertainty in calculating the Fermi Coupling Constant from the measured lifetime to a few tenths ppm. The largest uncertainty is now experimental. To achieve a 1ppm level of precision it is necessary to control all sources of systematic error and to understand their influences on the lifetime measurement. James Madison University is contributing by examine the response of the timing system to uncorrelated events, randoms. A radioactive source was placed in front of paired detectors similar to those in the main experiment. These detectors were integrated in an identical fashion into the data acquisition and measurement system and data from these detectors was recorded during the entire experiment. The pair were placed in a shielded enclosure away from the main experiment to minimize interference. The data from these detectors should have a flat time spectrum as the decay of a radioactive source is a random event and has no time correlation. Thus the spectrum can be used as an important diagnostic in studying the method of determining event times and timing system performance.

  5. VARIABLE TIME-INTERVAL GENERATOR

    DOEpatents

    Gross, J.E.

    1959-10-31

    This patent relates to a pulse generator and more particularly to a time interval generator wherein the time interval between pulses is precisely determined. The variable time generator comprises two oscillators with one having a variable frequency output and the other a fixed frequency output. A frequency divider is connected to the variable oscillator for dividing its frequency by a selected factor and a counter is used for counting the periods of the fixed oscillator occurring during a cycle of the divided frequency of the variable oscillator. This defines the period of the variable oscillator in terms of that of the fixed oscillator. A circuit is provided for selecting as a time interval a predetermined number of periods of the variable oscillator. The output of the generator consists of a first pulse produced by a trigger circuit at the start of the time interval and a second pulse marking the end of the time interval produced by the same trigger circuit.

  6. Smooth conditional distribution function and quantiles under random censorship.

    PubMed

    Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine

    2002-09-01

    We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).

  7. Modeling variability in porescale multiphase flow experiments

    SciT

    Ling, Bowen; Bao, Jie; Oostrom, Mart

    Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulationsmore » are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.« less

  8. Variable Selection through Correlation Sifting

    NASA Astrophysics Data System (ADS)

    Huang, Jim C.; Jojic, Nebojsa

    Many applications of computational biology require a variable selection procedure to sift through a large number of input variables and select some smaller number that influence a target variable of interest. For example, in virology, only some small number of viral protein fragments influence the nature of the immune response during viral infection. Due to the large number of variables to be considered, a brute-force search for the subset of variables is in general intractable. To approximate this, methods based on ℓ1-regularized linear regression have been proposed and have been found to be particularly successful. It is well understood however that such methods fail to choose the correct subset of variables if these are highly correlated with other "decoy" variables. We present a method for sifting through sets of highly correlated variables which leads to higher accuracy in selecting the correct variables. The main innovation is a filtering step that reduces correlations among variables to be selected, making the ℓ1-regularization effective for datasets on which many methods for variable selection fail. The filtering step changes both the values of the predictor variables and output values by projections onto components obtained through a computationally-inexpensive principal components analysis. In this paper we demonstrate the usefulness of our method on synthetic datasets and on novel applications in virology. These include HIV viral load analysis based on patients' HIV sequences and immune types, as well as the analysis of seasonal variation in influenza death rates based on the regions of the influenza genome that undergo diversifying selection in the previous season.

  9. Subjective randomness as statistical inference.

    PubMed

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Randomized controlled trials in mild cognitive impairment

    PubMed Central

    Thomas, Ronald G.; Aisen, Paul S.; Mohs, Richard C.; Carrillo, Maria C.; Albert, Marilyn S.

    2017-01-01

    Objective: To examine the variability in performance among placebo groups in randomized controlled trials for mild cognitive impairment (MCI). Methods: Placebo group data were obtained from 2 National Institute on Aging (NIA) MCI randomized controlled trials, the Alzheimer's Disease Cooperative Study (ADCS) MCI trial and the Alzheimer's Disease Neuroimaging Initiative (ADNI), which is a simulated clinical trial, in addition to industry-sponsored clinical trials involving rivastigmine, galantamine, rofecoxib, and donepezil. The data were collated for common measurement instruments. The performance of the placebo participants from these studies was tracked on the Alzheimer's Disease Assessment Scale–cognitive subscale, Mini-Mental State Examination, and Clinical Dementia Rating–sum of boxes, and for progression on these measures to prespecified clinical study endpoints. APOE status, where available, was also analyzed for its effects. Results: The progression to clinical endpoints varied a great deal among the trials. The expected performances were seen for the participants in the 2 NIA trials, ADCS and ADNI, with generally worsening of performance over time; however, the industry-sponsored trials largely showed stable or improved performance in their placebo participants. APOE4 carrier status influenced results in an expected fashion on the study outcomes, including rates of progression and cognitive subscales. Conclusions: In spite of apparently similar criteria for MCI being adopted by the 7 studies, the implementation of the criteria varied a great deal. Several explanations including instruments used to characterize participants and variability among study populations contributed to the findings. PMID:28381516

  11. Fatigue Tests with Random Flight Simulation Loading

    NASA Technical Reports Server (NTRS)

    Schijve, J.

    1972-01-01

    Crack propagation was studied in a full-scale wing structure under different simulated flight conditions. Omission of low-amplitude gust cycles had a small effect on the crack rate. Truncation of the infrequently occurring high-amplitude gust cycles to a lower level had a noticeably accelerating effect on crack growth. The application of fail-safe load (100 percent limit load) effectively stopped subsequent crack growth under resumed flight-simulation loading. In another flight-simulation test series on sheet specimens, the variables studied are the design stress level and the cyclic frequency of the random gust loading. Inflight mean stresses vary from 5.5 to 10.0 kg/sq mm. The effect of the stress level is larger for the 2024 alloy than for the 7075 alloy. Three frequencies were employed: namely, 10 cps, 1 cps, and 0.1 cps. The frequency effect was small. The advantages and limitations of flight-simulation tests are compared with those of alternative test procedures such as constant-amplitude tests, program tests, and random-load tests. Various testing purposes are considered. The variables of flight-simulation tests are listed and their effects are discussed. A proposal is made for performing systematic flight-simulation tests in such a way that the compiled data may be used as a source of reference.

  12. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    PubMed

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  13. Is the Sun a Long Period Variable

    NASA Technical Reports Server (NTRS)

    Sonett, Charles P.

    1990-01-01

    The inventory of atmospheric radiocarbon exhibits quasi-periodic variations of mean period of bar-lambda=269 years over the entire 9000 year record. But the period is inconstant and subject to random variability (sigma m exp. 1/2 = 119 years). The radiocarbon maxima correspond to the quasiperiodic extension of the Maunder minimum throughout the Holocene and resolve the long-standing issue of Maunder cyclicity. The radiocarbon maxima are amplitude modulated by the approx. 2300 year period and thus vary significantly in peak value. The approx. 2300 year period in turn appears to not be modulated by the secular geomagnetic variation. Detection of a Maunder-like sequence of minima in tree ring growth of Bristlecone pine and its correlation with the Maunder (1890, 1922) cyclicity in the radiocarbon record supports the inference that solar forcing of the radiocarbon record is accompanied by a corresponding forcing of growth of timberline Bristlecone pine. Because of the random component of the Maunder period, prediction of climate, if tied to the Maunder cycle other than probabilistically, is significantly hindered. For the mean Maunder period of 269 years, the probability is 67 percent that a given climatic maximum lies anywhere between 150 and 388 years.

  14. Strategic Use of Random Subsample Replication and a Coefficient of Factor Replicability

    ERIC Educational Resources Information Center

    Katzenmeyer, William G.; Stenner, A. Jackson

    1975-01-01

    The problem of demonstrating replicability of factor structure across random variables is addressed. Procedures are outlined which combine the use of random subsample replication strategies with the correlations between factor score estimates across replicate pairs to generate a coefficient of replicability and confidence intervals associated with…

  15. Energy density and variability in abundance of pigeon guillemot prey: Support for the quality-variability trade-off hypothesis

    Litzow, Michael A.; Piatt, John F.; Abookire, Alisa A.; Robards, Martin D.

    2004-01-01

    1. The quality-variability trade-off hypothesis predicts that (i) energy density (kJ g-1) and spatial-temporal variability in abundance are positively correlated in nearshore marine fishes; and (ii) prey selection by a nearshore piscivore, the pigeon guillemot (Cepphus columba Pallas), is negatively affected by variability in abundance. 2. We tested these predictions with data from a 4-year study that measured fish abundance with beach seines and pigeon guillemot prey utilization with visual identification of chick meals. 3. The first prediction was supported. Pearson's correlation showed that fishes with higher energy density were more variable on seasonal (r = 0.71) and annual (r = 0.66) time scales. Higher energy density fishes were also more abundant overall (r = 0.85) and more patchy at a scale of 10s of km (r = 0.77). 4. Prey utilization by pigeon guillemots was strongly non-random. Relative preference, defined as the difference between log-ratio transformed proportions of individual prey taxa in chick diets and beach seine catches, was significantly different from zero for seven of the eight main prey categories. 5. The second prediction was also supported. We used principal component analysis (PCA) to summarize variability in correlated prey characteristics (energy density, availability and variability in abundance). Two PCA scores explained 32% of observed variability in pigeon guillemot prey utilization. Seasonal variability in abundance was negatively weighted by these PCA scores, providing evidence of risk-averse selection. Prey availability, energy density and km-scale variability in abundance were positively weighted. 6. Trophic interactions are known to create variability in resource distribution in other systems. We propose that links between resource quality and the strength of trophic interactions may produce resource quality-variability trade-offs.

  16. Gossip in Random Networks

    NASA Astrophysics Data System (ADS)

    Malarz, K.; Szvetelszky, Z.; Szekf, B.; Kulakowski, K.

    2006-11-01

    We consider the average probability X of being informed on a gossip in a given social network. The network is modeled within the random graph theory of Erd{õ}s and Rényi. In this theory, a network is characterized by two parameters: the size N and the link probability p. Our experimental data suggest three levels of social inclusion of friendship. The critical value pc, for which half of agents are informed, scales with the system size as N-gamma with gamma approx 0.68. Computer simulations show that the probability X varies with p as a sigmoidal curve. Influence of the correlations between neighbors is also evaluated: with increasing clustering coefficient C, X decreases.

  17. THE COLOR VARIABILITY OF QUASARS

    SciT

    Schmidt, Kasper B.; Rix, Hans-Walter; Knecht, Matthias

    2012-01-10

    We quantify quasar color variability using an unprecedented variability database-ugriz photometry of 9093 quasars from Sloan Digital Sky Survey (SDSS) Stripe 82, observed over 8 years at {approx}60 epochs each. We confirm previous reports that quasars become bluer when brightening. We find a redshift dependence of this blueing in a given set of bands (e.g., g and r), but show that it is the result of the flux contribution from less-variable or delayed emission lines in the different SDSS bands at different redshifts. After correcting for this effect, quasar color variability is remarkably uniform, and independent not only of redshift,more » but also of quasar luminosity and black hole mass. The color variations of individual quasars, as they vary in brightness on year timescales, are much more pronounced than the ranges in color seen in samples of quasars across many orders of magnitude in luminosity. This indicates distinct physical mechanisms behind quasar variability and the observed range of quasar luminosities at a given black hole mass-quasar variations cannot be explained by changes in the mean accretion rate. We do find some dependence of the color variability on the characteristics of the flux variations themselves, with fast, low-amplitude, brightness variations producing more color variability. The observed behavior could arise if quasar variability results from flares or ephemeral hot spots in an accretion disk.« less

  18. Variability in human body size

    NASA Technical Reports Server (NTRS)

    Annis, J. F.

    1978-01-01

    The range of variability found among homogeneous groups is described and illustrated. Those trends that show significantly marked differences between sexes and among a number of racial/ethnic groups are also presented. Causes of human-body size variability discussed include genetic endowment, aging, nutrition, protective garments, and occupation. The information is presented to aid design engineers of space flight hardware and equipment.

  19. Operant Variability: Procedures and Processes

    ERIC Educational Resources Information Center

    Machado, Armando; Tonneau, Francois

    2012-01-01

    Barba's (2012) article deftly weaves three main themes in one argument about operant variability. From general theoretical considerations on operant behavior (Catania, 1973), Barba derives methodological guidelines about response differentiation and applies them to the study of operant variability. In the process, he uncovers unnoticed features of…

  20. Predictor variables of clergy pedophiles.

    PubMed

    Ruzicka, M F

    1997-10-01

    File data on familial traits, past sexual experience as a victim, and other traits identified in the literature as leading toward pedophilia, were summarized for 10 convicted clergy pedophiles to construct a set of variables possibly useful for screening. Further research is underway to identify trauma in early life and those personality-related variables current studies indicate as relevant.

  1. Speed control variable rate irrigation

    Speed control variable rate irrigation (VRI) is used to address within field variability by controlling a moving sprinkler’s travel speed to vary the application depth. Changes in speed are commonly practiced over areas that slope, pond or where soil texture is predominantly different. Dynamic presc...

  2. Tutorial in Biostatistics: Instrumental Variable Methods for Causal Inference*

    PubMed Central

    Baiocchi, Michael; Cheng, Jing; Small, Dylan S.

    2014-01-01

    A goal of many health studies is to determine the causal effect of a treatment or intervention on health outcomes. Often, it is not ethically or practically possible to conduct a perfectly randomized experiment and instead an observational study must be used. A major challenge to the validity of observational studies is the possibility of unmeasured confounding (i.e., unmeasured ways in which the treatment and control groups differ before treatment administration which also affect the outcome). Instrumental variables analysis is a method for controlling for unmeasured confounding. This type of analysis requires the measurement of a valid instrumental variable, which is a variable that (i) is independent of the unmeasured confounding; (ii) affects the treatment; and (iii) affects the outcome only indirectly through its effect on the treatment. This tutorial discusses the types of causal effects that can be estimated by instrumental variables analysis; the assumptions needed for instrumental variables analysis to provide valid estimates of causal effects and sensitivity analysis for those assumptions; methods of estimation of causal effects using instrumental variables; and sources of instrumental variables in health studies. PMID:24599889

  3. The Hubble Catalog of Variables

    NASA Astrophysics Data System (ADS)

    Gavras, P.; Bonanos, A. Z.; Bellas-Velidis, I.; Charmandaris, V.; Georgantopoulos, I.; Hatzidimitriou, D.; Kakaletris, G.; Karampelas, A.; Laskaris, N.; Lennon, D. J.; Moretti, M. I.; Pouliasis, E.; Sokolovsky, K.; Spetsieri, Z. T.; Tsinganos, K.; Whitmore, B. C.; Yang, M.

    2017-06-01

    The Hubble Catalog of Variables (HCV) is a 3 year ESA funded project that aims to develop a set of algorithms to identify variables among the sources included in the Hubble Source Catalog (HSC) and produce the HCV. We will process all HSC sources with more than a predefined number of measurements in a single filter/instrument combination and compute a range of lightcurve features to determine the variability status of each source. At the end of the project, the first release of the Hubble Catalog of Variables will be made available at the Mikulski Archive for Space Telescopes (MAST) and the ESA Science Archives. The variability detection pipeline will be implemented at the Space Telescope Science Institute (STScI) so that updated versions of the HCV may be created following the future releases of the HSC.

  4. An Undergraduate Research Experience on Studying Variable Stars

    NASA Astrophysics Data System (ADS)

    Amaral, A.; Percy, J. R.

    2016-06-01

    We describe and evaluate a summer undergraduate research project and experience by one of us (AA), under the supervision of the other (JP). The aim of the project was to sample current approaches to analyzing variable star data, and topics related to the study of Mira variable stars and their astrophysical importance. This project was done through the Summer Undergraduate Research Program (SURP) in astronomy at the University of Toronto. SURP allowed undergraduate students to explore and learn about many topics within astronomy and astrophysics, from instrumentation to cosmology. SURP introduced students to key skills which are essential for students hoping to pursue graduate studies in any scientific field. Variable stars proved to be an excellent topic for a research project. For beginners to independent research, it introduces key concepts in research such as critical thinking and problem solving, while illuminating previously learned topics in stellar physics. The focus of this summer project was to compare observations with structural and evolutionary models, including modelling the random walk behavior exhibited in the (O-C) diagrams of most Mira stars. We found that the random walk could be modelled by using random fluctuations of the period. This explanation agreed well with observations.

  5. Methods and analysis of realizing randomized grouping.

    PubMed

    Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi

    2011-07-01

    Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.

  6. Groupies in multitype random graphs.

    PubMed

    Shang, Yilun

    2016-01-01

    A groupie in a graph is a vertex whose degree is not less than the average degree of its neighbors. Under some mild conditions, we show that the proportion of groupies is very close to 1/2 in multitype random graphs (such as stochastic block models), which include Erdős-Rényi random graphs, random bipartite, and multipartite graphs as special examples. Numerical examples are provided to illustrate the theoretical results.

  7. Combined influence of CT random noise and HU-RSP calibration curve nonlinearities on proton range systematic errors

    NASA Astrophysics Data System (ADS)

    Brousmiche, S.; Souris, K.; Orban de Xivry, J.; Lee, J. A.; Macq, B.; Seco, J.

    2017-11-01

    Proton range random and systematic uncertainties are the major factors undermining the advantages of proton therapy, namely, a sharp dose falloff and a better dose conformality for lower doses in normal tissues. The influence of CT artifacts such as beam hardening or scatter can easily be understood and estimated due to their large-scale effects on the CT image, like cupping and streaks. In comparison, the effects of weakly-correlated stochastic noise are more insidious and less attention is drawn on them partly due to the common belief that they only contribute to proton range uncertainties and not to systematic errors thanks to some averaging effects. A new source of systematic errors on the range and relative stopping powers (RSP) has been highlighted and proved not to be negligible compared to the 3.5% uncertainty reference value used for safety margin design. Hence, we demonstrate that the angular points in the HU-to-RSP calibration curve are an intrinsic source of proton range systematic error for typical levels of zero-mean stochastic CT noise. Systematic errors on RSP of up to 1% have been computed for these levels. We also show that the range uncertainty does not generally vary linearly with the noise standard deviation. We define a noise-dependent effective calibration curve that better describes, for a given material, the RSP value that is actually used. The statistics of the RSP and the range continuous slowing down approximation (CSDA) have been analytically derived for the general case of a calibration curve obtained by the stoichiometric calibration procedure. These models have been validated against actual CSDA simulations for homogeneous and heterogeneous synthetical objects as well as on actual patient CTs for prostate and head-and-neck treatment planning situations.

  8. Blood pressure variability of two ambulatory blood pressure monitors.

    PubMed

    Kallem, Radhakrishna R; Meyers, Kevin E C; Cucchiara, Andrew J; Sawinski, Deirdre L; Townsend, Raymond R

    2014-04-01

    There are no data on the evaluation of blood pressure (BP) variability comparing two ambulatory blood pressure monitoring monitors worn at the same time. Hence, this study was carried out to compare variability of BP in healthy untreated adults using two ambulatory BP monitors worn at the same time over an 8-h period. An Accutorr device was used to measure office BP in the dominant and nondominant arms of 24 participants.Simultaneous 8-h BP and heart rate data were measured in 24 untreated adult volunteers by Mobil-O-Graph (worn for an additional 16 h after removing the Spacelabs monitor) and Spacelabs with both random (N=12) and nonrandom (N=12) assignment of each device to the dominant arm. Average real variability (ARV), SD, coefficient of variation, and variation independent of mean were calculated for systolic blood pressure, diastolic blood pressure, mean arterial pressure, and pulse pressure (PP). Whether the Mobil-O-Graph was applied to the dominant or the nondominant arm, the ARV of mean systolic (P=0.003 nonrandomized; P=0.010 randomized) and PP (P=0.009 nonrandomized; P=0.005 randomized) remained significantly higher than the Spacelabs device, whereas the ARV of the mean arterial pressure was not significantly different. The average BP readings and ARVs for systolic blood pressure and PP obtained by the Mobil-O-Graph were considerably higher for the daytime than the night-time. Given the emerging interest in the effect of BP variability on health outcomes, the accuracy of its measurement is important. Our study raises concerns about the accuracy of pooling international ambulatory blood pressure monitoring variability data using different devices.

  9. Speckle phase near random surfaces

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoyi; Cheng, Chuanfu; An, Guoqiang; Han, Yujing; Rong, Zhenyu; Zhang, Li; Zhang, Meina

    2018-03-01

    Based on Kirchhoff approximation theory, the speckle phase near random surfaces with different roughness is numerically simulated. As expected, the properties of the speckle phase near the random surfaces are different from that in far field. In addition, as scattering distances and roughness increase, the average fluctuations of the speckle phase become larger. Unusually, the speckle phase is somewhat similar to the corresponding surface topography. We have performed experiments to verify the theoretical simulation results. Studies in this paper contribute to understanding the evolution of speckle phase near a random surface and provide a possible way to identify a random surface structure based on its speckle phase.

  10. Imaging Variable Stars with HST

    NASA Astrophysics Data System (ADS)

    Karovska, Margarita

    2011-05-01

    The Hubble Space Telescope (HST) observations of astronomical sources, ranging from objects in our solar system to objects in the early Universe, have revolutionized our knowledge of the Universe its origins and contents.I will highlight results from HST observations of variable stars obtained during the past twenty or so years. Multiwavelength observations of numerous variable stars and stellar systems were obtained using the superb HST imaging capabilities and its unprecedented angular resolution, especially in the UV and optical. The HST provided the first detailed images probing the structure of variable stars including their atmospheres and circumstellar environments. AAVSO observations and light curves have been critical for scheduling of many of these observations and provided important information and context for understanding of the imaging results of many variable sources. I will describe the scientific results from the imaging observations of variable stars including AGBs, Miras, Cepheids, semi-regular variables (including supergiants and giants), YSOs and interacting stellar systems with a variable stellar components. These results have led to an unprecedented understanding of the spatial and temporal characteristics of these objects and their place in the stellar evolutionary chains, and in the larger context of the dynamic evolving Universe.

  11. Imaging Variable Stars with HST

    NASA Astrophysics Data System (ADS)

    Karovska, M.

    2012-06-01

    (Abstract only) The Hubble Space Telescope (HST) observations of astronomical sources, ranging from objects in our solar system to objects in the early Universe, have revolutionized our knowledge of the Universe its origins and contents. I highlight results from HST observations of variable stars obtained during the past twenty or so years. Multiwavelength observations of numerous variable stars and stellar systems were obtained using the superb HST imaging capabilities and its unprecedented angular resolution, especially in the UV and optical. The HST provided the first detailed images probing the structure of variable stars including their atmospheres and circumstellar environments. AAVSO observations and light curves have been critical for scheduling of many of these observations and provided important information and context for understanding of the imaging results of many variable sources. I describe the scientific results from the imaging observations of variable stars including AGBs, Miras, Cepheids, semiregular variables (including supergiants and giants), YSOs and interacting stellar systems with a variable stellar components. These results have led to an unprecedented understanding of the spatial and temporal characteristics of these objects and their place in the stellar evolutionary chains, and in the larger context of the dynamic evolving Universe.

  12. Random forests for classification in ecology

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  13. Variable pixel size ionospheric tomography

    NASA Astrophysics Data System (ADS)

    Zheng, Dunyong; Zheng, Hongwei; Wang, Yanjun; Nie, Wenfeng; Li, Chaokui; Ao, Minsi; Hu, Wusheng; Zhou, Wei

    2017-06-01

    A novel ionospheric tomography technique based on variable pixel size was developed for the tomographic reconstruction of the ionospheric electron density (IED) distribution. In variable pixel size computerized ionospheric tomography (VPSCIT) model, the IED distribution is parameterized by a decomposition of the lower and upper ionosphere with different pixel sizes. Thus, the lower and upper IED distribution may be very differently determined by the available data. The variable pixel size ionospheric tomography and constant pixel size tomography are similar in most other aspects. There are some differences between two kinds of models with constant and variable pixel size respectively, one is that the segments of GPS signal pay should be assigned to the different kinds of pixel in inversion; the other is smoothness constraint factor need to make the appropriate modified where the pixel change in size. For a real dataset, the variable pixel size method distinguishes different electron density distribution zones better than the constant pixel size method. Furthermore, it can be non-chided that when the effort is spent to identify the regions in a model with best data coverage. The variable pixel size method can not only greatly improve the efficiency of inversion, but also produce IED images with high fidelity which are the same as a used uniform pixel size method. In addition, variable pixel size tomography can reduce the underdetermined problem in an ill-posed inverse problem when the data coverage is irregular or less by adjusting quantitative proportion of pixels with different sizes. In comparison with constant pixel size tomography models, the variable pixel size ionospheric tomography technique achieved relatively good results in a numerical simulation. A careful validation of the reliability and superiority of variable pixel size ionospheric tomography was performed. Finally, according to the results of the statistical analysis and quantitative comparison, the

  14. Low-Frequency Temporal Variability in Mira and Semiregular Variables

    NASA Astrophysics Data System (ADS)

    Templeton, Matthew R.; Karovska, M.; Waagen, E. O.

    2012-01-01

    We investigate low-frequency variability in a large sample of Mira and semiregular variables with long-term visual light curves from the AAVSO International Database. Our aim is to determine whether we can detect and measure long-timescale variable phenomena in these stars, for example photometric variations that might be associated with supergranular convection. We analyzed the long-term light curves of 522 variable stars of the Mira and SRa, b, c, and d classes. We calculated their low-frequency time-series spectra to characterize rednoise with the power density spectrum index, and then correlate this index with other observable characteristics such as spectral type and primary pulsation period. In our initial analysis of the sample, we see that the semiregular variables have a much broader range of spectral index than the Mira types, with the SRb subtype having the broadest range. Among Mira variables we see that the M- and S-type Miras have similarly wide ranges of index, while the C-types have the narrowest with generally shallower slopes. There is also a trend of steeper slope with larger amplitude, but at a given amplitude, a wide range of slopes are seen. The ultimate goal of the project is to identify stars with strong intrinsic red noise components as possible targets for resolved surface imaging with interferometry.

  15. A Strategy to Use Soft Data Effectively in Randomized Controlled Clinical Trials.

    ERIC Educational Resources Information Center

    Kraemer, Helena Chmura; Thiemann, Sue

    1989-01-01

    Sees soft data, measures having substantial intrasubject variability due to errors of measurement or response inconsistency, as important measures of response in randomized clinical trials. Shows that using intensive design and slope of response on time as outcome measure maximizes sample retention and decreases within-group variability, thus…

  16. On the variability of Vega

    NASA Astrophysics Data System (ADS)

    Butkovskaya, V. V.

    2014-06-01

    For 60 years Vega has been accepted as a standard star in the near infrared, optical, and ultraviolet ranges. However, a 21-year spectral and spectrophotometric variability of Vega has been revealed. Vega also demonstrates short-term unexplained variability. Recent spectropolarimetric studies have revealed a weak magnetic field on Vega. We analyze the results of 15-year observations performed at the Crimean Astrophysical Observatory and we hypothesize that the magnetic field variation is caused by stellar rotation. In the present work we summarize the results of investigations on the variability of Vega.

  17. Nonvolatile random access memory

    NASA Technical Reports Server (NTRS)

    Wu, Jiin-Chuan (Inventor); Stadler, Henry L. (Inventor); Katti, Romney R. (Inventor)

    1994-01-01

    A nonvolatile magnetic random access memory can be achieved by an array of magnet-Hall effect (M-H) elements. The storage function is realized with a rectangular thin-film ferromagnetic material having an in-plane, uniaxial anisotropy and inplane bipolar remanent magnetization states. The thin-film magnetic element is magnetized by a local applied field, whose direction is used to form either a 0 or 1 state. The element remains in the 0 or 1 state until a switching field is applied to change its state. The stored information is detcted by a Hall-effect sensor which senses the fringing field from the magnetic storage element. The circuit design for addressing each cell includes transistor switches for providing a current of selected polarity to store a binary digit through a separate conductor overlying the magnetic element of the cell. To read out a stored binary digit, transistor switches are employed to provide a current through a row of Hall-effect sensors connected in series and enabling a differential voltage amplifier connected to all Hall-effect sensors of a column in series. To avoid read-out voltage errors due to shunt currents through resistive loads of the Hall-effect sensors of other cells in the same column, at least one transistor switch is provided between every pair of adjacent cells in every row which are not turned on except in the row of the selected cell.

  18. Random walk of passive tracers among randomly moving obstacles.

    PubMed

    Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco

    2016-04-14

    This study is mainly motivated by the need of understanding how the diffusion behavior of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random environment is here considered in the case of a passively diffusing particle among randomly moving and interacting obstacles. The relevant physical quantity which is worked out is the diffusion coefficient of the passive tracer which is computed as a function of the average inter-obstacles distance. The results reported here suggest that if a biomolecule, let us call it a test molecule, moves towards its target in the presence of other independently interacting molecules, its motion can be considerably slowed down.

  19. Critical Behavior of the Annealed Ising Model on Random Regular Graphs

    NASA Astrophysics Data System (ADS)

    Can, Van Hao

    2017-11-01

    In Giardinà et al. (ALEA Lat Am J Probab Math Stat 13(1):121-161, 2016), the authors have defined an annealed Ising model on random graphs and proved limit theorems for the magnetization of this model on some random graphs including random 2-regular graphs. Then in Can (Annealed limit theorems for the Ising model on random regular graphs, arXiv:1701.08639, 2017), we generalized their results to the class of all random regular graphs. In this paper, we study the critical behavior of this model. In particular, we determine the critical exponents and prove a non standard limit theorem stating that the magnetization scaled by n^{3/4} converges to a specific random variable, with n the number of vertices of random regular graphs.

  20. QUALITY CONTROL - VARIABILITY IN PROTOCOLS

    EPA Science Inventory

    The EPA Risk Reduction Engineering Laboratory’s Quality Assurance Office, which published the popular pocket guide Preparing Perfect Project Plans, is now introducing another quality assurance reference aid. The document Variability in Protocols (VIP) was initially designed as a ...

  1. Identifying Context Variables in Research.

    ERIC Educational Resources Information Center

    Piazza, Carolyn L.

    1987-01-01

    Identifies context variables in written composition from theoretical perspectives in cognitive psychology, sociology, and anthropology. Considers how multiple views of context from across the disciplines can build toward a broader definition of writing. (JD)

  2. Adapting to variable prismatic displacement

    NASA Technical Reports Server (NTRS)

    Welch, Robert B.; Cohen, Malcolm M.

    1989-01-01

    In each of two studies, subjects were exposed to a continuously changing prismatic displacement with a mean value of 19 prism diopters (variable displacement) and to a fixed 19-diopter displacement (fixed displacement). In Experiment 1, significant adaptation (post-pre shifts in hand-eye coordination) was found for fixed, but not for variable, displacement. Experiment 2 demonstrated that adaptation was obtained for variable displacement, but it was very fragile and is lost if the measures of adaptation are preceded by even a very brief exposure of the hand to normal or near-normal vision. Contrary to the results of some previous studies, an increase in within-S dispersion was not found of target pointing responses as a result of exposure to variable displacement.

  3. Climate Impact of Solar Variability

    NASA Technical Reports Server (NTRS)

    Schatten, Kenneth H. (Editor); Arking, Albert (Editor)

    1990-01-01

    The conference on The Climate Impact of Solar Variability, was held at Goddard Space Flight Center from April 24 to 27, 1990. In recent years they developed a renewed interest in the potential effects of increasing greenhouse gases on climate. Carbon dioxide, methane, nitrous oxide, and the chlorofluorocarbons have been increasing at rates that could significantly change climate. There is considerable uncertainty over the magnitude of this anthropogenic change. The climate system is very complex, with feedback processes that are not fully understood. Moreover, there are two sources of natural climate variability (volcanic aerosols and solar variability) added to the anthropogenic changes which may confuse our interpretation of the observed temperature record. Thus, if we could understand the climatic impact of the natural variability, it would aid our interpretation and understanding of man-made climate changes.

  4. Redundant variables and Granger causality

    NASA Astrophysics Data System (ADS)

    Angelini, L.; de Tommaso, M.; Marinazzo, D.; Nitti, L.; Pellicoro, M.; Stramaglia, S.

    2010-03-01

    We discuss the use of multivariate Granger causality in presence of redundant variables: the application of the standard analysis, in this case, leads to under estimation of causalities. Using the un-normalized version of the causality index, we quantitatively develop the notions of redundancy and synergy in the frame of causality and propose two approaches to group redundant variables: (i) for a given target, the remaining variables are grouped so as to maximize the total causality and (ii) the whole set of variables is partitioned to maximize the sum of the causalities between subsets. We show the application to a real neurological experiment, aiming to a deeper understanding of the physiological basis of abnormal neuronal oscillations in the migraine brain. The outcome by our approach reveals the change in the informational pattern due to repetitive transcranial magnetic stimulations.

  5. A Cautious Note on Auxiliary Variables That Can Increase Bias in Missing Data Problems.

    PubMed

    Thoemmes, Felix; Rose, Norman

    2014-01-01

    The treatment of missing data in the social sciences has changed tremendously during the last decade. Modern missing data techniques such as multiple imputation and full-information maximum likelihood are used much more frequently. These methods assume that data are missing at random. One very common approach to increase the likelihood that missing at random is achieved consists of including many covariates as so-called auxiliary variables. These variables are either included based on data considerations or in an inclusive fashion; that is, taking all available auxiliary variables. In this article, we point out that there are some instances in which auxiliary variables exhibit the surprising property of increasing bias in missing data problems. In a series of focused simulation studies, we highlight some situations in which this type of biasing behavior can occur. We briefly discuss possible ways how one can avoid selecting bias-inducing covariates as auxiliary variables.

  6. Automatic Classification of Time-variable X-Ray Sources

    NASA Astrophysics Data System (ADS)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ~97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7-500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  7. Automatic classification of time-variable X-ray sources

    SciT

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, andmore » other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.« less

  8. Variable spectra of active galaxies

    NASA Technical Reports Server (NTRS)

    Halpern, Jules P.

    1988-01-01

    The analysis of EXOSAT spectra of active galaxies are presented. The objects examined for X-ray spectral variability were MR 2251-178 and 3C 120. The results of these investigations are described, as well as additional results on X-ray spectral variability related to EXOSAT observations of active galaxies. Additionally, the dipping X-ray source 4U1624-49 was also investigated.

  9. Optimization Of Mean-Semivariance-Skewness Portfolio Selection Model In Fuzzy Random Environment

    NASA Astrophysics Data System (ADS)

    Chatterjee, Amitava; Bhattacharyya, Rupak; Mukherjee, Supratim; Kar, Samarjit

    2010-10-01

    The purpose of the paper is to construct a mean-semivariance-skewness portfolio selection model in fuzzy random environment. The objective is to maximize the skewness with predefined maximum risk tolerance and minimum expected return. Here the security returns in the objectives and constraints are assumed to be fuzzy random variables in nature and then the vagueness of the fuzzy random variables in the objectives and constraints are transformed into fuzzy variables which are similar to trapezoidal numbers. The newly formed fuzzy model is then converted into a deterministic optimization model. The feasibility and effectiveness of the proposed method is verified by numerical example extracted from Bombay Stock Exchange (BSE). The exact parameters of fuzzy membership function and probability density function are obtained through fuzzy random simulating the past dates.

  10. Neural variability, or lack thereof

    PubMed Central

    Masquelier, Timothée

    2013-01-01

    We do not claim that the brain is completely deterministic, and we agree that noise may be beneficial in some cases. But we suggest that neuronal variability may be often overestimated, due to uncontrolled internal variables, and/or the use of inappropriate reference times. These ideas are not new, but should be re-examined in the light of recent experimental findings: trial-to-trial variability is often correlated across neurons, across trials, greater for higher-order neurons, and reduced by attention, suggesting that “intrinsic” sources of noise can only account for a minimal part of it. While it is obviously difficult to control for all internal variables, the problem of reference time can be largely avoided by recording multiple neurons at the same time, and looking at statistical structures in relative latencies. These relative latencies have another major advantage: they are insensitive to the variability that is shared across neurons, which is often a significant part of the total variability. Thus, we suggest that signal-to-noise ratios in the brain may be much higher than usually thought, leading to reactive systems, economic in terms of number of neurons, and energy efficient. PMID:23444270

  11. The random continued fraction transformation

    NASA Astrophysics Data System (ADS)

    Kalle, Charlene; Kempton, Tom; Verbitskiy, Evgeny

    2017-03-01

    We introduce a random dynamical system related to continued fraction expansions. It uses random combinations of the Gauss map and the Rényi (or backwards) continued fraction map. We explore the continued fraction expansions that this system produces, as well as the dynamical properties of the system.

  12. Unsupervised Unmixing of Hyperspectral Images Accounting for Endmember Variability.

    PubMed

    Halimi, Abderrahim; Dobigeon, Nicolas; Tourneret, Jean-Yves

    2015-12-01

    This paper presents an unsupervised Bayesian algorithm for hyperspectral image unmixing, accounting for endmember variability. The pixels are modeled by a linear combination of endmembers weighted by their corresponding abundances. However, the endmembers are assumed random to consider their variability in the image. An additive noise is also considered in the proposed model, generalizing the normal compositional model. The proposed algorithm exploits the whole image to benefit from both spectral and spatial information. It estimates both the mean and the covariance matrix of each endmember in the image. This allows the behavior of each material to be analyzed and its variability to be quantified in the scene. A spatial segmentation is also obtained based on the estimated abundances. In order to estimate the parameters associated with the proposed Bayesian model, we propose to use a Hamiltonian Monte Carlo algorithm. The performance of the resulting unmixing strategy is evaluated through simulations conducted on both synthetic and real data.

  13. Random discrete linear canonical transform.

    PubMed

    Wei, Deyun; Wang, Ruikui; Li, Yuan-Min

    2016-12-01

    Linear canonical transforms (LCTs) are a family of integral transforms with wide applications in optical, acoustical, electromagnetic, and other wave propagation problems. In this paper, we propose the random discrete linear canonical transform (RDLCT) by randomizing the kernel transform matrix of the discrete linear canonical transform (DLCT). The RDLCT inherits excellent mathematical properties from the DLCT along with some fantastic features of its own. It has a greater degree of randomness because of the randomization in terms of both eigenvectors and eigenvalues. Numerical simulations demonstrate that the RDLCT has an important feature that the magnitude and phase of its output are both random. As an important application of the RDLCT, it can be used for image encryption. The simulation results demonstrate that the proposed encryption method is a security-enhanced image encryption scheme.

  14. The random fractional matching problem

    NASA Astrophysics Data System (ADS)

    Lucibello, Carlo; Malatesta, Enrico M.; Parisi, Giorgio; Sicuro, Gabriele

    2018-05-01

    We consider two formulations of the random-link fractional matching problem, a relaxed version of the more standard random-link (integer) matching problem. In one formulation, we allow each node to be linked to itself in the optimal matching configuration. In the other one, on the contrary, such a link is forbidden. Both problems have the same asymptotic average optimal cost of the random-link matching problem on the complete graph. Using a replica approach and previous results of Wästlund (2010 Acta Mathematica 204 91–150), we analytically derive the finite-size corrections to the asymptotic optimal cost. We compare our results with numerical simulations and we discuss the main differences between random-link fractional matching problems and the random-link matching problem.

  15. Mendelian randomization in nutritional epidemiology

    PubMed Central

    Qi, Lu

    2013-01-01

    Nutritional epidemiology aims to identify dietary and lifestyle causes for human diseases. Causality inference in nutritional epidemiology is largely based on evidence from studies of observational design, and may be distorted by unmeasured or residual confounding and reverse causation. Mendelian randomization is a recently developed methodology that combines genetic and classical epidemiological analysis to infer causality for environmental exposures, based on the principle of Mendel’s law of independent assortment. Mendelian randomization uses genetic variants as proxiesforenvironmentalexposuresofinterest.AssociationsderivedfromMendelian randomization analysis are less likely to be affected by confounding and reverse causation. During the past 5 years, a body of studies examined the causal effects of diet/lifestyle factors and biomarkers on a variety of diseases. The Mendelian randomization approach also holds considerable promise in the study of intrauterine influences on offspring health outcomes. However, the application of Mendelian randomization in nutritional epidemiology has some limitations. PMID:19674341

  16. Financial management of a large multisite randomized clinical trial.

    PubMed

    Sheffet, Alice J; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E; Longbottom, Mary E; Howard, Virginia J; Marler, John R; Brott, Thomas G

    2014-08-01

    The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years' funding ($21 112 866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2500 randomized participants at 40 sites. Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Projections of the original grant's fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant's fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2500 targeted sample size, 138 (5·5%) were randomized during the first five years and 1387 (55·5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13 845) of the projected per-patient costs ($152 992) of the fixed model. Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. © 2014 The Authors. International Journal of Stroke © 2014 World Stroke Organization.

  17. "Congratulations, you have been randomized into the control group!(?)": issues to consider when recruiting schools for matched-pair randomized control trials of prevention programs.

    PubMed

    Ji, Peter; DuBois, David L; Flay, Brian R; Brechling, Vanessa

    2008-03-01

    Recruiting schools into a matched-pair randomized control trial (MP-RCT) to evaluate the efficacy of a school-level prevention program presents challenges for researchers. We considered which of 2 procedures would be most effective for recruiting schools into the study and assigning them to conditions. In 1 procedure (recruit and match/randomize), we would recruit schools and match them prior to randomization, and in the other (match/randomize and recruitment), we would match schools and randomize them prior to recruitment. We considered how each procedure impacted the randomization process and our ability to recruit schools into the study. After implementing the selected procedure, the equivalence of both treatment and control group schools and the participating and nonparticipating schools on school demographic variables was evaluated. We decided on the recruit and match/randomize procedure because we thought it would provide the opportunity to build rapport with the schools and prepare them for the randomization process, thereby increasing the likelihood that they would accept their randomly assigned conditions. Neither the treatment and control group schools nor the participating and nonparticipating schools exhibited statistically significant differences from each other on any of the school demographic variables. Recruitment of schools prior to matching and randomization in an MP-RCT may facilitate the recruitment of schools and thus enhance both the statistical power and the representativeness of study findings. Future research would benefit from the consideration of a broader range of variables (eg, readiness to implement a comprehensive prevention program) both in matching schools and in evaluating their representativeness to nonparticipating schools.

  18. Coupled continuous time-random walks in quenched random environment

    NASA Astrophysics Data System (ADS)

    Magdziarz, M.; Szczotka, W.

    2018-02-01

    We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.

  19. A prospectus for a theory of variable variability

    NASA Technical Reports Server (NTRS)

    Childress, S.; Spiegel, E. A.

    1981-01-01

    It is proposed that the kind of stellar variability exhibited by the Sun in its magnetic activity cycle should be considered as a prototype of a class of stellar variability. The signature includes long 'periods' (compared to that of the radial fundamental model), erratic behavior, and intermittency. As other phenomena in the same variability class we nominate the liminosity fluctuations of ZZ Ceti stars and the solar 160 m oscillation. We discuss the possibility that analogous physical mechanisms are at work in all these cases, namely instabilities driven in a thin layer. These instabilities should be favorable to grave modes (in angle) and should arise in conditions that may allow more than one kind of instability to occur at once. The interaction of these competing instabilities produces complicated temporal variations. Given suitable idealizations, it is shown how to begin to compute solutions of small, but finite, amplitude.

  20. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  1. Evaluation of Kurtosis into the product of two normally distributed variables

    NASA Astrophysics Data System (ADS)

    Oliveira, Amílcar; Oliveira, Teresa; Seijas-Macías, Antonio

    2016-06-01

    Kurtosis (κ) is any measure of the "peakedness" of a distribution of a real-valued random variable. We study the evolution of the Kurtosis for the product of two normally distributed variables. Product of two normal variables is a very common problem for some areas of study, like, physics, economics, psychology, … Normal variables have a constant value for kurtosis (κ = 3), independently of the value of the two parameters: mean and variance. In fact, the excess kurtosis is defined as κ- 3 and the Normal Distribution Kurtosis is zero. The product of two normally distributed variables is a function of the parameters of the two variables and the correlation between then, and the range for kurtosis is in [0, 6] for independent variables and in [0, 12] when correlation between then is allowed.

  2. The cataclysmic variable AE Aquarii: orbital variability in V band

    NASA Astrophysics Data System (ADS)

    Zamanov, R.; Latev, G.

    2017-07-01

    We present 62.7 hours observations of the cataclysmic variable AE Aqr in Johnson V band. These are non-published archive electro-photometric data obtained during the time period 1993 to 1999. We construct the orbital variability in V band and obtain a Fourier fit to the double-wave quiescent light curve. The strongest flares in our data set are in phase interval 0.6 - 0.8. The data can be downloaded from http://www.astro.bas.bg/~rz/DATA/AEAqr.elphot.dat.

  3. All varieties of encoding variability are not created equal: Separating variable processing from variable tasks

    PubMed Central

    Huff, Mark J.; Bodner, Glen E.

    2014-01-01

    Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583

  4. Random bursts determine dynamics of active filaments.

    PubMed

    Weber, Christoph A; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S; Bausch, Andreas R; Frey, Erwin

    2015-08-25

    Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system's dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model.

  5. Secure TRNG with random phase stimulation

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr Z.

    2017-08-01

    In this paper a novel TRNG concept is proposed which is a vital part of cryptographic systems. The proposed TRNG involves phase variability of a pair of ring oscillators (ROs) to force the multiple metastable events in a flip-flop (FF). In the solution, the ROs are periodically activated to ensure the violation of the FF timing and resultant state randomness, while the TRNG circuit adapts the structure of ROs to obtain the maximum entropy and circuit security. The TRNG can be implemented in inexpensive re-programmable devices (CPLDs or FPGAs) without the use of Digital Clock Managers (DCMs). Preliminary test results proved the circuit's immunity to the intentional frequency injection attacks.

  6. Discriminative parameter estimation for random walks segmentation.

    PubMed

    Baudin, Pierre-Yves; Goodman, Danny; Kumrnar, Puneet; Azzabou, Noura; Carlier, Pierre G; Paragios, Nikos; Kumar, M Pawan

    2013-01-01

    The Random Walks (RW) algorithm is one of the most efficient and easy-to-use probabilistic segmentation methods. By combining contrast terms with prior terms, it provides accurate segmentations of medical images in a fully automated manner. However, one of the main drawbacks of using the RW algorithm is that its parameters have to be hand-tuned. we propose a novel discriminative learning framework that estimates the parameters using a training dataset. The main challenge we face is that the training samples are not fully supervised. Specifically, they provide a hard segmentation of the images, instead of a probabilistic segmentation. We overcome this challenge by treating the optimal probabilistic segmentation that is compatible with the given hard segmentation as a latent variable. This allows us to employ the latent support vector machine formulation for parameter estimation. We show that our approach significantly outperforms the baseline methods on a challenging dataset consisting of real clinical 3D MRI volumes of skeletal muscles.

  7. Random bursts determine dynamics of active filaments

    PubMed Central

    Weber, Christoph A.; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S.; Bausch, Andreas R.; Frey, Erwin

    2015-01-01

    Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system’s dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model. PMID:26261319

  8. Entropy as a collective variable

    NASA Astrophysics Data System (ADS)

    Parrinello, Michele

    Sampling complex free energy surfaces that exhibit long lived metastable states separated by kinetic bottlenecks is one of the most pressing issues in the atomistic simulations of matter. Not surprisingly many solutions to this problem have been suggested. Many of them are based on the identification of appropriate collective variables that span the manifold of the slow varying modes of the system. While much effort has been put in devising and even constructing on the fly appropriate collective variables there is still a cogent need of introducing simple, generic, physically transparent, and yet effective collective variables. Motivated by the physical observation that in many case transitions between one metastable state and another result from a trade off between enthalpy and entropy we introduce appropriate collective variables that are able to represent in a simple way these two physical properties. We use these variables in the context of the recently introduced variationally enhanced sampling and apply it them with success to the simulation of crystallization from the liquid and to conformational transitions in protein. Department of Chemistry and Applied Biosciences, ETH Zurich, and Facolta' di Informatica, Istituto di Scienze Computazionali, Universita' della Svizzera Italiana, Via G. Buffi 13, 6900 Lugano, Switzerland.

  9. Variable sensory perception in autism.

    PubMed

    Haigh, Sarah M

    2018-03-01

    Autism is associated with sensory and cognitive abnormalities. Individuals with autism generally show normal or superior early sensory processing abilities compared to healthy controls, but deficits in complex sensory processing. In the current opinion paper, it will be argued that sensory abnormalities impact cognition by limiting the amount of signal that can be used to interpret and interact with environment. There is a growing body of literature showing that individuals with autism exhibit greater trial-to-trial variability in behavioural and cortical sensory responses. If multiple sensory signals that are highly variable are added together to process more complex sensory stimuli, then this might destabilise later perception and impair cognition. Methods to improve sensory processing have shown improvements in more general cognition. Studies that specifically investigate differences in sensory trial-to-trial variability in autism, and the potential changes in variability before and after treatment, could ascertain if trial-to-trial variability is a good mechanism to target for treatment in autism. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  10. Polycyclic Aromatic Hydrocarbons in Residential Dust: Sources of Variability

    PubMed Central

    Metayer, Catherine; Petreas, Myrto; Does, Monique; Buffler, Patricia A.; Rappaport, Stephen M.

    2013-01-01

    Background: There is interest in using residential dust to estimate human exposure to environmental contaminants. Objectives: We aimed to characterize the sources of variability for polycyclic aromatic hydrocarbons (PAHs) in residential dust and provide guidance for investigators who plan to use residential dust to assess exposure to PAHs. Methods: We collected repeat dust samples from 293 households in the Northern California Childhood Leukemia Study during two sampling rounds (from 2001 through 2007 and during 2010) using household vacuum cleaners, and measured 12 PAHs using gas chromatography–mass spectrometry. We used a random- and a mixed-effects model for each PAH to apportion observed variance into four components and to identify sources of variability. Results: Median concentrations for individual PAHs ranged from 10 to 190 ng/g of dust. For each PAH, total variance was apportioned into regional variability (1–9%), intraregional between-household variability (24–48%), within-household variability over time (41–57%), and within-sample analytical variability (2–33%). Regional differences in PAH dust levels were associated with estimated ambient air concentrations of PAH. Intraregional differences between households were associated with the residential construction date and the smoking habits of residents. For some PAHs, a decreasing time trend explained a modest fraction of the within-household variability; however, most of the within-household variability was unaccounted for by our mixed-effects models. Within-household differences between sampling rounds were largest when the interval between dust sample collections was at least 6 years in duration. Conclusions: Our findings indicate that it may be feasible to use residential dust for retrospective assessment of PAH exposures in studies of health effects. PMID:23461863

  11. Variable focal length deformable mirror

    DOEpatents

    Headley, Daniel [Albuquerque, NM; Ramsey, Marc [Albuquerque, NM; Schwarz, Jens [Albuquerque, NM

    2007-06-12

    A variable focal length deformable mirror has an inner ring and an outer ring that simply support and push axially on opposite sides of a mirror plate. The resulting variable clamping force deforms the mirror plate to provide a parabolic mirror shape. The rings are parallel planar sections of a single paraboloid and can provide an on-axis focus, if the rings are circular, or an off-axis focus, if the rings are elliptical. The focal length of the deformable mirror can be varied by changing the variable clamping force. The deformable mirror can generally be used in any application requiring the focusing or defocusing of light, including with both coherent and incoherent light sources.

  12. Variable geometry Darrieus wind machine

    NASA Astrophysics Data System (ADS)

    Pytlinski, J. T.; Serrano, D.

    1983-08-01

    A variable geometry Darrieus wind machine is proposed. The lower attachment of the blades to the rotor can move freely up and down the axle allowing the blades of change shape during rotation. Experimental data for a 17 m. diameter Darrieus rotor and a theoretical model for multiple streamtube performance prediction were used to develop a computer simulation program for studying parameters that affect the machine's performance. This new variable geometry concept is described and interrelated with multiple streamtube theory through aerodynamic parameters. The computer simulation study shows that governor behavior of a Darrieus turbine can not be attained by a standard turbine operating within normally occurring rotational velocity limits. A second generation variable geometry Darrieus wind turbine which uses a telescopic blade is proposed as a potential improvement on the studied concept.

  13. Integrating Variable Renewable Energy - Russia

    SciT

    To foster sustainable, low-emission development, many countries are establishing ambitious renewable energy targets for their electricity supply. Because solar and wind tend to be more variable and uncertain than conventional sources, meeting these targets will involve changes to power system planning and operations. Grid integration is the practice of developing efficient ways to deliver variable renewable energy (VRE) to the grid. Good integration methods maximize the cost-effectiveness of incorporating VRE into the power system while maintaining or increasing system stability and reliability. When considering grid integration, policy makers, regulators, and system operators consider a variety of issues, which can bemore » organized into four broad topics: New Renewable Energy Generation, New Transmission, Increased System Flexibility, Planning for a High RE Future. This is a Russian-language translation of Integrating Variable Renewable Energy into the Grid: Key Issues, Greening the Grid, originally published in English in May 2015.« less

  14. Progress with variable cycle engines

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.

    1980-01-01

    The evaluation of components of an advanced propulsion system for a future supersonic cruise vehicle is discussed. These components, a high performance duct burner for thrust augmentation and a low jet noise coannular exhaust nozzle, are part of the variable stream control engine. An experimental test program involving both isolated component and complete engine tests was conducted for the high performance, low emissions duct burner with excellent results. Nozzle model tests were completed which substantiate the inherent jet noise benefit associated with the unique velocity profile possible of a coannular exhaust nozzle system on a variable stream control engine. Additional nozzle model performance tests have established high thrust efficiency levels at takeoff and supersonic cruise for this nozzle system. Large scale testing of these two critical components is conducted using an F100 engine as the testbed for simulating the variable stream control engine.

  15. Relation between random walks and quantum walks

    NASA Astrophysics Data System (ADS)

    Boettcher, Stefan; Falkner, Stefan; Portugal, Renato

    2015-05-01

    Based on studies of four specific networks, we conjecture a general relation between the walk dimensions dw of discrete-time random walks and quantum walks with the (self-inverse) Grover coin. In each case, we find that dw of the quantum walk takes on exactly half the value found for the classical random walk on the same geometry. Since walks on homogeneous lattices satisfy this relation trivially, our results for heterogeneous networks suggest that such a relation holds irrespective of whether translational invariance is maintained or not. To develop our results, we extend the renormalization-group analysis (RG) of the stochastic master equation to one with a unitary propagator. As in the classical case, the solution ρ (x ,t ) in space and time of this quantum-walk equation exhibits a scaling collapse for a variable xdw/t in the weak limit, which defines dw and illuminates fundamental aspects of the walk dynamics, e.g., its mean-square displacement. We confirm the collapse for ρ (x ,t ) in each case with extensive numerical simulation. The exact values for dw themselves demonstrate that RG is a powerful complementary approach to study the asymptotics of quantum walks that weak-limit theorems have not been able to access, such as for systems lacking translational symmetries beyond simple trees.

  16. Adapted random sampling patterns for accelerated MRI.

    PubMed

    Knoll, Florian; Clason, Christian; Diwoky, Clemens; Stollberger, Rudolf

    2011-02-01

    Variable density random sampling patterns have recently become increasingly popular for accelerated imaging strategies, as they lead to incoherent aliasing artifacts. However, the design of these sampling patterns is still an open problem. Current strategies use model assumptions like polynomials of different order to generate a probability density function that is then used to generate the sampling pattern. This approach relies on the optimization of design parameters which is very time consuming and therefore impractical for daily clinical use. This work presents a new approach that generates sampling patterns by making use of power spectra of existing reference data sets and hence requires neither parameter tuning nor an a priori mathematical model of the density of sampling points. The approach is validated with downsampling experiments, as well as with accelerated in vivo measurements. The proposed approach is compared with established sampling patterns, and the generalization potential is tested by using a range of reference images. Quantitative evaluation is performed for the downsampling experiments using RMS differences to the original, fully sampled data set. Our results demonstrate that the image quality of the method presented in this paper is comparable to that of an established model-based strategy when optimization of the model parameter is carried out and yields superior results to non-optimized model parameters. However, no random sampling pattern showed superior performance when compared to conventional Cartesian subsampling for the considered reconstruction strategy.

  17. Systematic random sampling of the comet assay.

    PubMed

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  18. Groundwater withdrawal in randomly heterogeneous coastal aquifers

    NASA Astrophysics Data System (ADS)

    Siena, Martina; Riva, Monica

    2018-05-01

    We analyze the combined effects of aquifer heterogeneity and pumping operations on seawater intrusion (SWI), a phenomenon which is threatening coastal aquifers worldwide. Our investigation is set within a probabilistic framework and relies on a numerical Monte Carlo approach targeting transient variable-density flow and solute transport in a three-dimensional randomly heterogeneous porous domain. The geological setting is patterned after the Argentona river basin, in the Maresme region of Catalonia (Spain). Our numerical study is concerned with exploring the effects of (a) random heterogeneity of the domain on SWI in combination with (b) a variety of groundwater withdrawal schemes. The latter have been designed by varying the screen location along the vertical direction and the distance of the wellbore from the coastline and from the location of the freshwater-saltwater mixing zone which is in place prior to pumping. For each random realization of the aquifer permeability field and for each pumping scheme, a quantitative depiction of SWI phenomena is inferred from an original set of metrics characterizing (a) the inland penetration of the saltwater wedge and (b) the width of the mixing zone across the whole three-dimensional system. Our results indicate that the stochastic nature of the system heterogeneity significantly affects the statistical description of the main features of the seawater wedge either in the presence or in the absence of pumping, yielding a general reduction of toe penetration and an increase of the width of the mixing zone. Simultaneous extraction of fresh and saltwater from two screens along the same wellbore located, prior to pumping, within the freshwater-saltwater mixing zone is effective in limiting SWI in the context of groundwater resources exploitation.

  19. Optimal partitioning of random programs across two processors

    NASA Technical Reports Server (NTRS)

    Nicol, D. M.

    1986-01-01

    The optimal partitioning of random distributed programs is discussed. It is concluded that the optimal partitioning of a homogeneous random program over a homogeneous distributed system either assigns all modules to a single processor, or distributes the modules as evenly as possible among all processors. The analysis rests heavily on the approximation which equates the expected maximum of a set of independent random variables with the set's maximum expectation. The results are strengthened by providing an approximation-free proof of this result for two processors under general conditions on the module execution time distribution. It is also shown that use of this approximation causes two of the previous central results to be false.

  20. Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue

    NASA Astrophysics Data System (ADS)

    Kree, P.; Soize, C.

    The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.

  1. Regularization of the big bang singularity with random perturbations

    NASA Astrophysics Data System (ADS)

    Belbruno, Edward; Xue, BingKan

    2018-03-01

    We show how to regularize the big bang singularity in the presence of random perturbations modeled by Brownian motion using stochastic methods. We prove that the physical variables in a contracting universe dominated by a scalar field can be continuously and uniquely extended through the big bang as a function of time to an expanding universe only for a discrete set of values of the equation of state satisfying special co-prime number conditions. This result significantly generalizes a previous result (Xue and Belbruno 2014 Class. Quantum Grav. 31 165002) that did not model random perturbations. This result implies that the extension from a contracting to an expanding universe for the discrete set of co-prime equation of state is robust, which is a surprising result. Implications for a purely expanding universe are discussed, such as a non-smooth, randomly varying scale factor near the big bang.

  2. Parameter identification using a creeping-random-search algorithm

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.

    1971-01-01

    A creeping-random-search algorithm is applied to different types of problems in the field of parameter identification. The studies are intended to demonstrate that a random-search algorithm can be applied successfully to these various problems, which often cannot be handled by conventional deterministic methods, and, also, to introduce methods that speed convergence to an extremal of the problem under investigation. Six two-parameter identification problems with analytic solutions are solved, and two application problems are discussed in some detail. Results of the study show that a modified version of the basic creeping-random-search algorithm chosen does speed convergence in comparison with the unmodified version. The results also show that the algorithm can successfully solve problems that contain limits on state or control variables, inequality constraints (both independent and dependent, and linear and nonlinear), or stochastic models.

  3. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Technical Reports Server (NTRS)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  4. Titan's highly variable plasma environment

    NASA Astrophysics Data System (ADS)

    Wolf, D. A.; Neubauer, F. M.

    1982-02-01

    It is noted that Titan's plasma environment is variable for two reasons. The variability of the solar wind is such that Titan may be located in the outer magnetosphere, the magnetosheath, or the interplanetary medium around noon Saturnian local time. What is more, there are local time variations in Saturn's magnetosphere. The location of the stagnation point of Saturn's magnetosphere is calculated, assuming a terrestrial type magnetosphere. Characteristic plasma parameters along the orbit of Titan are shown for high solar wind pressure. During crossings of the Saturnian magnetopause or bow shock by Titan, abrupt changes in the flow direction and stagnation pressure are expected, as are rapid associated changes in Titan's uppermost atmosphere.

  5. Variable conductance heat pipe technology

    NASA Technical Reports Server (NTRS)

    Marcus, B. D.; Edwards, D. K.; Anderson, W. T.

    1973-01-01

    Research and development programs in variable conductance heat pipe technology were conducted. The treatment has been comprehensive, involving theoretical and/or experimental studies in hydrostatics, hydrodynamics, heat transfer into and out of the pipe, fluid selection, and materials compatibility, in addition to the principal subject of variable conductance control techniques. Efforts were not limited to analytical work and laboratory experimentation, but extended to the development, fabrication and test of spacecraft hardware, culminating in the successful flight of the Ames Heat Pipe Experiment on the OAO-C spacecraft.

  6. Variable Selection in Logistic Regression.

    DTIC Science & Technology

    1987-06-01

    23 %. AUTIOR(.) S. CONTRACT OR GRANT NUMBE Rf.i %Z. D. Bai, P. R. Krishnaiah and . C. Zhao F49620-85- C-0008 " PERFORMING ORGANIZATION NAME AND AOORESS...d I7 IOK-TK- d 7 -I0 7’ VARIABLE SELECTION IN LOGISTIC REGRESSION Z. D. Bai, P. R. Krishnaiah and L. C. Zhao Center for Multivariate Analysis...University of Pittsburgh Center for Multivariate Analysis University of Pittsburgh Y !I VARIABLE SELECTION IN LOGISTIC REGRESSION Z- 0. Bai, P. R. Krishnaiah

  7. Relevance of anisotropy and spatial variability of gas diffusivity for soil-gas transport

    NASA Astrophysics Data System (ADS)

    Schack-Kirchner, Helmer; Kühne, Anke; Lang, Friederike

    2017-04-01

    Models of soil gas transport generally do not consider neither direction dependence of gas diffusivity, nor its small-scale variability. However, in a recent study, we could provide evidence for anisotropy favouring vertical gas diffusion in natural soils. We hypothesize that gas transport models based on gas diffusion data measured with soil rings are strongly influenced by both, anisotropy and spatial variability and the use of averaged diffusivities could be misleading. To test this we used a 2-dimensional model of soil gas transport to under compacted wheel tracks to model the soil-air oxygen distribution in the soil. The model was parametrized with data obtained from soil-ring measurements with its central tendency and variability. The model includes vertical parameter variability as well as variation perpendicular to the elongated wheel track. Different parametrization types have been tested: [i)]Averaged values for wheel track and undisturbed. em [ii)]Random distribution of soil cells with normally distributed variability within the strata. em [iii)]Random distributed soil cells with uniformly distributed variability within the strata. All three types of small-scale variability has been tested for [j)] isotropic gas diffusivity and em [jj)]reduced horizontal gas diffusivity (constant factor), yielding in total six models. As expected the different parametrizations had an important influence to the aeration state under wheel tracks with the strongest oxygen depletion in case of uniformly distributed variability and anisotropy towards higher vertical diffusivity. The simple simulation approach clearly showed the relevance of anisotropy and spatial variability in case of identical central tendency measures of gas diffusivity. However, until now it did not consider spatial dependency of variability, that could even aggravate effects. To consider anisotropy and spatial variability in gas transport models we recommend a) to measure soil-gas transport parameters

  8. Random effects coefficient of determination for mixed and meta-analysis models

    PubMed Central

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2011-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, Rr2, that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If Rr2 is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of Rr2 apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects—the model can be estimated using the dummy variable approach. We derive explicit formulas for Rr2 in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine. PMID:23750070

  9. Random effects coefficient of determination for mixed and meta-analysis models.

    PubMed

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2012-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, [Formula: see text], that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If [Formula: see text] is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of [Formula: see text] apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects-the model can be estimated using the dummy variable approach. We derive explicit formulas for [Formula: see text] in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine.

  10. Latent class instrumental variables: a clinical and biostatistical perspective.

    PubMed

    Baker, Stuart G; Kramer, Barnett S; Lindeman, Karen S

    2016-01-15

    In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Latent class instrumental variables: A clinical and biostatistical perspective

    PubMed Central

    Baker, Stuart G.; Kramer, Barnett S.; Lindeman, Karen S.

    2015-01-01

    In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. PMID:26239275

  12. Variable Selection in the Presence of Missing Data: Imputation-based Methods.

    PubMed

    Zhao, Yize; Long, Qi

    2017-01-01

    Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.

  13. Record statistics of a strongly correlated time series: random walks and Lévy flights

    NASA Astrophysics Data System (ADS)

    Godrèche, Claude; Majumdar, Satya N.; Schehr, Grégory

    2017-08-01

    We review recent advances on the record statistics of strongly correlated time series, whose entries denote the positions of a random walk or a Lévy flight on a line. After a brief survey of the theory of records for independent and identically distributed random variables, we focus on random walks. During the last few years, it was indeed realized that random walks are a very useful ‘laboratory’ to test the effects of correlations on the record statistics. We start with the simple one-dimensional random walk with symmetric jumps (both continuous and discrete) and discuss in detail the statistics of the number of records, as well as of the ages of the records, i.e. the lapses of time between two successive record breaking events. Then we review the results that were obtained for a wide variety of random walk models, including random walks with a linear drift, continuous time random walks, constrained random walks (like the random walk bridge) and the case of multiple independent random walkers. Finally, we discuss further observables related to records, like the record increments, as well as some questions raised by physical applications of record statistics, like the effects of measurement error and noise.

  14. Local randomness: Examples and application

    NASA Astrophysics Data System (ADS)

    Fu, Honghao; Miller, Carl A.

    2018-03-01

    When two players achieve a superclassical score at a nonlocal game, their outputs must contain intrinsic randomness. This fact has many useful implications for quantum cryptography. Recently it has been observed [C. Miller and Y. Shi, Quantum Inf. Computat. 17, 0595 (2017)] that such scores also imply the existence of local randomness—that is, randomness known to one player but not to the other. This has potential implications for cryptographic tasks between two cooperating but mistrustful players. In the current paper we bring this notion toward practical realization, by offering near-optimal bounds on local randomness for the CHSH game, and also proving the security of a cryptographic application of local randomness (single-bit certified deletion).

  15. On Edge Exchangeable Random Graphs

    NASA Astrophysics Data System (ADS)

    Janson, Svante

    2017-06-01

    We study a recent model for edge exchangeable random graphs introduced by Crane and Dempsey; in particular we study asymptotic properties of the random simple graph obtained by merging multiple edges. We study a number of examples, and show that the model can produce dense, sparse and extremely sparse random graphs. One example yields a power-law degree distribution. We give some examples where the random graph is dense and converges a.s. in the sense of graph limit theory, but also an example where a.s. every graph limit is the limit of some subsequence. Another example is sparse and yields convergence to a non-integrable generalized graphon defined on (0,∞).

  16. Cluster randomization and political philosophy.

    PubMed

    Chwang, Eric

    2012-11-01

    In this paper, I will argue that, while the ethical issues raised by cluster randomization can be challenging, they are not new. My thesis divides neatly into two parts. In the first, easier part I argue that many of the ethical challenges posed by cluster randomized human subjects research are clearly present in other types of human subjects research, and so are not novel. In the second, more difficult part I discuss the thorniest ethical challenge for cluster randomized research--cases where consent is genuinely impractical to obtain. I argue that once again these cases require no new analytic insight; instead, we should look to political philosophy for guidance. In other words, the most serious ethical problem that arises in cluster randomized research also arises in political philosophy. © 2011 Blackwell Publishing Ltd.

  17. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  18. ADAPTIVE MATCHING IN RANDOMIZED TRIALS AND OBSERVATIONAL STUDIES

    PubMed Central

    van der Laan, Mark J.; Balzer, Laura B.; Petersen, Maya L.

    2014-01-01

    SUMMARY In many randomized and observational studies the allocation of treatment among a sample of n independent and identically distributed units is a function of the covariates of all sampled units. As a result, the treatment labels among the units are possibly dependent, complicating estimation and posing challenges for statistical inference. For example, cluster randomized trials frequently sample communities from some target population, construct matched pairs of communities from those included in the sample based on some metric of similarity in baseline community characteristics, and then randomly allocate a treatment and a control intervention within each matched pair. In this case, the observed data can neither be represented as the realization of n independent random variables, nor, contrary to current practice, as the realization of n/2 independent random variables (treating the matched pair as the independent sampling unit). In this paper we study estimation of the average causal effect of a treatment under experimental designs in which treatment allocation potentially depends on the pre-intervention covariates of all units included in the sample. We define efficient targeted minimum loss based estimators for this general design, present a theorem that establishes the desired asymptotic normality of these estimators and allows for asymptotically valid statistical inference, and discuss implementation of these estimators. We further investigate the relative asymptotic efficiency of this design compared with a design in which unit-specific treatment assignment depends only on the units’ covariates. Our findings have practical implications for the optimal design and analysis of pair matched cluster randomized trials, as well as for observational studies in which treatment decisions may depend on characteristics of the entire sample. PMID:25097298

  19. Random sequential adsorption of cubes

    NASA Astrophysics Data System (ADS)

    Cieśla, Michał; Kubala, Piotr

    2018-01-01

    Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.

  20. Staggered chiral random matrix theory

    SciT

    Osborn, James C.

    2011-02-01

    We present a random matrix theory for the staggered lattice QCD Dirac operator. The staggered random matrix theory is equivalent to the zero-momentum limit of the staggered chiral Lagrangian and includes all taste breaking terms at their leading order. This is an extension of previous work which only included some of the taste breaking terms. We will also present some results for the taste breaking contributions to the partition function and the Dirac eigenvalues.

  1. On Pfaffian Random Point Fields

    NASA Astrophysics Data System (ADS)

    Kargin, V.

    2014-02-01

    We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.

  2. Climate Variability and Ecosystem Response

    David Greenland; Lloyd W. Swift; [Editors

    1990-01-01

    Nine papers describe studies of climate variability and ecosystem response. The studies were conducted at LTER (Long-Term Ecological Research) sites representing forest, agricultural, and aquatic ecosystems and systems in which extreme climates limit vegetational cover. An overview paper prepared by the LTER Climate Committee stresses the importance of (1) clear...

  3. City scale pollen concentration variability

    NASA Astrophysics Data System (ADS)

    van der Molen, Michiel; van Vliet, Arnold; Krol, Maarten

    2016-04-01

    Pollen are emitted in the atmosphere both in the country-side and in cities. Yet the majority of the population is exposed to pollen in cities. Allergic reactions may be induced by short-term exposure to pollen. This raises the question how variable pollen concentration in cities are in temporally and spatially, and how much of the pollen in cities are actually produced in the urban region itself. We built a high resolution (1 × 1 km) pollen dispersion model based on WRF-Chem to study a city's pollen budget and the spatial and temporal variability in concentration. It shows that the concentrations are highly variable, as a result of source distribution, wind direction and boundary layer mixing, as well as the release rate as a function of temperature, turbulence intensity and humidity. Hay Fever Forecasts based on such high resolution emission and physical dispersion modelling surpass traditional hay fever warning methods based on temperature sum methods. The model gives new insights in concentration variability, personal and community level exposure and prevention. The model will be developped into a new forecast tool to serve allergic people to minimize their exposure and reduce nuisance, coast of medication and sick leave. This is an innovative approach in hay fever warning systems.

  4. Optoacoustic Monitoring of Physiologic Variables

    PubMed Central

    Esenaliev, Rinat O.

    2017-01-01

    Optoacoustic (photoacoustic) technique is a novel diagnostic platform that can be used for noninvasive measurements of physiologic variables, functional imaging, and hemodynamic monitoring. This technique is based on generation and time-resolved detection of optoacoustic (thermoelastic) waves generated in tissue by short optical pulses. This provides probing of tissues and individual blood vessels with high optical contrast and ultrasound spatial resolution. Because the optoacoustic waves carry information on tissue optical and thermophysical properties, detection, and analysis of the optoacoustic waves allow for measurements of physiologic variables with high accuracy and specificity. We proposed to use the optoacoustic technique for monitoring of a number of important physiologic variables including temperature, thermal coagulation, freezing, concentration of molecular dyes, nanoparticles, oxygenation, and hemoglobin concentration. In this review we present origin of contrast and high spatial resolution in these measurements performed with optoacoustic systems developed and built by our group. We summarize data obtained in vitro, in experimental animals, and in humans on monitoring of these physiologic variables. Our data indicate that the optoacoustic technology may be used for monitoring of cerebral blood oxygenation in patients with traumatic brain injury and in neonatal patients, central venous oxygenation monitoring, total hemoglobin concentration monitoring, hematoma detection and characterization, monitoring of temperature, and coagulation and freezing boundaries during thermotherapy. PMID:29311964

  5. Optoacoustic Monitoring of Physiologic Variables.

    PubMed

    Esenaliev, Rinat O

    2017-01-01

    Optoacoustic (photoacoustic) technique is a novel diagnostic platform that can be used for noninvasive measurements of physiologic variables, functional imaging, and hemodynamic monitoring. This technique is based on generation and time-resolved detection of optoacoustic (thermoelastic) waves generated in tissue by short optical pulses. This provides probing of tissues and individual blood vessels with high optical contrast and ultrasound spatial resolution. Because the optoacoustic waves carry information on tissue optical and thermophysical properties, detection, and analysis of the optoacoustic waves allow for measurements of physiologic variables with high accuracy and specificity. We proposed to use the optoacoustic technique for monitoring of a number of important physiologic variables including temperature, thermal coagulation, freezing, concentration of molecular dyes, nanoparticles, oxygenation, and hemoglobin concentration. In this review we present origin of contrast and high spatial resolution in these measurements performed with optoacoustic systems developed and built by our group. We summarize data obtained in vitro , in experimental animals, and in humans on monitoring of these physiologic variables. Our data indicate that the optoacoustic technology may be used for monitoring of cerebral blood oxygenation in patients with traumatic brain injury and in neonatal patients, central venous oxygenation monitoring, total hemoglobin concentration monitoring, hematoma detection and characterization, monitoring of temperature, and coagulation and freezing boundaries during thermotherapy.

  6. Complex Variables throughout the Curriculum

    ERIC Educational Resources Information Center

    D'Angelo, John P.

    2017-01-01

    We offer many specific detailed examples, several of which are new, that instructors can use (in lecture or as student projects) to revitalize the role of complex variables throughout the curriculum. We conclude with three primary recommendations: revise the syllabus of Calculus II to allow early introductions of complex numbers and linear…

  7. Operant Variability: A Conceptual Analysis

    ERIC Educational Resources Information Center

    Barba, Lourenco de Souza

    2012-01-01

    Some researchers claim that variability is an operant dimension of behavior. The present paper reviews the concept of operant behavior and emphasizes that differentiation is the behavioral process that demonstrates an operant relation. Differentiation is conceived as change in the overlap between two probability distributions: the distribution of…

  8. Solar variability, weather, and climate

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Advances in the understanding of possible effects of solar variations on weather and climate are most likely to emerge by addressing the subject in terms of fundamental physical principles of atmospheric sciences and solar-terrestrial physis. The limits of variability of solar inputs to the atmosphere and the depth in the atmosphere to which these variations have significant effects are determined.

  9. Contextual Variability in Free Recall

    ERIC Educational Resources Information Center

    Lohnas, Lynn J.; Polyn, Sean M.; Kahana, Michael J.

    2011-01-01

    According to contextual-variability theory, experiences encoded at different times tend to be associated with different contextual states. The gradual evolution of context implies that spaced items will be associated with more distinct contextual states, and thus have more unique retrieval cues, than items presented in proximity. Ross and Landauer…

  10. Marginality and Variability in Esperanto.

    ERIC Educational Resources Information Center

    Brent, Edmund

    This paper discusses Esperanto as a planned language and refutes three myths connected to it, namely, that Esperanto is achronical, atopical, and apragmatic. The focus here is on a synchronic analysis. Synchronic variability is studied with reference to the structuralist determination of "marginality" and the dynamic linguistic…

  11. CONSTRAINTS ON VARIABLES IN SYNTAX.

    ERIC Educational Resources Information Center

    ROSS, JOHN ROBERT

    IN ATTEMPTING TO DEFINE "SYNTACTIC VARIABLE," THE AUTHOR BASES HIS DISCUSSION ON THE ASSUMPTION THAT SYNTACTIC FACTS ARE A COLLECTION OF TWO TYPES OF RULES--CONTEXT-FREE PHRASE STRUCTURE RULES (GENERATING UNDERLYING OR DEEP PHRASE MARKERS) AND GRAMMATICAL TRANSFORMATIONS, WHICH MAP UNDERLYING PHRASE MARKERS ONTO SUPERFICIAL (OR SURFACE) PHRASE…

  12. Variable & Recode Definitions - SEER Documentation

    Cancer.gov

    Resources that define variables and provide documentation for reporting using SEER and related datasets. Choose from SEER coding and staging manuals plus instructions for recoding behavior, site, stage, cause of death, insurance, and several additional topics. Also guidance on months survived, calculating Hispanic mortality, and site-specific surgery.

  13. A variable acceleration calibration system

    NASA Astrophysics Data System (ADS)

    Johnson, Thomas H.

    2011-12-01

    A variable acceleration calibration system that applies loads using gravitational and centripetal acceleration serves as an alternative, efficient and cost effective method for calibrating internal wind tunnel force balances. Two proof-of-concept variable acceleration calibration systems are designed, fabricated and tested. The NASA UT-36 force balance served as the test balance for the calibration experiments. The variable acceleration calibration systems are shown to be capable of performing three component calibration experiments with an approximate applied load error on the order of 1% of the full scale calibration loads. Sources of error are indentified using experimental design methods and a propagation of uncertainty analysis. Three types of uncertainty are indentified for the systems and are attributed to prediction error, calibration error and pure error. Angular velocity uncertainty is shown to be the largest indentified source of prediction error. The calibration uncertainties using a production variable acceleration based system are shown to be potentially equivalent to current methods. The production quality system can be realized using lighter materials and a more precise instrumentation. Further research is needed to account for balance deflection, forcing effects due to vibration, and large tare loads. A gyroscope measurement technique is shown to be capable of resolving the balance deflection angle calculation. Long term research objectives include a demonstration of a six degree of freedom calibration, and a large capacity balance calibration.

  14. Methodological Variables in Choral Reading

    ERIC Educational Resources Information Center

    Poore, Meredith A.; Ferguson, Sarah Hargus

    2008-01-01

    This preliminary study explored changes in prosodic variability during choral reading and investigated whether these changes are affected by the method of eliciting choral reading. Ten typical adult talkers recorded three reading materials (poetry, fiction and textbook) in three reading conditions: solo (reading aloud alone), track (reading aloud…

  15. Unsupervised classification of variable stars

    NASA Astrophysics Data System (ADS)

    Valenzuela, Lucas; Pichara, Karim

    2018-03-01

    During the past 10 years, a considerable amount of effort has been made to develop algorithms for automatic classification of variable stars. That has been primarily achieved by applying machine learning methods to photometric data sets where objects are represented as light curves. Classifiers require training sets to learn the underlying patterns that allow the separation among classes. Unfortunately, building training sets is an expensive process that demands a lot of human efforts. Every time data come from new surveys; the only available training instances are the ones that have a cross-match with previously labelled objects, consequently generating insufficient training sets compared with the large amounts of unlabelled sources. In this work, we present an algorithm that performs unsupervised classification of variable stars, relying only on the similarity among light curves. We tackle the unsupervised classification problem by proposing an untraditional approach. Instead of trying to match classes of stars with clusters found by a clustering algorithm, we propose a query-based method where astronomers can find groups of variable stars ranked by similarity. We also develop a fast similarity function specific for light curves, based on a novel data structure that allows scaling the search over the entire data set of unlabelled objects. Experiments show that our unsupervised model achieves high accuracy in the classification of different types of variable stars and that the proposed algorithm scales up to massive amounts of light curves.

  16. Variable gas leak rate valve

    DOEpatents

    Eernisse, Errol P.; Peterson, Gary D.

    1976-01-01

    A variable gas leak rate valve which utilizes a poled piezoelectric element to control opening and closing of the valve. The gas flow may be around a cylindrical rod with a tubular piezoelectric member encircling the rod for seating thereagainst to block passage of gas and for reopening thereof upon application of suitable electrical fields.

  17. Perceptions of randomized security schedules.

    PubMed

    Scurich, Nicholas; John, Richard S

    2014-04-01

    Security of infrastructure is a major concern. Traditional security schedules are unable to provide omnipresent coverage; consequently, adversaries can exploit predictable vulnerabilities to their advantage. Randomized security schedules, which randomly deploy security measures, overcome these limitations, but public perceptions of such schedules have not been examined. In this experiment, participants were asked to make a choice between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The absolute probability of detecting contraband was manipulated (i.e., 1/10, 1/4, 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer. There were no differences between traditional and random schedule in terms of perceived effectiveness or deterrence. Policy implications for the implementation and utilization of randomized schedules are discussed. © 2013 Society for Risk Analysis.

  18. Cross-country transferability of multi-variable damage models

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; Lüdtke, Stefan; Kreibich, Heidi; Bouwer, Laurens

    2017-04-01

    Flood damage assessment is often done with simple damage curves based only on flood water depth. Additionally, damage models are often transferred in space and time, e.g. from region to region or from one flood event to another. Validation has shown that depth-damage curve estimates are associated with high uncertainties, particularly when applied in regions outside the area where the data for curve development was collected. Recently, progress has been made with multi-variable damage models created with data-mining techniques, i.e. Bayesian Networks and random forest. However, it is still unknown to what extent and under which conditions model transfers are possible and reliable. Model validations in different countries will provide valuable insights into the transferability of multi-variable damage models. In this study we compare multi-variable models developed on basis of flood damage datasets from Germany as well as from The Netherlands. Data from several German floods was collected using computer aided telephone interviews. Data from the 1993 Meuse flood in the Netherlands is available, based on compensations paid by the government. The Bayesian network and random forest based models are applied and validated in both countries on basis of the individual datasets. A major challenge was the harmonization of the variables between both datasets due to factors like differences in variable definitions, and regional and temporal differences in flood hazard and exposure characteristics. Results of model validations and comparisons in both countries are discussed, particularly in respect to encountered challenges and possible solutions for an improvement of model transferability.

  19. Inverse Ising problem in continuous time: A latent variable approach

    NASA Astrophysics Data System (ADS)

    Donner, Christian; Opper, Manfred

    2017-12-01

    We consider the inverse Ising problem: the inference of network couplings from observed spin trajectories for a model with continuous time Glauber dynamics. By introducing two sets of auxiliary latent random variables we render the likelihood into a form which allows for simple iterative inference algorithms with analytical updates. The variables are (1) Poisson variables to linearize an exponential term which is typical for point process likelihoods and (2) Pólya-Gamma variables, which make the likelihood quadratic in the coupling parameters. Using the augmented likelihood, we derive an expectation-maximization (EM) algorithm to obtain the maximum likelihood estimate of network parameters. Using a third set of latent variables we extend the EM algorithm to sparse couplings via L1 regularization. Finally, we develop an efficient approximate Bayesian inference algorithm using a variational approach. We demonstrate the performance of our algorithms on data simulated from an Ising model. For data which are simulated from a more biologically plausible network with spiking neurons, we show that the Ising model captures well the low order statistics of the data and how the Ising couplings are related to the underlying synaptic structure of the simulated network.

  20. Power calculator for instrumental variable analysis in pharmacoepidemiology

    PubMed Central

    Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M

    2017-01-01

    Abstract Background Instrumental variable analysis, for example with physicians’ prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. Methods and Results The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. Conclusions The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. PMID:28575313