Sample records for components jujo random

  1. A Numerical Investigation of the Non-Linear Mechanics of Wave Disturbances in Plane Poiseuille Flows

    DTIC Science & Technology

    1971-09-02

    i OCXXLL UJ rn - •• • O •>a£ •> O »QC • S —nt-Q£ 00_IUJ< O0_IUJ< XLUI—LU QU< • OU< • *.< •>_! »•< »-J Z •Q.NOZ ROCOCO ...OU< * OU< *U- ^"O •X •» »Oi-i »<t *—J *< *—I »—<CM z ••a.cMoz Rococo -l-ocoto—• •»** o<<<x«-i • ora-<i- oru.<ti-FHO—z JUJO’-’Z...roro CO ro rococo OOOOOOOOOOOOOO oooooooooooooo oooooooooooooo oooooooooooooo ooooooooooooooooooooooooo co C* o -* c\\j ro -4- i n >o l*» oo cr

  2. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or themore » giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.« less

  4. Metal-backed versus all-polyethylene tibial components in primary total knee arthroplasty

    PubMed Central

    2011-01-01

    Background and purpose The choice of either all-polyethylene (AP) tibial components or metal-backed (MB) tibial components in total knee arthroplasty (TKA) remains controversial. We therefore performed a meta-analysis and systematic review of randomized controlled trials that have evaluated MB and AP tibial components in primary TKA. Methods The search strategy included a computerized literature search (Medline, EMBASE, Scopus, and the Cochrane Central Register of Controlled Trials) and a manual search of major orthopedic journals. A meta-analysis and systematic review of randomized or quasi-randomized trials that compared the performance of tibial components in primary TKA was performed using a fixed or random effects model. We assessed the methodological quality of studies using Detsky quality scale. Results 9 randomized controlled trials (RCTs) published between 2000 and 2009 met the inclusion quality standards for the systematic review. The mean standardized Detsky score was 14 (SD 3). We found that the frequency of radiolucent lines in the MB group was significantly higher than that in the AP group. There were no statistically significant differences between the MB and AP tibial components regarding component positioning, knee score, knee range of motion, quality of life, and postoperative complications. Interpretation Based on evidence obtained from this study, the AP tibial component was comparable with or better than the MB tibial component in TKA. However, high-quality RCTs are required to validate the results. PMID:21895503

  5. Micro-Randomized Trials: An Experimental Design for Developing Just-in-Time Adaptive Interventions

    PubMed Central

    Klasnja, Predrag; Hekler, Eric B.; Shiffman, Saul; Boruvka, Audrey; Almirall, Daniel; Tewari, Ambuj; Murphy, Susan A.

    2015-01-01

    Objective This paper presents an experimental design, the micro-randomized trial, developed to support optimization of just-in-time adaptive interventions (JITAIs). JITAIs are mHealth technologies that aim to deliver the right intervention components at the right times and locations to optimally support individuals’ health behaviors. Micro-randomized trials offer a way to optimize such interventions by enabling modeling of causal effects and time-varying effect moderation for individual intervention components within a JITAI. Methods The paper describes the micro-randomized trial design, enumerates research questions that this experimental design can help answer, and provides an overview of the data analyses that can be used to assess the causal effects of studied intervention components and investigate time-varying moderation of those effects. Results Micro-randomized trials enable causal modeling of proximal effects of the randomized intervention components and assessment of time-varying moderation of those effects. Conclusions Micro-randomized trials can help researchers understand whether their interventions are having intended effects, when and for whom they are effective, and what factors moderate the interventions’ effects, enabling creation of more effective JITAIs. PMID:26651463

  6. Scaling Techniques for Combustion Device Random Vibration Predictions

    NASA Technical Reports Server (NTRS)

    Kenny, R. J.; Ferebee, R. C.; Duvall, L. D.

    2016-01-01

    This work presents compares scaling techniques that can be used for prediction of combustion device component random vibration levels with excitation due to the internal combustion dynamics. Acceleration and unsteady dynamic pressure data from multiple component test programs are compared and normalized per the two scaling approaches reviewed. Two scaling technique are reviewed and compared against the collected component test data. The first technique is an existing approach developed by Barrett, and the second technique is an updated approach new to this work. Results from utilizing both techniques are presented and recommendations about future component random vibration prediction approaches are given.

  7. Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa

    2018-01-01

    A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.

  8. Fatigue Damage Spectrum calculation in a Mission Synthesis procedure for Sine-on-Random excitations

    NASA Astrophysics Data System (ADS)

    Angeli, Andrea; Cornelis, Bram; Troncossi, Marco

    2016-09-01

    In many real-life environments, certain mechanical and electronic components may be subjected to Sine-on-Random vibrations, i.e. excitations composed of random vibrations superimposed on deterministic (sinusoidal) contributions, in particular sine tones due to some rotating parts of the system (e.g. helicopters, engine-mounted components,...). These components must be designed to withstand the fatigue damage induced by the “composed” vibration environment, and qualification tests are advisable for the most critical ones. In the case of an accelerated qualification test, a proper test tailoring which starts from the real environment (measured vibration signals) and which preserves not only the accumulated fatigue damage but also the “nature” of the excitation (i.e. sinusoidal components plus random process) is important to obtain reliable results. In this paper, the classic time domain approach is taken as a reference for the comparison of different methods for the Fatigue Damage Spectrum (FDS) calculation in case of Sine-on-Random vibration environments. Then, a methodology to compute a Sine-on-Random specification based on a mission FDS is proposed.

  9. Components of effective randomized controlled trials of hydrotherapy programs for fibromyalgia syndrome: A systematic review.

    PubMed

    Perraton, Luke; Machotka, Zuzana; Kumar, Saravana

    2009-11-30

    Previous systematic reviews have found hydrotherapy to be an effective management strategy for fibromyalgia syndrome (FMS). The aim of this systematic review was to summarize the components of hydrotherapy programs used in randomized controlled trials. A systematic review of randomized controlled trials was conducted. Only trials that have reported significant FMS-related outcomes were included. Data relating to the components of hydrotherapy programs (exercise type, duration, frequency and intensity, environmental factors, and service delivery) were analyzed. Eleven randomized controlled trials were included in this review. Overall, the quality of trials was good. Aerobic exercise featured in all 11 trials and the majority of hydrotherapy programs included either a strengthening or flexibility component. Great variability was noted in both the environmental components of hydrotherapy programs and service delivery. Aerobic exercise, warm up and cool-down periods and relaxation exercises are common features of hydrotherapy programs that report significant FMS-related outcomes. Treatment duration of 60 minutes, frequency of three sessions per week and an intensity equivalent to 60%-80% maximum heart rate were the most commonly reported exercise components. Exercise appears to be the most important component of an effective hydrotherapy program for FMS, particularly when considering mental health-related outcomes.

  10. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiplemore » causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.« less

  11. Are randomly grown graphs really random?

    PubMed

    Callaway, D S; Hopcroft, J E; Kleinberg, J M; Newman, M E; Strogatz, S H

    2001-10-01

    We analyze a minimal model of a growing network. At each time step, a new vertex is added; then, with probability delta, two vertices are chosen uniformly at random and joined by an undirected edge. This process is repeated for t time steps. In the limit of large t, the resulting graph displays surprisingly rich characteristics. In particular, a giant component emerges in an infinite-order phase transition at delta=1/8. At the transition, the average component size jumps discontinuously but remains finite. In contrast, a static random graph with the same degree distribution exhibits a second-order phase transition at delta=1/4, and the average component size diverges there. These dramatic differences between grown and static random graphs stem from a positive correlation between the degrees of connected vertices in the grown graph-older vertices tend to have higher degree, and to link with other high-degree vertices, merely by virtue of their age. We conclude that grown graphs, however randomly they are constructed, are fundamentally different from their static random graph counterparts.

  12. Seismic random noise attenuation method based on empirical mode decomposition of Hausdorff dimension

    NASA Astrophysics Data System (ADS)

    Yan, Z.; Luan, X.

    2017-12-01

    Introduction Empirical mode decomposition (EMD) is a noise suppression algorithm by using wave field separation, which is based on the scale differences between effective signal and noise. However, since the complexity of the real seismic wave field results in serious aliasing modes, it is not ideal and effective to denoise with this method alone. Based on the multi-scale decomposition characteristics of the signal EMD algorithm, combining with Hausdorff dimension constraints, we propose a new method for seismic random noise attenuation. First of all, We apply EMD algorithm adaptive decomposition of seismic data and obtain a series of intrinsic mode function (IMF)with different scales. Based on the difference of Hausdorff dimension between effectively signals and random noise, we identify IMF component mixed with random noise. Then we use threshold correlation filtering process to separate the valid signal and random noise effectively. Compared with traditional EMD method, the results show that the new method of seismic random noise attenuation has a better suppression effect. The implementation process The EMD algorithm is used to decompose seismic signals into IMF sets and analyze its spectrum. Since most of the random noise is high frequency noise, the IMF sets can be divided into three categories: the first category is the effective wave composition of the larger scale; the second category is the noise part of the smaller scale; the third category is the IMF component containing random noise. Then, the third kind of IMF component is processed by the Hausdorff dimension algorithm, and the appropriate time window size, initial step and increment amount are selected to calculate the Hausdorff instantaneous dimension of each component. The dimension of the random noise is between 1.0 and 1.05, while the dimension of the effective wave is between 1.05 and 2.0. On the basis of the previous steps, according to the dimension difference between the random noise and effective signal, we extracted the sample points, whose fractal dimension value is less than or equal to 1.05 for the each IMF components, to separate the residual noise. Using the IMF components after dimension filtering processing and the effective wave IMF components after the first selection for reconstruction, we can obtained the results of de-noising.

  13. Detection of mastitis in dairy cattle by use of mixture models for repeated somatic cell scores: a Bayesian approach via Gibbs sampling.

    PubMed

    Odegård, J; Jensen, J; Madsen, P; Gianola, D; Klemetsdal, G; Heringstad, B

    2003-11-01

    The distribution of somatic cell scores could be regarded as a mixture of at least two components depending on a cow's udder health status. A heteroscedastic two-component Bayesian normal mixture model with random effects was developed and implemented via Gibbs sampling. The model was evaluated using datasets consisting of simulated somatic cell score records. Somatic cell score was simulated as a mixture representing two alternative udder health statuses ("healthy" or "diseased"). Animals were assigned randomly to the two components according to the probability of group membership (Pm). Random effects (additive genetic and permanent environment), when included, had identical distributions across mixture components. Posterior probabilities of putative mastitis were estimated for all observations, and model adequacy was evaluated using measures of sensitivity, specificity, and posterior probability of misclassification. Fitting different residual variances in the two mixture components caused some bias in estimation of parameters. When the components were difficult to disentangle, so were their residual variances, causing bias in estimation of Pm and of location parameters of the two underlying distributions. When all variance components were identical across mixture components, the mixture model analyses returned parameter estimates essentially without bias and with a high degree of precision. Including random effects in the model increased the probability of correct classification substantially. No sizable differences in probability of correct classification were found between models in which a single cow effect (ignoring relationships) was fitted and models where this effect was split into genetic and permanent environmental components, utilizing relationship information. When genetic and permanent environmental effects were fitted, the between-replicate variance of estimates of posterior means was smaller because the model accounted for random genetic drift.

  14. Components of effective randomized controlled trials of hydrotherapy programs for fibromyalgia syndrome: A systematic review

    PubMed Central

    Perraton, Luke; Machotka, Zuzana; Kumar, Saravana

    2009-01-01

    Aim Previous systematic reviews have found hydrotherapy to be an effective management strategy for fibromyalgia syndrome (FMS). The aim of this systematic review was to summarize the components of hydrotherapy programs used in randomized controlled trials. Method A systematic review of randomized controlled trials was conducted. Only trials that have reported significant FMS-related outcomes were included. Data relating to the components of hydrotherapy programs (exercise type, duration, frequency and intensity, environmental factors, and service delivery) were analyzed. Results Eleven randomized controlled trials were included in this review. Overall, the quality of trials was good. Aerobic exercise featured in all 11 trials and the majority of hydrotherapy programs included either a strengthening or flexibility component. Great variability was noted in both the environmental components of hydrotherapy programs and service delivery. Conclusions Aerobic exercise, warm up and cool-down periods and relaxation exercises are common features of hydrotherapy programs that report significant FMS-related outcomes. Treatment duration of 60 minutes, frequency of three sessions per week and an intensity equivalent to 60%–80% maximum heart rate were the most commonly reported exercise components. Exercise appears to be the most important component of an effective hydrotherapy program for FMS, particularly when considering mental health-related outcomes. PMID:21197303

  15. Disentangling giant component and finite cluster contributions in sparse random matrix spectra.

    PubMed

    Kühn, Reimer

    2016-04-01

    We describe a method for disentangling giant component and finite cluster contributions to sparse random matrix spectra, using sparse symmetric random matrices defined on Erdős-Rényi graphs as an example and test bed. Our methods apply to sparse matrices defined in terms of arbitrary graphs in the configuration model class, as long as they have finite mean degree.

  16. Simulating the component counts of combinatorial structures.

    PubMed

    Arratia, Richard; Barbour, A D; Ewens, W J; Tavaré, Simon

    2018-02-09

    This article describes and compares methods for simulating the component counts of random logarithmic combinatorial structures such as permutations and mappings. We exploit the Feller coupling for simulating permutations to provide a very fast method for simulating logarithmic assemblies more generally. For logarithmic multisets and selections, this approach is replaced by an acceptance/rejection method based on a particular conditioning relationship that represents the distribution of the combinatorial structure as that of independent random variables conditioned on a weighted sum. We show how to improve its acceptance rate. We illustrate the method by estimating the probability that a random mapping has no repeated component sizes, and establish the asymptotic distribution of the difference between the number of components and the number of distinct component sizes for a very general class of logarithmic structures. Copyright © 2018. Published by Elsevier Inc.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradonjic, Milan; Hagberg, Aric; Hengartner, Nick

    We analyze component evolution in general random intersection graphs (RIGs) and give conditions on existence and uniqueness of the giant component. Our techniques generalize the existing methods for analysis on component evolution in RIGs. That is, we analyze survival and extinction properties of a dependent, inhomogeneous Galton-Watson branching process on general RIGs. Our analysis relies on bounding the branching processes and inherits the fundamental concepts from the study on component evolution in Erdos-Renyi graphs. The main challenge becomes from the underlying structure of RIGs, when the number of offsprings follows a binomial distribution with a different number of nodes andmore » different rate at each step during the evolution. RIGs can be interpreted as a model for large randomly formed non-metric data sets. Besides the mathematical analysis on component evolution, which we provide in this work, we perceive RIGs as an important random structure which has already found applications in social networks, epidemic networks, blog readership, or wireless sensor networks.« less

  18. Scaling Limit of Symmetric Random Walk in High-Contrast Periodic Environment

    NASA Astrophysics Data System (ADS)

    Piatnitski, A.; Zhizhina, E.

    2017-11-01

    The paper deals with the asymptotic properties of a symmetric random walk in a high contrast periodic medium in Z^d, d≥1. From the existing homogenization results it follows that under diffusive scaling the limit behaviour of this random walk need not be Markovian. The goal of this work is to show that if in addition to the coordinate of the random walk in Z^d we introduce an extra variable that characterizes the position of the random walk inside the period then the limit dynamics of this two-component process is Markov. We describe the limit process and observe that the components of the limit process are coupled. We also prove the convergence in the path space for the said random walk.

  19. Similarities between principal components of protein dynamics and random diffusion

    NASA Astrophysics Data System (ADS)

    Hess, Berk

    2000-12-01

    Principal component analysis, also called essential dynamics, is a powerful tool for finding global, correlated motions in atomic simulations of macromolecules. It has become an established technique for analyzing molecular dynamics simulations of proteins. The first few principal components of simulations of large proteins often resemble cosines. We derive the principal components for high-dimensional random diffusion, which are almost perfect cosines. This resemblance between protein simulations and noise implies that for many proteins the time scales of current simulations are too short to obtain convergence of collective motions.

  20. Temporal evolution of financial-market correlations.

    PubMed

    Fenn, Daniel J; Porter, Mason A; Williams, Stacy; McDonald, Mark; Johnson, Neil F; Jones, Nick S

    2011-08-01

    We investigate financial market correlations using random matrix theory and principal component analysis. We use random matrix theory to demonstrate that correlation matrices of asset price changes contain structure that is incompatible with uncorrelated random price changes. We then identify the principal components of these correlation matrices and demonstrate that a small number of components accounts for a large proportion of the variability of the markets that we consider. We characterize the time-evolving relationships between the different assets by investigating the correlations between the asset price time series and principal components. Using this approach, we uncover notable changes that occurred in financial markets and identify the assets that were significantly affected by these changes. We show in particular that there was an increase in the strength of the relationships between several different markets following the 2007-2008 credit and liquidity crisis.

  1. Temporal evolution of financial-market correlations

    NASA Astrophysics Data System (ADS)

    Fenn, Daniel J.; Porter, Mason A.; Williams, Stacy; McDonald, Mark; Johnson, Neil F.; Jones, Nick S.

    2011-08-01

    We investigate financial market correlations using random matrix theory and principal component analysis. We use random matrix theory to demonstrate that correlation matrices of asset price changes contain structure that is incompatible with uncorrelated random price changes. We then identify the principal components of these correlation matrices and demonstrate that a small number of components accounts for a large proportion of the variability of the markets that we consider. We characterize the time-evolving relationships between the different assets by investigating the correlations between the asset price time series and principal components. Using this approach, we uncover notable changes that occurred in financial markets and identify the assets that were significantly affected by these changes. We show in particular that there was an increase in the strength of the relationships between several different markets following the 2007-2008 credit and liquidity crisis.

  2. Evolution in fluctuating environments: decomposing selection into additive components of the Robertson-Price equation.

    PubMed

    Engen, Steinar; Saether, Bernt-Erik

    2014-03-01

    We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  3. From micro-correlations to macro-correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eliazar, Iddo, E-mail: iddo.eliazar@intel.com

    2016-11-15

    Random vectors with a symmetric correlation structure share a common value of pair-wise correlation between their different components. The symmetric correlation structure appears in a multitude of settings, e.g. mixture models. In a mixture model the components of the random vector are drawn independently from a general probability distribution that is determined by an underlying parameter, and the parameter itself is randomized. In this paper we study the overall correlation of high-dimensional random vectors with a symmetric correlation structure. Considering such a random vector, and terming its pair-wise correlation “micro-correlation”, we use an asymptotic analysis to derive the random vector’smore » “macro-correlation” : a score that takes values in the unit interval, and that quantifies the random vector’s overall correlation. The method of obtaining macro-correlations from micro-correlations is then applied to a diverse collection of frameworks that demonstrate the method’s wide applicability.« less

  4. Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.

    PubMed

    Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M

    2005-11-01

    We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.

  5. Application of lifting wavelet and random forest in compound fault diagnosis of gearbox

    NASA Astrophysics Data System (ADS)

    Chen, Tang; Cui, Yulian; Feng, Fuzhou; Wu, Chunzhi

    2018-03-01

    Aiming at the weakness of compound fault characteristic signals of a gearbox of an armored vehicle and difficult to identify fault types, a fault diagnosis method based on lifting wavelet and random forest is proposed. First of all, this method uses the lifting wavelet transform to decompose the original vibration signal in multi-layers, reconstructs the multi-layer low-frequency and high-frequency components obtained by the decomposition to get multiple component signals. Then the time-domain feature parameters are obtained for each component signal to form multiple feature vectors, which is input into the random forest pattern recognition classifier to determine the compound fault type. Finally, a variety of compound fault data of the gearbox fault analog test platform are verified, the results show that the recognition accuracy of the fault diagnosis method combined with the lifting wavelet and the random forest is up to 99.99%.

  6. Correlation Energies from the Two-Component Random Phase Approximation.

    PubMed

    Kühn, Michael

    2014-02-11

    The correlation energy within the two-component random phase approximation accounting for spin-orbit effects is derived. The resulting plasmon equation is rewritten-analogously to the scalar relativistic case-in terms of the trace of two Hermitian matrices for (Kramers-restricted) closed-shell systems and then represented as an integral over imaginary frequency using the resolution of the identity approximation. The final expression is implemented in the TURBOMOLE program suite. The code is applied to the computation of equilibrium distances and vibrational frequencies of heavy diatomic molecules. The efficiency is demonstrated by calculation of the relative energies of the Oh-, D4h-, and C5v-symmetric isomers of Pb6. Results within the random phase approximation are obtained based on two-component Kohn-Sham reference-state calculations, using effective-core potentials. These values are finally compared to other two-component and scalar relativistic methods, as well as experimental data.

  7. Influence of Embedded Inhomogeneities on the Spectral Ratio of the Horizontal Components of a Random Field of Rayleigh Waves

    NASA Astrophysics Data System (ADS)

    Tsukanov, A. A.; Gorbatnikov, A. V.

    2018-01-01

    Study of the statistical parameters of the Earth's random microseismic field makes it possible to obtain estimates of the properties and structure of the Earth's crust and upper mantle. Different approaches are used to observe and process the microseismic records, which are divided into several groups of passive seismology methods. Among them are the well-known methods of surface-wave tomography, the spectral H/ V ratio of the components in the surface wave, and microseismic sounding, currently under development, which uses the spectral ratio V/ V 0 of the vertical components between pairs of spatially separated stations. In the course of previous experiments, it became clear that these ratios are stable statistical parameters of the random field that do not depend on the properties of microseism sources. This paper proposes to expand the mentioned approach and study the possibilities for using the ratio of the horizontal components H 1/ H 2 of the microseismic field. Numerical simulation was used to study the influence of an embedded velocity inhomogeneity on the spectral ratio of the horizontal components of the random field of fundamental Rayleigh modes, based on the concept that the Earth's microseismic field is represented by these waves in a significant part of the frequency spectrum.

  8. Role of small-norm components in extended random-phase approximation

    NASA Astrophysics Data System (ADS)

    Tohyama, Mitsuru

    2017-09-01

    The role of the small-norm amplitudes in extended random-phase approximation (RPA) theories such as the particle-particle and hole-hole components of one-body amplitudes and the two-body amplitudes other than two-particle/two-hole components are investigated for the one-dimensional Hubbard model using an extended RPA derived from the time-dependent density matrix theory. It is found that these amplitudes cannot be neglected in strongly interacting regions where the effects of ground-state correlations are significant.

  9. Testing the concept of a modulation filter bank: the audibility of component modulation and detection of phase change in three-component modulators.

    PubMed

    Sek, Aleksander; Moore, Brian C J

    2003-05-01

    Two experiments were performed to test the concept that the auditory system contains a "modulation filter bank" (MFB). Experiment 1 examined the ability to "hear out" the modulation frequency of the central component of a three-component modulator applied to a 4-kHz sinusoidal carrier. On each trial, three modulated stimuli were presented. The modulator of the first stimulus contained three components. Within a run the frequencies of the outer two components were fixed and the frequency of the central ("target") component was drawn randomly from one of five values. The modulators of second and third stimuli contained one component. One had a frequency equal to that of the target and the other had a frequency randomly selected from one of the other possible values. Subjects indicated whether the target corresponded to the second or third stimulus. Scores were around 80% correct when the components in the three-component modulator were widely spaced and when the frequencies of the target and comparison differed sufficiently. Experiment 2 examined the ability to hear a change in the relative phase of the components in a three-component modulator with harmonically spaced components, using a 31FC task. The frequency of the central component, f(c), was either 50 or 100 Hz. Scores were 80%-90% correct when the component spacing was < or = 0.5 f(c), but decreased markedly for greater spacings. Performance was only slightly impaired by randomizing the overall modulation depth from one stimulus to the next. The results of both experiments are broadly consistent with what would be expected from a MFB with a Q value of 1 or slightly less.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias

    A Randon Geometric Graph (RGG) is constructed by distributing n nodes uniformly at random in the unit square and connecting two nodes if their Euclidean distance is at most r, for some prescribed r. They analyze the following randomized broadcast algorithm on RGGs. At the beginning, there is only one informed node. Then in each round, each informed node chooses a neighbor uniformly at random and informs it. They prove that this algorithm informs every node in the largest component of a RGG in {Omicron}({radical}n/r) rounds with high probability. This holds for any value of r larger than the criticalmore » value for the emergence of a giant component. In particular, the result implies that the diameter of the giant component is {Theta}({radical}n/r).« less

  11. Color image encryption based on gyrator transform and Arnold transform

    NASA Astrophysics Data System (ADS)

    Sui, Liansheng; Gao, Bo

    2013-06-01

    A color image encryption scheme using gyrator transform and Arnold transform is proposed, which has two security levels. In the first level, the color image is separated into three components: red, green and blue, which are normalized and scrambled using the Arnold transform. The green component is combined with the first random phase mask and transformed to an interim using the gyrator transform. The first random phase mask is generated with the sum of the blue component and a logistic map. Similarly, the red component is combined with the second random phase mask and transformed to three-channel-related data. The second random phase mask is generated with the sum of the phase of the interim and an asymmetrical tent map. In the second level, the three-channel-related data are scrambled again and combined with the third random phase mask generated with the sum of the previous chaotic maps, and then encrypted into a gray scale ciphertext. The encryption result has stationary white noise distribution and camouflage property to some extent. In the process of encryption and decryption, the rotation angle of gyrator transform, the iterative numbers of Arnold transform, the parameters of the chaotic map and generated accompanied phase function serve as encryption keys, and hence enhance the security of the system. Simulation results and security analysis are presented to confirm the security, validity and feasibility of the proposed scheme.

  12. Inverse kinematic problem for a random gradient medium in geometric optics approximation

    NASA Astrophysics Data System (ADS)

    Petersen, N. V.

    1990-03-01

    Scattering at random inhomogeneities in a gradient medium results in systematic deviations of the rays and travel times of refracted body waves from those corresponding to the deterministic velocity component. The character of the difference depends on the parameters of the deterministic and random velocity component. However, at great distances to the source, independently of the velocity parameters (weakly or strongly inhomogeneous medium), the most probable depth of the ray turning point is smaller than that corresponding to the deterministic velocity component, the most probable travel times also being lower. The relative uncertainty in the deterministic velocity component, derived from the mean travel times using methods developed for laterally homogeneous media (for instance, the Herglotz-Wiechert method), is systematic in character, but does not exceed the contrast of velocity inhomogeneities by magnitude. The gradient of the deterministic velocity component has a significant effect on the travel-time fluctuations. The variance at great distances to the source is mainly controlled by shallow inhomogeneities. The travel-time flucutations are studied only for weakly inhomogeneous media.

  13. Time series, correlation matrices and random matrix models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinayak; Seligman, Thomas H.

    2014-01-08

    In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series.more » By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.« less

  14. Analysis of Wind Tunnel Polar Replicates Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Micol, John R.

    2010-01-01

    The role of variance in a Modern Design of Experiments analysis of wind tunnel data is reviewed, with distinctions made between explained and unexplained variance. The partitioning of unexplained variance into systematic and random components is illustrated, with examples of the elusive systematic component provided for various types of real-world tests. The importance of detecting and defending against systematic unexplained variance in wind tunnel testing is discussed, and the random and systematic components of unexplained variance are examined for a representative wind tunnel data set acquired in a test in which a missile is used as a test article. The adverse impact of correlated (non-independent) experimental errors is described, and recommendations are offered for replication strategies that facilitate the quantification of random and systematic unexplained variance.

  15. Statistics of Delta v magnitude for a trajectory correction maneuver containing deterministic and random components

    NASA Technical Reports Server (NTRS)

    Bollman, W. E.; Chadwick, C.

    1982-01-01

    A number of interplanetary missions now being planned involve placing deterministic maneuvers along the flight path to alter the trajectory. Lee and Boain (1973) examined the statistics of trajectory correction maneuver (TCM) magnitude with no deterministic ('bias') component. The Delta v vector magnitude statistics were generated for several values of random Delta v standard deviations using expansions in terms of infinite hypergeometric series. The present investigation uses a different technique (Monte Carlo simulation) to generate Delta v magnitude statistics for a wider selection of random Delta v standard deviations and also extends the analysis to the case of nonzero deterministic Delta v's. These Delta v magnitude statistics are plotted parametrically. The plots are useful in assisting the analyst in quickly answering questions about the statistics of Delta v magnitude for single TCM's consisting of both a deterministic and a random component. The plots provide quick insight into the nature of the Delta v magnitude distribution for the TCM.

  16. Markov and semi-Markov switching linear mixed models used to identify forest tree growth components.

    PubMed

    Chaubert-Pereira, Florence; Guédon, Yann; Lavergne, Christian; Trottier, Catherine

    2010-09-01

    Tree growth is assumed to be mainly the result of three components: (i) an endogenous component assumed to be structured as a succession of roughly stationary phases separated by marked change points that are asynchronous among individuals, (ii) a time-varying environmental component assumed to take the form of synchronous fluctuations among individuals, and (iii) an individual component corresponding mainly to the local environment of each tree. To identify and characterize these three components, we propose to use semi-Markov switching linear mixed models, i.e., models that combine linear mixed models in a semi-Markovian manner. The underlying semi-Markov chain represents the succession of growth phases and their lengths (endogenous component) whereas the linear mixed models attached to each state of the underlying semi-Markov chain represent-in the corresponding growth phase-both the influence of time-varying climatic covariates (environmental component) as fixed effects, and interindividual heterogeneity (individual component) as random effects. In this article, we address the estimation of Markov and semi-Markov switching linear mixed models in a general framework. We propose a Monte Carlo expectation-maximization like algorithm whose iterations decompose into three steps: (i) sampling of state sequences given random effects, (ii) prediction of random effects given state sequences, and (iii) maximization. The proposed statistical modeling approach is illustrated by the analysis of successive annual shoots along Corsican pine trunks influenced by climatic covariates. © 2009, The International Biometric Society.

  17. Formulation and Application of the Hierarchical Generalized Random-Situation Random-Weight MIRID

    ERIC Educational Resources Information Center

    Hung, Lai-Fa

    2011-01-01

    The process-component approach has become quite popular for examining many psychological concepts. A typical example is the model with internal restrictions on item difficulty (MIRID) described by Butter (1994) and Butter, De Boeck, and Verhelst (1998). This study proposes a hierarchical generalized random-situation random-weight MIRID. The…

  18. Improved estimation of random vibration loads in launch vehicles

    NASA Technical Reports Server (NTRS)

    Mehta, R.; Erwin, E.; Suryanarayan, S.; Krishna, Murali M. R.

    1993-01-01

    Random vibration induced load is an important component of the total design load environment for payload and launch vehicle components and their support structures. The current approach to random vibration load estimation is based, particularly at the preliminary design stage, on the use of Miles' equation which assumes a single degree-of-freedom (DOF) system and white noise excitation. This paper examines the implications of the use of multi-DOF system models and response calculation based on numerical integration using the actual excitation spectra for random vibration load estimation. The analytical study presented considers a two-DOF system and brings out the effects of modal mass, damping and frequency ratios on the random vibration load factor. The results indicate that load estimates based on the Miles' equation can be significantly different from the more accurate estimates based on multi-DOF models.

  19. HUMAN RESPONDING ON RANDOM-INTERVAL SCHEDULES OF RESPONSE-COST PUNISHMENT: THE ROLE OF REDUCED REINFORCEMENT DENSITY

    PubMed Central

    Pietras, Cynthia J; Brandt, Andrew E; Searcy, Gabriel D

    2010-01-01

    An experiment with adult humans investigated the effects of response-contingent money loss (response-cost punishment) on monetary-reinforced responding. A yoked-control procedure was used to separate the effects on responding of the response-cost contingency from the effects of reduced reinforcement density. Eight adults pressed buttons for money on a three-component multiple reinforcement schedule. During baseline, responding in all components produced money gains according to a random-interval 20-s schedule. During punishment conditions, responding during the punishment component conjointly produced money losses according to a random-interval schedule. The value of the response-cost schedule was manipulated across conditions to systematically evaluate the effects on responding of response-cost frequency. Participants were assigned to one of two yoked-control conditions. For participants in the Yoked Punishment group, during punishment conditions money losses were delivered in the yoked component response independently at the same intervals that money losses were produced in the punishment component. For participants in the Yoked Reinforcement group, responding in the yoked component produced the same net earnings as produced in the punishment component. In 6 of 8 participants, contingent response cost selectively decreased response rates in the punishment component and the magnitude of the decrease was directly related to the punishment schedule value. Under punishment conditions, for participants in the Yoked Punishment group response rates in the yoked component also decreased, but the decrease was less than that observed in the punishment component, whereas for participants in the Yoked Reinforcement group response rates in the yoked component remained similar to rates in the no-punishment component. These results provide further evidence that contingent response cost functions similarly to noxious punishers in that it appears to suppress responding apart from its effects on reinforcement density. PMID:20676265

  20. A Randomized Controlled Trial of the Morningside Math Facts Curriculum on Fluency, Stability, Endurance and Application Outcomes

    ERIC Educational Resources Information Center

    McTiernan, Aoife; Holloway, Jennifer; Healy, Olive; Hogan, Michael

    2016-01-01

    A randomized controlled trial was used to evaluate the impact of a frequency-building curriculum to increase the fluency of component mathematics skills in a sample of 28 males aged 9-11 years. Assessments of mathematical ability were conducted before and after the training period to evaluate the impact of learning component skills fluently on…

  1. Probabilistic material degradation model for aerospace materials subjected to high temperature, mechanical and thermal fatigue, and creep

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1992-01-01

    A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.

  2. Time series analysis of collective motions in proteins

    NASA Astrophysics Data System (ADS)

    Alakent, Burak; Doruker, Pemra; ćamurdan, Mehmet C.

    2004-01-01

    The dynamics of α-amylase inhibitor tendamistat around its native state is investigated using time series analysis of the principal components of the Cα atomic displacements obtained from molecular dynamics trajectories. Collective motion along a principal component is modeled as a homogeneous nonstationary process, which is the result of the damped oscillations in local minima superimposed on a random walk. The motion in local minima is described by a stationary autoregressive moving average model, consisting of the frequency, damping factor, moving average parameters and random shock terms. Frequencies for the first 50 principal components are found to be in the 3-25 cm-1 range, which are well correlated with the principal component indices and also with atomistic normal mode analysis results. Damping factors, though their correlation is less pronounced, decrease as principal component indices increase, indicating that low frequency motions are less affected by friction. The existence of a positive moving average parameter indicates that the stochastic force term is likely to disturb the mode in opposite directions for two successive sampling times, showing the modes tendency to stay close to minimum. All these four parameters affect the mean square fluctuations of a principal mode within a single minimum. The inter-minima transitions are described by a random walk model, which is driven by a random shock term considerably smaller than that for the intra-minimum motion. The principal modes are classified into three subspaces based on their dynamics: essential, semiconstrained, and constrained, at least in partial consistency with previous studies. The Gaussian-type distributions of the intermediate modes, called "semiconstrained" modes, are explained by asserting that this random walk behavior is not completely free but between energy barriers.

  3. Process Evaluation of a Multi-Component Intervention to Reduce Infectious Diseases and Improve Hygiene and Well-Being among School Children: The Hi Five Study

    ERIC Educational Resources Information Center

    Bonnesen, C. T.; Plauborg, R.; Denbaek, A. M.; Due, P.; Johansen, A.

    2015-01-01

    The Hi Five study was a three-armed cluster randomized controlled trial designed to reduce infections and improve hygiene and well-being among pupils. Participating schools (n = 43) were randomized into either control (n = 15) or one of two intervention groups (n = 28). The intervention consisted of three components: (i) a curriculum (ii)…

  4. On chemical distances and shape theorems in percolation models with long-range correlations

    NASA Astrophysics Data System (ADS)

    Drewitz, Alexander; Ráth, Balázs; Sapozhnikov, Artëm

    2014-08-01

    In this paper, we provide general conditions on a one parameter family of random infinite subsets of {{Z}}^d to contain a unique infinite connected component for which the chemical distances are comparable to the Euclidean distance. In addition, we show that these conditions also imply a shape theorem for the corresponding infinite connected component. By verifying these conditions for specific models, we obtain novel results about the structure of the infinite connected component of the vacant set of random interlacements and the level sets of the Gaussian free field. As a byproduct, we obtain alternative proofs to the corresponding results for random interlacements in the work of Černý and Popov ["On the internal distance in the interlacement set," Electron. J. Probab. 17(29), 1-25 (2012)], and while our main interest is in percolation models with long-range correlations, we also recover results in the spirit of the work of Antal and Pisztora ["On the chemical distance for supercritical Bernoulli percolation," Ann Probab. 24(2), 1036-1048 (1996)] for Bernoulli percolation. Finally, as a corollary, we derive new results about the (chemical) diameter of the largest connected component in the complement of the trace of the random walk on the torus.

  5. Effect of randomness on multi-frequency aeroelastic responses resolved by Unsteady Adaptive Stochastic Finite Elements

    NASA Astrophysics Data System (ADS)

    Witteveen, Jeroen A. S.; Bijl, Hester

    2009-10-01

    The Unsteady Adaptive Stochastic Finite Elements (UASFE) method resolves the effect of randomness in numerical simulations of single-mode aeroelastic responses with a constant accuracy in time for a constant number of samples. In this paper, the UASFE framework is extended to multi-frequency responses and continuous structures by employing a wavelet decomposition pre-processing step to decompose the sampled multi-frequency signals into single-frequency components. The effect of the randomness on the multi-frequency response is then obtained by summing the results of the UASFE interpolation at constant phase for the different frequency components. Results for multi-frequency responses and continuous structures show a three orders of magnitude reduction of computational costs compared to crude Monte Carlo simulations in a harmonically forced oscillator, a flutter panel problem, and the three-dimensional transonic AGARD 445.6 wing aeroelastic benchmark subject to random fields and random parameters with various probability distributions.

  6. Peri-apatite coating decreases uncemented tibial component migration: long-term RSA results of a randomized controlled trial and limitations of short-term results.

    PubMed

    Van Hamersveld, Koen T; Marang-Van De Mheen, Perla J; Nelissen, Rob G H H; Toksvig-Larsen, Sören

    2018-05-09

    Background and purpose - Biological fixation of uncemented knee prostheses can be improved by applying hydroxyapatite coating around the porous surface via a solution deposition technique called Peri-Apatite (PA). The 2-year results of a randomized controlled trial, evaluating the effect of PA, revealed several components with continuous migration in the second postoperative year, particularly in the uncoated group. To evaluate whether absence of early stabilization is diagnostic of loosening, we now present long-term follow-up results. Patients and methods - 60 patients were randomized to PA-coated or uncoated (porous only) total knee arthroplasty of which 58 were evaluated with radiostereometric analysis (RSA) performed at baseline, at 3 months postoperatively and at 1, 2, 5, 7, and 10 years. A linear mixed-effects model was used to analyze the repeated measurements. Results - PA-coated components had a statistically significantly lower mean migration at 10 years of 0.94 mm (95% CI 0.72-1.2) compared with the uncoated group showing a mean migration of 1.72 mm (95% CI 1.4-2.1). Continuous migration in the second postoperative year was seen in 7 uncoated components and in 1 PA-coated component. All of these implants stabilized after 2 years except for 2 uncoated components. Interpretation - Peri-apatite enhances stabilization of uncemented components. The number of components that stabilized after 2 years emphasizes the importance of longer follow-up to determine full stabilization and risk of loosening in uncemented components with biphasic migration profiles.

  7. Theoretical Calculation of the Power Spectra of the Rolling and Yawing Moments on a Wing in Random Turbulence

    NASA Technical Reports Server (NTRS)

    Eggleston, John M; Diederich, Franklin W

    1957-01-01

    The correlation functions and power spectra of the rolling and yawing moments on an airplane wing due to the three components of continuous random turbulence are calculated. The rolling moments to the longitudinal (horizontal) and normal (vertical) components depend on the spanwise distributions of instantaneous gust intensity, which are taken into account by using the inherent properties of symmetry of isotropic turbulence. The results consist of expressions for correlation functions or spectra of the rolling moment in terms of the point correlation functions of the two components of turbulence. Specific numerical calculations are made for a pair of correlation functions given by simple analytic expressions which fit available experimental data quite well. Calculations are made for four lift distributions. Comparison is made with the results of previous analyses which assumed random turbulence along the flight path and linear variations of gust velocity across the span.

  8. Simulation of the Effects of Random Measurement Errors

    ERIC Educational Resources Information Center

    Kinsella, I. A.; Hannaidh, P. B. O.

    1978-01-01

    Describes a simulation method for measurement of errors that requires calculators and tables of random digits. Each student simulates the random behaviour of the component variables in the function and by combining the results of all students, the outline of the sampling distribution of the function can be obtained. (GA)

  9. Failure and recovery in dynamical networks.

    PubMed

    Böttcher, L; Luković, M; Nagler, J; Havlin, S; Herrmann, H J

    2017-02-03

    Failure, damage spread and recovery crucially underlie many spatially embedded networked systems ranging from transportation structures to the human body. Here we study the interplay between spontaneous damage, induced failure and recovery in both embedded and non-embedded networks. In our model the network's components follow three realistic processes that capture these features: (i) spontaneous failure of a component independent of the neighborhood (internal failure), (ii) failure induced by failed neighboring nodes (external failure) and (iii) spontaneous recovery of a component. We identify a metastable domain in the global network phase diagram spanned by the model's control parameters where dramatic hysteresis effects and random switching between two coexisting states are observed. This dynamics depends on the characteristic link length of the embedded system. For the Euclidean lattice in particular, hysteresis and switching only occur in an extremely narrow region of the parameter space compared to random networks. We develop a unifying theory which links the dynamics of our model to contact processes. Our unifying framework may help to better understand controllability in spatially embedded and random networks where spontaneous recovery of components can mitigate spontaneous failure and damage spread in dynamical networks.

  10. RANDOM MATRIX DIAGONALIZATION--A COMPUTER PROGRAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuchel, K.; Greibach, R.J.; Porter, C.E.

    A computer prograra is described which generates random matrices, diagonalizes them and sorts appropriately the resulting eigenvalues and eigenvector components. FAP and FORTRAN listings for the IBM 7090 computer are included. (auth)

  11. Maraviroc, a chemokine receptor-5 antagonist, fails to demonstrate efficacy in the treatment of patients with rheumatoid arthritis in a randomized, double-blind placebo-controlled trial

    PubMed Central

    2012-01-01

    Introduction The purpose of this study was to determine whether maraviroc, a human CC chemokine receptor 5 (CCR5) antagonist, is safe and effective in the treatment of active rheumatoid arthritis (RA) in patients on background methotrexate (MTX). Methods This phase IIa study comprised two distinct components: an open-label safety study of the pharmacokinetics (PK) of MTX in the presence of maraviroc, and a randomized, double-blind, placebo-controlled, proof-of-concept (POC) component. In the PK component, patients were randomized 1:1 to receive maraviroc 150 or 300 mg twice daily (BID) for four weeks. In the POC component, patients were randomized 2:1 to receive maraviroc 300 mg BID or placebo for 12 weeks. Patients were not eligible for inclusion in both components. Results Sixteen patients were treated in the safety/PK component. Maraviroc was well tolerated and there was no evidence of drug-drug interaction with MTX. One hundred ten patients were treated in the POC component. The study was terminated after the planned interim futility analysis due to lack of efficacy, at which time 59 patients (38 maraviroc; 21 placebo) had completed their week 12 visit. There was no significant difference in the number of ACR20 responders between the maraviroc (23.7%) and placebo (23.8%) groups (treatment difference -0.13%; 90% CI -20.45, 17.70; P = 0.504). The most common all-causality treatment-emergent adverse events in the maraviroc group were constipation (7.8%), nausea (5.2%), and fatigue (3.9%). Conclusions Maraviroc was generally well tolerated over 12 weeks; however, selective antagonism of CCR5 with maraviroc 300 mg BID failed to improve signs and symptoms in patients with active RA on background MTX. Trial Registration ClinicalTrials.gov: NCT00427934 PMID:22251436

  12. Price percolation model

    NASA Astrophysics Data System (ADS)

    Kanai, Yasuhiro; Abe, Keiji; Seki, Yoichi

    2015-06-01

    We propose a price percolation model to reproduce the price distribution of components used in industrial finished goods. The intent is to show, using the price percolation model and a component category as an example, that percolation behaviors, which exist in the matter system, the ecosystem, and human society, also exist in abstract, random phenomena satisfying the power law. First, we discretize the total potential demand for a component category, considering it a random field. Second, we assume that the discretized potential demand corresponding to a function of a finished good turns into actual demand if the difficulty of function realization is less than the maximum difficulty of the realization. The simulations using this model suggest that changes in a component category's price distribution are due to changes in the total potential demand corresponding to the lattice size and the maximum difficulty of realization, which is an occupation probability. The results are verified using electronic components' sales data.

  13. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual. Appendix 2: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN programs RANDOM3 and RANDOM4 are documented in the form of a user's manual. Both programs are based on fatigue strength reduction, using a probabilistic constitutive model. The programs predict the random lifetime of an engine component to reach a given fatigue strength. The theoretical backgrounds, input data instructions, and sample problems illustrating the use of the programs are included.

  14. The voluntary-threat approach to control nonpoint source pollution under uncertainty.

    PubMed

    Li, Youping

    2013-11-15

    This paper extends the voluntary-threat approach of Segerson and Wu (2006) to the case that the ambient level of nonpoint source pollution is stochastic. It is shown that when the random component is bounded from the above, fine-tuning the cutoff value of the tax payments avoids the actual imposition of the tax while the threat of such payments retains necessary incentive for the polluters to engage in abatements at the optimal level. If the random component is not bounded, the imposition of the tax cannot be completely avoided but the probability can be reduced by setting a higher cutoff value. It is also noted that the regulator has additional flexibility in randomizing the tax imposition but the randomization process has to be credible. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. The Hi Five study: design of a school-based randomized trial to reduce infections and improve hygiene and well-being among 6-15 year olds in Denmark.

    PubMed

    Johansen, Anette; Denbæk, Anne Maj; Bonnesen, Camilla Thørring; Due, Pernille

    2015-03-01

    Infectious illnesses such as influenza and diarrhea are leading causes of absenteeism among Danish school children. Interventions in school settings addressing hand hygiene have shown to reduce the number of infectious illnesses. However, most of these studies include small populations and almost none of them are conducted as randomized controlled trials. The overall aim of the Hi Five study was to develop, implement and evaluate a multi-component school-based intervention to improve hand hygiene and well-being and to reduce the prevalence of infections among school children in intervention schools by 20% compared to control schools. This paper describes the development and the evaluation design of Hi Five. The Hi Five study was designed as a tree-armed cluster-randomized controlled trial. A national random sample of schools (n = 44) was randomized to one of two intervention groups (n = 29) or to a control group with no intervention (n = 15). A total of 8,438 six to fifteen-year-old school children were enrolled in the study. The Hi Five intervention consisted of three components: 1) a curriculum component 2) mandatory daily hand washing before lunch 3) extra cleaning of school toilets during the school day. Baseline data was collected from December 2011 to April 2012. The intervention period was August 2012 to June 2013. The follow-up data was collected from December 2012 to April 2013. The Hi Five study fills a gap in international research. This large randomized multi-component school-based hand hygiene intervention is the first to include education on healthy and appropriate toilet behavior as part of the curriculum. No previous studies have involved supplementary cleaning at the school toilets as an intervention component. The study will have the added value of providing new knowledge about usability of short message service (SMS, text message) for collecting data on infectious illness and absenteeism in large study populations. Current Controlled Trials ISRCTN19287682 , 21 December 2012.

  16. An Alternative Method for Computing Mean and Covariance Matrix of Some Multivariate Distributions

    ERIC Educational Resources Information Center

    Radhakrishnan, R.; Choudhury, Askar

    2009-01-01

    Computing the mean and covariance matrix of some multivariate distributions, in particular, multivariate normal distribution and Wishart distribution are considered in this article. It involves a matrix transformation of the normal random vector into a random vector whose components are independent normal random variables, and then integrating…

  17. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  18. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  19. Assessing variance components in multilevel linear models using approximate Bayes factors: A case study of ethnic disparities in birthweight

    PubMed Central

    Saville, Benjamin R.; Herring, Amy H.; Kaufman, Jay S.

    2013-01-01

    Racial/ethnic disparities in birthweight are a large source of differential morbidity and mortality worldwide and have remained largely unexplained in epidemiologic models. We assess the impact of maternal ancestry and census tract residence on infant birth weights in New York City and the modifying effects of race and nativity by incorporating random effects in a multilevel linear model. Evaluating the significance of these predictors involves the test of whether the variances of the random effects are equal to zero. This is problematic because the null hypothesis lies on the boundary of the parameter space. We generalize an approach for assessing random effects in the two-level linear model to a broader class of multilevel linear models by scaling the random effects to the residual variance and introducing parameters that control the relative contribution of the random effects. After integrating over the random effects and variance components, the resulting integrals needed to calculate the Bayes factor can be efficiently approximated with Laplace’s method. PMID:24082430

  20. Uncrackable code for nuclear weapons

    ScienceCinema

    Hart, Mark

    2018-05-11

    Mark Hart, a scientist and engineer in Lawrence Livermore National Laboratory's (LLNL) Defense Technologies Division, has developed a new approach for ensuring nuclear weapons and their components can't fall prey to unauthorized use. The beauty of his approach: Let the weapon protect itself. "Using the random process of nuclear radioactive decay is the gold standard of random number generators," said Mark Hart. "You’d have a better chance of winning both Mega Millions and Powerball on the same day than getting control of IUC-protected components."

  1. Uncrackable code for nuclear weapons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Mark

    Mark Hart, a scientist and engineer in Lawrence Livermore National Laboratory's (LLNL) Defense Technologies Division, has developed a new approach for ensuring nuclear weapons and their components can't fall prey to unauthorized use. The beauty of his approach: Let the weapon protect itself. "Using the random process of nuclear radioactive decay is the gold standard of random number generators," said Mark Hart. "You’d have a better chance of winning both Mega Millions and Powerball on the same day than getting control of IUC-protected components."

  2. Self-correcting random number generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Pooser, Raphael C.

    2016-09-06

    A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, tomore » provide a random number according to one or more performance criteria.« less

  3. Do Conditional Reinforcers Count?

    ERIC Educational Resources Information Center

    Davison, Michael; Baum, William M.

    2006-01-01

    Six pigeons were trained on a procedure in which seven components arranged different food-delivery ratios on concurrent variable-interval schedules each session. The components were unsignaled, lasted for 10 food deliveries, and occurred in random order with a 60-s blackout between components. The schedules were arranged using a switching-key…

  4. Financial Management and Job Social Skills Training Components in a Summer Business Institute

    ERIC Educational Resources Information Center

    Donohue, Brad; Conway, Debbie; Beisecker, Monica; Murphy, Heather; Farley, Alisha; Waite, Melissa; Gugino, Kristin; Knatz, Danielle; Lopez-Frank, Carolina; Burns, Jack; Madison, Suzanne; Shorty, Carrie

    2005-01-01

    Ninety-two adolescents, predominantly ethnic minority high school students, participated in a structured Summer Business Institute (SBI). Participating youth were randomly assigned to receive either job social skills or financial management skills training components. Students who additionally received the job social skills training component were…

  5. Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.

    PubMed

    Saccenti, Edoardo; Timmerman, Marieke E

    2017-03-01

    Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.

  6. Surface effect investigation on multipactor in microwave components using the EM-PIC method

    NASA Astrophysics Data System (ADS)

    Li, Yun; Ye, Ming; He, Yong-Ning; Cui, Wan-Zhao; Wang, Dan

    2017-11-01

    Multipactor poses a great risk to microwave components in space and its accurate controllable suppression is still lacking. To evaluate the secondary electron emission (SEE) of arbitrary surface states on multipactor, metal samples fabricated with ideal smoothness, random roughness, and micro-structures on the surface are investigated through SEE experiments and multipactor simulations. An accurate quantitative relationship between the SEE parameters and the multipactor discharge threshold in practical components has been established through Electromagnetic Particle-In-Cell (EM-PIC) simulation. Simulation results of microwave components, including the impedance transformer and the coaxial filter, exhibit an intuitive correlation between the critical SEE parameters, varied due to different surface states, and multipactor thresholds. It is demonstrated that it is the surface micro-structures with certain depth and morphology that determine the average yield of secondaries, other than the random surface relieves. Both the random surface relieves and micro-structures have a scattering effect on SEE, and the yield is prone to be identical upon different elevation angles of incident electrons. It possesses a great potential in the optimization and improvement of suppression technology without the exhaustion of the technological parameter.

  7. Rocket Engine Nozzle Side Load Transient Analysis Methodology: A Practical Approach

    NASA Technical Reports Server (NTRS)

    Shi, John J.

    2005-01-01

    During the development stage, in order to design/to size the rocket engine components and to reduce the risks, the local dynamic environments as well as dynamic interface loads must be defined. There are two kinds of dynamic environment, i.e. shock transients and steady-state random and sinusoidal vibration environments. Usually, the steady-state random and sinusoidal vibration environments are scalable, but the shock environments are not scalable. In other words, based on similarities only random vibration environments can be defined for a new engine. The methodology covered in this paper provides a way to predict the shock environments and the dynamic loads for new engine systems and new engine components in the early stage of new engine development or engine nozzle modifications.

  8. New constraints on modelling the random magnetic field of the MW

    NASA Astrophysics Data System (ADS)

    Beck, Marcus C.; Beck, Alexander M.; Beck, Rainer; Dolag, Klaus; Strong, Andrew W.; Nielaba, Peter

    2016-05-01

    We extend the description of the isotropic and anisotropic random component of the small-scale magnetic field within the existing magnetic field model of the Milky Way from Jansson & Farrar, by including random realizations of the small-scale component. Using a magnetic-field power spectrum with Gaussian random fields, the NE2001 model for the thermal electrons and the Galactic cosmic-ray electron distribution from the current GALPROP model we derive full-sky maps for the total and polarized synchrotron intensity as well as the Faraday rotation-measure distribution. While previous work assumed that small-scale fluctuations average out along the line-of-sight or which only computed ensemble averages of random fields, we show that these fluctuations need to be carefully taken into account. Comparing with observational data we obtain not only good agreement with 408 MHz total and WMAP7 22 GHz polarized intensity emission maps, but also an improved agreement with Galactic foreground rotation-measure maps and power spectra, whose amplitude and shape strongly depend on the parameters of the random field. We demonstrate that a correlation length of 0≈22 pc (05 pc being a 5σ lower limit) is needed to match the slope of the observed power spectrum of Galactic foreground rotation-measure maps. Using multiple realizations allows us also to infer errors on individual observables. We find that previously-used amplitudes for random and anisotropic random magnetic field components need to be rescaled by factors of ≈0.3 and 0.6 to account for the new small-scale contributions. Our model predicts a rotation measure of -2.8±7.1 rad/m2 and 04.4±11. rad/m2 for the north and south Galactic poles respectively, in good agreement with observations. Applying our model to deflections of ultra-high-energy cosmic rays we infer a mean deflection of ≈3.5±1.1 degree for 60 EeV protons arriving from CenA.

  9. Least Principal Components Analysis (LPCA): An Alternative to Regression Analysis.

    ERIC Educational Resources Information Center

    Olson, Jeffery E.

    Often, all of the variables in a model are latent, random, or subject to measurement error, or there is not an obvious dependent variable. When any of these conditions exist, an appropriate method for estimating the linear relationships among the variables is Least Principal Components Analysis. Least Principal Components are robust, consistent,…

  10. Examining the Relationships of Component Reading Skills to Reading Comprehension in Struggling Adult Readers

    ERIC Educational Resources Information Center

    Tighe, Elizabeth L.; Schatschneider, Christopher

    2016-01-01

    The current study employed a meta-analytic approach to investigate the relative importance of component reading skills to reading comprehension in struggling adult readers. A total of 10 component skills were consistently identified across 16 independent studies and 2,707 participants. Random effects models generated 76 predictor-reading…

  11. Fatigue crack growth model RANDOM2 user manual. Appendix 1: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN program RANDOM2 is presented in the form of a user's manual. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Details of the theoretical background, input data instructions, and a sample problem illustrating the use of the program are included.

  12. Parallel Optical Random Access Memory (PORAM)

    NASA Technical Reports Server (NTRS)

    Alphonse, G. A.

    1989-01-01

    It is shown that the need to minimize component count, power and size, and to maximize packing density require a parallel optical random access memory to be designed in a two-level hierarchy: a modular level and an interconnect level. Three module designs are proposed, in the order of research and development requirements. The first uses state-of-the-art components, including individually addressed laser diode arrays, acousto-optic (AO) deflectors and magneto-optic (MO) storage medium, aimed at moderate size, moderate power, and high packing density. The next design level uses an electron-trapping (ET) medium to reduce optical power requirements. The third design uses a beam-steering grating surface emitter (GSE) array to reduce size further and minimize the number of components.

  13. Two-Component Structure in the Entanglement Spectrum of Highly Excited States

    NASA Astrophysics Data System (ADS)

    Yang, Zhi-Cheng; Chamon, Claudio; Hamma, Alioscia; Mucciolo, Eduardo R.

    2015-12-01

    We study the entanglement spectrum of highly excited eigenstates of two known models that exhibit a many-body localization transition, namely the one-dimensional random-field Heisenberg model and the quantum random energy model. Our results indicate that the entanglement spectrum shows a "two-component" structure: a universal part that is associated with random matrix theory, and a nonuniversal part that is model dependent. The nonuniversal part manifests the deviation of the highly excited eigenstate from a true random state even in the thermalized phase where the eigenstate thermalization hypothesis holds. The fraction of the spectrum containing the universal part decreases as one approaches the critical point and vanishes in the localized phase in the thermodynamic limit. We use the universal part fraction to construct an order parameter for measuring the degree of randomness of a generic highly excited state, which is also a promising candidate for studying the many-body localization transition. Two toy models based on Rokhsar-Kivelson type wave functions are constructed and their entanglement spectra are shown to exhibit the same structure.

  14. A Practical Methodology for Quantifying Random and Systematic Components of Unexplained Variance in a Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Obara, Clifford J.; Goodman, Wesley L.

    2012-01-01

    This paper documents a check standard wind tunnel test conducted in the Langley 0.3-Meter Transonic Cryogenic Tunnel (0.3M TCT) that was designed and analyzed using the Modern Design of Experiments (MDOE). The test designed to partition the unexplained variance of typical wind tunnel data samples into two constituent components, one attributable to ordinary random error, and one attributable to systematic error induced by covariate effects. Covariate effects in wind tunnel testing are discussed, with examples. The impact of systematic (non-random) unexplained variance on the statistical independence of sequential measurements is reviewed. The corresponding correlation among experimental errors is discussed, as is the impact of such correlation on experimental results generally. The specific experiment documented herein was organized as a formal test for the presence of unexplained variance in representative samples of wind tunnel data, in order to quantify the frequency with which such systematic error was detected, and its magnitude relative to ordinary random error. Levels of systematic and random error reported here are representative of those quantified in other facilities, as cited in the references.

  15. Absence of a Tourniquet Does Not Affect Fixation of Cemented TKA: A Randomized RSA Study of 70 Patients.

    PubMed

    Ejaz, Ashir; Laursen, Anders C; Jakobsen, Thomas; Rasmussen, Sten; Nielsen, Poul Torben; Laursen, Mogens B

    2015-12-01

    We aimed to determine whether not using a tourniquet in cemented TKA would affect migration of the tibial component measured by radiosterometric analysis (RSA). Seventy patients were randomized into a tourniquet group and a non-tourniquet group and using model-based RSA, the migration of the tibial component was analyzed. Primary and secondary outcome measures were maximum total point motion (MTPM) and translations and rotations. Follow-up period was 2 years. The tibial component was well fixated in both groups and no significant difference in migration between the two groups was detected (P=0.632). Mean MTPM (SD) was 0.47 mm (0.16) in the tourniquet group and 0.45 mm (0.21) in the non-tourniquet group. Absence of tourniquet indicates that stable fixation of the tibial component can be achieved in cemented TKA. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Mass and Reliability Source (MaRS) Database

    NASA Technical Reports Server (NTRS)

    Valdenegro, Wladimir

    2017-01-01

    The Mass and Reliability Source (MaRS) Database consolidates components mass and reliability data for all Oribital Replacement Units (ORU) on the International Space Station (ISS) into a single database. It was created to help engineers develop a parametric model that relates hardware mass and reliability. MaRS supplies relevant failure data at the lowest possible component level while providing support for risk, reliability, and logistics analysis. Random-failure data is usually linked to the ORU assembly. MaRS uses this data to identify and display the lowest possible component failure level. As seen in Figure 1, the failure point is identified to the lowest level: Component 2.1. This is useful for efficient planning of spare supplies, supporting long duration crewed missions, allowing quicker trade studies, and streamlining diagnostic processes. MaRS is composed of information from various databases: MADS (operating hours), VMDB (indentured part lists), and ISS PART (failure data). This information is organized in Microsoft Excel and accessed through a program made in Microsoft Access (Figure 2). The focus of the Fall 2017 internship tour was to identify the components that were the root cause of failure from the given random-failure data, develop a taxonomy for the database, and attach material headings to the component list. Secondary objectives included verifying the integrity of the data in MaRS, eliminating any part discrepancies, and generating documentation for future reference. Due to the nature of the random-failure data, data mining had to be done manually without the assistance of an automated program to ensure positive identification.

  17. Algorithm 971: An Implementation of a Randomized Algorithm for Principal Component Analysis

    PubMed Central

    LI, HUAMIN; LINDERMAN, GEORGE C.; SZLAM, ARTHUR; STANTON, KELLY P.; KLUGER, YUVAL; TYGERT, MARK

    2017-01-01

    Recent years have witnessed intense development of randomized methods for low-rank approximation. These methods target principal component analysis and the calculation of truncated singular value decompositions. The present article presents an essentially black-box, foolproof implementation for Mathworks’ MATLAB, a popular software platform for numerical computation. As illustrated via several tests, the randomized algorithms for low-rank approximation outperform or at least match the classical deterministic techniques (such as Lanczos iterations run to convergence) in basically all respects: accuracy, computational efficiency (both speed and memory usage), ease-of-use, parallelizability, and reliability. However, the classical procedures remain the methods of choice for estimating spectral norms and are far superior for calculating the least singular values and corresponding singular vectors (or singular subspaces). PMID:28983138

  18. On fatigue crack growth under random loading

    NASA Astrophysics Data System (ADS)

    Zhu, W. Q.; Lin, Y. K.; Lei, Y.

    1992-09-01

    A probabilistic analysis of the fatigue crack growth, fatigue life and reliability of a structural or mechanical component is presented on the basis of fracture mechanics and theory of random processes. The material resistance to fatigue crack growth and the time-history of the stress are assumed to be random. Analytical expressions are obtained for the special case in which the random stress is a stationary narrow-band Gaussian random process, and a randomized Paris-Erdogan law is applicable. As an example, the analytical method is applied to a plate with a central crack, and the results are compared with those obtained from digital Monte Carlo simulations.

  19. Two-component Structure in the Entanglement Spectrum of Highly Excited States

    NASA Astrophysics Data System (ADS)

    Yang, Zhi-Cheng; Chamon, Claudio; Hamma, Alioscia; Mucciolo, Eduardo

    We study the entanglement spectrum of highly excited eigenstates of two known models which exhibit a many-body localization transition, namely the one-dimensional random-field Heisenberg model and the quantum random energy model. Our results indicate that the entanglement spectrum shows a ``two-component'' structure: a universal part that is associated to Random Matrix Theory, and a non-universal part that is model dependent. The non-universal part manifests the deviation of the highly excited eigenstate from a true random state even in the thermalized phase where the Eigenstate Thermalization Hypothesis holds. The fraction of the spectrum containing the universal part decreases continuously as one approaches the critical point and vanishes in the localized phase in the thermodynamic limit. We use the universal part fraction to construct a new order parameter for the many-body delocalized-to-localized transition. Two toy models based on Rokhsar-Kivelson type wavefunctions are constructed and their entanglement spectra are shown to exhibit the same structure.

  20. Probabilistic analysis for fatigue strength degradation of materials

    NASA Technical Reports Server (NTRS)

    Royce, Lola

    1989-01-01

    This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.

  1. Optical Interactions at Randomly Rough Surfaces

    DTIC Science & Technology

    2003-03-10

    frequency range. The design of a random surface that acts as a Lambertian diffuser, especially in the infrared region of the optical spectrum, is...FTIR grazing angle microscopy. Recently, an experimental study was performed of the far-field scattering at small grazing angles, especially the enhanced...a specular component in the scattered light, in this frequency range. The design of a random surface that acts as a Lambertian diffuser, especially in

  2. Randomness Amplification under Minimal Fundamental Assumptions on the Devices

    NASA Astrophysics Data System (ADS)

    Ramanathan, Ravishankar; Brandão, Fernando G. S. L.; Horodecki, Karol; Horodecki, Michał; Horodecki, Paweł; Wojewódka, Hanna

    2016-12-01

    Recently, the physically realistic protocol amplifying the randomness of Santha-Vazirani sources producing cryptographically secure random bits was proposed; however, for reasons of practical relevance, the crucial question remained open regarding whether this can be accomplished under the minimal conditions necessary for the task. Namely, is it possible to achieve randomness amplification using only two no-signaling components and in a situation where the violation of a Bell inequality only guarantees that some outcomes of the device for specific inputs exhibit randomness? Here, we solve this question and present a device-independent protocol for randomness amplification of Santha-Vazirani sources using a device consisting of two nonsignaling components. We show that the protocol can amplify any such source that is not fully deterministic into a fully random source while tolerating a constant noise rate and prove the composable security of the protocol against general no-signaling adversaries. Our main innovation is the proof that even the partial randomness certified by the two-party Bell test [a single input-output pair (u* , x* ) for which the conditional probability P (x*|u*) is bounded away from 1 for all no-signaling strategies that optimally violate the Bell inequality] can be used for amplification. We introduce the methodology of a partial tomographic procedure on the empirical statistics obtained in the Bell test that ensures that the outputs constitute a linear min-entropy source of randomness. As a technical novelty that may be of independent interest, we prove that the Santha-Vazirani source satisfies an exponential concentration property given by a recently discovered generalized Chernoff bound.

  3. Solving a mixture of many random linear equations by tensor decomposition and alternating minimization.

    DOT National Transportation Integrated Search

    2016-09-01

    We consider the problem of solving mixed random linear equations with k components. This is the noiseless setting of mixed linear regression. The goal is to estimate multiple linear models from mixed samples in the case where the labels (which sample...

  4. Random Signal Fluctuations Can Reduce Random Fluctuations in Regulated Components of Chemical Regulatory Networks

    NASA Astrophysics Data System (ADS)

    Paulsson, Johan; Ehrenberg, Måns

    2000-06-01

    Many intracellular components are present in low copy numbers per cell and subject to feedback control. We use chemical master equations to analyze a negative feedback system where species X and S regulate each other's synthesis with standard intracellular kinetics. For a given number of X-molecules, S-variation can be significant. We show that this signal noise does not necessarily increase X-variation as previously thought but, surprisingly, can be necessary to reduce it below a Poissonian limit. The principle resembles Stochastic Resonance in that signal noise improves signal detection.

  5. Contrast and autoshaping in multiple schedules varying reinforcer rate and duration.

    PubMed

    Hamilton, B E; Silberberg, A

    1978-07-01

    Thirteen master pigeons were exposed to multiple schedules in which reinforcement frequency (Experiment I) or duration (Experiment II) was varied. In Phases 1 and 3 of Experiment I, the values of the first and second components' random-interval schedules were 33 and 99 seconds, respectively. In Phase 2, these values were 99 seconds for both components. In Experiment II, a random-interval 33-second schedule was associated with each component. During Phases 1 and 3, the first and second components had hopper durations of 7.5 and 2.5 seconds respectively. During Phase 2, both components' hopper durations were 2.5 seconds. In each experiment, positive contrast obtained for about half the master subjects. The rest showed a rate increase in both components (positive induction). Each master subject's key colors and reinforcers were synchronously presented on a response-independent basis to a yoked control. Richer component key-pecking occurred during each experiment's Phases 1 and 3 among half these subjects. However, none responded during the contrast condition (unchanged component of each experiment's Phase 2). From this it is inferred that autoshaping did not contribute to the contrast and induction findings among master birds. Little evidence of local contrast (highest rate at beginning of richer component) was found in any subject. These data show that (a) contrast can occur independently from autoshaping, (b) contrast assays during equal-valued components may produce induction, (c) local contrast in multiple schedules often does not occur, and (d) differential hopper durations can produce autoshaping and contrast.

  6. Bayesian estimation of Karhunen–Loève expansions; A random subspace approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhary, Kenny; Najm, Habib N.

    One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less

  7. Bayesian estimation of Karhunen–Loève expansions; A random subspace approach

    DOE PAGES

    Chowdhary, Kenny; Najm, Habib N.

    2016-04-13

    One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less

  8. On the predictive control of foveal eye tracking and slow phases of optokinetic and vestibular nystagmus.

    PubMed Central

    Yasui, S; Young, L R

    1984-01-01

    Smooth pursuit and saccadic components of foveal visual tracking as well as more involuntary ocular movements of optokinetic (o.k.n.) and vestibular nystagmus slow phase components were investigated in man, with particular attention given to their possible input-adaptive or predictive behaviour. Each component in question was isolated from the eye movement records through a computer-aided procedure. The frequency response method was used with sinusoidal (predictable) and pseudo-random (unpredictable) stimuli. When the target motion was pseudo-random, the frequency response of pursuit eye movements revealed a large phase lead (up to about 90 degrees) at low stimulus frequencies. It is possible to interpret this result as a predictive effect, even though the stimulation was pseudo-random and thus 'unpredictable'. The pseudo-random-input frequency response intrinsic to the saccadic system was estimated in an indirect way from the pursuit and composite (pursuit + saccade) frequency response data. The result was fitted well by a servo-mechanism model, which has a simple anticipatory mechanism to compensate for the inherent neuromuscular saccadic delay by utilizing the retinal slip velocity signal. The o.k.n. slow phase also exhibited a predictive effect with sinusoidal inputs; however, pseudo-random stimuli did not produce such phase lead as found in the pursuit case. The vestibular nystagmus slow phase showed no noticeable sign of prediction in the frequency range examined (0 approximately 0.7 Hz), in contrast to the results of the visually driven eye movements (i.e. saccade, pursuit and o.k.n. slow phase) at comparable stimulus frequencies. PMID:6707954

  9. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual, appendix 2

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    The FORTRAN programs RANDOM3 and RANDOM4 are documented. They are based on fatigue strength reduction, using a probabilistic constitutive model. They predict the random lifetime of an engine component to reach a given fatigue strength. Included in this user manual are details regarding the theoretical backgrounds of RANDOM3 and RANDOM4. Appendix A gives information on the physical quantities, their symbols, FORTRAN names, and both SI and U.S. Customary units. Appendix B and C include photocopies of the actual computer printout corresponding to the sample problems. Appendices D and E detail the IMSL, Version 10(1), subroutines and functions called by RANDOM3 and RANDOM4 and SAS/GRAPH(2) programs that can be used to plot both the probability density functions (p.d.f.) and the cumulative distribution functions (c.d.f.).

  10. Random matrix approach to plasmon resonances in the random impedance network model of disordered nanocomposites

    NASA Astrophysics Data System (ADS)

    Olekhno, N. A.; Beltukov, Y. M.

    2018-05-01

    Random impedance networks are widely used as a model to describe plasmon resonances in disordered metal-dielectric and other two-component nanocomposites. In the present work, the spectral properties of resonances in random networks are studied within the framework of the random matrix theory. We have shown that the appropriate ensemble of random matrices for the considered problem is the Jacobi ensemble (the MANOVA ensemble). The obtained analytical expressions for the density of states in such resonant networks show a good agreement with the results of numerical simulations in a wide range of metal filling fractions 0

  11. Randomized prospective study comparing tri-cortical iliac crest autograft to allograft in the lateral column lengthening component for operative correction of adult acquired flatfoot deformity.

    PubMed

    Dolan, Christopher M; Henning, Jeffrey A; Anderson, John G; Bohay, Donald R; Kornmesser, Marc J; Endres, Terrence J

    2007-01-01

    Operative treatment of stage II posterior tibial tendon insufficiency (PTTI) is controversial. Many soft-tissue and bony procedures and various combinations of the two have been reported for treatment of stage II PTTI. Orthopaedists recognize the lateral column lengthening component of the procedure as a successful reconstructive technique. The use of cortical allograft for lateral column lengthening in the correction of pes planus in the pediatric patient population has been routine. In the adult population, however, tricortical iliac crest autograft has been the bone graft of choice. Harvest of this autograft can precipitate significant morbidity and cost. Therefore, we undertook this randomized controlled trial to compare graft incorporation and healing of allograft and autograft in the lateral column lengthening component of adult flatfoot reconstruction. Lateral column lengthening was done as a component of operative correction for stage II PTTI in adult patients (older than 18 years) by two surgeons using similar procedures. The patients were randomized to either the allograft or autograft procedures. The primary endpoint was graft incorporation and healing as assessed by radiographs. The study included 33 randomized feet in 31 patients. We followed 18 feet in the allograft group and 15 in the autograft group to the point of union. There were 21 women and 10 men. There were no delayed unions, nonunions, or hardware failures. All patients in both groups achieved bony union by the 12-week followup evaluation. Two superficial foot infections were successfully treated with oral antibiotics. Two patients in the autograft group continued to have hip donor site pain at 3 months. This study suggests that union rates of allograft and autograft (iliac crest bone graft) are equal. The use of allograft in the lateral column lengthening component of operative correction of adult stage II PTTI appears to be a viable alternative to the use of iliac crest autograft and eliminates the morbidity and increased cost associated with autograft harvest.

  12. Theory-Based Interventions Combining Mental Simulation and Planning Techniques to Improve Physical Activity: Null Results from Two Randomized Controlled Trials.

    PubMed

    Meslot, Carine; Gauchet, Aurélie; Allenet, Benoît; François, Olivier; Hagger, Martin S

    2016-01-01

    Interventions to assist individuals in initiating and maintaining regular participation in physical activity are not always effective. Psychological and behavioral theories advocate the importance of both motivation and volition in interventions to change health behavior. Interventions adopting self-regulation strategies that foster motivational and volitional components may, therefore, have utility in promoting regular physical activity participation. We tested the efficacy of an intervention adopting motivational (mental simulation) and volitional (implementation intentions) components to promote a regular physical activity in two studies. Study 1 adopted a cluster randomized design in which participants ( n = 92) were allocated to one of three conditions: mental simulation plus implementation intention, implementation intention only, or control. Study 2 adopted a 2 (mental simulation vs. no mental simulation) × 2 (implementation intention vs. no implementation intention) randomized controlled design in which fitness center attendees ( n = 184) were randomly allocated one of four conditions: mental simulation only, implementation intention only, combined, or control. Physical activity behavior was measured by self-report (Study 1) or fitness center attendance (Study 2) at 4- (Studies 1 and 2) and 19- (Study 2 only) week follow-up periods. Findings revealed no statistically significant main or interactive effects of the mental simulation and implementation intention conditions on physical activity outcomes in either study. Findings are in contrast to previous research which has found pervasive effects for both intervention strategies. Findings are discussed in light of study limitations including the relatively small sample sizes, particularly for Study 1, deviations in the operationalization of the intervention components from previous research and the lack of a prompt for a goal intention. Future research should focus on ensuring uniformity in the format of the intervention components, test the effects of each component alone and in combination using standardized measures across multiple samples, and systematically explore effects of candidate moderators.

  13. Theory-Based Interventions Combining Mental Simulation and Planning Techniques to Improve Physical Activity: Null Results from Two Randomized Controlled Trials

    PubMed Central

    Meslot, Carine; Gauchet, Aurélie; Allenet, Benoît; François, Olivier; Hagger, Martin S.

    2016-01-01

    Interventions to assist individuals in initiating and maintaining regular participation in physical activity are not always effective. Psychological and behavioral theories advocate the importance of both motivation and volition in interventions to change health behavior. Interventions adopting self-regulation strategies that foster motivational and volitional components may, therefore, have utility in promoting regular physical activity participation. We tested the efficacy of an intervention adopting motivational (mental simulation) and volitional (implementation intentions) components to promote a regular physical activity in two studies. Study 1 adopted a cluster randomized design in which participants (n = 92) were allocated to one of three conditions: mental simulation plus implementation intention, implementation intention only, or control. Study 2 adopted a 2 (mental simulation vs. no mental simulation) × 2 (implementation intention vs. no implementation intention) randomized controlled design in which fitness center attendees (n = 184) were randomly allocated one of four conditions: mental simulation only, implementation intention only, combined, or control. Physical activity behavior was measured by self-report (Study 1) or fitness center attendance (Study 2) at 4- (Studies 1 and 2) and 19- (Study 2 only) week follow-up periods. Findings revealed no statistically significant main or interactive effects of the mental simulation and implementation intention conditions on physical activity outcomes in either study. Findings are in contrast to previous research which has found pervasive effects for both intervention strategies. Findings are discussed in light of study limitations including the relatively small sample sizes, particularly for Study 1, deviations in the operationalization of the intervention components from previous research and the lack of a prompt for a goal intention. Future research should focus on ensuring uniformity in the format of the intervention components, test the effects of each component alone and in combination using standardized measures across multiple samples, and systematically explore effects of candidate moderators. PMID:27899904

  14. Real-time fast physical random number generator with a photonic integrated circuit.

    PubMed

    Ugajin, Kazusa; Terashima, Yuta; Iwakawa, Kento; Uchida, Atsushi; Harayama, Takahisa; Yoshimura, Kazuyuki; Inubushi, Masanobu

    2017-03-20

    Random number generators are essential for applications in information security and numerical simulations. Most optical-chaos-based random number generators produce random bit sequences by offline post-processing with large optical components. We demonstrate a real-time hardware implementation of a fast physical random number generator with a photonic integrated circuit and a field programmable gate array (FPGA) electronic board. We generate 1-Tbit random bit sequences and evaluate their statistical randomness using NIST Special Publication 800-22 and TestU01. All of the BigCrush tests in TestU01 are passed using 410-Gbit random bit sequences. A maximum real-time generation rate of 21.1 Gb/s is achieved for random bit sequences in binary format stored in a computer, which can be directly used for applications involving secret keys in cryptography and random seeds in large-scale numerical simulations.

  15. Thermal transport in binary colloidal glasses: Composition dependence and percolation assessment

    NASA Astrophysics Data System (ADS)

    Ruckdeschel, Pia; Philipp, Alexandra; Kopera, Bernd A. F.; Bitterlich, Flora; Dulle, Martin; Pech-May, Nelson W.; Retsch, Markus

    2018-02-01

    The combination of various types of materials is often used to create superior composites that outperform the pure phase components. For any rational design, the thermal conductivity of the composite as a function of the volume fraction of the filler component needs to be known. When approaching the nanoscale, the homogeneous mixture of various components poses an additional challenge. Here, we investigate binary nanocomposite materials based on polymer latex beads and hollow silica nanoparticles. These form randomly mixed colloidal glasses on a sub-μ m scale. We focus on the heat transport properties through such binary assembly structures. The thermal conductivity can be well described by the effective medium theory. However, film formation of the soft polymer component leads to phase segregation and a mismatch between existing mixing models. We confirm our experimental data by finite element modeling. This additionally allowed us to assess the onset of thermal transport percolation in such random particulate structures. Our study contributes to a better understanding of thermal transport through heterostructured particulate assemblies.

  16. Synthesis of Sine-on-Random vibration profiles for accelerated life tests based on fatigue damage spectrum equivalence

    NASA Astrophysics Data System (ADS)

    Angeli, Andrea; Cornelis, Bram; Troncossi, Marco

    2018-03-01

    In many real life environments, mechanical and electronic systems are subjected to vibrations that may induce dynamic loads and potentially lead to an early failure due to fatigue damage. Thus, qualification tests by means of shakers are advisable for the most critical components in order to verify their durability throughout the entire life cycle. Nowadays the trend is to tailor the qualification tests according to the specific application of the tested component, considering the measured field data as reference to set up the experimental campaign, for example through the so called "Mission Synthesis" methodology. One of the main issues is to define the excitation profiles for the tests, that must have, besides the (potentially scaled) frequency content, also the same damage potential of the field data despite being applied for a limited duration. With this target, the current procedures generally provide the test profile as a stationary random vibration specified by a Power Spectral Density (PSD). In certain applications this output may prove inadequate to represent the nature of the reference signal, and the procedure could result in an unrealistic qualification test. For instance when a rotating part is present in the system the component under analysis may be subjected to Sine-on-Random (SoR) vibrations, namely excitations composed of sinusoidal contributions superimposed to random vibrations. In this case, the synthesized test profile should preserve not only the induced fatigue damage but also the deterministic components of the environmental vibration. In this work, the potential advantages of a novel procedure to synthesize SoR profiles instead of PSDs for qualification tests are presented and supported by the results of an experimental campaign.

  17. Effect of psycho-educational interventions on quality of life in patients with implantable cardioverter defibrillators: a meta-analysis of randomized controlled trials.

    PubMed

    Kao, Chi-Wen; Chen, Miao-Yi; Chen, Ting-Yu; Lin, Pai-Hui

    2016-09-30

    Implantable cardioverter defibrillators (ICD) were developed for primary and secondary prevention of sudden cardiac death. However, ICD recipients' mortality is significantly predicted by their quality of life (QOL). The aim of this meta-analysis was to evaluate the effects of psycho-educational interventions on QOL in patients with ICDs. We systematically searched PubMed, Medline, Cochrane Library, and CINAHL through April 2015 and references of relevant articles. Studies were reviewed if they met following criteria: (1) randomized controlled trial, (2) participants were adults with an ICD, and (3) data were sufficient to evaluate the effect of psychological or educational interventions on QOL measured by the SF-36 or SF-12. Studies were independently selected and their data were extracted by two reviewers. Study quality was evaluated using a modified Jadad scale. The meta-analysis was conducted using the Cochrane Collaboration's Review Manager Software Package (RevMan 5). Study heterogeneity was assessed by Q statistics and I 2 statistic. Depending on heterogeneity, data were pooled across trials using fixed-effect or random-effect modeling. Seven randomized controlled trials fulfilled the inclusion and exclusion criteria, and included 1017 participants. The psycho-educational interventions improved physical component summary (PCS) scores in the intervention groups more than in control groups (mean difference 2.08, 95 % CI 0.86 to 3.29, p < 0.001), but did not significantly affect mental component summary (MCS) scores (mean difference 0.84, 95 % CI -1.68 to 3.35, p = 0.52). Our meta-analysis demonstrates that psycho-educational interventions improved the physical component, but not the mental component of QOL in patients with ICDs.

  18. Random matrix theory and cross-correlations in global financial indices and local stock market indices

    NASA Astrophysics Data System (ADS)

    Nobi, Ashadun; Maeng, Seong Eun; Ha, Gyeong Gyun; Lee, Jae Woo

    2013-02-01

    We analyzed cross-correlations between price fluctuations of global financial indices (20 daily stock indices over the world) and local indices (daily indices of 200 companies in the Korean stock market) by using random matrix theory (RMT). We compared eigenvalues and components of the largest and the second largest eigenvectors of the cross-correlation matrix before, during, and after the global financial the crisis in the year 2008. We find that the majority of its eigenvalues fall within the RMT bounds [ λ -, λ +], where λ - and λ + are the lower and the upper bounds of the eigenvalues of random correlation matrices. The components of the eigenvectors for the largest positive eigenvalues indicate the identical financial market mode dominating the global and local indices. On the other hand, the components of the eigenvector corresponding to the second largest eigenvalue are positive and negative values alternatively. The components before the crisis change sign during the crisis, and those during the crisis change sign after the crisis. The largest inverse participation ratio (IPR) corresponding to the smallest eigenvector is higher after the crisis than during any other periods in the global and local indices. During the global financial the crisis, the correlations among the global indices and among the local stock indices are perturbed significantly. However, the correlations between indices quickly recover the trends before the crisis.

  19. Rationale, design and methods of the HEALTHY study behavior intervention component

    USDA-ARS?s Scientific Manuscript database

    HEALTHY was a multi-center primary prevention trial designed to reduce risk factors for type 2 diabetes in adolescents. Seven centers each recruited six middle schools that were randomized to either intervention or control. The HEALTHY intervention integrated multiple components in nutrition, physic...

  20. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  1. A generator for unique quantum random numbers based on vacuum states

    NASA Astrophysics Data System (ADS)

    Gabriel, Christian; Wittmann, Christoffer; Sych, Denis; Dong, Ruifang; Mauerer, Wolfgang; Andersen, Ulrik L.; Marquardt, Christoph; Leuchs, Gerd

    2010-10-01

    Random numbers are a valuable component in diverse applications that range from simulations over gambling to cryptography. The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational unpredictability of quantum mechanics. However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique. Here we present a simple experimental setup based on homodyne measurements that uses the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably unique randomness, are important attributes for achieving high-reliability, high-speed and low-cost quantum random number generators.

  2. Synthesis of wavelet envelope in 2-D random media having power-law spectra: comparison with FD simulations

    NASA Astrophysics Data System (ADS)

    Sato, Haruo; Fehler, Michael C.

    2016-10-01

    The envelope broadening and the peak delay of the S-wavelet of a small earthquake with increasing travel distance are results of scattering by random velocity inhomogeneities in the earth medium. As a simple mathematical model, Sato proposed a new stochastic synthesis of the scalar wavelet envelope in 3-D von Kármán type random media when the centre wavenumber of the wavelet is in the power-law spectral range of the random velocity fluctuation. The essential idea is to split the random medium spectrum into two components using the centre wavenumber as a reference: the long-scale (low-wavenumber spectral) component produces the peak delay and the envelope broadening by multiple scattering around the forward direction; the short-scale (high-wavenumber spectral) component attenuates wave amplitude by wide angle scattering. The former is calculated by the Markov approximation based on the parabolic approximation and the latter is calculated by the Born approximation. Here, we extend the theory for the envelope synthesis of a wavelet in 2-D random media, which makes it easy to compare with finite difference (FD) simulation results. The synthetic wavelet envelope is analytically written by using the random medium parameters in the angular frequency domain. For the case that the power spectral density function of the random velocity fluctuation has a steep roll-off at large wavenumbers, the envelope broadening is small and frequency independent, and scattering attenuation is weak. For the case of a small roll-off, however, the envelope broadening is large and increases with frequency, and the scattering attenuation is strong and increases with frequency. As a preliminary study, we compare synthetic wavelet envelopes with the average of FD simulation wavelet envelopes in 50 synthesized random media, which are characterized by the RMS fractional velocity fluctuation ε = 0.05, correlation scale a = 5 km and the background wave velocity V0 = 4 km s-1. We use the radiation of a 2 Hz Ricker wavelet from a point source. For all the cases of von Kármán order κ = 0.1, 0.5 and 1, we find the synthetic wavelet envelopes are a good match to the characteristics of FD simulation wavelet envelopes in a time window starting from the onset through the maximum peak to the time when the amplitude decreases to half the peak amplitude.

  3. Evaluation of some random effects methodology applicable to bird ringing data

    USGS Publications Warehouse

    Burnham, K.P.; White, Gary C.

    2002-01-01

    Existing models for ring recovery and recapture data analysis treat temporal variations in annual survival probability (S) as fixed effects. Often there is no explainable structure to the temporal variation in S1,..., Sk; random effects can then be a useful model: Si = E(S) + ??i. Here, the temporal variation in survival probability is treated as random with average value E(??2) = ??2. This random effects model can now be fit in program MARK. Resultant inferences include point and interval estimation for process variation, ??2, estimation of E(S) and var (E??(S)) where the latter includes a component for ??2 as well as the traditional component for v??ar(S??\\S??). Furthermore, the random effects model leads to shrinkage estimates, Si, as improved (in mean square error) estimators of Si compared to the MLE, S??i, from the unrestricted time-effects model. Appropriate confidence intervals based on the Si are also provided. In addition, AIC has been generalized to random effects models. This paper presents results of a Monte Carlo evaluation of inference performance under the simple random effects model. Examined by simulation, under the simple one group Cormack-Jolly-Seber (CJS) model, are issues such as bias of ??s2, confidence interval coverage on ??2, coverage and mean square error comparisons for inference about Si based on shrinkage versus maximum likelihood estimators, and performance of AIC model selection over three models: Si ??? S (no effects), Si = E(S) + ??i (random effects), and S1,..., Sk (fixed effects). For the cases simulated, the random effects methods performed well and were uniformly better than fixed effects MLE for the Si.

  4. [Theory, method and application of method R on estimation of (co)variance components].

    PubMed

    Liu, Wen-Zhong

    2004-07-01

    Theory, method and application of Method R on estimation of (co)variance components were reviewed in order to make the method be reasonably used. Estimation requires R values,which are regressions of predicted random effects that are calculated using complete dataset on predicted random effects that are calculated using random subsets of the same data. By using multivariate iteration algorithm based on a transformation matrix,and combining with the preconditioned conjugate gradient to solve the mixed model equations, the computation efficiency of Method R is much improved. Method R is computationally inexpensive,and the sampling errors and approximate credible intervals of estimates can be obtained. Disadvantages of Method R include a larger sampling variance than other methods for the same data,and biased estimates in small datasets. As an alternative method, Method R can be used in larger datasets. It is necessary to study its theoretical properties and broaden its application range further.

  5. Calculation of Dynamic Loads Due to Random Vibration Environments in Rocket Engine Systems

    NASA Technical Reports Server (NTRS)

    Christensen, Eric R.; Brown, Andrew M.; Frady, Greg P.

    2007-01-01

    An important part of rocket engine design is the calculation of random dynamic loads resulting from internal engine "self-induced" sources. These loads are random in nature and can greatly influence the weight of many engine components. Several methodologies for calculating random loads are discussed and then compared to test results using a dynamic testbed consisting of a 60K thrust engine. The engine was tested in a free-free condition with known random force inputs from shakers attached to three locations near the main noise sources on the engine. Accelerations and strains were measured at several critical locations on the engines and then compared to the analytical results using two different random response methodologies.

  6. A philosophical argument against evidence-based policy.

    PubMed

    Anjum, Rani Lill; Mumford, Stephen D

    2017-10-01

    Evidence-based medicine has two components. The methodological or ontological component consists of randomized controlled trials and their systematic review. This makes use of a difference-making conception of cause. But there is also a policy component that makes a recommendation for uniform intervention, based on the evidence from randomized controlled trials. The policy side of evidence-based medicine is basically a form of rule utilitarianism. But it is then subject to an objection from Smart that rule utilitarianism inevitably collapses. If one assumes (1) you should recommend the intervention that has brought most benefit (the core of evidence-based policy making), (2) individual variation (acknowledged by use of randomization) and (3) no intervention benefits all (contingent but true), then the objection can be brought to bear. A utility maximizer should always ignore the rule in an individual case where greater benefit can be secured through doing so. In the medical case, this would mean that a clinician who knows that a patient would not benefit from the recommended intervention has good reason to ignore the recommendation. This is indeed the feeling of many clinicians who would like to offer other interventions but for an aversion to breaking clinical guidelines. © 2016 John Wiley & Sons, Ltd.

  7. Multi-peak structure of generation spectrum of random distributed feedback fiber Raman lasers.

    PubMed

    Vatnik, I D; Zlobina, E A; Kablukov, S I; Babin, S A

    2017-02-06

    We study spectral features of the generation of random distributed feedback fiber Raman laser arising from two-peak shape of the Raman gain spectral profile realized in the germanosilicate fibers. We demonstrate that number of peaks can be calculated using power balance model considering different subcomponents within each Stokes component.

  8. Multi-component access to a commercially available weight loss program: A randomized controlled trial

    USDA-ARS?s Scientific Manuscript database

    This study examined weight loss between a community-based, intensive behavioral counseling program (Weight Watchers PointsPlus that included three treatment access modes and a self-help condition. A total of 292 participants were randomized to a Weight Watchers (WW; n=147) or a self-help condition (...

  9. Cognitive Behavioral Principles within Group Mentoring: A Randomized Pilot Study

    ERIC Educational Resources Information Center

    Jent, Jason F.; Niec, Larissa N.

    2009-01-01

    This study evaluated the effectiveness of a group mentoring program that included components of empirically supported mentoring and cognitive behavioral techniques for children served at a community mental health center. Eighty-six 8- to 12-year-old children were randomly assigned to either group mentoring or a wait-list control group. Group…

  10. Random benzotrithiophene-based donor-acceptor copolymers for efficient organic photovoltaic devices.

    PubMed

    Nielsen, Christian B; Ashraf, Raja Shahid; Schroeder, Bob C; D'Angelo, Pasquale; Watkins, Scott E; Song, Kigook; Anthopoulos, Thomas D; McCulloch, Iain

    2012-06-14

    A series of benzotrithiophene-containing random terpolymers for polymer solar cells is reported. Through variations of the two other components in the terpolymers, the absorption profile and the frontier energy levels are optimized and maximum power conversion efficiencies are nearly doubled (5.14%) relative to the parent alternating copolymer.

  11. Impacts of a Comprehensive School Readiness Curriculum for Preschool Children at Risk for Educational Difficulties

    ERIC Educational Resources Information Center

    Lonigan, Christopher J.; Phillips, Beth M.; Clancy, Jeanine L.; Landry, Susan H.; Swank, Paul R.; Assel, Michael; Taylor, Heather B.; Klein, Alice; Starkey, Prentice; Domitrovich, Celene E.; Eisenberg, Nancy; Villiers, Jill; Villiers, Peter; Barnes, Marcia

    2015-01-01

    This article reports findings from a cluster-randomized study of an integrated literacy- and math-focused preschool curriculum, comparing versions with and without an explicit socioemotional lesson component to a business-as-usual condition. Participants included 110 classroom teachers from randomized classrooms and approximately eight students…

  12. Functional mixed effects spectral analysis

    PubMed Central

    KRAFTY, ROBERT T.; HALL, MARTICA; GUO, WENSHENG

    2011-01-01

    SUMMARY In many experiments, time series data can be collected from multiple units and multiple time series segments can be collected from the same unit. This article introduces a mixed effects Cramér spectral representation which can be used to model the effects of design covariates on the second-order power spectrum while accounting for potential correlations among the time series segments collected from the same unit. The transfer function is composed of a deterministic component to account for the population-average effects and a random component to account for the unit-specific deviations. The resulting log-spectrum has a functional mixed effects representation where both the fixed effects and random effects are functions in the frequency domain. It is shown that, when the replicate-specific spectra are smooth, the log-periodograms converge to a functional mixed effects model. A data-driven iterative estimation procedure is offered for the periodic smoothing spline estimation of the fixed effects, penalized estimation of the functional covariance of the random effects, and unit-specific random effects prediction via the best linear unbiased predictor. PMID:26855437

  13. Estimating rate uncertainty with maximum likelihood: differences between power-law and flicker–random-walk models

    USGS Publications Warehouse

    Langbein, John O.

    2012-01-01

    Recent studies have documented that global positioning system (GPS) time series of position estimates have temporal correlations which have been modeled as a combination of power-law and white noise processes. When estimating quantities such as a constant rate from GPS time series data, the estimated uncertainties on these quantities are more realistic when using a noise model that includes temporal correlations than simply assuming temporally uncorrelated noise. However, the choice of the specific representation of correlated noise can affect the estimate of uncertainty. For many GPS time series, the background noise can be represented by either: (1) a sum of flicker and random-walk noise or, (2) as a power-law noise model that represents an average of the flicker and random-walk noise. For instance, if the underlying noise model is a combination of flicker and random-walk noise, then incorrectly choosing the power-law model could underestimate the rate uncertainty by a factor of two. Distinguishing between the two alternate noise models is difficult since the flicker component can dominate the assessment of the noise properties because it is spread over a significant portion of the measurable frequency band. But, although not necessarily detectable, the random-walk component can be a major constituent of the estimated rate uncertainty. None the less, it is possible to determine the upper bound on the random-walk noise.

  14. Contrast and autoshaping in multiple schedules varying reinforcer rate and duration

    PubMed Central

    Hamilton, Bruce E.; Silberberg, Alan

    1978-01-01

    Thirteen master pigeons were exposed to multiple schedules in which reinforcement frequency (Experiment I) or duration (Experiment II) was varied. In Phases 1 and 3 of Experiment I, the values of the first and second components' random-interval schedules were 33 and 99 seconds, respectively. In Phase 2, these values were 99 seconds for both components. In Experiment II, a random-interval 33-second schedule was associated with each component. During Phases 1 and 3, the first and second components had hopper durations of 7.5 and 2.5 seconds respectively. During Phase 2, both components' hopper durations were 2.5 seconds. In each experiment, positive contrast obtained for about half the master subjects. The rest showed a rate increase in both components (positive induction). Each master subject's key colors and reinforcers were synchronously presented on a response-independent basis to a yoked control. Richer component key-pecking occurred during each experiment's Phases 1 and 3 among half these subjects. However, none responded during the contrast condition (unchanged component of each experiment's Phase 2). From this it is inferred that autoshaping did not contribute to the contrast and induction findings among master birds. Little evidence of local contrast (highest rate at beginning of richer component) was found in any subject. These data show that (a) contrast can occur independently from autoshaping, (b) contrast assays during equal-valued components may produce induction, (c) local contrast in multiple schedules often does not occur, and (d) differential hopper durations can produce autoshaping and contrast. PMID:16812081

  15. Disease Mapping of Zero-excessive Mesothelioma Data in Flanders

    PubMed Central

    Neyens, Thomas; Lawson, Andrew B.; Kirby, Russell S.; Nuyts, Valerie; Watjou, Kevin; Aregay, Mehreteab; Carroll, Rachel; Nawrot, Tim S.; Faes, Christel

    2016-01-01

    Purpose To investigate the distribution of mesothelioma in Flanders using Bayesian disease mapping models that account for both an excess of zeros and overdispersion. Methods The numbers of newly diagnosed mesothelioma cases within all Flemish municipalities between 1999 and 2008 were obtained from the Belgian Cancer Registry. To deal with overdispersion, zero-inflation and geographical association, the hurdle combined model was proposed, which has three components: a Bernoulli zero-inflation mixture component to account for excess zeros, a gamma random effect to adjust for overdispersion and a normal conditional autoregressive random effect to attribute spatial association. This model was compared with other existing methods in literature. Results The results indicate that hurdle models with a random effects term accounting for extra-variance in the Bernoulli zero-inflation component fit the data better than hurdle models that do not take overdispersion in the occurrence of zeros into account. Furthermore, traditional models that do not take into account excessive zeros but contain at least one random effects term that models extra-variance in the counts have better fits compared to their hurdle counterparts. In other words, the extra-variability, due to an excess of zeros, can be accommodated by spatially structured and/or unstructured random effects in a Poisson model such that the hurdle mixture model is not necessary. Conclusions Models taking into account zero-inflation do not always provide better fits to data with excessive zeros than less complex models. In this study, a simple conditional autoregressive model identified a cluster in mesothelioma cases near a former asbestos processing plant (Kapelle-op-den-Bos). This observation is likely linked with historical local asbestos exposures. Future research will clarify this. PMID:27908590

  16. Disease mapping of zero-excessive mesothelioma data in Flanders.

    PubMed

    Neyens, Thomas; Lawson, Andrew B; Kirby, Russell S; Nuyts, Valerie; Watjou, Kevin; Aregay, Mehreteab; Carroll, Rachel; Nawrot, Tim S; Faes, Christel

    2017-01-01

    To investigate the distribution of mesothelioma in Flanders using Bayesian disease mapping models that account for both an excess of zeros and overdispersion. The numbers of newly diagnosed mesothelioma cases within all Flemish municipalities between 1999 and 2008 were obtained from the Belgian Cancer Registry. To deal with overdispersion, zero inflation, and geographical association, the hurdle combined model was proposed, which has three components: a Bernoulli zero-inflation mixture component to account for excess zeros, a gamma random effect to adjust for overdispersion, and a normal conditional autoregressive random effect to attribute spatial association. This model was compared with other existing methods in literature. The results indicate that hurdle models with a random effects term accounting for extra variance in the Bernoulli zero-inflation component fit the data better than hurdle models that do not take overdispersion in the occurrence of zeros into account. Furthermore, traditional models that do not take into account excessive zeros but contain at least one random effects term that models extra variance in the counts have better fits compared to their hurdle counterparts. In other words, the extra variability, due to an excess of zeros, can be accommodated by spatially structured and/or unstructured random effects in a Poisson model such that the hurdle mixture model is not necessary. Models taking into account zero inflation do not always provide better fits to data with excessive zeros than less complex models. In this study, a simple conditional autoregressive model identified a cluster in mesothelioma cases near a former asbestos processing plant (Kapelle-op-den-Bos). This observation is likely linked with historical local asbestos exposures. Future research will clarify this. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Unbiased Estimates of Variance Components with Bootstrap Procedures

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2007-01-01

    This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…

  18. Response Strength in Extreme Multiple Schedules

    ERIC Educational Resources Information Center

    McLean, Anthony P.; Grace, Randolph C.; Nevin, John A.

    2012-01-01

    Four pigeons were trained in a series of two-component multiple schedules. Reinforcers were scheduled with random-interval schedules. The ratio of arranged reinforcer rates in the two components was varied over 4 log units, a much wider range than previously studied. When performance appeared stable, prefeeding tests were conducted to assess…

  19. Explanation of Social Relation Based on University's Psycho-Social Climate, Psychological Wellbeing Components, and Emotional Intelligence

    ERIC Educational Resources Information Center

    Oke, Kayode

    2015-01-01

    This study was conducted to explain social relation based on psycho-social climate, psychological wellbeing components, and emotional intelligence among undergraduates of Olabisi Onabanjo University, Ogun State, Nigeria. The statistical population consisted of all undergraduates of Olabisi Onabanjo University. Participants were randomly selected…

  20. Rationale, design and methods of the HEALTHY study physical education intervention component

    USDA-ARS?s Scientific Manuscript database

    The HEALTHY primary prevention trial was designed to reduce risk factors for type 2 diabetes in middle school students. Middle schools at seven centers across the United States participated in the 3-year study. Half of them were randomized to receive a multi-component intervention. The intervention ...

  1. System Lifetimes, The Memoryless Property, Euler's Constant, and Pi

    ERIC Educational Resources Information Center

    Agarwal, Anurag; Marengo, James E.; Romero, Likin Simon

    2013-01-01

    A "k"-out-of-"n" system functions as long as at least "k" of its "n" components remain operational. Assuming that component failure times are independent and identically distributed exponential random variables, we find the distribution of system failure time. After some examples, we find the limiting…

  2. Active non-volatile memory post-processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kannan, Sudarsun; Milojicic, Dejan S.; Talwar, Vanish

    A computing node includes an active Non-Volatile Random Access Memory (NVRAM) component which includes memory and a sub-processor component. The memory is to store data chunks received from a processor core, the data chunks comprising metadata indicating a type of post-processing to be performed on data within the data chunks. The sub-processor component is to perform post-processing of said data chunks based on said metadata.

  3. A Spatial Poisson Hurdle Model for Exploring Geographic Variation in Emergency Department Visits

    PubMed Central

    Neelon, Brian; Ghosh, Pulak; Loebs, Patrick F.

    2012-01-01

    Summary We develop a spatial Poisson hurdle model to explore geographic variation in emergency department (ED) visits while accounting for zero inflation. The model consists of two components: a Bernoulli component that models the probability of any ED use (i.e., at least one ED visit per year), and a truncated Poisson component that models the number of ED visits given use. Together, these components address both the abundance of zeros and the right-skewed nature of the nonzero counts. The model has a hierarchical structure that incorporates patient- and area-level covariates, as well as spatially correlated random effects for each areal unit. Because regions with high rates of ED use are likely to have high expected counts among users, we model the spatial random effects via a bivariate conditionally autoregressive (CAR) prior, which introduces dependence between the components and provides spatial smoothing and sharing of information across neighboring regions. Using a simulation study, we show that modeling the between-component correlation reduces bias in parameter estimates. We adopt a Bayesian estimation approach, and the model can be fit using standard Bayesian software. We apply the model to a study of patient and neighborhood factors influencing emergency department use in Durham County, North Carolina. PMID:23543242

  4. Boson expansions based on the random phase approximation representation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedrocchi, V.G.; Tamura, T.

    1984-04-01

    A new boson expansion theory based on the random phase approximation is presented. The boson expansions are derived here directly in the random phase approximation representation with the help of a technique that combines the use of the Usui operator with that of a new bosonization procedure, called the term-by-term bosonization method. The present boson expansion theory is constructed by retaining a single collective quadrupole random phase approximation component, a truncation that allows for a perturbative treatment of the whole problem. Both Hermitian, as well as non-Hermitian boson expansions, valid for even nuclei, are obtained.

  5. The investigation of social networks based on multi-component random graphs

    NASA Astrophysics Data System (ADS)

    Zadorozhnyi, V. N.; Yudin, E. B.

    2018-01-01

    The methods of non-homogeneous random graphs calibration are developed for social networks simulation. The graphs are calibrated by the degree distributions of the vertices and the edges. The mathematical foundation of the methods is formed by the theory of random graphs with the nonlinear preferential attachment rule and the theory of Erdôs-Rényi random graphs. In fact, well-calibrated network graph models and computer experiments with these models would help developers (owners) of the networks to predict their development correctly and to choose effective strategies for controlling network projects.

  6. Physical constraints of cultural evolution of dialects in killer whales.

    PubMed

    Filatova, Olga A; Samarra, Filipa I P; Barrett-Lennard, Lance G; Miller, Patrick J O; Ford, John K B; Yurk, Harald; Matkin, Craig O; Hoyt, Erich

    2016-11-01

    Odontocete sounds are produced by two pairs of phonic lips situated in soft nares below the blowhole; the right pair is larger and is more likely to produce clicks, while the left pair is more likely to produce whistles. This has important implications for the cultural evolution of delphinid sounds: the greater the physical constraints, the greater the probability of random convergence. In this paper the authors examine the call structure of eight killer whale populations to identify structural constraints and to determine if they are consistent among all populations. Constraints were especially pronounced in two-voiced calls. In the calls of all eight populations, the lower component of two-voiced (biphonic) calls was typically centered below 4 kHz, while the upper component was typically above that value. The lower component of two-voiced calls had a narrower frequency range than single-voiced calls in all populations. This may be because some single-voiced calls are homologous to the lower component, while others are homologous to the higher component of two-voiced calls. Physical constraints on the call structure reduce the possible variation and increase the probability of random convergence, producing similar calls in different populations.

  7. A system identification technique based on the random decrement signatures. Part 1: Theory and simulation

    NASA Technical Reports Server (NTRS)

    Bedewi, Nabih E.; Yang, Jackson C. S.

    1987-01-01

    Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The mathematics of the technique is presented in addition to the results of computer simulations conducted to demonstrate the prediction of the response of the system and the random forcing function initially introduced to excite the system.

  8. On the efficiency of a randomized mirror descent algorithm in online optimization problems

    NASA Astrophysics Data System (ADS)

    Gasnikov, A. V.; Nesterov, Yu. E.; Spokoiny, V. G.

    2015-04-01

    A randomized online version of the mirror descent method is proposed. It differs from the existing versions by the randomization method. Randomization is performed at the stage of the projection of a subgradient of the function being optimized onto the unit simplex rather than at the stage of the computation of a subgradient, which is common practice. As a result, a componentwise subgradient descent with a randomly chosen component is obtained, which admits an online interpretation. This observation, for example, has made it possible to uniformly interpret results on weighting expert decisions and propose the most efficient method for searching for an equilibrium in a zero-sum two-person matrix game with sparse matrix.

  9. Parameters affecting the resilience of scale-free networks to random failures.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Link, Hamilton E.; LaViolette, Randall A.; Lane, Terran

    2005-09-01

    It is commonly believed that scale-free networks are robust to massive numbers of random node deletions. For example, Cohen et al. in (1) study scale-free networks including some which approximate the measured degree distribution of the Internet. Their results suggest that if each node in this network failed independently with probability 0.99, most of the remaining nodes would still be connected in a giant component. In this paper, we show that a large and important subclass of scale-free networks are not robust to massive numbers of random node deletions. In particular, we study scale-free networks which have minimum node degreemore » of 1 and a power-law degree distribution beginning with nodes of degree 1 (power-law networks). We show that, in a power-law network approximating the Internet's reported distribution, when the probability of deletion of each node is 0.5 only about 25% of the surviving nodes in the network remain connected in a giant component, and the giant component does not persist beyond a critical failure rate of 0.9. The new result is partially due to improved analytical accommodation of the large number of degree-0 nodes that result after node deletions. Our results apply to power-law networks with a wide range of power-law exponents, including Internet-like networks. We give both analytical and empirical evidence that such networks are not generally robust to massive random node deletions.« less

  10. Functional Principal Component Analysis and Randomized Sparse Clustering Algorithm for Medical Image Analysis

    PubMed Central

    Lin, Nan; Jiang, Junhai; Guo, Shicheng; Xiong, Momiao

    2015-01-01

    Due to the advancement in sensor technology, the growing large medical image data have the ability to visualize the anatomical changes in biological tissues. As a consequence, the medical images have the potential to enhance the diagnosis of disease, the prediction of clinical outcomes and the characterization of disease progression. But in the meantime, the growing data dimensions pose great methodological and computational challenges for the representation and selection of features in image cluster analysis. To address these challenges, we first extend the functional principal component analysis (FPCA) from one dimension to two dimensions to fully capture the space variation of image the signals. The image signals contain a large number of redundant features which provide no additional information for clustering analysis. The widely used methods for removing the irrelevant features are sparse clustering algorithms using a lasso-type penalty to select the features. However, the accuracy of clustering using a lasso-type penalty depends on the selection of the penalty parameters and the threshold value. In practice, they are difficult to determine. Recently, randomized algorithms have received a great deal of attentions in big data analysis. This paper presents a randomized algorithm for accurate feature selection in image clustering analysis. The proposed method is applied to both the liver and kidney cancer histology image data from the TCGA database. The results demonstrate that the randomized feature selection method coupled with functional principal component analysis substantially outperforms the current sparse clustering algorithms in image cluster analysis. PMID:26196383

  11. Hodge Decomposition of Information Flow on Small-World Networks.

    PubMed

    Haruna, Taichi; Fujiki, Yuuya

    2016-01-01

    We investigate the influence of the small-world topology on the composition of information flow on networks. By appealing to the combinatorial Hodge theory, we decompose information flow generated by random threshold networks on the Watts-Strogatz model into three components: gradient, harmonic and curl flows. The harmonic and curl flows represent globally circular and locally circular components, respectively. The Watts-Strogatz model bridges the two extreme network topologies, a lattice network and a random network, by a single parameter that is the probability of random rewiring. The small-world topology is realized within a certain range between them. By numerical simulation we found that as networks become more random the ratio of harmonic flow to the total magnitude of information flow increases whereas the ratio of curl flow decreases. Furthermore, both quantities are significantly enhanced from the level when only network structure is considered for the network close to a random network and a lattice network, respectively. Finally, the sum of these two ratios takes its maximum value within the small-world region. These findings suggest that the dynamical information counterpart of global integration and that of local segregation are the harmonic flow and the curl flow, respectively, and that a part of the small-world region is dominated by internal circulation of information flow.

  12. Targeting Classrooms' Emotional Climate and Preschoolers' Socioemotional Adjustment: Implementation of the Chicago School Readiness Project

    PubMed Central

    Li-Grining, Christine P.; Raver, C. Cybele; Jones-Lewis, Darlene; Madison-Boyd, Sybil; Lennon, Jaclyn

    2015-01-01

    Children living in low-income families are more likely to experience less self-regulation, greater behavior problems, and lower academic achievement than higher income children. To help prevent children's later socioemotional and academic difficulties, the Chicago School Readiness Project (CSRP) team implemented a clustered, randomized controlled trial (RCT) in early childhood programs with Head Start funding. Head Start sites were randomly assigned to receive CSRP services, which were offered as part of a multi-component, classroom-based mental health intervention. Here, we provide an overview of the CSRP model, its components, and a descriptive portrait of its implementation. In so doing, we address various aspects of the implementation of three of its components: 1) the training of teachers, 2) MHCs' coaching of teachers, and 3) teachers' behavior management of children. We conclude with a discussion of factors potentially related to the implementation of CSRP and directions for future research. PMID:25321641

  13. Linear velocity fields in non-Gaussian models for large-scale structure

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.

    1992-01-01

    Linear velocity fields in two types of physically motivated non-Gaussian models are examined for large-scale structure: seed models, in which the density field is a convolution of a density profile with a distribution of points, and local non-Gaussian fields, derived from a local nonlinear transformation on a Gaussian field. The distribution of a single component of the velocity is derived for seed models with randomly distributed seeds, and these results are applied to the seeded hot dark matter model and the global texture model with cold dark matter. An expression for the distribution of a single component of the velocity in arbitrary local non-Gaussian models is given, and these results are applied to such fields with chi-squared and lognormal distributions. It is shown that all seed models with randomly distributed seeds and all local non-Guassian models have single-component velocity distributions with positive kurtosis.

  14. Randomly displaced phase distribution design and its advantage in page-data recording of Fourier transform holograms.

    PubMed

    Emoto, Akira; Fukuda, Takashi

    2013-02-20

    For Fourier transform holography, an effective random phase distribution with randomly displaced phase segments is proposed for obtaining a smooth finite optical intensity distribution in the Fourier transform plane. Since unitary phase segments are randomly distributed in-plane, the blanks give various spatial frequency components to an image, and thus smooth the spectrum. Moreover, by randomly changing the phase segment size, spike generation from the unitary phase segment size in the spectrum can be reduced significantly. As a result, a smooth spectrum including sidebands can be formed at a relatively narrow extent. The proposed phase distribution sustains the primary functions of a random phase mask for holographic-data recording and reconstruction. Therefore, this distribution is expected to find applications in high-density holographic memory systems, replacing conventional random phase mask patterns.

  15. True random numbers from amplified quantum vacuum.

    PubMed

    Jofre, M; Curty, M; Steinlechner, F; Anzolin, G; Torres, J P; Mitchell, M W; Pruneri, V

    2011-10-10

    Random numbers are essential for applications ranging from secure communications to numerical simulation and quantitative finance. Algorithms can rapidly produce pseudo-random outcomes, series of numbers that mimic most properties of true random numbers while quantum random number generators (QRNGs) exploit intrinsic quantum randomness to produce true random numbers. Single-photon QRNGs are conceptually simple but produce few random bits per detection. In contrast, vacuum fluctuations are a vast resource for QRNGs: they are broad-band and thus can encode many random bits per second. Direct recording of vacuum fluctuations is possible, but requires shot-noise-limited detectors, at the cost of bandwidth. We demonstrate efficient conversion of vacuum fluctuations to true random bits using optical amplification of vacuum and interferometry. Using commercially-available optical components we demonstrate a QRNG at a bit rate of 1.11 Gbps. The proposed scheme has the potential to be extended to 10 Gbps and even up to 100 Gbps by taking advantage of high speed modulation sources and detectors for optical fiber telecommunication devices.

  16. Non-Speech Oro-Motor Exercises in Post-Stroke Dysarthria Intervention: A Randomized Feasibility Trial

    ERIC Educational Resources Information Center

    Mackenzie, C.; Muir, M.; Allen, C.; Jensen, A.

    2014-01-01

    Background: There has been little robust evaluation of the outcome of speech and language therapy (SLT) intervention for post-stroke dysarthria. Non-speech oro-motor exercises (NSOMExs) are a common component of dysarthria intervention. A feasibility study was designed and executed, with participants randomized into two groups, in one of which…

  17. Randomized, Controlled Trial of a Comprehensive Program for Young Students with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Young, Helen E.; Falco, Ruth A.; Hanita, Makoto

    2016-01-01

    This randomized, controlled trial, comparing the Comprehensive Autism Program (CAP) and business as usual programs, studied outcomes for 3-5 year old students with autism spectrum disorder (ASD). Participants included 84 teachers and 302 students with ASD and their parents. CAP utilized specialized curricula and training components to implement…

  18. Asymptotic Effect of Misspecification in the Random Part of the Multilevel Model

    ERIC Educational Resources Information Center

    Berkhof, Johannes; Kampen, Jarl Kennard

    2004-01-01

    The authors examine the asymptotic effect of omitting a random coefficient in the multilevel model and derive expressions for the change in (a) the variance components estimator and (b) the estimated variance of the fixed effects estimator. They apply the method of moments, which yields a closed form expression for the omission effect. In…

  19. Optimizing occupational exposure measurement strategies when estimating the log-scale arithmetic mean value--an example from the reinforced plastics industry.

    PubMed

    Lampa, Erik G; Nilsson, Leif; Liljelind, Ingrid E; Bergdahl, Ingvar A

    2006-06-01

    When assessing occupational exposures, repeated measurements are in most cases required. Repeated measurements are more resource intensive than a single measurement, so careful planning of the measurement strategy is necessary to assure that resources are spent wisely. The optimal strategy depends on the objectives of the measurements. Here, two different models of random effects analysis of variance (ANOVA) are proposed for the optimization of measurement strategies by the minimization of the variance of the estimated log-transformed arithmetic mean value of a worker group, i.e. the strategies are optimized for precise estimation of that value. The first model is a one-way random effects ANOVA model. For that model it is shown that the best precision in the estimated mean value is always obtained by including as many workers as possible in the sample while restricting the number of replicates to two or at most three regardless of the size of the variance components. The second model introduces the 'shared temporal variation' which accounts for those random temporal fluctuations of the exposure that the workers have in common. It is shown for that model that the optimal sample allocation depends on the relative sizes of the between-worker component and the shared temporal component, so that if the between-worker component is larger than the shared temporal component more workers should be included in the sample and vice versa. The results are illustrated graphically with an example from the reinforced plastics industry. If there exists a shared temporal variation at a workplace, that variability needs to be accounted for in the sampling design and the more complex model is recommended.

  20. Improving anxiety regulation in patients with breast cancer at the beginning of the survivorship period: a randomized clinical trial comparing the benefits of single-component and multiple-component group interventions.

    PubMed

    Merckaert, Isabelle; Lewis, Florence; Delevallez, France; Herman, Sophie; Caillier, Marie; Delvaux, Nicole; Libert, Yves; Liénard, Aurore; Nogaret, Jean-Marie; Ogez, David; Scalliet, Pierre; Slachmuylder, Jean-Louis; Van Houtte, Paul; Razavi, Darius

    2017-08-01

    To compare in a multicenter randomized controlled trial the benefits in terms of anxiety regulation of a 15-session single-component group intervention (SGI) based on support with those of a 15-session multiple-component structured manualized group intervention (MGI) combining support with cognitive-behavioral and hypnosis components. Patients with nonmetastatic breast cancer were randomly assigned at the beginning of the survivorship period to the SGI (n = 83) or MGI (n = 87). Anxiety regulation was assessed, before and after group interventions, through an anxiety regulation task designed to assess their ability to regulate anxiety psychologically (anxiety levels) and physiologically (heart rates). Questionnaires were used to assess psychological distress, everyday anxiety regulation, and fear of recurrence. Group allocation was computer generated and concealed till baseline completion. Compared with patients in the SGI group (n = 77), patients attending the MGI group (n = 82) showed significantly reduced anxiety after a self-relaxation exercise (P = .006) and after exposure to anxiety triggers (P = .013) and reduced heart rates at different time points throughout the task (P = .001 to P = .047). The MGI participants also reported better everyday anxiety regulation (P = .005), greater use of fear of recurrence-related coping strategies (P = .022), and greater reduction in fear of recurrence-related psychological distress (P = .017) compared with the SGI group. This study shows that an MGI combining support with cognitive-behavioral techniques and hypnosis is more effective than an SGI based only on support in improving anxiety regulation in patients with breast cancer. Copyright © 2016 John Wiley & Sons, Ltd.

  1. A Noise Reduction Method for Dual-Mass Micro-Electromechanical Gyroscopes Based on Sample Entropy Empirical Mode Decomposition and Time-Frequency Peak Filtering

    PubMed Central

    Shen, Chong; Li, Jie; Zhang, Xiaoming; Shi, Yunbo; Tang, Jun; Cao, Huiliang; Liu, Jun

    2016-01-01

    The different noise components in a dual-mass micro-electromechanical system (MEMS) gyroscope structure is analyzed in this paper, including mechanical-thermal noise (MTN), electronic-thermal noise (ETN), flicker noise (FN) and Coriolis signal in-phase noise (IPN). The structure equivalent electronic model is established, and an improved white Gaussian noise reduction method for dual-mass MEMS gyroscopes is proposed which is based on sample entropy empirical mode decomposition (SEEMD) and time-frequency peak filtering (TFPF). There is a contradiction in TFPS, i.e., selecting a short window length may lead to good preservation of signal amplitude but bad random noise reduction, whereas selecting a long window length may lead to serious attenuation of the signal amplitude but effective random noise reduction. In order to achieve a good tradeoff between valid signal amplitude preservation and random noise reduction, SEEMD is adopted to improve TFPF. Firstly, the original signal is decomposed into intrinsic mode functions (IMFs) by EMD, and the SE of each IMF is calculated in order to classify the numerous IMFs into three different components; then short window TFPF is employed for low frequency component of IMFs, and long window TFPF is employed for high frequency component of IMFs, and the noise component of IMFs is wiped off directly; at last the final signal is obtained after reconstruction. Rotation experimental and temperature experimental are carried out to verify the proposed SEEMD-TFPF algorithm, the verification and comparison results show that the de-noising performance of SEEMD-TFPF is better than that achievable with the traditional wavelet, Kalman filter and fixed window length TFPF methods. PMID:27258276

  2. A Noise Reduction Method for Dual-Mass Micro-Electromechanical Gyroscopes Based on Sample Entropy Empirical Mode Decomposition and Time-Frequency Peak Filtering.

    PubMed

    Shen, Chong; Li, Jie; Zhang, Xiaoming; Shi, Yunbo; Tang, Jun; Cao, Huiliang; Liu, Jun

    2016-05-31

    The different noise components in a dual-mass micro-electromechanical system (MEMS) gyroscope structure is analyzed in this paper, including mechanical-thermal noise (MTN), electronic-thermal noise (ETN), flicker noise (FN) and Coriolis signal in-phase noise (IPN). The structure equivalent electronic model is established, and an improved white Gaussian noise reduction method for dual-mass MEMS gyroscopes is proposed which is based on sample entropy empirical mode decomposition (SEEMD) and time-frequency peak filtering (TFPF). There is a contradiction in TFPS, i.e., selecting a short window length may lead to good preservation of signal amplitude but bad random noise reduction, whereas selecting a long window length may lead to serious attenuation of the signal amplitude but effective random noise reduction. In order to achieve a good tradeoff between valid signal amplitude preservation and random noise reduction, SEEMD is adopted to improve TFPF. Firstly, the original signal is decomposed into intrinsic mode functions (IMFs) by EMD, and the SE of each IMF is calculated in order to classify the numerous IMFs into three different components; then short window TFPF is employed for low frequency component of IMFs, and long window TFPF is employed for high frequency component of IMFs, and the noise component of IMFs is wiped off directly; at last the final signal is obtained after reconstruction. Rotation experimental and temperature experimental are carried out to verify the proposed SEEMD-TFPF algorithm, the verification and comparison results show that the de-noising performance of SEEMD-TFPF is better than that achievable with the traditional wavelet, Kalman filter and fixed window length TFPF methods.

  3. Fabricating a Microcomputer on a Single Silicon Wafer

    NASA Technical Reports Server (NTRS)

    Evanchuk, V. L.

    1983-01-01

    Concept for "microcomputer on a slice" reduces microcomputer costs by eliminating scribing, wiring, and packaging of individual circuit chips. Low-cost microcomputer on silicon slice contains redundant components. All components-central processing unit, input/output circuitry, read-only memory, and random-access memory (CPU, I/O, ROM, and RAM) on placed on single silicon wafer.

  4. Accuracy of the Parallel Analysis Procedure with Polychoric Correlations

    ERIC Educational Resources Information Center

    Cho, Sun-Joo; Li, Feiming; Bandalos, Deborah

    2009-01-01

    The purpose of this study was to investigate the application of the parallel analysis (PA) method for choosing the number of factors in component analysis for situations in which data are dichotomous or ordinal. Although polychoric correlations are sometimes used as input for component analyses, the random data matrices generated for use in PA…

  5. Effects of Career Choice Intervention on Components of Career Preparation

    ERIC Educational Resources Information Center

    Koivisto, Petri; Vinokur, Amiram D.; Vuori, Jukka

    2011-01-01

    This randomized experimental study (N = 1,034) examines both the direct and the indirect effects of the Towards Working Life intervention on 2 components of adolescents' career preparation: preparedness for career choice and attitude toward career planning. The intervention comprised a 1-week workshop program, the proximal goals of which were to…

  6. Estimating individual influences of behavioral intentions: an application of random-effects modeling to the theory of reasoned action.

    PubMed

    Hedeker, D; Flay, B R; Petraitis, J

    1996-02-01

    Methods are proposed and described for estimating the degree to which relations among variables vary at the individual level. As an example of the methods, M. Fishbein and I. Ajzen's (1975; I. Ajzen & M. Fishbein, 1980) theory of reasoned action is examined, which posits first that an individual's behavioral intentions are a function of 2 components: the individual's attitudes toward the behavior and the subjective norms as perceived by the individual. A second component of their theory is that individuals may weight these 2 components differently in assessing their behavioral intentions. This article illustrates the use of empirical Bayes methods based on a random-effects regression model to estimate these individual influences, estimating an individual's weighting of both of these components (attitudes toward the behavior and subjective norms) in relation to their behavioral intentions. This method can be used when an individual's behavioral intentions, subjective norms, and attitudes toward the behavior are all repeatedly measured. In this case, the empirical Bayes estimates are derived as a function of the data from the individual, strengthened by the overall sample data.

  7. Using variance components to estimate power in a hierarchically nested sampling design improving monitoring of larval Devils Hole pupfish

    USGS Publications Warehouse

    Dzul, Maria C.; Dixon, Philip M.; Quist, Michael C.; Dinsomore, Stephen J.; Bower, Michael R.; Wilson, Kevin P.; Gaines, D. Bailey

    2013-01-01

    We used variance components to assess allocation of sampling effort in a hierarchically nested sampling design for ongoing monitoring of early life history stages of the federally endangered Devils Hole pupfish (DHP) (Cyprinodon diabolis). Sampling design for larval DHP included surveys (5 days each spring 2007–2009), events, and plots. Each survey was comprised of three counting events, where DHP larvae on nine plots were counted plot by plot. Statistical analysis of larval abundance included three components: (1) evaluation of power from various sample size combinations, (2) comparison of power in fixed and random plot designs, and (3) assessment of yearly differences in the power of the survey. Results indicated that increasing the sample size at the lowest level of sampling represented the most realistic option to increase the survey's power, fixed plot designs had greater power than random plot designs, and the power of the larval survey varied by year. This study provides an example of how monitoring efforts may benefit from coupling variance components estimation with power analysis to assess sampling design.

  8. Modeling methodology for MLS range navigation system errors using flight test data

    NASA Technical Reports Server (NTRS)

    Karmali, M. S.; Phatak, A. V.

    1982-01-01

    Flight test data was used to develop a methodology for modeling MLS range navigation system errors. The data used corresponded to the constant velocity and glideslope approach segment of a helicopter landing trajectory. The MLS range measurement was assumed to consist of low frequency and random high frequency components. The random high frequency component was extracted from the MLS range measurements. This was done by appropriate filtering of the range residual generated from a linearization of the range profile for the final approach segment. This range navigation system error was then modeled as an autoregressive moving average (ARMA) process. Maximum likelihood techniques were used to identify the parameters of the ARMA process.

  9. Stability of parental understanding of random assignment in childhood leukemia trials: an empirical examination of informed consent.

    PubMed

    Greenley, Rachel Neff; Drotar, Dennis; Zyzanski, Stephen J; Kodish, Eric

    2006-02-20

    To examine stability versus change in parental understanding of random assignment in randomized clinical trials (RCTs) for pediatric leukemia and to identify factors associated with changes in understanding. Eighty-four parents of children diagnosed with acute lymphoblastic leukemia or acute myeloid leukemia who were enrolled onto a pediatric leukemia RCT at one of six US children's hospitals participated. Parents were interviewed twice, once within 48 hours after the Informed Consent Conference (ICC; time 1 [T1]) and again 6 months later (time 2 [T2]). Interviews focused on parental understanding of key components of the RCT, including random assignment. Interviews were audiotaped, transcribed, and later analyzed. Changes in understanding of random assignment occurred in 19% of parents, with 17% of parents deteriorating in understanding from T1 to T2. Forty-nine percent of parents failed to understand random assignment at both times. Factors associated with understanding at both times included majority ethnicity, high socioeconomic status, parental reading of consent document, and presence of a nurse during the ICC. Physician discussion of specific components of the RCT was also associated with understanding at both times. Female caregivers and parents of low socioeconomic status were overrepresented among those who showed decay in understanding from T1 to T2. Parents showed little gain in understanding over time. Factors that predicted understanding at diagnosis as well as sustained understanding over time may be important intervention targets. Attention to both modifiable and nonmodifiable barriers is important for clinical practice.

  10. A system identification technique based on the random decrement signatures. Part 2: Experimental results

    NASA Technical Reports Server (NTRS)

    Bedewi, Nabih E.; Yang, Jackson C. S.

    1987-01-01

    Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The results of an experiment conducted on an offshore platform scale model to verify the validity of the technique and to demonstrate its application in damage detection are presented.

  11. How random is a random vector?

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  12. Fatigue crack growth model RANDOM2 user manual, appendix 1

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    The FORTRAN program RANDOM2 is documented. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Included in this user manual are details regarding the theoretical background of RANDOM2, input data, instructions and a sample problem illustrating the use of RANDOM2. Appendix A gives information on the physical quantities, their symbols, FORTRAN names, and both SI and U.S. Customary units. Appendix B includes photocopies of the actual computer printout corresponding to the sample problem. Appendices C and D detail the IMSL, Ver. 10(1), subroutines and functions called by RANDOM2 and a SAS/GRAPH(2) program that can be used to plot both the probability density function (p.d.f.) and the cumulative distribution function (c.d.f.).

  13. High-speed true random number generation based on paired memristors for security electronics

    NASA Astrophysics Data System (ADS)

    Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru

    2017-11-01

    True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ˜30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.

  14. High-speed true random number generation based on paired memristors for security electronics.

    PubMed

    Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru

    2017-11-10

    True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ∼30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.

  15. The Effectiveness of Mindfulness-based Cognitive Therapy on Psychological Symptoms and Quality of Life in Systemic Lupus Erythematosus Patients: 
A Randomized Controlled Trial.

    PubMed

    Solati, Kamal; Mousavi, Mohammad; Kheiri, Soleiman; Hasanpour-Dehkordi, Ali

    2017-09-01

    This study was conducted to determine the efficacy of mindfulness-based cognitive therapy (MBCT) on psychological symptoms and quality of life (QoL) in patients with systemic lupus erythematosus (SLE). We conducted a randomized single-blind clinical trial in patients with SLE referred from the Imam Ali Clinic in Shahrekord, southwest Iran. The patients (46 in total in two groups of 23 each) were randomly assigned into the experimental and control groups. Both groups underwent routine medical care, and the experimental group underwent eight group sessions of MBCT in addition to routine care. The patient , s QoL was assessed using the General Health Questionnaire-28 and 36-Item Short Form Health Survey before, after, and six months after intervention (follow-up). A significant difference was seen in psychological symptoms and QoL between MBCT and control groups immediately after the intervention and at follow-up ( p ≤ 0.050). However, the difference was not significant for the physical components of QoL ( p ≥ 0.050). MBCT contributed to decreased psychological symptoms and improved QoL in patients with SLE with a stable effect on psychological symptoms and psychological components of QoL, but an unstable effect on physical components.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modeste Nguimdo, Romain, E-mail: Romain.Nguimdo@vub.ac.be; Tchitnga, Robert; Woafo, Paul

    We numerically investigate the possibility of using a coupling to increase the complexity in simplest chaotic two-component electronic circuits operating at high frequency. We subsequently show that complex behaviors generated in such coupled systems, together with the post-processing are suitable for generating bit-streams which pass all the NIST tests for randomness. The electronic circuit is built up by unidirectionally coupling three two-component (one active and one passive) oscillators in a ring configuration through resistances. It turns out that, with such a coupling, high chaotic signals can be obtained. By extracting points at fixed interval of 10 ns (corresponding to a bitmore » rate of 100 Mb/s) on such chaotic signals, each point being simultaneously converted in 16-bits (or 8-bits), we find that the binary sequence constructed by including the 10(or 2) least significant bits pass statistical tests of randomness, meaning that bit-streams with random properties can be achieved with an overall bit rate up to 10×100 Mb/s =1Gbit/s (or 2×100 Mb/s =200 Megabit/s). Moreover, by varying the bias voltages, we also investigate the parameter range for which more complex signals can be obtained. Besides being simple to implement, the two-component electronic circuit setup is very cheap as compared to optical and electro-optical systems.« less

  17. Comparative evaluation of Space Transportation System (STS)-3 flight and acoustic test random vibration response of the OSS-1 payload

    NASA Technical Reports Server (NTRS)

    On, F. J.

    1983-01-01

    A comparative evaluation of the Space Transportation System (STS)-3 flight and acoustic test random vibration response of the Office of Space Science-1 (OSS-1) payload is presented. The results provide insight into the characteristics of vibroacoustic response of pallet payload components in the payload bay during STS flights.

  18. Alcohol-Specific Parenting within a Cluster-Randomized Effectiveness Trial of a Swedish Primary Prevention Program

    ERIC Educational Resources Information Center

    Strandberg, Anna K.; Bodin, Maria C.

    2011-01-01

    Purpose: Within the framework of an ongoing cluster-randomized effectiveness trial of a parental prevention program, the aim of the present study is to investigate attitudes towards under-age drinking and use of program components, i.e. alcohol-specific parenting behaviors, in parents who did and did not take part in the programme.…

  19. Evaluation of a 2-Year Physical Activity and Healthy Eating Intervention in Middle School Children

    ERIC Educational Resources Information Center

    Haerens, Leen; Deforche, Benedicte; Maes, Lea; Cardon, Greet; Stevens, Veerle; De Bourdeaudhuij, Ilse

    2006-01-01

    The aim of the present study was to evaluate the effects of a middle school physical activity and healthy eating intervention, including an environmental and computer-tailored component, and to investigate the effects of parental involvement. A random sample of 15 schools with seventh and eight graders was randomly assigned to one of three…

  20. Validating Components of Teacher Effectiveness: A Random Assignment Study of Value-Added, Observation, and Survey Scores

    ERIC Educational Resources Information Center

    Bacher-Hicks, Andrew; Chin, Mark; Kane, Thomas J.; Staiger, Douglas O.

    2015-01-01

    Policy changes from the past decade have resulted in a growing interest in identifying effective teachers and their characteristics. This study is the third study to use data from a randomized experiment to test the validity of measures of teacher effectiveness. The authors collected effectiveness measures across three school years from three…

  1. Electrophysiological Evidence for the Magnocellular-Dorsal Pathway Deficit in Dyslexia

    ERIC Educational Resources Information Center

    Jednorog, Katarzyna; Marchewka, Artur; Tacikowski, Pawel; Heim, Stefan; Grabowska, Anna

    2011-01-01

    In adults, the onset of coherent motion compared to random motion in a random dot kinematogram leads to a right hemispheric amplitude advantage of the N2 response. The source of this asymmetry is believed to lie in the motion selective MT+ cortex. Here, we tested whether the right tempo-parietal N2 component shows a similar regularity in children.…

  2. Infinite Systems of Interacting Chains with Memory of Variable Length—A Stochastic Model for Biological Neural Nets

    NASA Astrophysics Data System (ADS)

    Galves, A.; Löcherbach, E.

    2013-06-01

    We consider a new class of non Markovian processes with a countable number of interacting components. At each time unit, each component can take two values, indicating if it has a spike or not at this precise moment. The system evolves as follows. For each component, the probability of having a spike at the next time unit depends on the entire time evolution of the system after the last spike time of the component. This class of systems extends in a non trivial way both the interacting particle systems, which are Markovian (Spitzer in Adv. Math. 5:246-290, 1970) and the stochastic chains with memory of variable length which have finite state space (Rissanen in IEEE Trans. Inf. Theory 29(5):656-664, 1983). These features make it suitable to describe the time evolution of biological neural systems. We construct a stationary version of the process by using a probabilistic tool which is a Kalikow-type decomposition either in random environment or in space-time. This construction implies uniqueness of the stationary process. Finally we consider the case where the interactions between components are given by a critical directed Erdös-Rényi-type random graph with a large but finite number of components. In this framework we obtain an explicit upper-bound for the correlation between successive inter-spike intervals which is compatible with previous empirical findings.

  3. Normalization of neuronal responses in cortical area MT across signal strengths and motion directions

    PubMed Central

    Xiao, Jianbo; Niu, Yu-Qiong; Wiesner, Steven

    2014-01-01

    Multiple visual stimuli are common in natural scenes, yet it remains unclear how multiple stimuli interact to influence neuronal responses. We investigated this question by manipulating relative signal strengths of two stimuli moving simultaneously within the receptive fields (RFs) of neurons in the extrastriate middle temporal (MT) cortex. Visual stimuli were overlapping random-dot patterns moving in two directions separated by 90°. We first varied the motion coherence of each random-dot pattern and characterized, across the direction tuning curve, the relationship between neuronal responses elicited by bidirectional stimuli and by the constituent motion components. The tuning curve for bidirectional stimuli showed response normalization and can be accounted for by a weighted sum of the responses to the motion components. Allowing nonlinear, multiplicative interaction between the two component responses significantly improved the data fit for some neurons, and the interaction mainly had a suppressive effect on the neuronal response. The weighting of the component responses was not fixed but dependent on relative signal strengths. When two stimulus components moved at different coherence levels, the response weight for the higher-coherence component was significantly greater than that for the lower-coherence component. We also varied relative luminance levels of two coherently moving stimuli and found that MT response weight for the higher-luminance component was also greater. These results suggest that competition between multiple stimuli within a neuron's RF depends on relative signal strengths of the stimuli and that multiplicative nonlinearity may play an important role in shaping the response tuning for multiple stimuli. PMID:24899674

  4. Application of a computerized vibroacoustic data bank for random vibration criteria development

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.

    1982-01-01

    A computerized data bank system was developed for utilization of large amounts of vibration and acoustic data to formulate component random vibration design and test criteria. This system consists of a computer, graphics tablets, and a dry silver hard copier which are all desk top type hardware and occupy minimal space. Currently, the data bank contains data from the Saturn 5 and Titan 3 flight and static test programs. The vibration and acoustic data are stored in the form of power spectral density and one third octave band plots over the frequency range from 20 to 2000 Hz. The data were stored by digitizing each spectral plot by tracing with the graphics tablet. The digitized data were statistically analyzed, and the resulting 97.5 percent confidence levels were stored on tape along with the appropriate structural parameters. Standard extrapolation procedures were programmed for prediction of component random vibration test criteria for new launch vehicle and payload configurations. A user's manual is included to guide potential users through the programs.

  5. Computing approximate random Delta v magnitude probability densities. [for spacecraft trajectory correction

    NASA Technical Reports Server (NTRS)

    Chadwick, C.

    1984-01-01

    This paper describes the development and use of an algorithm to compute approximate statistics of the magnitude of a single random trajectory correction maneuver (TCM) Delta v vector. The TCM Delta v vector is modeled as a three component Cartesian vector each of whose components is a random variable having a normal (Gaussian) distribution with zero mean and possibly unequal standard deviations. The algorithm uses these standard deviations as input to produce approximations to (1) the mean and standard deviation of the magnitude of Delta v, (2) points of the probability density function of the magnitude of Delta v, and (3) points of the cumulative and inverse cumulative distribution functions of Delta v. The approximates are based on Monte Carlo techniques developed in a previous paper by the author and extended here. The algorithm described is expected to be useful in both pre-flight planning and in-flight analysis of maneuver propellant requirements for space missions.

  6. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, High-Cycle and Low-Cycle Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Boyce, Lola

    1995-01-01

    The development of methodology for a probabilistic material strength degradation is described. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing predictions of high-cycle mechanical fatigue and high temperature effects with experiments are presented. Results from this limited verification study strongly supported that material degradation can be represented by randomized multifactor interaction models.

  7. Observing Responses and Serial Stimuli: Searching for the Reinforcing Properties of the S-

    ERIC Educational Resources Information Center

    Escobar, Rogelio; Bruner, Carlos A.

    2009-01-01

    The control exerted by a stimulus associated with an extinction component (S-) on observing responses was determined as a function of its temporal relation with the onset of the reinforcement component (S+). Lever pressing by rats was reinforced on a mixed random-interval extinction schedule. Each press on a second lever produced stimuli…

  8. Predicting Personality Resiliency by Psychological Well-Being and Its Components in Girl Students of Islamic Azad University

    ERIC Educational Resources Information Center

    Kajbafnezhad, Hadi; Khaneh Keshi, Ali

    2015-01-01

    The aim of this study was to predict psychological resilience by psychological well-being and its components. The research sample consisted of 216 girl students who were selected through multistage random sampling. The data were collected by implementing psychological resilience and psychological well-being questionnaire and analyzed by using…

  9. Implications of random variation in the Stand Prognosis Model

    Treesearch

    David A. Hamilton

    1991-01-01

    Although the Stand Prognosis Model has several stochastic components, features have been included in the model in an attempt to minimize run-to-run variation attributable to these stochastic components. This has led many users to assume that comparisons of management alternatives could be made based on a single run of the model for each alternative. Recent analyses...

  10. The Practices of Critical Thinking Component and Its Impact in Malaysian Nurses Health Education

    ERIC Educational Resources Information Center

    Abdullah, Abdul Ghani Kanesan; Alzaidiyeen, Naser Jamil; Yee, Ng Mooi

    2010-01-01

    The purpose of this research is to study the impact of the critical thinking component in the health education curriculum of nurses for patients with different health needs. Data for this research was gathered from mixed approaches, quantitative and qualitative approaches. For the quantitative approach 84 student nurses were selected randomly to…

  11. On the Extraction of Components and the Applicability of the Factor Model.

    ERIC Educational Resources Information Center

    Dziuban, Charles D.; Harris, Chester W.

    A reanalysis of Shaycroft's matrix of intercorrelations of 10 test variables plus 4 random variables is discussed. Three different procedures were used in the reanalysis: (1) Image Component Analysis, (2) Uniqueness Rescaling Factor Analysis, and (3) Alpha Factor Analysis. The results of these analyses are presented in tables. It is concluded from…

  12. Use of a threshold animal model to estimate calving ease and stillbirth (co)variance components for US Holsteins

    USDA-ARS?s Scientific Manuscript database

    (Co)variance components for calving ease and stillbirth in US Holsteins were estimated using a single-trait threshold animal model and two different sets of data edits. Six sets of approximately 250,000 records each were created by randomly selecting herd codes without replacement from the data used...

  13. A method to identify aperiodic disturbances in the ionosphere

    NASA Astrophysics Data System (ADS)

    Wang, J.-S.; Chen, Z.; Huang, C.-M.

    2014-05-01

    In this paper, variations in the ionospheric F2 layer's critical frequency are decomposed into their periodic and aperiodic components. The latter include disturbances caused both by geophysical impacts on the ionosphere and random noise. The spectral whitening method (SWM), a signal-processing technique used in statistical estimation and/or detection, was used to identify aperiodic components in the ionosphere. The whitening algorithm adopted herein is used to divide the Fourier transform of the observed data series by a real envelope function. As a result, periodic components are suppressed and aperiodic components emerge as the dominant contributors. Application to a synthetic data set based on significant simulated periodic features of ionospheric observations containing artificial (and, hence, controllable) disturbances was used to validate the SWM for identification of aperiodic components. Although the random noise was somewhat enhanced by post-processing, the artificial disturbances could still be clearly identified. The SWM was then applied to real ionospheric observations. It was found to be more sensitive than the often-used monthly median method to identify geomagnetic effects. In addition, disturbances detected by the SWM were characterized by a Gaussian-type probability density function over all timescales, which further simplifies statistical analysis and suggests that the disturbances thus identified can be compared regardless of timescale.

  14. Online neural monitoring of statistical learning

    PubMed Central

    Batterink, Laura J.; Paller, Ken A.

    2017-01-01

    The extraction of patterns in the environment plays a critical role in many types of human learning, from motor skills to language acquisition. This process is known as statistical learning. Here we propose that statistical learning has two dissociable components: (1) perceptual binding of individual stimulus units into integrated composites and (2) storing those integrated representations for later use. Statistical learning is typically assessed using post-learning tasks, such that the two components are conflated. Our goal was to characterize the online perceptual component of statistical learning. Participants were exposed to a structured stream of repeating trisyllabic nonsense words and a random syllable stream. Online learning was indexed by an EEG-based measure that quantified neural entrainment at the frequency of the repeating words relative to that of individual syllables. Statistical learning was subsequently assessed using conventional measures in an explicit rating task and a reaction-time task. In the structured stream, neural entrainment to trisyllabic words was higher than in the random stream, increased as a function of exposure to track the progression of learning, and predicted performance on the RT task. These results demonstrate that monitoring this critical component of learning via rhythmic EEG entrainment reveals a gradual acquisition of knowledge whereby novel stimulus sequences are transformed into familiar composites. This online perceptual transformation is a critical component of learning. PMID:28324696

  15. Percolation and epidemics in random clustered networks

    NASA Astrophysics Data System (ADS)

    Miller, Joel C.

    2009-08-01

    The social networks that infectious diseases spread along are typically clustered. Because of the close relation between percolation and epidemic spread, the behavior of percolation in such networks gives insight into infectious disease dynamics. A number of authors have studied percolation or epidemics in clustered networks, but the networks often contain preferential contacts in high degree nodes. We introduce a class of random clustered networks and a class of random unclustered networks with the same preferential mixing. Percolation in the clustered networks reduces the component sizes and increases the epidemic threshold compared to the unclustered networks.

  16. Beyond SaGMRotI: Conversion to SaArb, SaSN, and SaMaxRot

    USGS Publications Warehouse

    Watson-Lamprey, J. A.; Boore, D.M.

    2007-01-01

    In the seismic design of structures, estimates of design forces are usually provided to the engineer in the form of elastic response spectra. Predictive equations for elastic response spectra are derived from empirical recordings of ground motion. The geometric mean of the two orthogonal horizontal components of motion is often used as the response value in these predictive equations, although it is not necessarily the most relevant estimate of forces within the structure. For some applications it is desirable to estimate the response value on a randomly chosen single component of ground motion, and in other applications the maximum response in a single direction is required. We give adjustment factors that allow converting the predictions of geometric-mean ground-motion predictions into either of these other two measures of seismic ground-motion intensity. In addition, we investigate the relation of the strike-normal component of ground motion to the maximum response values. We show that the strike-normal component of ground motion seldom corresponds to the maximum horizontal-component response value (in particular, at distances greater than about 3 km from faults), and that focusing on this case in exclusion of others can result in the underestimation of the maximum component. This research provides estimates of the maximum response value of a single component for all cases, not just near-fault strike-normal components. We provide modification factors that can be used to convert predictions of ground motions in terms of the geometric mean to the maximum spectral acceleration (SaMaxRot) and the random component of spectral acceleration (SaArb). Included are modification factors for both the mean and the aleatory standard deviation of the logarithm of the motions.

  17. Random matrix approach to cross correlations in financial data

    NASA Astrophysics Data System (ADS)

    Plerou, Vasiliki; Gopikrishnan, Parameswaran; Rosenow, Bernd; Amaral, Luís A.; Guhr, Thomas; Stanley, H. Eugene

    2002-06-01

    We analyze cross correlations between price fluctuations of different stocks using methods of random matrix theory (RMT). Using two large databases, we calculate cross-correlation matrices C of returns constructed from (i) 30-min returns of 1000 US stocks for the 2-yr period 1994-1995, (ii) 30-min returns of 881 US stocks for the 2-yr period 1996-1997, and (iii) 1-day returns of 422 US stocks for the 35-yr period 1962-1996. We test the statistics of the eigenvalues λi of C against a ``null hypothesis'' - a random correlation matrix constructed from mutually uncorrelated time series. We find that a majority of the eigenvalues of C fall within the RMT bounds [λ-,λ+] for the eigenvalues of random correlation matrices. We test the eigenvalues of C within the RMT bound for universal properties of random matrices and find good agreement with the results for the Gaussian orthogonal ensemble of random matrices-implying a large degree of randomness in the measured cross-correlation coefficients. Further, we find that the distribution of eigenvector components for the eigenvectors corresponding to the eigenvalues outside the RMT bound display systematic deviations from the RMT prediction. In addition, we find that these ``deviating eigenvectors'' are stable in time. We analyze the components of the deviating eigenvectors and find that the largest eigenvalue corresponds to an influence common to all stocks. Our analysis of the remaining deviating eigenvectors shows distinct groups, whose identities correspond to conventionally identified business sectors. Finally, we discuss applications to the construction of portfolios of stocks that have a stable ratio of risk to return.

  18. HealthWorks: results of a multi-component group-randomized worksite environmental intervention trial for weight gain prevention.

    PubMed

    Linde, Jennifer A; Nygaard, Katherine E; MacLehose, Richard F; Mitchell, Nathan R; Harnack, Lisa J; Cousins, Julie M; Graham, Daniel J; Jeffery, Robert W

    2012-02-16

    U.S. adults are at unprecedented risk of becoming overweight or obese, and most scientists believe the primary cause is an obesogenic environment. Worksites provide an opportunity to shape the environments of adults to reduce obesity risk. The goal of this group-randomized trial was to implement a four-component environmental intervention at the worksite level to positively influence weight gain among employees over a two-year period. Environmental components focused on food availability and price, physical activity promotion, scale access, and media enhancements. Six worksites in a U.S. metropolitan area were recruited and randomized in pairs at the worksite level to either a two-year intervention or a no-contact control. Evaluations at baseline and two years included: 1) measured height and weight; 2) online surveys of individual dietary intake and physical activity behaviors; and 3) detailed worksite environment assessment. Mean participant age was 42.9 years (range 18-75), 62.6% were women, 68.5% were married or cohabiting, 88.6% were white, 2.1% Hispanic. Mean baseline BMI was 28.5 kg/m(2) (range 16.9-61.2 kg/m(2)). A majority of intervention components were successfully implemented. However, there were no differences between sites in the key outcome of weight change over the two-year study period (p = .36). Body mass was not significantly affected by environmental changes implemented for the trial. Results raise questions about whether environmental change at worksites is sufficient for population weight gain prevention. ClinicalTrials.gov: NCT00708461.

  19. Simulation of Crack Propagation in Engine Rotating Components under Variable Amplitude Loading

    NASA Technical Reports Server (NTRS)

    Bonacuse, P. J.; Ghosn, L. J.; Telesman, J.; Calomino, A. M.; Kantzos, P.

    1998-01-01

    The crack propagation life of tested specimens has been repeatedly shown to strongly depend on the loading history. Overloads and extended stress holds at temperature can either retard or accelerate the crack growth rate. Therefore, to accurately predict the crack propagation life of an actual component, it is essential to approximate the true loading history. In military rotorcraft engine applications, the loading profile (stress amplitudes, temperature, and number of excursions) can vary significantly depending on the type of mission flown. To accurately assess the durability of a fleet of engines, the crack propagation life distribution of a specific component should account for the variability in the missions performed (proportion of missions flown and sequence). In this report, analytical and experimental studies are described that calibrate/validate the crack propagation prediction capability ]or a disk alloy under variable amplitude loading. A crack closure based model was adopted to analytically predict the load interaction effects. Furthermore, a methodology has been developed to realistically simulate the actual mission mix loading on a fleet of engines over their lifetime. A sequence of missions is randomly selected and the number of repeats of each mission in the sequence is determined assuming a Poisson distributed random variable with a given mean occurrence rate. Multiple realizations of random mission histories are generated in this manner and are used to produce stress, temperature, and time points for fracture mechanics calculations. The result is a cumulative distribution of crack propagation lives for a given, life limiting, component location. This information can be used to determine a safe retirement life or inspection interval for the given location.

  20. Inhomogeneous fluid of penetrable-spheres: Application of the random phase approximation

    NASA Astrophysics Data System (ADS)

    Xiang, Yan; Frydel, Derek

    2017-05-01

    The focus of the present work is the application of the random phase approximation (RPA), derived for inhomogeneous fluids [Frydel and Ma, Phys. Rev. E 93, 062112 (2016)], to penetrable-spheres. As penetrable-spheres transform into hard-spheres with increasing interactions, they provide an interesting case for exploring the RPA, its shortcomings, and limitations, the weak- versus the strong-coupling limit. Two scenarios taken up by the present study are a one-component and a two-component fluid with symmetric interactions. In the latter case, the mean-field contributions cancel out and any contributions from particle interactions are accounted for by correlations. The accuracy of the RPA for this case is the result of a somewhat lucky cancellation of errors.

  1. Is Identification with School the Key Component in the "Black Box" of Education Outcomes? Evidence from a Randomized Experiment

    ERIC Educational Resources Information Center

    Fletcher, Jason M.

    2009-01-01

    In this paper, we follow up the important class size reduction randomized experiment in Tennessee in the mid 1980s (Project STAR) to attempt to further understand the long-lasting influences of early education interventions. While STAR led to large test score benefits during the intervention, these benefits quickly faded at its conclusion.…

  2. A Randomized Controlled Trial of the Social Tools and Rules for Teens (START) Program: An Immersive Socialization Intervention for Adolescents with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Vernon, Ty W.; Miller, Amber R.; Ko, Jordan A.; Barrett, Amy C.; McGarry, Elizabeth S.

    2018-01-01

    Adolescents with ASD face numerous personal and contextual barriers that impede the development of social motivation and core competencies, warranting the need for targeted intervention. A randomized controlled trial was conducted with 40 adolescents to evaluate the merits of a multi-component socialization intervention that places emphasis on…

  3. Modeling for Ultrasonic Health Monitoring of Foams with Embedded Sensors

    NASA Technical Reports Server (NTRS)

    Wang, L.; Rokhlin, S. I.; Rokhlin, Stanislav, I.

    2005-01-01

    In this report analytical and numerical methods are proposed to estimate the effective elastic properties of regular and random open-cell foams. The methods are based on the principle of minimum energy and on structural beam models. The analytical solutions are obtained using symbolic processing software. The microstructure of the random foam is simulated using Voronoi tessellation together with a rate-dependent random close-packing algorithm. The statistics of the geometrical properties of random foams corresponding to different packing fractions have been studied. The effects of the packing fraction on elastic properties of the foams have been investigated by decomposing the compliance into bending and axial compliance components. It is shown that the bending compliance increases and the axial compliance decreases when the packing fraction increases. Keywords: Foam; Elastic properties; Finite element; Randomness

  4. Bridges in complex networks

    NASA Astrophysics Data System (ADS)

    Wu, Ang-Kun; Tian, Liang; Liu, Yang-Yu

    2018-01-01

    A bridge in a graph is an edge whose removal disconnects the graph and increases the number of connected components. We calculate the fraction of bridges in a wide range of real-world networks and their randomized counterparts. We find that real networks typically have more bridges than their completely randomized counterparts, but they have a fraction of bridges that is very similar to their degree-preserving randomizations. We define an edge centrality measure, called bridgeness, to quantify the importance of a bridge in damaging a network. We find that certain real networks have a very large average and variance of bridgeness compared to their degree-preserving randomizations and other real networks. Finally, we offer an analytical framework to calculate the bridge fraction and the average and variance of bridgeness for uncorrelated random networks with arbitrary degree distributions.

  5. The Cognitive Social Network in Dreams: Transitivity, Assortativity, and Giant Component Proportion Are Monotonic.

    PubMed

    Han, Hye Joo; Schweickert, Richard; Xi, Zhuangzhuang; Viau-Quesnel, Charles

    2016-04-01

    For five individuals, a social network was constructed from a series of his or her dreams. Three important network measures were calculated for each network: transitivity, assortativity, and giant component proportion. These were monotonically related; over the five networks as transitivity increased, assortativity increased and giant component proportion decreased. The relations indicate that characters appear in dreams systematically. Systematicity likely arises from the dreamer's memory of people and their relations, which is from the dreamer's cognitive social network. But the dream social network is not a copy of the cognitive social network. Waking life social networks tend to have positive assortativity; that is, people tend to be connected to others with similar connectivity. Instead, in our sample of dream social networks assortativity is more often negative or near 0, as in online social networks. We show that if characters appear via a random walk, negative assortativity can result, particularly if the random walk is biased as suggested by remote associations. Copyright © 2015 Cognitive Science Society, Inc.

  6. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  7. Comprehensive analysis of low-frequency noise variability components in bulk and fully depleted silicon-on-insulator metal–oxide–semiconductor field-effect transistor

    NASA Astrophysics Data System (ADS)

    Maekawa, Keiichi; Makiyama, Hideki; Yamamoto, Yoshiki; Hasegawa, Takumi; Okanishi, Shinobu; Sonoda, Kenichiro; Shinkawata, Hiroki; Yamashita, Tomohiro; Kamohara, Shiro; Yamaguchi, Yasuo

    2018-04-01

    The low-frequency noise (LFN) variability in bulk and fully depleted silicon-on-insulator (FDSOI) metal–oxide–semiconductor field-effect transistor (MOSFET) with silicon on thin box (SOTB) technology was investigated. LFN typically shows a flicker noise component and a signal Lorentzian component by random telegraph noise (RTN). At a weak inversion state, the random dopant fluctuation (RDF) in a channel is strongly affected to not only RTN variability but also flicker noise variability in the bulk MOSFET compared with SOTB MOSFET because of local carrier number fluctuation in the channel. On the other hand, the typical level of LFN in SOTB MOSFET is slightly larger than that in the bulk MOSFET because of an additional interface on the buried oxide layer. However, considering the tailing characteristics of LFN variability, LFN in SOTB MOSFET can be assumed to be smaller than that in the bulk MOSFET, which enables the low-voltage operation of analog circuits.

  8. [A magnetic therapy apparatus with an adaptable electromagnetic spectrum for the treatment of prostatitis and gynecopathies].

    PubMed

    Kuz'min, A A; Meshkovskiĭ, D V; Filist, S A

    2008-01-01

    Problems of engineering and algorithm development of magnetic therapy apparatuses with pseudo-random radiation spectrum within the audio range for treatment of prostatitis and gynecopathies are considered. A typical design based on a PIC 16F microcontroller is suggested. It includes a keyboard, LCD indicator, audio amplifier, inducer, and software units. The problem of pseudo-random signal generation within the audio range is considered. A series of rectangular pulses is generated on a random-length interval on the basis of a three-component random vector. This series provides the required spectral characteristics of the therapeutic magnetic field and their adaptation to the therapeutic conditions and individual features of the patient.

  9. Simulating Pre-Asymptotic, Non-Fickian Transport Although Doing Simple Random Walks - Supported By Empirical Pore-Scale Velocity Distributions and Memory Effects

    NASA Astrophysics Data System (ADS)

    Most, S.; Jia, N.; Bijeljic, B.; Nowak, W.

    2016-12-01

    Pre-asymptotic characteristics are almost ubiquitous when analyzing solute transport processes in porous media. These pre-asymptotic aspects are caused by spatial coherence in the velocity field and by its heterogeneity. For the Lagrangian perspective of particle displacements, the causes of pre-asymptotic, non-Fickian transport are skewed velocity distribution, statistical dependencies between subsequent increments of particle positions (memory) and dependence between the x, y and z-components of particle increments. Valid simulation frameworks should account for these factors. We propose a particle tracking random walk (PTRW) simulation technique that can use empirical pore-space velocity distributions as input, enforces memory between subsequent random walk steps, and considers cross dependence. Thus, it is able to simulate pre-asymptotic non-Fickian transport phenomena. Our PTRW framework contains an advection/dispersion term plus a diffusion term. The advection/dispersion term produces time-series of particle increments from the velocity CDFs. These time series are equipped with memory by enforcing that the CDF values of subsequent velocities change only slightly. The latter is achieved through a random walk on the axis of CDF values between 0 and 1. The virtual diffusion coefficient for that random walk is our only fitting parameter. Cross-dependence can be enforced by constraining the random walk to certain combinations of CDF values between the three velocity components in x, y and z. We will show that this modelling framework is capable of simulating non-Fickian transport by comparison with a pore-scale transport simulation and we analyze the approach to asymptotic behavior.

  10. From Research to Practice: The Effect of Multi-Component Vocabulary Instruction on Increasing Vocabulary and Comprehension Performance in Social Studies

    ERIC Educational Resources Information Center

    Graham, Lori; Graham, Anna; West, Courtney

    2015-01-01

    This study was designed to demonstrate the effect of implementing multi-component vocabulary strategy instruction in fourth grade social studies. Curriculum was designed for a six-week period and was intended to actively engage students and reinforce retention of word meanings in isolation and in context. Teachers were randomly chosen for…

  11. Multiple-Component Remediation for Developmental Reading Disabilities: IQ, Socioeconomic Status, and Race as Factors in Remedial Outcome

    ERIC Educational Resources Information Center

    Morris, Robin D.; Lovett, Maureen W.; Wolf, Maryanne; Sevcik, Rose A.; Steinbach, Karen A.; Frijters, Jan C.; Shapiro, Marla B.

    2012-01-01

    Results from a controlled evaluation of remedial reading interventions are reported: 279 young disabled readers were randomly assigned to a program according to a 2 x 2 x 2 factorial design (IQ, socioeconomic status [SES], and race). The effectiveness of two multiple-component intervention programs for children with reading disabilities (PHAB +…

  12. Assessment of the Implementation of the Reading Component of the English Language Curriculum for Basic Education in Nigeria

    ERIC Educational Resources Information Center

    Yusuf, Hanna Onyi

    2014-01-01

    This study assessed the implementation of the reading component of the Junior Secondary School English Language Curriculum for Basic Education in Nigeria. Ten (10) randomly selected public and private secondary schools from Kaduna metropolis in Kaduna State of Nigeria were used for the study. Among the factors assessed in relation to the…

  13. Enhancing the Mental Health Promotion Component of a Health and Personal Development Programme in Irish Schools

    ERIC Educational Resources Information Center

    Fitzpatrick, Carol; Conlon, Andrea; Cleary, Deirdre; Power, Mike; King, Frances; Guerin, Suzanne

    2013-01-01

    This study set out to examine the impact of a health and personal development programme (the Social, Personal and Health Education Programme) which had been "enhanced" by the addition of a mental health promotion component. Students aged 12-16 years attending 17 secondary schools were randomly allocated as clusters to participate in…

  14. Sensitivity of directed networks to the addition and pruning of edges and vertices

    NASA Astrophysics Data System (ADS)

    Goltsev, A. V.; Timár, G.; Mendes, J. F. F.

    2017-08-01

    Directed networks have various topologically different extensive components, in contrast to a single giant component in undirected networks. We study the sensitivity (response) of the sizes of these extensive components in directed complex networks to the addition and pruning of edges and vertices. We introduce the susceptibility, which quantifies this sensitivity. We show that topologically different parts of a directed network have different sensitivity to the addition and pruning of edges and vertices and, therefore, they are characterized by different susceptibilities. These susceptibilities diverge at the critical point of the directed percolation transition, signaling the appearance (or disappearance) of the giant strongly connected component in the infinite size limit. We demonstrate this behavior in randomly damaged real and synthetic directed complex networks, such as the World Wide Web, Twitter, the Caenorhabditis elegans neural network, directed Erdős-Rényi graphs, and others. We reveal a nonmonotonic dependence of the sensitivity to random pruning of edges or vertices in the case of C. elegans and Twitter that manifests specific structural peculiarities of these networks. We propose the measurements of the susceptibilities during the addition or pruning of edges and vertices as a new method for studying structural peculiarities of directed networks.

  15. An Overview of Randomization and Minimization Programs for Randomized Clinical Trials

    PubMed Central

    Saghaei, Mahmoud

    2011-01-01

    Randomization is an essential component of sound clinical trials, which prevents selection biases and helps in blinding the allocations. Randomization is a process by which subsequent subjects are enrolled into trial groups only by chance, which essentially eliminates selection biases. A serious consequence of randomization is severe imbalance among the treatment groups with respect to some prognostic factors, which invalidate the trial results or necessitate complex and usually unreliable secondary analysis to eradicate the source of imbalances. Minimization on the other hand tends to allocate in such a way as to minimize the differences among groups, with respect to prognostic factors. Pure minimization is therefore completely deterministic, that is, one can predict the allocation of the next subject by knowing the factor levels of a previously enrolled subject and having the properties of the next subject. To eliminate the predictability of randomization, it is necessary to include some elements of randomness in the minimization algorithms. In this article brief descriptions of randomization and minimization are presented followed by introducing selected randomization and minimization programs. PMID:22606659

  16. Cemented tibial component fixation performs better than cementless fixation: a randomized radiostereometric study comparing porous-coated, hydroxyapatite-coated and cemented tibial components over 5 years.

    PubMed

    Carlsson, Ake; Björkman, Anders; Besjakov, Jack; Onsten, Ingemar

    2005-06-01

    The question whether the tibial component of a total knee arthroplasty should be fixed to bone with or without bone cement has not yet been definitely answered. We studied movements between the tibial component and bone by radiostereometry (RSA) in total knee replacement (TKR) for 3 different types of fixation: cemented fixation (C-F), uncemented porous fixation (UC-F) and uncemented porous hydroxyapatite fixation (UCHA-F). 116 patients with osteoarthrosis, who had 146 TKRs, were included in 2 randomized series. The first series included 86 unilateral TKRs stratified into 1 of the 3 types of fixation. The second series included 30 patients who had simultaneous bilateral TKR surgery, and who were stratified into 3 subgroups of pairwise comparisons of the 3 types of fixation. After 5 years 2 knees had been revised, neither of which were due to loosening. 1 UCHA-F knee in the unilateral series showed a large and continuous migration and a poor clinical result, and is a pending failure. The C-F knees rotated and migrated less than UC-F and UCHA-F knees over 5 years. UCHA-F migrated less than UC-F after 1 year. Cementing of the tibial component offers more stable bone-implant contact for 5 years compared to uncemented fixation. When using uncemented components, however, there is evidence that augmenting a porous surface with hydroxyapatite may mean less motion between implant and bone after the initial postoperative year.

  17. Generalized essential energy space random walks to more effectively accelerate solute sampling in aqueous environment

    NASA Astrophysics Data System (ADS)

    Lv, Chao; Zheng, Lianqing; Yang, Wei

    2012-01-01

    Molecular dynamics sampling can be enhanced via the promoting of potential energy fluctuations, for instance, based on a Hamiltonian modified with the addition of a potential-energy-dependent biasing term. To overcome the diffusion sampling issue, which reveals the fact that enlargement of event-irrelevant energy fluctuations may abolish sampling efficiency, the essential energy space random walk (EESRW) approach was proposed earlier. To more effectively accelerate the sampling of solute conformations in aqueous environment, in the current work, we generalized the EESRW method to a two-dimension-EESRW (2D-EESRW) strategy. Specifically, the essential internal energy component of a focused region and the essential interaction energy component between the focused region and the environmental region are employed to define the two-dimensional essential energy space. This proposal is motivated by the general observation that in different conformational events, the two essential energy components have distinctive interplays. Model studies on the alanine dipeptide and the aspartate-arginine peptide demonstrate sampling improvement over the original one-dimension-EESRW strategy; with the same biasing level, the present generalization allows more effective acceleration of the sampling of conformational transitions in aqueous solution. The 2D-EESRW generalization is readily extended to higher dimension schemes and employed in more advanced enhanced-sampling schemes, such as the recent orthogonal space random walk method.

  18. Methods to assess an exercise intervention trial based on 3-level functional data.

    PubMed

    Li, Haocheng; Kozey Keadle, Sarah; Staudenmayer, John; Assaad, Houssein; Huang, Jianhua Z; Carroll, Raymond J

    2015-10-01

    Motivated by data recording the effects of an exercise intervention on subjects' physical activity over time, we develop a model to assess the effects of a treatment when the data are functional with 3 levels (subjects, weeks and days in our application) and possibly incomplete. We develop a model with 3-level mean structure effects, all stratified by treatment and subject random effects, including a general subject effect and nested effects for the 3 levels. The mean and random structures are specified as smooth curves measured at various time points. The association structure of the 3-level data is induced through the random curves, which are summarized using a few important principal components. We use penalized splines to model the mean curves and the principal component curves, and cast the proposed model into a mixed effects model framework for model fitting, prediction and inference. We develop an algorithm to fit the model iteratively with the Expectation/Conditional Maximization Either (ECME) version of the EM algorithm and eigenvalue decompositions. Selection of the number of principal components and handling incomplete data issues are incorporated into the algorithm. The performance of the Wald-type hypothesis test is also discussed. The method is applied to the physical activity data and evaluated empirically by a simulation study. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Large-scale randomized clinical trials of bioactives and nutrients in relation to human health and disease prevention - Lessons from the VITAL and COSMOS trials.

    PubMed

    Rautiainen, Susanne; Sesso, Howard D; Manson, JoAnn E

    2017-12-29

    Several bioactive compounds and nutrients in foods have physiological properties that are beneficial for human health. While nutrients typically have clear definitions with established levels of recommended intakes, bioactive compounds often lack such a definition. Although a food-based approach is often the optimal approach to ensure adequate intake of bioactives and nutrients, these components are also often produced as dietary supplements. However, many of these supplements are not sufficiently studied and have an unclear role in chronic disease prevention. Randomized trials are considered the gold standard of study designs, but have not been fully applied to understand the effects of bioactives and nutrients. We review the specific role of large-scale trials to test whether bioactives and nutrients have an effect on health outcomes through several crucial components of trial design, including selection of intervention, recruitment, compliance, outcome selection, and interpretation and generalizability of study findings. We will discuss these components in the context of two randomized clinical trials, the VITamin D and OmegA-3 TriaL (VITAL) and the COcoa Supplement and Multivitamin Outcomes Study (COSMOS). We will mainly focus on dietary supplements of bioactives and nutrients while also emphasizing the need for translation and integration with food-based trials that are of vital importance within nutritional research. Copyright © 2017. Published by Elsevier Ltd.

  20. Cluster pattern analysis of energy deposition sites for the brachytherapy sources 103Pd, 125I, 192Ir, 137Cs, and 60Co.

    PubMed

    Villegas, Fernanda; Tilly, Nina; Bäckström, Gloria; Ahnesjö, Anders

    2014-09-21

    Analysing the pattern of energy depositions may help elucidate differences in the severity of radiation-induced DNA strand breakage for different radiation qualities. It is often claimed that energy deposition (ED) sites from photon radiation form a uniform random pattern, but there is indication of differences in RBE values among different photon sources used in brachytherapy. The aim of this work is to analyse the spatial patterns of EDs from 103Pd, 125I, 192Ir, 137Cs sources commonly used in brachytherapy and a 60Co source as a reference radiation. The results suggest that there is both a non-uniform and a uniform random component to the frequency distribution of distances to the nearest neighbour ED. The closest neighbouring EDs show high spatial correlation for all investigated radiation qualities, whilst the uniform random component dominates for neighbours with longer distances for the three higher mean photon energy sources (192Ir, 137Cs, and 60Co). The two lower energy photon emitters (103Pd and 125I) present a very small uniform random component. The ratio of frequencies of clusters with respect to 60Co differs up to 15% for the lower energy sources and less than 2% for the higher energy sources when the maximum distance between each pair of EDs is 2 nm. At distances relevant to DNA damage, cluster patterns can be differentiated between the lower and higher energy sources. This may be part of the explanation to the reported difference in RBE values with initial DSB yields as an endpoint for these brachytherapy sources.

  1. Intercellular Variability in Protein Levels from Stochastic Expression and Noisy Cell Cycle Processes

    PubMed Central

    Soltani, Mohammad; Vargas-Garcia, Cesar A.; Antunes, Duarte; Singh, Abhyudai

    2016-01-01

    Inside individual cells, expression of genes is inherently stochastic and manifests as cell-to-cell variability or noise in protein copy numbers. Since proteins half-lives can be comparable to the cell-cycle length, randomness in cell-division times generates additional intercellular variability in protein levels. Moreover, as many mRNA/protein species are expressed at low-copy numbers, errors incurred in partitioning of molecules between two daughter cells are significant. We derive analytical formulas for the total noise in protein levels when the cell-cycle duration follows a general class of probability distributions. Using a novel hybrid approach the total noise is decomposed into components arising from i) stochastic expression; ii) partitioning errors at the time of cell division and iii) random cell-division events. These formulas reveal that random cell-division times not only generate additional extrinsic noise, but also critically affect the mean protein copy numbers and intrinsic noise components. Counter intuitively, in some parameter regimes, noise in protein levels can decrease as cell-division times become more stochastic. Computations are extended to consider genome duplication, where transcription rate is increased at a random point in the cell cycle. We systematically investigate how the timing of genome duplication influences different protein noise components. Intriguingly, results show that noise contribution from stochastic expression is minimized at an optimal genome-duplication time. Our theoretical results motivate new experimental methods for decomposing protein noise levels from synchronized and asynchronized single-cell expression data. Characterizing the contributions of individual noise mechanisms will lead to precise estimates of gene expression parameters and techniques for altering stochasticity to change phenotype of individual cells. PMID:27536771

  2. Methodological Challenges of Multiple-Component Intervention: Lessons Learned from a Randomized Controlled Trial of Functional Recovery After Hip Fracture

    PubMed Central

    Peterson, Margaret G.E.; Cornell, Charles N.; MacKenzie, C. Ronald; Robbins, Laura; Horton, Roberta; Ganz, Sandy B.; Ruchlin, Hirsch S.; Russo, Pamela Williams; Paget, Stephen A.; Charlson, Mary E.

    2006-01-01

    We conducted a randomized controlled trial to assess the efficacy and safety of a multiple-component intervention designed to improve functional recovery after hip fracture. One hundred seventy-six patients who underwent surgery for a primary unilateral hip fracture were assigned randomly to receive usual care (control arm, n = 86) or a brief motivational videotape, supportive peer counseling, and high-intensity muscle-strength training (intervention arm, n = 90). Between-group differences on the physical functioning, role-physical, and social functioning domains of the SF-36 were assessed postoperatively at 6 months. At the end of the trial, 32 intervention and 27 control patients (34%) completed the 6-month outcome assessment. Although patient compliance with all three components of the intervention was uneven, over 90% of intervention patients were exposed to the motivational videotape. Intervention patients experienced a significant (P = 0.03) improvement in the role-physical domain (mean change, −11 ± 33) compared to control patients (mean change, −37 ± 41). Change in general health (P = 0.2) and mental health (P = 0.1) domain scores was also directionally consistent with the study hypothesis. Although our findings are consistent with previous reports of comprehensive rehabilitation efforts for hip fracture patients, the trial was undermined by high attrition and the possibility of self-selection bias at 6-month follow-up. We discuss the methodological challenges and lessons learned in conducting a randomized controlled trial that sought to implement and assess the impact of a complex intervention in a population that proved difficult to follow up once they had returned to the community. PMID:18751772

  3. Family, Community and Clinic Collaboration to Treat Overweight and Obese Children: Stanford GOALS -- a Randomized Controlled Trial of a Three-Year, Multi-Component, Multi-Level, Multi-Setting Intervention

    PubMed Central

    Robinson, Thomas N.; Matheson, Donna; Desai, Manisha; Wilson, Darrell M.; Weintraub, Dana L.; Haskell, William L.; McClain, Arianna; McClure, Samuel; Banda, Jorge; Sanders, Lee M.; Haydel, K. Farish; Killen, Joel D.

    2013-01-01

    Objective To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Design Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Participants Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Interventions Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Main Outcome Measure Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. Conclusions The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families. PMID:24028942

  4. Family, community and clinic collaboration to treat overweight and obese children: Stanford GOALS-A randomized controlled trial of a three-year, multi-component, multi-level, multi-setting intervention.

    PubMed

    Robinson, Thomas N; Matheson, Donna; Desai, Manisha; Wilson, Darrell M; Weintraub, Dana L; Haskell, William L; McClain, Arianna; McClure, Samuel; Banda, Jorge A; Sanders, Lee M; Haydel, K Farish; Killen, Joel D

    2013-11-01

    To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families. © 2013 Elsevier Inc. All rights reserved.

  5. Treatment of Middle East Respiratory Syndrome with a combination of lopinavir-ritonavir and interferon-β1b (MIRACLE trial): study protocol for a randomized controlled trial.

    PubMed

    Arabi, Yaseen M; Alothman, Adel; Balkhy, Hanan H; Al-Dawood, Abdulaziz; AlJohani, Sameera; Al Harbi, Shmeylan; Kojan, Suleiman; Al Jeraisy, Majed; Deeb, Ahmad M; Assiri, Abdullah M; Al-Hameed, Fahad; AlSaedi, Asim; Mandourah, Yasser; Almekhlafi, Ghaleb A; Sherbeeni, Nisreen Murad; Elzein, Fatehi Elnour; Memon, Javed; Taha, Yusri; Almotairi, Abdullah; Maghrabi, Khalid A; Qushmaq, Ismael; Al Bshabshe, Ali; Kharaba, Ayman; Shalhoub, Sarah; Jose, Jesna; Fowler, Robert A; Hayden, Frederick G; Hussein, Mohamed A

    2018-01-30

    It had been more than 5 years since the first case of Middle East Respiratory Syndrome coronavirus infection (MERS-CoV) was recorded, but no specific treatment has been investigated in randomized clinical trials. Results from in vitro and animal studies suggest that a combination of lopinavir/ritonavir and interferon-β1b (IFN-β1b) may be effective against MERS-CoV. The aim of this study is to investigate the efficacy of treatment with a combination of lopinavir/ritonavir and recombinant IFN-β1b provided with standard supportive care, compared to treatment with placebo provided with standard supportive care in patients with laboratory-confirmed MERS requiring hospital admission. The protocol is prepared in accordance with the SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) guidelines. Hospitalized adult patients with laboratory-confirmed MERS will be enrolled in this recursive, two-stage, group sequential, multicenter, placebo-controlled, double-blind randomized controlled trial. The trial is initially designed to include 2 two-stage components. The first two-stage component is designed to adjust sample size and determine futility stopping, but not efficacy stopping. The second two-stage component is designed to determine efficacy stopping and possibly readjustment of sample size. The primary outcome is 90-day mortality. This will be the first randomized controlled trial of a potential treatment for MERS. The study is sponsored by King Abdullah International Medical Research Center, Riyadh, Saudi Arabia. Enrollment for this study began in November 2016, and has enrolled thirteen patients as of Jan 24-2018. ClinicalTrials.gov, ID: NCT02845843 . Registered on 27 July 2016.

  6. A school-based comprehensive lifestyle intervention among chinese kids against obesity (CLICK-Obesity): rationale, design and methodology of a randomized controlled trial in Nanjing city, China.

    PubMed

    Xu, Fei; Ware, Robert S; Tse, Lap Ah; Wang, Zhiyong; Hong, Xin; Song, Aiju; Li, Jiequan; Wang, Youfa

    2012-06-15

    The prevalence of childhood obesity among adolescents has been rapidly rising in Mainland China in recent decades, especially in urban and rich areas. There is an urgent need to develop effective interventions to prevent childhood obesity. Limited data regarding adolescent overweight prevention in China are available. Thus, we developed a school-based intervention with the aim of reducing excess body weight in children. This report described the study design. We designed a cluster randomized controlled trial in 8 randomly selected urban primary schools between May 2010 and December 2013. Each school was randomly assigned to either the intervention or control group (four schools in each group). Participants were the 4th graders in each participating school. The multi-component program was implemented within the intervention group, while students in the control group followed their usual health and physical education curriculum with no additional intervention program. The intervention consisted of four components: a) classroom curriculum, (including physical education and healthy diet education), b) school environment support, c) family involvement, and d) fun programs/events. The primary study outcome was body composition, and secondary outcomes were behaviour and behavioural determinants. The intervention was designed with due consideration of Chinese cultural and familial tradition, social convention, and current primary education and exam system in Mainland China. We did our best to gain good support from educational authorities, school administrators, teachers and parents, and to integrate intervention components into schools' regular academic programs. The results of and lesson learned from this study will help guide future school-based childhood obesity prevention programs in Mainland China. ChiCTR-ERC-11001819.

  7. Cluster pattern analysis of energy deposition sites for the brachytherapy sources 103Pd, 125I, 192Ir, 137Cs, and 60Co

    NASA Astrophysics Data System (ADS)

    Villegas, Fernanda; Tilly, Nina; Bäckström, Gloria; Ahnesjö, Anders

    2014-09-01

    Analysing the pattern of energy depositions may help elucidate differences in the severity of radiation-induced DNA strand breakage for different radiation qualities. It is often claimed that energy deposition (ED) sites from photon radiation form a uniform random pattern, but there is indication of differences in RBE values among different photon sources used in brachytherapy. The aim of this work is to analyse the spatial patterns of EDs from 103Pd, 125I, 192Ir, 137Cs sources commonly used in brachytherapy and a 60Co source as a reference radiation. The results suggest that there is both a non-uniform and a uniform random component to the frequency distribution of distances to the nearest neighbour ED. The closest neighbouring EDs show high spatial correlation for all investigated radiation qualities, whilst the uniform random component dominates for neighbours with longer distances for the three higher mean photon energy sources (192Ir, 137Cs, and 60Co). The two lower energy photon emitters (103Pd and 125I) present a very small uniform random component. The ratio of frequencies of clusters with respect to 60Co differs up to 15% for the lower energy sources and less than 2% for the higher energy sources when the maximum distance between each pair of EDs is 2 nm. At distances relevant to DNA damage, cluster patterns can be differentiated between the lower and higher energy sources. This may be part of the explanation to the reported difference in RBE values with initial DSB yields as an endpoint for these brachytherapy sources.

  8. Photocurrent polarization anisotropy of randomly oriented nanowire networks.

    PubMed

    Yu, Yanghai; Protasenko, Vladimir; Jena, Debdeep; Xing, Huili Grace; Kuno, Masaru

    2008-05-01

    While the polarization sensitivity of single or aligned NW ensembles is well-known, this article reports on the existence of residual photocurrent polarization sensitivities in random NW networks. In these studies, CdSe and CdTe NWs were deposited onto glass substrates and contacted with Au electrodes separated by 30-110 microm gaps. SEM and AFM images of resulting devices show isotropically distributed NWs between the electrodes. Complementary high resolution TEM micrographs reveal component NWs to be highly crystalline with diameters between 10 and 20 nm and with lengths ranging from 1 to 10 microm. When illuminated with visible (linearly polarized) light, such random NW networks exhibit significant photocurrent anisotropies rho = 0.25 (sigma = 0.04) [rho = 0.22 (sigma = 0.04)] for CdSe (CdTe) NWs. Corresponding bandwidth measurements yield device polarization sensitivities up to 100 Hz. Additional studies have investigated the effects of varying the electrode potential, gap width, and spatial excitation profile. These experiments suggest electrode orientation as the determining factor behind the polarization sensitivity of NW devices. A simple geometric model has been developed to qualitatively explain the phenomenon. The main conclusion from these studies, however, is that polarization sensitive devices can be made from random NW networks without the need to align component wires.

  9. Fast and secure encryption-decryption method based on chaotic dynamics

    DOEpatents

    Protopopescu, Vladimir A.; Santoro, Robert T.; Tolliver, Johnny S.

    1995-01-01

    A method and system for the secure encryption of information. The method comprises the steps of dividing a message of length L into its character components; generating m chaotic iterates from m independent chaotic maps; producing an "initial" value based upon the m chaotic iterates; transforming the "initial" value to create a pseudo-random integer; repeating the steps of generating, producing and transforming until a pseudo-random integer sequence of length L is created; and encrypting the message as ciphertext based upon the pseudo random integer sequence. A system for accomplishing the invention is also provided.

  10. Identification and quantification of homologous series of compound in complex mixtures: autocovariance study of GC/MS chromatograms.

    PubMed

    Pietrogrande, Maria Chiara; Zampolli, Maria Grazia; Dondi, Francesco

    2006-04-15

    The paper describes a method for determining homologous classes of compounds in a multicomponent complex chromatogram obtained under programming elution conditions. The method is based on the computation of the autocovariance function of the experimental chromatogram (EACVF). The EACVF plot, if properly interpreted, can be regarded as a "class chromatogram" i.e., a virtual chromatogram formed by peaks whose positions and heights allow identification and quantification of the different homologous series, even if they are embedded in a random complex chromatogram. Theoretical models were developed to describe complex chromatograms displaying random retention pattern, ordered sequences or a combination of them. On the basis of theoretical autocovariance function, the properties of the chromatogram can be experimentally evaluated, under well-defined conditions: in particular, the two components of the chromatogram, ordered and random, can be identified. Moreover, the total number of single components (SCs) and the separated number of the SCs belonging to the random and ordered components can be determined, when the two components display the same concentration. If the mixture contains several homologous series with common frequency and different phase values, the number and identity of the different homologous series as well as the number of SCs belonging to each of them can be evaluated. Moreover, the power of the EACVF method can be magnified by applying it to the single ion monitoring (SIM) signals to selectively detect specific compound classes in order to identify the different homologous series. By this way, a full "decoding" of the complex multicomponent chromatogram is achieved. The method was validated on synthetic mixtures containing known amount of SCs belonging to homologous series of hydrocarbon, alcohols, ketones, and aromatic compounds in addition to other not structurally related SCs. The method was applied to both the total ion monitoring (TIC) and the SIM signals, to describe step by step the essence of the procedure. Moreover, the systematic use of both SIM and TIC can simplify the decoding procedure of complex chromatograms by singling out only specific compound classes or by confirming the identification of the different homologous series. The method was further applied to a sample containing unknown number of compounds and homologous series (a petroleum benzin, bp 140-160 degrees C): the results obtained were meaningful in terms of both the identified number of components and identified homologous series.

  11. Design of a school-based randomized trial to reduce smoking among 13 to 15-year olds, the X:IT study.

    PubMed

    Andersen, Anette; Bast, Lotus Sofie; Ringgaard, Lene Winther; Wohllebe, Louise; Jensen, Poul Dengsøe; Svendsen, Maria; Dalum, Peter; Due, Pernille

    2014-05-28

    Adolescent smoking is still highly prevalent in Denmark. One in four 13-year olds indicates that they have tried to smoke, and one in four 15-year olds answer that they smoke regularly. Smoking is more prevalent in socioeconomically disadvantaged populations in Denmark as well as in most Western countries. Previous school-based programs to prevent smoking have shown contrasting results internationally. In Denmark, previous programs have shown limited or no effect. This indicates a need for developing a well-designed, comprehensive, and multi-component intervention aimed at Danish schools with careful implementation and thorough evaluation.This paper describes X:IT, a study including 1) the development of a 3-year school-based multi-component intervention and 2) the randomized trial investigating the effect of the intervention. The study aims at reducing the prevalence of smoking among 13 to 15-year olds by 25%. The X:IT study is based on the Theory of Triadic Influences. The theory organizes factors influencing adolescent smoking into three streams: cultural environment, social situation, and personal factors. We added a fourth stream, the community aspects. The X:IT program comprises three main components: 1) smoke-free school premises, 2) parental involvement including smoke-free dialogues and smoke-free contracts between students and parents, and 3) a curricular component. The study encompasses process- and effect-evaluations as well as health economic analyses. Ninety-four schools in 17 municipalities were randomly allocated to the intervention (51 schools) or control (43 schools) group. At baseline in September 2010, 4,468 year 7 students were eligible of which 4,167 answered the baseline questionnaire (response rate = 93.3%). The X:IT study is a large, randomized controlled trial evaluating the effect of an intervention, based on components proven to be efficient in other Nordic settings. The X:IT study directs students, their parents, and smoking prevention policies at the schools. These elements have proven to be effective tools in preventing smoking among adolescents. Program implementation is thoroughly evaluated to be able to add to the current knowledge of the importance of implementation. X:IT creates the basis for thorough effect and process evaluation, focusing on various social groups. Current Controlled Trials ISRCTN77415416.

  12. Design of a school-based randomized trial to reduce smoking among 13 to 15-year olds, the X:IT study

    PubMed Central

    2014-01-01

    Background Adolescent smoking is still highly prevalent in Denmark. One in four 13-year olds indicates that they have tried to smoke, and one in four 15-year olds answer that they smoke regularly. Smoking is more prevalent in socioeconomically disadvantaged populations in Denmark as well as in most Western countries. Previous school-based programs to prevent smoking have shown contrasting results internationally. In Denmark, previous programs have shown limited or no effect. This indicates a need for developing a well-designed, comprehensive, and multi-component intervention aimed at Danish schools with careful implementation and thorough evaluation. This paper describes X:IT, a study including 1) the development of a 3-year school-based multi-component intervention and 2) the randomized trial investigating the effect of the intervention. The study aims at reducing the prevalence of smoking among 13 to 15-year olds by 25%. Methods/Design The X:IT study is based on the Theory of Triadic Influences. The theory organizes factors influencing adolescent smoking into three streams: cultural environment, social situation, and personal factors. We added a fourth stream, the community aspects. The X:IT program comprises three main components: 1) smoke-free school premises, 2) parental involvement including smoke-free dialogues and smoke-free contracts between students and parents, and 3) a curricular component. The study encompasses process- and effect-evaluations as well as health economic analyses. Ninety-four schools in 17 municipalities were randomly allocated to the intervention (51 schools) or control (43 schools) group. At baseline in September 2010, 4,468 year 7 students were eligible of which 4,167 answered the baseline questionnaire (response rate = 93.3%). Discussion The X:IT study is a large, randomized controlled trial evaluating the effect of an intervention, based on components proven to be efficient in other Nordic settings. The X:IT study directs students, their parents, and smoking prevention policies at the schools. These elements have proven to be effective tools in preventing smoking among adolescents. Program implementation is thoroughly evaluated to be able to add to the current knowledge of the importance of implementation. X:IT creates the basis for thorough effect and process evaluation, focusing on various social groups. Trial registration Current Controlled Trials ISRCTN77415416. PMID:24886206

  13. Compact quantum random number generator based on superluminescent light-emitting diodes

    NASA Astrophysics Data System (ADS)

    Wei, Shihai; Yang, Jie; Fan, Fan; Huang, Wei; Li, Dashuang; Xu, Bingjie

    2017-12-01

    By measuring the amplified spontaneous emission (ASE) noise of the superluminescent light emitting diodes, we propose and realize a quantum random number generator (QRNG) featured with practicability. In the QRNG, after the detection and amplification of the ASE noise, the data acquisition and randomness extraction which is integrated in a field programmable gate array (FPGA) are both implemented in real-time, and the final random bit sequences are delivered to a host computer with a real-time generation rate of 1.2 Gbps. Further, to achieve compactness, all the components of the QRNG are integrated on three independent printed circuit boards with a compact design, and the QRNG is packed in a small enclosure sized 140 mm × 120 mm × 25 mm. The final random bit sequences can pass all the NIST-STS and DIEHARD tests.

  14. Online neural monitoring of statistical learning.

    PubMed

    Batterink, Laura J; Paller, Ken A

    2017-05-01

    The extraction of patterns in the environment plays a critical role in many types of human learning, from motor skills to language acquisition. This process is known as statistical learning. Here we propose that statistical learning has two dissociable components: (1) perceptual binding of individual stimulus units into integrated composites and (2) storing those integrated representations for later use. Statistical learning is typically assessed using post-learning tasks, such that the two components are conflated. Our goal was to characterize the online perceptual component of statistical learning. Participants were exposed to a structured stream of repeating trisyllabic nonsense words and a random syllable stream. Online learning was indexed by an EEG-based measure that quantified neural entrainment at the frequency of the repeating words relative to that of individual syllables. Statistical learning was subsequently assessed using conventional measures in an explicit rating task and a reaction-time task. In the structured stream, neural entrainment to trisyllabic words was higher than in the random stream, increased as a function of exposure to track the progression of learning, and predicted performance on the reaction time (RT) task. These results demonstrate that monitoring this critical component of learning via rhythmic EEG entrainment reveals a gradual acquisition of knowledge whereby novel stimulus sequences are transformed into familiar composites. This online perceptual transformation is a critical component of learning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Linear response to nonstationary random excitation.

    NASA Technical Reports Server (NTRS)

    Hasselman, T.

    1972-01-01

    Development of a method for computing the mean-square response of linear systems to nonstationary random excitation of the form given by y(t) = f(t) x(t), in which x(t) = a stationary process and f(t) is deterministic. The method is suitable for application to multidegree-of-freedom systems when the mean-square response at a point due to excitation applied at another point is desired. Both the stationary process, x(t), and the modulating function, f(t), may be arbitrary. The method utilizes a fundamental component of transient response dependent only on x(t) and the system, and independent of f(t) to synthesize the total response. The role played by this component is analogous to that played by the Green's function or impulse response function in the convolution integral.

  16. Finding the right match: Mindfulness training may potentiate the therapeutic effect of non-judgment of inner experience on smoking cessation

    PubMed Central

    Schuman-Olivier, Zev D.; Hoeppner, Bettina B.; Evins, A. Eden; Brewer, Judson

    2014-01-01

    Mindfulness Training (MT) is an emerging therapeutic modality for addictive disorders. Non-judgment of inner experience, a component of mindfulness, may influence addiction treatment response. To test whether this component influences smoking cessation, tobacco smokers (n=85) in a randomized control trial of MT vs. Freedom from Smoking (FFS), a standard cognitive-behaviorally-oriented treatment, were divided into split-half subgroups based on baseline Five Facet Mindfulness Questionnaire non-judgment subscale. Smokers who rarely judge inner experience (non-judgment > 30.5) smoked less during follow-up when randomized to MT (3.9 cigs/d) vs. FFS (11.1 cigs/d), p <0.01. Measuring trait non-judgment may help personalize treatment assignments, improving outcomes. PMID:24611853

  17. Probabilistic evaluation of SSME structural components

    NASA Astrophysics Data System (ADS)

    Rajagopal, K. R.; Newell, J. F.; Ho, H.

    1991-05-01

    The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.

  18. A Randomized Clinical Trial to Dismantle Components of Cognitive Processing Therapy for Posttraumatic Stress Disorder in Female Victims of Interpersonal Violence

    ERIC Educational Resources Information Center

    Resick, Patricia A.; Galovski, Tara E.; Uhlmansiek, Mary O'Brien; Scher, Christine D.; Clum, Gretchen A.; Young-Xu, Yinong

    2008-01-01

    The purpose of this experiment was to conduct a dismantling study of cognitive processing therapy in which the full protocol was compared with its constituent components--cognitive therapy only (CPT-C) and written accounts (WA)--for the treatment of posttraumatic stress disorder (PTSD) and comorbid symptoms. The intent-to-treat (ITT) sample…

  19. Effects of a Signaled Delay to Reinforcement in the Previous and Upcoming Ratios on Between-Ratio Pausing in Fixed-Ratio Schedules

    ERIC Educational Resources Information Center

    Harris, Aimee; Foster, T. Mary; Levine, Joshua; Temple, William

    2012-01-01

    Domestic hens responded under multiple fixed-ratio fixed-ratio schedules with equal fixed ratios. One component provided immediate reinforcement and the other provided reinforcement after a delay, signaled by the offset of the key light. The components were presented quasi-randomly so that all four possible transitions occurred in each session.…

  20. The effects of a multi-component dyadic intervention on the psychological distress of family caregivers providing care to people with dementia: a randomized controlled trial.

    PubMed

    Prick, Anna-Eva; de Lange, Jacomine; Twisk, Jos; Pot, Anne Margriet

    2015-12-01

    Earlier research showed that multi-component dyadic interventions - including a combination of intervention strategies and addressing both the person with dementia and caregiver - have a beneficial impact on the mental and physical health of people with dementia and their family caregivers. A randomized controlled trial (RCT) of a multi-component dyadic intervention, which is a translated and adapted version of an intervention that has been shown to be effective in the US by Teri et al. (2003), was performed. The effects on caregivers' mood (primary outcome), burden, general health, and salivary cortisol levels (secondary outcomes) were studied. Community-dwelling people with dementia and their family caregivers (N = 111 dyads) were randomly assigned. The experimental group received eight home visits during three months, combining physical exercise and support (psycho-education, communication skills training, and planning of pleasant activities). Both the physical exercise and support component were directed at both the person with dementia and the caregiver. The comparison group received monthly information bulletins and phone calls. There were three measurements at baseline (prior to the intervention), at three months, and at six months into the intervention. Data were analyzed with Generalized Estimating Equations (GEE) based on an intention-to-treat analysis of all available data. All analyses showed no benefits of the intervention over time on any of the outcomes. The negative results might be explained by the translation and adaptation of the intervention that has been shown to be effective in the US: the intervention was shortened and did not include cognitive reframing. However, only the health effects on people with dementia and not on caregivers were studied in the US. Several other factors might also have played a role, which are important for future studies to take into account. These are: the usual health care in the country or region of implementation; the wishes and needs of participants for specific intervention components; the room for improvement regarding these components; the inclusion of positive outcome measures, such as pleasure, and the quality of the relationship.

  1. Optical encrypted holographic memory using triple random phase-encoded multiplexing in photorefractive LiNbO3:Fe crystal

    NASA Astrophysics Data System (ADS)

    Tang, Li-Chuan; Hu, Guang W.; Russell, Kendra L.; Chang, Chen S.; Chang, Chi Ching

    2000-10-01

    We propose a new holographic memory scheme based on random phase-encoded multiplexing in a photorefractive LiNbO3:Fe crystal. Experimental results show that rotating a diffuser placed as a random phase modulator in the path of the reference beam provides a simple yet effective method of increasing the holographic storage capabilities of the crystal. Combining this rotational multiplexing with angular multiplexing offers further advantages. Storage capabilities can be optimized by using a post-image random phase plate in the path of the object beam. The technique is applied to a triple phase-encoded optical security system that takes advantage of the high angular selectivity of the angular-rotational multiplexing components.

  2. Breaking through the bandwidth barrier in distributed fiber vibration sensing by sub-Nyquist randomized sampling

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdong; Zhu, Tao; Zheng, Hua; Kuang, Yang; Liu, Min; Huang, Wei

    2017-04-01

    The round trip time of the light pulse limits the maximum detectable frequency response range of vibration in phase-sensitive optical time domain reflectometry (φ-OTDR). We propose a method to break the frequency response range restriction of φ-OTDR system by modulating the light pulse interval randomly which enables a random sampling for every vibration point in a long sensing fiber. This sub-Nyquist randomized sampling method is suits for detecting sparse-wideband- frequency vibration signals. Up to MHz resonance vibration signal with over dozens of frequency components and 1.153MHz single frequency vibration signal are clearly identified for a sensing range of 9.6km with 10kHz maximum sampling rate.

  3. A Hybrid Color Space for Skin Detection Using Genetic Algorithm Heuristic Search and Principal Component Analysis Technique

    PubMed Central

    2015-01-01

    Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377

  4. Effects of coarse-graining on fluctuations in gene expression

    NASA Astrophysics Data System (ADS)

    Pedraza, Juan; Paulsson, Johan

    2008-03-01

    Many cellular components are present in such low numbers per cell that random births and deaths of individual molecules can cause significant `noise' in concentrations. But biochemical events do not necessarily occur in steps of individual molecules. Some processes are greatly randomized when synthesis or degradation occurs in large bursts of many molecules in a short time interval. Conversely, each birth or death of a macromolecule could involve several small steps, creating a memory between individual events. Here we present generalized theory for stochastic gene expression, formulating the variance in protein abundance in terms of the randomness of the individual events, and discuss the effective coarse-graining of the molecular hardware. We show that common molecular mechanisms produce gestation and senescence periods that can reduce noise without changing average abundances, lifetimes, or any concentration-dependent control loops. We also show that single-cell experimental methods that are now commonplace in cell biology do not discriminate between qualitatively different stochastic principles, but that this in turn makes them better suited for identifying which components introduce fluctuations.

  5. Evaluating the Impact of Feedback on Elementary Aged Students’ Fluency Growth in Written Expression: A Randomized Controlled Trial

    PubMed Central

    Truckenmiller, Adrea J.; Eckert, Tanya L.; Codding, Robin S.; Petscher, Yaacov

    2016-01-01

    The purpose of this randomized controlled trial was to evaluate elementary-aged students’ writing fluency growth in response to (a) instructional practices, (b) sex differences, and (c) student’s initial level of writing fluency. Third-grade students (n=133) in three urban elementary schools were randomly assigned to either an individualized performance feedback condition (n=46), a practice-only condition (i.e., weekly writing practice; n = 39), or an instructional control condition (n = 48) for 8 weeks. Findings included support for use of performance feedback as an instructional component in general education classrooms (Hedges’ g = 0.66), whereas simple practice with curriculum-based measurement in written expression did not produce growth significantly greater than standard instructional practices. The hypothesis that girls write significantly more than boys was supported. However, girls and boys did not differ in their rate of growth. Finally, students’ initial risk status in writing fluency did not differentially predict growth in writing fluency over the course of the study. Implications for incorporating feedback as a basic component of intervention in writing are discussed. PMID:25432270

  6. Landscape-scale spatial abundance distributions discriminate core from random components of boreal lake bacterioplankton.

    PubMed

    Niño-García, Juan Pablo; Ruiz-González, Clara; Del Giorgio, Paul A

    2016-12-01

    Aquatic bacterial communities harbour thousands of coexisting taxa. To meet the challenge of discriminating between a 'core' and a sporadically occurring 'random' component of these communities, we explored the spatial abundance distribution of individual bacterioplankton taxa across 198 boreal lakes and their associated fluvial networks (188 rivers). We found that all taxa could be grouped into four distinct categories based on model statistical distributions (normal like, bimodal, logistic and lognormal). The distribution patterns across lakes and their associated river networks showed that lake communities are composed of a core of taxa whose distribution appears to be linked to in-lake environmental sorting (normal-like and bimodal categories), and a large fraction of mostly rare bacteria (94% of all taxa) whose presence appears to be largely random and linked to downstream transport in aquatic networks (logistic and lognormal categories). These rare taxa are thus likely to reflect species sorting at upstream locations, providing a perspective of the conditions prevailing in entire aquatic networks rather than only in lakes. © 2016 John Wiley & Sons Ltd/CNRS.

  7. Draco,Version 6.x.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Kelly; Budge, Kent; Lowrie, Rob

    2016-03-03

    Draco is an object-oriented component library geared towards numerically intensive, radiation (particle) transport applications built for parallel computing hardware. It consists of semi-independent packages and a robust build system. The packages in Draco provide a set of components that can be used by multiple clients to build transport codes. The build system can also be extracted for use in clients. Software includes smart pointers, Design-by-Contract assertions, unit test framework, wrapped MPI functions, a file parser, unstructured mesh data structures, a random number generator, root finders and an angular quadrature component.

  8. Rational and design of a stepped-wedge cluster randomized trial evaluating quality improvement initiative for reducing cardiovascular events among patients with acute coronary syndromes in resource-constrained hospitals in China.

    PubMed

    Li, Shenshen; Wu, Yangfeng; Du, Xin; Li, Xian; Patel, Anushka; Peterson, Eric D; Turnbull, Fiona; Lo, Serigne; Billot, Laurent; Laba, Tracey; Gao, Runlin

    2015-03-01

    Acute coronary syndromes (ACSs) are a major cause of morbidity and mortality, yet effective ACS treatments are frequently underused in clinical practice. Randomized trials including the CPACS-2 study suggest that quality improvement initiatives can increase the use of effective treatments, but whether such programs can impact hard clinical outcomes has never been demonstrated in a well-powered randomized controlled trial. The CPACS-3 study is a stepped-wedge cluster-randomized trial conducted in 104 remote level 2 hospitals without PCI facilities in China. All hospitalized ACS patients will be recruited consecutively over a 30-month period to an anticipated total study population of more than 25,000 patients. After a 6-month baseline period, hospitals will be randomized to 1 of 4 groups, and a 6-component quality improvement intervention will be implemented sequentially in each group every 6months. These components include the following: establishment of a quality improvement team, implementation of a clinical pathway, training of physicians and nurses, hospital performance audit and feedback, online technical support, and patient education. All patients will be followed up for 6months postdischarge. The primary outcome will be the incidence of in-hospital major adverse cardiovascular events comprising all-cause mortality, myocardial infarction or reinfarction, and nonfatal stroke. The CPACS-3 study will be the first large randomized trial with sufficient power to assess the effects of a multifaceted quality of care improvement initiative on hard clinical outcomes, in patients with ACS. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Mediating Effects of Home-Related Factors on Fat Intake from Snacks in a School-Based Nutrition Intervention among Adolescents

    ERIC Educational Resources Information Center

    Van Lippevelde, Wendy; van Stralen, Maartje; Verloigne, Maite; De Bourdeaudhuij, Ilse; Deforche, Benedicte; Brug, Johannes; Maes, Lea; Haerens, Leen

    2012-01-01

    The purpose of the present study was to investigate if the effects of the parental component of a school-based intervention on dietary fat intake from snacking were mediated by changes in home-related factors. A random sample of 10 schools with 2232 pupils aged 11-15 years was randomly assigned to one of two intervention groups [one with (n =…

  10. Verifying Digital Components of Physical Systems: Experimental Evaluation of Test Quality

    NASA Astrophysics Data System (ADS)

    Laputenko, A. V.; López, J. E.; Yevtushenko, N. V.

    2018-03-01

    This paper continues the study of high quality test derivation for verifying digital components which are used in various physical systems; those are sensors, data transfer components, etc. We have used logic circuits b01-b010 of the package of ITC'99 benchmarks (Second Release) for experimental evaluation which as stated before, describe digital components of physical systems designed for various applications. Test sequences are derived for detecting the most known faults of the reference logic circuit using three different approaches to test derivation. Three widely used fault types such as stuck-at-faults, bridges, and faults which slightly modify the behavior of one gate are considered as possible faults of the reference behavior. The most interesting test sequences are short test sequences that can provide appropriate guarantees after testing, and thus, we experimentally study various approaches to the derivation of the so-called complete test suites which detect all fault types. In the first series of experiments, we compare two approaches for deriving complete test suites. In the first approach, a shortest test sequence is derived for testing each fault. In the second approach, a test sequence is pseudo-randomly generated by the use of an appropriate software for logic synthesis and verification (ABC system in our study) and thus, can be longer. However, after deleting sequences detecting the same set of faults, a test suite returned by the second approach is shorter. The latter underlines the fact that in many cases it is useless to spend `time and efforts' for deriving a shortest distinguishing sequence; it is better to use the test minimization afterwards. The performed experiments also show that the use of only randomly generated test sequences is not very efficient since such sequences do not detect all the faults of any type. After reaching the fault coverage around 70%, saturation is observed, and the fault coverage cannot be increased anymore. For deriving high quality short test suites, the approach that is the combination of randomly generated sequences together with sequences which are aimed to detect faults not detected by random tests, allows to reach the good fault coverage using shortest test sequences.

  11. Coordinative structuring of gait kinematics during adaptation to variable and asymmetric split-belt treadmill walking - A principal component analysis approach.

    PubMed

    Hinkel-Lipsker, Jacob W; Hahn, Michael E

    2018-06-01

    Gait adaptation is a task that requires fine-tuned coordination of all degrees of freedom in the lower limbs by the central nervous system. However, when individuals change their gait it is unknown how this coordination is organized, and how it can be influenced by contextual interference during practice. Such knowledge could provide information about measurement of gait adaptation during rehabilitation. Able-bodied individuals completed an acute bout of asymmetric split-belt treadmill walking, where one limb was driven at a constant velocity and the other according to one of three designed practice paradigms: serial practice, where the variable limb belt velocity increased over time; random blocked practice, where every 20 strides the variable limb belt velocity changed randomly; random practice, where every stride the variable limb belt velocity changed randomly. On the second day, subjects completed one of two different transfer tests; one with a belt asymmetry close to that experienced on the acquisition day (transfer 1; 1.5:1), and one with a greater asymmetry (transfer 2; 2:1) . To reduce this inherently high-dimensional dataset, principal component analyses were used for kinematic data collected throughout the acquisition and transfer phases; resulting in extraction of the first two principal components (PCs). For acquisition, PC1 and PC2 were related to sagittal and frontal plane control. For transfer 1, PC1 and PC2 were related to frontal plane control of the base of support and whole-body center of mass. For transfer 2, PC1 did not have any variables with high enough coefficients deemed to be relevant, and PC2 was related to sagittal plane control. Observations of principal component scores indicate that variance structuring differs among practice groups during acquisition and transfer 1, but not transfer 2. These results demonstrate the main kinematic coordinative structures that exist during gait adaptation, and that control of sagittal plane and frontal plane motion are perhaps a trade-off during acquisition of a novel asymmetric gait pattern. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Patient-specific targeting guides compared with traditional instrumentation for glenoid component placement in shoulder arthroplasty: a multi-surgeon study in 70 arthritic cadaver specimens.

    PubMed

    Throckmorton, Thomas W; Gulotta, Lawrence V; Bonnarens, Frank O; Wright, Stephen A; Hartzell, Jeffrey L; Rozzi, William B; Hurst, Jason M; Frostick, Simon P; Sperling, John W

    2015-06-01

    The purpose of this study was to compare the accuracy of patient-specific guides for total shoulder arthroplasty (TSA) with traditional instrumentation in arthritic cadaver shoulders. We hypothesized that the patient-specific guides would place components more accurately than standard instrumentation. Seventy cadaver shoulders with radiographically confirmed arthritis were randomized in equal groups to 5 surgeons of varying experience levels who were not involved in development of the patient-specific guidance system. Specimens were then randomized to patient-specific guides based off of computed tomography scanning, standard instrumentation, and anatomic TSA or reverse TSA. Variances in version or inclination of more than 10° and more than 4 mm in starting point were considered indications of significant component malposition. TSA glenoid components placed with patient-specific guides averaged 5° of deviation from the intended position in version and 3° in inclination; those with standard instrumentation averaged 8° of deviation in version and 7° in inclination. These differences were significant for version (P = .04) and inclination (P = .01). Multivariate analysis of variance to compare the overall accuracy for the entire cohort (TSA and reverse TSA) revealed patient-specific guides to be significantly more accurate (P = .01) for the combined vectors of version and inclination. Patient-specific guides also had fewer instances of significant component malposition than standard instrumentation did. Patient-specific targeting guides were more accurate than traditional instrumentation and had fewer instances of component malposition for glenoid component placement in this multi-surgeon cadaver study of arthritic shoulders. Long-term clinical studies are needed to determine if these improvements produce improved functional outcomes. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  13. Dynamic fractals in spatial evolutionary games

    NASA Astrophysics Data System (ADS)

    Kolotev, Sergei; Malyutin, Aleksandr; Burovski, Evgeni; Krashakov, Sergei; Shchur, Lev

    2018-06-01

    We investigate critical properties of a spatial evolutionary game based on the Prisoner's Dilemma. Simulations demonstrate a jump in the component densities accompanied by drastic changes in average sizes of the component clusters. We argue that the cluster boundary is a random fractal. Our simulations are consistent with the fractal dimension of the boundary being equal to 2, and the cluster boundaries are hence asymptotically space filling as the system size increases.

  14. Feasibility study of the design of Bi Ra Systems, Incorporated model 5301, 5101, and 3222 CAMAC modules for space use

    NASA Technical Reports Server (NTRS)

    Biswell, L.; Mcelderry, R.

    1976-01-01

    Cost estimates are determined for redesigned modules. Consideration is given to incorporation of NASA approved components, component screening and documentation, as well as reduced power consumption. Results show that r designed modules will function reliably in a space environment of 50 C and withstand greater than 15 G's of random vibration between 40 Hz and 400 Hz.

  15. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao

    2016-07-15

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilizedmore » interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.« less

  16. Polynomial chaos expansion with random and fuzzy variables

    NASA Astrophysics Data System (ADS)

    Jacquelin, E.; Friswell, M. I.; Adhikari, S.; Dessombz, O.; Sinou, J.-J.

    2016-06-01

    A dynamical uncertain system is studied in this paper. Two kinds of uncertainties are addressed, where the uncertain parameters are described through random variables and/or fuzzy variables. A general framework is proposed to deal with both kinds of uncertainty using a polynomial chaos expansion (PCE). It is shown that fuzzy variables may be expanded in terms of polynomial chaos when Legendre polynomials are used. The components of the PCE are a solution of an equation that does not depend on the nature of uncertainty. Once this equation is solved, the post-processing of the data gives the moments of the random response when the uncertainties are random or gives the response interval when the variables are fuzzy. With the PCE approach, it is also possible to deal with mixed uncertainty, when some parameters are random and others are fuzzy. The results provide a fuzzy description of the response statistical moments.

  17. Secure self-calibrating quantum random-bit generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiorentino, M.; Santori, C.; Spillane, S. M.

    2007-03-15

    Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less

  18. Is killer whale dialect evolution random?

    PubMed

    Filatova, Olga A; Burdin, Alexandr M; Hoyt, Erich

    2013-10-01

    The killer whale is among the few species in which cultural change accumulates over many generations, leading to cumulative cultural evolution. Killer whales have group-specific vocal repertoires which are thought to be learned rather than being genetically coded. It is supposed that divergence between vocal repertoires of sister groups increases gradually over time due to random learning mistakes and innovations. In this case, the similarity of calls across groups must be correlated with pod relatedness and, consequently, with each other. In this study we tested this prediction by comparing the patterns of call similarity between matrilines of resident killer whales from Eastern Kamchatka. We calculated the similarity of seven components from three call types across 14 matrilines. In contrast to the theoretical predictions, matrilines formed different clusters on the dendrograms made by different calls and even by different components of the same call. We suggest three possible explanations for this phenomenon. First, the lack of agreement between similarity patterns of different components may be the result of constraints in the call structure. Second, it is possible that call components change in time with different speed and/or in different directions. Third, horizontal cultural transmission of call features may occur between matrilines. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. [Gene method for inconsistent hydrological frequency calculation. I: Inheritance, variability and evolution principles of hydrological genes].

    PubMed

    Xie, Ping; Wu, Zi Yi; Zhao, Jiang Yan; Sang, Yan Fang; Chen, Jie

    2018-04-01

    A stochastic hydrological process is influenced by both stochastic and deterministic factors. A hydrological time series contains not only pure random components reflecting its inheri-tance characteristics, but also deterministic components reflecting variability characteristics, such as jump, trend, period, and stochastic dependence. As a result, the stochastic hydrological process presents complicated evolution phenomena and rules. To better understand these complicated phenomena and rules, this study described the inheritance and variability characteristics of an inconsistent hydrological series from two aspects: stochastic process simulation and time series analysis. In addition, several frequency analysis approaches for inconsistent time series were compared to reveal the main problems in inconsistency study. Then, we proposed a new concept of hydrological genes origined from biological genes to describe the inconsistent hydrolocal processes. The hydrologi-cal genes were constructed using moments methods, such as general moments, weight function moments, probability weight moments and L-moments. Meanwhile, the five components, including jump, trend, periodic, dependence and pure random components, of a stochastic hydrological process were defined as five hydrological bases. With this method, the inheritance and variability of inconsistent hydrological time series were synthetically considered and the inheritance, variability and evolution principles were fully described. Our study would contribute to reveal the inheritance, variability and evolution principles in probability distribution of hydrological elements.

  20. Results of a faith-based weight loss intervention for black women.

    PubMed

    Fitzgibbon, Marian L; Stolley, Melinda R; Ganschow, Pamela; Schiffer, Linda; Wells, Anita; Simon, Nolanna; Dyer, Alan

    2005-10-01

    Obesity is a risk factor for a variety of chronic diseases. Although weight loss may reduce these risks, weight loss programs designed for black women have yielded mixed results. Studies suggest that religion/spirituality is a prominent component of black culture. Given this, the inclusion of religion/spirituality as an active component of a weight loss program may enhance the benefits of the program. The role of religion/spirituality, however, has not been specifically tested as a mechanism that enhances the weight loss process. This paper presents the results of "Faith on the Move," a randomized pilot study of a faith-based weight loss program for black women. The goals of the study were to estimate the effects of a 12-week culturally tailored, faith-based weight loss intervention on weight loss, dietary fat consumption and physical activity. The culturally tailored, faith-based weight loss intervention was compared to a culturally tailored weight loss intervention with no active faith component. Fifty-nine overweight/obese black women were randomized to one of the two interventions. Although the results were not statistically significant, the effect size suggests that the addition of the faith component improved results. These promising preliminary results will need to be tested in an adequately powered trial.

  1. JT15D simulated flight data evaluation

    NASA Technical Reports Server (NTRS)

    Holm, R. G.

    1984-01-01

    The noise characteristics of the JT15D turbofan engine was analyzed with the objectives of: (1) assessing the state-of-art ability to simulate flight acoustic data using test results acquired in wind tunnel and outdoor (turbulence controlled) environments; and (2) predicting the farfield noise directivity of the blade passage frequency (BPF) tonal components using results from rotor blade mounted dynamic pressure instrumentation. Engine rotor tip speeds at subsonic, transonic, and supersonic conditions were evaluated. The ability to simulate flight results was generally within 2-3 dB for both outdoor and wind tunnel acoustic results. Some differences did occur in the broadband noise level and in the multiple-pure-tone harmonics at supersonic tip speeds. The prediction of blade passage frequency tone directivity from dynamic pressure measurements was accomplished for the three tip speed conditions. Predictions were made of the random and periodic components of the tone directivity. The technique for estimating the random tone component used hot wire data to establish a correlation between dynamic pressure and turbulence intensity. This prediction overestimated the tone level by typically 10 dB with the greatest overestimates occurring at supersonic conditions.

  2. Spatial Distribution of Phase Singularities in Optical Random Vector Waves.

    PubMed

    De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L

    2016-08-26

    Phase singularities are dislocations widely studied in optical fields as well as in other areas of physics. With experiment and theory we show that the vectorial nature of light affects the spatial distribution of phase singularities in random light fields. While in scalar random waves phase singularities exhibit spatial distributions reminiscent of particles in isotropic liquids, in vector fields their distribution for the different vector components becomes anisotropic due to the direct relation between propagation and field direction. By incorporating this relation in the theory for scalar fields by Berry and Dennis [Proc. R. Soc. A 456, 2059 (2000)], we quantitatively describe our experiments.

  3. Relaxation dynamics of maximally clustered networks

    NASA Astrophysics Data System (ADS)

    Klaise, Janis; Johnson, Samuel

    2018-01-01

    We study the relaxation dynamics of fully clustered networks (maximal number of triangles) to an unclustered state under two different edge dynamics—the double-edge swap, corresponding to degree-preserving randomization of the configuration model, and single edge replacement, corresponding to full randomization of the Erdős-Rényi random graph. We derive expressions for the time evolution of the degree distribution, edge multiplicity distribution and clustering coefficient. We show that under both dynamics networks undergo a continuous phase transition in which a giant connected component is formed. We calculate the position of the phase transition analytically using the Erdős-Rényi phenomenology.

  4. Approximating SIR-B response characteristics and estimating wave height and wavelength for ocean imagery

    NASA Technical Reports Server (NTRS)

    Tilley, David G.

    1987-01-01

    NASA Space Shuttle Challenger SIR-B ocean scenes are used to derive directional wave spectra for which speckle noise is modeled as a function of Rayleigh random phase coherence downrange and Poisson random amplitude errors inherent in the Doppler measurement of along-track position. A Fourier filter that preserves SIR-B image phase relations is used to correct the stationary and dynamic response characteristics of the remote sensor and scene correlator, as well as to subtract an estimate of the speckle noise component. A two-dimensional map of sea surface elevation is obtained after the filtered image is corrected for both random and deterministic motions.

  5. The PX-EM algorithm for fast stable fitting of Henderson's mixed model

    PubMed Central

    Foulley, Jean-Louis; Van Dyk, David A

    2000-01-01

    This paper presents procedures for implementing the PX-EM algorithm of Liu, Rubin and Wu to compute REML estimates of variance covariance components in Henderson's linear mixed models. The class of models considered encompasses several correlated random factors having the same vector length e.g., as in random regression models for longitudinal data analysis and in sire-maternal grandsire models for genetic evaluation. Numerical examples are presented to illustrate the procedures. Much better results in terms of convergence characteristics (number of iterations and time required for convergence) are obtained for PX-EM relative to the basic EM algorithm in the random regression. PMID:14736399

  6. Carbon atom clusters in random covalent networks: PAHs as an integral component of interstellar HAC

    NASA Astrophysics Data System (ADS)

    Jones, A. P.

    1990-11-01

    Using a random covalent network (RCN) model for the structure of hydrogenated amorphorous carbon (HAC) and the available laboratory data, it is shown that aromatic species are a natural consequence of the structure of amorphous carbons formed in the laboratory. Amorphous carbons in the interstellar medium are therefore likely to contain a significant fraction of Polycyclic aromatic hydrocarbons (PAH) species within the 'amorphous' matrix making up these materials. This aromatic component can be produced in situ during the accretion of gas phase carbon species on to grains in the interstellar medium under hydrogen-poor conditions, or subsequent to deposition as a result of photolysis (photodarkening). The fraction of interstellar carbon present in HAC in the form of PAHs, based upon a RCN model, is consistent with the observed Unidentified infrared (UIR) emission features.

  7. Efficient Ab initio Modeling of Random Multicomponent Alloys

    DOE PAGES

    Jiang, Chao; Uberuaga, Blas P.

    2016-03-08

    Here, we present in this Letter a novel small set of ordered structures (SSOS) method that allows extremely efficient ab initio modeling of random multi-component alloys. Using inverse II-III spinel oxides and equiatomic quinary bcc (so-called high entropy) alloys as examples, we also demonstrate that a SSOS can achieve the same accuracy as a large supercell or a well-converged cluster expansion, but with significantly reduced computational cost. In particular, because of this efficiency, a large number of quinary alloy compositions can be quickly screened, leading to the identification of several new possible high entropy alloy chemistries. Furthermore, the SSOS methodmore » developed here can be broadly useful for the rapid computational design of multi-component materials, especially those with a large number of alloying elements, a challenging problem for other approaches.« less

  8. Secure content objects

    DOEpatents

    Evans, William D [Cupertino, CA

    2009-02-24

    A secure content object protects electronic documents from unauthorized use. The secure content object includes an encrypted electronic document, a multi-key encryption table having at least one multi-key component, an encrypted header and a user interface device. The encrypted document is encrypted using a document encryption key associated with a multi-key encryption method. The encrypted header includes an encryption marker formed by a random number followed by a derivable variation of the same random number. The user interface device enables a user to input a user authorization. The user authorization is combined with each of the multi-key components in the multi-key encryption key table and used to try to decrypt the encrypted header. If the encryption marker is successfully decrypted, the electronic document may be decrypted. Multiple electronic documents or a document and annotations may be protected by the secure content object.

  9. [Working memory and executive control: inhibitory processes in updating and random generation tasks].

    PubMed

    Macizo, Pedro; Bajo, Teresa; Soriano, Maria Felipa

    2006-02-01

    Working Memory (WM) span predicts subjects' performance in control executive tasks and, in addition, it has been related to the capacity to inhibit irrelevant information. In this paper we investigate the role of WM span in two executive tasks focusing our attention on inhibitory components of both tasks. High and low span participants recalled targets words rejecting irrelevant items at the same time (Experiment 1) and they generated random numbers (Experiment 2). Results showed a clear relation between WM span and performance in both tasks. In addition, analyses of intrusion errors (Experiment 1) and stereotyped responses (Experiment 2) indicated that high span individuals were able to efficiently use the inhibitory component implied in both tasks. The pattern of data provides support to the relation between WM span and control executive tasks through an inhibitory mechanism.

  10. Genetic parameters of legendre polynomials for first parity lactation curves.

    PubMed

    Pool, M H; Janss, L L; Meuwissen, T H

    2000-11-01

    Variance components of the covariance function coefficients in a random regression test-day model were estimated by Legendre polynomials up to a fifth order for first-parity records of Dutch dairy cows using Gibbs sampling. Two Legendre polynomials of equal order were used to model the random part of the lactation curve, one for the genetic component and one for permanent environment. Test-day records from cows registered between 1990 to 1996 and collected by regular milk recording were available. For the data set, 23,700 complete lactations were selected from 475 herds sired by 262 sires. Because the application of a random regression model is limited by computing capacity, we investigated the minimum order needed to fit the variance structure in the data sufficiently. Predictions of genetic and permanent environmental variance structures were compared with bivariate estimates on 30-d intervals. A third-order or higher polynomial modeled the shape of variance curves over DIM with sufficient accuracy for the genetic and permanent environment part. Also, the genetic correlation structure was fitted with sufficient accuracy by a third-order polynomial, but, for the permanent environmental component, a fourth order was needed. Because equal orders are suggested in the literature, a fourth-order Legendre polynomial is recommended in this study. However, a rank of three for the genetic covariance matrix and of four for permanent environment allows a simpler covariance function with a reduced number of parameters based on the eigenvalues and eigenvectors.

  11. The assisted prediction modelling frame with hybridisation and ensemble for business risk forecasting and an implementation

    NASA Astrophysics Data System (ADS)

    Li, Hui; Hong, Lu-Yao; Zhou, Qing; Yu, Hai-Jie

    2015-08-01

    The business failure of numerous companies results in financial crises. The high social costs associated with such crises have made people to search for effective tools for business risk prediction, among which, support vector machine is very effective. Several modelling means, including single-technique modelling, hybrid modelling, and ensemble modelling, have been suggested in forecasting business risk with support vector machine. However, existing literature seldom focuses on the general modelling frame for business risk prediction, and seldom investigates performance differences among different modelling means. We reviewed researches on forecasting business risk with support vector machine, proposed the general assisted prediction modelling frame with hybridisation and ensemble (APMF-WHAE), and finally, investigated the use of principal components analysis, support vector machine, random sampling, and group decision, under the general frame in forecasting business risk. Under the APMF-WHAE frame with support vector machine as the base predictive model, four specific predictive models were produced, namely, pure support vector machine, a hybrid support vector machine involved with principal components analysis, a support vector machine ensemble involved with random sampling and group decision, and an ensemble of hybrid support vector machine using group decision to integrate various hybrid support vector machines on variables produced from principle components analysis and samples from random sampling. The experimental results indicate that hybrid support vector machine and ensemble of hybrid support vector machines were able to produce dominating performance than pure support vector machine and support vector machine ensemble.

  12. Public perceptions of key performance indicators of healthcare in Alberta, Canada.

    PubMed

    Northcott, Herbert C; Harvey, Michael D

    2012-06-01

    To examine the relationship between public perceptions of key performance indicators assessing various aspects of the health-care system. Cross-sequential survey research. Annual telephone surveys of random samples of adult Albertans selected by random digit dialing and stratified according to age, sex and region (n = 4000 for each survey year). The survey questionnaires included single-item measures of key performance indicators to assess public perceptions of availability, accessibility, quality, outcome and satisfaction with healthcare. Cronbach's α and factor analysis were used to assess the relationship between key performance indicators focusing on the health-care system overall and on a recent interaction with the health-care system. The province of Alberta, Canada during the years 1996-2004. Four thousand adults randomly selected each survey year. Survey questions measuring public perceptions of healthcare availability, accessibility, quality, outcome and satisfaction with healthcare. Factor analysis identified two principal components with key performance indicators focusing on the health system overall loading most strongly on the first component and key performance indicators focusing on the most recent health-care encounter loading most strongly on the second component. Assessments of the quality of care most recently received, accessibility of that care and perceived outcome of care tended to be higher than the more general assessments of overall health system quality and accessibility. Assessments of specific health-care encounters and more general assessments of the overall health-care system, while related, nevertheless comprise separate dimensions for health-care evaluation.

  13. On Digital Simulation of Multicorrelated Random Processes and Its Applications. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sinha, A. K.

    1973-01-01

    Two methods are described to simulate, on a digital computer, a set of correlated, stationary, and Gaussian time series with zero mean from the given matrix of power spectral densities and cross spectral densities. The first method is based upon trigonometric series with random amplitudes and deterministic phase angles. The random amplitudes are generated by using a standard random number generator subroutine. An example is given which corresponds to three components of wind velocities at two different spatial locations for a total of six correlated time series. In the second method, the whole process is carried out using the Fast Fourier Transform approach. This method gives more accurate results and works about twenty times faster for a set of six correlated time series.

  14. Stochastic stability of parametrically excited random systems

    NASA Astrophysics Data System (ADS)

    Labou, M.

    2004-01-01

    Multidegree-of-freedom dynamic systems subjected to parametric excitation are analyzed for stochastic stability. The variation of excitation intensity with time is described by the sum of a harmonic function and a stationary random process. The stability boundaries are determined by the stochastic averaging method. The effect of random parametric excitation on the stability of trivial solutions of systems of differential equations for the moments of phase variables is studied. It is assumed that the frequency of harmonic component falls within the region of combination resonances. Stability conditions for the first and second moments are obtained. It turns out that additional parametric excitation may have a stabilizing or destabilizing effect, depending on the values of certain parameters of random excitation. As an example, the stability of a beam in plane bending is analyzed.

  15. Medium scale carbon nanotube thin film integrated circuits on flexible plastic substrates

    DOEpatents

    Rogers, John A; Cao, Qing; Alam, Muhammad; Pimparkar, Ninad

    2015-02-03

    The present invention provides device components geometries and fabrication strategies for enhancing the electronic performance of electronic devices based on thin films of randomly oriented or partially aligned semiconducting nanotubes. In certain aspects, devices and methods of the present invention incorporate a patterned layer of randomly oriented or partially aligned carbon nanotubes, such as one or more interconnected SWNT networks, providing a semiconductor channel exhibiting improved electronic properties relative to conventional nanotubes-based electronic systems.

  16. A randomized clinical trial of theory-based activities for the behavioral symptoms of dementia in nursing home residents.

    PubMed

    Kolanowski, Ann; Litaker, Mark; Buettner, Lin; Moeller, Joyel; Costa, Paul T

    2011-06-01

    To test the main and interactive effects of activities derived from the Need-Driven Dementia-Compromised Behavior model for responding to behavioral symptoms in nursing home residents. Randomized double-blind clinical trial. Nine community-based nursing homes. One hundred twenty-eight cognitively impaired residents randomly assigned to activities adjusted to functional level (FL) (n=32), personality style of interest (PSI) (n=33), functional level and personality style of interest (FL+PSI) (n=31), or active control (AC) (n=32). Three weeks of activities provided twice daily. Agitation, passivity, engagement, affect, and mood assessed from video recordings and real-time observations during baseline, intervention, random times outside of intervention, and 1 week after intervention. All treatments improved outcomes during intervention except mood, which worsened under AC. During intervention the PSI group demonstrated greater engagement, alertness, and attention than the other groups; the FL+PSI group demonstrated greater pleasure. During random times, engagement returned to baseline levels except in the FL group in which it decreased. There was also less agitation and passivity in groups with a component adjusted to PSI. One week after the intervention, mood, anxiety, and passivity improved over baseline; significantly less pleasure was displayed after withdrawal of treatment. The hypothesis that activities adjusted to FL+PSI would improve behavioral outcomes to a greater extent than partially adjusted or nonadjusted activities was partially supported. PSI is a critical component of individualized activity prescription. © 2011, Copyright the Authors. Journal compilation © 2011, The American Geriatrics Society.

  17. Random Regression Models Using Legendre Polynomials to Estimate Genetic Parameters for Test-day Milk Protein Yields in Iranian Holstein Dairy Cattle.

    PubMed

    Naserkheil, Masoumeh; Miraie-Ashtiani, Seyed Reza; Nejati-Javaremi, Ardeshir; Son, Jihyun; Lee, Deukhwan

    2016-12-01

    The objective of this study was to estimate the genetic parameters of milk protein yields in Iranian Holstein dairy cattle. A total of 1,112,082 test-day milk protein yield records of 167,269 first lactation Holstein cows, calved from 1990 to 2010, were analyzed. Estimates of the variance components, heritability, and genetic correlations for milk protein yields were obtained using a random regression test-day model. Milking times, herd, age of recording, year, and month of recording were included as fixed effects in the model. Additive genetic and permanent environmental random effects for the lactation curve were taken into account by applying orthogonal Legendre polynomials of the fourth order in the model. The lowest and highest additive genetic variances were estimated at the beginning and end of lactation, respectively. Permanent environmental variance was higher at both extremes. Residual variance was lowest at the middle of the lactation and contrarily, heritability increased during this period. Maximum heritability was found during the 12th lactation stage (0.213±0.007). Genetic, permanent, and phenotypic correlations among test-days decreased as the interval between consecutive test-days increased. A relatively large data set was used in this study; therefore, the estimated (co)variance components for random regression coefficients could be used for national genetic evaluation of dairy cattle in Iran.

  18. Random Regression Models Using Legendre Polynomials to Estimate Genetic Parameters for Test-day Milk Protein Yields in Iranian Holstein Dairy Cattle

    PubMed Central

    Naserkheil, Masoumeh; Miraie-Ashtiani, Seyed Reza; Nejati-Javaremi, Ardeshir; Son, Jihyun; Lee, Deukhwan

    2016-01-01

    The objective of this study was to estimate the genetic parameters of milk protein yields in Iranian Holstein dairy cattle. A total of 1,112,082 test-day milk protein yield records of 167,269 first lactation Holstein cows, calved from 1990 to 2010, were analyzed. Estimates of the variance components, heritability, and genetic correlations for milk protein yields were obtained using a random regression test-day model. Milking times, herd, age of recording, year, and month of recording were included as fixed effects in the model. Additive genetic and permanent environmental random effects for the lactation curve were taken into account by applying orthogonal Legendre polynomials of the fourth order in the model. The lowest and highest additive genetic variances were estimated at the beginning and end of lactation, respectively. Permanent environmental variance was higher at both extremes. Residual variance was lowest at the middle of the lactation and contrarily, heritability increased during this period. Maximum heritability was found during the 12th lactation stage (0.213±0.007). Genetic, permanent, and phenotypic correlations among test-days decreased as the interval between consecutive test-days increased. A relatively large data set was used in this study; therefore, the estimated (co)variance components for random regression coefficients could be used for national genetic evaluation of dairy cattle in Iran. PMID:26954192

  19. Methods to Limit Attrition in Longitudinal Comparative Effectiveness Trials: Lessons from the Lithium Use for Bipolar Disorder (LiTMUS) Study

    PubMed Central

    Sylvia, Louisa G.; Reilly-Harrington, Noreen A.; Leon, Andrew C.; Kansky, Christine I.; Ketter, Terence A.; Calabrese, Joseph R.; Thase, Michael E.; Bowden, Charles L.; Friedman, Edward S.; Ostacher, Michael J.; Iosifescu, Dan V.; Severe, Joanne; Nierenberg, Andrew A.

    2013-01-01

    Background High attrition rates which occur frequently in longitudinal clinical trials of interventions for bipolar disorder limit the interpretation of results. Purpose The aim of this article is to present design approaches that limited attrition in the Lithium Use for Bipolar Disorder (LiTMUS) Study. Methods LiTMUS was a 6-month randomized, longitudinal multi-site comparative effectiveness trial that examined bipolar participants who were at least mildly ill. Participants were randomized to either low to moderate doses of lithium or no lithium, in addition to other treatments needed for mood stabilization administered in a guideline-informed, empirically supported, and personalized fashion (N=283). Results Components of the study design that may have contributed to the low attrition rate of the study included use of: (1) an intent-to-treat design; (2) a randomized adjunctive single-blind design; (3) participant reimbursement; (4) intent-to-attend the next study visit (includes a discussion of attendance obstacles when intention is low); (5) quality care with limited participant burden; and (6) target windows for study visits. Limitations Site differences and the effectiveness and tolerability data have not been analyzed yet. Conclusions These components of the LiTMUS study design may have reduced the probability of attrition which would inform the design of future randomized clinical effectiveness trials. PMID:22076437

  20. Characterization of cancer and normal tissue fluorescence through wavelet transform and singular value decomposition

    NASA Astrophysics Data System (ADS)

    Gharekhan, Anita H.; Biswal, Nrusingh C.; Gupta, Sharad; Pradhan, Asima; Sureshkumar, M. B.; Panigrahi, Prasanta K.

    2008-02-01

    The statistical and characteristic features of the polarized fluorescence spectra from cancer, normal and benign human breast tissues are studied through wavelet transform and singular value decomposition. The discrete wavelets enabled one to isolate high and low frequency spectral fluctuations, which revealed substantial randomization in the cancerous tissues, not present in the normal cases. In particular, the fluctuations fitted well with a Gaussian distribution for the cancerous tissues in the perpendicular component. One finds non-Gaussian behavior for normal and benign tissues' spectral variations. The study of the difference of intensities in parallel and perpendicular channels, which is free from the diffusive component, revealed weak fluorescence activity in the 630nm domain, for the cancerous tissues. This may be ascribable to porphyrin emission. The role of both scatterers and fluorophores in the observed minor intensity peak for the cancer case is experimentally confirmed through tissue-phantom experiments. Continuous Morlet wavelet also highlighted this domain for the cancerous tissue fluorescence spectra. Correlation in the spectral fluctuation is further studied in different tissue types through singular value decomposition. Apart from identifying different domains of spectral activity for diseased and non-diseased tissues, we found random matrix support for the spectral fluctuations. The small eigenvalues of the perpendicular polarized fluorescence spectra of cancerous tissues fitted remarkably well with random matrix prediction for Gaussian random variables, confirming our observations about spectral fluctuations in the wavelet domain.

  1. Design of the Women's Health Initiative clinical trial and observational study. The Women's Health Initiative Study Group.

    PubMed

    1998-02-01

    The Women's Health Initiative (WHI) is a large and complex clinical investigation of strategies for the prevention and control of some of the most common causes of morbidity and mortality among postmenopausal women, including cancer, cardiovascular disease, and osteoporotic fractures. The WHI was initiated in 1992, with a planned completion date of 2007. Postmenopausal women ranging in age from 50 to 79 are enrolled at one of 40 WHI clinical centers nationwide into either a clinical trial (CT) that will include about 64,500 women or an observational study (OS) that will include about 100,000 women. The CT is designed to allow randomized controlled evaluation of three distinct interventions: a low-fat eating pattern, hypothesized to prevent breast cancer and colorectal cancer and, secondarily, coronary heart disease; hormone replacement therapy, hypothesized to reduce the risk of coronary heart disease and other cardiovascular diseases and, secondarily, to reduce the risk of hip and other fractures, with increased breast cancer risk as a possible adverse outcome; and calcium and vitamin D supplementation, hypothesized to prevent hip fractures and, secondarily, other fractures and colorectal cancer. Overall benefit-versus-risk assessment is a central focus in each of the three CT components. Women are screened for participation in one or both of the components--dietary modification (DM) or hormone replacement therapy (HRT)--of the CT, which will randomize 48,000 and 27,500 women, respectively. Women who prove to be ineligible for, or who are unwilling to enroll in, these CT components are invited to enroll in the OS. At their 1-year anniversary of randomization, CT women are invited to be further randomized into the calcium and vitamin D (CaD) trial component, which is projected to include 45,000 women. The average follow-up for women in either CT or OS is approximately 9 years. Concerted efforts are made to enroll women of racial and ethnic minority groups, with a target of 20% of overall enrollment in both the CT and OS. This article gives a brief description of the rationale for the interventions being studied in each of the CT components and for the inclusion of the OS component. Some detail is provided on specific study design choices, including eligibility criteria, recruitment strategy, and sample size, with attention to the partial factorial design of the CT. Some aspects of the CT monitoring approach are also outlined. The scientific and logistic complexity of the WHI implies particular leadership and management challenges. The WHI organization and committee structure employed to respond to these challenges is also briefly described.

  2. A randomized controlled pilot trial of modified whole blood versus component therapy in severely injured patients requiring large volume transfusions.

    PubMed

    Cotton, Bryan A; Podbielski, Jeanette; Camp, Elizabeth; Welch, Timothy; del Junco, Deborah; Bai, Yu; Hobbs, Rhonda; Scroggins, Jamie; Hartwell, Beth; Kozar, Rosemary A; Wade, Charles E; Holcomb, John B

    2013-10-01

    To determine whether resuscitation of severely injured patients with modified whole blood (mWB) resulted in fewer overall transfusions compared with component (COMP) therapy. For decades, whole blood (WB) was the primary product for resuscitating patients in hemorrhagic shock. After dramatic advances in blood banking in the 1970s, blood donor centers began supplying hospitals with individual components [red blood cell (RBC), plasma, platelets] and removed WB as an available product. However, no studies of efficacy or hemostatic potential in trauma patients were performed before doing so. Single-center, randomized trial of severely injured patients predicted to large transfusion volume. Pregnant patients, prisoners, those younger than 18 years or with more than 20% total body surface area burns (TBSA) burns were excluded. Patients were randomized to mWB (1 U mWB) or COMP therapy (1 U RBC+ 1 U plasma) immediately on arrival. Each group also received 1 U platelets (apheresis or prepooled random donor) for every 6 U of mWB or 6 U of RBC + 6 U plasma. The study was performed under the Exception From Informed Consent (Food and Drug Administration 21 code of federal regulations [CFR] 50.24). Primary outcome was 24-hour transfusion volumes. A total of 107 patients were randomized (55 mWB, 52 COMP therapy) over 14 months. There were no differences in demographics, arrival vitals or laboratory values, injury severity, or mechanism. Transfusions were similar between groups (intent-to-treat analysis). However, when excluding patients with severe brain injury (sensitivity analysis), WB group received less 24-hour RBC (median 3 vs 6, P = 0.02), plasma (4 vs 6, P = 0.02), platelets (0 vs 3, P = 0.09), and total products (11 vs 16, P = 0.02). Compared with COMP therapy, WB did not reduce transfusion volumes in severely injured patients predicted to receive massive transfusion. However, in the sensitivity analysis (patients without severe brain injuries), use of mWB significantly reduced transfusion volumes, achieving the prespecified endpoint of this initial pilot study.

  3. CERAMENT treatment of fracture defects (CERTiFy): protocol for a prospective, multicenter, randomized study investigating the use of CERAMENT™ BONE VOID FILLER in tibial plateau fractures

    PubMed Central

    2014-01-01

    Background Bone graft substitutes are widely used for reconstruction of posttraumatic bone defects. However, their clinical significance in comparison to autologous bone grafting, the gold-standard in reconstruction of larger bone defects, still remains under debate. This prospective, randomized, controlled clinical study investigates the differences in pain, quality of life, and cost of care in the treatment of tibia plateau fractures-associated bone defects using either autologous bone grafting or bioresorbable hydroxyapatite/calcium sulphate cement (CERAMENT™|BONE VOID FILLER (CBVF)). Methods/Design CERTiFy (CERament™ Treatment of Fracture defects) is a prospective, multicenter, controlled, randomized trial. We plan to enroll 136 patients with fresh traumatic depression fractures of the proximal tibia (types AO 41-B2 and AO 41-B3) in 13 participating centers in Germany. Patients will be randomized to receive either autologous iliac crest bone graft or CBVF after reduction and osteosynthesis of the fracture to reconstruct the subchondral bone defect and prevent the subsidence of the articular surface. The primary outcome is the SF-12 Physical Component Summary at week 26. The co-primary endpoint is the pain level 26 weeks after surgery measured by a visual analog scale. The SF-12 Mental Component Summary after 26 weeks and costs of care will serve as key secondary endpoints. The study is designed to show non-inferiority of the CBVF treatment to the autologous iliac crest bone graft with respect to the physical component of quality of life. The pain level at 26 weeks after surgery is expected to be lower in the CERAMENT bone void filler treatment group. Discussion CERTiFy is the first randomized multicenter clinical trial designed to compare quality of life, pain, and cost of care in the use of the CBVF and the autologous iliac crest bone graft in the treatment of tibia plateau fractures. The results are expected to influence future treatment recommendations. Trial registration number ClinicalTrials.gov: NCT01828905 PMID:24606670

  4. Turbulence and fire-spotting effects into wild-land fire simulators

    NASA Astrophysics Data System (ADS)

    Kaur, Inderpreet; Mentrelli, Andrea; Bosseur, Frédéric; Filippi, Jean-Baptiste; Pagnini, Gianni

    2016-10-01

    This paper presents a mathematical approach to model the effects and the role of phenomena with random nature such as turbulence and fire-spotting into the existing wildfire simulators. The formulation proposes that the propagation of the fire-front is the sum of a drifting component (obtained from an existing wildfire simulator without turbulence and fire-spotting) and a random fluctuating component. The modelling of the random effects is embodied in a probability density function accounting for the fluctuations around the fire perimeter which is given by the drifting component. In past, this formulation has been applied to include these random effects into a wildfire simulator based on an Eulerian moving interface method, namely the Level Set Method (LSM), but in this paper the same formulation is adapted for a wildfire simulator based on a Lagrangian front tracking technique, namely the Discrete Event System Specification (DEVS). The main highlight of the present study is the comparison of the performance of a Lagrangian and an Eulerian moving interface method when applied to wild-land fire propagation. Simple idealised numerical experiments are used to investigate the potential applicability of the proposed formulation to DEVS and to compare its behaviour with respect to the LSM. The results show that DEVS based wildfire propagation model qualitatively improves its performance (e.g., reproducing flank and back fire, increase in fire spread due to pre-heating of the fuel by hot air and firebrands, fire propagation across no fuel zones, secondary fire generation, ...) when random effects are included according to the present formulation. The performance of DEVS and LSM based wildfire models is comparable and the only differences which arise among the two are due to the differences in the geometrical construction of the direction of propagation. Though the results presented here are devoid of any validation exercise and provide only a proof of concept, they show a strong inclination towards an intended operational use. The existing LSM or DEVS based operational simulators like WRF-SFIRE and ForeFire respectively can serve as an ideal basis for the same.

  5. Web-Based Smoking-Cessation Program

    PubMed Central

    Strecher, Victor J.; McClure, Jennifer B.; Alexander, Gwen L.; Chakraborty, Bibhas; Nair, Vijay N.; Konkel, Janine M.; Greene, Sarah M.; Collins, Linda M.; Carlier, Carola C.; Wiese, Cheryl J.; Little, Roderick J.; Pomerleau, Cynthia S.; Pomerleau, Ovide F.

    2009-01-01

    Background Initial trials of web-based smoking-cessation programs have generally been promising. The active components of these programs, however, are not well understood. This study aimed to (1) identify active psychosocial and communication components of a web-based smoking-cessation intervention and (2) examine the impact of increasing the tailoring depth on smoking cessation. Design Randomized fractional factorial design. Setting Two HMOs: Group Health in Washington State and Henry Ford Health System in Michigan. Participants 1866 smokers. Intervention A web-based smoking-cessation program plus nicotine patch. Five components of the intervention were randomized using a fractional factorial design: high- versus low-depth tailored success story, outcome expectation, and efficacy expectation messages; high- versus low-personalized source; and multiple versus single exposure to the intervention components. Measurements Primary outcome was 7 day point-prevalence abstinence at the 6-month follow-up. Findings Abstinence was most influenced by high-depth tailored success stories and a high-personalized message source. The cumulative assignment of the three tailoring depth factors also resulted in increasing the rates of 6-month cessation, demonstrating an effect of tailoring depth. Conclusions The study identified relevant components of smoking-cessation interventions that should be generalizable to other cessation interventions. The study also demonstrated the importance of higher-depth tailoring in smoking-cessation programs. Finally, the use of a novel fractional factorial design allowed efficient examination of the study aims. The rapidly changing interfaces, software, and capabilities of eHealth are likely to require such dynamic experimental approaches to intervention discovery. PMID:18407003

  6. Integrating virtual reality with activity management for the treatment of fibromyalgia: acceptability and preliminary efficacy.

    PubMed

    Garcia-Palacios, Azucena; Herrero, Rocio; Vizcaíno, Yolanda; Belmonte, Miguel A; Castilla, Diana; Molinari, Guadalupe; Baños, Rosa Maria; Botella, Cristina

    2015-06-01

    Cognitive-behavioral therapies (CBT) for fibromyalgia syndrome (FMS) are important interventions in the management of this condition. Empirical evidence reports that although the results are promising, further research is needed to respond more appropriately to these patients. This study focuses on exploring the use of Virtual Reality (VR) as an adjunct to the activity management component. The aim of this study is to present the results of a small-sized randomized controlled trial to test the preliminary efficacy and acceptability of this component. The final sample was composed of 61 women diagnosed with FMS according to the American College of Rheumatology. The sample was randomly allocated to 2 conditions: VR treatment and treatment as usual. Participants in the VR condition achieved significant improvements in the primary outcome: disability measured with the FIQ. The improvement was also significant in secondary outcomes, such as perceived quality of life and some of the coping strategies included in the Chronic Pain Coping Inventory: task persistence and exercise. There were no differences in other secondary outcome measures like pain intensity and interference and depression. Participants reported high satisfaction with the VR component. The effects were related to the psychological aspects targeted in the treatment. The component was well accepted by FMS patients referred from a public hospital. These findings show that the VR component could be useful in the CBT treatment of FMS and encourage us to continue exploring the use of integrating VR with CBT interventions for the treatment of FMS.

  7. Comprehension of Randomization and Uncertainty in Cancer Clinical Trials Decision Making Among Rural, Appalachian Patients.

    PubMed

    Krieger, Janice L; Palmer-Wackerly, Angela; Dailey, Phokeng M; Krok-Schoen, Jessica L; Schoenberg, Nancy E; Paskett, Electra D

    2015-12-01

    Comprehension of randomization is a vital, but understudied, component of informed consent to participate in cancer randomized clinical trials (RCTs). This study examines patient comprehension of the randomization process as well as sources of ongoing uncertainty that may inhibit a patient's ability to provide informed consent to participate in RCTs. Cancer patients living in rural Appalachia who were offered an opportunity to participate in a cancer treatment RCT completed in-depth interviews and a brief survey. No systematic differences in randomization comprehension between patients who consented and those who declined participation in a cancer RCT were detected. Comprehension is conceptually distinct from uncertainty, with patients who had both high and low comprehension experiencing randomization-related uncertainty. Uncertainty about randomization was found to have cognitive and affective dimensions. Not all patients enrolling in RCTs have a sufficient understanding of the randomization process to provide informed consent. Healthcare providers need to be aware of the different types of randomization-related uncertainty. Efforts to improve informed consent to participate in RCTs should focus on having patients teach back their understanding of randomization. This practice could yield valuable information about the patient's cognitive and affective understanding of randomization as well as opportunities to correct misperceptions. Education about RCTs should reflect patient expectations of individualized care by explaining how all treatments being compared are appropriate to the specifics of a patient's disease.

  8. Probalistic Finite Elements (PFEM) structural dynamics and fracture mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing-Kam; Belytschko, Ted; Mani, A.; Besterfield, G.

    1989-01-01

    The purpose of this work is to develop computationally efficient methodologies for assessing the effects of randomness in loads, material properties, and other aspects of a problem by a finite element analysis. The resulting group of methods is called probabilistic finite elements (PFEM). The overall objective of this work is to develop methodologies whereby the lifetime of a component can be predicted, accounting for the variability in the material and geometry of the component, the loads, and other aspects of the environment; and the range of response expected in a particular scenario can be presented to the analyst in addition to the response itself. Emphasis has been placed on methods which are not statistical in character; that is, they do not involve Monte Carlo simulations. The reason for this choice of direction is that Monte Carlo simulations of complex nonlinear response require a tremendous amount of computation. The focus of efforts so far has been on nonlinear structural dynamics. However, in the continuation of this project, emphasis will be shifted to probabilistic fracture mechanics so that the effect of randomness in crack geometry and material properties can be studied interactively with the effect of random load and environment.

  9. Automation of vibroacoustic data bank for random vibration criteria development. [for the space shuttle and launch vehicles

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.

    1982-01-01

    A computerized data bank system was developed for utilization of large amounts of vibration and acoustic data to formulate component random vibration design and test criteria. This system consists of a computer, graphics tablet, and a dry-silver hard copier which are all desk-top type hardware and occupy minimal space. The data bank contains data from the Saturn V and Titan III flight and static test programs. The vibration and acoustic data are stored in the form of power spectral density and one-third octave band plots over the frequency range from 20 to 2000 Hz. The data was stored by digitizing each spectral plot by tracing with the graphics tablet. The digitized data was statistically analyzed and the resulting 97.5% probability levels were stored on tape along with the appropriate structural parameters. Standard extrapolation procedures were programmed for prediction of component random vibration test criteria for new launch vehicle and payload configurations. This automated vibroacoustic data bank system greatly enhances the speed and accuracy of formulating vibration test criteria. In the future, the data bank will be expanded to include all data acquired from the space shuttle flight test program.

  10. Diffusion of an Evidence-Based Smoking Cessation Intervention Through Facebook: A Randomized Controlled Trial.

    PubMed

    Cobb, Nathan K; Jacobs, Megan A; Wileyto, Paul; Valente, Thomas; Graham, Amanda L

    2016-06-01

    To examine the diffusion of an evidence-based smoking cessation application ("app") through Facebook social networks and identify specific intervention components that accelerate diffusion. Between December 2012 and October 2013, we recruited adult US smokers ("seeds") via Facebook advertising and randomized them to 1 of 12 app variants using a factorial design. App variants targeted components of diffusion: duration of use (t), "contagiousness" (β), and number of contacts (Z). The primary outcome was the reproductive ratio (R), defined as the number of individuals installing the app ("descendants") divided by the number of a seed participant's Facebook friends. We randomized 9042 smokers. App utilization metrics demonstrated between-variant differences in expected directions. The highest level of diffusion (R = 0.087) occurred when we combined active contagion strategies with strategies to increase duration of use (incidence rate ratio = 9.99; 95% confidence interval = 5.58, 17.91; P < .001). Involving nonsmokers did not affect diffusion. The maximal R value (0.087) is sufficient to increase the numbers of individuals receiving treatment if applied on a large scale. Online interventions can be designed a priori to spread through social networks.

  11. Multiaxis Rainflow Fatigue Methods for Nonstationary Vibration

    NASA Technical Reports Server (NTRS)

    Irvine, T.

    2016-01-01

    Mechanical structures and components may be subjected to cyclical loading conditions, including sine and random vibration. Such systems must be designed and tested accordingly. Rainflow cycle counting is the standard method for reducing a stress time history to a table of amplitude-cycle pairings prior to the Palmgren-Miner cumulative damage calculation. The damage calculation is straightforward for sinusoidal stress but very complicated for random stress, particularly for nonstationary vibration. This paper evaluates candidate methods and makes a recommendation for further study of a hybrid technique.

  12. Adaptive box filters for removal of random noise from digital images

    USGS Publications Warehouse

    Eliason, E.M.; McEwen, A.S.

    1990-01-01

    We have developed adaptive box-filtering algorithms to (1) remove random bit errors (pixel values with no relation to the image scene) and (2) smooth noisy data (pixels related to the image scene but with an additive or multiplicative component of noise). For both procedures, we use the standard deviation (??) of those pixels within a local box surrounding each pixel, hence they are adaptive filters. This technique effectively reduces speckle in radar images without eliminating fine details. -from Authors

  13. Probabilistic structural mechanics research for parallel processing computers

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Martin, William R.

    1991-01-01

    Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical.

  14. Random sex determination: When developmental noise tips the sex balance.

    PubMed

    Perrin, Nicolas

    2016-12-01

    Sex-determining factors are usually assumed to be either genetic or environmental. The present paper aims at drawing attention to the potential contribution of developmental noise, an important but often-neglected component of phenotypic variance. Mutual inhibitions between male and female pathways make sex a bistable equilibrium, such that random fluctuations in the expression of genes at the top of the cascade are sufficient to drive individual development toward one or the other stable state. Evolutionary modeling shows that stochastic sex determinants should resist elimination by genetic or environmental sex determinants under ecologically meaningful settings. On the empirical side, many sex-determination systems traditionally considered as environmental or polygenic actually provide evidence for large components of stochasticity. In reviewing the field, I argue that sex-determination systems should be considered within a three-ends continuum, rather than the classical two-ends continuum. © 2016 WILEY Periodicals, Inc.

  15. Electromagnetic wave extinction within a forested canopy

    NASA Technical Reports Server (NTRS)

    Karam, M. A.; Fung, A. K.

    1989-01-01

    A forested canopy is modeled by a collection of randomly oriented finite-length cylinders shaded by randomly oriented and distributed disk- or needle-shaped leaves. For a plane wave exciting the forested canopy, the extinction coefficient is formulated in terms of the extinction cross sections (ECSs) in the local frame of each forest component and the Eulerian angles of orientation (used to describe the orientation of each component). The ECSs in the local frame for the finite-length cylinders used to model the branches are obtained by using the forward-scattering theorem. ECSs in the local frame for the disk- and needle-shaped leaves are obtained by the summation of the absorption and scattering cross-sections. The behavior of the extinction coefficients with the incidence angle is investigated numerically for both deciduous and coniferous forest. The dependencies of the extinction coefficients on the orientation of the leaves are illustrated numerically.

  16. Cooperation evolution in random multiplicative environments

    NASA Astrophysics Data System (ADS)

    Yaari, G.; Solomon, S.

    2010-02-01

    Most real life systems have a random component: the multitude of endogenous and exogenous factors influencing them result in stochastic fluctuations of the parameters determining their dynamics. These empirical systems are in many cases subject to noise of multiplicative nature. The special properties of multiplicative noise as opposed to additive noise have been noticed for a long while. Even though apparently and formally the difference between free additive vs. multiplicative random walks consists in just a move from normal to log-normal distributions, in practice the implications are much more far reaching. While in an additive context the emergence and survival of cooperation requires special conditions (especially some level of reward, punishment, reciprocity), we find that in the multiplicative random context the emergence of cooperation is much more natural and effective. We study the various implications of this observation and its applications in various contexts.

  17. Evaluating Personalized Feedback Intervention Framing with a Randomized Controlled Trial to Reduce Young Adult Alcohol-Related Sexual Risk Taking.

    PubMed

    Lewis, Melissa A; Rhew, Isaac C; Fairlie, Anne M; Swanson, Alex; Anderson, Judyth; Kaysen, Debra

    2018-03-06

    The purpose of this study was to evaluate personalized feedback intervention (PFI) framing with two web-delivered PFIs aimed to reduce young adult alcohol-related risky sexual behavior (RSB). Combined PFIs typically use an additive approach whereby independent components on drinking and components on RSB are presented without the discussion of the influence of alcohol on RSB. In contrast, an integrated PFI highlights the RSB-alcohol connection by presenting integrated alcohol and RSB components that focus on the role of intoxication as a barrier to risk reduction in sexual situations. In a randomized controlled trial, 402 (53.98% female) sexually active young adults aged 18-25 were randomly assigned to a combined PFI, an integrated PFI, or attention control. All assessment and intervention procedures were web-based. At the 1-month follow-up, those randomly assigned to the integrated condition had a lower likelihood of having any casual sex partners compared to those in the control group. At the 6-month follow-up, the combined condition had a lower likelihood of having any casual sex partners compared to those in the control group. When examining alcohol-related RSB, at the 1-month follow-up, both interventions showed a lower likelihood of any drinking prior to sex compared to the control group. When examining alcohol-related sexual consequences, results showed a reduction in the non-zero count of consequences in the integrated condition compared to the control at the 1-month follow-up. For typical drinks per week, those in the combined condition showed a greater reduction in the non-zero count of drinks than those in the control condition at the 1-month follow-up. While there were no significant differences between the two interventions, the current findings highlight the utility of two efficacious web-based alcohol and RSB interventions among a national sample of at-risk young adults.

  18. Increasing students' physical activity during school physical education: rationale and protocol for the SELF-FIT cluster randomized controlled trial.

    PubMed

    Ha, Amy S; Lonsdale, Chris; Lubans, David R; Ng, Johan Y Y

    2017-07-11

    The Self-determined Exercise and Learning For FITness (SELF-FIT) is a multi-component school-based intervention based on tenets of self-determination theory. SELF-FIT aims to increase students' moderate-to-vigorous physical activity (MVPA) during physical education lessons, and enhance their autonomous motivation towards fitness activities. Using a cluster randomized controlled trial, we aim to examine the effects of the intervention on students' MVPA during school physical education. Secondary 2 students (approximately aged 14 years) from 26 classes in 26 different schools will be recruited. After baseline assessments, students will be randomized into either the experimental group or wait-list control group using a matched-pair randomization. Teachers allocated to the experimental group will attend two half-day workshops and deliver the SELF-FIT intervention for 8 weeks. The main intervention components include training teachers to teach in more need supportive ways, and conducting fitness exercises using a fitness dice with interchangeable faces. Other motivational components, such as playing music during classes, are also included. The primary outcome of the trial is students' MVPA during PE lessons. Secondary outcomes include students' leisure-time MVPA, perceived need support from teachers, need satisfaction, autonomous motivation towards physical education, intention to engage in physical activity, psychological well-being, and health-related fitness (cardiorespiratory and muscular fitness). Quantitative data will be analyzed using multilevel modeling approaches. Focus group interviews will also be conducted to assess students' perceptions of the intervention. The SELF-FIT intervention has been designed to improve students' health and well-being by using high-intensity activities in classes delivered by teachers who have been trained to be autonomy needs supportive. If successful, scalable interventions based on SELF-FIT could be applied in physical education at large. The trial is registered at the Australia New Zealand Clinical Trial Registry (Trial ID: ACTRN12615000633583 ; date of registration: 18 June 2015).

  19. Polarization of gamma-ray burst afterglows in the synchrotron self-Compton process from a highly relativistic jet

    NASA Astrophysics Data System (ADS)

    Lin, Hai-Nan; Li, Xin; Chang, Zhe

    2017-04-01

    Linear polarization has been observed in both the prompt phase and afterglow of some bright gamma-ray bursts (GRBs). Polarization in the prompt phase spans a wide range, and may be as high as ≳ 50%. In the afterglow phase, however, it is usually below 10%. According to the standard fireball model, GRBs are produced by synchrotron radiation and Compton scattering process in a highly relativistic jet ejected from the central engine. It is widely accepted that prompt emissions occur in the internal shock when shells with different velocities collide with each other, and the magnetic field advected by the jet from the central engine can be ordered on a large scale. On the other hand, afterglows are often assumed to occur in the external shock when the jet collides with interstellar medium, and the magnetic field produced by the shock through, for example, Weibel instability, is possibly random. In this paper, we calculate the polarization properties of the synchrotron self-Compton process from a highly relativistic jet, in which the magnetic field is randomly distributed in the shock plane. We also consider the generalized situation where a uniform magnetic component perpendicular to the shock plane is superposed on the random magnetic component. We show that it is difficult for the polarization to be larger than 10% if the seed electrons are isotropic in the jet frame. This may account for the observed upper limit of polarization in the afterglow phase of GRBs. In addition, if the random and uniform magnetic components decay with time at different speeds, then the polarization angle may change 90° during the temporal evolution. Supported by Fundamental Research Funds for the Central Universities (106112016CDJCR301206), National Natural Science Fund of China (11375203, 11603005), and Open Project Program of State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of Sciences, China (Y5KF181CJ1)

  20. Use of regularized principal component analysis to model anatomical changes during head and neck radiation therapy for treatment adaptation and response assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chetvertkov, Mikhail A., E-mail: chetvertkov@wayne

    2016-10-15

    Purpose: To develop standard (SPCA) and regularized (RPCA) principal component analysis models of anatomical changes from daily cone beam CTs (CBCTs) of head and neck (H&N) patients and assess their potential use in adaptive radiation therapy, and for extracting quantitative information for treatment response assessment. Methods: Planning CT images of ten H&N patients were artificially deformed to create “digital phantom” images, which modeled systematic anatomical changes during radiation therapy. Artificial deformations closely mirrored patients’ actual deformations and were interpolated to generate 35 synthetic CBCTs, representing evolving anatomy over 35 fractions. Deformation vector fields (DVFs) were acquired between pCT and syntheticmore » CBCTs (i.e., digital phantoms) and between pCT and clinical CBCTs. Patient-specific SPCA and RPCA models were built from these synthetic and clinical DVF sets. EigenDVFs (EDVFs) having the largest eigenvalues were hypothesized to capture the major anatomical deformations during treatment. Results: Principal component analysis (PCA) models achieve variable results, depending on the size and location of anatomical change. Random changes prevent or degrade PCA’s ability to detect underlying systematic change. RPCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes and is therefore more successful than SPCA at capturing systematic changes early in treatment. SPCA models were less successful at modeling systematic changes in clinical patient images, which contain a wider range of random motion than synthetic CBCTs, while the regularized approach was able to extract major modes of motion. Conclusions: Leading EDVFs from the both PCA approaches have the potential to capture systematic anatomical change during H&N radiotherapy when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the RPCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are established early in a treatment course, or based on population models.« less

  1. mActive: A Randomized Clinical Trial of an Automated mHealth Intervention for Physical Activity Promotion.

    PubMed

    Martin, Seth S; Feldman, David I; Blumenthal, Roger S; Jones, Steven R; Post, Wendy S; McKibben, Rebeccah A; Michos, Erin D; Ndumele, Chiadi E; Ratchford, Elizabeth V; Coresh, Josef; Blaha, Michael J

    2015-11-09

    We hypothesized that a fully automated mobile health (mHealth) intervention with tracking and texting components would increase physical activity. mActive enrolled smartphone users aged 18 to 69 years at an ambulatory cardiology center in Baltimore, Maryland. We used sequential randomization to evaluate the intervention's 2 core components. After establishing baseline activity during a blinded run-in (week 1), in phase I (weeks 2 to 3), we randomized 2:1 to unblinded versus blinded tracking. Unblinding allowed continuous access to activity data through a smartphone interface. In phase II (weeks 4 to 5), we randomized unblinded participants 1:1 to smart texts versus no texts. Smart texts provided smartphone-delivered coaching 3 times/day aimed at individual encouragement and fostering feedback loops by a fully automated, physician-written, theory-based algorithm using real-time activity data and 16 personal factors with a 10 000 steps/day goal. Forty-eight outpatients (46% women, 21% nonwhite) enrolled with a mean±SD age of 58±8 years, body mass index of 31±6 kg/m(2), and baseline activity of 9670±4350 steps/day. Daily activity data capture was 97.4%. The phase I change in activity was nonsignificantly higher in unblinded participants versus blinded controls by 1024 daily steps (95% confidence interval [CI], -580 to 2628; P=0.21). In phase II, participants receiving texts increased their daily steps over those not receiving texts by 2534 (95% CI, 1318 to 3750; P<0.001) and over blinded controls by 3376 (95% CI, 1951 to 4801; P<0.001). An automated tracking-texting intervention increased physical activity with, but not without, the texting component. These results support new mHealth tracking technologies as facilitators in need of behavior change drivers. URL: http://ClinicalTrials.gov/. Unique identifier: NCT01917812. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  2. Mediation of Short and Longer Term Effects of an Intervention Program to Enhance Resilience in Immigrants from Mainland China to Hong Kong

    PubMed Central

    Yu, Nancy X.; Lam, T. H.; Liu, Iris K. F.; Stewart, Sunita M.

    2015-01-01

    Few clinical trials report on the active intervention components that result in outcome changes, although this is relevant to further improving efficacy and adapting effective programs to other populations. This paper presents follow-up analyses of a randomized controlled trial to enhance adaptation by increasing knowledge and personal resilience in two separate brief interventions with immigrants from Mainland China to Hong Kong (Yu et al., 2014b). The present paper extends our previous one by reporting on the longer term effect of the interventions on personal resilience, and examining whether the Resilience intervention worked as designed to enhance personal resilience. The four-session intervention targeted at self-efficacy, positive thinking, altruism, and goal setting. In this randomized controlled trial, 220 immigrants were randomly allocated to three arms: Resilience, Information (an active control arm), and Control arms. Participants completed measures of the four active components (self-efficacy, positive thinking, altruism, and goal setting) at baseline and immediately after the intervention. Personal resilience was assessed at baseline, post-intervention, and 3- and 6-month follow-ups. The results showed that the Resilience arm had greater increases in the four active components post-intervention. Changes in each of the four active components at the post-intervention assessment mediated enhanced personal resilience at the 3-month follow-up in the Resilience arm. Changes in self-efficacy and goal setting showed the largest effect size, and altruism showed the smallest. The arm effects of the Resilience intervention on enhanced personal resilience at the 6-month follow-up were mediated by increases of personal resilience post-intervention (Resilience vs. Control) and at the 3-month follow-up (Resilience vs. Information). These findings showed that these four active components were all mediators in this Resilience intervention. Our results of the effects of short term increases in personal resilience on longer term increase in personal resilience in some models suggest how changes in intervention outcomes might persist over time. PMID:26640446

  3. Hierarchical Regularity in Multi-Basin Dynamics on Protein Landscapes

    NASA Astrophysics Data System (ADS)

    Matsunaga, Yasuhiro; Kostov, Konstatin S.; Komatsuzaki, Tamiki

    2004-04-01

    We analyze time series of potential energy fluctuations and principal components at several temperatures for two kinds of off-lattice 46-bead models that have two distinctive energy landscapes. The less-frustrated "funnel" energy landscape brings about stronger nonstationary behavior of the potential energy fluctuations at the folding temperature than the other, rather frustrated energy landscape at the collapse temperature. By combining principal component analysis with an embedding nonlinear time-series analysis, it is shown that the fast fluctuations with small amplitudes of 70-80% of the principal components cause the time series to become almost "random" in only 100 simulation steps. However, the stochastic feature of the principal components tends to be suppressed through a wide range of degrees of freedom at the transition temperature.

  4. Short communication: Principal components and factor analytic models for test-day milk yield in Brazilian Holstein cattle.

    PubMed

    Bignardi, A B; El Faro, L; Rosa, G J M; Cardoso, V L; Machado, P F; Albuquerque, L G

    2012-04-01

    A total of 46,089 individual monthly test-day (TD) milk yields (10 test-days), from 7,331 complete first lactations of Holstein cattle were analyzed. A standard multivariate analysis (MV), reduced rank analyses fitting the first 2, 3, and 4 genetic principal components (PC2, PC3, PC4), and analyses that fitted a factor analytic structure considering 2, 3, and 4 factors (FAS2, FAS3, FAS4), were carried out. The models included the random animal genetic effect and fixed effects of the contemporary groups (herd-year-month of test-day), age of cow (linear and quadratic effects), and days in milk (linear effect). The residual covariance matrix was assumed to have full rank. Moreover, 2 random regression models were applied. Variance components were estimated by restricted maximum likelihood method. The heritability estimates ranged from 0.11 to 0.24. The genetic correlation estimates between TD obtained with the PC2 model were higher than those obtained with the MV model, especially on adjacent test-days at the end of lactation close to unity. The results indicate that for the data considered in this study, only 2 principal components are required to summarize the bulk of genetic variation among the 10 traits. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. Scale-free models for the structure of business firm networks.

    PubMed

    Kitsak, Maksim; Riccaboni, Massimo; Havlin, Shlomo; Pammolli, Fabio; Stanley, H Eugene

    2010-03-01

    We study firm collaborations in the life sciences and the information and communication technology sectors. We propose an approach to characterize industrial leadership using k -shell decomposition, with top-ranking firms in terms of market value in higher k -shell layers. We find that the life sciences industry network consists of three distinct components: a "nucleus," which is a small well-connected subgraph, "tendrils," which are small subgraphs consisting of small degree nodes connected exclusively to the nucleus, and a "bulk body," which consists of the majority of nodes. Industrial leaders, i.e., the largest companies in terms of market value, are in the highest k -shells of both networks. The nucleus of the life sciences sector is very stable: once a firm enters the nucleus, it is likely to stay there for a long time. At the same time we do not observe the above three components in the information and communication technology sector. We also conduct a systematic study of these three components in random scale-free networks. Our results suggest that the sizes of the nucleus and the tendrils in scale-free networks decrease as the exponent of the power-law degree distribution lambda increases, and disappear for lambda>or=3 . We compare the k -shell structure of random scale-free model networks with two real-world business firm networks in the life sciences and in the information and communication technology sectors. We argue that the observed behavior of the k -shell structure in the two industries is consistent with the coexistence of both preferential and random agreements in the evolution of industrial networks.

  6. Preoperative exercise training prevents functional decline after lung resection surgery: a randomized, single-blind controlled trial.

    PubMed

    Sebio García, Raquel; Yáñez-Brage, Maria Isabel; Giménez Moolhuyzen, Esther; Salorio Riobo, Marta; Lista Paz, Ana; Borro Mate, Jose María

    2017-08-01

    To investigate the effects of a preoperative pulmonary rehabilitation programme in patients with lung cancer undergoing video-assisted thoracic surgery. Randomized, single-blind controlled trial. Teaching hospital. Patients with suspected or confirmed lung cancer undergoing video-assisted thoracic surgery. Participants were randomized to either a prehabilitation group or a control group. Participants in the prehabilitation group underwent a combination of moderate endurance and resistance training plus breathing exercises three to five times per week. The primary outcome of the study was exercise capacity. Secondary outcomes were muscle strength (Senior Fitness Test), health-related quality of life (Short-Form 36) and the postoperative outcomes. Patients were evaluated at baseline (before randomization), presurgery (only the prehabilitation group), after surgery and three months post-operatively. A total of 40 patients were randomized and 22 finished the study (10 in the prehabilitation group and 12 in the control group). Three patients were lost to follow-up at three months. After the training, there was a statistically significant improvement in exercise tolerance (+397 seconds, p = 0.0001), the physical summary component of the SF-36 (+4.4 points, p = 0.008) and muscle strength ( p < 0.01). There were no significant differences between groups after surgery. However, three months postoperatively, significant differences were found in the mean change of exercise capacity ( p = 0.005), physical summary component ( p = 0.001) and upper and lower body strength ( p = 0.045 and p = 0.002). A pulmonary rehabilitation programme before video-assisted thoracic surgery seems to improve patients' preoperative condition and may prevent functional decline after surgery. Clinical Registration Number: NCT01963923 (Registration date 10/10/2013).

  7. Randomized controlled trial of a comprehensive stroke education program for patients and caregivers.

    PubMed

    Rodgers, H; Atkinson, C; Bond, S; Suddes, M; Dobson, R; Curless, R

    1999-12-01

    We report the findings of a randomized controlled trial to determine the effectiveness of a multidisciplinary Stroke Education Program (SEP) for patients and their informal carers. Two hundred four patients admitted with acute stroke and their 176 informal carers were randomized to receive an invitation to the SEP or to receive conventional stroke unit care. The SEP consisted of one 1-hour small group educational session for inpatients followed by six 1-hour sessions after discharge. The primary outcome measure was patient- and carer-perceived health status (SF-36) at 6 months after stroke. Knowledge of stroke, satisfaction with services, emotional outcome, disability, and handicap and were secondary outcome measures. Only 51 of 108 (47%) surviving patients randomized to the SEP completed the program, as did 20 of 93 (22%) informal carers of surviving patients. Perceived health status (Short Form 36 [SF-36] health survey) scores were similar for SEP patients and controls. Informal carers in the control group scored better on the social functioning component of the SF-36 than the SEP group (P=0.04). Patients and informal carers in the SEP group scored higher on the stroke knowledge scale than controls (patients, P=0.02; carers, P=0. 01). Patients in the SEP group were more satisfied with the information that they had received about stroke (P=0.004). There were no differences in emotional or functional outcomes between groups. Although the SEP improved patient and informal carer knowledge about stroke and patient satisfaction with some components of stroke services, this was not associated with an improvement in their perceived health status. Indeed, the social functioning of informal carers randomized to the SEP was less than in the control group.

  8. The gust-mitigating potential of flapping wings.

    PubMed

    Fisher, Alex; Ravi, Sridhar; Watkins, Simon; Watmuff, Jon; Wang, Chun; Liu, Hao; Petersen, Phred

    2016-08-02

    Nature's flapping-wing flyers are adept at negotiating highly turbulent flows across a wide range of scales. This is in part due to their ability to quickly detect and counterract disturbances to their flight path, but may also be assisted by an inherent aerodynamic property of flapping wings. In this study, we subject a mechanical flapping wing to replicated atmospheric turbulence across a range of flapping frequencies and turbulence intensities. By means of flow visualization and surface pressure measurements, we determine the salient effects of large-scale freestream turbulence on the flow field, and on the phase-average and fluctuating components of pressure and lift. It is shown that at lower flapping frequencies, turbulence dominates the instantaneous flow field, and the random fluctuating component of lift contributes significantly to the total lift. At higher flapping frequencies, kinematic forcing begins to dominate and the flow field becomes more consistent from cycle to cycle. Turbulence still modulates the flapping-induced flow field, as evidenced in particular by a variation in the timing and extent of leading edge vortex formation during the early downstroke. The random fluctuating component of lift contributes less to the total lift at these frequencies, providing evidence that flapping wings do indeed provide some inherent gust mitigation.

  9. Scaled Particle Theory for Multicomponent Hard Sphere Fluids Confined in Random Porous Media.

    PubMed

    Chen, W; Zhao, S L; Holovko, M; Chen, X S; Dong, W

    2016-06-23

    The formulation of scaled particle theory (SPT) is presented for a quite general model of fluids confined in a random porous media, i.e., a multicomponent hard sphere (HS) fluid in a multicomponent hard sphere or a multicomponent overlapping hard sphere (OHS) matrix. The analytical expressions for pressure, Helmholtz free energy, and chemical potential are derived. The thermodynamic consistency of the proposed theory is established. Moreover, we show that there is an isomorphism between the SPT for a multicomponent system and that for a one-component system. Results from grand canonical ensemble Monte Carlo simulations are also presented for a binary HS mixture in a one-component HS or a one-component OHS matrix. The accuracy of various variants derived from the basic SPT formulation is appraised against the simulation results. Scaled particle theory, initially formulated for a bulk HS fluid, has not only provided an analytical tool for calculating thermodynamic properties of HS fluid but also helped to gain very useful insight for elaborating other theoretical approaches such as the fundamental measure theory (FMT). We expect that the general SPT for multicomponent systems developed in this work can contribute to the study of confined fluids in a similar way.

  10. Covariance analyses of satellite-derived mesoscale wind fields

    NASA Technical Reports Server (NTRS)

    Maddox, R. A.; Vonder Haar, T. H.

    1979-01-01

    Statistical structure functions have been computed independently for nine satellite-derived mesoscale wind fields that were obtained on two different days. Small cumulus clouds were tracked at 5 min intervals, but since these clouds occurred primarily in the warm sectors of midlatitude cyclones the results cannot be considered representative of the circulations within cyclones in general. The field structure varied considerably with time and was especially affected if mesoscale features were observed. The wind fields on the 2 days studied were highly anisotropic with large gradients in structure occurring approximately normal to the mean flow. Structure function calculations for the combined set of satellite winds were used to estimate random error present in the fields. It is concluded for these data that the random error in vector winds derived from cumulus cloud tracking using high-frequency satellite data is less than 1.75 m/s. Spatial correlation functions were also computed for the nine data sets. Normalized correlation functions were considerably different for u and v components and decreased rapidly as data point separation increased for both components. The correlation functions for transverse and longitudinal components decreased less rapidly as data point separation increased.

  11. Spatiotemporal hurdle models for zero-inflated count data: Exploring trends in emergency department visits.

    PubMed

    Neelon, Brian; Chang, Howard H; Ling, Qiang; Hastings, Nicole S

    2016-12-01

    Motivated by a study exploring spatiotemporal trends in emergency department use, we develop a class of two-part hurdle models for the analysis of zero-inflated areal count data. The models consist of two components-one for the probability of any emergency department use and one for the number of emergency department visits given use. Through a hierarchical structure, the models incorporate both patient- and region-level predictors, as well as spatially and temporally correlated random effects for each model component. The random effects are assigned multivariate conditionally autoregressive priors, which induce dependence between the components and provide spatial and temporal smoothing across adjacent spatial units and time periods, resulting in improved inferences. To accommodate potential overdispersion, we consider a range of parametric specifications for the positive counts, including truncated negative binomial and generalized Poisson distributions. We adopt a Bayesian inferential approach, and posterior computation is handled conveniently within standard Bayesian software. Our results indicate that the negative binomial and generalized Poisson hurdle models vastly outperform the Poisson hurdle model, demonstrating that overdispersed hurdle models provide a useful approach to analyzing zero-inflated spatiotemporal data. © The Author(s) 2014.

  12. Neutron monitor generated data distributions in quantum variational Monte Carlo

    NASA Astrophysics Data System (ADS)

    Kussainov, A. S.; Pya, N.

    2016-08-01

    We have assessed the potential applications of the neutron monitor hardware as random number generator for normal and uniform distributions. The data tables from the acquisition channels with no extreme changes in the signal level were chosen as the retrospective model. The stochastic component was extracted by fitting the raw data with splines and then subtracting the fit. Scaling the extracted data to zero mean and variance of one is sufficient to obtain a stable standard normal random variate. Distributions under consideration pass all available normality tests. Inverse transform sampling is suggested to use as a source of the uniform random numbers. Variational Monte Carlo method for quantum harmonic oscillator was used to test the quality of our random numbers. If the data delivery rate is of importance and the conventional one minute resolution neutron count is insufficient, we could always settle for an efficient seed generator to feed into the faster algorithmic random number generator or create a buffer.

  13. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches

    NASA Astrophysics Data System (ADS)

    Brokamp, Cole; Jandarov, Roman; Rao, M. B.; LeMasters, Grace; Ryan, Patrick

    2017-02-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment.

  14. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches.

    PubMed

    Brokamp, Cole; Jandarov, Roman; Rao, M B; LeMasters, Grace; Ryan, Patrick

    2017-02-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment.

  15. Design and methodology of a community-based cluster-randomized controlled trial for dietary behaviour change in rural Kerala.

    PubMed

    Daivadanam, Meena; Wahlstrom, Rolf; Sundari Ravindran, T K; Sarma, P S; Sivasankaran, S; Thankappan, K R

    2013-07-17

    Interventions targeting lifestyle-related risk factors and non-communicable diseases have contributed to the mainstream knowledge necessary for action. However, there are gaps in how this knowledge can be translated for practical day-to-day use in complex multicultural settings like that in India. Here, we describe the design of the Behavioural Intervention for Diet study, which was developed as a community-based intervention to change dietary behaviour among middle-income households in rural Kerala. This was a cluster-randomized controlled trial to assess the effectiveness of a sequential stage-matched intervention to bring about dietary behaviour change by targeting the procurement and consumption of five dietary components: fruits, vegetables, salt, sugar, and oil. Following a step-wise process of pairing and exclusion of outliers, six out of 22 administrative units in the northern part of Trivandrum district, Kerala state were randomly selected and allocated to intervention or control arms. Trained community volunteers carried out the data collection and intervention delivery. An innovative tool was developed to assess household readiness-to-change, and a household measurement kit and easy formulas were introduced to facilitate the practical side of behaviour change. The 1-year intervention included a household component with sequential stage-matched intervention strategies at 0, 6, and 12 months along with counselling sessions, telephonic reminders, and home visits and a community component with general awareness sessions in the intervention arm. Households in the control arm received information on recommended levels of intake of the five dietary components and general dietary information leaflets. Formative research provided the knowledge to contextualise the design of the study in accordance with socio-cultural aspects, felt needs of the community, and the ground realities associated with existing dietary procurement, preparation, and consumption patterns. The study also addressed two key issues, namely the central role of the household as the decision unit and the long-term sustainability through the use of existing local and administrative networks and community volunteers.

  16. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches

    PubMed Central

    Brokamp, Cole; Jandarov, Roman; Rao, M.B.; LeMasters, Grace; Ryan, Patrick

    2017-01-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment. PMID:28959135

  17. Imageless navigation system does not improve component rotational alignment in total knee arthroplasty.

    PubMed

    Cheng, Tao; Zhang, Guoyou; Zhang, Xianlong

    2011-12-01

    The aim of computer-assisted surgery is to improve accuracy and limit the range of surgical variability. However, a worldwide debate exists regarding the importance and usefulness of computer-assisted navigation for total knee arthroplasty (TKA). The main purpose of this study is to summarize and compare the radiographic outcomes of TKA performed using imageless computer-assisted navigation compared with conventional techniques. An electronic search of PubMed, EMBASE, Web of Science, and Cochrane library databases was made, in addition to manual search of major orthopedic journals. A meta-analysis of 29 quasi-randomized/randomized controlled trials (quasi-RCTs/RCTs) and 11 prospective comparative studies was conducted through a random effects model. Additional a priori sources of clinical heterogeneity were evaluated by subgroup analysis with regard to radiographic methods. When the outlier cut-off value of lower limb axis was defined as ±2° or ±3° from the neutral, the postoperative full-length radiographs demonstrated that the risk ratio was 0.54 or 0.39, respectively, which were in favor of the navigated group. When the cut-off value used for the alignment in the coronal and sagittal plane was 2° or 3°, imageless navigation significantly reduced the outlier rate of the femoral and tibial components compared with the conventional group. Notably, computed tomography scans demonstrated no statistically significant differences between the two groups regarding the outliers in the rotational alignment of the femoral and tibial components; however, there was strong statistical heterogeneity. Our results indicated that imageless computer-assisted navigation systems improve lower limb axis and component orientation in the coronal and sagittal planes, but not the rotational alignment in TKA. Further multiple-center clinical trials with long-term follow-up are needed to determine differences in the clinical and functional outcomes of knee arthroplasties performed using computer-assisted techniques. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Effects of 5 Weeks of Bench Press Training on Muscle Synergies: A Randomized Controlled Study.

    PubMed

    Kristiansen, Mathias; Samani, Afshin; Madeleine, Pascal; Hansen, Ernst A

    2016-07-01

    Kristiansen, M, Samani, A, Madeleine, P, and Hansen, EA. Effects of 5 weeks of bench press training on muscle synergies: A randomized controlled study. J Strength Cond Res 30(7): 1948-1959, 2016-The ability to perform forceful muscle contractions has important implications in sports performance and in activities of daily living. However, there is a lack of knowledge on adaptations in intermuscular coordination after strength training. The purpose of this study was therefore to assess muscle synergies before and after 5 weeks of bench press training. Thirty untrained male subjects were randomly allocated to a training group (TRA) or a control group (CON). After the pretest, TRA completed 5 weeks of bench press training, before completing a posttest, whereas subjects in CON continued their normal life. During test sessions, surface electromyography (EMG) was recorded from 13 different muscles. Muscle synergies were extracted from EMG data using nonnegative matrix factorization. To evaluate differences between pretest and posttest, we performed a cross-correlation analysis and a cross-validation analysis, in which the synergy components extracted in the pretest session were recomputed, using the fixed synergy components from the posttest session. Two muscle synergies accounted for 90% of the total variance and reflected the concentric and eccentric phase, respectively. TRA significantly increased 3 repetition maximum in bench press with 19.0% (25th; 75th percentile, 10.3%; 21.7%) (p < 0.001), whereas no change occurred in CON. No significant differences were observed in synergy components between groups. However, decreases in correlation values for intragroup comparisons in TRA may suggest that the synergy components changed, whereas this was not the case in CON. Strength and conditioning professionals may consider monitoring changes in muscle synergies in training and rehabilitation programs as a way to benchmark changes in intermuscular coordination.

  19. Secure communications system

    NASA Technical Reports Server (NTRS)

    Doland, G. D.

    1977-01-01

    System employs electronically randomized variant of quadraphase modulation and demodulation between two synchronized transceivers. System uses off-the-shelf components. It may be used with digital data, command signals, delta-modulated voice signals, digital television signals, or other data converted to digital form.

  20. Effective components of feedback from Routine Outcome Monitoring (ROM) in youth mental health care: study protocol of a three-arm parallel-group randomized controlled trial

    PubMed Central

    2014-01-01

    Background Routine Outcome Monitoring refers to regular measurements of clients’ progress in clinical practice, aiming to evaluate and, if necessary, adapt treatment. Clients fill out questionnaires and clinicians receive feedback about the results. Studies concerning feedback in youth mental health care are rare. The effects of feedback, the importance of specific aspects of feedback, and the mechanisms underlying the effects of feedback are unknown. In the present study, several potentially effective components of feedback from Routine Outcome Monitoring in youth mental health care in the Netherlands are investigated. Methods/Design We will examine three different forms of feedback through a three-arm parallel-group randomized controlled trial. 432 children and adolescents (aged 4 to 17 years) and their parents, who have been referred to mental health care institution Pro Persona, will be randomly assigned to one of three feedback conditions (144 participants per condition). Randomization will be stratified by age of the child or adolescent and by department. All participants fill out questionnaires at the start of treatment, one and a half months after the start of treatment, every three months during treatment, and at the end of treatment. Participants in the second and third feedback conditions fill out an additional questionnaire. In condition 1, clinicians receive basic feedback regarding clients’ symptoms and quality of life. In condition 2, the feedback of condition 1 is extended with feedback regarding possible obstacles to a good outcome and with practical suggestions. In condition 3, the feedback of condition 2 is discussed with a colleague while following a standardized format for case consultation. The primary outcome measure is symptom severity and secondary outcome measures are quality of life, satisfaction with treatment, number of sessions, length of treatment, and rates of dropout. We will also examine the role of being not on track (not responding to treatment). Discussion This study contributes to the identification of effective components of feedback and a better understanding of how feedback functions in real-world clinical practice. If the different feedback components prove to be effective, this can help to support and improve the care for youth. Trial registration Dutch Trial Register NTR4234 PMID:24393491

  1. Interventions to improve hemodialysis adherence: a systematic review of randomized-controlled trials.

    PubMed

    Matteson, Michelle L; Russell, Cynthia

    2010-10-01

    Over 485,000 people in the United States have chronic kidney disease, a progressive kidney disease that may lead to hemodialysis. Hemodialysis involves a complex regimen of treatment, medication, fluid, and diet management. In 2005, over 312,000 patients were undergoing hemodialysis in the United States. Dialysis nonadherence rates range from 8.5% to 86%. Dialysis therapy treatment nonadherence, including treatment, medication, fluid, and diet nonadherence, significantly increases the risk of morbidity and mortality. The purpose of this paper is to systematically review randomized-controlled trial intervention studies designed to increase treatment, medication, fluid, and diet adherence in adult hemodialysis patients. A search of Cumulative Index of Nursing and Allied Health Literature (CINAHL) (1982 to May 2008), MEDLINE (1950 to May 2008), PsycINFO (1806 to May 2008), and all Evidence-Based Medicine (EBM) Reviews (Cochran DSR, ACP Journal Club, DARE, and CCTR) was conducted to identify randomized-controlled studies that tested the efficacy of interventions to improve adherence in adult hemodialysis patients. Eight randomized-controlled trials met criteria for inclusion. Six of the 8 studies found statistically significant improvement in adherence with the intervention. Of these 6 intervention studies, all studies had a cognitive component, with 3 studies utilizing cognitive/behavioral intervention strategies. Based on this systematic review, interventions utilizing a cognitive or cognitive/behavioral component appear to show the most promise for future study. © 2010 The Authors. Hemodialysis International © 2010 International Society for Hemodialysis.

  2. Engineering controllable architecture in matrigel for 3D cell alignment.

    PubMed

    Jang, Jae Myung; Tran, Si-Hoai-Trung; Na, Sang Cheol; Jeon, Noo Li

    2015-02-04

    We report a microfluidic approach to impart alignment in ECM components in 3D hydrogels by continuously applying fluid flow across the bulk gel during the gelation process. The microfluidic device where each channel can be independently filled was tilted at 90° to generate continuous flow across the Matrigel as it gelled. The presence of flow helped that more than 70% of ECM components were oriented along the direction of flow, compared with randomly cross-linked Matrigel. Following the oriented ECM components, primary rat cortical neurons and mouse neural stem cells showed oriented outgrowth of neuronal processes within the 3D Matrigel matrix.

  3. Check-Standard Testing Across Multiple Transonic Wind Tunnels with the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Deloach, Richard

    2012-01-01

    This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared.

  4. An employee total health management-based survey of Iowa employers.

    PubMed

    Merchant, James A; Lind, David P; Kelly, Kevin M; Hall, Jennifer L

    2013-12-01

    To implement an Employee Total Health Management (ETHM) model-based questionnaire and provide estimates of model program elements among a statewide sample of Iowa employers. Survey a stratified random sample of Iowa employers, and characterize and estimate employer participation in ETHM program elements. Iowa employers are implementing less than 30% of all 12 components of ETHM, with the exception of occupational safety and health (46.6%) and workers' compensation insurance coverage (89.2%), but intend modest expansion of all components in the coming year. The ETHM questionnaire-based survey provides estimates of progress Iowa employers are making toward implementing components of Total Worker Health programs.

  5. Effectiveness of a mood management component as an adjunct to a telephone counselling smoking cessation intervention for smokers with a past major depression: a pragmatic randomized controlled trial.

    PubMed

    van der Meer, Regina M; Willemsen, Marc C; Smit, Filip; Cuijpers, Pim; Schippers, Gerard M

    2010-11-01

    To assess whether the addition of a mood management component to telephone counselling produces higher abstinence rates in smokers with past major depression and helps to prevent recurrence of depressive symptoms. Pragmatic randomized controlled trial with two conditions, with follow-up at 6 and 12 months. The control intervention consisted of eight sessions of proactive telephone counselling. The mood management intervention was an integration of the control intervention with a mood management component. This component consisted of a self-help mood management manual, two more preparatory proactive telephone counselling sessions and supplementary homework assignments and advice. Dutch national smoking cessation quitline. A total of 485 daily smokers with past major depression, according to the DSM-IV. The primary outcome measure was prolonged abstinence and secondary outcome measures were 7-day point prevalence abstinence and depressive symptoms. The mood management intervention resulted in significantly higher prolonged abstinence rates at 6- and 12-month follow-up (30.5% and 23.9% in experimental condition, 22.3% and 14.0% in the control condition). The odds ratios were 1.60 (95% CI 1.06-2.42) and 1.96 (95% CI 1.22-3.14) for both follow-ups. The mood management intervention did not seem to prevent recurrence of depressive symptoms. Adding a mood management component to telephone counselling for smoking cessation in smokers with a past major depression increases cessation rates without necessarily reducing depressive symptoms. © 2010 The Authors, Addiction © 2010 Society for the Study of Addiction.

  6. Processing of Fear and Anger Facial Expressions: The Role of Spatial Frequency

    PubMed Central

    Comfort, William E.; Wang, Meng; Benton, Christopher P.; Zana, Yossi

    2013-01-01

    Spatial frequency (SF) components encode a portion of the affective value expressed in face images. The aim of this study was to estimate the relative weight of specific frequency spectrum bandwidth on the discrimination of anger and fear facial expressions. The general paradigm was a classification of the expression of faces morphed at varying proportions between anger and fear images in which SF adaptation and SF subtraction are expected to shift classification of facial emotion. A series of three experiments was conducted. In Experiment 1 subjects classified morphed face images that were unfiltered or filtered to remove either low (<8 cycles/face), middle (12–28 cycles/face), or high (>32 cycles/face) SF components. In Experiment 2 subjects were adapted to unfiltered or filtered prototypical (non-morphed) fear face images and subsequently classified morphed face images. In Experiment 3 subjects were adapted to unfiltered or filtered prototypical fear face images with the phase component randomized before classifying morphed face images. Removing mid frequency components from the target images shifted classification toward fear. The same shift was observed under adaptation condition to unfiltered and low- and middle-range filtered fear images. However, when the phase spectrum of the same adaptation stimuli was randomized, no adaptation effect was observed. These results suggest that medium SF components support the perception of fear more than anger at both low and high level of processing. They also suggest that the effect at high-level processing stage is related more to high-level featural and/or configural information than to the low-level frequency spectrum. PMID:23637687

  7. Estimating overall exposure effects for the clustered and censored outcome using random effect Tobit regression models.

    PubMed

    Wang, Wei; Griswold, Michael E

    2016-11-30

    The random effect Tobit model is a regression model that accommodates both left- and/or right-censoring and within-cluster dependence of the outcome variable. Regression coefficients of random effect Tobit models have conditional interpretations on a constructed latent dependent variable and do not provide inference of overall exposure effects on the original outcome scale. Marginalized random effects model (MREM) permits likelihood-based estimation of marginal mean parameters for the clustered data. For random effect Tobit models, we extend the MREM to marginalize over both the random effects and the normal space and boundary components of the censored response to estimate overall exposure effects at population level. We also extend the 'Average Predicted Value' method to estimate the model-predicted marginal means for each person under different exposure status in a designated reference group by integrating over the random effects and then use the calculated difference to assess the overall exposure effect. The maximum likelihood estimation is proposed utilizing a quasi-Newton optimization algorithm with Gauss-Hermite quadrature to approximate the integration of the random effects. We use these methods to carefully analyze two real datasets. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Dynamic speckle - Interferometry of micro-displacements

    NASA Astrophysics Data System (ADS)

    Vladimirov, A. P.

    2012-06-01

    The problem of the dynamics of speckles in the image plane of the object, caused by random movements of scattering centers is solved. We consider three cases: 1) during the observation the points move at random, but constant speeds, and 2) the relative displacement of any pair of points is a continuous random process, and 3) the motion of the centers is the sum of a deterministic movement and random displacement. For the cases 1) and 2) the characteristics of temporal and spectral autocorrelation function of the radiation intensity can be used for determining of individually and the average relative displacement of the centers, their dispersion and the relaxation time. For the case 3) is showed that under certain conditions, the optical signal contains a periodic component, the number of periods is proportional to the derivations of the deterministic displacements. The results of experiments conducted to test and application of theory are given.

  9. A qualitative assessment of a random process proposed as an atmospheric turbulence model

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1977-01-01

    A random process is formed by the product of two Gaussian processes and the sum of that product with a third Gaussian process. The resulting total random process is interpreted as the sum of an amplitude modulated process and a slowly varying, random mean value. The properties of the process are examined, including an interpretation of the process in terms of the physical structure of atmospheric motions. The inclusion of the mean value variation gives an improved representation of the properties of atmospheric motions, since the resulting process can account for the differences in the statistical properties of atmospheric velocity components and their gradients. The application of the process to atmospheric turbulence problems, including the response of aircraft dynamic systems, is examined. The effects of the mean value variation upon aircraft loads are small in most cases, but can be important in the measurement and interpretation of atmospheric turbulence data.

  10. Designing Studies That Would Address the Multilayered Nature of Health Care

    PubMed Central

    Pennell, Michael; Rhoda, Dale; Hade, Erinn M.; Paskett, Electra D.

    2010-01-01

    We review design and analytic methods available for multilevel interventions in cancer research with particular attention to study design, sample size requirements, and potential to provide statistical evidence for causal inference. The most appropriate methods will depend on the stage of development of the research and whether randomization is possible. Early on, fractional factorial designs may be used to screen intervention components, particularly when randomization of individuals is possible. Quasi-experimental designs, including time-series and multiple baseline designs, can be useful once the intervention is designed because they require few sites and can provide the preliminary evidence to plan efficacy studies. In efficacy and effectiveness studies, group-randomized trials are preferred when randomization is possible and regression discontinuity designs are preferred otherwise if assignment based on a quantitative score is possible. Quasi-experimental designs may be used, especially when combined with recent developments in analytic methods to reduce bias in effect estimates. PMID:20386057

  11. A classification of the galaxy groups

    NASA Technical Reports Server (NTRS)

    Anosova, Joanna P.

    1990-01-01

    A statistical criterion has been proposed to reveal the random and physical clusterings among stars, galaxies and other objects. This criterion has been applied to the galaxy triples of the list by Karachentseva, Karaschentsev and Scherbanovsky, and the double galaxies of the list by Dahari where the primary components are the Seyfert galaxies. The confident physical, probable physical, probable optical and confident optical groups have been identified. The limit difference of radial velocities of components for the confident physical multiple galaxies has also been estimated.

  12. Space shuttle solid rocket booster recovery system definition, volume 1

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The performance requirements, preliminary designs, and development program plans for an airborne recovery system for the space shuttle solid rocket booster are discussed. The analyses performed during the study phase of the program are presented. The basic considerations which established the system configuration are defined. A Monte Carlo statistical technique using random sampling of the probability distribution for the critical water impact parameters was used to determine the failure probability of each solid rocket booster component as functions of impact velocity and component strength capability.

  13. Effectiveness of workplace weight management interventions: a systematic review

    USDA-ARS?s Scientific Manuscript database

    Background: A systematic review was conducted of randomized trials of workplace weight management interventions, including trials with dietary, physical activity, environmental, behavioral and incentive based components. Main outcomes were defined as change in weight-related measures. Methods: Key w...

  14. Statistical controversies in clinical research: an initial evaluation of a surrogate end point using a single randomized clinical trial and the Prentice criteria

    PubMed Central

    Heller, G.

    2015-01-01

    Surrogate end point research has grown in recent years with the increasing development and usage of biomarkers in clinical research. Surrogacy analysis is derived through randomized clinical trial data and it is carried out at the individual level and at the trial level. A common surrogate analysis at the individual level is the application of the Prentice criteria. An approach for the evaluation of the Prentice criteria is discussed, with a focus on its most difficult component, the determination of whether the treatment effect is captured by the surrogate. An interpretation of this criterion is illustrated using data from a randomized clinical trial in prostate cancer. PMID:26254442

  15. Local Neighbourhoods for First-Passage Percolation on the Configuration Model

    NASA Astrophysics Data System (ADS)

    Dereich, Steffen; Ortgiese, Marcel

    2018-04-01

    We consider first-passage percolation on the configuration model. Once the network has been generated each edge is assigned an i.i.d. weight modeling the passage time of a message along this edge. Then independently two vertices are chosen uniformly at random, a sender and a recipient, and all edges along the geodesic connecting the two vertices are coloured in red (in the case that both vertices are in the same component). In this article we prove local limit theorems for the coloured graph around the recipient in the spirit of Benjamini and Schramm. We consider the explosive regime, in which case the random distances are of finite order, and the Malthusian regime, in which case the random distances are of logarithmic order.

  16. Simultaneous multi-component seismic denoising and reconstruction via K-SVD

    NASA Astrophysics Data System (ADS)

    Hou, Sian; Zhang, Feng; Li, Xiangyang; Zhao, Qiang; Dai, Hengchang

    2018-06-01

    Data denoising and reconstruction play an increasingly significant role in seismic prospecting for their value in enhancing effective signals, dealing with surface obstacles and reducing acquisition costs. In this paper, we propose a novel method to denoise and reconstruct multicomponent seismic data simultaneously. This method lies within the framework of machine learning and the key points are defining a suitable weight function and a modified inner product operator. The purpose of these two processes are to perform missing data machine learning when the random noise deviation is unknown, and building a mathematical relationship for each component to incorporate all the information of multi-component data. Two examples, using synthetic and real multicomponent data, demonstrate that the new method is a feasible alternative for multi-component seismic data processing.

  17. System for loading executable code into volatile memory in a downhole tool

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.

    2007-09-25

    A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.

  18. Quantitative analysis of random ameboid motion

    NASA Astrophysics Data System (ADS)

    Bödeker, H. U.; Beta, C.; Frank, T. D.; Bodenschatz, E.

    2010-04-01

    We quantify random migration of the social ameba Dictyostelium discoideum. We demonstrate that the statistics of cell motion can be described by an underlying Langevin-type stochastic differential equation. An analytic expression for the velocity distribution function is derived. The separation into deterministic and stochastic parts of the movement shows that the cells undergo a damped motion with multiplicative noise. Both contributions to the dynamics display a distinct response to external physiological stimuli. The deterministic component depends on the developmental state and ambient levels of signaling substances, while the stochastic part does not.

  19. The influence of the directional energy distribution on the nonlinear dispersion relation in a random gravity wave field

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Tung, C.-C.

    1977-01-01

    The influence of the directional distribution of wave energy on the dispersion relation is calculated numerically using various directional wave spectrum models. The results indicate that the dispersion relation varies both as a function of the directional energy distribution and the direction of propagation of the wave component under consideration. Furthermore, both the mean deviation and the random scatter from the linear approximation increase as the energy spreading decreases. Limited observational data are compared with the theoretical results. The agreement is favorable.

  20. A randomized evaluation of smoking cessation interventions for pregnant women at a WIC clinic.

    PubMed Central

    Mayer, J P; Hawkins, B; Todd, R

    1990-01-01

    Pregnant smokers attending a local health department WIC clinic were randomly assigned to one of two self-help smoking cessation programs or usual care. The multiple component program resulted in larger quit rates than usual care during the last month of pregnancy (11 percent vs 3 percent) and postpartum (7 percent vs 0 percent). Achieving quit rates in WIC similar to those in studies conducted at prenatal care settings, suggests that smoking cessation programs for low-income pregnant WIC clients are feasible. PMID:2293809

  1. Effect of Atomic Layer Depositions (ALD)-Deposited Titanium Oxide (TiO2) Thickness on the Performance of Zr40Cu35Al15Ni10 (ZCAN)/TiO2/Indium (In)-Based Resistive Random Access Memory (RRAM) Structures

    DTIC Science & Technology

    2015-08-01

    metal structures, memristors, resistive random access memory, RRAM, titanium dioxide, Zr40Cu35Al15Ni10, ZCAN, resistive memory, tunnel junction 16...TiO2 thickness ........................6 1 1. Introduction Resistive-switching memory elements based on metal-insulator-metal (MIM) diodes ...have attracted great interest due to their potential as components for simple, inexpensive, and high-density non-volatile storage devices. MIM diodes

  2. A unified development of several techniques for the representation of random vectors and data sets

    NASA Technical Reports Server (NTRS)

    Bundick, W. T.

    1973-01-01

    Linear vector space theory is used to develop a general representation of a set of data vectors or random vectors by linear combinations of orthonormal vectors such that the mean squared error of the representation is minimized. The orthonormal vectors are shown to be the eigenvectors of an operator. The general representation is applied to several specific problems involving the use of the Karhunen-Loeve expansion, principal component analysis, and empirical orthogonal functions; and the common properties of these representations are developed.

  3. Diffusion of an Evidence-Based Smoking Cessation Intervention Through Facebook: A Randomized Controlled Trial

    PubMed Central

    Cobb, Nathan K.; Jacobs, Megan A.; Wileyto, Paul; Valente, Thomas

    2016-01-01

    Objectives. To examine the diffusion of an evidence-based smoking cessation application (“app”) through Facebook social networks and identify specific intervention components that accelerate diffusion. Methods. Between December 2012 and October 2013, we recruited adult US smokers (“seeds”) via Facebook advertising and randomized them to 1 of 12 app variants using a factorial design. App variants targeted components of diffusion: duration of use (t), “contagiousness” (β), and number of contacts (Z). The primary outcome was the reproductive ratio (R), defined as the number of individuals installing the app (“descendants”) divided by the number of a seed participant’s Facebook friends. Results. We randomized 9042 smokers. App utilization metrics demonstrated between-variant differences in expected directions. The highest level of diffusion (R = 0.087) occurred when we combined active contagion strategies with strategies to increase duration of use (incidence rate ratio = 9.99; 95% confidence interval = 5.58, 17.91; P < .001). Involving nonsmokers did not affect diffusion. Conclusions. The maximal R value (0.087) is sufficient to increase the numbers of individuals receiving treatment if applied on a large scale. Online interventions can be designed a priori to spread through social networks. PMID:27077358

  4. A multilevel model to estimate the within- and the between-center components of the exposure/disease association in the EPIC study.

    PubMed

    Sera, Francesco; Ferrari, Pietro

    2015-01-01

    In a multicenter study, the overall relationship between exposure and the risk of cancer can be broken down into a within-center component, which reflects the individual level association, and a between-center relationship, which captures the association at the aggregate level. A piecewise exponential proportional hazards model with random effects was used to evaluate the association between dietary fiber intake and colorectal cancer (CRC) risk in the EPIC study. During an average follow-up of 11.0 years, 4,517 CRC events occurred among study participants recruited in 28 centers from ten European countries. Models were adjusted by relevant confounding factors. Heterogeneity among centers was modelled with random effects. Linear regression calibration was used to account for errors in dietary questionnaire (DQ) measurements. Risk ratio estimates for a 10 g/day increment in dietary fiber were equal to 0.90 (95%CI: 0.85, 0.96) and 0.85 (0.64, 1.14), at the individual and aggregate levels, respectively, while calibrated estimates were 0.85 (0.76, 0.94), and 0.87 (0.65, 1.15), respectively. In multicenter studies, over a straightforward ecological analysis, random effects models allow information at the individual and ecologic levels to be captured, while controlling for confounding at both levels of evidence.

  5. The effect of a telephone-based cognitive behavioral therapy on quality of life: a randomized controlled trial.

    PubMed

    Ngai, Fei-Wan; Wong, Paul Wai-Ching; Chung, Ka-Fai; Leung, Kwok-Yin

    2017-06-01

    Health-related quality of life (HRQoL) has emerged as a major public health concern in perinatal care. The purpose of this study was to examine the effect of telephone-based cognitive behavioral therapy (T-CBT) on HRQoL among Chinese mothers at risk of postnatal depression at 6 weeks and 6 months postpartum. A multi-center randomized controlled trial was conducted at the postnatal units of three regional hospitals. Three hundred and ninety-seven women at risk of postnatal depression were recruited and were randomly assigned to the T-CBT (n = 197) or usual care (n = 200). Assessment was conducted at baseline, 6 weeks and 6 months postpartum for HRQoL. Women in the T-CBT experienced greater improvement in the physical component of HRQoL from baseline to 6 weeks and 6 months postpartum than the usual care group. At 6 months postpartum, the T-CBT group also experienced better HRQoL in the mental component of HRQoL than the usual care group. The T-CBT appears to be feasible and effective in improving HRQoL in women at risk of postnatal depression in the primary care practice.

  6. Long-lasting changes in brain activation induced by a single REAC technology pulse in Wi-Fi bands. Randomized double-blind fMRI qualitative study.

    PubMed

    Rinaldi, Salvatore; Mura, Marco; Castagna, Alessandro; Fontani, Vania

    2014-07-11

    The aim of this randomized double-blind study was to evaluate in healthy adult subjects, with functional magnetic resonance imaging (fMRI), long lasting changes in brain activation patterns following administration of a single, 250 milliseconds pulse emitted with radio-electric asymmetric conveyer (REAC) technology in the Wi-Fi bands. The REAC impulse was not administered during the scan, but after this, according to a protocol that has previously been demonstrated to be effective in improving motor control and postural balance, in healthy subjects and patients. The study was conducted on 33 healthy volunteers, performed with a 1.5 T unit while operating a motor block task involving cyclical and alternating flexion and extension of one leg. Subsequently subjects were randomly divided into a treatment and a sham treatment control group. Repeated fMRI examinations were performed following the administration of the REAC pulse or sham treatment. The Treated group showed cerebellar and ponto-mesencephalic activation components that disappeared in the second scan, while these activation components persisted in the Sham group. This study shows that a very weak signal, such as 250 milliseconds Wi-Fi pulse, administered with REAC technology, could lead to lasting effects on brain activity modification.

  7. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    PubMed

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  8. Interfaces. Working Papers in Linguistics No. 32.

    ERIC Educational Resources Information Center

    Zwicky, Arnold M.

    The papers collected here concern the interfaces between various components of grammar (semantics, syntax, morphology, and phonology) and between grammar itself and various extragrammatical domains. They include: "The OSU Random, Unorganized Collection of Speech Act Examples"; "In and Out in Phonology"; "Forestress and…

  9. Multilevel Modeling with Correlated Effects

    ERIC Educational Resources Information Center

    Kim, Jee-Seon; Frees, Edward W.

    2007-01-01

    When there exist omitted effects, measurement error, and/or simultaneity in multilevel models, explanatory variables may be correlated with random components, and standard estimation methods do not provide consistent estimates of model parameters. This paper introduces estimators that are consistent under such conditions. By employing generalized…

  10. Certain dietary patterns are beneficial for the metabolic syndrome: reviewing the evidence.

    PubMed

    Calton, Emily K; James, Anthony P; Pannu, Poonam K; Soares, Mario J

    2014-07-01

    The metabolic syndrome (MetS) is a global public health issue of increasing magnitude. The Asia-Pacific region is expected to be hardest hit due to large population numbers, rising obesity, and insulin resistance (IR). This review assessed the protective effects of dietary patterns and their components on MetS. A literature search was conducted using prominent electronic databases and search terms that included in combination: diet, dietary components, dietary patterns, and metabolic syndrome. Articles were restricted to prospective studies and high quality randomized controlled trials that were conducted on humans, reported in the English language, and within the time period of 2000 to 2012. Traditional factors such as age, gender, physical activity, and obesity were associated with risk of MetS; however, these potential confounders were not always accounted for in study outcomes. Three dietary patterns emerged from the review; a Mediterranean dietary pattern, dietary approaches to stop hypertension diet, and the Nordic Diet. Potential contributors to their beneficial effects on prevalence of MetS or reduction in MetS components included increases in fruits, vegetables, whole grains, dairy and dairy components, calcium, vitamin D, and whey protein, as well as monounsaturated fatty acids, and omega-3 fatty acids. Additional prospective and high quality randomized controlled trial studies that investigate Mediterranean dietary pattern, the dietary approaches to stop hypertension diet, and the Nordic Diet would cement the protective benefits of these diets against the MetS. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. 2GETHER - The Dual Protection Project: Design and rationale of a randomized controlled trial to increase dual protection strategy selection and adherence among African American adolescent females

    PubMed Central

    Ewing, Alexander C.; Kottke, Melissa J.; Kraft, Joan Marie; Sales, Jessica M.; Brown, Jennifer L.; Goedken, Peggy; Wiener, Jeffrey; Kourtis, Athena P.

    2018-01-01

    Background African American adolescent females are at elevated risk for unintended pregnancy and sexually transmitted infections (STIs). Dual protection (DP) is defined as concurrent prevention of pregnancy and STIs. This can be achieved by abstinence, consistent condom use, or the dual methods of condoms plus an effective non-barrier contraceptive. Previous clinic-based interventions showed short-term effects on increasing dual method use, but evidence of sustained effects on dual method use and decreased incident pregnancies and STIs are lacking. Methods/Design This manuscript describes the 2GETHER Project. 2GETHER is a randomized controlled trial of a multi-component intervention to increase dual protection use among sexually active African American females aged 14–19 years not desiring pregnancy at a Title X clinic in Atlanta, GA. The intervention is clinic-based and includes a culturally tailored interactive multimedia component and counseling sessions, both to assist in selection of a DP method and to reinforce use of the DP method. The participants are randomized to the study intervention or the standard of care, and followed for 12 months to evaluate how the intervention influences DP method selection and adherence, pregnancy and STI incidence, and participants’ DP knowledge, intentions, and self-efficacy. Discussion The 2GETHER Project is a novel trial to reduce unintended pregnancies and STIs among African American adolescents. The intervention is unique in the comprehensive and complementary nature of its components and its individual tailoring of provider-patient interaction. If the trial interventions are shown to be effective, then it will be reasonable to assess their scalability and applicability in other populations. PMID:28007634

  12. Bioequivalence of saxagliptin/dapagliflozin fixed-dose combination tablets compared with coadministration of the individual tablets to healthy subjects.

    PubMed

    Vakkalagadda, Blisse; Vetter, Marion L; Rana, Jignasa; Smith, Charles H; Huang, Jian; Karkas, Jennifer; Boulton, David W; LaCreta, Frank

    2015-12-01

    Saxagliptin and dapagliflozin are individually indicated as an adjunct to diet and exercise to improve glycemic control in adults with type 2 diabetes mellitus. The bioequivalence of saxagliptin/dapagliflozin 2.5/5 mg and 5/10 mg fixed-dose combination (FDC) tablets compared with coadministration of the individual tablets and the food effect on both strengths of saxagliptin/dapagliflozin FDCs were evaluated in this open-label, randomized, single-dose crossover study. Healthy subjects were randomized to saxagliptin 2.5 mg + dapagliflozin 5 mg fasted, 2.5/5 mg FDC fasted, 2.5/5 mg FDC fed (Cohort 1) or saxagliptin 5 mg + dapagliflozin 10 mg fasted, 5/10 mg FDC fasted, 5/10 mg FDC fed (Cohort 2). Serial blood samples for pharmacokinetics of saxagliptin and dapagliflozin were obtained predose and up to 60 h postdose. Bioequivalence of FDC tablets versus individual components was concluded if the 90% CIs for FDC to individual component geometric mean ratios of C max, AUC 0-T, and AUC inf of both analytes were between 0.80 and 1.25. Seventy-two subjects were randomized; 71 (98.6%) completed the study. Saxagliptin/dapagliflozin 2.5/5 mg and 5/10 mg FDC tablets were bioequivalent to the individual tablets administered concomitantly. Food had no clinically meaningful effect on saxagliptin or dapagliflozin overall systemic exposure. Saxagliptin/dapagliflozin FDC tablets were bioequivalent to coadministration of the individual components in healthy subjects under fasted conditions and food had no clinically meaningful effect on bioavailability.

  13. The probability of false positives in zero-dimensional analyses of one-dimensional kinematic, force and EMG trajectories.

    PubMed

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2016-06-14

    A false positive is the mistake of inferring an effect when none exists, and although α controls the false positive (Type I error) rate in classical hypothesis testing, a given α value is accurate only if the underlying model of randomness appropriately reflects experimentally observed variance. Hypotheses pertaining to one-dimensional (1D) (e.g. time-varying) biomechanical trajectories are most often tested using a traditional zero-dimensional (0D) Gaussian model of randomness, but variance in these datasets is clearly 1D. The purpose of this study was to determine the likelihood that analyzing smooth 1D data with a 0D model of variance will produce false positives. We first used random field theory (RFT) to predict the probability of false positives in 0D analyses. We then validated RFT predictions via numerical simulations of smooth Gaussian 1D trajectories. Results showed that, across a range of public kinematic, force/moment and EMG datasets, the median false positive rate was 0.382 and not the assumed α=0.05, even for a simple two-sample t test involving N=10 trajectories per group. The median false positive rate for experiments involving three-component vector trajectories was p=0.764. This rate increased to p=0.945 for two three-component vector trajectories, and to p=0.999 for six three-component vectors. This implies that experiments involving vector trajectories have a high probability of yielding 0D statistical significance when there is, in fact, no 1D effect. Either (a) explicit a priori identification of 0D variables or (b) adoption of 1D methods can more tightly control α. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Outcomes from a randomized controlled trial of a multi-component alcohol use preventive intervention for urban youth: project northland Chicago.

    PubMed

    Komro, Kelli A; Perry, Cheryl L; Veblen-Mortenson, Sara; Farbakhsh, Kian; Toomey, Traci L; Stigler, Melissa H; Jones-Webb, Rhonda; Kugler, Kari C; Pasch, Keryn E; Williams, Carolyn L

    2008-04-01

    The goal of this group-randomized trial was to test the effectiveness of an adapted alcohol use preventive intervention for urban, low-income and multi-ethnic settings. Sixty-one public schools in Chicago were recruited to participate, were grouped into neighborhood study units and assigned randomly to intervention or 'delayed program' control condition. The study sample (n = 5812 students) was primarily African American, Hispanic and low-income. Students, beginning in sixth grade (age 12 years), received 3 years of intervention strategies (curricula, family interventions, youth-led community service projects, community organizing). Students participated in yearly classroom-based surveys to measure their alcohol use and related risk and protective factors. Additional evaluation components included a parent survey, a community leader survey and alcohol purchase attempts. Overall, the intervention, compared with a control condition receiving 'prevention as usual', was not effective in reducing alcohol use, drug use or any hypothesized mediating variables (i.e. related risk and protective factors). There was a non-significant trend (P = 0.066) that suggested the ability to purchase alcohol by young-appearing buyers was reduced in the intervention communities compared to the control communities, but this could be due to chance. Secondary outcome analyses to assess the effects of each intervention component indicated that the home-based programs were associated with reduced alcohol, marijuana and tobacco use combined (P = 0.01), with alcohol use alone approaching statistical significance (P = 0.06). Study results indicate the importance of conducting evaluations of previously validated programs in contexts that differ from the original study sample. Also, the findings highlight the need for further research with urban, low-income adolescents from different ethnic backgrounds to identify effective methods to prevent and reduce alcohol use.

  15. Bias Correction and Random Error Characterization for the Assimilation of HRDI Line-of-Sight Wind Measurements

    NASA Technical Reports Server (NTRS)

    Tangborn, Andrew; Menard, Richard; Ortland, David; Einaudi, Franco (Technical Monitor)

    2001-01-01

    A new approach to the analysis of systematic and random observation errors is presented in which the error statistics are obtained using forecast data rather than observations from a different instrument type. The analysis is carried out at an intermediate retrieval level, instead of the more typical state variable space. This method is carried out on measurements made by the High Resolution Doppler Imager (HRDI) on board the Upper Atmosphere Research Satellite (UARS). HRDI, a limb sounder, is the only satellite instrument measuring winds in the stratosphere, and the only instrument of any kind making global wind measurements in the upper atmosphere. HRDI measures doppler shifts in the two different O2 absorption bands (alpha and B) and the retrieved products are tangent point Line-of-Sight wind component (level 2 retrieval) and UV winds (level 3 retrieval). This analysis is carried out on a level 1.9 retrieval, in which the contributions from different points along the line-of-sight have not been removed. Biases are calculated from O-F (observed minus forecast) LOS wind components and are separated into a measurement parameter space consisting of 16 different values. The bias dependence on these parameters (plus an altitude dependence) is used to create a bias correction scheme carried out on the level 1.9 retrieval. The random error component is analyzed by separating the gamma and B band observations and locating observation pairs where both bands are very nearly looking at the same location at the same time. It is shown that the two observation streams are uncorrelated and that this allows the forecast error variance to be estimated. The bias correction is found to cut the effective observation error variance in half.

  16. Quality of Life From Canadian Cancer Trials Group MA.17R: A Randomized Trial of Extending Adjuvant Letrozole to 10 Years.

    PubMed

    Lemieux, Julie; Brundage, Michael D; Parulekar, Wendy R; Goss, Paul E; Ingle, James N; Pritchard, Kathleen I; Celano, Paul; Muss, Hyman; Gralow, Julie; Strasser-Weippl, Kathrin; Whelan, Kate; Tu, Dongsheng; Whelan, Timothy J

    2018-02-20

    Purpose MA.17R was a Canadian Cancer Trials Group-led phase III randomized controlled trial comparing letrozole to placebo after 5 years of aromatase inhibitor as adjuvant therapy for hormone receptor-positive breast cancer. Quality of life (QOL) was a secondary outcome measure of the study, and here, we report the results of these analyses. Methods QOL was measured using the Short Form-36 (SF-36; two summary scores and eight domains) and menopause-specific QOL (MENQOL; four symptom domains) at baseline and every 12 months up to 60 months. QOL assessment was mandatory for Canadian Cancer Trials Group centers but optional for centers in other groups. Mean change scores from baseline were calculated. Results One thousand nine hundred eighteen women were randomly assigned, and 1,428 women completed the baseline QOL assessment. Compliance with QOL measures was > 85%. Baseline summary scores for the SF-36 physical component summary (47.5 for letrozole and 47.9 for placebo) and mental component summary (55.5 for letrozole and 54.8 for placebo) were close to the population norms of 50. No differences were seen between groups in mean change scores for the SF-36 physical and mental component summaries and the other eight QOL domains except for the role-physical subscale. No difference was found in any of the four domains of the MENQOL Conclusion No clinically significant differences were seen in overall QOL measured by the SF-36 summary measures and MENQOL between the letrozole and placebo groups. The data indicate that continuation of aromatase inhibitor therapy after 5 years of prior treatment in the trial population was not associated with a deterioration of overall QOL.

  17. Quality of Life From Canadian Cancer Trials Group MA.17R: A Randomized Trial of Extending Adjuvant Letrozole to 10 Years

    PubMed Central

    Brundage, Michael D.; Parulekar, Wendy R.; Goss, Paul E.; Ingle, James N.; Pritchard, Kathleen I.; Celano, Paul; Muss, Hyman; Gralow, Julie; Strasser-Weippl, Kathrin; Whelan, Kate; Tu, Dongsheng; Whelan, Timothy J.

    2018-01-01

    Purpose MA.17R was a Canadian Cancer Trials Group–led phase III randomized controlled trial comparing letrozole to placebo after 5 years of aromatase inhibitor as adjuvant therapy for hormone receptor–positive breast cancer. Quality of life (QOL) was a secondary outcome measure of the study, and here, we report the results of these analyses. Methods QOL was measured using the Short Form-36 (SF-36; two summary scores and eight domains) and menopause-specific QOL (MENQOL; four symptom domains) at baseline and every 12 months up to 60 months. QOL assessment was mandatory for Canadian Cancer Trials Group centers but optional for centers in other groups. Mean change scores from baseline were calculated. Results One thousand nine hundred eighteen women were randomly assigned, and 1,428 women completed the baseline QOL assessment. Compliance with QOL measures was > 85%. Baseline summary scores for the SF-36 physical component summary (47.5 for letrozole and 47.9 for placebo) and mental component summary (55.5 for letrozole and 54.8 for placebo) were close to the population norms of 50. No differences were seen between groups in mean change scores for the SF-36 physical and mental component summaries and the other eight QOL domains except for the role-physical subscale. No difference was found in any of the four domains of the MENQOL Conclusion No clinically significant differences were seen in overall QOL measured by the SF-36 summary measures and MENQOL between the letrozole and placebo groups. The data indicate that continuation of aromatase inhibitor therapy after 5 years of prior treatment in the trial population was not associated with a deterioration of overall QOL. PMID:29328860

  18. Scale-free models for the structure of business firm networks

    NASA Astrophysics Data System (ADS)

    Kitsak, Maksim; Riccaboni, Massimo; Havlin, Shlomo; Pammolli, Fabio; Stanley, H. Eugene

    2010-03-01

    We study firm collaborations in the life sciences and the information and communication technology sectors. We propose an approach to characterize industrial leadership using k -shell decomposition, with top-ranking firms in terms of market value in higher k -shell layers. We find that the life sciences industry network consists of three distinct components: a “nucleus,” which is a small well-connected subgraph, “tendrils,” which are small subgraphs consisting of small degree nodes connected exclusively to the nucleus, and a “bulk body,” which consists of the majority of nodes. Industrial leaders, i.e., the largest companies in terms of market value, are in the highest k -shells of both networks. The nucleus of the life sciences sector is very stable: once a firm enters the nucleus, it is likely to stay there for a long time. At the same time we do not observe the above three components in the information and communication technology sector. We also conduct a systematic study of these three components in random scale-free networks. Our results suggest that the sizes of the nucleus and the tendrils in scale-free networks decrease as the exponent of the power-law degree distribution λ increases, and disappear for λ≥3 . We compare the k -shell structure of random scale-free model networks with two real-world business firm networks in the life sciences and in the information and communication technology sectors. We argue that the observed behavior of the k -shell structure in the two industries is consistent with the coexistence of both preferential and random agreements in the evolution of industrial networks.

  19. 2GETHER - The Dual Protection Project: Design and rationale of a randomized controlled trial to increase dual protection strategy selection and adherence among African American adolescent females.

    PubMed

    Ewing, Alexander C; Kottke, Melissa J; Kraft, Joan Marie; Sales, Jessica M; Brown, Jennifer L; Goedken, Peggy; Wiener, Jeffrey; Kourtis, Athena P

    2017-03-01

    African American adolescent females are at elevated risk for unintended pregnancy and sexually transmitted infections (STIs). Dual protection (DP) is defined as concurrent prevention of pregnancy and STIs. This can be achieved by abstinence, consistent condom use, or the dual methods of condoms plus an effective non-barrier contraceptive. Previous clinic-based interventions showed short-term effects on increasing dual method use, but evidence of sustained effects on dual method use and decreased incident pregnancies and STIs are lacking. This manuscript describes the 2GETHER Project. 2GETHER is a randomized controlled trial of a multi-component intervention to increase dual protection use among sexually active African American females aged 14-19years not desiring pregnancy at a Title X clinic in Atlanta, GA. The intervention is clinic-based and includes a culturally tailored interactive multimedia component and counseling sessions, both to assist in selection of a DP method and to reinforce use of the DP method. The participants are randomized to the study intervention or the standard of care, and followed for 12months to evaluate how the intervention influences DP method selection and adherence, pregnancy and STI incidence, and participants' DP knowledge, intentions, and self-efficacy. The 2GETHER Project is a novel trial to reduce unintended pregnancies and STIs among African American adolescents. The intervention is unique in the comprehensive and complementary nature of its components and its individual tailoring of provider-patient interaction. If the trial interventions are shown to be effective, then it will be reasonable to assess their scalability and applicability in other populations. Published by Elsevier Inc.

  20. Effects of the scatter in sunspot group tilt angles on the large-scale magnetic field at the solar surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, J.; Cameron, R. H.; Schüssler, M., E-mail: jiejiang@nao.cas.cn

    The tilt angles of sunspot groups represent the poloidal field source in Babcock-Leighton-type models of the solar dynamo and are crucial for the build-up and reversals of the polar fields in surface flux transport (SFT) simulations. The evolution of the polar field is a consequence of Hale's polarity rules, together with the tilt angle distribution which has a systematic component (Joy's law) and a random component (tilt-angle scatter). We determine the scatter using the observed tilt angle data and study the effects of this scatter on the evolution of the solar surface field using SFT simulations with flux input basedmore » upon the recorded sunspot groups. The tilt angle scatter is described in our simulations by a random component according to the observed distributions for different ranges of sunspot group size (total umbral area). By performing simulations with a number of different realizations of the scatter we study the effect of the tilt angle scatter on the global magnetic field, especially on the evolution of the axial dipole moment. The average axial dipole moment at the end of cycle 17 (a medium-amplitude cycle) from our simulations was 2.73 G. The tilt angle scatter leads to an uncertainty of 0.78 G (standard deviation). We also considered cycle 14 (a weak cycle) and cycle 19 (a strong cycle) and show that the standard deviation of the axial dipole moment is similar for all three cycles. The uncertainty mainly results from the big sunspot groups which emerge near the equator. In the framework of Babcock-Leighton dynamo models, the tilt angle scatter therefore constitutes a significant random factor in the cycle-to-cycle amplitude variability, which strongly limits the predictability of solar activity.« less

  1. An Employee Total Health Management–Based Survey of Iowa Employers

    PubMed Central

    Merchant, James A.; Lind, David P.; Kelly, Kevin M.; Hall, Jennifer L.

    2015-01-01

    Objective To implement an Employee Total Health Management (ETHM) model-based questionnaire and provide estimates of model program elements among a statewide sample of Iowa employers. Methods Survey a stratified random sample of Iowa employers, characterize and estimate employer participation in ETHM program elements Results Iowa employers are implementing under 30% of all 12 components of ETHM, with the exception of occupational safety and health (46.6%) and worker compensation insurance coverage (89.2%), but intend modest expansion of all components in the coming year. Conclusions The Employee Total Health Management questionnaire-based survey provides estimates of progress Iowa employers are making toward implementing components of total worker health programs. PMID:24284757

  2. Effects of vibration and shock on the performance of gas-bearing space-power Brayton cycle turbomachinery. Part 3: Sinusoidal and random vibration data reduction and evaluation, and random vibration probability analysis

    NASA Technical Reports Server (NTRS)

    Tessarzik, J. M.; Chiang, T.; Badgley, R. H.

    1973-01-01

    The random vibration response of a gas bearing rotor support system has been experimentally and analytically investigated in the amplitude and frequency domains. The NASA Brayton Rotating Unit (BRU), a 36,000 rpm, 10 KWe turbogenerator had previously been subjected in the laboratory to external random vibrations, and the response data recorded on magnetic tape. This data has now been experimentally analyzed for amplitude distribution and magnetic tape. This data has now been experimentally analyzed for amplitude distribution and frequency content. The results of the power spectral density analysis indicate strong vibration responses for the major rotor-bearing system components at frequencies which correspond closely to their resonant frequencies obtained under periodic vibration testing. The results of amplitude analysis indicate an increasing shift towards non-Gaussian distributions as the input level of external vibrations is raised. Analysis of axial random vibration response of the BRU was performed by using a linear three-mass model. Power spectral densities, the root-mean-square value of the thrust bearing surface contact were calculated for specified input random excitation.

  3. Solar Cycle Variability and Surface Differential Rotation from Ca II K-line Time Series Data

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.; Keil, Stephen L.; Worden, Simon P.

    2013-07-01

    Analysis of over 36 yr of time series data from the NSO/AFRL/Sac Peak K-line monitoring program elucidates 5 components of the variation of the 7 measured chromospheric parameters: (a) the solar cycle (period ~ 11 yr), (b) quasi-periodic variations (periods ~ 100 days), (c) a broadband stochastic process (wide range of periods), (d) rotational modulation, and (e) random observational errors, independent of (a)-(d). Correlation and power spectrum analyses elucidate periodic and aperiodic variation of these parameters. Time-frequency analysis illuminates periodic and quasi-periodic signals, details of frequency modulation due to differential rotation, and in particular elucidates the rather complex harmonic structure (a) and (b) at timescales in the range ~0.1-10 yr. These results using only full-disk data suggest that similar analyses will be useful for detecting and characterizing differential rotation in stars from stellar light curves such as those being produced by NASA's Kepler observatory. Component (c) consists of variations over a range of timescales, in the manner of a 1/f random process with a power-law slope index that varies in a systematic way. A time-dependent Wilson-Bappu effect appears to be present in the solar cycle variations (a), but not in the more rapid variations of the stochastic process (c). Component (d) characterizes differential rotation of the active regions. Component (e) is of course not characteristic of solar variability, but the fact that the observational errors are quite small greatly facilitates the analysis of the other components. The data analyzed in this paper can be found at the National Solar Observatory Web site http://nsosp.nso.edu/cak_mon/, or by file transfer protocol at ftp://ftp.nso.edu/idl/cak.parameters.

  4. SOLAR CYCLE VARIABILITY AND SURFACE DIFFERENTIAL ROTATION FROM Ca II K-LINE TIME SERIES DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scargle, Jeffrey D.; Worden, Simon P.; Keil, Stephen L.

    Analysis of over 36 yr of time series data from the NSO/AFRL/Sac Peak K-line monitoring program elucidates 5 components of the variation of the 7 measured chromospheric parameters: (a) the solar cycle (period {approx} 11 yr), (b) quasi-periodic variations (periods {approx} 100 days), (c) a broadband stochastic process (wide range of periods), (d) rotational modulation, and (e) random observational errors, independent of (a)-(d). Correlation and power spectrum analyses elucidate periodic and aperiodic variation of these parameters. Time-frequency analysis illuminates periodic and quasi-periodic signals, details of frequency modulation due to differential rotation, and in particular elucidates the rather complex harmonic structuremore » (a) and (b) at timescales in the range {approx}0.1-10 yr. These results using only full-disk data suggest that similar analyses will be useful for detecting and characterizing differential rotation in stars from stellar light curves such as those being produced by NASA's Kepler observatory. Component (c) consists of variations over a range of timescales, in the manner of a 1/f random process with a power-law slope index that varies in a systematic way. A time-dependent Wilson-Bappu effect appears to be present in the solar cycle variations (a), but not in the more rapid variations of the stochastic process (c). Component (d) characterizes differential rotation of the active regions. Component (e) is of course not characteristic of solar variability, but the fact that the observational errors are quite small greatly facilitates the analysis of the other components. The data analyzed in this paper can be found at the National Solar Observatory Web site http://nsosp.nso.edu/cak{sub m}on/, or by file transfer protocol at ftp://ftp.nso.edu/idl/cak.parameters.« less

  5. Treatment dismantling pilot study to identify the active ingredients in personalized feedback interventions for hazardous alcohol use: randomized controlled trial.

    PubMed

    Cunningham, John A; Murphy, Michelle; Hendershot, Christian S

    2014-12-10

    There is a considerable body of evidence supporting the effectiveness of personalized feedback interventions for hazardous alcohol use-whether delivered face-to-face, by postal mail, or over the Internet (probably now the primary mode of delivery). The Check Your Drinking Screener (CYD; see www.CheckYourDrinking.net) is one such intervention. The current treatment dismantling study assessed which components of personalized feedback interventions were effective in motivating change in drinking. Specifically, the major objective of this project was to conduct a randomized controlled trial (RCT) comparing the impact of the normative feedback and other personalized feedback components of the CYD intervention in the general population. Participants were recruited to take part in an RCT and received either the complete CYD final report, just the normative feedback sections of the CYD, just the personalized feedback components of the CYD, or were assigned to a no-intervention control group. Participants were followed-up at 3 months to assess changes in alcohol consumption. A total of 741 hazardous drinking participants were recruited for the trial, of which 73 percent provided follow-up data. Analyses using an intent-to-treat approach found some evidence for the impact of the personalized feedback components of the CYD in reducing alcohol consumption on the variables, number of drinks in a week and AUDIT-C (p = .028 and .047 respectively; no impact on highest number of drinks on one occasion; p = .594). However, there was no significant evidence of the impact of the normative feedback components (all p > .3). Personalized feedback elements alone could provide an active intervention for hazardous drinkers, particularly in situations where normative feedback information was not available. ClinicalTrials.gov NCT01608763.

  6. A randomized controlled trial of gabapentin for chronic low back pain with and without a radiating component.

    PubMed

    Atkinson, J Hampton; Slater, Mark A; Capparelli, Edmund V; Patel, Shetal M; Wolfson, Tanya; Gamst, Anthony; Abramson, Ian S; Wallace, Mark S; Funk, Stephen D; Rutledge, Thomas R; Wetherell, Julie L; Matthews, Scott C; Zisook, Sidney; Garfin, Steven R

    2016-07-01

    Gabapentin is prescribed for analgesia in chronic low back pain, yet there are no controlled trials supporting this practice. This randomized, 2-arm, 12-week, parallel group study compared gabapentin (forced titration up to 3600 mg daily) with inert placebo. The primary efficacy measure was change in pain intensity from baseline to the last week on treatment measured by the Descriptor Differential Scale; the secondary outcome was disability (Oswestry Disability Index). The intention-to-treat analysis comprised 108 randomized patients with chronic back pain (daily pain for ≥6 months) whose pain did (43%) or did not radiate into the lower extremity. Random effects regression models which did not impute missing scores were used to analyze outcome data. Pain intensity decreased significantly over time (P < 0.0001) with subjects on gabapentin or placebo, reporting reductions of about 30% from baseline, but did not differ significantly between groups (P = 0.423). The same results pertained for disability scores. In responder analyses of those who completed 12 weeks (N = 72), the proportion reporting at least 30% or 50% reduction in pain intensity, or at least "Minimal Improvement" on the Physician Clinical Global Impression of Change did not differ significantly between groups. There were no significant differences in analgesia between participants with radiating (n = 46) and nonradiating (n = 62) pain either within or between treatment arms. There was no significant correlation between gabapentin plasma concentration and pain intensity. Gabapentin appears to be ineffective for analgesia in chronic low back pain with or without a radiating component.

  7. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    PubMed Central

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A.; Burgueño, Juan; Pérez-Rodríguez, Paulino; de los Campos, Gustavo

    2016-01-01

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects (u) that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model (u) plus an extra component, f, that captures random effects between environments that were not captured by the random effects u. We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u and f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u. PMID:27793970

  8. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models.

    PubMed

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A; Burgueño, Juan; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo

    2017-01-05

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects [Formula: see text] that can be assessed by the Kronecker product of variance-covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model [Formula: see text] plus an extra component, F: , that captures random effects between environments that were not captured by the random effects [Formula: see text] We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with [Formula: see text] over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect [Formula: see text]. Copyright © 2017 Cuevas et al.

  9. Transportability of an Evidence-Based Early Childhood Intervention in a Low-Income African Country: Results of a Cluster Randomized Controlled Study.

    PubMed

    Huang, Keng-Yen; Nakigudde, Janet; Rhule, Dana; Gumikiriza-Onoria, Joy Louise; Abura, Gloria; Kolawole, Bukky; Ndyanabangi, Sheila; Kim, Sharon; Seidman, Edward; Ogedegbe, Gbenga; Brotman, Laurie Miller

    2017-11-01

    Children in Sub-Saharan Africa (SSA) are burdened by significant unmet mental health needs. Despite the successes of numerous school-based interventions for promoting child mental health, most evidence-based interventions (EBIs) are not available in SSA. This study investigated the implementation quality and effectiveness of one component of an EBI from a developed country (USA) in a SSA country (Uganda). The EBI component, Professional Development, was provided by trained Ugandan mental health professionals to Ugandan primary school teachers. It included large-group experiential training and small-group coaching to introduce and support a range of evidence-based practices (EBPs) to create nurturing and predictable classroom experiences. The study was guided by the Consolidated Framework for Implementation Research, the Teacher Training Implementation Model, and the RE-AIM evaluation framework. Effectiveness outcomes were studied using a cluster randomized design, in which 10 schools were randomized to intervention and wait-list control conditions. A total of 79 early childhood teachers participated. Teacher knowledge and the use of EBPs were assessed at baseline and immediately post-intervention (4-5 months later). A sample of 154 parents was randomly selected to report on child behavior at baseline and post-intervention. Linear mixed effect modeling was applied to examine effectiveness outcomes. Findings support the feasibility of training Ugandan mental health professionals to provide Professional Development for Ugandan teachers. Professional Development was delivered with high levels of fidelity and resulted in improved teacher EBP knowledge and the use of EBPs in the classroom, and child social competence.

  10. A randomized controlled trial of gabapentin for chronic low back pain with and without a radiating component

    PubMed Central

    Atkinson, J. Hampton; Slater, Mark A.; Capparelli, Edmund V.; Patel, Shetal M.; Wolfson, Tanya; Gamst, Anthony; Abramson, Ian S.; Wallace, Mark S.; Funk, Stephen D.; Rutledge, Thomas R.; Wetherell, Julie Loebach; Matthews, Scott C.; Zisook, Sidney; Garfin, Steven R.

    2016-01-01

    Gabapentin is prescribed for analgesia in chronic low back pain, yet there are no controlled trials supporting this practice. This randomized, two-arm, 12-week, parallel group study compared gabapentin (forced titration up to 3600 mg daily) to inert placebo. The primary efficacy measure was change in pain intensity from baseline to the last week on treatment measured by the Descriptor Differential Scale; the secondary outcome was disability (Oswestry Disability Index). The intention-to-treat analysis comprised 108 randomized chronic back pain patients (daily pain for ≥ 6 months) whose pain did (43%) or did not radiate into the lower extremity. Random effects regression models which did not impute missing scores were used to analyze outcome data. Pain intensity decreased significantly over time (p < .0001) with subjects on gabapentin or placebo reporting reductions of about 30% from baseline, but did not differ significantly between groups (p = .423). The same results pertained for disability scores. In responder analyses of those who completed 12 weeks (N=72), the proportion reporting at least 30% or 50% reduction in pain intensity, or at least “Minimal Improvement” on the Physician Clinical Global Impression of Change did not differ significantly between groups. There were no significant differences in analgesia between participants with radiating (n = 46) and non-radiating (n = 62) pain either within or between treatment arms. There was no significant correlation between gabapentin plasma concentration and pain intensity. Gabapentin appears to be ineffective for analgesia in chronic low back pain with or without a radiating component. PMID:26963844

  11. A Randomized Control Trial of a Community Mental Health Intervention for Military Personnel

    DTIC Science & Technology

    2013-10-01

    reporting period. 15. SUBJECT TERMS Mental health literacy , Mental Health First Aid (MHFA), curriculum adaptation 16. SECURITY CLASSIFICATION OF...Stress First Aid and suicide prevention gatekeeper training by providing a mental health literacy component that is currently not addressed

  12. K-Fold Crossvalidation in Canonical Analysis.

    ERIC Educational Resources Information Center

    Liang, Kun-Hsia; And Others

    1995-01-01

    A computer-assisted, K-fold cross-validation technique is discussed in the framework of canonical correlation analysis of randomly generated data sets. Analysis results suggest that this technique can effectively reduce the contamination of canonical variates and canonical correlations by sample-specific variance components. (Author/SLD)

  13. Repeated and random components in Oklahoma's monthly precipitation record

    USDA-ARS?s Scientific Manuscript database

    Precipitation across Oklahoma exhibits a high degree of spatial and temporal variability and creates numerous water resources management challenges. The monthly precipitation record of the Central Oklahoma climate division was evaluated in a proof-of-concept to establish whether a simple monthly pre...

  14. Aging Theories for Establishing Safe Life Spans of Airborne Critical Structural Components

    NASA Technical Reports Server (NTRS)

    Ko, William L.

    2003-01-01

    New aging theories have been developed to establish the safe life span of airborne critical structural components such as B-52B aircraft pylon hooks for carrying air-launch drop-test vehicles. The new aging theories use the equivalent-constant-amplitude loading spectrum to represent the actual random loading spectrum with the same damaging effect. The crack growth due to random loading cycling of the first flight is calculated using the half-cycle theory, and then extrapolated to all the crack growths of the subsequent flights. The predictions of the new aging theories (finite difference aging theory and closed-form aging theory) are compared with the classical flight-test life theory and the previously developed Ko first- and Ko second-order aging theories. The new aging theories predict the number of safe flights as considerably lower than that predicted by the classical aging theory, and slightly lower than those predicted by the Ko first- and Ko second-order aging theories due to the inclusion of all the higher order terms.

  15. Non-stationary least-squares complex decomposition for microseismic noise attenuation

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2018-06-01

    Microseismic data processing and imaging are crucial for subsurface real-time monitoring during hydraulic fracturing process. Unlike the active-source seismic events or large-scale earthquake events, the microseismic event is usually of very small magnitude, which makes its detection challenging. The biggest trouble of microseismic data is the low signal-to-noise ratio issue. Because of the small energy difference between effective microseismic signal and ambient noise, the effective signals are usually buried in strong random noise. I propose a useful microseismic denoising algorithm that is based on decomposing a microseismic trace into an ensemble of components using least-squares inversion. Based on the predictive property of useful microseismic event along the time direction, the random noise can be filtered out via least-squares fitting of multiple damping exponential components. The method is flexible and almost automated since the only parameter needed to be defined is a decomposition number. I use some synthetic and real data examples to demonstrate the potential of the algorithm in processing complicated microseismic data sets.

  16. PCA-LBG-based algorithms for VQ codebook generation

    NASA Astrophysics Data System (ADS)

    Tsai, Jinn-Tsong; Yang, Po-Yuan

    2015-04-01

    Vector quantisation (VQ) codebooks are generated by combining principal component analysis (PCA) algorithms with Linde-Buzo-Gray (LBG) algorithms. All training vectors are grouped according to the projected values of the principal components. The PCA-LBG-based algorithms include (1) PCA-LBG-Median, which selects the median vector of each group, (2) PCA-LBG-Centroid, which adopts the centroid vector of each group, and (3) PCA-LBG-Random, which randomly selects a vector of each group. The LBG algorithm finds a codebook based on the better vectors sent to an initial codebook by the PCA. The PCA performs an orthogonal transformation to convert a set of potentially correlated variables into a set of variables that are not linearly correlated. Because the orthogonal transformation efficiently distinguishes test image vectors, the proposed PCA-LBG-based algorithm is expected to outperform conventional algorithms in designing VQ codebooks. The experimental results confirm that the proposed PCA-LBG-based algorithms indeed obtain better results compared to existing methods reported in the literature.

  17. Known plaintext attack on double random phase encoding using fingerprint as key and a method for avoiding the attack.

    PubMed

    Tashima, Hideaki; Takeda, Masafumi; Suzuki, Hiroyuki; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2010-06-21

    We have shown that the application of double random phase encoding (DRPE) to biometrics enables the use of biometrics as cipher keys for binary data encryption. However, DRPE is reported to be vulnerable to known-plaintext attacks (KPAs) using a phase recovery algorithm. In this study, we investigated the vulnerability of DRPE using fingerprints as cipher keys to the KPAs. By means of computational experiments, we estimated the encryption key and restored the fingerprint image using the estimated key. Further, we propose a method for avoiding the KPA on the DRPE that employs the phase retrieval algorithm. The proposed method makes the amplitude component of the encrypted image constant in order to prevent the amplitude component of the encrypted image from being used as a clue for phase retrieval. Computational experiments showed that the proposed method not only avoids revealing the cipher key and the fingerprint but also serves as a sufficiently accurate verification system.

  18. [The reentrant binomial model of nuclear anomalies growth in rhabdomyosarcoma RA-23 cell populations under increasing doze of rare ionizing radiation].

    PubMed

    Alekseeva, N P; Alekseev, A O; Vakhtin, Iu B; Kravtsov, V Iu; Kuzovatov, S N; Skorikova, T I

    2008-01-01

    Distributions of nuclear morphology anomalies in transplantable rabdomiosarcoma RA-23 cell populations were investigated under effect of ionizing radiation from 0 to 45 Gy. Internuclear bridges, nuclear protrusions and dumbbell-shaped nuclei were accepted for morphological anomalies. Empirical distributions of the number of anomalies per 100 nuclei were used. The adequate model of reentrant binomial distribution has been found. The sum of binomial random variables with binomial number of summands has such distribution. Averages of these random variables were named, accordingly, internal and external average reentrant components. Their maximum likelihood estimations were received. Statistical properties of these estimations were investigated by means of statistical modeling. It has been received that at equally significant correlation between the radiation dose and the average of nuclear anomalies in cell populations after two-three cellular cycles from the moment of irradiation in vivo the irradiation doze significantly correlates with internal average reentrant component, and in remote descendants of cell transplants irradiated in vitro - with external one.

  19. Exercise to Enhance Smoking Cessation: the Getting Physical on Cigarette Randomized Control Trial.

    PubMed

    Prapavessis, Harry; De Jesus, Stefanie; Fitzgeorge, Lindsay; Faulkner, Guy; Maddison, Ralph; Batten, Sandra

    2016-06-01

    Exercise has been proposed as a useful smoking cessation aid. The purpose of the present study is to determine the effect of an exercise-aided smoking cessation intervention program, with built-in maintenance components, on post-intervention 14-, 26- and 56-week cessation rates. Female cigarette smokers (n = 413) participating in a supervised exercise and nicotine replacement therapy (NRT) smoking cessation program were randomized to one of four conditions: exercise + smoking cessation maintenance, exercise maintenance + contact control, smoking cessation maintenance + contact control or contact control. The primary outcome was continuous smoking abstinence. Abstinence differences were found between the exercise and equal contact non-exercise maintenance groups at weeks 14 (57 vs 43 %), 26 (27 vs 21 %) and 56 (26 vs 23.5 %), respectively. Only the week 14 difference approached significance, p = 0.08. An exercise-aided NRT smoking cessation program with built-in maintenance components enhances post-intervention cessation rates at week 14 but not at weeks 26 and 56.

  20. Intraoperative Comparison of Measured Resection and Gap Balancing Using a Force Sensor: A Prospective, Randomized Controlled Trial.

    PubMed

    Cidambi, Krishna R; Robertson, Nicholas; Borges, Camille; Nassif, Nader A; Barnett, Steven L

    2018-07-01

    For establishing femoral component position, gap-balancing (GB) and measured resection (MR) techniques were compared using a force sensor. Ninety-one patients were randomized to undergo primary total knee arthroplasty using either MR (n = 43) or GB (n = 48) technique using a single total knee arthroplasty design. GB was performed with an instrumented tensioner. Force sensor data were obtained before the final implantation. GB resulted in greater range of femoral component rotation vs MR (1.5° ± 2.9° vs 3.1° ± 0.5°, P < .05) and posterior condylar cut thickness medially (10.2 ± 2.0 mm vs 9.0 ± 1.3 mm) and laterally (8.5 ± 1.9 mm vs 6.4 ± 1.0 mm). Force sensor data showed a decreased intercompartmental force difference at full flexion in GB (.8 ± 2.3 vs 2.0 ± 3.3u, 1u ≈ 15 N, P < .05). GB resulted in a greater range of femoral component rotation and thicker posterior condylar cuts resulting in an increased flexion space relative to MR. Intercompartmental force difference trended toward a more uniform distribution between full extension and full flexion in the GB vs MR group. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Statistics of partially-polarized fields: beyond the Stokes vector and coherence matrix

    NASA Astrophysics Data System (ADS)

    Charnotskii, Mikhail

    2017-08-01

    Traditionally, the partially-polarized light is characterized by the four Stokes parameters. Equivalent description is also provided by correlation tensor of the optical field. These statistics specify only the second moments of the complex amplitudes of the narrow-band two-dimensional electric field of the optical wave. Electric field vector of the random quasi monochromatic wave is a nonstationary oscillating two-dimensional real random variable. We introduce a novel statistical description of these partially polarized waves: the Period-Averaged Probability Density Function (PA-PDF) of the field. PA-PDF contains more information on the polarization state of the field than the Stokes vector. In particular, in addition to the conventional distinction between the polarized and depolarized components of the field PA-PDF allows to separate the coherent and fluctuating components of the field. We present several model examples of the fields with identical Stokes vectors and very distinct shapes of PA-PDF. In the simplest case of the nonstationary, oscillating normal 2-D probability distribution of the real electrical field and stationary 4-D probability distribution of the complex amplitudes, the newly-introduced PA-PDF is determined by 13 parameters that include the first moments and covariance matrix of the quadrature components of the oscillating vector field.

  2. Diamond, aromatic, aliphatic components of interstellar dust grains: Random covalent networks in carbonaceous grains

    NASA Astrophysics Data System (ADS)

    Duley, W. W.

    1995-05-01

    A formalism based on the theory of random covalent networks (RCNs) in amorphous solids is developed for carbonaceous dust grains. RCN solutions provide optimized structures and relative compositions for amorphous materials. By inclusion of aliphatic, aromatic, and diamond clusters, solutions specific to interstellar materials can be obtained and compared with infrared spectral data. It is found that distinct RCN solutions corresponding to diffuse cloud and molecular cloud materials are possible. Specific solutions are derived for three representative objects: VI Cyg No. 12, NGC 7538 (IRS 9), and GC IRS 7. While diffuse cloud conditions with a preponderance of sp2 and sp3 bonded aliphatic CH species can be reproduced under a variety of RCN conditions, the presence of an abundant tertiary CH or diamond component is highly constrained. These solutions are related quantitatively to carbon depletions and can be used to provide a quantitative estimate of carbon in these various dust components. Despite the abundance of C6 aromatic rings in many RCN solutions, the infrared absorption due to the aromatic stretch at approximately 3.3 micrometers is weak under all conditions. The RCN formalism is shown to provide a useful method for tracing the evolutionary properties of interstellar carbonaceous grains.

  3. A stochastic approach to noise modeling for barometric altimeters.

    PubMed

    Sabatini, Angelo Maria; Genovese, Vincenzo

    2013-11-18

    The question whether barometric altimeters can be applied to accurately track human motions is still debated, since their measurement performance are rather poor due to either coarse resolution or drifting behavior problems. As a step toward accurate short-time tracking of changes in height (up to few minutes), we develop a stochastic model that attempts to capture some statistical properties of the barometric altimeter noise. The barometric altimeter noise is decomposed in three components with different physical origin and properties: a deterministic time-varying mean, mainly correlated with global environment changes, and a first-order Gauss-Markov (GM) random process, mainly accounting for short-term, local environment changes, the effects of which are prominent, respectively, for long-time and short-time motion tracking; an uncorrelated random process, mainly due to wideband electronic noise, including quantization noise. Autoregressive-moving average (ARMA) system identification techniques are used to capture the correlation structure of the piecewise stationary GM component, and to estimate its standard deviation, together with the standard deviation of the uncorrelated component. M-point moving average filters used alone or in combination with whitening filters learnt from ARMA model parameters are further tested in few dynamic motion experiments and discussed for their capability of short-time tracking small-amplitude, low-frequency motions.

  4. Strategic allocation of attention reduces temporally predictable stimulus conflict

    PubMed Central

    Appelbaum, L. Gregory; Boehler, Carsten N.; Won, Robert; Davis, Lauren; Woldorff, Marty G.

    2013-01-01

    Humans are able to continuously monitor environmental situations and adjust their behavioral strategies to optimize performance. Here we investigate the behavioral and brain adjustments that occur when conflicting stimulus elements are, or are not, temporally predictable. Event-related potentials (ERPs) were collected while manual-response variants of the Stroop task were performed in which the stimulus onset asynchronies (SOAs) between the relevant-color and irrelevant-word stimulus components were either randomly intermixed, or held constant, within each experimental run. Results indicated that the size of both the neural and behavioral effects of stimulus incongruency varied with the temporal arrangement of the stimulus components, such that the random-SOA arrangements produced the greatest incongruency effects at the earliest irrelevant-first SOA (−200 ms) and the constant-SOA arrangements produced the greatest effects with simultaneous presentation. These differences in conflict processing were accompanied by rapid (~150 ms) modulations of the sensory ERPs to the irrelevant distracter components when they occurred consistently first. These effects suggest that individuals are able to strategically allocate attention in time to mitigate the influence of a temporally predictable distracter. As these adjustments are instantiated by the subjects without instruction, they reveal a form of rapid strategic learning for dealing with temporally predictable stimulus incongruency. PMID:22360623

  5. An exercise trial for wheelchair users: Project Workout on Wheels

    PubMed Central

    Froehlich-Grobe, Katherine; Aaronson, Lauren S.; Washburn, Richard A.; Little, Todd D.; Lee, Jaehoon; Nary, Dorothy E.; VanSciver, Angela; Nesbitt, Jill; Norman, Sarah E.

    2011-01-01

    There is growing interest in promoting health for people with disabilities, yet evidence regarding community-based interventions is sparse. This paper describes the design details of a randomized controlled trial (RCT) that will test the effectiveness of a multi-component behaviorally-based, intervention to promote exercise adoption (over 6 months) and maintenance (up to one year) among wheelchair users and includes descriptive data on participant characteristics at baseline. Participants were randomly assigned to either a staff-supported intervention group or a self-guided comparison group. The primary study aim is to assess the effectiveness of the multi-component behaviorally-based intervention for promoting physical activity adoption and maintenance. The RCT will also assess the physical and psychosocial effects of the intervention and the complex interplay of factors that influence the effectiveness of the intervention. Therefore, the primary outcome derives from participant reports of weekly exercise (type, frequency, duration) over 52 weeks. Secondary outcomes collected on four occasions (baseline, 3 months, 6 months, 12 months) included physiological outcomes (VO2 peak, strength), disability-related outcomes (pain, fatigue, participation), and psychosocial outcomes (exercise self-efficacy, exercise barriers, quality of life, depression, mood). This study will provide evidence regarding the effectiveness of a multi-component behaviorally-based intervention for promoting exercise adoption among people with mobility impairments that necessitate wheelchair use. PMID:22101206

  6. Three-part joint modeling methods for complex functional data mixed with zero-and-one-inflated proportions and zero-inflated continuous outcomes with skewness.

    PubMed

    Li, Haocheng; Staudenmayer, John; Wang, Tianying; Keadle, Sarah Kozey; Carroll, Raymond J

    2018-02-20

    We take a functional data approach to longitudinal studies with complex bivariate outcomes. This work is motivated by data from a physical activity study that measured 2 responses over time in 5-minute intervals. One response is the proportion of time active in each interval, a continuous proportions with excess zeros and ones. The other response, energy expenditure rate in the interval, is a continuous variable with excess zeros and skewness. This outcome is complex because there are 3 possible activity patterns in each interval (inactive, partially active, and completely active), and those patterns, which are observed, induce both nonrandom and random associations between the responses. More specifically, the inactive pattern requires a zero value in both the proportion for active behavior and the energy expenditure rate; a partially active pattern means that the proportion of activity is strictly between zero and one and that the energy expenditure rate is greater than zero and likely to be moderate, and the completely active pattern means that the proportion of activity is exactly one, and the energy expenditure rate is greater than zero and likely to be higher. To address these challenges, we propose a 3-part functional data joint modeling approach. The first part is a continuation-ratio model to reorder the ordinal valued 3 activity patterns. The second part models the proportions when they are in interval (0,1). The last component specifies the skewed continuous energy expenditure rate with Box-Cox transformations when they are greater than zero. In this 3-part model, the regression structures are specified as smooth curves measured at various time points with random effects that have a correlation structure. The smoothed random curves for each variable are summarized using a few important principal components, and the association of the 3 longitudinal components is modeled through the association of the principal component scores. The difficulties in handling the ordinal and proportional variables are addressed using a quasi-likelihood type approximation. We develop an efficient algorithm to fit the model that also involves the selection of the number of principal components. The method is applied to physical activity data and is evaluated empirically by a simulation study. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Forensic firearm identification of semiautomatic handguns using laser formed microstamping elements

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd E.; Ohar, Orest

    2008-08-01

    For well over one hundred years the science of Firearm and Tool Mark Identification has relied on the theory that unintentional random tooling marks generated during the manufacture of a firearm onto its interior surfaces are unique to each individual firearm.[1][2] Forensic Firearm and Tool Mark Examiners have had to rely on the analysis of these randomly formed unintentional striations, or scratches and dings, transferred onto ammunition components from firearms used to commit crimes, as a way of developing clues and evidence. Such transfers take place during the cycle of fire and ejection of the cartridge from the firearm during the commission of a crime. The typical striations on the cartridge casings are caused by tooling marks that are randomly formed during the machining of interior surfaces of the manufactured firearm and by other firearm components that come in contact with the cycling ammunition. Components like the firing pin, extractor and ejector, impact the surfaces of the cartridges as they are fed, fired and ejected from the firearm. When found at a crime scene, these striae constitute ballistic evidence when effectively analyzed by a Forensic Firearm and Tool Mark Examiner. Examiners categorize these striations looking for matches to be made between the components that created the marks and the recovered firearm. Reality is that nearly 50% of firearms used in violent crimes are not recovered at a crime scene, requiring the analysis to be processed and logged into evidence files or imaged into reference image databases for future comparison whenever a firearm might be recovered. This paper will present a unique law enforcement technology, embedded into firearms for tracking the sources of illegally trafficked firearms, called Microstamping. Microstamping is a laser based micromachining process that forms microscopic "intentional structures and marks" on components within a firearm. Thus when the firearm is fired, these microstamp structures transfer an identifying tracking code onto the expended cartridge ejected from the firearm. Microstamped structures are laser micromachined alpha numeric and encoded geometric tracking numbers, linked to the serial number of the firearm. Ballistic testing data will be presented covering microstamp transfer quality, transfer rates and survivability/durability. Further information will provide an overview on how microstamping information can be utilized by law enforcement to combat illegal firearm trafficking.

  8. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  9. Probabilistic Structures Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The basic formulation for probabilistic finite element analysis is described and demonstrated on a few sample problems. This formulation is based on iterative perturbation that uses the factorized stiffness on the unperturbed system as the iteration preconditioner for obtaining the solution to the perturbed problem. This approach eliminates the need to compute, store and manipulate explicit partial derivatives of the element matrices and force vector, which not only reduces memory usage considerably, but also greatly simplifies the coding and validation tasks. All aspects for the proposed formulation were combined in a demonstration problem using a simplified model of a curved turbine blade discretized with 48 shell elements, and having random pressure and temperature fields with partial correlation, random uniform thickness, and random stiffness at the root.

  10. Anderson transition in a multiply-twisted helix.

    PubMed

    Ugajin, R

    2001-06-01

    We investigated the Anderson transition in a multiply-twisted helix in which a helical chain of components, i.e., atoms or nanoclusters, is twisted to produce a doubly-twisted helix, which itself can be twisted to produce a triply-twisted helix, and so on, in which there are couplings between adjacent rounds of helices. As the strength of the on-site random potentials increases, an Anderson transition occurs, suggesting that the number of dimensions is 3 for electrons running along the multiply-twisted helix when the couplings between adjacent rounds are strong enough. If the couplings are weakened, the dimensionality becomes less, resulting in localization of electrons. The effect of random connections between adjacent rounds of helices and random magnetic fields that thread the structure is analyzed using the spectral statistics of a quantum particle.

  11. Three-dimensional direct laser written graphitic electrical contacts to randomly distributed components

    NASA Astrophysics Data System (ADS)

    Dorin, Bryce; Parkinson, Patrick; Scully, Patricia

    2018-04-01

    The development of cost-effective electrical packaging for randomly distributed micro/nano-scale devices is a widely recognized challenge for fabrication technologies. Three-dimensional direct laser writing (DLW) has been proposed as a solution to this challenge, and has enabled the creation of rapid and low resistance graphitic wires within commercial polyimide substrates. In this work, we utilize the DLW technique to electrically contact three fully encapsulated and randomly positioned light-emitting diodes (LEDs) in a one-step process. The resolution of the contacts is in the order of 20 μ m, with an average circuit resistance of 29 ± 18 kΩ per LED contacted. The speed and simplicity of this technique is promising to meet the needs of future microelectronics and device packaging.

  12. Perturbed effects at radiation physics

    NASA Astrophysics Data System (ADS)

    Külahcı, Fatih; Şen, Zekâi

    2013-09-01

    Perturbation methodology is applied in order to assess the linear attenuation coefficient, mass attenuation coefficient and cross-section behavior with random components in the basic variables such as the radiation amounts frequently used in the radiation physics and chemistry. Additionally, layer attenuation coefficient (LAC) and perturbed LAC (PLAC) are proposed for different contact materials. Perturbation methodology provides opportunity to obtain results with random deviations from the average behavior of each variable that enters the whole mathematical expression. The basic photon intensity variation expression as the inverse exponential power law (as Beer-Lambert's law) is adopted for perturbation method exposition. Perturbed results are presented not only in terms of the mean but additionally the standard deviation and the correlation coefficients. Such perturbation expressions provide one to assess small random variability in basic variables.

  13. Digital versus analogue preoperative planning of total hip arthroplasties: a randomized clinical trial of 210 total hip arthroplasties.

    PubMed

    The, Bertram; Verdonschot, Nico; van Horn, Jim R; van Ooijen, Peter M A; Diercks, Ron L

    2007-09-01

    The objective of this randomized clinical trial was to compare the clinical and technical results of digital preoperative planning for primary total hip arthroplasties with analogue planning. Two hundred and ten total hip arthroplasties were randomized. All plans were constructed on standardized radiographs by the surgeon who performed the arthroplasty the next day. The main outcome was accuracy of the preoperative plan. Secondary outcomes were operation time and a radiographic assessment of the arthroplasty. Digital preoperative plans were more accurate in planning the cup (P < .05) and scored higher on the postoperative radiologic assessment of cemented cup (P = .03) and stem (P < .01) components. None of the other comparisons reached statistical significance. We conclude that digital plans slightly outperform analogue plans.

  14. An investigation into the probabilistic combination of quasi-static and random accelerations

    NASA Technical Reports Server (NTRS)

    Schock, R. W.; Tuell, L. P.

    1984-01-01

    The development of design load factors for aerospace and aircraft components and experiment support structures, which are subject to a simultaneous vehicle dynamic vibration (quasi-static) and acoustically generated random vibration, require the selection of a combination methodology. Typically, the procedure is to define the quasi-static and the random generated response separately, and arithmetically add or root sum square to get combined accelerations. Since the combination of a probabilistic and a deterministic function yield a probabilistic function, a viable alternate approach would be to determine the characteristics of the combined acceleration probability density function and select an appropriate percentile level for the combined acceleration. The following paper develops this mechanism and provides graphical data to select combined accelerations for most popular percentile levels.

  15. Activity Begins in Childhood (ABC) - inspiring healthy active behaviour in preschoolers: study protocol for a cluster randomized controlled trial.

    PubMed

    Adamo, Kristi B; Barrowman, Nick; Naylor, Patti Jean; Yaya, Sanni; Harvey, Alysha; Grattan, Kimberly P; Goldfield, Gary S

    2014-07-29

    Today's children are more overweight than previous generations and physical inactivity is a contributing factor. Modelling and promoting positive behaviour in the early years is imperative for the development of lifelong health habits. The social and physical environments where children spend their time have a powerful influence on behaviour. Since the majority of preschool children spend time in care outside of the home, this provides an ideal setting to examine the ability of an intervention to enhance movement skills and modify physical activity behaviour. This study aims to evaluate the efficacy of the Activity Begins in Childhood (ABC) intervention delivered in licensed daycare settings alone or in combination with a parent-driven home physical activity-promotion component to increase preschoolers' overall physical activity levels and, specifically, the time spent in moderate to vigorous physical activity. This study is a single site, three-arm, cluster-randomized controlled trial design with a daycare centre as the unit of measurement (clusters). All daycare centres in the National Capital region that serve children between the ages of 3 and 5, expressing an interest in receiving the ABC intervention will be invited to participate. Those who agree will be randomly assigned to one of three groups: i) ABC program delivered at a daycare centre only, ii) ABC program delivered at daycare with a home/parental education component, or iii) regular daycare curriculum. This study will recruit 18 daycare centres, 6 in each of the three groups. The intervention will last approximately 6 months, with baseline assessment prior to ABC implementation and follow-up assessments at 3 and 6 months. Physical activity is an acknowledged component of a healthy lifestyle and childhood experiences as it has an important impact on lifelong behaviour and health. Opportunities for physical activity and motor development in early childhood may, over the lifespan, influence the maintenance of a healthy body weight and reduce cardiovascular disease risk. If successful, the ABC program may be implemented in daycare centres as an effective way of increasing healthy activity behaviours of preschoolers. Current Controlled Trials: ISRCTN94022291. Registered in December 2012, first cluster randomized in April 2013.

  16. Exploring the use of random regression models with legendre polynomials to analyze measures of volume of ejaculate in Holstein bulls.

    PubMed

    Carabaño, M J; Díaz, C; Ugarte, C; Serrano, M

    2007-02-01

    Artificial insemination centers routinely collect records of quantity and quality of semen of bulls throughout the animals' productive period. The goal of this paper was to explore the use of random regression models with orthogonal polynomials to analyze repeated measures of semen production of Spanish Holstein bulls. A total of 8,773 records of volume of first ejaculate (VFE) collected between 12 and 30 mo of age from 213 Spanish Holstein bulls was analyzed under alternative random regression models. Legendre polynomial functions of increasing order (0 to 6) were fitted to the average trajectory, additive genetic and permanent environmental effects. Age at collection and days in production were used as time variables. Heterogeneous and homogeneous residual variances were alternatively assumed. Analyses were carried out within a Bayesian framework. The logarithm of the marginal density and the cross-validation predictive ability of the data were used as model comparison criteria. Based on both criteria, age at collection as a time variable and heterogeneous residuals models are recommended to analyze changes of VFE over time. Both criteria indicated that fitting random curves for genetic and permanent environmental components as well as for the average trajector improved the quality of models. Furthermore, models with a higher order polynomial for the permanent environmental (5 to 6) than for the genetic components (4 to 5) and the average trajectory (2 to 3) tended to perform best. High-order polynomials were needed to accommodate the highly oscillating nature of the phenotypic values. Heritability and repeatability estimates, disregarding the extremes of the studied period, ranged from 0.15 to 0.35 and from 0.20 to 0.50, respectively, indicating that selection for VFE may be effective at any stage. Small differences among models were observed. Apart from the extremes, estimated correlations between ages decreased steadily from 0.9 and 0.4 for measures 1 mo apart to 0.4 and 0.2 for most distant measures for additive genetic and phenotypic components, respectively. Further investigation to account for environmental factors that may be responsible for the oscillating observations of VFE is needed.

  17. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    NASA Astrophysics Data System (ADS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-03-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1-norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1-norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended to the full suite of experiments available to modern NMR spectroscopy, allowing resolution enhancements for all indirect dimensions; alone or in combination with NUS, RQD can be used to improve experimental resolution, or shorten experiment times, of considerable benefit to the challenging applications undertaken by modern NMR.

  18. Multicolumn spinal cord stimulation for significant low back pain in failed back surgery syndrome: design of a national, multicentre, randomized, controlled health economics trial (ESTIMET Study).

    PubMed

    Roulaud, M; Durand-Zaleski, I; Ingrand, P; Serrie, A; Diallo, B; Peruzzi, P; Hieu, P D; Voirin, J; Raoul, S; Page, P; Fontaine, D; Lantéri-Minet, M; Blond, S; Buisset, N; Cuny, E; Cadenne, M; Caire, F; Ranoux, D; Mertens, P; Naous, H; Simon, E; Emery, E; Gadan, B; Regis, J; Sol, J-C; Béraud, G; Debiais, F; Durand, G; Guetarni Ging, F; Prévost, A; Brandet, C; Monlezun, O; Delmotte, A; d'Houtaud, S; Bataille, B; Rigoard, P

    2015-03-01

    Many studies have demonstrated the efficacy of spinal cord stimulation (SCS) for chronic neuropathic radicular pain over recent decades, but despite global favourable outcomes in failed back surgery syndrome (FBSS) with leg pain, the back pain component remains poorly controlled by neurostimulation. Technological and scientific progress has led to the development of new SCS leads, comprising a multicolumn design and a greater number of contacts. The efficacy of multicolumn SCS lead configurations for the treatment of the back pain component of FBSS has recently been suggested by pilot studies. However, a randomized controlled trial must be conducted to confirm the efficacy of new generation multicolumn SCS. Évaluation médico-économique de la STImulation MEdullaire mulTi-colonnes (ESTIMET) is a multicentre, randomized study designed to compare the clinical efficacy and health economics aspects of mono- vs. multicolumn SCS lead programming in FBSS patients with radicular pain and significant back pain. FBSS patients with a radicular pain VAS score≥50mm, associated with a significant back pain component were recruited in 14 centres in France and implanted with multicolumn SCS. Before the lead implantation procedure, they were 1:1 randomized to monocolumn SCS (group 1) or multicolumn SCS (group 2). Programming was performed using only one column for group 1 and full use of the 3 columns for group 2. Outcome assessment was performed at baseline (pre-implantation), and 1, 3, 6 and 12months post-implantation. The primary outcome measure was a reduction of the severity of low back pain (bVAS reduction≥50%) at the 6-month visit. Additional outcome measures were changes in global pain, leg pain, paraesthesia coverage mapping, functional capacities, quality of life, neuropsychological aspects, patient satisfaction and healthcare resource consumption. Trial recruitment started in May 2012. As of September 2013, all 14 study centres have been initiated and 112/115 patients have been enrolled. Preliminary results are expected to be published in 2015. Clinical trial registration information-URL: www.clinicaltrials.gov. Unique identifier NCT01628237. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  19. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Coroneos, Rula; Patnaik, Surya N.

    2011-01-01

    A stochastic optimization methodology (SDO) has been developed to design airframe structural components made of metallic and composite materials. The design method accommodates uncertainties in load, strength, and material properties that are defined by distribution functions with mean values and standard deviations. A response parameter, like a failure mode, has become a function of reliability. The primitive variables like thermomechanical loads, material properties, and failure theories, as well as variables like depth of beam or thickness of a membrane, are considered random parameters with specified distribution functions defined by mean values and standard deviations.

  20. Drop Calibration of Accelerometers for Shock Measurement

    DTIC Science & Technology

    2011-08-01

    important that the screen is clear, the records displayed are crisp and values are easily read. The current DSO, used within the Division, in the...Capacitor ≤ ± 0.01% ξc Tolerance of capacitor Drop Mass Reading ≤ ± 0.083 %  dm 0.1g over 120g (typically) Reference Mass Reading ≤ ± 0.1 % rm...Therefore m has uncertainty components due to rm ,  dm and ξrme. The random component is  222 dmrmm  (6.8) and once again  dsodc

  1. All-fiber pyroelectric nanogenerator

    NASA Astrophysics Data System (ADS)

    Ghosh, Sujoy Kumar; Xie, Mengying; Bowen, Christopher Rhys; Mandal, Dipankar

    2018-04-01

    An all-fiber pyroelectric nanogenerator (PyNG) is fabricated where both the active pyroelectric component and the electrodes were composed of fiber. The pyroelectric component was made with randomly organized electrospun PVDF nano-fibers possessing ferroelectric β- and γ-phases. The PyNG possess higher level of sensitivity which can detect very low level of temperature fluctuation, as, low as, 2 K. In addition, the thermal energy harvesting ability of the PyNG under several temperature variations and cycling frequencies paves the way for next generation thermal sensor and self-powered flexible micro-electronics.

  2. Cognitive Processes that Underlie Mathematical Precociousness in Young Children

    ERIC Educational Resources Information Center

    Swanson, H. Lee

    2006-01-01

    The working memory (WM) processes that underlie young children's (ages 6-8 years) mathematical precociousness were examined. A battery of tests that assessed components of WM (phonological loop, visual-spatial sketchpad, and central executive), naming speed, random generation, and fluency was administered to mathematically precocious and…

  3. A randomized controlled trial of a commercially available weight loss program

    USDA-ARS?s Scientific Manuscript database

    The U.S. Preventive Services Task Force (USPSTF) recommends that clinicians refer obese adults for intensive, multi-component behavioral counseling, yet most obese Americans choose a self-help approach to lose weight. The current study examined weight loss between a community-based, intensive behavi...

  4. Effects of Psychoeducation for Offenders in a Community Correctional Facility

    ERIC Educational Resources Information Center

    Liau, Albert K.; Shively, Randy; Horn, Mary; Landau, Jennifer; Barriga, Alvaro; Gibbs, John C.

    2004-01-01

    The present study provided a randomized outcome evaluation of the psychoeducational component of the EQUIP program. The psychoeducational curriculum was implemented in a community correctional facility for adult felony offenders. The psychoeducational curriculum is designed to remedy offenders' delays in moral judgment maturity, social cognitive…

  5. CTEPP STANDARD OPERATING PROCEDURE FOR TELEPHONE SAMPLE SUBJECTS RECRUITMENT (SOP-1.12)

    EPA Science Inventory

    The subject recruitment procedures for the telephone sample component are described in the SOP. A random telephone sample list is ordered from a commercial survey sampling firm. Using this list, introductory letters are sent to targeted homes prior to making initial telephone c...

  6. Prediction of X-33 Engine Dynamic Environments

    NASA Technical Reports Server (NTRS)

    Shi, John J.

    1999-01-01

    Rocket engines normally have two primary sources of dynamic excitation. The first source is the injector and the combustion chambers that generate wide band random vibration. The second source is the turbopumps, which produce lower levels of wide band random vibration as well as sinusoidal vibration at frequencies related to the rotating speed and multiples thereof. Additionally, the pressure fluctuations due to flow turbulence and acoustics represent secondary sources of excitation. During the development stage, in order to design/size the rocket engine components, the local dynamic environments as well as dynamic interface loads have to be defined.

  7. Reproduction of exact solutions of Lipkin model by nonlinear higher random-phase approximation

    NASA Astrophysics Data System (ADS)

    Terasaki, J.; Smetana, A.; Šimkovic, F.; Krivoruchenko, M. I.

    2017-10-01

    It is shown that the random-phase approximation (RPA) method with its nonlinear higher generalization, which was previously considered as approximation except for a very limited case, reproduces the exact solutions of the Lipkin model. The nonlinear higher RPA is based on an equation nonlinear on eigenvectors and includes many-particle-many-hole components in the creation operator of the excited states. We demonstrate the exact character of solutions analytically for the particle number N = 2 and numerically for N = 8. This finding indicates that the nonlinear higher RPA is equivalent to the exact Schrödinger equation.

  8. Random matrix theory and portfolio optimization in Moroccan stock exchange

    NASA Astrophysics Data System (ADS)

    El Alaoui, Marwane

    2015-09-01

    In this work, we use random matrix theory to analyze eigenvalues and see if there is a presence of pertinent information by using Marčenko-Pastur distribution. Thus, we study cross-correlation among stocks of Casablanca Stock Exchange. Moreover, we clean correlation matrix from noisy elements to see if the gap between predicted risk and realized risk would be reduced. We also analyze eigenvectors components distributions and their degree of deviations by computing the inverse participation ratio. This analysis is a way to understand the correlation structure among stocks of Casablanca Stock Exchange portfolio.

  9. Statistical simulations of the dust foreground to cosmic microwave background polarization

    NASA Astrophysics Data System (ADS)

    Vansyngel, F.; Boulanger, F.; Ghosh, T.; Wandelt, B.; Aumont, J.; Bracco, A.; Levrier, F.; Martin, P. G.; Montier, L.

    2017-07-01

    The characterization of the dust polarization foreground to the cosmic microwave background (CMB) is a necessary step toward the detection of the B-mode signal associated with primordial gravitational waves. We present a method to simulate maps of polarized dust emission on the sphere that is similar to the approach used for CMB anisotropies. This method builds on the understanding of Galactic polarization stemming from the analysis of Planck data. It relates the dust polarization sky to the structure of the Galactic magnetic field and its coupling with interstellar matter and turbulence. The Galactic magnetic field is modeled as a superposition of a mean uniform field and a Gaussian random (turbulent) component with a power-law power spectrum of exponent αM. The integration along the line of sight carried out to compute Stokes maps is approximated by a sum over a small number of emitting layers with different realizations of the random component of the magnetic field. The model parameters are constrained to fit the power spectra of dust polarization EE, BB, and TE measured using Planck data. We find that the slopes of the E and B power spectra of dust polarization are matched for αM = -2.5, an exponent close to that measured for total dust intensity but larger than the Kolmogorov exponent - 11/3. The model allows us to compute multiple realizations of the Stokes Q and U maps for different realizations of the random component of the magnetic field, and to quantify the variance of dust polarization spectra for any given sky area outside of the Galactic plane. The simulations reproduce the scaling relation between the dust polarization power and the mean total dust intensity including the observed dispersion around the mean relation. We also propose a method to carry out multifrequency simulations, including the decorrelation measured recently by Planck, using a given covariance matrix of the polarization maps. These simulations are well suited to optimize component separation methods and to quantify the confidence with which the dust and CMB B-modes can be separated in present and future experiments. We also provide an astrophysical perspective on our phenomenological modeling of the dust polarization spectra.

  10. Design and methodology of a community-based cluster-randomized controlled trial for dietary behaviour change in rural Kerala

    PubMed Central

    Daivadanam, Meena; Wahlstrom, Rolf; Ravindran, T.K. Sundari; Sarma, P.S.; Sivasankaran, S.; Thankappan, K.R.

    2013-01-01

    Background Interventions targeting lifestyle-related risk factors and non-communicable diseases have contributed to the mainstream knowledge necessary for action. However, there are gaps in how this knowledge can be translated for practical day-to-day use in complex multicultural settings like that in India. Here, we describe the design of the Behavioural Intervention for Diet study, which was developed as a community-based intervention to change dietary behaviour among middle-income households in rural Kerala. Methods This was a cluster-randomized controlled trial to assess the effectiveness of a sequential stage-matched intervention to bring about dietary behaviour change by targeting the procurement and consumption of five dietary components: fruits, vegetables, salt, sugar, and oil. Following a step-wise process of pairing and exclusion of outliers, six out of 22 administrative units in the northern part of Trivandrum district, Kerala state were randomly selected and allocated to intervention or control arms. Trained community volunteers carried out the data collection and intervention delivery. An innovative tool was developed to assess household readiness-to-change, and a household measurement kit and easy formulas were introduced to facilitate the practical side of behaviour change. The 1-year intervention included a household component with sequential stage-matched intervention strategies at 0, 6, and 12 months along with counselling sessions, telephonic reminders, and home visits and a community component with general awareness sessions in the intervention arm. Households in the control arm received information on recommended levels of intake of the five dietary components and general dietary information leaflets. Discussion Formative research provided the knowledge to contextualise the design of the study in accordance with socio-cultural aspects, felt needs of the community, and the ground realities associated with existing dietary procurement, preparation, and consumption patterns. The study also addressed two key issues, namely the central role of the household as the decision unit and the long-term sustainability through the use of existing local and administrative networks and community volunteers. PMID:23866917

  11. Random Matrix Theory in molecular dynamics analysis.

    PubMed

    Palese, Luigi Leonardo

    2015-01-01

    It is well known that, in some situations, principal component analysis (PCA) carried out on molecular dynamics data results in the appearance of cosine-shaped low index projections. Because this is reminiscent of the results obtained by performing PCA on a multidimensional Brownian dynamics, it has been suggested that short-time protein dynamics is essentially nothing more than a noisy signal. Here we use Random Matrix Theory to analyze a series of short-time molecular dynamics experiments which are specifically designed to be simulations with high cosine content. We use as a model system the protein apoCox17, a mitochondrial copper chaperone. Spectral analysis on correlation matrices allows to easily differentiate random correlations, simply deriving from the finite length of the process, from non-random signals reflecting the intrinsic system properties. Our results clearly show that protein dynamics is not really Brownian also in presence of the cosine-shaped low index projections on principal axes. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Hanbury Brown and Twiss interferometry with twisted light

    PubMed Central

    Magaña-Loaiza, Omar S.; Mirhosseini, Mohammad; Cross, Robert M.; Rafsanjani, Seyed Mohammad Hashemi; Boyd, Robert W.

    2016-01-01

    The rich physics exhibited by random optical wave fields permitted Hanbury Brown and Twiss to unveil fundamental aspects of light. Furthermore, it has been recognized that optical vortices are ubiquitous in random light and that the phase distribution around these optical singularities imprints a spectrum of orbital angular momentum onto a light field. We demonstrate that random fluctuations of intensity give rise to the formation of correlations in the orbital angular momentum components and angular positions of pseudothermal light. The presence of these correlations is manifested through distinct interference structures in the orbital angular momentum–mode distribution of random light. These novel forms of interference correspond to the azimuthal analog of the Hanbury Brown and Twiss effect. This family of effects can be of fundamental importance in applications where entanglement is not required and where correlations in angular position and orbital angular momentum suffice. We also suggest that the azimuthal Hanbury Brown and Twiss effect can be useful in the exploration of novel phenomena in other branches of physics and astrophysics. PMID:27152334

  13. Random Feedback Makes Listeners Tone-Deaf.

    PubMed

    Vuvan, Dominique T; Zendel, Benjamin Rich; Peretz, Isabelle

    2018-05-08

    The mental representation of pitch structure (tonal knowledge) is a core component of musical experience and is learned implicitly through exposure to music. One theory of congenital amusia (tone deafness) posits that conscious access to tonal knowledge is disrupted, leading to a severe deficit of music cognition. We tested this idea by providing random performance feedback to neurotypical listeners while they listened to melodies for tonal incongruities and had their electrical brain activity monitored. The introduction of random feedback was associated with a reduction of accuracy and confidence, and a suppression of the late positive brain response usually elicited by conscious detection of a tonal violation. These effects mirror the behavioural and neurophysiological profile of amusia. In contrast, random feedback was associated with an increase in the amplitude of the early right anterior negativity, possibly due to heightened attention to the experimental task. This successful simulation of amusia in a normal brain highlights the key role of feedback in learning, and thereby provides a new avenue for the rehabilitation of learning disorders.

  14. Hanbury Brown and Twiss interferometry with twisted light.

    PubMed

    Magaña-Loaiza, Omar S; Mirhosseini, Mohammad; Cross, Robert M; Rafsanjani, Seyed Mohammad Hashemi; Boyd, Robert W

    2016-04-01

    The rich physics exhibited by random optical wave fields permitted Hanbury Brown and Twiss to unveil fundamental aspects of light. Furthermore, it has been recognized that optical vortices are ubiquitous in random light and that the phase distribution around these optical singularities imprints a spectrum of orbital angular momentum onto a light field. We demonstrate that random fluctuations of intensity give rise to the formation of correlations in the orbital angular momentum components and angular positions of pseudothermal light. The presence of these correlations is manifested through distinct interference structures in the orbital angular momentum-mode distribution of random light. These novel forms of interference correspond to the azimuthal analog of the Hanbury Brown and Twiss effect. This family of effects can be of fundamental importance in applications where entanglement is not required and where correlations in angular position and orbital angular momentum suffice. We also suggest that the azimuthal Hanbury Brown and Twiss effect can be useful in the exploration of novel phenomena in other branches of physics and astrophysics.

  15. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Technical Reports Server (NTRS)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  16. Severe traumatic brain injury management and clinical outcome using the Lund concept.

    PubMed

    Koskinen, L-O D; Olivecrona, M; Grände, P O

    2014-12-26

    This review covers the main principles of the Lund concept for treatment of severe traumatic brain injury. This is followed by a description of results of clinical studies in which this therapy or a modified version of the therapy has been used. Unlike other guidelines, which are based on meta-analytical approaches, important components of the Lund concept are based on physiological mechanisms for regulation of brain volume and brain perfusion and to reduce transcapillary plasma leakage and the need for plasma volume expanders. There have been nine non-randomized and two randomized outcome studies with the Lund concept or modified versions of the concept. The non-randomized studies indicated that the Lund concept is beneficial for outcome. The two randomized studies were small but showed better outcome in the groups of patients treated according to the modified principles of the Lund concept than in the groups given a more conventional treatment. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  17. Probing the stochastic, motor-driven properties of the cytoplasm using force spectrum microscopy

    PubMed Central

    Guo, Ming; Ehrlicher, Allen J.; Jensen, Mikkel H.; Renz, Malte; Moore, Jeffrey R.; Goldman, Robert D.; Lippincott-Schwartz, Jennifer; Mackintosh, Frederick C.; Weitz, David A.

    2014-01-01

    SUMMARY Molecular motors in cells typically produce highly directed motion; however, the aggregate, incoherent effect of all active processes also creates randomly fluctuating forces, which drive diffusive-like, non-thermal motion. Here we introduce force-spectrum-microscopy (FSM) to directly quantify random forces within the cytoplasm of cells and thereby probe stochastic motor activity. This technique combines measurements of the random motion of probe particles with independent micromechanical measurements of the cytoplasm to quantify the spectrum of force fluctuations. Using FSM, we show that force fluctuations substantially enhance intracellular movement of small and large components. The fluctuations are three times larger in malignant cells than in their benign counterparts. We further demonstrate that vimentin acts globally to anchor organelles against randomly fluctuating forces in the cytoplasm, with no effect on their magnitude. Thus, FSM has broad applications for understanding the cytoplasm and its intracellular processes in relation to cell physiology in healthy and diseased states. PMID:25126787

  18. No difference in terms of radiostereometric analysis between fixed- and mobile-bearing total knee arthroplasty: a randomized, single-blind, controlled trial.

    PubMed

    Schotanus, M G M; Pilot, P; Kaptein, B L; Draijer, W F; Tilman, P B J; Vos, R; Kort, N P

    2017-09-01

    A concern that arises with any new prosthesis is whether it will achieve satisfactory long-term implant stability. The gold standard of assessing the quality of fixation in a new or relatively new implant is to undertake a randomized controlled trial using radiostereometric analysis. It was hypothesized that both mobile-bearing total knee arthroplasty and fixed-bearing total knee arthroplasty have comparable migration patterns at 2-year follow-up. This study investigated two types of cemented total knee arthroplasty, the mobile- or fixed-bearing variant from the same family with use of radiostereometric analysis. This prospective, patient-blinded, randomized, controlled trial was designed to investigate early migration of the tibia component after two years of follow-up with use of radiostereometric analysis. A total of 50 patients were randomized to receive a mobile- or fixed-bearing TKA from the same family. Patients were evaluated during 2-year follow-up, including radiostereometric analysis, physical and clinical examination and patient reported outcome measures (PROMs). At two-year follow-up, the mean (±SD) maximum total point motion (MTPM) in the fixed-bearing group was 0.82 (±1.16) versus 0.92 mm (±0.64) in the mobile-bearing group (p = n.s) with the largest migration seen during the first 6 weeks (0.45 ± 0.32 vs. 0.54 ± 0.30). The clinical outcome and PROMs significantly improved within each group, not between both groups. Measuring early micromotion is useful for predicting clinical loosening that can lead to revision. The results of this study demonstrate that early migration of the mobile-bearing is similar to that of the fixed-bearing component at two years and was mainly seen in the first weeks after implantation. Randomized, single-blind, controlled trial, Level I.

  19. Structural Analysis of Pressurized Small Diameter Lines in a Random Vibration Environment

    NASA Technical Reports Server (NTRS)

    Davis, Mark; Ridnour, Andrew; Brethen, Mark

    2011-01-01

    The pressurization and propellant feed lines for the Ares 1 Upper Stage Reaction and Roll Control Systems (ReCS and RoCS) were required to be in a high g-load random vibration flight environment. The lines connected the system components and were filled with both liquid hydrazine and gaseous helium. They are considered small and varied between one fourth to one inch in diameter. The random vibration of the lines was considered to be base excitation through the mating components and mounting hardware. It was found that reducing the amount of support structure for the lines added flexibility to the system and improved the line stresses from random vibration, but caused higher stresses from the static g-loads. The locations and number of brackets were optimized by analyzing the mode shapes of the lines causing high stresses. The use of brackets that only constrain motion in the direction of concern further reduced the stresses in the lines. Finite element analysis was used to perform the analysis. The lines were pre-stressed by temperature and internal pressure with fluid and insulation included as non-structural mass. Base excitation was added to the model using Power Spectral Density (PSD) data for the expected flight loads. The random vibration and static g-load cases were combined to obtain the total stress in the lines. This approach advances the state of the art in line analysis by using FEA to predict the stresses in the lines and to optimize the entire system based on the expected flight environment. Adding flexibility to lines has been used in piping system for temperature loads, but in flight environments flexibility has been limited for the static stresses. Adding flexibility to the system in a flight environment by reducing brackets has the benefit of reducing stresses and weight

  20. Structuring Communication Relationships for Interprofessional Teamwork (SCRIPT): a cluster randomized controlled trial.

    PubMed

    Zwarenstein, Merrick; Reeves, Scott; Russell, Ann; Kenaszchuk, Chris; Conn, Lesley Gotlib; Miller, Karen-Lee; Lingard, Lorelei; Thorpe, Kevin E

    2007-09-18

    Despite a burgeoning interest in using interprofessional approaches to promote effective collaboration in health care, systematic reviews find scant evidence of benefit. This protocol describes the first cluster randomized controlled trial (RCT) to design and evaluate an intervention intended to improve interprofessional collaborative communication and patient-centred care. The objective is to evaluate the effects of a four-component, hospital-based staff communication protocol designed to promote collaborative communication between healthcare professionals and enhance patient-centred care. The study is a multi-centre mixed-methods cluster randomized controlled trial involving twenty clinical teaching teams (CTTs) in general internal medicine (GIM) divisions of five Toronto tertiary-care hospitals. CTTs will be randomly assigned either to receive an intervention designed to improve interprofessional collaborative communication, or to continue usual communication practices. Non-participant naturalistic observation, shadowing, and semi-structured, qualitative interviews were conducted to explore existing patterns of interprofessional collaboration in the CTTs, and to support intervention development. Interviews and shadowing will continue during intervention delivery in order to document interactions between the intervention settings and adopters, and changes in interprofessional communication. The primary outcome is the rate of unplanned hospital readmission. Secondary outcomes are length of stay (LOS); adherence to evidence-based prescription drug therapy; patients' satisfaction with care; self-report surveys of CTT staff perceptions of interprofessional collaboration; and frequency of calls to paging devices. Outcomes will be compared on an intention-to-treat basis using adjustment methods appropriate for data from a cluster randomized design. Pre-intervention qualitative analysis revealed that a substantial amount of interprofessional interaction lacks key core elements of collaborative communication such as self-introduction, description of professional role, and solicitation of other professional perspectives. Incorporating these findings, a four-component intervention was designed with a goal of creating a culture of communication in which the fundamentals of collaboration become a routine part of interprofessional interactions during unstructured work periods on GIM wards. Registered with National Institutes of Health as NCT00466297.

  1. Self-organized network of fractal-shaped components coupled through statistical interaction.

    PubMed

    Ugajin, R

    2001-09-01

    A dissipative dynamics is introduced to generate self-organized networks of interacting objects, which we call coupled-fractal networks. The growth model is constructed based on a growth hypothesis in which the growth rate of each object is a product of the probability of receiving source materials from faraway and the probability of receiving adhesives from other grown objects, where each object grows to be a random fractal if isolated, but connects with others if glued. The network is governed by the statistical interaction between fractal-shaped components, which can only be identified in a statistical manner over ensembles. This interaction is investigated using the degree of correlation between fractal-shaped components, enabling us to determine whether it is attractive or repulsive.

  2. System-Level Radiation Hardening

    NASA Technical Reports Server (NTRS)

    Ladbury, Ray

    2014-01-01

    Although system-level radiation hardening can enable the use of high-performance components and enhance the capabilities of a spacecraft, hardening techniques can be costly and can compromise the very performance designers sought from the high-performance components. Moreover, such techniques often result in a complicated design, especially if several complex commercial microcircuits are used, each posing its own hardening challenges. The latter risk is particularly acute for Commercial-Off-The-Shelf components since high-performance parts (e.g. double-data-rate synchronous dynamic random access memories - DDR SDRAMs) may require other high-performance commercial parts (e.g. processors) to support their operation. For these reasons, it is essential that system-level radiation hardening be a coordinated effort, from setting requirements through testing up to and including validation.

  3. Mixed model approaches for diallel analysis based on a bio-model.

    PubMed

    Zhu, J; Weir, B S

    1996-12-01

    A MINQUE(1) procedure, which is minimum norm quadratic unbiased estimation (MINQUE) method with 1 for all the prior values, is suggested for estimating variance and covariance components in a bio-model for diallel crosses. Unbiasedness and efficiency of estimation were compared for MINQUE(1), restricted maximum likelihood (REML) and MINQUE theta which has parameter values for the prior values. MINQUE(1) is almost as efficient as MINQUE theta for unbiased estimation of genetic variance and covariance components. The bio-model is efficient and robust for estimating variance and covariance components for maternal and paternal effects as well as for nuclear effects. A procedure of adjusted unbiased prediction (AUP) is proposed for predicting random genetic effects in the bio-model. The jack-knife procedure is suggested for estimation of sampling variances of estimated variance and covariance components and of predicted genetic effects. Worked examples are given for estimation of variance and covariance components and for prediction of genetic merits.

  4. Financial management and job social skills training components in a summer business institute: a controlled evaluation in high achieving predominantly ethnic minority youth.

    PubMed

    Donohue, Brad; Conway, Debbie; Beisecker, Monica; Murphy, Heather; Farley, Alisha; Waite, Melissa; Gugino, Kristin; Knatz, Danielle; Lopez-Frank, Carolina; Burns, Jack; Madison, Suzanne; Shorty, Carrie

    2005-07-01

    Ninety-two adolescents, predominantly ethnic minority high school students, participated in a structured Summer Business Institute (SBI). Participating youth were randomly assigned to receive either job social skills or financial management skills training components. Students who additionally received the job social skills training component were more likely to recommend their employment agency to others than were youth who received the financial management component, rated their overall on-the-job work experience more favorably, and demonstrated higher scores in areas that were relevant to the skills that were taught in the job social skills workshops. The financial management component also appeared to be relatively effective, as youth who received this intervention improved their knowledge of financial management issues more than youth who received job social skills, and rated their workshops as more helpful in financial management, as well as insurance management. Future directions are discussed in light of these results.

  5. Computer analysis of the leaf movements of pinto beans.

    PubMed

    Hoshizaki, T; Hamner, K C

    1969-07-01

    Computer analysis was used for the detection of rhythmic components and the estimation of period length in leaf movement records. The results of this study indicated that spectral analysis can be profitably used to determine rhythmic components in leaf movements.In Pinto bean plants (Phaseolus vulgaris L.) grown for 28 days under continuous light of 750 ft-c and at a constant temperature of 28 degrees , there was only 1 highly significant rhythmic component in the leaf movements. The period of this rhythm was 27.3 hr. In plants grown at 20 degrees , there were 2 highly significant rhythmic components: 1 of 13.8 hr and a much stronger 1 of 27.3 hr. At 15 degrees , the highly significant rhythmic components were also 27.3 and 13.8 hr in length but were of equal intensity. Random movements less than 9 hr in length became very pronounced at this temperature. At 10 degrees , no significant rhythm was found in the leaf movements. At 5 degrees , the leaf movements ceased within 1 day.

  6. Inferring random component distributions from environmental measurements for quality assurance

    USDA-ARS?s Scientific Manuscript database

    Environmental measurement programs can add value by providing not just accurate data, but also a measure of that accuracy. While quality assurance (QA) has been recognized as necessary since almost the beginning of automated weather measurement, it has received less attention than the data proper. M...

  7. Science Laboratory Environment and Academic Performance

    ERIC Educational Resources Information Center

    Aladejana, Francisca; Aderibigbe, Oluyemisi

    2007-01-01

    The study determined how students assess the various components of their science laboratory environment. It also identified how the laboratory environment affects students' learning outcomes. The modified ex-post facto design was used. A sample of 328 randomly selected students was taken from a population of all Senior Secondary School chemistry…

  8. Differential effects of estrogen and progestin on apolipoprotein B100 and B48 kinetics in postmenopausal women

    USDA-ARS?s Scientific Manuscript database

    The distinct effects of the estrogen and progestin components of hormonal therapy on the metabolism of apolipoprotein (apo) B-containing lipoproteins have not been studied. We enrolled eight healthy postmenopausal women in a placebo-controlled, randomized, double-blind, crossover study. Each subject...

  9. Modeling Heterogeneous Variance-Covariance Components in Two-Level Models

    ERIC Educational Resources Information Center

    Leckie, George; French, Robert; Charlton, Chris; Browne, William

    2014-01-01

    Applications of multilevel models to continuous outcomes nearly always assume constant residual variance and constant random effects variances and covariances. However, modeling heterogeneity of variance can prove a useful indicator of model misspecification, and in some educational and behavioral studies, it may even be of direct substantive…

  10. PREDICTING PARTICULATE (PM-10) FREQUENCY DISTRIBUTIONS FOR URBAN POPULATIONS USING A RANDOM COMPONENT SUPERPOSITION MODEL (RCS) MODEL

    EPA Science Inventory

    Health risk evaluations usually require the frequency distribution of personal exposures of a given population. For particles, personal exposure field studies have been conducted in only a few urban areas, such as Riverside, CA; Philipsburg, NJ; and Toronto, Ontario. This paper...

  11. Quantitative organic vapor-particle sampler

    DOEpatents

    Gundel, Lara; Daisey, Joan M.; Stevens, Robert K.

    1998-01-01

    A quantitative organic vapor-particle sampler for sampling semi-volatile organic gases and particulate components. A semi-volatile organic reversible gas sorbent macroreticular resin agglomerates of randomly packed microspheres with the continuous porous structure of particles ranging in size between 0.05-10 .mu.m for use in an integrated diffusion vapor-particle sampler.

  12. Sequential Effects in Deduction: Cost of Inference Switch

    ERIC Educational Resources Information Center

    Milan, Emilio G.; Moreno-Rios, Sergio; Espino, Orlando; Santamaria, Carlos; Gonzalez-Hernandez, Antonio

    2010-01-01

    The task-switch paradigm has helped psychologists gain insight into the processes involved in changing from one activity to another. The literature has yielded consistent results about switch cost reconfiguration (abrupt offset in regular task-switch vs. gradual reduction in random task-switch; endogenous and exogenous components of switch cost;…

  13. Smooth empirical Bayes estimation of observation error variances in linear systems

    NASA Technical Reports Server (NTRS)

    Martz, H. F., Jr.; Lian, M. W.

    1972-01-01

    A smooth empirical Bayes estimator was developed for estimating the unknown random scale component of each of a set of observation error variances. It is shown that the estimator possesses a smaller average squared error loss than other estimators for a discrete time linear system.

  14. Revealing the microstructure of the giant component in random graph ensembles

    NASA Astrophysics Data System (ADS)

    Tishby, Ido; Biham, Ofer; Katzav, Eytan; Kühn, Reimer

    2018-04-01

    The microstructure of the giant component of the Erdős-Rényi network and other configuration model networks is analyzed using generating function methods. While configuration model networks are uncorrelated, the giant component exhibits a degree distribution which is different from the overall degree distribution of the network and includes degree-degree correlations of all orders. We present exact analytical results for the degree distributions as well as higher-order degree-degree correlations on the giant components of configuration model networks. We show that the degree-degree correlations are essential for the integrity of the giant component, in the sense that the degree distribution alone cannot guarantee that it will consist of a single connected component. To demonstrate the importance and broad applicability of these results, we apply them to the study of the distribution of shortest path lengths on the giant component, percolation on the giant component, and spectra of sparse matrices defined on the giant component. We show that by using the degree distribution on the giant component one obtains high quality results for these properties, which can be further improved by taking the degree-degree correlations into account. This suggests that many existing methods, currently used for the analysis of the whole network, can be adapted in a straightforward fashion to yield results conditioned on the giant component.

  15. Multistate Lempel-Ziv (MLZ) index interpretation as a measure of amplitude and complexity changes.

    PubMed

    Sarlabous, Leonardo; Torres, Abel; Fiz, Jose A; Gea, Joaquim; Galdiz, Juan B; Jane, Raimon

    2009-01-01

    The Lempel-Ziv complexity (LZ) has been widely used to evaluate the randomness of finite sequences. In general, the LZ complexity has been used to determine the complexity grade present in biomedical signals. The LZ complexity is not able to discern between signals with different amplitude variations and similar random components. On the other hand, amplitude parameters, as the root mean square (RMS), are not able to discern between signals with similar power distributions and different random components. In this work, we present a novel method to quantify amplitude and complexity variations in biomedical signals by means of the computation of the LZ coefficient using more than two quantification states, and with thresholds fixed and independent of the dynamic range or standard deviation of the analyzed signal: the Multistate Lempel-Ziv (MLZ) index. Our results indicate that MLZ index with few quantification levels only evaluate the complexity changes of the signal, with high number of levels, the amplitude variations, and with an intermediate number of levels informs about both amplitude and complexity variations. The study performed in diaphragmatic mechanomyographic signals shows that the amplitude variations of this signal are more correlated with the respiratory effort than the complexity variations. Furthermore, it has been observed that the MLZ index with high number of levels practically is not affected by the existence of impulsive, sinusoidal, constant and Gaussian noises compared with the RMS amplitude parameter.

  16. Vibration Isolation for Launch of a Space Station Orbital Replacement Unit

    NASA Technical Reports Server (NTRS)

    Maly, Joseph R.; Sills, Joel W., Jr.; Pendleton, Scott C.; James, George H., III; Mimovich, Mark

    2004-01-01

    Delivery of Orbital Replacement Units (ORUs) to on-orbit destinations such a the International Space Station (ISS) and the Hubble Space Telescope is an important component of the space program. ORUs are integrated on orbit with space assets to maintain and upgrade functionality. For ORUs comprised of sensitive equipment, the dynamic launch environment drives design and testing requirements, and high frequency random vibrations are generally the cause for failure. Vibration isolation can mitigate the structure-borne vibration environment during launch, and hardware has been developed that can provide a reduced environment for current and future launch environments. Random vibration testing of one ORU to equivalent Space Shuttle launch levels revealed that its qualification and acceptance requirements were exceeded. An isolation system was designed to mitigate the structure-borne launch vibration environment. To protect this ORU, the random vibration levels at 50 Hz must be attenuated by a factor of two and those at higher frequencies even more. Design load factors for Shuttle launch are high, so a metallic load path is needed to maintain strength margins. Isolation system design was performed using a finite element model of the ORU on its carrier with representative disturbance inputs. Iterations on the modelled to an optimized design based on flight proven SoftRide MultiFlex isolators. Component testing has been performed on prototype isolators to validate analytical predictions.

  17. A Multilevel Model to Estimate the Within- and the Between-Center Components of the Exposure/Disease Association in the EPIC Study

    PubMed Central

    2015-01-01

    In a multicenter study, the overall relationship between exposure and the risk of cancer can be broken down into a within-center component, which reflects the individual level association, and a between-center relationship, which captures the association at the aggregate level. A piecewise exponential proportional hazards model with random effects was used to evaluate the association between dietary fiber intake and colorectal cancer (CRC) risk in the EPIC study. During an average follow-up of 11.0 years, 4,517 CRC events occurred among study participants recruited in 28 centers from ten European countries. Models were adjusted by relevant confounding factors. Heterogeneity among centers was modelled with random effects. Linear regression calibration was used to account for errors in dietary questionnaire (DQ) measurements. Risk ratio estimates for a 10 g/day increment in dietary fiber were equal to 0.90 (95%CI: 0.85, 0.96) and 0.85 (0.64, 1.14), at the individual and aggregate levels, respectively, while calibrated estimates were 0.85 (0.76, 0.94), and 0.87 (0.65, 1.15), respectively. In multicenter studies, over a straightforward ecological analysis, random effects models allow information at the individual and ecologic levels to be captured, while controlling for confounding at both levels of evidence. PMID:25785729

  18. [Toward exploration of morphological diversity of measurable traits of mammalian skull. 2. Scalar and vector parameters of the forms of group variation].

    PubMed

    Lisovskiĭ, A A; Pavlinov, I Ia

    2008-01-01

    Any morphospace is partitioned by the forms of group variation, its structure is described by a set of scalar (range, overlap) and vector (direction) characteristics. They are analyzed quantitatively for the sex and age variations in the sample of 200 skulls of the pine marten described by 14 measurable traits. Standard dispersion and variance components analyses are employed, accompanied with several resampling methods (randomization and bootstrep); effects of changes in the analysis design on results of the above methods are also considered. Maximum likelihood algorithm of variance components analysis is shown to give an adequate estimates of portions of particular forms of group variation within the overall disparity. It is quite stable in respect to changes of the analysis design and therefore could be used in the explorations of the real data with variously unbalanced designs. A new algorithm of estimation of co-directionality of particular forms of group variation within the overall disparity is elaborated, which includes angle measures between eigenvectors of covariation matrices of effects of group variations calculated by dispersion analysis. A null hypothesis of random portion of a given group variation could be tested by means of randomization of the respective grouping variable. A null hypothesis of equality of both portions and directionalities of different forms of group variation could be tested by means of the bootstrep procedure.

  19. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    PubMed Central

    2016-01-01

    Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination. PMID:26883810

  20. Effect of ambient light on the time needed to complete a fetal biophysical profile: A randomized controlled trial.

    PubMed

    Said, Heather M; Gupta, Shweta; Vricella, Laura K; Wand, Katy; Nguyen, Thinh; Gross, Gilad

    2017-10-01

    The objective of this study is to determine whether ambient light serves as a fetal stimulus to decrease the amount of time needed to complete a biophysical profile. This is a randomized controlled trial of singleton gestations undergoing a biophysical profile. Patients were randomized to either ambient light or a darkened room. The primary outcome was the time needed to complete the biophysical profile. Secondary outcomes included total and individual component biophysical profile scores and scores less than 8. A subgroup analysis of different maternal body mass indices was also performed. 357 biophysical profile studies were analyzed. 182 studies were performed with ambient light and 175 were performed in a darkened room. There was no difference in the median time needed to complete the biophysical profile based on exposure to ambient light (6.1min in darkened room versus 6.6min with ambient light; P=0.73). No difference was found in total or individual component biophysical profile scores. Subgroup analysis by maternal body mass index did not demonstrate shorter study times with ambient light exposure in women who were normal weight, overweight or obese. Ambient light exposure did not decrease the time needed to complete the biophysical profile. There was no evidence that ambient light altered fetal behavior observed during the biophysical profile. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Complex Atheromatosis of the Aortic Arch in Cerebral Infarction

    PubMed Central

    Capmany, Ramón Pujadas; Ibañez, Montserrat Oliveras; Pesquer, Xavier Jané

    2010-01-01

    In many stroke patients it is not possible to establish the etiology of stroke. However, in the last two decades, the use of transesophageal echocardiography in patients with stroke of uncertain etiology reveals atherosclerotic plaques in the aortic arch, which often protrude into the lumen and have mobile components in a high percentage of cases. Several autopsy series and retrospective studies of cases and controls have shown an association between aortic arch atheroma and arterial embolism, which was later confirmed by prospectively designed studies. The association with ischemic stroke was particularly strong when atheromas were located proximal to the ostium of the left subclavian artery, when the plaque was ≥ 4 mm thick and particularly when mobile components are present. In these cases, aspirin might not prevent adequately new arterial ischemic events especially stroke. Here we review the evidence of aortic arch atheroma as an independent risk factor for stroke and arterial embolism, including clinical and pathological data on atherosclerosis of the thoracic aorta as an embolic source. In addition, the impact of complex plaques (≥ 4 mm thick, or with mobile components) on increasing the risk of stroke is also reviewed. In non-randomized retrospective studies anticoagulation was superior to antiplatelet therapy in patients with stroke and aortic arch plaques with mobile components. In a retrospective case-control study, statins significantly reduced the relative risk of new vascular events. However, given the limited data available and its retrospective nature, randomized prospective studies are needed to establish the optimal secondary prevention therapeutic regimens in these high risk patients. PMID:21804777

  2. Complex atheromatosis of the aortic arch in cerebral infarction.

    PubMed

    Capmany, Ramón Pujadas; Ibañez, Montserrat Oliveras; Pesquer, Xavier Jané

    2010-08-01

    In many stroke patients it is not possible to establish the etiology of stroke. However, in the last two decades, the use of transesophageal echocardiography in patients with stroke of uncertain etiology reveals atherosclerotic plaques in the aortic arch, which often protrude into the lumen and have mobile components in a high percentage of cases. Several autopsy series and retrospective studies of cases and controls have shown an association between aortic arch atheroma and arterial embolism, which was later confirmed by prospectively designed studies. The association with ischemic stroke was particularly strong when atheromas were located proximal to the ostium of the left subclavian artery, when the plaque was ≥ 4 mm thick and particularly when mobile components are present. In these cases, aspirin might not prevent adequately new arterial ischemic events especially stroke. Here we review the evidence of aortic arch atheroma as an independent risk factor for stroke and arterial embolism, including clinical and pathological data on atherosclerosis of the thoracic aorta as an embolic source. In addition, the impact of complex plaques (≥ 4 mm thick, or with mobile components) on increasing the risk of stroke is also reviewed. In non-randomized retrospective studies anticoagulation was superior to antiplatelet therapy in patients with stroke and aortic arch plaques with mobile components. In a retrospective case-control study, statins significantly reduced the relative risk of new vascular events. However, given the limited data available and its retrospective nature, randomized prospective studies are needed to establish the optimal secondary prevention therapeutic regimens in these high risk patients.

  3. Histogram contrast analysis and the visual segregation of IID textures.

    PubMed

    Chubb, C; Econopouly, J; Landy, M S

    1994-09-01

    A new psychophysical methodology is introduced, histogram contrast analysis, that allows one to measure stimulus transformations, f, used by the visual system to draw distinctions between different image regions. The method involves the discrimination of images constructed by selecting texture micropatterns randomly and independently (across locations) on the basis of a given micropattern histogram. Different components of f are measured by use of different component functions to modulate the micropattern histogram until the resulting textures are discriminable. When no discrimination threshold can be obtained for a given modulating component function, a second titration technique may be used to measure the contribution of that component to f. The method includes several strong tests of its own assumptions. An example is given of the method applied to visual textures composed of small, uniform squares with randomly chosen gray levels. In particular, for a fixed mean gray level mu and a fixed gray-level variance sigma 2, histogram contrast analysis is used to establish that the class S of all textures composed of small squares with jointly independent, identically distributed gray levels with mean mu and variance sigma 2 is perceptually elementary in the following sense: there exists a single, real-valued function f S of gray level, such that two textures I and J in S are discriminable only if the average value of f S applied to the gray levels in I is significantly different from the average value of f S applied to the gray levels in J. Finally, histogram contrast analysis is used to obtain a seventh-order polynomial approximation of f S.

  4. Radiological outcomes of pinless navigation in total knee arthroplasty: a randomized controlled trial.

    PubMed

    Chen, Jerry Yongqiang; Chin, Pak Lin; Li, Zongxian; Yew, Andy Khye Soon; Tay, Darren Keng Jin; Chia, Shi-Lu; Lo, Ngai Nung; Yeo, Seng Jin

    2015-12-01

    This study aimed to investigate the accuracy of pinless navigation (BrainLAB(®) VectorVision(®) Knee 2.5 Navigation System) as an intra-operative alignment guide in total knee arthroplasty (TKA). The authors hypothesized that pinless navigation would reduce the proportion of outliers in conventional TKA, without a significant increase in the duration of surgery. Between 2011 and 2012, 100 patients scheduled for a unilateral primary TKA were randomized into two groups: pinless navigation and conventional surgery. All TKAs were performed with the surgical aim of achieving neutral coronal alignment with a 180° mechanical axis. The primary outcomes of this study were post-operative radiographic assessment of lower limb alignment using hip-knee-ankle angle (HKA) and components placement using coronal femoral-component angle (CFA) and coronal tibia-component angle (CTA). There was a smaller proportion of outliers for HKA, CFA and CTA at 10, 2 and 2 % respectively, in the pinless navigation group, compared to 32, 16 and 16 %, respectively, in the conventional group (p = 0.013, p = 0.032 and p = 0.032, respectively). The mean CFA was also more accurate at 90° in the pinless navigation group compared to 91° in the conventional group (p = 0.002). There was no difference in the duration of surgery between the two groups (n.s.). Pinless navigation improves lower limb alignment and components placement without a significant increase in the duration of surgery. The authors recommend the use of pinless navigation to verify the coronal alignments of conventional cutting blocks in TKA before the bone cuts are made. I.

  5. Determination of the optimal number of components in independent components analysis.

    PubMed

    Kassouf, Amine; Jouan-Rimbaud Bouveresse, Delphine; Rutledge, Douglas N

    2018-03-01

    Independent components analysis (ICA) may be considered as one of the most established blind source separation techniques for the treatment of complex data sets in analytical chemistry. Like other similar methods, the determination of the optimal number of latent variables, in this case, independent components (ICs), is a crucial step before any modeling. Therefore, validation methods are required in order to decide about the optimal number of ICs to be used in the computation of the final model. In this paper, three new validation methods are formally presented. The first one, called Random_ICA, is a generalization of the ICA_by_blocks method. Its specificity resides in the random way of splitting the initial data matrix into two blocks, and then repeating this procedure several times, giving a broader perspective for the selection of the optimal number of ICs. The second method, called KMO_ICA_Residuals is based on the computation of the Kaiser-Meyer-Olkin (KMO) index of the transposed residual matrices obtained after progressive extraction of ICs. The third method, called ICA_corr_y, helps to select the optimal number of ICs by computing the correlations between calculated proportions and known physico-chemical information about samples, generally concentrations, or between a source signal known to be present in the mixture and the signals extracted by ICA. These three methods were tested using varied simulated and experimental data sets and compared, when necessary, to ICA_by_blocks. Results were relevant and in line with expected ones, proving the reliability of the three proposed methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Reconstituted fresh whole blood improves clinical outcomes compared with stored component blood therapy for neonates undergoing cardiopulmonary bypass for cardiac surgery: a randomized controlled trial.

    PubMed

    Gruenwald, Colleen E; McCrindle, Brian W; Crawford-Lean, Lynn; Holtby, Helen; Parshuram, Christopher; Massicotte, Patricia; Van Arsdell, Glen

    2008-12-01

    This study compared the effects of reconstituted fresh whole blood against standard blood component therapy in neonates undergoing cardiac surgery. Patients less than 1 month of age were randomized to receive either reconstituted fresh whole blood (n = 31) or standard blood component therapy (n = 33) to prime the bypass circuit and for transfusion during the 24 hours after cardiopulmonary bypass. Primary outcome was chest tube drainage; secondary outcomes included transfusion needs, inotrope score, ventilation time, and hospital length of stay. Patients who received reconstituted fresh whole blood had significantly less postoperative chest tube volume loss per kilogram of body weight (7.7 mL/kg vs 11.8 mL/kg; P = .03). Standard blood component therapy was associated with higher inotropic score (6.6 vs 3.3; P = .002), longer ventilation times (164 hours vs 119 hours; P = .04), as well as longer hospital stays (18 days vs 12 days; P = .006) than patients receiving reconstituted fresh whole blood. Of the different factors associated with the use of reconstituted fresh whole blood, lower platelet counts at 10 minutes and at the end of cardiopulmonary bypass, older age of cells used in the prime and throughout bypass, and exposures to higher number of allogeneic donors were found to be independent predictors of poor clinical outcomes. Reconstituted fresh whole blood used for the prime, throughout cardiopulmonary bypass, and for all transfusion requirements within the first 24 hours postoperatively results in reduced chest tube volume loss and improved clinical outcomes in neonatal patients undergoing cardiac surgery.

  7. Extraction of fault component from abnormal sound in diesel engines using acoustic signals

    NASA Astrophysics Data System (ADS)

    Dayong, Ning; Changle, Sun; Yongjun, Gong; Zengmeng, Zhang; Jiaoyi, Hou

    2016-06-01

    In this paper a method for extracting fault components from abnormal acoustic signals and automatically diagnosing diesel engine faults is presented. The method named dislocation superimposed method (DSM) is based on the improved random decrement technique (IRDT), differential function (DF) and correlation analysis (CA). The aim of DSM is to linearly superpose multiple segments of abnormal acoustic signals because of the waveform similarity of faulty components. The method uses sample points at the beginning of time when abnormal sound appears as the starting position for each segment. In this study, the abnormal sound belonged to shocking faulty type; thus, the starting position searching method based on gradient variance was adopted. The coefficient of similar degree between two same sized signals is presented. By comparing with a similar degree, the extracted fault component could be judged automatically. The results show that this method is capable of accurately extracting the fault component from abnormal acoustic signals induced by faulty shocking type and the extracted component can be used to identify the fault type.

  8. Random Number Generation and Executive Functions in Parkinson's Disease: An Event-Related Brain Potential Study.

    PubMed

    Münte, Thomas F; Joppich, Gregor; Däuper, Jan; Schrader, Christoph; Dengler, Reinhard; Heldmann, Marcus

    2015-01-01

    The generation of random sequences is considered to tax executive functions and has been reported to be impaired in Parkinson's disease (PD) previously. To assess the neurophysiological markers of random number generation in PD. Event-related potentials (ERP) were recorded in 12 PD patients and 12 age-matched normal controls (NC) while either engaging in random number generation (RNG) by pressing the number keys on a computer keyboard in a random sequence or in ordered number generation (ONG) necessitating key presses in the canonical order. Key presses were paced by an external auditory stimulus at a rate of 1 tone every 1800 ms. As a secondary task subjects had to monitor the tone-sequence for a particular target tone to which the number "0" key had to be pressed. This target tone occurred randomly and infrequently, thus creating a secondary oddball task. Behaviorally, PD patients showed an increased tendency to count in steps of one as well as a tendency towards repetition avoidance. Electrophysiologically, the amplitude of the P3 component of the ERP to the target tone of the secondary task was reduced during RNG in PD but not in NC. The behavioral findings indicate less random behavior in PD while the ERP findings suggest that this impairment comes about, because attentional resources are depleted in PD.

  9. Depression Care Management: Can Employers Purchase Improved Outcomes?

    PubMed Central

    Rost, Kathryn; Marshall, Donna; Shearer, Benjamin; Dietrich, Allen J.

    2011-01-01

    Fourteen vendors are currently selling depression care management products to US employers after randomized trials demonstrate improved work outcomes. The research team interviewed 10 (71.4%) of these vendors to compare their products to four key components of interventions demonstrated to improve work outcomes. Five of 10 depression products incorporate all four key components, three of which are sold by health maintenance organizations (HMOs); however, HMOs did not deliver these components at the recommended intensity and/or duration. Only one product delivered by a disease management company delivered all four components of care at the recommended intensity and duration. This “voltage drop,” which we anticipate will increase with product implementation, suggests that every delivery system should carefully evaluate the design of its depression product before implementation for its capacity to deliver evidence-based care, repeating these evaluations as new evidence emerges. PMID:21738872

  10. Application of stochastic processes in random growth and evolutionary dynamics

    NASA Astrophysics Data System (ADS)

    Oikonomou, Panagiotis

    We study the effect of power-law distributed randomness on the dynamical behavior of processes such as stochastic growth patterns and evolution. First, we examine the geometrical properties of random shapes produced by a generalized stochastic Loewner Evolution driven by a superposition of a Brownian motion and a stable Levy process. The situation is defined by the usual stochastic Loewner Evolution parameter, kappa, as well as alpha which defines the power-law tail of the stable Levy distribution. We show that the properties of these patterns change qualitatively and singularly at critical values of kappa and alpha. It is reasonable to call such changes "phase transitions". These transitions occur as kappa passes through four and as alpha passes through one. Numerical simulations are used to explore the global scaling behavior of these patterns in each "phase". We show both analytically and numerically that the growth continues indefinitely in the vertical direction for alpha greater than 1, goes as logarithmically with time for alpha equals to 1, and saturates for alpha smaller than 1. The probability density has two different scales corresponding to directions along and perpendicular to the boundary. Scaling functions for the probability density are given for various limiting cases. Second, we study the effect of the architecture of biological networks on their evolutionary dynamics. In recent years, studies of the architecture of large networks have unveiled a common topology, called scale-free, in which a majority of the elements are poorly connected except for a small fraction of highly connected components. We ask how networks with distinct topologies can evolve towards a pre-established target phenotype through a process of random mutations and selection. We use networks of Boolean components as a framework to model a large class of phenotypes. Within this approach, we find that homogeneous random networks and scale-free networks exhibit drastically different evolutionary paths. While homogeneous random networks accumulate neutral mutations and evolve by sparse punctuated steps, scale-free networks evolve rapidly and continuously towards the target phenotype. Moreover, we show that scale-free networks always evolve faster than homogeneous random networks; remarkably, this property does not depend on the precise value of the topological parameter. By contrast, homogeneous random networks require a specific tuning of their topological parameter in order to optimize their fitness. This model suggests that the evolutionary paths of biological networks, punctuated or continuous, may solely be determined by the network topology.

  11. Enhancing physical and social environments to reduce obesity among public housing residents: rationale, trial design, and baseline data for the Healthy Families study.

    PubMed

    Quintiliani, Lisa M; DeBiasse, Michele A; Branco, Jamie M; Bhosrekar, Sarah Gees; Rorie, Jo-Anna L; Bowen, Deborah J

    2014-11-01

    Intervention programs that change environments have the potential for greater population impact on obesity compared to individual-level programs. We began a cluster randomized, multi-component multi-level intervention to improve weight, diet, and physical activity among low-socioeconomic status public housing residents. Here we describe the rationale, intervention design, and baseline survey data. After approaching 12 developments, ten were randomized to intervention (n=5) or assessment-only control (n=5). All residents in intervention developments are welcome to attend any intervention component: health screenings, mobile food bus, walking groups, cooking demonstrations, and a social media campaign; all of which are facilitated by community health workers who are residents trained in health outreach. To evaluate weight and behavioral outcomes, a subgroup of female residents and their daughters age 8-15 were recruited into an evaluation cohort. In total, 211 households completed the survey (RR=46.44%). Respondents were Latino (63%), Black (24%), and had ≤ high school education (64%). Respondents reported ≤2 servings of fruits & vegetables/day (62%), visiting fast food restaurants 1+ times/week (32%), and drinking soft drinks daily or more (27%). The only difference between randomized groups was race/ethnicity, with more Black residents in the intervention vs. control group (28% vs. 19%, p=0.0146). Among low-socioeconomic status urban public housing residents, we successfully recruited and randomized families into a multi-level intervention targeting obesity. If successful, this intervention model could be adopted in other public housing developments or entities that also employ community health workers, such as food assistance programs or hospitals. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Intervening on spontaneous physical activity to prevent weight regain in older adults: design of a randomized, clinical trial.

    PubMed

    Nicklas, Barbara J; Gaukstern, Jill E; Legault, Claudine; Leng, Iris; Rejeski, W Jack

    2012-03-01

    There is a need to identify evidenced-based obesity treatments that are effective in maintaining lost weight. Weight loss results in reductions in energy expenditure, including spontaneous physical activity (SPA) which is defined as energy expenditure resulting primarily from unstructured mobility-related activities that occur during daily life. To date, there is little research, especially randomized, controlled trials, testing strategies that can be adopted and sustained to prevent declines in SPA that occur with weight loss. Self-monitoring is a successful behavioral strategy to facilitate behavior change, so a provocative question is whether monitoring SPA-related energy expenditure would override these reductions in SPA, and slow weight regain. This study is a randomized trial in older, obese men and women designed to test the hypothesis that adding a self-regulatory intervention (SRI), focused around self-monitoring of SPA, to a weight loss intervention will result in less weight and fat mass regain following weight loss than a comparable intervention that lacks this self-regulatory behavioral strategy. Participants (n=72) are randomized to a 5-month weight loss intervention with or without the addition of a behavioral component that includes an innovative approach to promoting increased SPA. Both groups then transition to self-selected diet and exercise behavior for a 5-month follow-up. Throughout the 10-month period, the SRI group is provided with an intervention designed to promote a SPA level that is equal to or greater than each individual's baseline SPA level, allowing us to isolate the effects of the SPA self-regulatory intervention component on weight and fat mass regain. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE PAGES

    Ye, Xin; Garikapati, Venu M.; You, Daehyun; ...

    2017-11-08

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  14. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Xin; Garikapati, Venu M.; You, Daehyun

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  15. Improving Adherence to Smoking Cessation Treatment: Smoking Outcomes in a Web-based Randomized Trial.

    PubMed

    Graham, Amanda L; Papandonatos, George D; Cha, Sarah; Erar, Bahar; Amato, Michael S

    2018-03-15

    Partial adherence in Internet smoking cessation interventions presents treatment and evaluation challenges. Increasing adherence may improve outcomes. To present smoking outcomes from an Internet randomized trial of two strategies to encourage adherence to tobacco dependence treatment components: (i) a social network (SN) strategy to integrate smokers into an online community and (ii) free nicotine replacement therapy (NRT). In addition to intent-to-treat analyses, we used novel statistical methods to distinguish the impact of treatment assignment from treatment utilization. A total of 5,290 current smokers on a cessation website (WEB) were randomized to WEB, WEB + SN, WEB + NRT, or WEB + SN + NRT. The main outcome was 30-day point prevalence abstinence at 3 and 9 months post-randomization. Adherence measures included self-reported medication use (meds), and website metrics of skills training (sk) and community use (comm). Inverse Probability of Retention Weighting and Inverse Probability of Treatment Weighting jointly addressed dropout and treatment selection. Propensity weights were used to calculate Average Treatment effects on the Treated. Treatment assignment analyses showed no effects on abstinence for either adherence strategy. Abstinence rates were 25.7%-32.2% among participants that used all three treatment components (sk+comm +meds).Treatment utilization analyses revealed that among such participants, sk+comm+meds yielded large percentage point increases in 3-month abstinence rates over sk alone across arms: WEB = 20.6 (95% CI = 10.8, 30.4), WEB + SN = 19.2 (95% CI = 11.1, 27.3), WEB + NRT = 13.1 (95% CI = 4.1, 22.0), and WEB + SN + NRT = 20.0 (95% CI = 12.2, 27.7). Novel propensity weighting approaches can serve as a model for establishing efficacy of Internet interventions and yield important insights about mechanisms. NCT01544153.

  16. Self-administration of intranasal influenza vaccine: Immunogenicity and volunteer acceptance

    PubMed Central

    Burgess, Timothy H.; Murray, Clinton K.; Bavaro, Mary F.; Landrum, Michael L.; O’Bryan, Thomas A.; Rosas, Jessica G.; Cammarata, Stephanie M.; Martin, Nicholas J.; Ewing, Daniel; Raviprakash, Kanakatte; Mor, Deepika; Zell, Elizabeth R.; Wilkins, Kenneth J.; Millar, Eugene V.

    2018-01-01

    Background In outbreak settings, mass vaccination strategies could maximize health protection of military personnel. Self-administration of live attenuated influenza vaccine (LAIV) may be a means to vaccinate large numbers of people and achieve deployment readiness while sparing the use of human resources. Methods A phase IV, open-label, randomized controlled trial evaluating the immunogenicity and acceptance of self-administered (SA) LAIV was conducted from 2012 to 2014. SA subjects were randomized to either individual self-administration or self-administration in a group setting. Control randomized subjects received healthcare worker-administered (HCWA) LAIV. Anti-hemagglutinin (HAI) antibody concentrations were measured pre- and post-vaccination. The primary endpoint was immunogenicity non-inferiority between SA and HCWA groups. Subjects were surveyed on preferred administration method. Results A total of 1077 subjects consented and were randomized (529 SA, 548 HCWA). Subject characteristics were very similar between groups, though SA subjects were younger, more likely to be white and on active duty. The per-protocol analysis included 1024 subjects (501 SA, 523 HCWA). Post-vaccination geometric mean titers by vaccine strain and by study group (HCWA vs. SA) were: A/H1N1 (45.8 vs. 48.7, respectively; p = 0.43), A/H3N2 (45.5 vs. 46.4; p = 0.80), B/Yamagata (17.2 vs. 17.8; p = 0.55). Seroresponses to A components were high (∼67%), while seroresponses to B components were lower (∼25%). Seroresponse did not differ by administration method. Baseline preference for administration method was similar between groups, with the majority in each group expressing no preference. At follow-up, the majority (64%) of SA subjects preferred SA vaccine. Conclusions LAIV immunogenicity was similar for HCWA and SA vaccines. SA was well-tolerated and preferred to HCWA among those who performed SA. PMID:26117150

  17. Temporal variation in the mating structure of Sanday, Orkney Islands.

    PubMed

    Brennan, E R; Relethford, J H

    1983-01-01

    Pedigree and vital statistics data from the population of Sanday, Orkney Islands, Scotland, were used to assess temporal changes in population structure. Secular trends in patterns of mate choice were analysed for three separate birth cohorts of spouses: 1855-1884, 1885-1924 and 1925-1964. The degree to which mating was random or assortative with respect to both genealogical and geographic distance was determined by comparing average characteristics of all potential mates of married males with those of actual wives. We integrated this procedure, originally developed by Dyke (1971), into a three-fold investigation of population structure: (1) comparison of random and non-random components of relatedness as measured from pedigree data; (2) an analysis of marital distance distributions for actual and potential mates of married males; and (3) the relationship between genealogical relatedness and geographic distance. As population size decreased from 1881 to the present, total kinship and spatial distances between spouses increased. Whereas the random component of relatedness increased over time, consanguinity avoidance was sufficient to decrease the total coefficient of kinship over time. Part of the increase in consanguinity avoidance was associated with isolate breakdown, as distances between island-born spouses, as well as the total amount of off-island migration, increased from the mid-nineteenth century to the present. Mate choice was influenced by geographic distance for all time periods, although this effect diminished over time. Since decreases in population size, concomitant with increases in consanguinity avoidance and community exogamy, have probably occurred quite frequently in small human populations, as well as in rural Western communities in the past century, observed secular trends illustrate the potential for change in population structure characteristic of isolate breakdown.

  18. Cemented all-polyethylene and metal-backed polyethylene tibial components used for primary total knee arthroplasty: a systematic review of the literature and meta-analysis of randomized controlled trials involving 1798 primary total knee implants.

    PubMed

    Voigt, Jeffrey; Mosier, Michael

    2011-10-05

    The cost of the implant as part of a total knee arthroplasty accounts for a substantial portion of the costs for the overall procedure: all-polyethylene tibial components cost considerably less than cemented metal-backed tibial components. We performed a systematic review of the literature to determine whether the clinical results of lower-cost all-polyethylene tibial components were comparable with the results of a more expensive metal-backed tibial component. We searched The Cochrane Library, MEDLINE, EMBASE, EBSCO CINAHL, the bibliographies of identified articles, orthopaedic meeting abstracts, health technology assessment web sites, and important orthopaedic journals. This search was performed for the years 1990 to the present. No language restriction was applied. We restricted our search to Level-I studies involving participants who received either an all-polyethylene or a metal-backed tibial implant. The primary outcome measures were durability, function, and adverse events. Two reviewers independently screened the papers for inclusion, assessed trial quality, and extracted data. Effects estimates were pooled with use of fixed and random-effects models of risk ratios, calculated with 95% confidence intervals. Heterogeneity was assessed with the I2 statistic. Forest plots were also generated. Data on 1798 primary total knee implants from twelve studies were analyzed. In all studies, the median or mean age of the participants was greater than sixty-seven years, with a majority of the patients being female. There was no difference between patients managed with an all-polyethylene tibial component and those managed with a metal-backed tibial component in terms of adverse events. There was no significant difference between the two groups in terms of the durability of the implants at two, ten, and fifteen years postoperatively, regardless of the year or how durability was defined (revision or radiographic failure). Finally, with use of a variety of validated measures, there was no difference between the two groups in terms of functional status at two, eight, and ten years, regardless of the measure used. A less expensive all-polyethylene component as part of a total knee arthroplasty has results equivalent to those obtained with a cemented metal-backed tibial component. Using a total knee implant with a cemented all-polyethylene tibial component could save the healthcare system substantial money while obtaining equivalent results to more expensive cemented designs and materials.

  19. Constraint-induced aphasia therapy in post-stroke aphasia rehabilitation: A systematic review and meta-analysis of randomized controlled trials

    PubMed Central

    Bao, Yong; Xie, Qing; Xu, Yang; Zhang, Junmei

    2017-01-01

    Background Constraint-induced aphasia therapy (CIAT) has been widely used in post-stroke aphasia rehabilitation. An increasing number of clinical controlled trials have investigated the efficacy of the CIAT for the post-stroke aphasia. Purpose To systematically review the randomized controlled trials (RCTs) concerning the effect of the CIAT in post-stroke patients with aphasia, and to identify the useful components of CIAT in post-stroke aphasia rehabilitation. Methods A computerized database search was performed through five databases (Pubmed, EMbase, Medline, ScienceDirect and Cochrane library). Cochrane handbook domains were used to evaluate the methodological quality of the included RCTs. Results Eight RCTs qualified in the inclusion criteria. Inconsistent results were found in comparing the CIAT with conventional therapies without any component from the CIAT based on the results of three RCTs. Five RCTs showed that the CIAT performed equally well as other intensive aphasia therapies, in terms of improving language performance. One RCT showed that therapies embedded with social interaction were likely to enhance the efficacy of the CIAT. Conclusion CIAT may be useful for improving chronic post-stroke aphasia, however, limited evidence to support its superiority to other aphasia therapies. Massed practice is likely to be a useful component of CIAT, while the role of “constraint” is needed to be further explored. CIAT embedded with social interaction may gain more benefits. PMID:28846724

  20. Systematic review and meta-analysis of interventions targeting sleep and their impact on child body mass index, diet, and physical activity.

    PubMed

    Yoong, Sze Lin; Chai, Li Kheng; Williams, Christopher M; Wiggers, John; Finch, Meghan; Wolfenden, Luke

    2016-05-01

    This review aimed to examine the impact of interventions involving an explicit sleep component on child body mass index (BMI), diet, and physical activity. A systematic search was undertaken in six databases to identify randomized controlled trials examining the impact of interventions with a sleep component on child BMI, dietary intake, and/or physical activity. A random effects meta-analysis was conducted assessing the impact of included interventions on child BMI. Of the eight included trials, three enforced a sleep protocol and five targeted sleep as part of multicomponent behavioral interventions either exclusively or together with nutrition and physical activity. Meta-analysis of three studies found that multicomponent behavioral interventions involving a sleep component were not significantly effective in changing child BMI (n = 360,-0.04 kg/m(2) [-0.18, 0.11], I(2)  = 0%); however, only one study included in the meta-analysis successfully changed sleep duration in children. There were some reported improvements to adolescent diet, and only one trial examined the impact on child physical activity, where a significant effect was observed. Findings from the included studies suggest that where improvements in child sleep duration were achieved, a positive impact on child BMI, nutrition, and physical activity was also observed. © 2016 The Obesity Society.

  1. Effects of a progressive resistance exercise program with high-speed component on the physical function of older women with sarcopenic obesity: a randomized controlled trial.

    PubMed

    Vasconcelos, Karina S S; Dias, João M D; Araújo, Marília C; Pinheiro, Ana C; Moreira, Bruno S; Dias, Rosângela C

    2016-07-11

    Sarcopenic obesity is associated with disability in older people, especially in women. Resistance exercises are recommended for this population, but their efficacy is not clear. To evaluate the effects of a progressive resistance exercise program with high-speed component on the physical function of older women with sarcopenic obesity. Twenty-eight women 65 to 80 years old, with a body mass index ≥30kg/m2 and handgrip strength ≤21kg were randomly allocated to two groups. The experimental group underwent a 10-week resistance exercise program designed to improve strength, power, and endurance of lower-limb muscles, with open chain and closed chain exercises. The control group had their health status monitored through telephone calls. The primary outcomes were lower limb muscle performance measured by knee extensor strength, power and fatigue by isokinetic dynamometry, and mobility measured by the Short Physical Performance Battery and by gait velocity. The secondary outcome was health-related quality of life assessed by the SF-36 Questionnaire. The average rate of adherence was 85%, with few mild adverse effects. There were no significant between-group differences for any of the outcomes. In this study, a progressive resistance exercise program with high-speed component was not effective for improving the physical function of older women with sarcopenic obesity.

  2. Positive Psychology Interventions Addressing Pleasure, Engagement, Meaning, Positive Relationships, and Accomplishment Increase Well-Being and Ameliorate Depressive Symptoms: A Randomized, Placebo-Controlled Online Study.

    PubMed

    Gander, Fabian; Proyer, René T; Ruch, Willibald

    2016-01-01

    Seligman (2002) suggested three paths to well-being, the pursuit of pleasure, the pursuit of meaning, and the pursuit of engagement, later adding two more, positive relationships and accomplishment, in his 2011 version. The contribution of these new components to well-being has yet to be addressed. In an online positive psychology intervention study, we randomly assigned 1624 adults aged 18-78 (M = 46.13; 79.2% women) to seven conditions. Participants wrote down three things they related to either one of the five components of Seligman's Well-Being theory (Conditions 1-5), all of the five components (Condition 6) or early childhood memories (placebo control condition). We assessed happiness (AHI) and depression (CES-D) before and after the intervention, and 1-, 3-, and 6 months afterwards. Additionally, we considered moderation effects of well-being levels at baseline. Results confirmed that all interventions were effective in increasing happiness and most ameliorated depressive symptoms. The interventions worked best for those in the middle-range of the well-being continuum. We conclude that interventions based on pleasure, engagement, meaning, positive relationships, and accomplishment are effective strategies for increasing well-being and ameliorating depressive symptoms and that positive psychology interventions are most effective for those people in the middle range of the well-being continuum.

  3. Constraint-induced aphasia therapy in post-stroke aphasia rehabilitation: A systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Zhang, Jiaqi; Yu, Jiadan; Bao, Yong; Xie, Qing; Xu, Yang; Zhang, Junmei; Wang, Pu

    2017-01-01

    Constraint-induced aphasia therapy (CIAT) has been widely used in post-stroke aphasia rehabilitation. An increasing number of clinical controlled trials have investigated the efficacy of the CIAT for the post-stroke aphasia. To systematically review the randomized controlled trials (RCTs) concerning the effect of the CIAT in post-stroke patients with aphasia, and to identify the useful components of CIAT in post-stroke aphasia rehabilitation. A computerized database search was performed through five databases (Pubmed, EMbase, Medline, ScienceDirect and Cochrane library). Cochrane handbook domains were used to evaluate the methodological quality of the included RCTs. Eight RCTs qualified in the inclusion criteria. Inconsistent results were found in comparing the CIAT with conventional therapies without any component from the CIAT based on the results of three RCTs. Five RCTs showed that the CIAT performed equally well as other intensive aphasia therapies, in terms of improving language performance. One RCT showed that therapies embedded with social interaction were likely to enhance the efficacy of the CIAT. CIAT may be useful for improving chronic post-stroke aphasia, however, limited evidence to support its superiority to other aphasia therapies. Massed practice is likely to be a useful component of CIAT, while the role of "constraint" is needed to be further explored. CIAT embedded with social interaction may gain more benefits.

  4. Length and pressure of the reconstructed lower esophageal sphincter is determined by both crural closure and Nissen fundoplication.

    PubMed

    Louie, Brian E; Kapur, Seema; Blitz, Maurice; Farivar, Alexander S; Vallières, Eric; Aye, Ralph W

    2013-02-01

    Laparoscopic Nissen fundoplication is comprised of: a wrap thought responsible for the lower esophageal sphincter function and crural closure performed to prevent herniation. We hypothesized gastroesophageal junction competence effected by Nissen fundoplication results from closure of the crural diaphragm and creation of the fundoplication. Patients with uncomplicated reflux undergoing Nissen fundoplication were prospectively enrolled. After hiatal dissection, patients were randomized to crural closure followed by fundoplication (group 1) or fundoplication followed by crural closure (group 2). Intra-operative high-resolution manometry collected sphincter pressure and length data after complete dissection and after each component repair. Eighteen patients were randomized. When compared to the completely dissected hiatus, the mean sphincter length increased 1.3 cm (p < 0.001), and mean sphincter pressure was increased by 13.7 mmHg (p < 0.001). Groups 1 and 2 had similar sphincter length and pressure changes. Crural closure and fundal wrap contribute equally to sphincter length, although crural closure appears to contribute more to sphincter pressure. The Nissen fundoplication restores the function of the gastroesophageal junction and thus the reflux barrier by means of two main components: the crural closure and the construction of a 360° fundal wrap. Each of these components is equally important in establishing both increased sphincter length and pressure.

  5. Chemical composition separation of a propylene-ethylene random copolymer by high temperature solvent gradient interaction chromatography.

    PubMed

    Liu, Yonggang; Phiri, Mohau Justice; Ndiripo, Anthony; Pasch, Harald

    2017-11-03

    A propylene-ethylene random copolymer was fractionated by preparative temperature rising elution fractionation (TREF). The structural heterogeneity of the bulk sample and its TREF fractions was studied by high temperature liquid chromatography with a solvent gradient elution from 1-decanol to 1,2,4-trichlorobenzene. HPLC alone cannot resolve those propylene-ethylene copolymers with high ethylene content in the bulk sample, due to their low weight fractions in the bulk sample and a small response factor of these components in the ELSD detector, as well as their broad chemical composition distribution. These components can only be detected after being separated and enriched by TREF followed by HPLC analysis. Chemical composition separations were achieved for TREF fractions with average ethylene contents between 2.1 and 22.0mol%, showing that copolymers with higher ethylene contents were adsorbed stronger in the Hypercarb column and eluted later. All TREF fractions, except the 40°C fraction, were relatively homogeneous in both molar mass and chemical composition. The 40°C fraction was rather broad in both molar mass and chemical composition distributions. 2D HPLC showed that the molar masses of the components containing more ethylene units were getting lower for the 40°C fraction. HPLC revealed and confirmed that co-crystallization influences the separation in TREF of the studied propylene-ethylene copolymer. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Differentiating Motivational from Affective Influence of Performance-contingent Reward on Cognitive Control: The Wanting Component Enhances Both Proactive and Reactive Control.

    PubMed

    Chaillou, Anne-Clémence; Giersch, Anne; Hoonakker, Marc; Capa, Rémi L; Bonnefond, Anne

    2017-04-01

    Positive affect strongly modulates goal-directed behaviors and cognitive control mechanisms. It often results from the presence of a pleasant stimulus in the environment, whether that stimulus appears unpredictably or as a consequence of a particular behavior. The influence of positive affect linked to a random pleasant stimulus differs from the influence of positive affect resulting from performance-contingent pleasant stimuli. However, the mechanisms by which the performance contingency of pleasant stimuli modulates the influence of positive affect on cognitive control mechanisms have not been elucidated. Here, we tested the hypothesis that these differentiated effects are the consequence of the activation of the motivational "wanting" component specifically under performance contingency conditions. To that end, we directly compared the effects on cognitive control of pleasant stimuli (a monetary reward) attributed in a performance contingent manner, and of random pleasant stimuli (positive picture) not related to performance, during an AX-CPT task. Both proactive and reactive modes of control were increased specifically by performance contingency, as reflected by faster reaction times and larger amplitude of the CNV and P3a components. Our findings advance our understanding of the respective effects of affect and motivation, which is of special interest regarding alterations of emotion-motivation interaction found in several psychopathological disorders. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Convergence of sampling in protein simulations

    NASA Astrophysics Data System (ADS)

    Hess, Berk

    2002-03-01

    With molecular dynamics protein dynamics can be simulated in atomic detail. Current computers are not fast enough to probe all available conformations, but fluctuations around one conformation can be sampled to a reasonable extent. The motions with the largest fluctuations can be filtered out of a simulation using covariance or principal component analysis. A problem with this analysis is that random diffusion can appear as correlated motion. An analysis is presented of how long a simulation should be to obtain relevant results for global motions. The analysis reveals that the cosine content of the principal components is a good indicator for bad sampling.

  8. Within-Tunnel Variations in Pressure Data for Three Transonic Wind Tunnels

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2014-01-01

    This paper compares the results of pressure measurements made on the same test article with the same test matrix in three transonic wind tunnels. A comparison is presented of the unexplained variance associated with polar replicates acquired in each tunnel. The impact of a significance component of systematic (not random) unexplained variance is reviewed, and the results of analyses of variance are presented to assess the degree of significant systematic error in these representative wind tunnel tests. Total uncertainty estimates are reported for 140 samples of pressure data, quantifying the effects of within-polar random errors and between-polar systematic bias errors.

  9. A distributed scheduling algorithm for heterogeneous real-time systems

    NASA Technical Reports Server (NTRS)

    Zeineldine, Osman; El-Toweissy, Mohamed; Mukkamala, Ravi

    1991-01-01

    Much of the previous work on load balancing and scheduling in distributed environments was concerned with homogeneous systems and homogeneous loads. Several of the results indicated that random policies are as effective as other more complex load allocation policies. The effects of heterogeneity on scheduling algorithms for hard real time systems is examined. A distributed scheduler specifically to handle heterogeneities in both nodes and node traffic is proposed. The performance of the algorithm is measured in terms of the percentage of jobs discarded. While a random task allocation is very sensitive to heterogeneities, the algorithm is shown to be robust to such non-uniformities in system components and load.

  10. Evaluation of random errors in Williams’ series coefficients obtained with digital image correlation

    NASA Astrophysics Data System (ADS)

    Lychak, Oleh V.; Holyns'kiy, Ivan S.

    2016-03-01

    The use of the Williams’ series parameters for fracture analysis requires valid information about their error values. The aim of this investigation is the development of the method for estimation of the standard deviation of random errors of the Williams’ series parameters, obtained from the measured components of the stress field. Also, the criteria for choosing the optimal number of terms in the truncated Williams’ series for derivation of their parameters with minimal errors is proposed. The method was used for the evaluation of the Williams’ parameters, obtained from the data, and measured by the digital image correlation technique for testing a three-point bending specimen.

  11. Scattering Models and Basic Experiments in the Microwave Regime

    NASA Technical Reports Server (NTRS)

    Fung, A. K.; Blanchard, A. J. (Principal Investigator)

    1985-01-01

    The objectives of research over the next three years are: (1) to develop a randomly rough surface scattering model which is applicable over the entire frequency band; (2) to develop a computer simulation method and algorithm to simulate scattering from known randomly rough surfaces, Z(x,y); (3) to design and perform laboratory experiments to study geometric and physical target parameters of an inhomogeneous layer; (4) to develop scattering models for an inhomogeneous layer which accounts for near field interaction and multiple scattering in both the coherent and the incoherent scattering components; and (5) a comparison between theoretical models and measurements or numerical simulation.

  12. Amplitude- and rise-time-compensated filters

    DOEpatents

    Nowlin, Charles H.

    1984-01-01

    An amplitude-compensated rise-time-compensated filter for a pulse time-of-occurrence (TOOC) measurement system is disclosed. The filter converts an input pulse, having the characteristics of random amplitudes and random, non-zero rise times, to a bipolar output pulse wherein the output pulse has a zero-crossing time that is independent of the rise time and amplitude of the input pulse. The filter differentiates the input pulse, along the linear leading edge of the input pulse, and subtracts therefrom a pulse fractionally proportional to the input pulse. The filter of the present invention can use discrete circuit components and avoids the use of delay lines.

  13. Coherent Doppler lidar signal covariance including wind shear and wind turbulence

    NASA Technical Reports Server (NTRS)

    Frehlich, R. G.

    1993-01-01

    The performance of coherent Doppler lidar is determined by the statistics of the coherent Doppler signal. The derivation and calculation of the covariance of the Doppler lidar signal is presented for random atmospheric wind fields with wind shear. The random component is described by a Kolmogorov turbulence spectrum. The signal parameters are clarified for a general coherent Doppler lidar system. There are two distinct physical regimes: one where the transmitted pulse determines the signal statistics and the other where the wind field dominates the signal statistics. The Doppler shift of the signal is identified in terms of the wind field and system parameters.

  14. Reduction in Acute Gastroenteritis among Military Trainees: Secondary Effects of a Hygiene-based Cluster-Randomized Trial for Skin and Soft Tissue Infection Prevention

    PubMed Central

    D’Onofrio, Michael J.; Schlett, Carey D.; Millar, Eugene V.; Cui, Tianyuan; Lanier, Jeffrey B.; Law, Natasha N.; Tribble, David R.; Ellis, Michael W.

    2018-01-01

    Military personnel in congregate settings are at increased risk for acute gastroenteritis.1,2 Personal hygiene (eg, frequent hand washing, hand sanitizers, etc.) remains a central strategy. A skin and soft tissue infection (SSTI) prevention trial was conducted among military trainees.3 Trainees were randomized to 1 of 3 groups with incrementally increasing education- and hygiene-based measures. The principal components were promotion of hand washing in addition to a once-weekly application of a chlorhexidine-based body wash. Herein, we report the trial’s impact on acute gastroenteritis. PMID:25695181

  15. The Influence of Mathematics Vocabulary Instruction Embedded within Addition Tutoring for First-Grade Students with Mathematics Difficulty

    ERIC Educational Resources Information Center

    Powell, Sarah R.; Driver, Melissa K.

    2015-01-01

    Researchers and practitioners indicate students require explicit instruction on mathematics vocabulary terms, yet no study has examined the effects of an embedded vocabulary component within mathematics tutoring for early elementary students. First-grade students with mathematics difficulty (MD; n = 98) were randomly assigned to addition tutoring…

  16. Power Analysis for Models of Change in Cluster Randomized Designs

    ERIC Educational Resources Information Center

    Li, Wei; Konstantopoulos, Spyros

    2017-01-01

    Field experiments in education frequently assign entire groups such as schools to treatment or control conditions. These experiments incorporate sometimes a longitudinal component where for example students are followed over time to assess differences in the average rate of linear change, or rate of acceleration. In this study, we provide methods…

  17. Attributions for Success and Failure in Smoking Cessation.

    ERIC Educational Resources Information Center

    Epstein, Jennifer A.; And Others

    This study examined the determinants of attributions for success or failure in stopping smoking in a self-help treatment program with and without a drug component. Subjects (N=137) were randomly assigned to one of three experimental conditions: (1) nicotine gum and a self-help manual with an intrinsic motivational orientation; (2) self-help manual…

  18. Effects of a Professional Development Package to Prepare Special Education Paraprofessionals to Implement Evidence-Based Practice

    ERIC Educational Resources Information Center

    Brock, Matthew E.; Carter, Erik W.

    2015-01-01

    Although paraprofessionals have become an increasingly integral part of special education services, most paraprofessionals lack training in evidence-based instructional strategies. We used a randomized contolled experimental design to examine the efficacy of a professional development training package and its individual components to equip 25…

  19. Impact of the Fit and Strong Intervention on Older Adults with Osteoarthritis

    ERIC Educational Resources Information Center

    Hughes, Susan L.; Seymour, Rachel B.; Campbell, Richard; Pollak, Naomi; Huber, Gail; Sharma, Leena

    2004-01-01

    Purpose: This study assessed the impact of a low cost, multicomponent physical activity intervention for older adults with lower extremity osteoarthritis. Design and Methods: A randomized controlled trial compared the effects of a facility-based multiple-component training program followed by home-based adherence (n = 80) to a wait list control…

  20. Teaching Students with Emotional and Behavioral Disorders to Self-Advocate through Persuasive Writing

    ERIC Educational Resources Information Center

    Cuenca-Sanchez, Yojanna; Mastropieri, Margo A.; Scruggs, Thomas E.; Kidd, Julie K.

    2012-01-01

    We examined the effectiveness of the Self-Regulated Strategy Development (SRSD) model of writing instruction with a self-determination training component for middle school-age students with emotional and behavioral disorders. We randomly assigned students to experimental or comparison treatments during which special education teachers provided the…

  1. Modeling species distribution and change using random forest [Chapter 8

    Treesearch

    Jeffrey S. Evans; Melanie A. Murphy; Zachary A. Holden; Samuel A. Cushman

    2011-01-01

    Although inference is a critical component in ecological modeling, the balance between accurate predictions and inference is the ultimate goal in ecological studies (Peters 1991; De’ath 2007). Practical applications of ecology in conservation planning, ecosystem assessment, and bio-diversity are highly dependent on very accurate spatial predictions of...

  2. Physical Training Improves Insulin Resistance Syndrome Markers in Obese Adolescents.

    ERIC Educational Resources Information Center

    Kang, Hyun-Sik; Gutin, Bernard; Barbeau, Paule; Owens, Scott; Lemmon, Christian R.; Allison, Jerry; Litaker, Mark S.; Le, Ngoc-Anh

    2002-01-01

    Tested the hypothesis that physical training (PT), especially high-intensity PT, would favorably affect components of the insulin resistance syndrome (IRS) in obese adolescents. Data on teens randomized into lifestyle education (LSE) alone, LSE plus moderate -intensity PT, and LSE plus high-intensity PT indicated that PT, especially high-intensity…

  3. Multi-component access to a community-based weight loss program: 12 week results

    USDA-ARS?s Scientific Manuscript database

    The current study examined weight loss between a comprehensive lifestyle modification program (Weight Watchers PointsPlus program) that included three ways to access and a self-help (SH) condition. A total of 293 participants were randomized to either a Weight Watchers condition (WW) (n=148) or a SH...

  4. A Group Contingency Program to Improve the Behavior of Elementary School Students in a Cafeteria

    ERIC Educational Resources Information Center

    Fabiano, Gregory A.; Pelham, William E., Jr.; Karmazin, Karen; Kreher, Joanne; Panahon, Carlos J.; Carlson, Carl

    2008-01-01

    Studies of behavior modification interventions for disruptive behavior in schools have generally focused on classroom behavior with less research directed toward child behavior in other school settings (e.g., cafeterias). The present report documents the effect of a group contingency intervention with a random reward component, targeting…

  5. The Impact of Sample Size and Other Factors When Estimating Multilevel Logistic Models

    ERIC Educational Resources Information Center

    Schoeneberger, Jason A.

    2016-01-01

    The design of research studies utilizing binary multilevel models must necessarily incorporate knowledge of multiple factors, including estimation method, variance component size, or number of predictors, in addition to sample sizes. This Monte Carlo study examined the performance of random effect binary outcome multilevel models under varying…

  6. Generalizability of Scaling Gradients on Direct Behavior Ratings

    ERIC Educational Resources Information Center

    Chafouleas, Sandra M.; Christ, Theodore J.; Riley-Tillman, T. Chris

    2009-01-01

    Generalizability theory is used to examine the impact of scaling gradients on a single-item Direct Behavior Rating (DBR). A DBR refers to a type of rating scale used to efficiently record target behavior(s) following an observation occasion. Variance components associated with scale gradients are estimated using a random effects design for persons…

  7. Morpho-Phonemic Analysis Boosts Word Reading for Adult Struggling Readers

    ERIC Educational Resources Information Center

    Gray, Susan H.; Ehri, Linnea C.; Locke, John L.

    2018-01-01

    A randomized control trial compared the effects of two kinds of vocabulary instruction on component reading skills of adult struggling readers. Participants seeking alternative high school diplomas received 8 h of scripted tutoring to learn forty academic vocabulary words embedded within a civics curriculum. They were matched for language…

  8. Virtual Reality Cognitive Behavior Therapy for Public Speaking Anxiety: A Randomized Clinical Trial

    ERIC Educational Resources Information Center

    Wallach, Helene S.; Safir, Marilyn P.; Bar-Zvi, Margalit

    2009-01-01

    Public speaking anxiety (PSA) is a common phobia. Although cognitive behavior therapy (CBT) is preferred, difficulties arise with the exposure component (lack of therapist control, patient's inability to imagine, self-flooding, loss of confidentiality resulting from public exposure). Virtual reality CBT (VRCBT) enables a high degree of therapist…

  9. Cognitive Behavioral Therapy for Anxiety in Children with Autism Spectrum Disorders: A Randomized, Controlled Trial

    ERIC Educational Resources Information Center

    Wood, Jeffrey J.; Drahota, Amy; Sze, Karen; Har, Kim; Chiu, Angela; Langer, David A.

    2009-01-01

    Background: Children with autism spectrum disorders often present with comorbid anxiety disorders that cause significant functional impairment. This study tested a modular cognitive behavioral therapy (CBT) program for children with this profile. A standard CBT program was augmented with multiple treatment components designed to accommodate or…

  10. Onion consumption and bone density in laying hens

    USDA-ARS?s Scientific Manuscript database

    Onion and its flavonoid component, quercetin, are associated with increased bone density in humans, rabbits, and rodents. The purpose of this study was to determine whether there is a similar effect of onion on laying hens. Thirty-two Hy-line W36 White Leghorn hens at 30 weeks of age were randomly d...

  11. Increasing Word Recognition Skills in High School Remedial Readers through Systematic Intersensory Transfer.

    ERIC Educational Resources Information Center

    Silverston, Randall A.; Deichmann, John W.

    The purpose of this study was to design and test a remedial reading instructional strategy for word recognition skills utilizing specific intersensory transfer components. The subjects were 56 high school sophomores and juniors enrolled in special education classes. Eight subjects were randomly selected from each of seven special education…

  12. Evaluating Parent Satisfaction of School Nursing Services

    ERIC Educational Resources Information Center

    Read, Mary; Small, Patricia; Donaher, Kathleen; Gilsanz, Paola; Sheetz, Anne

    2009-01-01

    The Conceptual Model of Nursing Health Policy (CMNHP) was used to guide this study of client satisfaction as one component of an ongoing assessment of the Essential School Health Service (ESHS) Programs conducted by the Massachusetts Department of Public Health. Random samples of parents/guardians of students who use the school nursing services…

  13. Component-based control of oil-gas-water mixture composition in pipelines

    NASA Astrophysics Data System (ADS)

    Voytyuk, I. N.

    2018-03-01

    The article theoretically proves the method for measuring the changes in content of oil, gas and water in pipelines; also the measurement system design for implementation thereof is discussed. An assessment is presented in connection with random and systemic errors for the future system, and recommendations for optimization thereof are presented.

  14. SimEngine v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le, Hai D.

    2017-03-02

    SimEngine provides the core functionalities and components that are key to the development of discrete event simulation tools. These include events, activities, event queues, random number generators, and basic result tracking classes. SimEngine was designed for high performance, integrates seamlessly into any Microsoft .Net development environment, and provides a flexible API for simulation developers.

  15. 78 FR 66008 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-04

    ... developing recommendations, future research and program needs concerning home- testing for MSM. Specific Aims... Manhunt and Adam4Adam. This study also has a qualitative component that aims to examine the experiences of participants in the randomized control trial (RCT). Participants for the qualitative data collection will be...

  16. Male lifetime mating success in relation to body size in Diabrotica barberi

    USDA-ARS?s Scientific Manuscript database

    Body size is often an important component of male lifetime mating success in insects, especially when males are capable of mating several times over their lifespan. We paired either a large or small male northern corn rootworm with a female of random size and noted copulation success. We observed co...

  17. Respiratory-deficient mutants of the unicellular green alga Chlamydomonas: a review.

    PubMed

    Salinas, Thalia; Larosa, Véronique; Cardol, Pierre; Maréchal-Drouard, Laurence; Remacle, Claire

    2014-05-01

    Genetic manipulation of the unicellular green alga Chlamydomonas reinhardtii is straightforward. Nuclear genes can be interrupted by insertional mutagenesis or targeted by RNA interference whereas random or site-directed mutagenesis allows the introduction of mutations in the mitochondrial genome. This, combined with a screen that easily allows discriminating respiratory-deficient mutants, makes Chlamydomonas a model system of choice to study mitochondria biology in photosynthetic organisms. Since the first description of Chlamydomonas respiratory-deficient mutants in 1977 by random mutagenesis, many other mutants affected in mitochondrial components have been characterized. These respiratory-deficient mutants increased our knowledge on function and assembly of the respiratory enzyme complexes. More recently some of these mutants allowed the study of mitochondrial gene expression processes poorly understood in Chlamydomonas. In this review, we update the data concerning the respiratory components with a special focus on the assembly factors identified on other organisms. In addition, we make an inventory of different mitochondrial respiratory mutants that are inactivated either on mitochondrial or nuclear genes. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  18. Volume and methodological quality of randomized controlled trials in laparoscopic surgery: assessment over a 10-year period.

    PubMed

    Antoniou, Stavros A; Andreou, Alexandros; Antoniou, George A; Koch, Oliver O; Köhler, Gernot; Luketina, Ruzica-R; Bertsias, Antonios; Pointner, Rudolph; Granderath, Frank-Alexander

    2015-11-01

    Measures have been taken to improve methodological quality of randomized controlled trials (RCTs). This review systematically assessed the trends in volume and methodological quality of RCTs on minimally invasive surgery within a 10-year period. RCTs on minimally invasive surgery were searched in the 10 most cited general surgical journals and the 5 most cited journals of laparoscopic interest for the years 2002 and 2012. Bibliometric and methodological quality components were abstracted using the Scottish Intercollegiate Guidelines Network. The pooled number of RCTs from low-contribution regions demonstrated an increasing proportion of the total published RCTs, compensating for a concomitant decrease of the respective contributions from Europe and North America. International collaborations were more frequent in 2012. Acceptable or high quality RCTs accounted for 37.9% and 54.4% of RCTs published in 2002 and 2012, respectively. Components of external validity were poorly reported. Both the volume and the reporting quality of laparoscopic RCTs have increased from 2002 to 2012, but there seems to be ample room for improvement of methodological quality. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Computational simulation of coupled material degradation processes for probabilistic lifetime strength of aerospace materials

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.

    1992-01-01

    The research included ongoing development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primative variables. These primative variable may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above described constitutive equation using actual experimental materials data together with linear regression of that data, thereby predicting values for the empirical material constraints for each effect or primative variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from the open literature for materials typically of interest to those studying aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  20. Random matrix theory and fund of funds portfolio optimisation

    NASA Astrophysics Data System (ADS)

    Conlon, T.; Ruskin, H. J.; Crane, M.

    2007-08-01

    The proprietary nature of Hedge Fund investing means that it is common practise for managers to release minimal information about their returns. The construction of a fund of hedge funds portfolio requires a correlation matrix which often has to be estimated using a relatively small sample of monthly returns data which induces noise. In this paper, random matrix theory (RMT) is applied to a cross-correlation matrix C, constructed using hedge fund returns data. The analysis reveals a number of eigenvalues that deviate from the spectrum suggested by RMT. The components of the deviating eigenvectors are found to correspond to distinct groups of strategies that are applied by hedge fund managers. The inverse participation ratio is used to quantify the number of components that participate in each eigenvector. Finally, the correlation matrix is cleaned by separating the noisy part from the non-noisy part of C. This technique is found to greatly reduce the difference between the predicted and realised risk of a portfolio, leading to an improved risk profile for a fund of hedge funds.

  1. Temporal processing and long-latency auditory evoked potential in stutterers.

    PubMed

    Prestes, Raquel; de Andrade, Adriana Neves; Santos, Renata Beatriz Fernandes; Marangoni, Andrea Tortosa; Schiefer, Ana Maria; Gil, Daniela

    Stuttering is a speech fluency disorder, and may be associated with neuroaudiological factors linked to central auditory processing, including changes in auditory processing skills and temporal resolution. To characterize the temporal processing and long-latency auditory evoked potential in stutterers and to compare them with non-stutterers. The study included 41 right-handed subjects, aged 18-46 years, divided into two groups: stutterers (n=20) and non-stutters (n=21), compared according to age, education, and sex. All subjects were submitted to the duration pattern tests, random gap detection test, and long-latency auditory evoked potential. Individuals who stutter showed poorer performance on Duration Pattern and Random Gap Detection tests when compared with fluent individuals. In the long-latency auditory evoked potential, there was a difference in the latency of N2 and P3 components; stutterers had higher latency values. Stutterers have poor performance in temporal processing and higher latency values for N2 and P3 components. Copyright © 2017 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  2. Fracture mechanics concepts in reliability analysis of monolithic ceramics

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.; Gyekenyesi, John P.

    1987-01-01

    Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

  3. Moisture Forecast Bias Correction in GEOS DAS

    NASA Technical Reports Server (NTRS)

    Dee, D.

    1999-01-01

    Data assimilation methods rely on numerous assumptions about the errors involved in measuring and forecasting atmospheric fields. One of the more disturbing of these is that short-term model forecasts are assumed to be unbiased. In case of atmospheric moisture, for example, observational evidence shows that the systematic component of errors in forecasts and analyses is often of the same order of magnitude as the random component. we have implemented a sequential algorithm for estimating forecast moisture bias from rawinsonde data in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The algorithm is designed to remove the systematic component of analysis errors and can be easily incorporated in an existing statistical data assimilation system. We will present results of initial experiments that show a significant reduction of bias in the GEOS DAS moisture analyses.

  4. Statistics of Shared Components in Complex Component Systems

    NASA Astrophysics Data System (ADS)

    Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo

    2018-04-01

    Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.

  5. Effectiveness of disease-management programs for improving diabetes care: a meta-analysis.

    PubMed

    Pimouguet, Clément; Le Goff, Mélanie; Thiébaut, Rodolphe; Dartigues, Jean François; Helmer, Catherine

    2011-02-08

    We conducted a meta-analysis of randomized controlled trials to assess the effectiveness of disease-management programs for improving glycemic control in adults with diabetes mellitus and to study which components of programs are associated with their effectiveness. We searched several databases for studies published up to December 2009. We included randomized controlled trials involving adults with type 1 or 2 diabetes that evaluated the effect of disease-management programs on glycated hemoglobin (hemoglobin A₁(C)) concentrations. We performed a meta-regression analysis to determine the effective components of the programs. We included 41 randomized controlled trials in our review. Across these trials, disease-management programs resulted in a significant reduction in hemoglobin A₁(C) levels (pooled standardized mean difference between intervention and control groups -0.38 [95% confidence interval -0.47 to -0.29], which corresponds to an absolute mean difference of 0.51%). The finding was robust in the sensitivity analyses based on quality assessment. Programs in which the disease manager was able to start or modify treatment with or without prior approval from the primary care physician resulted in a greater improvement in hemoglobin A₁(C) levels (standardized mean difference -0.60 v. -0.28 in trials with no approval to do so; p < 0.001). Programs with a moderate or high frequency of contact reported a significant reduction in hemoglobin A₁(C) levels compared with usual care; nevertheless, only programs with a high frequency of contact led to a significantly greater reduction compared with low-frequency contact programs (standardized mean difference -0.56 v. -0.30, p = 0.03). Disease-management programs had a clinically moderate but significant impact on hemoglobin A₁(C) levels among adults with diabetes. Effective components of programs were a high frequency of patient contact and the ability for disease managers to adjust treatment with or without prior physician approval.

  6. Quality of community basic medical service utilization in urban and suburban areas in Shanghai from 2009 to 2014.

    PubMed

    Guo, Lijun; Bao, Yong; Ma, Jun; Li, Shujun; Cai, Yuyang; Sun, Wei; Liu, Qiaohong

    2018-01-01

    Urban areas usually display better health care services than rural areas, but data about suburban areas in China are lacking. Hence, this cross-sectional study compared the utilization of community basic medical services in Shanghai urban and suburban areas between 2009 and 2014. These data were used to improve the efficiency of community health service utilization and to provide a reference for solving the main health problems of the residents in urban and suburban areas of Shanghai. Using a two-stage random sampling method, questionnaires were completed by 73 community health service centers that were randomly selected from six districts that were also randomly selected from 17 counties in Shanghai. Descriptive statistics, principal component analysis, and forecast analysis were used to complete a gap analysis of basic health services utilization quality between urban and suburban areas. During the 6-year study period, there was an increasing trend toward greater efficiency of basic medical service provision, benefits of basic medical service provision, effectiveness of common chronic disease management, overall satisfaction of community residents, and two-way referral effects. In addition to the implementation effect of hypertension management and two-way referral, the remaining indicators showed a superior effect in urban areas compared with the suburbs (P<0.001). In addition, among the seven principal components, four principal component scores were better in urban areas than in suburban areas (P = <0.001, 0.004, 0.036, and 0.022). The urban comprehensive score also exceeded that of the suburbs (P<0.001). In summary, over the 6-year period, there was a rapidly increasing trend in basic medical service utilization. Comprehensive satisfaction clearly improved as well. Nevertheless, there was an imbalance in health service utilization between urban and suburban areas. There is a need for the health administrative department to address this imbalance between urban and suburban institutions and to provide the required support to underdeveloped areas to improve resident satisfaction.

  7. Implementation-effectiveness trial of an ecological intervention for physical activity in ethnically diverse low income senior centers.

    PubMed

    Rich, Porchia; Aarons, Gregory A; Takemoto, Michelle; Cardenas, Veronica; Crist, Katie; Bolling, Khalisa; Lewars, Brittany; Sweet, Cynthia Castro; Natarajan, Loki; Shi, Yuyan; Full, Kelsie M; Johnson, Eileen; Rosenberg, Dori E; Whitt-Glover, Melicia; Marcus, Bess; Kerr, Jacqueline

    2017-07-18

    As the US population ages, there is an increasing need for evidence based, peer-led physical activity programs, particularly in ethnically diverse, low income senior centers where access is limited. The Peer Empowerment Program 4 Physical Activity' (PEP4PA) is a hybrid Type II implementation-effectiveness trial that is a peer-led physical activity (PA) intervention based on the ecological model of behavior change. The initial phase is a cluster randomized control trial randomized to either a peer-led PA intervention or usual center programming. After 18 months, the intervention sites are further randomized to continued support or no support for another 6 months. This study will be conducted at twelve senior centers in San Diego County in low income, diverse communities. In the intervention sites, 24 peer health coaches and 408 adults, aged 50 years and older, are invited to participate. Peer health coaches receive training and support and utilize a tablet computer for delivery and tracking. There are several levels of intervention. Individual components include pedometers, step goals, counseling, and feedback charts. Interpersonal components include group walks, group sharing and health tips, and monthly celebrations. Community components include review of PA resources, walkability audit, sustainability plan, and streetscape improvements. The primary outcome of interest is intensity and location of PA minutes per day, measured every 6 months by wrist and hip accelerometers and GPS devices. Secondary outcomes include blood pressure, physical, cognitive, and emotional functioning. Implementation measures include appropriateness & acceptability (perceived and actual fit), adoption & penetration (reach), fidelity (quantity & quality of intervention delivered), acceptability (satisfaction), costs, and sustainability. Using a peer led implementation strategy to deliver a multi-level community based PA program can enhance program adoption, implementation, and sustainment. ClinicalTrials.gov, USA ( NCT02405325 ). Date of registration, March 20, 2015. This website also contains all items from the World Health Organization Trial Registration Data Set.

  8. Quality of community basic medical service utilization in urban and suburban areas in Shanghai from 2009 to 2014

    PubMed Central

    Ma, Jun; Li, Shujun; Cai, Yuyang; Sun, Wei; Liu, Qiaohong

    2018-01-01

    Urban areas usually display better health care services than rural areas, but data about suburban areas in China are lacking. Hence, this cross-sectional study compared the utilization of community basic medical services in Shanghai urban and suburban areas between 2009 and 2014. These data were used to improve the efficiency of community health service utilization and to provide a reference for solving the main health problems of the residents in urban and suburban areas of Shanghai. Using a two-stage random sampling method, questionnaires were completed by 73 community health service centers that were randomly selected from six districts that were also randomly selected from 17 counties in Shanghai. Descriptive statistics, principal component analysis, and forecast analysis were used to complete a gap analysis of basic health services utilization quality between urban and suburban areas. During the 6-year study period, there was an increasing trend toward greater efficiency of basic medical service provision, benefits of basic medical service provision, effectiveness of common chronic disease management, overall satisfaction of community residents, and two-way referral effects. In addition to the implementation effect of hypertension management and two-way referral, the remaining indicators showed a superior effect in urban areas compared with the suburbs (P<0.001). In addition, among the seven principal components, four principal component scores were better in urban areas than in suburban areas (P = <0.001, 0.004, 0.036, and 0.022). The urban comprehensive score also exceeded that of the suburbs (P<0.001). In summary, over the 6-year period, there was a rapidly increasing trend in basic medical service utilization. Comprehensive satisfaction clearly improved as well. Nevertheless, there was an imbalance in health service utilization between urban and suburban areas. There is a need for the health administrative department to address this imbalance between urban and suburban institutions and to provide the required support to underdeveloped areas to improve resident satisfaction. PMID:29791470

  9. Effectiveness of disease-management programs for improving diabetes care: a meta-analysis

    PubMed Central

    Pimouguet, Clément; Le Goff, Mélanie; Thiébaut, Rodolphe; Dartigues, Jean François; Helmer, Catherine

    2011-01-01

    Background We conducted a meta-analysis of randomized controlled trials to assess the effectiveness of disease-management programs for improving glycemic control in adults with diabetes mellitus and to study which components of programs are associated with their effectiveness. Methods We searched several databases for studies published up to December 2009. We included randomized controlled trials involving adults with type 1 or 2 diabetes that evaluated the effect of disease-management programs on glycated hemoglobin (hemoglobin A1C) concentrations. We performed a meta-regression analysis to determine the effective components of the programs. Results We included 41 randomized controlled trials in our review. Across these trials, disease-management programs resulted in a significant reduction in hemoglobin A1C levels (pooled standardized mean difference between intervention and control groups −0.38 [95% confidence interval −0.47 to −0.29], which corresponds to an absolute mean difference of 0.51%). The finding was robust in the sensitivity analyses based on quality assessment. Programs in which the disease manager was able to start or modify treatment with or without prior approval from the primary care physician resulted in a greater improvement in hemoglobin A1C levels (standardized mean difference −0.60 v. −0.28 in trials with no approval to do so; p < 0.001). Programs with a moderate or high frequency of contact reported a significant reduction in hemoglobin A1C levels compared with usual care; nevertheless, only programs with a high frequency of contact led to a significantly greater reduction compared with low-frequency contact programs (standardized mean difference −0.56 v. −0.30, p = 0.03). Interpretation Disease-management programs had a clinically moderate but significant impact on hemoglobin A1C levels among adults with diabetes. Effective components of programs were a high frequency of patient contact and the ability for disease managers to adjust treatment with or without prior physician approval. PMID:21149524

  10. rTMS in fibromyalgia: a randomized trial evaluating QoL and its brain metabolic substrate.

    PubMed

    Boyer, Laurent; Dousset, Alix; Roussel, Philippe; Dossetto, Nathalie; Cammilleri, Serge; Piano, Virginie; Khalfa, Stéphanie; Mundler, Olivier; Donnet, Anne; Guedj, Eric

    2014-04-08

    This double-blind, randomized, placebo-controlled study investigated the impact of repetitive transcranial magnetic stimulation (rTMS) on quality of life (QoL) of patients with fibromyalgia, and its possible brain metabolic substrate. Thirty-eight patients were randomly assigned to receive high-frequency rTMS (n = 19) or sham stimulation (n = 19), applied to left primary motor cortex in 14 sessions over 10 weeks. Primary clinical outcomes were QoL changes at the end of week 11, measured using the Fibromyalgia Impact Questionnaire (FIQ). Secondary clinical outcomes were mental and physical QoL component measured using the 36-Item Short Form Health Survey (SF-36), but also pain, mood, and anxiety. Resting-state [(18)F]-fluorodeoxyglucose-PET metabolism was assessed at baseline, week 2, and week 11. Whole-brain voxel-based analysis was performed to study between-group metabolic changes over time. At week 11, patients of the active rTMS group had greater QoL improvement in the FIQ (p = 0.032) and in the mental component of the SF-36 (p = 0.019) than the sham stimulation group. No significant impact was found for other clinical outcomes. Compared with the sham stimulation group, patients of the active rTMS group presented an increase in right medial temporal metabolism between baseline and week 11 (p < 0.001), which was correlated with FIQ and mental component SF-36 concomitant changes (r = -0.38, p = 0.043; r = 0.51, p = 0.009, respectively). QoL improvement involved mainly affective, emotional, and social dimensions. Our study shows that rTMS improves QoL of patients with fibromyalgia. This improvement is associated with a concomitant increase in right limbic metabolism, arguing for a neural substrate to the impact of rTMS on emotional dimensions involved in QoL. This study provides Class II evidence that rTMS compared with sham rTMS improves QoL in patients with fibromyalgia.

  11. Perspectives: Nanofibers and nanowires for disordered photonics

    NASA Astrophysics Data System (ADS)

    Pisignano, Dario; Persano, Luana; Camposeo, Andrea

    2017-03-01

    As building blocks of microscopically non-homogeneous materials, semiconductor nanowires and polymer nanofibers are emerging component materials for disordered photonics, with unique properties of light emission and scattering. Effects found in assemblies of nanowires and nanofibers include broadband reflection, significant localization of light, strong and collective multiple scattering, enhanced absorption of incident photons, synergistic effects with plasmonic particles, and random lasing. We highlight recent related discoveries, with a focus on material aspects. The control of spatial correlations in complex assemblies during deposition, the coupling of modes with efficient transmission channels provided by nanofiber waveguides, and the embedment of random architectures into individually coded nanowires will allow the potential of these photonic materials to be fully exploited, unconventional physics to be highlighted, and next-generation optical devices to be achieved. The prospects opened by this technology include enhanced random lasing and mode-locking, multi-directionally guided coupling to sensors and receivers, and low-cost encrypting miniatures for encoders and labels.

  12. Worksite Environmental Interventions for Obesity Prevention and Control: Evidence from Group Randomized Trials.

    PubMed

    Fernandez, Isabel Diana; Becerra, Adan; Chin, Nancy P

    2014-06-01

    Worksites provide multiple advantages to prevent and treat obesity and to test environmental interventions to tackle its multiple causal factors. We present a literature review of group-randomized and non-randomized trials that tested worksite environmental, multiple component interventions for obesity prevention and control paying particular attention to the conduct of formative research prior to intervention development. The evidence on environmental interventions on measures of obesity appears to be strong since most of the studies have a low (4/8) and unclear (2/8) risk of bias. Among the studies reviewed whose potential risk of bias was low, the magnitude of the effect was modest and sometimes in the unexpected direction. None of the four studies describing an explicit formative research stage with clear integration of findings into the intervention was able to demonstrate an effect on the main outcome of interest. We present alternative explanation for the findings and recommendations for future research.

  13. Random Interchange of Magnetic Connectivity

    NASA Astrophysics Data System (ADS)

    Matthaeus, W. H.; Ruffolo, D. J.; Servidio, S.; Wan, M.; Rappazzo, A. F.

    2015-12-01

    Magnetic connectivity, the connection between two points along a magnetic field line, has a stochastic character associated with field lines random walking in space due to magnetic fluctuations, but connectivity can also change in time due to dynamical activity [1]. For fluctuations transverse to a strong mean field, this connectivity change be caused by stochastic interchange due to component reconnection. The process may be understood approximately by formulating a diffusion-like Fokker-Planck coefficient [2] that is asymptotically related to standard field line random walk. Quantitative estimates are provided, for transverse magnetic field models and anisotropic models such as reduced magnetohydrodynamics. In heliospheric applications, these estimates may be useful for understanding mixing between open and close field line regions near coronal hole boundaries, and large latitude excursions of connectivity associated with turbulence. [1] A. F. Rappazzo, W. H. Matthaeus, D. Ruffolo, S. Servidio & M. Velli, ApJL, 758, L14 (2012) [2] D. Ruffolo & W. Matthaeus, ApJ, 806, 233 (2015)

  14. A randomized controlled trial of the effects of hypnosis with 3-D virtual reality animation on tiredness, mood, and salivary cortisol.

    PubMed

    Thompson, Trevor; Steffert, Tony; Steed, Anthony; Gruzelier, John

    2011-01-01

    Case studies suggest hypnosis with a virtual reality (VR) component may be an effective intervention; although few follow-up randomized, controlled trials have been performed comparing such interventions with standard hypnotic treatments. Thirty-five healthy participants were randomized to self-hypnosis with VR imagery, standard self-hypnosis, or relaxation interventions. Changes in sleep, cortisol levels, and mood were examined. Self-hypnosis involved 10- to 20-min. sessions visualizing a healthy immune scenario. Trait absorption was also recorded as a possible moderator. Moderated regression indicated that both hypnosis interventions produced significantly lower tiredness ratings than relaxation when trait absorption was high. When trait absorption was low, VR resulted in significantly higher engagement ratings, although this did not translate to demonstrable improvement in outcome. Results suggest that VR imagery may increase engagement relative to traditional methods, but further investigation into its potential to enhance therapeutic efficacy is required.

  15. Components of Brief Alcohol Interventions for Youth in the Emergency Department.

    PubMed

    Walton, Maureen A; Chermack, Stephen T; Blow, Frederic C; Ehrlich, Peter F; Barry, Kristen L; Booth, Brenda M; Cunningham, Rebecca M

    2015-01-01

    Alcohol brief interventions (BIs) delivered by therapists are promising among underage drinkers in the emergency department (ED); however, integration into routine ED care is lacking. Harnessing technology for identification of at-risk drinkers and delivery of interventions could have tremendous public health impact by addressing practical barriers to implementation. The paper presents baseline, within BI session, and posttest data from an ongoing randomized controlled trial (RCT) of youth in the ED. Patients (ages 14-20) who screened positive for risky drinking were randomized to computer BI (CBI), therapist BI (TBI), or control. Measures included demographics, alcohol consumption (Alcohol Use Disorders Identification Test--Consumption [AUDIT-C]), process questions, BI components (e.g., strengths, tools), and psychological constructs (i.e., importance of cutting down, likelihood of cutting down, readiness to stop, and wanting help). Among 4389 youth surveyed (13.7% refused), 24.0% (n = 1053) screened positive for risky drinking and 80.3% (n = 836) were enrolled in the RCT; 93.7% (n = 783) completed the posttest. Although similar in content, the TBI included a tailored, computerized workbook to structure the session, whereas the CBI was a stand-alone, offline, Facebook-styled program. As compared with controls, significant increases were found at posttest for the TBI in "importance to cut down" and "readiness to stop" and for the CBI in "importance and likelihood to cut down." BI components positively associated with outcomes at posttest included greater identification of personal strengths, protective behavioral strategies, benefits of change, and alternative activities involving sports. In contrast, providing information during the TBI was negatively related to outcomes at posttest. Initial data suggest that therapist and computer BIs are promising, increasing perceived importance of reducing drinking. In addition, findings provide clues to potentially beneficial components of BIs. Future studies are needed to identify BI components that have the greatest influence on reducing risky drinking behaviors among adolescents and emerging adults.

  16. Fabrication and characterization of powder metallurgy tantalum components prepared by high compaction pressure technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Youngmoo; Agency for Defense Development, Yuseong, P.O. Box 35, Yuseong-gu, Daejeon 34186, Republic of Korea.; Lee, Dongju

    2016-04-15

    The present study has investigated the consolidation behaviors of tantalum powders during compaction and sintering, and the characteristics of sintered components. For die compaction, the densification behaviors of the powders are simulated by finite element analyses based on the yield function proposed by Shima and Oyane. Accordingly, the green density distribution for coarser particles is predicted to be more uniform because they exhibits higher initial relative tap density owing to lower interparticle friction. It is also found that cold isostatic pressing is capable of producing higher dense compacts compared to the die pressing. However, unlike the compaction behavior, the sinteredmore » density of smaller particles is found to be higher than those of coarser ones owing to their higher specific surface area. The maximum sintered density was found to be 0.96 of theoretical density where smaller particles were pressed isostatically at 400 MPa followed by sintering at 2000 °C. Moreover, the effects of processing conditions on grain size and texture were also investigated. The average grain size of the sintered specimen is 30.29 μm and its texture is less than 2 times random intensity. Consequently, it is concluded that the higher pressure compaction technique is beneficial to produce high dense and texture-free tantalum components compared to hot pressing and spark plasma sintering. - Highlights: • Higher Ta density is obtained from higher pressure and sintering temperature. • High compaction method enables P/M Ta to achieve the density of 16.00 g·cm{sup −3}. • A P/M Ta component with fine microstructure and random orientation is developed.« less

  17. Supplemental protein from dairy products increases body weight and vitamin D improves physical performance in older adults: a systematic review and meta-analysis.

    PubMed

    Dewansingh, Priya; Melse-Boonstra, Alida; Krijnen, Wim P; van der Schans, Cees P; Jager-Wittenaar, Harriët; van den Heuvel, Ellen G H M

    2018-01-01

    The purpose of these systematic review and meta-analysis was to assess the effectiveness of dairy components on nutritional status and physical fitness in older adults, as evidence for efficacy of the supplementation of these components is inconclusive. Scopus and MEDLINE were searched. Main inclusion criteria for articles were as follows: double-blind, randomized, placebo-controlled trials including participants aged ≥55 years who received dairy components or a placebo. Outcome measures were nutrient status (body weight and body mass index) and physical fitness (body composition, muscle strength, and physical performance). Thirty-six trials with 4947participants were included. Most trials investigated protein and vitamin D supplementation and showed no effect on the outcomes. Meta-analysis on the effect of protein on body weight showed a significant increase in mean difference of 1.13 kg (95% confidence interval, 0.59-1.67). This effect increased by selecting trials with study a duration of 6 months in which less nourished and physically fit participants were included. Trials where the participants were (pre-)frail, inactive older adults or when supplementing ≥20 g of protein per day tended to increase lean body mass. Only small significant effects of vitamin D supplementation on Timed Up and Go (mean difference -0.75 seconds; 95% confidence interval -1.44 to -0.07) were determined. This effect increased when vitamin D doses ranged between 400 and 1000 IU. Additional large randomized controlled trials of ≥6 months are needed regarding the effect of dairy components containing an adequate amount of vitamin D (400-1000 IU) and/or protein (≥20 g) on nutritional status and physical fitness in malnourished or frail older adults. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Rationale, design and methods of the HEALTHY study nutrition intervention component.

    PubMed

    Gillis, B; Mobley, C; Stadler, D D; Hartstein, J; Virus, A; Volpe, S L; El ghormli, L; Staten, M A; Bridgman, J; McCormick, S

    2009-08-01

    The HEALTHY study was a randomized, controlled, multicenter and middle school-based, multifaceted intervention designed to reduce risk factors for the development of type 2 diabetes. The study randomized 42 middle schools to intervention or control, and followed students from the sixth to the eighth grades. Here we describe the design of the HEALTHY nutrition intervention component that was developed to modify the total school food environment, defined to include the following: federal breakfast, lunch, after school snack and supper programs; a la carte venues, including snack bars and school stores; vending machines; fundraisers; and classroom parties and celebrations. Study staff implemented the intervention using core and toolbox strategies to achieve and maintain the following five intervention goals: (1) lower the average fat content of foods, (2) increase the availability and variety of fruits and vegetables, (3) limit the portion sizes and energy content of dessert and snack foods, (4) eliminate whole and 2% milk and all added sugar beverages, with the exception of low fat or nonfat flavored milk, and limit 100% fruit juice to breakfast in small portions and (5) increase the availability of higher fiber grain-based foods and legumes. Other nutrition intervention component elements were taste tests, cafeteria enhancements, cafeteria line messages and other messages about healthy eating, cafeteria learning laboratory (CLL) activities, twice-yearly training of food service staff, weekly meetings with food service managers, incentives for food service departments, and twice yearly local meetings and three national summits with district food service directors. Strengths of the intervention design were the integration of nutrition with the other HEALTHY intervention components (physical education, behavior change and communications), and the collaboration and rapport between the nutrition intervention study staff members and food service personnel at both school and district levels.

  19. Chromospheric Variability: Analysis of 36 years of Time Series from the National Solar Observatory/Sacramento Peak Ca II K-line Monitoring Program

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Keil, Stephen L.; Worden, Simon P.

    2014-01-01

    Analysis of more than 36 years of time series of seven parameters measured in the NSO/AFRL/Sac Peak K-line monitoring program elucidates five elucidates five components of the variation: (1) the solar cycle (period approx. 11 years), (2) quasi-periodic variations (periods approx 100 days), (3) a broad band stochastic process (wide range of periods), (4) rotational modulation, and (5) random observational errors. Correlation and power spectrum analyses elucidate periodic and aperiodic variation of the chromospheric parameters. Time-frequency analysis illuminates periodic and quasi periodic signals, details of frequency modulation due to differential rotation, and in particular elucidates the rather complex harmonic structure (1) and (2) at time scales in the range approx 0.1 - 10 years. These results using only full-disk data further suggest that similar analyses will be useful at detecting and characterizing differential rotation in stars from stellar light-curves such as those being produced by NASA's Kepler observatory. Component (3) consists of variations over a range of timescales, in the manner of a 1/f random noise process. A timedependent Wilson-Bappu effect appears to be present in the solar cycle variations (1), but not in the stochastic process (3). Component (4) characterizes differential rotation of the active regions, and (5) is of course not characteristic of solar variability, but the fact that the observational errors are quite small greatly facilitates the analysis of the other components. The recent data suggest that the current cycle is starting late and may be relatively weak. The data analyzed in this paper can be found at the National Solar Observatory web site http://nsosp.nso.edu/cak_mon/, or by file transfer protocol at ftp://ftp.nso.edu/idl/cak.parameters.

  20. Decision strategies of hearing-impaired listeners in spectral shape discrimination

    NASA Astrophysics Data System (ADS)

    Lentz, Jennifer J.; Leek, Marjorie R.

    2002-03-01

    The ability to discriminate between sounds with different spectral shapes was evaluated for normal-hearing and hearing-impaired listeners. Listeners detected a 920-Hz tone added in phase to a single component of a standard consisting of the sum of five tones spaced equally on a logarithmic frequency scale ranging from 200 to 4200 Hz. An overall level randomization of 10 dB was either present or absent. In one subset of conditions, the no-perturbation conditions, the standard stimulus was the sum of equal-amplitude tones. In the perturbation conditions, the amplitudes of the components within a stimulus were randomly altered on every presentation. For both perturbation and no-perturbation conditions, thresholds for the detection of the 920-Hz tone were measured to compare sensitivity to changes in spectral shape between normal-hearing and hearing-impaired listeners. To assess whether hearing-impaired listeners relied on different regions of the spectrum to discriminate between sounds, spectral weights were estimated from the perturbed standards by correlating the listener's responses with the level differences per component across two intervals of a two-alternative forced-choice task. Results showed that hearing-impaired and normal-hearing listeners had similar sensitivity to changes in spectral shape. On average, across-frequency correlation functions also were similar for both groups of listeners, suggesting that as long as all components are audible and well separated in frequency, hearing-impaired listeners can use information across frequency as well as normal-hearing listeners. Analysis of the individual data revealed, however, that normal-hearing listeners may be better able to adopt optimal weighting schemes. This conclusion is only tentative, as differences in internal noise may need to be considered to interpret the results obtained from weighting studies between normal-hearing and hearing-impaired listeners.

Top