Science.gov

Sample records for adequate randomization methods

  1. Are shear force methods adequately reported?

    PubMed

    Holman, Benjamin W B; Fowler, Stephanie M; Hopkins, David L

    2016-09-01

    This study aimed to determine the detail to which shear force (SF) protocols and methods have been reported in the scientific literature between 2009 and 2015. Articles (n=734) published in peer-reviewed animal and food science journals and limited to only those testing the SF of unprocessed and non-fabricated mammal meats were evaluated. It was found that most of these SF articles originated in Europe (35.3%), investigated bovine species (49.0%), measured m. longissimus samples (55.2%), used tenderometers manufactured by Instron (31.2%), and equipped with Warner-Bratzler blades (68.8%). SF samples were also predominantly thawed prior to cooking (37.1%) and cooked sous vide, using a water bath (50.5%). Information pertaining to blade crosshead speed (47.5%), recorded SF resistance (56.7%), muscle fibre orientation when tested (49.2%), sub-section or core dimension (21.8%), end-point temperature (29.3%), and other factors contributing to SF variation were often omitted. This base failure diminishes repeatability and accurate SF interpretation, and must therefore be rectified. PMID:27107727

  2. AREA OVERLAP METHOD FOR DETERMINING ADEQUATE CHROMATOGRAPHIC RESOLUTION

    EPA Science Inventory

    The Area Overlap method for evaluating analytical chromatograms is evaluated and compared with the Depth-of-the-Valley, IUPAC and Purnell criteria. The method is a resolution criterion based on the fraction of area contributed by an adjacent, overlapping peak. It accounts for bot...

  3. Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios

    NASA Technical Reports Server (NTRS)

    Juarez, Alfredo; Harper, Susana A.

    2016-01-01

    The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.

  4. A method for determining adequate resistance form of complete cast crown preparations.

    PubMed

    Weed, R M; Baez, R J

    1984-09-01

    A diagram with various degrees of occlusal convergence, which takes into consideration the length and diameter of complete crown preparations, was designed as a guide to assist the dentist to obtain adequate resistance form. To test the validity of the diagram, five groups of complete cast crown stainless steel dies were prepared (3.5 mm long, occlusal convergence 10, 13, 16, 19, and 22 degrees). Gold copings were cast for each of the 50 preparations. Displacement force was applied to the casting perpendicularly to a simulated 30-degree cuspal incline until the casting was displaced. Castings were deformed at margins except for the 22-degree group. Castings from this group were displaced without deformation, and it was concluded that there was a lack of adequate resistance form as predicted by the diagram. The hypothesis that the diagram could be used to predict adequate or inadequate resistance form was confirmed by this study. PMID:6384470

  5. Elementary Science Methods Courses and the "National Science Education Standards": Are We Adequately Preparing Teachers?

    ERIC Educational Resources Information Center

    Smith, Leigh K.; Gess-Newsome, Julie

    2004-01-01

    Despite the apparent lack of universally accepted goals or objectives for elementary science methods courses, teacher educators nationally are autonomously designing these classes to prepare prospective teachers to teach science. It is unclear, however, whether science methods courses are preparing teachers to teach science effectively or to…

  6. Are adequate methods available to detect protist parasites on fresh produce?

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Human parasitic protists such as Cryptosporidium, Giardia and microsporidia contaminate a variety of fresh produce worldwide. Existing detection methods lack sensitivity and specificity for most foodborne parasites. Furthermore, detection has been problematic because these parasites adhere tenacious...

  7. Quasi-Isotropic Approximation of Geometrical Optics Method as Adequate Electrodynamical Basis for Tokamak Plasma Polarimetry

    NASA Astrophysics Data System (ADS)

    Bieg, Bohdan; Chrzanowski, Janusz; Kravtsov, Yury A.; Orsitto, Francesco

    Basic principles and recent findings of quasi-isotropic approximation (QIA) of a geometrical optics method are presented in a compact manner. QIA was developed in 1969 to describe electromagnetic waves in weakly anisotropic media. QIA represents the wave field as a power series in two small parameters, one of which is a traditional geometrical optics parameter, equal to wavelength ratio to plasma characteristic scale, and the other one is the largest component of anisotropy tensor. As a result, "" QIA ideally suits to tokamak polarimetry/interferometry systems in submillimeter range, where plasma manifests properties of weakly anisotropic medium.

  8. Random Walk Method for Potential Problems

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, T.; Raju, I. S.

    2002-01-01

    A local Random Walk Method (RWM) for potential problems governed by Lapalace's and Paragon's equations is developed for two- and three-dimensional problems. The RWM is implemented and demonstrated in a multiprocessor parallel environment on a Beowulf cluster of computers. A speed gain of 16 is achieved as the number of processors is increased from 1 to 23.

  9. Individual Differences Methods for Randomized Experiments

    PubMed Central

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily used to make inferences about the average causal effect of the experimental manipulation. This paper introduces a general framework for analyzing experimental data in order to make inferences about individual differences in causal effects. Approaches to analyzing the data produced by a number of classical designs, and two more novel designs, are discussed. Simulations highlight the strengths and weaknesses of the data produced by each design with respect to internal validity. Results indicate that, although the data produced by standard designs can be used to produce accurate estimates of average causal effects of experimental manipulations, more elaborate designs are often necessary for accurate inferences with respect to individual differences in causal effects. The methods described here can be diversely applied by researchers interested in determining the extent to which individuals respond differentially to an experimental manipulation or treatment, and how differential responsiveness relates to individual participant characteristics. PMID:21744970

  10. Are the defined substrate-based methods adequate to determine the microbiological quality of natural recreational waters?

    PubMed

    Valente, Marta Sofia; Pedro, Paulo; Alonso, M Carmen; Borrego, Juan J; Dionísio, Lídia

    2010-03-01

    Monitoring the microbiological quality of water used for recreational activities is very important to human public health. Although the sanitary quality of recreational marine waters could be evaluated by standard methods, they are time-consuming and need confirmation. For these reasons, faster and more sensitive methods, such as the defined substrate-based technology, have been developed. In the present work, we have compared the standard method of membrane filtration using Tergitol-TTC agar for total coliforms and Escherichia coli, and Slanetz and Bartley agar for enterococci, and the IDEXX defined substrate technology for these faecal pollution indicators to determine the microbiological quality of natural recreational waters. ISO 17994:2004 standard was used to compare these methods. The IDEXX for total coliforms and E. coli, Colilert, showed higher values than those obtained by the standard method. Enterolert test, for the enumeration of enterococci, showed lower values when compared with the standard method. It may be concluded that more studies to evaluate the precision and accuracy of the rapid tests are required in order to apply them for routine monitoring of marine and freshwater recreational bathing areas. The main advantages of these methods are that they are more specific, feasible and simpler than the standard methodology. PMID:20009243

  11. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  12. Convergence of a random walk method for the Burgers equation

    SciTech Connect

    Roberts, S.

    1985-10-01

    In this paper we consider a random walk algorithm for the solution of Burgers' equation. The algorithm uses the method of fractional steps. The non-linear advection term of the equation is solved by advecting ''fluid'' particles in a velocity field induced by the particles. The diffusion term of the equation is approximated by adding an appropriate random perturbation to the positions of the particles. Though the algorithm is inefficient as a method for solving Burgers' equation, it does model a similar method, the random vortex method, which has been used extensively to solve the incompressible Navier-Stokes equations. The purpose of this paper is to demonstrate the strong convergence of our random walk method and so provide a model for the proof of convergence for more complex random walk algorithms; for instance, the random vortex method without boundaries.

  13. Effect of packing method on the randomness of disc packings

    NASA Astrophysics Data System (ADS)

    Zhang, Z. P.; Yu, A. B.; Oakeshott, R. B. S.

    1996-06-01

    The randomness of disc packings, generated by random sequential adsorption (RSA), random packing under gravity (RPG) and Mason packing (MP) which gives a packing density close to that of the RSA packing, has been analysed, based on the Delaunay tessellation, and is evaluated at two levels, i.e. the randomness at individual subunit level which relates to the construction of a triangle from a given edge length distribution and the randomness at network level which relates to the connection between triangles from a given triangle frequency distribution. The Delaunay tessellation itself is also analysed and its almost perfect randomness at the two levels is demonstrated, which verifies the proposed approach and provides a random reference system for the present analysis. It is found that (i) the construction of a triangle subunit is not random for the RSA, MP and RPG packings, with the degree of randomness decreasing from the RSA to MP and then to RPG packing; (ii) the connection of triangular subunits in the network is almost perfectly random for the RSA packing, acceptable for the MP packing and not good for the RPG packing. Packing method is an important factor governing the randomness of disc packings.

  14. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  15. Tabu search method with random moves for globally optimal design

    NASA Astrophysics Data System (ADS)

    Hu, Nanfang

    1992-09-01

    Optimum engineering design problems are usually formulated as non-convex optimization problems of continuous variables. Because of the absence of convexity structure, they can have multiple minima, and global optimization becomes difficult. Traditional methods of optimization, such as penalty methods, can often be trapped at a local optimum. The tabu search method with random moves to solve approximately these problems is introduced. Its reliability and efficiency are examined with the help of standard test functions. By the analysis of the implementations, it is seen that this method is easy to use, and no derivative information is necessary. It outperforms the random search method and composite genetic algorithm. In particular, it is applied to minimum weight design examples of a three-bar truss, coil springs, a Z-section and a channel section. For the channel section, the optimal design using the tabu search method with random moves saved 26.14 percent over the weight of the SUMT method.

  16. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that that schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solution and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  17. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that the schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solutions and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  18. Random errors in interferometry with the least-squares method

    SciTech Connect

    Wang Qi

    2011-01-20

    This investigation analyzes random errors in interferometric surface profilers using the least-squares method when random noises are present. Two types of random noise are considered here: intensity noise and position noise. Two formulas have been derived for estimating the standard deviations of the surface height measurements: one is for estimating the standard deviation when only intensity noise is present, and the other is for estimating the standard deviation when only position noise is present. Measurements on simulated noisy interferometric data have been performed, and standard deviations of the simulated measurements have been compared with those theoretically derived. The relationships have also been discussed between random error and the wavelength of the light source and between random error and the amplitude of the interference fringe.

  19. Random and systematic measurement errors in acoustic impedance as determined by the transmission line method

    NASA Technical Reports Server (NTRS)

    Parrott, T. L.; Smith, C. D.

    1977-01-01

    The effect of random and systematic errors associated with the measurement of normal incidence acoustic impedance in a zero-mean-flow environment was investigated by the transmission line method. The influence of random measurement errors in the reflection coefficients and pressure minima positions was investigated by computing fractional standard deviations of the normalized impedance. Both the standard techniques of random process theory and a simplified technique were used. Over a wavelength range of 68 to 10 cm random measurement errors in the reflection coefficients and pressure minima positions could be described adequately by normal probability distributions with standard deviations of 0.001 and 0.0098 cm, respectively. An error propagation technique based on the observed concentration of the probability density functions was found to give essentially the same results but with a computation time of about 1 percent of that required for the standard technique. The results suggest that careful experimental design reduces the effect of random measurement errors to insignificant levels for moderate ranges of test specimen impedance component magnitudes. Most of the observed random scatter can be attributed to lack of control by the mounting arrangement over mechanical boundary conditions of the test sample.

  20. How adequate are the current methods of lead extraction? A review of the efficiency and safety of transvenous lead extraction methods.

    PubMed

    Buiten, Maurits S; van der Heijden, Aafke C; Schalij, Martin J; van Erven, Lieselot

    2015-05-01

    Currently several extraction tools are available in order to allow safe and successful transvenous lead extraction (TLE) of pacemaker and ICD leads; however, no directives exist to guide physicians in their choice of extraction tools and approaches. To aim of the current review is to provide an overview of the success and complication rates of different extraction methods and tools available. A comprehensive search of all published literature was conducted in the databases of PubMed, Embase, Web of Science, and Central. Included papers were original articles describing a specific method of TLE and the corresponding success rates of at least 50 patients. Fifty-three studies were included; the majority (56%) utilized 2 (1-4) different venous extraction approaches (subclavian and femoral), the median number of extraction tools used was 3 (1-6). A stepwise approach was utilized in the majority of the studies, starting with simple traction which resulted in successful TLE in 7-85% of the leads. When applicable the procedure was continued with non-powered tools resulting in a successful extraction of 34-87% leads. Subsequently, powered tools were applied whereby success rates further increased to 74-100%. The final step in TLE was usually utilized by femoral snare leading to an overall TLE success rate of 96-100%. The median procedure-related mortality and major complication described were, respectively, 0% (0-3%) and 1% (0-7%) per patient. In conclusion, a stepwise extraction approach can result in a clinical successful TLE in up to 100% of the leads with a relatively low risk of procedure-related mortality and complications. PMID:25687745

  1. Randomized methods in lossless compression of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Pauca, V. Paúl; Plemmons, Robert

    2013-01-01

    We evaluate recently developed randomized matrix decomposition methods for fast lossless compression and reconstruction of hyperspectral imaging (HSI) data. The simple random projection methods have been shown to be effective for lossy compression without severely affecting the performance of object identification and classification. We build upon these methods to develop a new double-random projection method that may enable security in data transmission of compressed data. For HSI data, the distribution of elements in the resulting residual matrix, i.e., the original data subtracted by its low-rank representation, exhibits a low entropy relative to the original data that favors high-compression ratio. We show both theoretically and empirically that randomized methods combined with residual-coding algorithms can lead to effective lossless compression of HSI data. We conduct numerical tests on real large-scale HSI data that shows promise in this case. In addition, we show that randomized techniques can be applicable for encoding on resource-constrained on-board sensor systems, where the core matrix-vector multiplications can be easily implemented on computing platforms such as graphic processing units or field-programmable gate arrays.

  2. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  3. Accelerated Mini-batch Randomized Block Coordinate Descent Method

    PubMed Central

    Zhao, Tuo; Yu, Mo; Wang, Yiming; Arora, Raman; Liu, Han

    2014-01-01

    We consider regularized empirical risk minimization problems. In particular, we minimize the sum of a smooth empirical risk function and a nonsmooth regularization function. When the regularization function is block separable, we can solve the minimization problems in a randomized block coordinate descent (RBCD) manner. Existing RBCD methods usually decrease the objective value by exploiting the partial gradient of a randomly selected block of coordinates in each iteration. Thus they need all data to be accessible so that the partial gradient of the block gradient can be exactly obtained. However, such a “batch” setting may be computationally expensive in practice. In this paper, we propose a mini-batch randomized block coordinate descent (MRBCD) method, which estimates the partial gradient of the selected block based on a mini-batch of randomly sampled data in each iteration. We further accelerate the MRBCD method by exploiting the semi-stochastic optimization scheme, which effectively reduces the variance of the partial gradient estimators. Theoretically, we show that for strongly convex functions, the MRBCD method attains lower overall iteration complexity than existing RBCD methods. As an application, we further trim the MRBCD method to solve the regularized sparse learning problems. Our numerical experiments shows that the MRBCD method naturally exploits the sparsity structure and achieves better computational performance than existing methods. PMID:25620860

  4. Methods for analyzing cost effectiveness data from cluster randomized trials

    PubMed Central

    Bachmann, Max O; Fairall, Lara; Clark, Allan; Mugford, Miranda

    2007-01-01

    Background Measurement of individuals' costs and outcomes in randomized trials allows uncertainty about cost effectiveness to be quantified. Uncertainty is expressed as probabilities that an intervention is cost effective, and confidence intervals of incremental cost effectiveness ratios. Randomizing clusters instead of individuals tends to increase uncertainty but such data are often analysed incorrectly in published studies. Methods We used data from a cluster randomized trial to demonstrate five appropriate analytic methods: 1) joint modeling of costs and effects with two-stage non-parametric bootstrap sampling of clusters then individuals, 2) joint modeling of costs and effects with Bayesian hierarchical models and 3) linear regression of net benefits at different willingness to pay levels using a) least squares regression with Huber-White robust adjustment of errors, b) a least squares hierarchical model and c) a Bayesian hierarchical model. Results All five methods produced similar results, with greater uncertainty than if cluster randomization was not accounted for. Conclusion Cost effectiveness analyses alongside cluster randomized trials need to account for study design. Several theoretically coherent methods can be implemented with common statistical software. PMID:17822546

  5. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Technical Reports Server (NTRS)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  6. Pseudo Random Classification of Circulation Patterns - Comparison to Deliberate Methods

    NASA Astrophysics Data System (ADS)

    Philipp, Andreas

    2010-05-01

    Classification of circulations patterns, e.g. of sea level pressure patterns, can be done by many different methods, e.g. by cluster analysis, methods based on eigenvalues or those based on the leader algorithm like the Lund classification. However none of these methods can give clear advice on the problem of appropriate numbers of classes and even though the number is decided different methods lead to different results. High efforts are made to find methods leading to indisbutable results. However, doubts on the classifiability of tropospheric circulation states have been raised recently and the existence of natural groups of similar patterns within the circulation data, which might be caused by circulation regimes, are questionable. If those groups or clusters exist, methods which are designed to find them, in particular cluster analysis, should be superior to classification schemes based on pseudo random definition of classes. In order to prove this assumption, a classification method called "random centroids" has been designed, for each class choosing one single circulation pattern using a random number generator and assigning all remaining patterns to them according to the minimum Euclidean distance. Evaluation metrics like the "explained cluster variance" for pressure, temperature and precipitation are calculated in order to compare those pseudo random classifications to classifications provided by the cost733cat dataset including many different classification catalogs for various methods (COST Action 733 "Harmonisation and Applications of Weather Type Classifications for European regions"). By running the randomcent method 1000 times the empirical probability density function of the evaluation metrics can be established and provides information about the probability for the established deliberate methods to be better than random classifications. The results show that most of the classifications fail to succeed the 95th percentile of the empirical probability

  7. Efficient stochastic Galerkin methods for random diffusion equations

    SciTech Connect

    Xiu Dongbin Shen Jie

    2009-02-01

    We discuss in this paper efficient solvers for stochastic diffusion equations in random media. We employ generalized polynomial chaos (gPC) expansion to express the solution in a convergent series and obtain a set of deterministic equations for the expansion coefficients by Galerkin projection. Although the resulting system of diffusion equations are coupled, we show that one can construct fast numerical methods to solve them in a decoupled fashion. The methods are based on separation of the diagonal terms and off-diagonal terms in the matrix of the Galerkin system. We examine properties of this matrix and show that the proposed method is unconditionally stable for unsteady problems and convergent for steady problems with a convergent rate independent of discretization parameters. Numerical examples are provided, for both steady and unsteady random diffusions, to support the analysis.

  8. An Evaluation of the Effectiveness of Recruitment Methods: The Staying Well after Depression Randomized Controlled Trial

    PubMed Central

    Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J. Mark G.

    2014-01-01

    Background Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited and trials often fail to report sufficient details about the recruitment sources and resources utilised. Purpose We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. Methods We describe eight recruitment methods utilised and two further sources not initiated by the research team and examine their efficacy in terms of (i) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial, (ii) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants, and (iii) comparison of sociodemographic characteristics of individuals recruited from different sources. Results Poster advertising, web-based advertising and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. Limitations It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other

  9. Elongation method for electronic structure calculations of random DNA sequences.

    PubMed

    Orimoto, Yuuichi; Liu, Kai; Aoki, Yuriko

    2015-10-30

    We applied ab initio order-N elongation (ELG) method to calculate electronic structures of various deoxyribonucleic acid (DNA) models. We aim to test potential application of the method for building a database of DNA electronic structures. The ELG method mimics polymerization reactions on a computer and meets the requirements for linear scaling computational efficiency and high accuracy, even for huge systems. As a benchmark test, we applied the method for calculations of various types of random sequenced A- and B-type DNA models with and without counterions. In each case, the ELG method maintained high accuracy with small errors in energy on the order of 10(-8) hartree/atom compared with conventional calculations. We demonstrate that the ELG method can provide valuable information such as stabilization energies and local densities of states for each DNA sequence. In addition, we discuss the "restarting" feature of the ELG method for constructing a database that exhaustively covers DNA species. PMID:26337429

  10. Random Sampling of Quantum States: a Survey of Methods. And Some Issues Regarding the Overparametrized Method

    NASA Astrophysics Data System (ADS)

    Maziero, Jonas

    2015-12-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed.

  11. Random-breakage mapping method applied to human DNA sequences

    NASA Technical Reports Server (NTRS)

    Lobrich, M.; Rydberg, B.; Cooper, P. K.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    The random-breakage mapping method [Game et al. (1990) Nucleic Acids Res., 18, 4453-4461] was applied to DNA sequences in human fibroblasts. The methodology involves NotI restriction endonuclease digestion of DNA from irradiated calls, followed by pulsed-field gel electrophoresis, Southern blotting and hybridization with DNA probes recognizing the single copy sequences of interest. The Southern blots show a band for the unbroken restriction fragments and a smear below this band due to radiation induced random breaks. This smear pattern contains two discontinuities in intensity at positions that correspond to the distance of the hybridization site to each end of the restriction fragment. By analyzing the positions of those discontinuities we confirmed the previously mapped position of the probe DXS1327 within a NotI fragment on the X chromosome, thus demonstrating the validity of the technique. We were also able to position the probes D21S1 and D21S15 with respect to the ends of their corresponding NotI fragments on chromosome 21. A third chromosome 21 probe, D21S11, has previously been reported to be close to D21S1, although an uncertainty about a second possible location existed. Since both probes D21S1 and D21S11 hybridized to a single NotI fragment and yielded a similar smear pattern, this uncertainty is removed by the random-breakage mapping method.

  12. Theory of optimum radio reception methods in random noise

    NASA Astrophysics Data System (ADS)

    Gutkin, L. S.

    1982-09-01

    The theory of optimum methods of reception of signals on the background of random noise, widely used in development of any radioelectronic systems and devices based on reception and transmission of information (radar and radio controlled, radio communications, radio telemetry, radio astronomy, television, and other systems), as well as electroacoustical and wire communications sytems, is presented. Optimum linear and nonlinear filtration, binary and comples signal detection and discrimination, estimation of signal parameters, receiver synthesis for incomplete a priori data, special features of synthesis with respect to certain quality indicators, and other problems are examined.

  13. Finite amplitude method for the quasiparticle random-phase approximation

    SciTech Connect

    Avogadro, Paolo; Nakatsukasa, Takashi

    2011-07-15

    We present the finite amplitude method (FAM), originally proposed in Ref. [17], for superfluid systems. A Hartree-Fock-Bogoliubov code may be transformed into a code of the quasiparticle-random-phase approximation (QRPA) with simple modifications. This technique has advantages over the conventional QRPA calculations, such as coding feasibility and computational cost. We perform the fully self-consistent linear-response calculation for the spherical neutron-rich nucleus {sup 174}Sn, modifying the hfbrad code, to demonstrate the accuracy, feasibility, and usefulness of the FAM.

  14. Searching method through biased random walks on complex networks.

    PubMed

    Lee, Sungmin; Yook, Soon-Hyung; Kim, Yup

    2009-07-01

    Information search is closely related to the first-passage property of diffusing particle. The physical properties of diffusing particle is affected by the topological structure of the underlying network. Thus, the interplay between dynamical process and network topology is important to study information search on complex networks. Designing an efficient method has been one of main interests in information search. Both reducing the network traffic and decreasing the searching time have been two essential factors for designing efficient method. Here we propose an efficient method based on biased random walks. Numerical simulations show that the average searching time of the suggested model is more efficient than other well-known models. For a practical interest, we demonstrate how the suggested model can be applied to the peer-to-peer system. PMID:19658839

  15. Random projection and SVD methods in hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Zhang, Jiani

    Hyperspectral imaging provides researchers with abundant information with which to study the characteristics of objects in a scene. Processing the massive hyperspectral imagery datasets in a way that efficiently provides useful information becomes an important issue. In this thesis, we consider methods which reduce the dimension of hyperspectral data while retaining as much useful information as possible. Traditional deterministic methods for low-rank approximation are not always adaptable to process huge datasets in an effective way, and therefore probabilistic methods are useful in dimension reduction of hyperspectral images. In this thesis, we begin by generally introducing the background and motivations of this work. Next, we summarize the preliminary knowledge and the applications of SVD and PCA. After these descriptions, we present a probabilistic method, randomized Singular Value Decomposition (rSVD), for the purposes of dimension reduction, compression, reconstruction, and classification of hyperspectral data. We discuss some variations of this method. These variations offer the opportunity to obtain a more accurate reconstruction of the matrix whose singular values decay gradually, to process matrices without target rank, and to obtain the rSVD with only one single pass over the original data. Moreover, we compare the method with Compressive-Projection Principle Component Analysis (CPPCA). From the numerical results, we can see that rSVD has better performance in compression and reconstruction than truncated SVD and CPPCA. We also apply rSVD to classification methods for the hyperspectral data provided by the National Geospatial-Intelligence Agency (NGA).

  16. A random walk method for computing genetic location scores.

    PubMed Central

    Lange, K; Sobel, E

    1991-01-01

    Calculation of location scores is one of the most computationally intensive tasks in modern genetics. Since these scores are crucial in placing disease loci on marker maps, there is ample incentive to pursue such calculations with large numbers of markers. However, in contrast to the simple, standardized pedigrees used in making marker maps, disease pedigrees are often graphically complex and sparsely phenotyped. These complications can present insuperable barriers to exact likelihood calculations with more than a few markers simultaneously. To overcome these barriers we introduce in the present paper a random walk method for computing approximate location scores with large numbers of biallelic markers. Sufficient mathematical theory is developed to explain the method. Feasibility is checked by small-scale simulations for two applications permitting exact calculation of location scores. PMID:1746559

  17. PROSPECTIVE RANDOMIZED STUDY COMPARING TWO ANESTHETIC METHODS FOR SHOULDER SURGERY

    PubMed Central

    Ikemoto, Roberto Yukio; Murachovsky, Joel; Prata Nascimento, Luis Gustavo; Bueno, Rogerio Serpone; Oliveira Almeida, Luiz Henrique; Strose, Eric; de Mello, Sérgio Cabral; Saletti, Deise

    2015-01-01

    Objective: To evaluate the efficacy of suprascapular nerve block in combination with infusion of anesthetic into the subacromial space, compared with interscalene block. Methods: Forty-five patients with small or medium-sized isolated supraspinatus tendon lesions who underwent arthroscopic repair were prospectively and comparatively evaluated through random assignation to three groups of 15, each with a different combination of anesthetic methods. The efficacy of postoperative analgesia was measured using the visual analogue scale for pain and the analgesic, anti-inflammatory and opioid drug consumption. Inhalation anesthetic consumption during surgery was also compared between the groups. Results: The statistical analysis did not find any statistically significant differences among the groups regarding anesthetic consumption during surgery or postoperative analgesic efficacy during the first 48 hours. Conclusion: Suprascapular nerve block with infusion of anesthetic into the subacromial space is an excellent alternative to interscalene block, particularly in hospitals in which an electrical nerve stimulating device is unavailable. PMID:27022569

  18. Sequential methods for random-effects meta-analysis

    PubMed Central

    Higgins, Julian P T; Whitehead, Anne; Simmonds, Mark

    2011-01-01

    Although meta-analyses are typically viewed as retrospective activities, they are increasingly being applied prospectively to provide up-to-date evidence on specific research questions. When meta-analyses are updated account should be taken of the possibility of false-positive findings due to repeated significance tests. We discuss the use of sequential methods for meta-analyses that incorporate random effects to allow for heterogeneity across studies. We propose a method that uses an approximate semi-Bayes procedure to update evidence on the among-study variance, starting with an informative prior distribution that might be based on findings from previous meta-analyses. We compare our methods with other approaches, including the traditional method of cumulative meta-analysis, in a simulation study and observe that it has Type I and Type II error rates close to the nominal level. We illustrate the method using an example in the treatment of bleeding peptic ulcers. Copyright © 2010 John Wiley & Sons, Ltd. PMID:21472757

  19. Asbestos/NESHAP adequately wet guidance

    SciTech Connect

    Shafer, R.; Throwe, S.; Salgado, O.; Garlow, C.; Hoerath, E.

    1990-12-01

    The Asbestos NESHAP requires facility owners and/or operators involved in demolition and renovation activities to control emissions of particulate asbestos to the outside air because no safe concentration of airborne asbestos has ever been established. The primary method used to control asbestos emissions is to adequately wet the Asbestos Containing Material (ACM) with a wetting agent prior to, during and after demolition/renovation activities. The purpose of the document is to provide guidance to asbestos inspectors and the regulated community on how to determine if friable ACM is adequately wet as required by the Asbestos NESHAP.

  20. A new method for direction finding based on Markov random field model

    NASA Astrophysics Data System (ADS)

    Ota, Mamoru; Kasahara, Yoshiya; Goto, Yoshitaka

    2015-07-01

    Investigating the characteristics of plasma waves observed by scientific satellites in the Earth's plasmasphere/magnetosphere is effective for understanding the mechanisms for generating waves and the plasma environment that influences wave generation and propagation. In particular, finding the propagation directions of waves is important for understanding mechanisms of VLF/ELF waves. To find these directions, the wave distribution function (WDF) method has been proposed. This method is based on the idea that observed signals consist of a number of elementary plane waves that define wave energy density distribution. However, the resulting equations constitute an ill-posed problem in which a solution is not determined uniquely; hence, an adequate model must be assumed for a solution. Although many models have been proposed, we have to select the most optimum model for the given situation because each model has its own advantages and disadvantages. In the present study, we propose a new method for direction finding of the plasma waves measured by plasma wave receivers. Our method is based on the assumption that the WDF can be represented by a Markov random field model with inference of model parameters performed using a variational Bayesian learning algorithm. Using computer-generated spectral matrices, we evaluated the performance of the model and compared the results with those obtained from two conventional methods.

  1. Yoga for veterans with chronic low back pain: Design and methods of a randomized clinical trial.

    PubMed

    Groessl, Erik J; Schmalzl, Laura; Maiya, Meghan; Liu, Lin; Goodman, Debora; Chang, Douglas G; Wetherell, Julie L; Bormann, Jill E; Atkinson, J Hamp; Baxi, Sunita

    2016-05-01

    Chronic low back pain (CLBP) afflicts millions of people worldwide, with particularly high prevalence in military veterans. Many treatment options exist for CLBP, but most have limited effectiveness and some have significant side effects. In general populations with CLBP, yoga has been shown to improve health outcomes with few side effects. However, yoga has not been adequately studied in military veteran populations. In the current paper we will describe the design and methods of a randomized clinical trial aimed at examining whether yoga can effectively reduce disability and pain in US military veterans with CLBP. A total of 144 US military veterans with CLBP will be randomized to either yoga or a delayed treatment comparison group. The yoga intervention will consist of 2× weekly yoga classes for 12weeks, complemented by regular home practice guided by a manual. The delayed treatment group will receive the same intervention after six months. The primary outcome is the change in back pain-related disability measured with the Roland-Morris Disability Questionnaire at baseline and 12-weeks. Secondary outcomes include pain intensity, pain interference, depression, anxiety, fatigue/energy, quality of life, self-efficacy, sleep quality, and medication usage. Additional process and/or mediational factors will be measured to examine dose response and effect mechanisms. Assessments will be conducted at baseline, 6-weeks, 12-weeks, and 6-months. All randomized participants will be included in intention-to-treat analyses. Study results will provide much needed evidence on the feasibility and effectiveness of yoga as a therapeutic modality for the treatment of CLBP in US military veterans. PMID:27103548

  2. Multilevel Analysis Methods for Partially Nested Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Sanders, Elizabeth A.

    2011-01-01

    This paper explores multilevel modeling approaches for 2-group randomized experiments in which a treatment condition involving clusters of individuals is compared to a control condition involving only ungrouped individuals, otherwise known as partially nested cluster randomized designs (PNCRTs). Strategies for comparing groups from a PNCRT in the…

  3. Random particle methods applied to broadband fan interaction noise

    NASA Astrophysics Data System (ADS)

    Dieste, M.; Gabard, G.

    2012-10-01

    Predicting broadband fan noise is key to reduce noise emissions from aircraft and wind turbines. Complete CFD simulations of broadband fan noise generation remain too expensive to be used routinely for engineering design. A more efficient approach consists in synthesizing a turbulent velocity field that captures the main features of the exact solution. This synthetic turbulence is then used in a noise source model. This paper concentrates on predicting broadband fan noise interaction (also called leading edge noise) and demonstrates that a random particle mesh method (RPM) is well suited for simulating this source mechanism. The linearized Euler equations are used to describe sound generation and propagation. In this work, the definition of the filter kernel is generalized to include non-Gaussian filters that can directly follow more realistic energy spectra such as the ones developed by Liepmann and von Kármán. The velocity correlation and energy spectrum of the turbulence are found to be well captured by the RPM. The acoustic predictions are successfully validated against Amiet's analytical solution for a flat plate in a turbulent stream. A standard Langevin equation is used to model temporal decorrelation, but the presence of numerical issues leads to the introduction and validation of a second-order Langevin model.

  4. A comparison of methods for representing sparsely sampled random quantities.

    SciTech Connect

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  5. Investigation of stochastic radiation transport methods in random heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Reinert, Dustin Ray

    Among the most formidable challenges facing our world is the need for safe, clean, affordable energy sources. Growing concerns over global warming induced climate change and the rising costs of fossil fuels threaten conventional means of electricity production and are driving the current nuclear renaissance. One concept at the forefront of international development efforts is the High Temperature Gas-Cooled Reactor (HTGR). With numerous passive safety features and a meltdown-proof design capable of attaining high thermodynamic efficiencies for electricity generation as well as high temperatures useful for the burgeoning hydrogen economy, the HTGR is an extremely promising technology. Unfortunately, the fundamental understanding of neutron behavior within HTGR fuels lags far behind that of more conventional water-cooled reactors. HTGRs utilize a unique heterogeneous fuel element design consisting of thousands of tiny fissile fuel kernels randomly mixed with a non-fissile graphite matrix. Monte Carlo neutron transport simulations of the HTGR fuel element geometry in its full complexity are infeasible and this has motivated the development of more approximate computational techniques. A series of MATLAB codes was written to perform Monte Carlo simulations within HTGR fuel pebbles to establish a comprehensive understanding of the parameters under which the accuracy of the approximate techniques diminishes. This research identified the accuracy of the chord length sampling method to be a function of the matrix scattering optical thickness, the kernel optical thickness, and the kernel packing density. Two new Monte Carlo methods designed to focus the computational effort upon the parameter conditions shown to contribute most strongly to the overall computational error were implemented and evaluated. An extended memory chord length sampling routine that recalls a neutron's prior material traversals was demonstrated to be effective in fixed source calculations containing

  6. A likelihood reformulation method in non-normal random effects models.

    PubMed

    Liu, Lei; Yu, Zhangsheng

    2008-07-20

    In this paper, we propose a practical computational method to obtain the maximum likelihood estimates (MLE) for mixed models with non-normal random effects. By simply multiplying and dividing a standard normal density, we reformulate the likelihood conditional on the non-normal random effects to that conditional on the normal random effects. Gaussian quadrature technique, conveniently implemented in SAS Proc NLMIXED, can then be used to carry out the estimation process. Our method substantially reduces computational time, while yielding similar estimates to the probability integral transformation method (J. Comput. Graphical Stat. 2006; 15:39-57). Furthermore, our method can be applied to more general situations, e.g. finite mixture random effects or correlated random effects from Clayton copula. Simulations and applications are presented to illustrate our method. PMID:18038445

  7. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular...

  8. 29 CFR 98.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Adequate evidence. 98.900 Section 98.900 Labor Office of the Secretary of Labor GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 98.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...

  9. Combining randomized and non-randomized evidence in clinical research: a review of methods and applications.

    PubMed

    Verde, Pablo E; Ohmann, Christian

    2015-03-01

    Researchers may have multiple motivations for combining disparate pieces of evidence in a meta-analysis, such as generalizing experimental results or increasing the power to detect an effect that a single study is not able to detect. However, while in meta-analysis, the main question may be simple, the structure of evidence available to answer it may be complex. As a consequence, combining disparate pieces of evidence becomes a challenge. In this review, we cover statistical methods that have been used for the evidence-synthesis of different study types with the same outcome and similar interventions. For the methodological review, a literature retrieval in the area of generalized evidence-synthesis was performed, and publications were identified, assessed, grouped and classified. Furthermore real applications of these methods in medicine were identified and described. For these approaches, 39 real clinical applications could be identified. A new classification of methods is provided, which takes into account: the inferential approach, the bias modeling, the hierarchical structure, and the use of graphical modeling. We conclude with a discussion of pros and cons of our approach and give some practical advice. PMID:26035469

  10. A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn

    2006-01-01

    A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…

  11. Note on coefficient matrices from stochastic Galerkin methods for random diffusion equations

    SciTech Connect

    Zhou Tao; Tang Tao

    2010-11-01

    In a recent work by Xiu and Shen [D. Xiu, J. Shen, Efficient stochastic Galerkin methods for random diffusion equations, J. Comput. Phys. 228 (2009) 266-281], the Galerkin methods are used to solve stochastic diffusion equations in random media, where some properties for the coefficient matrix of the resulting system are provided. They also posed an open question on the properties of the coefficient matrix. In this work, we will provide some results related to the open question.

  12. Which Ab Initio Wave Function Methods Are Adequate for Quantitative Calculations of the Energies of Biradicals? The Performance of Coupled-Cluster and Multi-Reference Methods Along a Single-Bond Dissociation Coordinate

    SciTech Connect

    Yang, Ke; Jalan, Amrit; Green, William H.; Truhlar, Donald G.

    2013-01-08

    We examine the accuracy of single-reference and multireference correlated wave function methods for predicting accurate energies and potential energy curves of biradicals. The biradicals considered are intermediate species along the bond dissociation coordinates for breaking the F-F bond in F2, the O-O bond in H2O2, and the C-C bond in CH3CH3. We apply a host of single-reference and multireference approximations in a consistent way to the same cases to provide a better assessment of their relative accuracies than was previously possible. The most accurate method studied is coupled cluster theory with all connected excitations through quadruples, CCSDTQ. Without explicit quadruple excitations, the most accurate potential energy curves are obtained by the single-reference RCCSDt method, followed, in order of decreasing accuracy, by UCCSDT, RCCSDT, UCCSDt, seven multireference methods, including perturbation theory, configuration interaction, and coupled-cluster methods (with MRCI+Q being the best and Mk-MR-CCSD the least accurate), four CCSD(T) methods, and then CCSD.

  13. 34 CFR 85.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Definitions § 85.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular act or omission has occurred. Authority: E.O. 12549 (3 CFR, 1986 Comp., p. 189); E.O 12689 (3 CFR, 1989 Comp., p. 235); 20 U.S.C. 1082, 1094, 1221e-3 and 3474; and Sec....

  14. 29 CFR 452.110 - Adequate safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 2 2010-07-01 2010-07-01 false Adequate safeguards. 452.110 Section 452.110 Labor... DISCLOSURE ACT OF 1959 Election Procedures; Rights of Members § 452.110 Adequate safeguards. (a) In addition to the election safeguards discussed in this part, the Act contains a general mandate in section...

  15. 29 CFR 452.110 - Adequate safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 2 2011-07-01 2011-07-01 false Adequate safeguards. 452.110 Section 452.110 Labor... DISCLOSURE ACT OF 1959 Election Procedures; Rights of Members § 452.110 Adequate safeguards. (a) In addition to the election safeguards discussed in this part, the Act contains a general mandate in section...

  16. Random dynamic load identification based on error analysis and weighted total least squares method

    NASA Astrophysics Data System (ADS)

    Jia, You; Yang, Zhichun; Guo, Ning; Wang, Le

    2015-12-01

    In most cases, random dynamic load identification problems in structural dynamics are in general ill-posed. A common approach to treat these problems is to reformulate these problems into some well-posed problems by some numerical regularization methods. In a previous paper by the authors, a random dynamic load identification model was built, and a weighted regularization approach based on the proper orthogonal decomposition (POD) was proposed to identify the random dynamic loads. In this paper, the upper bound of relative load identification error in frequency domain is derived. The selection condition and the specific form of the weighting matrix is also proposed and validated analytically and experimentally, In order to improve the accuracy of random dynamic load identification, a weighted total least squares method is proposed to reduce the impact of these errors. To further validate the feasibility and effectiveness of the proposed method, the comparative study of the proposed method and other methods are conducted with the experiment. The experimental results demonstrated that the weighted total least squares method is more effective than other methods for random dynamic load identification.

  17. Reduction method for intrinsic random coincidence events from (176)Lu in low activity PET imaging.

    PubMed

    Yoshida, Eiji; Tashima, Hideaki; Nishikido, Fumihiko; Murayama, Hideo; Yamaya, Taiga

    2014-07-01

    For clinical studies, the effects of the intrinsic radioactivity of lutetium-based scintillators such as LSO used in PET imaging can be ignored within a narrow energy window. However, the intrinsic radioactivity becomes problematic when used in low-count-rate situations such as gene expression imaging or in-beam PET imaging. Time-of-flight (TOF) measurement capability promises not only to improve PET image quality, but also to reduce intrinsic random coincidences. On the other hand, we have developed a new reduction method for intrinsic random coincidence events based on multiple-coincidence information. Without the energy window, an intrinsic random coincidence is detected simultaneously with an intrinsic true coincidence as a multiple coincidence. The multiple-coincidence events can serve as a guide to identification of the intrinsic coincidences. After rejection of multiple-coincidence events detected with a wide energy window, data obtained included a few intrinsic random and many intrinsic true coincidence events. We analyzed the effect of intrinsic radioactivity and used Monte Carlo simulation to test both the TOF-based method and the developed multiple-coincidence-based (MC-based) method for a whole-body LSO-PET scanner. Using the TOF- and MC-based reduction methods separately, we could reduce the intrinsic random coincidence rates by 77 and 30 %, respectively. Also, the intrinsic random coincidence rate could be reduced by 84 % when the TOF+MC reduction methods were applied. The developed MC-based method showed reduced number of the intrinsic random coincidence events, but the reduction performance was limited compared to that of the TOF-based reduction method. PMID:24496884

  18. Americans Getting Adequate Water Daily, CDC Finds

    MedlinePlus

    ... medlineplus/news/fullstory_158510.html Americans Getting Adequate Water Daily, CDC Finds Men take in an average ... new government report finds most are getting enough water each day. The data, from the U.S. National ...

  19. Americans Getting Adequate Water Daily, CDC Finds

    MedlinePlus

    ... gov/news/fullstory_158510.html Americans Getting Adequate Water Daily, CDC Finds Men take in an average ... new government report finds most are getting enough water each day. The data, from the U.S. National ...

  20. Safety assessment of a shallow foundation using the random finite element method

    NASA Astrophysics Data System (ADS)

    Zaskórski, Łukasz; Puła, Wojciech

    2015-04-01

    A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical

  1. Alternative exact method for random walks on finite and periodic lattices with traps

    NASA Astrophysics Data System (ADS)

    Soler, Jose M.

    1982-07-01

    An alternative general method for random walks in finite or periodic lattices with traps is presented. The method gives, in a straightforward manner and in very little computing time, the exact probability that a random walker, starting from a given site, will undergo n steps before trapping. Another version gives the probability that the walker is at any other given position after n steps. The expected walk lengths calculated for simple lattices agree exactly with those given by a previous exact method by Walsh and Kozak.

  2. An overhang-based DNA block shuffling method for creating a customized random library

    PubMed Central

    Fujishima, Kosuke; Venter, Chris; Wang, Kendrick; Ferreira, Raphael; Rothschild, Lynn J.

    2015-01-01

    We present an overhang-based DNA block shuffling method to create a customized random DNA library with flexible sequence design and length. Our method enables the efficient and seamless assembly of short DNA blocks with dinucleotide overhangs through a simple ligation process. Next generation sequencing analysis of the assembled DNA library revealed that ligation was accurate, directional and unbiased. This straightforward DNA assembly method should fulfill the versatile needs of both in vivo and in vitro functional screening of random peptides and RNA created with a desired amino acid and nucleotide composition, as well as making highly repetitive gene constructs that are difficult to synthesize de novo. PMID:26010273

  3. Applying a weighted random forests method to extract karst sinkholes from LiDAR data

    NASA Astrophysics Data System (ADS)

    Zhu, Junfeng; Pierskalla, William P.

    2016-02-01

    Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.

  4. Deterministic replica-exchange method without pseudo random numbers for simulations of complex systems

    NASA Astrophysics Data System (ADS)

    Urano, Ryo; Okamoto, Yuko

    2015-12-01

    We propose a replica-exchange method (REM) which does not use pseudo random numbers. For this purpose, we first give a conditional probability for Gibbs sampling replica-exchange method (GSREM) based on the heat bath method. In GSREM, replica exchange is performed by conditional probability based on the weight of states using pseudo random numbers. From the conditional probability, we propose a new method called deterministic replica-exchange method (DETREM) that produces thermal equilibrium distribution based on a differential equation instead of using pseudo random numbers. This method satisfies the detailed balance condition using a conditional probability of Gibbs heat bath method and thus results can reproduce the Boltzmann distribution within the condition of the probability. We confirmed that the equivalent results were obtained by REM and DETREM with two-dimensional Ising model. DETREM can avoid problems of choice of seeds in pseudo random numbers for parallel computing of REM and gives analytic method for REM using a differential equation.

  5. Random projection-based dimensionality reduction method for hyperspectral target detection

    NASA Astrophysics Data System (ADS)

    Feng, Weiyi; Chen, Qian; He, Weiji; Arce, Gonzalo R.; Gu, Guohua; Zhuang, Jiayan

    2015-09-01

    Dimensionality reduction is a frequent preprocessing step in hyperspectral image analysis. High-dimensional data will cause the issue of the "curse of dimensionality" in the applications of hyperspectral imagery. In this paper, a dimensionality reduction method of hyperspectral images based on random projection (RP) for target detection was investigated. In the application areas of hyperspectral imagery, e.g. target detection, the high dimensionality of the hyperspectral data would lead to burdensome computations. Random projection is attractive in this area because it is data independent and computationally more efficient than other widely-used hyperspectral dimensionality-reduction methods, such as Principal Component Analysis (PCA) or the maximum-noise-fraction (MNF) transform. In RP, the original highdimensional data is projected onto a low dimensional subspace using a random matrix, which is very simple. Theoretical and experimental results indicated that random projections preserved the structure of the original high-dimensional data quite well without introducing significant distortion. In the experiments, Constrained Energy Minimization (CEM) was adopted as the target detector and a RP-based CEM method for hyperspectral target detection was implemented to reveal that random projections might be a good alternative as a dimensionality reduction tool of hyperspectral images to yield improved target detection with higher detection accuracy and lower computation time than other methods.

  6. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship: Randomized Controlled Trials.

    PubMed

    Anderson, Deverick J; Juthani-Mehta, Manisha; Morgan, Daniel J

    2016-06-01

    Randomized controlled trials (RCT) produce the strongest level of clinical evidence when comparing interventions. RCTs are technically difficult, costly, and require specific considerations including the use of patient- and cluster-level randomization and outcome selection. In this methods paper, we focus on key considerations for RCT methods in healthcare epidemiology and antimicrobial stewardship (HE&AS) research, including the need for cluster randomization, conduct at multiple sites, behavior modification interventions, and difficulty with identifying appropriate outcomes. We review key RCTs in HE&AS with a focus on advantages and disadvantages of methods used. A checklist is provided to aid in the development of RCTs in HE&AS. Infect Control Hosp Epidemiol 2016;37:629-634. PMID:27108848

  7. Methods and optical fibers that decrease pulse degradation resulting from random chromatic dispersion

    DOEpatents

    Chertkov, Michael; Gabitov, Ildar

    2004-03-02

    The present invention provides methods and optical fibers for periodically pinning an actual (random) accumulated chromatic dispersion of an optical fiber to a predicted accumulated dispersion of the fiber through relatively simple modifications of fiber-optic manufacturing methods or retrofitting of existing fibers. If the pinning occurs with sufficient frequency (at a distance less than or are equal to a correlation scale), pulse degradation resulting from random chromatic dispersion is minimized. Alternatively, pinning may occur quasi-periodically, i.e., the pinning distance is distributed between approximately zero and approximately two to three times the correlation scale.

  8. Plackett-Burman randomization method for Bacterial Ghosts preparation form E. coli JM109.

    PubMed

    Amro, Amara A; Salem-Bekhit, Mounir M; Alanazi, Fars K

    2014-07-01

    Plackett-Burman randomization method is a conventional tool for variables randomization aiming at optimization. Bacterial Ghosts (BGs) preparation has been recently established using methods other than the E lysis gene. The protocol has been based mainly on using critical concentrations from chemical compounds able to convert viable cells to BGs. The Minimum Inhibition Concentration (MIC) and the Minimum Growth Concentration (MGC) were the main guide for the BGs preparation. In this study, Escherichia coli JM109 DEC has been used to produce the BGs following the original protocol. The study contained a detail protocol for BGs preparation that could be used as a guide. PMID:25061413

  9. A novel model and estimation method for the individual random component of earthquake ground-motion relations

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2016-06-01

    In this paper, I introduce a novel approach to modelling the individual random component (also called the intra-event uncertainty) of a ground-motion relation (GMR), as well as a novel approach to estimating the corresponding parameters. In essence, I contend that the individual random component is reproduced adequately by a simple stochastic mechanism of random impulses acting in the horizontal plane, with random directions. The random number of impulses was Poisson distributed. The parameters of the model were estimated according to a proposal by Raschke J Seismol 17(4):1157-1182, (2013a), with the sample of random difference ξ = ln(Y 1 )-ln(Y 2 ), in which Y 1 and Y 2 are the horizontal components of local ground-motion intensity. Any GMR element was eliminated by subtraction, except the individual random components. In the estimation procedure, the distribution of difference ξ was approximated by combining a large Monte Carlo simulated sample and Kernel smoothing. The estimated model satisfactorily fitted the difference ξ of the sample of peak ground accelerations, and the variance of the individual random components was considerably smaller than that of conventional GMRs. In addition, the dependence of variance on the epicentre distance was considered; however, a dependence of variance on the magnitude was not detected. Finally, the influence of the novel model and the corresponding approximations on PSHA was researched. The applied approximations of distribution of the individual random component were satisfactory for the researched example of PSHA.

  10. Characterization of a random anisotropic conductivity field with Karhunen-Loeve methods

    SciTech Connect

    Cherry, Matthew R.; Sabbagh, Harold S.; Pilchak, Adam L.; Knopp, Jeremy S.

    2014-02-18

    While parametric uncertainty quantification for NDE models has been addressed in recent years, the problem of stochastic field parameters such as spatially distributed electrical conductivity has only been investigated minimally in the last year. In that work, the authors treated the field as a one-dimensional random process and Karhunen-Loeve methods were used to discretize this process to make it amenable to UQ methods such as ANOVA expansions. In the present work, we will treat the field as a two dimensional random process, and the eigenvalues and eigenfunctions of the integral operator will be determined via Galerkin methods. The Karhunen-Loeve methods is extended to two dimensions and implemented to represent this process. Several different choices for basis functions will be discussed, as well as convergence criteria for each. The methods are applied to correlation functions collected over electron backscatter data from highly micro textured Ti-7Al.

  11. Analysis of random structure-acoustic interaction problems using coupled boundary element and finite element methods

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Pates, Carl S., III

    1994-01-01

    A coupled boundary element (BEM)-finite element (FEM) approach is presented to accurately model structure-acoustic interaction systems. The boundary element method is first applied to interior, two and three-dimensional acoustic domains with complex geometry configurations. Boundary element results are very accurate when compared with limited exact solutions. Structure-interaction problems are then analyzed with the coupled FEM-BEM method, where the finite element method models the structure and the boundary element method models the interior acoustic domain. The coupled analysis is compared with exact and experimental results for a simplistic model. Composite panels are analyzed and compared with isotropic results. The coupled method is then extended for random excitation. Random excitation results are compared with uncoupled results for isotropic and composite panels.

  12. Adequate supervision for children and adolescents.

    PubMed

    Anderst, James; Moffatt, Mary

    2014-11-01

    Primary care providers (PCPs) have the opportunity to improve child health and well-being by addressing supervision issues before an injury or exposure has occurred and/or after an injury or exposure has occurred. Appropriate anticipatory guidance on supervision at well-child visits can improve supervision of children, and may prevent future harm. Adequate supervision varies based on the child's development and maturity, and the risks in the child's environment. Consideration should be given to issues as wide ranging as swimming pools, falls, dating violence, and social media. By considering the likelihood of harm and the severity of the potential harm, caregivers may provide adequate supervision by minimizing risks to the child while still allowing the child to take "small" risks as needed for healthy development. Caregivers should initially focus on direct (visual, auditory, and proximity) supervision of the young child. Gradually, supervision needs to be adjusted as the child develops, emphasizing a safe environment and safe social interactions, with graduated independence. PCPs may foster adequate supervision by providing concrete guidance to caregivers. In addition to preventing injury, supervision includes fostering a safe, stable, and nurturing relationship with every child. PCPs should be familiar with age/developmentally based supervision risks, adequate supervision based on those risks, characteristics of neglectful supervision based on age/development, and ways to encourage appropriate supervision throughout childhood. PMID:25369578

  13. Small Rural Schools CAN Have Adequate Curriculums.

    ERIC Educational Resources Information Center

    Loustaunau, Martha

    The small rural school's foremost and largest problem is providing an adequate curriculum for students in a changing world. Often the small district cannot or is not willing to pay the per-pupil cost of curriculum specialists, specialized courses using expensive equipment no more than one period a day, and remodeled rooms to accommodate new…

  14. Funding the Formula Adequately in Oklahoma

    ERIC Educational Resources Information Center

    Hancock, Kenneth

    2015-01-01

    This report is a longevity, simulational study that looks at how the ratio of state support to local support effects the number of school districts that breaks the common school's funding formula which in turns effects the equity of distribution to the common schools. After nearly two decades of adequately supporting the funding formula, Oklahoma…

  15. Random Qualitative Validation: A Mixed-Methods Approach to Survey Validation

    ERIC Educational Resources Information Center

    Van Duzer, Eric

    2012-01-01

    The purpose of this paper is to introduce the process and value of Random Qualitative Validation (RQV) in the development and interpretation of survey data. RQV is a method of gathering clarifying qualitative data that improves the validity of the quantitative analysis. This paper is concerned with validity in relation to the participants'…

  16. Randomized Controlled Trial of Teaching Methods: Do Classroom Experiments Improve Economic Education in High Schools?

    ERIC Educational Resources Information Center

    Eisenkopf, Gerald; Sulser, Pascal A.

    2016-01-01

    The authors present results from a comprehensive field experiment at Swiss high schools in which they compare the effectiveness of teaching methods in economics. They randomly assigned classes into an experimental and a conventional teaching group, or a control group that received no specific instruction. Both teaching treatments improve economic…

  17. Analytic Methods for Individually Randomized Group Treatment Trials and Group-Randomized Trials When Subjects Belong to Multiple Groups

    PubMed Central

    Andridge, Rebecca. R.; Shoben, Abigail B.; Muller, Keith E.; Murray, David M.

    2014-01-01

    Participants in trials may be randomized either individually or in groups, and may receive their treatment either entirely individually, entirely in groups, or partially individually and partially in groups. This paper concerns cases in which participants receive their treatment either entirely or partially in groups, regardless of how they were randomized. Participants in Group-Randomized Trials (GRTs) are randomized in groups and participants in Individually Randomized Group Treatment (IRGT) trials are individually randomized, but participants in both types of trials receive part or all of their treatment in groups or through common change agents. Participants who receive part or all of their treatment in a group are expected to have positively correlated outcome measurements. This paper addresses a situation that occurs in GRTs and IRGT trials – participants receive treatment through more than one group. As motivation, we consider trials in The Childhood Obesity Prevention and Treatment Research Consortium (COPTR), in which each child participant receives treatment in at least two groups. In simulation studies we considered several possible analytic approaches over a variety of possible group structures. A mixed model with random effects for both groups provided the only consistent protection against inflated type I error rates and did so at the cost of only moderate loss of power when intraclass correlations were not large. We recommend constraining variance estimates to be positive and using the Kenward-Roger adjustment for degrees of freedom; this combination provided additional power but maintained type I error rates at the nominal level. PMID:24399701

  18. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter.

    PubMed

    Huang, Lei

    2015-01-01

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required. PMID:26437409

  19. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter

    PubMed Central

    Huang, Lei

    2015-01-01

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required. PMID:26437409

  20. A Ground State Method for Continuum Systems Using Random Walks in the Space of Slater Determinants.^

    NASA Astrophysics Data System (ADS)

    Zhang, Shiwei; Krakauer, Henry

    2001-03-01

    We study a ground state quantum Monte Carlo method for electronic systems. The method is based on the constrained path Monte Carlo approach(S. Zhang, J. Carlson, and J. E. Gubernatis, Phys. Rev. B 55), 7464 (1997). developed for lattice models of correlated electrons. It works in second-quantized form and uses random walks involving full Slater determinants rather than individual real-space configurations. The method allows easy calculation of expectation values and also makes it straightforward to import standard techniques (e.g., pseudopotentials) used in density functional and quantum chemistry calculations. In general, Slater determinants will acquire overall complex phases, due to the Hubbard-Stratonovich transformation of the two-body potential. In order to control the sign decay, an approximation is developed for the propagation of complex Slater determinants by random walks. We test the method in a homogeneous 3-D electron gas (jellium) using a planewave basis. ^ Supported by NSF, ONR and Research Corporation.

  1. A shortcut through the Coulomb gas method for spectral linear statistics on random matrices

    NASA Astrophysics Data System (ADS)

    Deelan Cunden, Fabio; Facchi, Paolo; Vivo, Pierpaolo

    2016-04-01

    In the last decade, spectral linear statistics on large dimensional random matrices have attracted significant attention. Within the physics community, a privileged role has been played by invariant matrix ensembles for which a two-dimensional Coulomb gas analogy is available. We present a critical revision of the Coulomb gas method in random matrix theory (RMT) borrowing language and tools from large deviations theory. This allows us to formalize an equivalent, but more effective and quicker route toward RMT free energy calculations. Moreover, we argue that this more modern viewpoint is likely to shed further light on the interesting issues of weak phase transitions and evaporation phenomena recently observed in RMT.

  2. An efficient method for calculating RMS von Mises stress in a random vibration environment

    SciTech Connect

    Segalman, D.J.; Fulcher, C.W.G.; Reese, G.M.; Field, R.V. Jr.

    1998-02-01

    An efficient method is presented for calculation of RMS von Mises stresses from stress component transfer functions and the Fourier representation of random input forces. An efficient implementation of the method calculates the RMS stresses directly from the linear stress and displacement modes. The key relation presented is one suggested in past literature, but does not appear to have been previously exploited in this manner.

  3. An efficient method for calculating RMS von Mises stress in a random vibration environment

    SciTech Connect

    Segalman, D.J.; Fulcher, C.W.G.; Reese, G.M.; Field, R.V. Jr.

    1997-12-01

    An efficient method is presented for calculation of RMS von Mises stresses from stress component transfer functions and the Fourier representation of random input forces. An efficient implementation of the method calculates the RMS stresses directly from the linear stress and displacement modes. The key relation presented is one suggested in past literature, but does not appear to have been previously exploited in this manner.

  4. Methods for testing theory and evaluating impact in randomized field trials

    PubMed Central

    Brown, C. Hendricks; Wang, Wei; Kellam, Sheppard G.; Muthén, Bengt O.; Petras, Hanno; Toyinbo, Peter; Poduska, Jeanne; Ialongo, Nicholas; Wyman, Peter A.; Chamberlain, Patricia; Sloboda, Zili; MacKinnon, David P.; Windham, Amy

    2008-01-01

    Randomized field trials provide unique opportunities to examine the effectiveness of an intervention in real world settings and to test and extend both theory of etiology and theory of intervention. These trials are designed not only to test for overall intervention impact but also to examine how impact varies as a function of individual level characteristics, context, and across time. Examination of such variation in impact requires analytical methods that take into account the trial’s multiple nested structure and the evolving changes in outcomes over time. The models that we describe here merge multilevel modeling with growth modeling, allowing for variation in impact to be represented through discrete mixtures—growth mixture models—and nonparametric smooth functions—generalized additive mixed models. These methods are part of an emerging class of multilevel growth mixture models, and we illustrate these with models that examine overall impact and variation in impact. In this paper, we define intent-to-treat analyses in group-randomized multilevel field trials and discuss appropriate ways to identify, examine, and test for variation in impact without inflating the Type I error rate. We describe how to make causal inferences more robust to misspecification of covariates in such analyses and how to summarize and present these interactive intervention effects clearly. Practical strategies for reducing model complexity, checking model fit, and handling missing data are discussed using six randomized field trials to show how these methods may be used across trials randomized at different levels. PMID:18215473

  5. Multi-objective optimization by a new hybridized method: applications to random mechanical systems

    NASA Astrophysics Data System (ADS)

    Zidani, H.; Pagnacco, E.; Sampaio, R.; Ellaia, R.; Souza de Cursi, J. E.

    2013-08-01

    In this article two linear problems with random Gaussian loading are transformed into multi-objective optimization problems. The first problem is the design of a pillar geometry with respect to a compressive random load process. The second problem is the design of a truss structure with respect to a vertical random load process for several frequency bands. A new algorithm, motivated by the Pincus representation formula hybridized with the Nelder-Mead algorithm, is proposed to solve the two multi-objective optimization problems. To generate the Pareto curve, the normal boundary intersection method is used to produce a series of constrained single-objective optimizations. The second problem, depending on the frequency band of excitation, can have as Pareto curve a single point, a standard Pareto curve, or a discontinuous Pareto curve, a fact that has been reported here for the first time in the literature, to the best of the authors' knowledge.

  6. A method for determining the weak statistical stationarity of a random process

    NASA Technical Reports Server (NTRS)

    Sadeh, W. Z.; Koper, C. A., Jr.

    1978-01-01

    A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.

  7. Local search methods based on variable focusing for random K-satisfiability.

    PubMed

    Lemoy, Rémi; Alava, Mikko; Aurell, Erik

    2015-01-01

    We introduce variable focused local search algorithms for satisfiabiliity problems. Usual approaches focus uniformly on unsatisfied clauses. The methods described here work by focusing on random variables in unsatisfied clauses. Variants are considered where variables are selected uniformly and randomly or by introducing a bias towards picking variables participating in several unsatistified clauses. These are studied in the case of the random 3-SAT problem, together with an alternative energy definition, the number of variables in unsatisfied constraints. The variable-based focused Metropolis search (V-FMS) is found to be quite close in performance to the standard clause-based FMS at optimal noise. At infinite noise, instead, the threshold for the linearity of solution times with instance size is improved by picking preferably variables in several UNSAT clauses. Consequences for algorithmic design are discussed. PMID:25679737

  8. Local search methods based on variable focusing for random K -satisfiability

    NASA Astrophysics Data System (ADS)

    Lemoy, Rémi; Alava, Mikko; Aurell, Erik

    2015-01-01

    We introduce variable focused local search algorithms for satisfiabiliity problems. Usual approaches focus uniformly on unsatisfied clauses. The methods described here work by focusing on random variables in unsatisfied clauses. Variants are considered where variables are selected uniformly and randomly or by introducing a bias towards picking variables participating in several unsatistified clauses. These are studied in the case of the random 3-SAT problem, together with an alternative energy definition, the number of variables in unsatisfied constraints. The variable-based focused Metropolis search (V-FMS) is found to be quite close in performance to the standard clause-based FMS at optimal noise. At infinite noise, instead, the threshold for the linearity of solution times with instance size is improved by picking preferably variables in several UNSAT clauses. Consequences for algorithmic design are discussed.

  9. A Bloch decomposition-based stochastic Galerkin method for quantum dynamics with a random external potential

    NASA Astrophysics Data System (ADS)

    Wu, Zhizhang; Huang, Zhongyi

    2016-07-01

    In this paper, we consider the numerical solution of the one-dimensional Schrödinger equation with a periodic lattice potential and a random external potential. This is an important model in solid state physics where the randomness results from complicated phenomena that are not exactly known. Here we generalize the Bloch decomposition-based time-splitting pseudospectral method to the stochastic setting using the generalized polynomial chaos with a Galerkin procedure so that the main effects of dispersion and periodic potential are still computed together. We prove that our method is unconditionally stable and numerical examples show that it has other nice properties and is more efficient than the traditional method. Finally, we give some numerical evidence for the well-known phenomenon of Anderson localization.

  10. On analysis-based two-step interpolation methods for randomly sampled seismic data

    NASA Astrophysics Data System (ADS)

    Yang, Pengliang; Gao, Jinghuai; Chen, Wenchao

    2013-02-01

    Interpolating the missing traces of regularly or irregularly sampled seismic record is an exceedingly important issue in the geophysical community. Many modern acquisition and reconstruction methods are designed to exploit the transform domain sparsity of the few randomly recorded but informative seismic data using thresholding techniques. In this paper, to regularize randomly sampled seismic data, we introduce two accelerated, analysis-based two-step interpolation algorithms, the analysis-based FISTA (fast iterative shrinkage-thresholding algorithm) and the FPOCS (fast projection onto convex sets) algorithm from the IST (iterative shrinkage-thresholding) algorithm and the POCS (projection onto convex sets) algorithm. A MATLAB package is developed for the implementation of these thresholding-related interpolation methods. Based on this package, we compare the reconstruction performance of these algorithms, using synthetic and real seismic data. Combined with several thresholding strategies, the accelerated convergence of the proposed methods is also highlighted.

  11. Implementation of a random displacement method (RDM) in the ADPIC model framework

    SciTech Connect

    Ermak, D.L.; Nasstrom, J.S.; Taylor, A.G.

    1995-06-01

    The objective of this work was to implement a 3-D Lagrangian stochastic (also called random walk or Monte Carlo) diffusion method in the framework of the operational ADPIC (Atmospheric Diffusion Particle-In-Cell) code. The Random Displacement Method, RDM, presented here and implemented in the ADPIC code, calculates atmospheric dispersion in a purely Lagrangian, grid-independent manner. Some of the benefits of this approach compared to the previously-used ``particle-in-cell, gradient diffusion`` method are (a) a sub-grid diffusion approximation is no longer needed, (b) numerical accuracy of the diffusion calculation is improved because particle displacement does not depend on the resolution of the Eulerian grid used to calculate species concentration, and (c) adaptation to other grid structures for the input wind field does not affect the diffusion calculation. In addition, the RDM incorporates a unique and accurate treatment of particle interaction with the surface.

  12. Randomized gradient-free method for multiagent optimization over time-varying networks.

    PubMed

    Yuan, Deming; Ho, Daniel W C

    2015-06-01

    In this brief, we consider the multiagent optimization over a network where multiple agents try to minimize a sum of nonsmooth but Lipschitz continuous functions, subject to a convex state constraint set. The underlying network topology is modeled as time varying. We propose a randomized derivative-free method, where in each update, the random gradient-free oracles are utilized instead of the subgradients (SGs). In contrast to the existing work, we do not require that agents are able to compute the SGs of their objective functions. We establish the convergence of the method to an approximate solution of the multiagent optimization problem within the error level depending on the smoothing parameter and the Lipschitz constant of each agent's objective function. Finally, a numerical example is provided to demonstrate the effectiveness of the method. PMID:25099738

  13. A systematic review of randomized controlled trials on sterilization methods of extracted human teeth

    PubMed Central

    Western, J. Sylvia; Dicksit, Daniel Devaprakash

    2016-01-01

    Aim of this Study: The aim was to evaluate the efficiency of different sterilization methods on extracted human teeth (EHT) by a systematic review of in vitro randomized controlled trials. Methodology: An extensive electronic database literature search concerning the sterilization of EHT was conducted. The search terms used were “human teeth, sterilization, disinfection, randomized controlled trials, and infection control.” Randomized controlled trials which aim at comparing the efficiency of different methods of sterilization of EHT were all included in this systematic review. Results: Out of 1618 articles obtained, eight articles were selected for this systematic review. The sterilization methods reviewed were autoclaving, 10% formalin, 5.25% sodium hypochlorite, 3% hydrogen peroxide, 2% glutaraldehyde, 0.1% thymol, and boiling to 100°C. Data were extracted from the selected individual studies and their findings were summarized. Conclusion: Autoclaving and 10% formalin can be considered as 100% efficient and reliable methods. While the use of 5.25% sodium hypochlorite, 3% hydrogen peroxide, 2% glutaraldehyde, 0.1% thymol, and boiling to 100°C was inefficient and unreliable methods of sterilization of EHT. PMID:27563183

  14. A Simulation-Based Comparison of Covariate Adjustment Methods for the Analysis of Randomized Controlled Trials

    PubMed Central

    Chaussé, Pierre; Liu, Jin; Luta, George

    2016-01-01

    Covariate adjustment methods are frequently used when baseline covariate information is available for randomized controlled trials. Using a simulation study, we compared the analysis of covariance (ANCOVA) with three nonparametric covariate adjustment methods with respect to point and interval estimation for the difference between means. The three alternative methods were based on important members of the generalized empirical likelihood (GEL) family, specifically on the empirical likelihood (EL) method, the exponential tilting (ET) method, and the continuous updated estimator (CUE) method. Two criteria were considered for the comparison of the four statistical methods: the root mean squared error and the empirical coverage of the nominal 95% confidence intervals for the difference between means. Based on the results of the simulation study, for sensitivity analysis purposes, we recommend the use of ANCOVA (with robust standard errors when heteroscedasticity is present) together with the CUE-based covariate adjustment method. PMID:27077870

  15. A new Lagrangian random choice method for steady two-dimensional supersonic/hypersonic flow

    NASA Technical Reports Server (NTRS)

    Loh, C. Y.; Hui, W. H.

    1991-01-01

    Glimm's (1965) random choice method has been successfully applied to compute steady two-dimensional supersonic/hypersonic flow using a new Lagrangian formulation. The method is easy to program, fast to execute, yet it is very accurate and robust. It requires no grid generation, resolves slipline and shock discontinuities crisply, can handle boundary conditions most easily, and is applicable to hypersonic as well as supersonic flow. It represents an accurate and fast alternative to the existing Eulerian methods. Many computed examples are given.

  16. Wave propagation through random media: A local method of small perturbations based on the Helmholtz equation

    NASA Technical Reports Server (NTRS)

    Grosse, Ralf

    1990-01-01

    Propagation of sound through the turbulent atmosphere is a statistical problem. The randomness of the refractive index field causes sound pressure fluctuations. Although no general theory to predict sound pressure statistics from given refractive index statistics exists, there are several approximate solutions to the problem. The most common approximation is the parabolic equation method. Results obtained by this method are restricted to small refractive index fluctuations and to small wave lengths. While the first condition is generally met in the atmosphere, it is desirable to overcome the second. A generalization of the parabolic equation method with respect to the small wave length restriction is presented.

  17. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    NASA Astrophysics Data System (ADS)

    Liao, Qifeng; Lin, Guang

    2016-07-01

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  18. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  19. Novel copyright information hiding method based on random phase matrix of Fresnel diffraction transforms

    NASA Astrophysics Data System (ADS)

    Cao, Chao; Chen, Ru-jun

    2009-10-01

    In this paper, we present a new copyright information hide method for digital images in Moiré fringe formats. The copyright information is embedded into the protected image and the detecting image based on Fresnel phase matrix. Firstly, using Fresnel diffraction transform, the random phase matrix of copyright information is generated. Then, according to Moiré fringe principle, the protected image and the detecting image are modulated respectively based on the random phase matrix, and the copyright information is embedded into them. When the protected image and the detecting image are overlapped, the copyright information can reappear. Experiment results show that our method has good concealment performance, and is a new way for copyright protection.

  20. Research on text encryption and hiding method with double-random phase-encoding

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Sang, Nong

    2013-10-01

    By using optical image processing techniques, a novel text encryption and hiding method applied by double-random phase-encoding technique is proposed in the paper. The first step is that the secret message is transformed into a 2- dimension array. The higher bits of the elements in the array are used to fill with the bit stream of the secret text, while the lower bits are stored specific values. Then, the transformed array is encoded by double random phase encoding technique. Last, the encoded array is embedded on a public host image to obtain the image embedded with hidden text. The performance of the proposed technique is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient.

  1. Chosen-plaintext attack on double-random-phase-encoding-based image hiding method

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Li, Guirong; Zhu, Xianchen

    2015-12-01

    By using optical image processing techniques, a novel text encryption and hiding method applied by double-random phase-encoding technique is proposed in the paper. The first step is that the secret message is transformed into a 2- dimension array. The higher bits of the elements in the array are used to fill with the bit stream of the secret text, while the lower bits are stored specific values. Then, the transformed array is encoded by double random phase encoding technique. Last, the encoded array is embedded on a public host image to obtain the image embedded with hidden text. The performance of the proposed technique is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient.

  2. A numerical method for reducing the random noise in a two-dimensional waveform

    SciTech Connect

    Levy, A.J.

    1991-01-23

    This invention is comprised of a method for reducing random noise in a two-dimensional waveform having an irregular curvature includes the steps of selecting a plurality of points initially positioned at preselected locations on the waveform. For each point selected, the straight line is found which connects it to the midpoint between its neighboring points. A new location for the point is calculated to lie on the straight line a fraction of the distance between the initial location of the point and the midpoint. This process is repeated for each point positioned on the waveform. After a single iteration of the method is completed, the entire process is repeated a predetermined number of times to identify final calculated locations for the plurality of points selected. The final calculated locations of the points are then connected to form a relatively random noise-free waveform having a substantially smooth curvature.

  3. Key management of the double random-phase-encoding method using public-key encryption

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2010-03-01

    Public-key encryption has been used to encode the key of the encryption process. In the proposed technique, an input image has been encrypted by using the double random-phase-encoding method using extended fractional Fourier transform. The key of the encryption process have been encoded by using the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. The encoded key has then been transmitted to the receiver side along with the encrypted image. In the decryption process, first the encoded key has been decrypted using the secret key and then the encrypted image has been decrypted by using the retrieved key parameters. The proposed technique has advantage over double random-phase-encoding method because the problem associated with the transmission of the key has been eliminated by using public-key encryption. Computer simulation has been carried out to validate the proposed technique.

  4. Modulation transfer function of a lens measured with a random target method.

    PubMed

    Levy, E; Peles, D; Opher-Lipson, M; Lipson, S G

    1999-02-01

    We measured the modulation transfer function (MTF) of a lens in the visible region using a random test target generated on a computer screen. This is a simple method to determine the entire MTF curve in one measurement. The lens was obscured by several masks so that the measurements could be compared with the theoretically calculated MTF. Excellent agreement was obtained. Measurement noise was reduced by use of a large number of targets generated on the screen. PMID:18305663

  5. Thermodynamic method for generating random stress distributions on an earthquake fault

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2012-01-01

    This report presents a new method for generating random stress distributions on an earthquake fault, suitable for use as initial conditions in a dynamic rupture simulation. The method employs concepts from thermodynamics and statistical mechanics. A pattern of fault slip is considered to be analogous to a micro-state of a thermodynamic system. The energy of the micro-state is taken to be the elastic energy stored in the surrounding medium. Then, the Boltzmann distribution gives the probability of a given pattern of fault slip and stress. We show how to decompose the system into independent degrees of freedom, which makes it computationally feasible to select a random state. However, due to the equipartition theorem, straightforward application of the Boltzmann distribution leads to a divergence which predicts infinite stress. To avoid equipartition, we show that the finite strength of the fault acts to restrict the possible states of the system. By analyzing a set of earthquake scaling relations, we derive a new formula for the expected power spectral density of the stress distribution, which allows us to construct a computer algorithm free of infinities. We then present a new technique for controlling the extent of the rupture by generating a random stress distribution thousands of times larger than the fault surface, and selecting a portion which, by chance, has a positive stress perturbation of the desired size. Finally, we present a new two-stage nucleation method that combines a small zone of forced rupture with a larger zone of reduced fracture energy.

  6. The Method of Randomization for Cluster-Randomized Trials: Challenges of Including Patients with Multiple Chronic Conditions

    PubMed Central

    Esserman, Denise; Allore, Heather G.; Travison, Thomas G.

    2016-01-01

    Cluster-randomized clinical trials (CRT) are trials in which the unit of randomization is not a participant but a group (e.g. healthcare systems or community centers). They are suitable when the intervention applies naturally to the cluster (e.g. healthcare policy); when lack of independence among participants may occur (e.g. nursing home hygiene); or when it is most ethical to apply an intervention to all within a group (e.g. school-level immunization). Because participants in the same cluster receive the same intervention, CRT may approximate clinical practice, and may produce generalizable findings. However, when not properly designed or interpreted, CRT may induce biased results. CRT designs have features that add complexity to statistical estimation and inference. Chief among these is the cluster-level correlation in response measurements induced by the randomization. A critical consideration is the experimental unit of inference; often it is desirable to consider intervention effects at the level of the individual rather than the cluster. Finally, given that the number of clusters available may be limited, simple forms of randomization may not achieve balance between intervention and control arms at either the cluster- or participant-level. In non-clustered clinical trials, balance of key factors may be easier to achieve because the sample can be homogenous by exclusion of participants with multiple chronic conditions (MCC). CRTs, which are often pragmatic, may eschew such restrictions. Failure to account for imbalance may induce bias and reducing validity. This article focuses on the complexities of randomization in the design of CRTs, such as the inclusion of patients with MCC, and imbalances in covariate factors across clusters.

  7. Discharge simulation using downscaled spatial rainfall field by introducing correlation effect in random cascade method

    NASA Astrophysics Data System (ADS)

    Shrestha, R. K.; Tachikawa, Y.; Takara, K.

    2003-04-01

    The simulation of spatial rainfall field based on non-homogenous random cascade method disaggregates a regionally averaged rainfall such as the GCM output. The cascade-generators are used to disaggregate and produce spatial patterns across the region (Over and Gupta, 1996; Chatchai et al. 2000; Tachikawa et al. 2003). However, the disaggregated data is rarely used to produce discharge by using distributed hydrological model. The hesitation to use disaggregated GCM data in discharge simulation is mainly due to lower reliability to reproduce spatial pattern and higher chance of magnitude fluctuation in a few trials of disaggregation. Long term disaggregation results, which are expected to produce true spatial pattern, may not be convenient for practical discharge simulation. A modified method is tested by keeping the volume balanced and forcing the location of cascade generators on the basis of spatial correlation of rainfall field with respect to surround regions. In this method, a reference matrix is prepared, which is calculated for every target grid by summing the multiplication of rainfall magnitude and spatial correlation coefficient of the respective reference grids. The reference matrix is used to adjust the location of random generator in two ways -- hierarchically and statistically. So, this method is designated as Hierarchical and Statistical Adjustment (HSA) method. The HSA method preserves the magnitude of random cascade generators but modifies the location. Unlike the previous non-homogenous random cascade method, this method produced similar spatial patterns as that of ground truth in every realization, which is a clear indication of improved reliability of the disaggregation method from coarse GCM output to a finer resolution as demanded by the hydrological model. The forced volume balance may be justified from the engineering aspect to maintain the same input quantity of rainfall in a watershed for hydrologic simulation purpose. The downscaled data

  8. Development of pseudo-random diamond turning method for fabricating freeform optics with scattering homogenization.

    PubMed

    Zhu, Zhiwei; Zhou, Xiaoqin; Luo, Dan; Liu, Qiang

    2013-11-18

    In this paper, a novel pseudo-random diamond turning (PRDT) method is proposed for the fabrication of freeform optics with scattering homogenization by means of actively eliminating the inherent periodicity of the residual tool marks. The strategy for accurately determining the spiral toolpath with pseudo-random vibration modulation is deliberately explained. Spatial geometric calculation method is employed to determine the toolpath in consideration of cutting tool geometries, and an iteration algorithm is further introduced to enhance the computation efficiency. Moreover, a novel two degree of freedom fast tool servo (2-DOF FTS) system with decoupled motions is developed to implement the PRDT method. Taking advantage of a novel surface topography generation algorithm, theoretical surfaces generated by using the calculated toolpaths are obtained, the accuracy of the toolpath generation and the efficiency of the PRDT method for breaking up the inherent periodicity of tool marks are examined. A series of preliminary cutting experiments are carried out to verify the efficiency of the proposed PRDT method, the experimental results obtained are in good agreement with the results obtained by numerical simulation. In addition, the results of scattering experiments indicate that the proposed PRDT method will be a very promising technique to achieve the scattering homogenization of machined surfaces with complicated shapes. PMID:24514359

  9. Determination of Slope Safety Factor with Analytical Solution and Searching Critical Slip Surface with Genetic-Traversal Random Method

    PubMed Central

    2014-01-01

    In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679

  10. A stochastic simulation method for the assessment of resistive random access memory retention reliability

    SciTech Connect

    Berco, Dan Tseng, Tseung-Yuen

    2015-12-21

    This study presents an evaluation method for resistive random access memory retention reliability based on the Metropolis Monte Carlo algorithm and Gibbs free energy. The method, which does not rely on a time evolution, provides an extremely efficient way to compare the relative retention properties of metal-insulator-metal structures. It requires a small number of iterations and may be used for statistical analysis. The presented approach is used to compare the relative robustness of a single layer ZrO{sub 2} device with a double layer ZnO/ZrO{sub 2} one, and obtain results which are in good agreement with experimental data.

  11. A stochastic simulation method for the assessment of resistive random access memory retention reliability

    NASA Astrophysics Data System (ADS)

    Berco, Dan; Tseng, Tseung-Yuen

    2015-12-01

    This study presents an evaluation method for resistive random access memory retention reliability based on the Metropolis Monte Carlo algorithm and Gibbs free energy. The method, which does not rely on a time evolution, provides an extremely efficient way to compare the relative retention properties of metal-insulator-metal structures. It requires a small number of iterations and may be used for statistical analysis. The presented approach is used to compare the relative robustness of a single layer ZrO2 device with a double layer ZnO/ZrO2 one, and obtain results which are in good agreement with experimental data.

  12. Quantum Monte Carlo method using phase-free random walks with slater determinants.

    PubMed

    Zhang, Shiwei; Krakauer, Henry

    2003-04-01

    We develop a quantum Monte Carlo method for many fermions using random walks in the space of Slater determinants. An approximate approach is formulated with a trial wave function |Psi(T)> to control the phase problem. Using a plane-wave basis and nonlocal pseudopotentials, we apply the method to Be, Si, and P atoms and dimers, and to bulk Si supercells. Single-determinant wave functions from density functional theory calculations were used as |Psi(T)> with no additional optimization. The calculated binding energies of dimers and cohesive energy of bulk Si are in excellent agreement with experiments and are comparable to the best existing theoretical results. PMID:12689312

  13. Low-noise multiple watermarks technology based on complex double random phase encoding method

    NASA Astrophysics Data System (ADS)

    Zheng, Jihong; Lu, Rongwen; Sun, Liujie; Zhuang, Songlin

    2010-11-01

    Based on double random phase encoding method (DRPE), watermarking technology may provide a stable and robust method to protect the copyright of the printing. However, due to its linear character, DRPE exist the serious safety risk when it is attacked. In this paper, a complex coding method, which means adding the chaotic encryption based on logistic mapping before the DRPE coding, is provided and simulated. The results testify the complex method will provide better security protection for the watermarking. Furthermore, a low-noise multiple watermarking is studied, which means embedding multiple watermarks into one host printing and decrypt them with corresponding phase keys individually. The Digital simulation and mathematic analysis show that with the same total embedding weight factor, multiply watermarking will improve signal noise ratio (SNR) of the output printing image significantly. The complex multiply watermark method may provide a robust, stability, reliability copyright protection with higher quality printing image.

  14. A finite element method for the statistics of non-linear random vibration

    NASA Astrophysics Data System (ADS)

    Langley, R. S.

    1985-07-01

    The transitional probability density function for the random response of a certain class of non-linear system satisfies the Fokker-Planck-Kolmogorov equation. This paper concerns the numerical solution of the stationary form of this equation, yielding the stationary probability density function of response. The weighted residual statement for the problem is integrated by parts to yield the weak form of the equations, which are then solved by the finite element method. The method is applied to a Duffing oscillator and good agreement is found with the exact result, and the method is compared favourably with a Galerkin solution method given by Bhandari and Sherrer [1]. Also, the method is applied to the ship rolling problem and good agreement is found with an approximate analytical result due to Roberts [2].

  15. Random fields generation on the GPU with the spectral turning bands method

    NASA Astrophysics Data System (ADS)

    Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.

    2014-08-01

    Random field (RF) generation algorithms are of paramount importance for many scientific domains, such as astrophysics, geostatistics, computer graphics and many others. Some examples are the generation of initial conditions for cosmological simulations or hydrodynamical turbulence driving. In the latter a new random field is needed every time-step. Current approaches commonly make use of 3D FFT (Fast Fourier Transform) and require the whole generated field to be stored in memory. Moreover, they are limited to regular rectilinear meshes and need an extra processing step to support non-regular meshes. In this paper, we introduce TBARF (Turning BAnd Random Fields), a RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs. Our algorithm replaces the 3D FFT with a lower order, one-dimensional FFT followed by a projection step, and is further optimized with loop unrolling and blocking. We show that TBARF can easily generate RF on non-regular (non uniform) meshes and can afford mesh sizes bigger than the available GPU memory by using a streaming, out-of-core approach. TBARF is 2 to 5 times faster than the traditional methods when generating RFs with more than 16M cells. It can also generate RF on non-regular meshes, and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.

  16. An FRF bounding method for randomly uncertain structures with or without coupling to an acoustic cavity

    NASA Astrophysics Data System (ADS)

    Dunne, L. W.; Dunne, J. F.

    2009-04-01

    An efficient frequency response function (FRF) bounding method is proposed using asymptotic extreme-value theory. The method exploits a small random sample of realised FRFs obtained from nominally identical structures to predict corresponding FRF bounds for a substantially larger batch. This is useful for predicting forced-vibration levels in automotive vehicle bodies when parameters are assumed to vary statistically. Small samples are assumed to come either from Monte Carlo simulation using a vibration model, or via measurements from real structures. The basis of the method is to undertake a hypothesis test and if justified, repeatedly fit inverted Type I asymptotic threshold exceedance models at discrete frequencies, for which the models are not locked to a block size (as in classical extreme-value models). The chosen FRF 'bound' is predicted from the inverse model in the form of the ' m-observational return level', namely the level that will be exceeded on average once in every m structures realised. The method is tested on simulated linear structures, initially to establish its scope and limitations. Initial testing is performed on a sdof system followed by small and medium-sized uncoupled mdof grillages. Testing then continues to: (i) a random acoustically coupled grillage structure; and (ii) a partially random industrial-scale box structure which exhibits similar dynamic characteristics to a small vehicle structure and is analysed in NASTRAN. In both cases, structural and acoustic responses to a single deterministic load are examined. The paper shows that the method is not suitable for very small uncoupled systems but rapidly becomes very appropriate for both uncoupled and coupled mdof structures.

  17. Illegitimate recombination: An efficient method for random mutagenesis in Mycobacterium avium subsp. hominissuis

    PubMed Central

    2012-01-01

    Background The genus Mycobacterium (M.) comprises highly pathogenic bacteria such as M. tuberculosis as well as environmental opportunistic bacteria called non-tuberculous mycobacteria (NTM). While the incidence of tuberculosis is declining in the developed world, infection rates by NTM are increasing. NTM are ubiquitous and have been isolated from soil, natural water sources, tap water, biofilms, aerosols, dust and sawdust. Lung infections as well as lymphadenitis are most often caused by M. avium subsp. hominissuis (MAH), which is considered to be among the clinically most important NTM. Only few virulence genes from M. avium have been defined among other things due to difficulties in generating M. avium mutants. More efforts in developing new methods for mutagenesis of M. avium and identification of virulence-associated genes are therefore needed. Results We developed a random mutagenesis method based on illegitimate recombination and integration of a Hygromycin-resistance marker. Screening for mutations possibly affecting virulence was performed by monitoring of pH resistance, colony morphology, cytokine induction in infected macrophages and intracellular persistence. Out of 50 randomly chosen Hygromycin-resistant colonies, four revealed to be affected in virulence-related traits. The mutated genes were MAV_4334 (nitroreductase family protein), MAV_5106 (phosphoenolpyruvate carboxykinase), MAV_1778 (GTP-binding protein LepA) and MAV_3128 (lysyl-tRNA synthetase LysS). Conclusions We established a random mutagenesis method for MAH that can be easily carried out and combined it with a set of phenotypic screening methods for the identification of virulence-associated mutants. By this method, four new MAH genes were identified that may be involved in virulence. PMID:22966811

  18. Hyperspectral image clustering method based on artificial bee colony algorithm and Markov random fields

    NASA Astrophysics Data System (ADS)

    Sun, Xu; Yang, Lina; Gao, Lianru; Zhang, Bing; Li, Shanshan; Li, Jun

    2015-01-01

    Center-oriented hyperspectral image clustering methods have been widely applied to hyperspectral remote sensing image processing; however, the drawbacks are obvious, including the over-simplicity of computing models and underutilized spatial information. In recent years, some studies have been conducted trying to improve this situation. We introduce the artificial bee colony (ABC) and Markov random field (MRF) algorithms to propose an ABC-MRF-cluster model to solve the problems mentioned above. In this model, a typical ABC algorithm framework is adopted in which cluster centers and iteration conditional model algorithm's results are considered as feasible solutions and objective functions separately, and MRF is modified to be capable of dealing with the clustering problem. Finally, four datasets and two indices are used to show that the application of ABC-cluster and ABC-MRF-cluster methods could help to obtain better image accuracy than conventional methods. Specifically, the ABC-cluster method is superior when used for a higher power of spectral discrimination, whereas the ABC-MRF-cluster method can provide better results when used for an adjusted random index. In experiments on simulated images with different signal-to-noise ratios, ABC-cluster and ABC-MRF-cluster showed good stability.

  19. Risk Prediction Modeling of Sequencing Data Using a Forward Random Field Method

    PubMed Central

    Wen, Yalu; He, Zihuai; Li, Ming; Lu, Qing

    2016-01-01

    With the advance in high-throughput sequencing technology, it is feasible to investigate the role of common and rare variants in disease risk prediction. While the new technology holds great promise to improve disease prediction, the massive amount of data and low frequency of rare variants pose great analytical challenges on risk prediction modeling. In this paper, we develop a forward random field method (FRF) for risk prediction modeling using sequencing data. In FRF, subjects’ phenotypes are treated as stochastic realizations of a random field on a genetic space formed by subjects’ genotypes, and an individual’s phenotype can be predicted by adjacent subjects with similar genotypes. The FRF method allows for multiple similarity measures and candidate genes in the model, and adaptively chooses the optimal similarity measure and disease-associated genes to reflect the underlying disease model. It also avoids the specification of the threshold of rare variants and allows for different directions and magnitudes of genetic effects. Through simulations, we demonstrate the FRF method attains higher or comparable accuracy over commonly used support vector machine based methods under various disease models. We further illustrate the FRF method with an application to the sequencing data obtained from the Dallas Heart Study. PMID:26892725

  20. A Two-Stage Random Forest-Based Pathway Analysis Method

    PubMed Central

    Chung, Ren-Hua; Chen, Ying-Erh

    2012-01-01

    Pathway analysis provides a powerful approach for identifying the joint effect of genes grouped into biologically-based pathways on disease. Pathway analysis is also an attractive approach for a secondary analysis of genome-wide association study (GWAS) data that may still yield new results from these valuable datasets. Most of the current pathway analysis methods focused on testing the cumulative main effects of genes in a pathway. However, for complex diseases, gene-gene interactions are expected to play a critical role in disease etiology. We extended a random forest-based method for pathway analysis by incorporating a two-stage design. We used simulations to verify that the proposed method has the correct type I error rates. We also used simulations to show that the method is more powerful than the original random forest-based pathway approach and the set-based test implemented in PLINK in the presence of gene-gene interactions. Finally, we applied the method to a breast cancer GWAS dataset and a lung cancer GWAS dataset and interesting pathways were identified that have implications for breast and lung cancers. PMID:22586488

  1. Echocardiographic Methods, Quality Review, and Measurement Accuracy in a Randomized Multicenter Clinical Trial of Marfan Syndrome

    PubMed Central

    Selamet Tierney, Elif Seda; Levine, Jami C.; Chen, Shan; Bradley, Timothy J.; Pearson, Gail D.; Colan, Steven D.; Sleeper, Lynn A.; Campbell, M. Jay; Cohen, Meryl S.; Backer, Julie De; Guey, Lin T.; Heydarian, Haleh; Lai, Wyman W.; Lewin, Mark B.; Marcus, Edward; Mart, Christopher R.; Pignatelli, Ricardo H.; Printz, Beth F.; Sharkey, Angela M.; Shirali, Girish S.; Srivastava, Shubhika; Lacro, Ronald V.

    2013-01-01

    Background The Pediatric Heart Network is conducting a large international randomized trial to compare aortic root growth and other cardiovascular outcomes in 608 subjects with Marfan syndrome randomized to receive atenolol or losartan for 3 years. The authors report here the echocardiographic methods and baseline echocardiographic characteristics of the randomized subjects, describe the interobserver agreement of aortic measurements, and identify factors influencing agreement. Methods Individuals aged 6 months to 25 years who met the original Ghent criteria and had body surface area–adjusted maximum aortic root diameter (ROOTmax) Z scores > 3 were eligible for inclusion. The primary outcome measure for the trial is the change over time in ROOTmax Z score. A detailed echocardiographic protocol was established and implemented across 22 centers, with an extensive training and quality review process. Results Interobserver agreement for the aortic measurements was excellent, with intraclass correlation coefficients ranging from 0.921 to 0.989. Lower interobserver percentage error in ROOTmax measurements was independently associated (model R2 = 0.15) with better image quality (P = .002) and later study reading date (P < .001). Echocardiographic characteristics of the randomized subjects did not differ by treatment arm. Subjects with ROOTmax Z scores ≥ 4.5 (36%) were more likely to have mitral valve prolapse and dilation of the main pulmonary artery and left ventricle, but there were no differences in aortic regurgitation, aortic stiffness indices, mitral regurgitation, or left ventricular function compared with subjects with ROOTmax Z scores < 4.5. Conclusions The echocardiographic methodology, training, and quality review process resulted in a robust evaluation of aortic root dimensions, with excellent reproducibility. PMID:23582510

  2. A new method to model x-ray scattering from random rough surfaces

    NASA Astrophysics Data System (ADS)

    Zhao, Ping; Van Speybroeck, Leon P.

    2003-03-01

    This paper presents a method for modeling the X-ray scattering from random rough surfaces. An actual rough surface is (incompletely) described by its Power Spectral Density (PSD). For a given PSD, model surfaces with the same roughness as the actual surface are constructed by preserving the PSD amplitudes and assigning a random phase to each spectral component. Rays representing the incident wave are reflected from the model surface and projected onto a flat plane, which approximates the model surface, as outgoing rays and corrected for phase delays. The projected outgoing rays are then corrected for wave densities and redistributed onto an uniform grid where the model surface is constructed. The scattering is then calculated by taking the Fast Fourier Transform (FFT) of the resulting distribution. This method is generally applicable and is not limited to small scattering angles. It provides the correct asymmetrical scattering profile for grazing incident radiation. We apply this method to the mirrors of the Chandra X-ray Observatory and show the results. We also expect this method to be useful for other X-ray telescope missions.

  3. The Wire-Grasping Method as a New Technique for Forceps Biopsy of Biliary Strictures: A Prospective Randomized Controlled Study of Effectiveness

    PubMed Central

    Yamashita, Yasunobu; Ueda, Kazuki; Kawaji, Yuki; Tamura, Takashi; Itonaga, Masahiro; Yoshida, Takeichi; Maeda, Hiroki; Magari, Hirohito; Maekita, Takao; Iguchi, Mikitaka; Tamai, Hideyuki; Ichinose, Masao; Kato, Jun

    2016-01-01

    Background/Aims Transpapillary forceps biopsy is an effective diagnostic technique in patients with biliary stricture. This prospective study aimed to determine the usefulness of the wire-grasping method as a new technique for forceps biopsy. Methods Consecutive patients with biliary stricture or irregularities of the bile duct wall were randomly allocated to either the direct or wire-grasping method group. In the wire-grasping method, forceps in the duodenum grasps a guide-wire placed into the bile duct beforehand, and then, the forceps are pushed through the papilla without endoscopic sphincterotomy. In the direct method, forceps are directly pushed into the bile duct alongside a guide-wire. The primary endpoint was the success rate of obtaining specimens suitable for adequate pathological examination. Results In total, 32 patients were enrolled, and 28 (14 in each group) were eligible for analysis. The success rate was significantly higher using the wire-grasping method than the direct method (100% vs 50%, p=0.016). Sensitivity and accuracy for the diagnosis of cancer were comparable in patients with the successful procurement of biopsy specimens between the two methods (91% vs 83% and 93% vs 86%, respectively). Conclusions The wire-grasping method is useful for diagnosing patients with biliary stricture or irregularities of the bile duct wall. PMID:27021502

  4. Evaluation of Strip Footing Bearing Capacity Built on the Anthropogenic Embankment by Random Finite Element Method

    NASA Astrophysics Data System (ADS)

    Pieczynska-Kozlowska, Joanna

    2014-05-01

    One of a geotechnical problem in the area of Wroclaw is an anthropogenic embankment layer delaying to the depth of 4-5m, arising as a result of historical incidents. In such a case an assumption of bearing capacity of strip footing might be difficult. The standard solution is to use a deep foundation or foundation soil replacement. However both methods generate significant costs. In the present paper the authors focused their attention on the influence of anthropogenic embankment variability on bearing capacity. Soil parameters were defined on the basis of CPT test and modeled as 2D anisotropic random fields and the assumption of bearing capacity were made according deterministic finite element methods. Many repeated of the different realizations of random fields lead to stable expected value of bearing capacity. The algorithm used to estimate the bearing capacity of strip footing was the random finite element method (e.g. [1]). In traditional approach of bearing capacity the formula proposed by [2] is taken into account. qf = c'Nc + qNq + 0.5γBN- γ (1) where: qf is the ultimate bearing stress, cis the cohesion, qis the overburden load due to foundation embedment, γ is the soil unit weight, Bis the footing width, and Nc, Nq and Nγ are the bearing capacity factors. The method of evaluation the bearing capacity of strip footing based on finite element method incorporate five parameters: Young's modulus (E), Poisson's ratio (ν), dilation angle (ψ), cohesion (c), and friction angle (φ). In the present study E, ν and ψ are held constant while c and φ are randomized. Although the Young's modulus does not affect the bearing capacity it governs the initial elastic response of the soil. Plastic stress redistribution is accomplished using a viscoplastic algorithm merge with an elastic perfectly plastic (Mohr - Coulomb) failure criterion. In this paper a typical finite element mesh was assumed with 8-node elements consist in 50 columns and 20 rows. Footings width B

  5. A simple combinatorial method to describe particle retention time in random media with applications in chromatography

    NASA Astrophysics Data System (ADS)

    da Silva, Roberto; Lamb, Luis C.; Lima, Eder C.; Dupont, Jairton

    2012-01-01

    We propose a foundational model to explain properties of the retention time distribution of particle transport in a random medium. These particles are captured and released by distributed theoretical plates in a random medium as in standard chromatography. Our approach differs from current models, since it is not based on simple random walks, but on a directed and coordinated movement of the particles whose retention time dispersion in the column is due to the imprisonment time of the particle spent in the theoretical plates. Given a pair of fundamental parameters (λc,λe) the capture and release probabilities, we use simple combinatorial methods to predict the Probability Distribution of the retention times. We have analyzed several distributions typically used in chromatographic peak fits. We show that a log-normal distribution with only two parameters describes with high accuracy chromatographic distributions typically used in experiments. This distribution show a better fit than distributions with a larger number of parameters, possibly allowing for better control of experimental data.

  6. Implementation of the finite amplitude method for the relativistic quasiparticle random-phase approximation

    NASA Astrophysics Data System (ADS)

    Nikšić, T.; Kralj, N.; Tutiš, T.; Vretenar, D.; Ring, P.

    2013-10-01

    A new implementation of the finite amplitude method (FAM) for the solution of the relativistic quasiparticle random-phase approximation (RQRPA) is presented, based on the relativistic Hartree-Bogoliubov (RHB) model for deformed nuclei. The numerical accuracy and stability of the FAM-RQRPA is tested in a calculation of the monopole response of 22O. As an illustrative example, the model is applied to a study of the evolution of monopole strength in the chain of Sm isotopes, including the splitting of the giant monopole resonance in axially deformed systems.

  7. New high resolution Random Telegraph Noise (RTN) characterization method for resistive RAM

    NASA Astrophysics Data System (ADS)

    Maestro, M.; Diaz, J.; Crespo-Yepes, A.; Gonzalez, M. B.; Martin-Martinez, J.; Rodriguez, R.; Nafria, M.; Campabadal, F.; Aymerich, X.

    2016-01-01

    Random Telegraph Noise (RTN) is one of the main reliability problems of resistive switching-based memories. To understand the physics behind RTN, a complete and accurate RTN characterization is required. The standard equipment used to analyse RTN has a typical time resolution of ∼2 ms which prevents evaluating fast phenomena. In this work, a new RTN measurement procedure, which increases the measurement time resolution to 2 μs, is proposed. The experimental set-up, together with the recently proposed Weighted Time Lag (W-LT) method for the analysis of RTN signals, allows obtaining a more detailed and precise information about the RTN phenomenon.

  8. Recommended Minimum Test Requirements and Test Methods for Assessing Durability of Random-Glass-Fiber Composites

    SciTech Connect

    Battiste, R.L.; Corum, J.M.; Ren, W.; Ruggles, M.B.

    1999-06-01

    This report provides recommended minimum test requirements are suggested test methods for establishing the durability properties and characteristics of candidate random-glass-fiber polymeric composites for automotive structural applications. The recommendations and suggestions are based on experience and results developed at Oak Ridge National Laboratory (ORNL) under a US Department of Energy Advanced Automotive Materials project entitled ''Durability of Lightweight Composite Structures,'' which is closely coordinated with the Automotive Composites Consortium. The report is intended as an aid to suppliers offering new structural composites for automotive applications and to testing organizations that are called on to characterize the composites.

  9. Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, A. M.; McGhee, D. S.

    2003-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.

  10. Statistical Comparison and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; McGhee, David S.

    2004-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield ultimate strength and high cycle fatigue capability. This paper examines the cumulative distribution function (CDF) percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematics is then used to calculate the combined value corresponding to any desired percentile along with a curve fit to this value. Another Excel macro is used to calculate the combination using a Monte Carlo simulation. Unlike the traditional techniques, these methods quantify the calculated load value with a Consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Also, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can lower the design loading substantially without losing any of the identified structural reliability.

  11. Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Astrophysics Data System (ADS)

    Brown, A. M.; McGhee, D. S.

    2003-02-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.

  12. A Novel Hepatocellular Carcinoma Image Classification Method Based on Voting Ranking Random Forests

    PubMed Central

    Xia, Bingbing; Jiang, Huiyan; Liu, Huiling; Yi, Dehui

    2016-01-01

    This paper proposed a novel voting ranking random forests (VRRF) method for solving hepatocellular carcinoma (HCC) image classification problem. Firstly, in preprocessing stage, this paper used bilateral filtering for hematoxylin-eosin (HE) pathological images. Next, this paper segmented the bilateral filtering processed image and got three different kinds of images, which include single binary cell image, single minimum exterior rectangle cell image, and single cell image with a size of n⁎n. After that, this paper defined atypia features which include auxiliary circularity, amendment circularity, and cell symmetry. Besides, this paper extracted some shape features, fractal dimension features, and several gray features like Local Binary Patterns (LBP) feature, Gray Level Cooccurrence Matrix (GLCM) feature, and Tamura features. Finally, this paper proposed a HCC image classification model based on random forests and further optimized the model by voting ranking method. The experiment results showed that the proposed features combined with VRRF method have a good performance in HCC image classification problem. PMID:27293477

  13. Is a vegetarian diet adequate for children.

    PubMed

    Hackett, A; Nathan, I; Burgess, L

    1998-01-01

    The number of people who avoid eating meat is growing, especially among young people. Benefits to health from a vegetarian diet have been reported in adults but it is not clear to what extent these benefits are due to diet or to other aspects of lifestyles. In children concern has been expressed concerning the adequacy of vegetarian diets especially with regard to growth. The risks/benefits seem to be related to the degree of restriction of he diet; anaemia is probably both the main and the most serious risk but this also applies to omnivores. Vegan diets are more likely to be associated with malnutrition, especially if the diets are the result of authoritarian dogma. Overall, lacto-ovo-vegetarian children consume diets closer to recommendations than omnivores and their pre-pubertal growth is at least as good. The simplest strategy when becoming vegetarian may involve reliance on vegetarian convenience foods which are not necessarily superior in nutritional composition. The vegetarian sector of the food industry could do more to produce foods closer to recommendations. Vegetarian diets can be, but are not necessarily, adequate for children, providing vigilance is maintained, particularly to ensure variety. Identical comments apply to omnivorous diets. Three threats to the diet of children are too much reliance on convenience foods, lack of variety and lack of exercise. PMID:9670174

  14. Color computer-generated hologram generation using the random phase-free method and color space conversion.

    PubMed

    Shimobaba, Tomoyoshi; Makowski, Michał; Nagahama, Yuki; Endo, Yutaka; Hirayama, Ryuji; Hiyama, Daisuke; Hasegawa, Satoki; Sano, Marie; Kakue, Takashi; Oikawa, Minoru; Sugie, Takashige; Takada, Naoki; Ito, Tomoyoshi

    2016-05-20

    We propose two calculation methods of generating color computer-generated holograms (CGHs) with the random phase-free method and color space conversion in order to improve the image quality and accelerate the calculation. The random phase-free method improves the image quality in monochrome CGH, but it is not performed in color CGH. We first aimed to improve the image quality of color CGH using the random phase-free method and then to accelerate the color CGH generation with a combination of the random phase-free method and color space conversion method, which accelerates the color CGH calculation due to down-sampling of the color components converted by color space conversion. To overcome the problem of image quality degradation that occurs due to the down-sampling of random phases, the combination of the random phase-free method and color space conversion method improves the quality of reconstructed images and accelerates the color CGH calculation. We demonstrated the effectiveness of the proposed method in simulation, and in this paper discuss its application to lensless zoomable holographic projection. PMID:27411145

  15. Methods of learning in statistical education: Design and analysis of a randomized trial

    NASA Astrophysics Data System (ADS)

    Boyd, Felicity Turner

    Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus

  16. A blind image detection method for information hiding with double random-phase encoding

    NASA Astrophysics Data System (ADS)

    Sheng, Yuan; Xin, Zhou; Jian-guo, Chen; Yong-liang, Xiao; Qiang, Liu

    2009-07-01

    In this paper, a blind image detection method based on a statistical hypothesis test for information hiding with double random-phase encoding (DRPE) is proposed. This method aims to establish a quantitative criterion which is used to judge whether there is secret information embedded in the detected image. The main process can be described as follows: at the beginning, we decompose the detected gray-scale image into 8 bit planes considering it has 256 gray levels, and suppose that a secret image has been hidden in the detected image after it was encrypted by DRPE, thus the lower bit planes of the detected image exhibit strong randomness. Then, we divide the bit plane to be tested into many windows, and establish a statistical variable to measure the relativity between pixels in every window. Finally, judge whether the secret image exists in the detected image by operating the t test on all statistical variables. Numerical simulation shows that the accuracy is quite satisfactory, when we need to distinguish the images carrying secret information from a large amount of images.

  17. The simulation of the viscous flow around a cylinder by the random vortex method

    NASA Astrophysics Data System (ADS)

    Tiemroth, Erik Charles

    The ability to calculate the loads on cylindrical members is of great importance in offshore engineering. The random vortex method (RVM) is applied to simulate steady, uniform incident flow over a cylinder and the flow about a cylinder in a free surface wave field. Tent functions are used to model the vortex sheets at the boundary and a Rankine core function is used for the vortex blobs. Second-order accurate time integration and random walk diffusion algorithms are used. One of the most difficult problems with the RVM is the computational requirements of the convection calculations. The exact solution to this problem involves O(N sup 2) calculations, where N is the number of vortex blobs. A highly accurate method based on series expansion is introduced that is capable of reducing the computational requirements to O(N sup 1.4). Simulations for the case of steady, uniform incident flow were made for Reynolds numbers of 4,000 to 95,000. Comparison was made with experiments and finite difference calculations for the case of Reynolds number 9,500. The observed flow structures agreed well with the simulated flow. Simulations of the wave flows was also successful. The predicted forces agreed with experimental results within 10 to 20 percent for a range of incident wave lengths and amplitudes. The simulated vorticity fields provided an interpretation of the typical shapes seen in experimentally derive force curves, in which the various regions could be associated with the degree of vorticity formation and shedding.

  18. Impedance measurement using a two-microphone, random-excitation method

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Parrott, T. L.

    1978-01-01

    The feasibility of using a two-microphone, random-excitation technique for the measurement of acoustic impedance was studied. Equations were developed, including the effect of mean flow, which show that acoustic impedance is related to the pressure ratio and phase difference between two points in a duct carrying plane waves only. The impedances of a honeycomb ceramic specimen and a Helmholtz resonator were measured and compared with impedances obtained using the conventional standing-wave method. Agreement between the two methods was generally good. A sensitivity analysis was performed to pinpoint possible error sources and recommendations were made for future study. The two-microphone approach evaluated in this study appears to have some advantages over other impedance measuring techniques.

  19. Identifying protein interaction subnetworks by a bagging Markov random field-based method

    PubMed Central

    Chen, Li; Xuan, Jianhua; Riggins, Rebecca B.; Wang, Yue; Clarke, Robert

    2013-01-01

    Identification of differentially expressed subnetworks from protein–protein interaction (PPI) networks has become increasingly important to our global understanding of the molecular mechanisms that drive cancer. Several methods have been proposed for PPI subnetwork identification, but the dependency among network member genes is not explicitly considered, leaving many important hub genes largely unidentified. We present a new method, based on a bagging Markov random field (BMRF) framework, to improve subnetwork identification for mechanistic studies of breast cancer. The method follows a maximum a posteriori principle to form a novel network score that explicitly considers pairwise gene interactions in PPI networks, and it searches for subnetworks with maximal network scores. To improve their robustness across data sets, a bagging scheme based on bootstrapping samples is implemented to statistically select high confidence subnetworks. We first compared the BMRF-based method with existing methods on simulation data to demonstrate its improved performance. We then applied our method to breast cancer data to identify PPI subnetworks associated with breast cancer progression and/or tamoxifen resistance. The experimental results show that not only an improved prediction performance can be achieved by the BMRF approach when tested on independent data sets, but biologically meaningful subnetworks can also be revealed that are relevant to breast cancer and tamoxifen resistance. PMID:23161673

  20. Multi-level Monte Carlo finite volume methods for uncertainty quantification of acoustic wave propagation in random heterogeneous layered medium

    NASA Astrophysics Data System (ADS)

    Mishra, S.; Schwab, Ch.; Šukys, J.

    2016-05-01

    We consider the very challenging problem of efficient uncertainty quantification for acoustic wave propagation in a highly heterogeneous, possibly layered, random medium, characterized by possibly anisotropic, piecewise log-exponentially distributed Gaussian random fields. A multi-level Monte Carlo finite volume method is proposed, along with a novel, bias-free upscaling technique that allows to represent the input random fields, generated using spectral FFT methods, efficiently. Combined together with a recently developed dynamic load balancing algorithm that scales to massively parallel computing architectures, the proposed method is able to robustly compute uncertainty for highly realistic random subsurface formations that can contain a very high number (millions) of sources of uncertainty. Numerical experiments, in both two and three space dimensions, illustrating the efficiency of the method are presented.

  1. Propensity score methods for estimating relative risks in cluster randomized trials with low-incidence binary outcomes and selection bias.

    PubMed

    Leyrat, Clémence; Caille, Agnès; Donner, Allan; Giraudeau, Bruno

    2014-09-10

    Despite randomization, selection bias may occur in cluster randomized trials. Classical multivariable regression usually allows for adjusting treatment effect estimates with unbalanced covariates. However, for binary outcomes with low incidence, such a method may fail because of separation problems. This simulation study focused on the performance of propensity score (PS)-based methods to estimate relative risks from cluster randomized trials with binary outcomes with low incidence. The results suggested that among the different approaches used (multivariable regression, direct adjustment on PS, inverse weighting on PS, and stratification on PS), only direct adjustment on the PS fully corrected the bias and moreover had the best statistical properties. PMID:24771662

  2. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  3. Skeletonization of the internal thoracic artery: a randomized comparison of harvesting methods.

    PubMed

    Urso, Stefano; Alvarez, Luis; Sádaba, Rafael; Greco, Ernesto

    2008-02-01

    We performed a randomized study to compare internal thoracic artery (ITA) flow response to two harvesting methods used in the skeletonization procedure: ultrasonic scalpel and bipolar electrocautery. Sixty patients scheduled for CABG were randomized to receive either ultrasonically (n=30 patients) or electrocautery (n=30 patients) skeletonized ITAs. Intraoperative ITA graft mean flows were obtained with a transit-time flowmeter. ITA flows were evaluated at the beginning (Time 1) and at the end (Time 2) of the harvesting procedure. Post-cardiopulmonary bypass (CPB) flow measurement (Time 3) was obtained in the ITA grafts anastomosed to the left anterior descending artery. Intraoperative mean flow decreased significantly within ultrasonic group (Group U) and electrocautery group (Group E) at the end of the harvesting procedure (P<0.0001 in both cases). Within both groups the final mean flow measured on anastomosed ITAs (Time 3) was significantly higher than the beginning ITA flow value (Time 1). No statistical difference was noted comparing ITA flows between the two groups at any time of evaluation. Skeletonization harvesting of the ITA produces a modification of the mean flow. The quantity and the reversibility of this phenomenon, probably related to vasospasm, are independent from the energy source used in the skeletonization procedure. PMID:17998305

  4. A recursive model-reduction method for approximate inference in Gaussian Markov random fields.

    PubMed

    Johnson, Jason K; Willsky, Alan S

    2008-01-01

    This paper presents recursive cavity modeling--a principled, tractable approach to approximate, near-optimal inference for large Gauss-Markov random fields. The main idea is to subdivide the random field into smaller subfields, constructing cavity models which approximate these subfields. Each cavity model is a concise, yet faithful, model for the surface of one subfield sufficient for near-optimal inference in adjacent subfields. This basic idea leads to a tree-structured algorithm which recursively builds a hierarchy of cavity models during an "upward pass" and then builds a complementary set of blanket models during a reverse "downward pass." The marginal statistics of individual variables can then be approximated using their blanket models. Model thinning plays an important role, allowing us to develop thinned cavity and blanket models thereby providing tractable approximate inference. We develop a maximum-entropy approach that exploits certain tractable representations of Fisher information on thin chordal graphs. Given the resulting set of thinned cavity models, we also develop a fast preconditioner, which provides a simple iterative method to compute optimal estimates. Thus, our overall approach combines recursive inference, variational learning and iterative estimation. We demonstrate the accuracy and scalability of this approach in several challenging, large-scale remote sensing problems. PMID:18229805

  5. A Novel Microaneurysms Detection Method Based on Local Applying of Markov Random Field.

    PubMed

    Ganjee, Razieh; Azmi, Reza; Moghadam, Mohsen Ebrahimi

    2016-03-01

    Diabetic Retinopathy (DR) is one of the most common complications of long-term diabetes. It is a progressive disease and by damaging retina, it finally results in blindness of patients. Since Microaneurysms (MAs) appear as a first sign of DR in retina, early detection of this lesion is an essential step in automatic detection of DR. In this paper, a new MAs detection method is presented. The proposed approach consists of two main steps. In the first step, the MA candidates are detected based on local applying of Markov random field model (MRF). In the second step, these candidate regions are categorized to identify the correct MAs using 23 features based on shape, intensity and Gaussian distribution of MAs intensity. The proposed method is evaluated on DIARETDB1 which is a standard and publicly available database in this field. Evaluation of the proposed method on this database resulted in the average sensitivity of 0.82 for a confidence level of 75 as a ground truth. The results show that our method is able to detect the low contrast MAs with the background while its performance is still comparable to other state of the art approaches. PMID:26779642

  6. THE LOSS OF ACCURACY OF STOCHASTIC COLLOCATION METHOD IN SOLVING NONLINEAR DIFFERENTIAL EQUATIONS WITH RANDOM INPUT DATA

    SciTech Connect

    Webster, Clayton G; Tran, Hoang A; Trenchea, Catalin S

    2013-01-01

    n this paper we show how stochastic collocation method (SCM) could fail to con- verge for nonlinear differential equations with random coefficients. First, we consider Navier-Stokes equation with uncertain viscosity and derive error estimates for stochastic collocation discretization. Our analysis gives some indicators on how the nonlinearity negatively affects the accuracy of the method. The stochastic collocation method is then applied to noisy Lorenz system. Simulation re- sults demonstrate that the solution of a nonlinear equation could be highly irregular on the random data and in such cases, stochastic collocation method cannot capture the correct solution.

  7. A generalized genetic random field method for the genetic association analysis of sequencing data.

    PubMed

    Li, Ming; He, Zihuai; Zhang, Min; Zhan, Xiaowei; Wei, Changshuai; Elston, Robert C; Lu, Qing

    2014-04-01

    With the advance of high-throughput sequencing technologies, it has become feasible to investigate the influence of the entire spectrum of sequencing variations on complex human diseases. Although association studies utilizing the new sequencing technologies hold great promise to unravel novel genetic variants, especially rare genetic variants that contribute to human diseases, the statistical analysis of high-dimensional sequencing data remains a challenge. Advanced analytical methods are in great need to facilitate high-dimensional sequencing data analyses. In this article, we propose a generalized genetic random field (GGRF) method for association analyses of sequencing data. Like other similarity-based methods (e.g., SIMreg and SKAT), the new method has the advantages of avoiding the need to specify thresholds for rare variants and allowing for testing multiple variants acting in different directions and magnitude of effects. The method is built on the generalized estimating equation framework and thus accommodates a variety of disease phenotypes (e.g., quantitative and binary phenotypes). Moreover, it has a nice asymptotic property, and can be applied to small-scale sequencing data without need for small-sample adjustment. Through simulations, we demonstrate that the proposed GGRF attains an improved or comparable power over a commonly used method, SKAT, under various disease scenarios, especially when rare variants play a significant role in disease etiology. We further illustrate GGRF with an application to a real dataset from the Dallas Heart Study. By using GGRF, we were able to detect the association of two candidate genes, ANGPTL3 and ANGPTL4, with serum triglyceride. PMID:24482034

  8. A new hierarchical method for inter-patient heartbeat classification using random projections and RR intervals

    PubMed Central

    2014-01-01

    Background The inter-patient classification schema and the Association for the Advancement of Medical Instrumentation (AAMI) standards are important to the construction and evaluation of automated heartbeat classification systems. The majority of previously proposed methods that take the above two aspects into consideration use the same features and classification method to classify different classes of heartbeats. The performance of the classification system is often unsatisfactory with respect to the ventricular ectopic beat (VEB) and supraventricular ectopic beat (SVEB). Methods Based on the different characteristics of VEB and SVEB, a novel hierarchical heartbeat classification system was constructed. This was done in order to improve the classification performance of these two classes of heartbeats by using different features and classification methods. First, random projection and support vector machine (SVM) ensemble were used to detect VEB. Then, the ratio of the RR interval was compared to a predetermined threshold to detect SVEB. The optimal parameters for the classification models were selected on the training set and used in the independent testing set to assess the final performance of the classification system. Meanwhile, the effect of different lead configurations on the classification results was evaluated. Results Results showed that the performance of this classification system was notably superior to that of other methods. The VEB detection sensitivity was 93.9% with a positive predictive value of 90.9%, and the SVEB detection sensitivity was 91.1% with a positive predictive value of 42.2%. In addition, this classification process was relatively fast. Conclusions A hierarchical heartbeat classification system was proposed based on the inter-patient data division to detect VEB and SVEB. It demonstrated better classification performance than existing methods. It can be regarded as a promising system for detecting VEB and SVEB of unknown patients in

  9. Adequate histologic sectioning of prostate needle biopsies.

    PubMed

    Bostwick, David G; Kahane, Hillel

    2013-08-01

    No standard method exists for sampling prostate needle biopsies, although most reports claim to embed 3 cores per block and obtain 3 slices from each block. This study was undertaken to determine the extent of histologic sectioning necessary for optimal examination of prostate biopsies. We prospectively compared the impact on cancer yield of submitting 1 biopsy core per cassette (biopsies from January 2010) with 3 cores per cassette (biopsies from August 2010) from a large national reference laboratory. Between 6 and 12 slices were obtained with the former 1-core method, resulting in 3 to 6 slices being placed on each of 2 slides; for the latter 3-core method, a limit of 6 slices was obtained, resulting in 3 slices being place on each of 2 slides. A total of 6708 sets of 12 to 18 core biopsies were studied, including 3509 biopsy sets from the 1-biopsy-core-per-cassette group (January 2010) and 3199 biopsy sets from the 3-biopsy-cores-percassette group (August 2010). The yield of diagnoses was classified as benign, atypical small acinar proliferation, high-grade prostatic intraepithelial neoplasia, and cancer and was similar with the 2 methods: 46.2%, 8.2%, 4.5%, and 41.1% and 46.7%, 6.3%, 4.4%, and 42.6%, respectively (P = .02). Submission of 1 core or 3 cores per cassette had no effect on the yield of atypical small acinar proliferation, prostatic intraepithelial neoplasia, or cancer in prostate needle biopsies. Consequently, we recommend submission of 3 cores per cassette to minimize labor and cost of processing. PMID:23764163

  10. Non-stationary random vibration analysis of a 3D train-bridge system using the probability density evolution method

    NASA Astrophysics Data System (ADS)

    Yu, Zhi-wu; Mao, Jian-feng; Guo, Feng-qi; Guo, Wei

    2016-03-01

    Rail irregularity is one of the main sources causing train-bridge random vibration. A new random vibration theory for the coupled train-bridge systems is proposed in this paper. First, number theory method (NTM) with 2N-dimensional vectors for the stochastic harmonic function (SHF) of rail irregularity power spectrum density was adopted to determine the representative points of spatial frequencies and phases to generate the random rail irregularity samples, and the non-stationary rail irregularity samples were modulated with the slowly varying function. Second, the probability density evolution method (PDEM) was employed to calculate the random dynamic vibration of the three-dimensional (3D) train-bridge system by a program compiled on the MATLAB® software platform. Eventually, the Newmark-β integration method and double edge difference method of total variation diminishing (TVD) format were adopted to obtain the mean value curve, the standard deviation curve and the time-history probability density information of responses. A case study was presented in which the ICE-3 train travels on a three-span simply-supported high-speed railway bridge with excitation of random rail irregularity. The results showed that compared to the Monte Carlo simulation, the PDEM has higher computational efficiency for the same accuracy, i.e., an improvement by 1-2 orders of magnitude. Additionally, the influences of rail irregularity and train speed on the random vibration of the coupled train-bridge system were discussed.

  11. a Method to Estimate Temporal Interaction in a Conditional Random Field Based Approach for Crop Recognition

    NASA Astrophysics Data System (ADS)

    Diaz, P. M. A.; Feitosa, R. Q.; Sanches, I. D.; Costa, G. A. O. P.

    2016-06-01

    This paper presents a method to estimate the temporal interaction in a Conditional Random Field (CRF) based approach for crop recognition from multitemporal remote sensing image sequences. This approach models the phenology of different crop types as a CRF. Interaction potentials are assumed to depend only on the class labels of an image site at two consecutive epochs. In the proposed method, the estimation of temporal interaction parameters is considered as an optimization problem, whose goal is to find the transition matrix that maximizes the CRF performance, upon a set of labelled data. The objective functions underlying the optimization procedure can be formulated in terms of different accuracy metrics, such as overall and average class accuracy per crop or phenological stages. To validate the proposed approach, experiments were carried out upon a dataset consisting of 12 co-registered LANDSAT images of a region in southeast of Brazil. Pattern Search was used as the optimization algorithm. The experimental results demonstrated that the proposed method was able to substantially outperform estimates related to joint or conditional class transition probabilities, which rely on training samples.

  12. Calculation of the entropy of random coil polymers with the hypothetical scanning Monte Carlo method

    NASA Astrophysics Data System (ADS)

    White, Ronald P.; Meirovitch, Hagai

    2005-12-01

    Hypothetical scanning Monte Carlo (HSMC) is a method for calculating the absolute entropy S and free energy F from a given MC trajectory developed recently and applied to liquid argon, TIP3P water, and peptides. In this paper HSMC is extended to random coil polymers by applying it to self-avoiding walks on a square lattice—a simple but difficult model due to strong excluded volume interactions. With HSMC the probability of a given chain is obtained as a product of transition probabilities calculated for each bond by MC simulations and a counting formula. This probability is exact in the sense that it is based on all the interactions of the system and the only approximation is due to finite sampling. The method provides rigorous upper and lower bounds for F, which can be obtained from a very small sample and even from a single chain conformation. HSMC is independent of existing techniques and thus constitutes an independent research tool. The HSMC results are compared to those obtained by other methods, and its application to complex lattice chain models is discussed; we emphasize its ability to treat any type of boundary conditions for which a reference state (with known free energy) might be difficult to define for a thermodynamic integration process. Finally, we stress that the capability of HSMC to extract the absolute entropy from a given sample is important for studying relaxation processes, such as protein folding.

  13. Effects of Pilates method in elderly people: Systematic review of randomized controlled trials.

    PubMed

    de Oliveira Francisco, Cristina; de Almeida Fagundes, Alessandra; Gorges, Bruna

    2015-07-01

    The Pilates method has been widely used in physical training and rehabilitation. Evidence regarding the effectiveness of this method in elderly people is limited. Six randomized controlled trials studies involving the use of the Pilates method for elderly people, published prior to December 2013, were selected from the databases PubMed, MEDLINE, Embase, Cochrane, Scielo and PEDro. Three articles suggested that Pilates produced improvements in balance. Two studies evaluated the adherence to Pilates programs. One study assessed Pilates' influence on cardio-metabolic parameters and another study evaluated changes in body composition. Strong evidence was found regarding beneficial effects of Pilates over static and dynamic balance in women. Nevertheless, evidence of balance improvement in both genders, changes in body composition in woman and adherence to Pilates programs were limited. Effects on cardio-metabolic parameters due to Pilates training presented inconclusive results. Pilates may be a useful tool in rehabilitation and prevention programs but more high quality studies are necessary to establish all the effects on elderly populations. PMID:26118523

  14. Calculation of the entropy of random coil polymers with the hypothetical scanning Monte Carlo method.

    PubMed

    White, Ronald P; Meirovitch, Hagai

    2005-12-01

    Hypothetical scanning Monte Carlo (HSMC) is a method for calculating the absolute entropy S and free energy F from a given MC trajectory developed recently and applied to liquid argon, TIP3P water, and peptides. In this paper HSMC is extended to random coil polymers by applying it to self-avoiding walks on a square lattice--a simple but difficult model due to strong excluded volume interactions. With HSMC the probability of a given chain is obtained as a product of transition probabilities calculated for each bond by MC simulations and a counting formula. This probability is exact in the sense that it is based on all the interactions of the system and the only approximation is due to finite sampling. The method provides rigorous upper and lower bounds for F, which can be obtained from a very small sample and even from a single chain conformation. HSMC is independent of existing techniques and thus constitutes an independent research tool. The HSMC results are compared to those obtained by other methods, and its application to complex lattice chain models is discussed; we emphasize its ability to treat any type of boundary conditions for which a reference state (with known free energy) might be difficult to define for a thermodynamic integration process. Finally, we stress that the capability of HSMC to extract the absolute entropy from a given sample is important for studying relaxation processes, such as protein folding. PMID:16356071

  15. Genetically controlled random search: a global optimization method for continuous multidimensional functions

    NASA Astrophysics Data System (ADS)

    Tsoulos, Ioannis G.; Lagaris, Isaac E.

    2006-01-01

    A new stochastic method for locating the global minimum of a multidimensional function inside a rectangular hyperbox is presented. A sampling technique is employed that makes use of the procedure known as grammatical evolution. The method can be considered as a "genetic" modification of the Controlled Random Search procedure due to Price. The user may code the objective function either in C++ or in Fortran 77. We offer a comparison of the new method with others of similar structure, by presenting results of computational experiments on a set of test functions. Program summaryTitle of program: GenPrice Catalogue identifier:ADWP Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWP Program available from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: the tool is designed to be portable in all systems running the GNU C++ compiler Installation: University of Ioannina, Greece Programming language used: GNU-C++, GNU-C, GNU Fortran-77 Memory required to execute with typical data: 200 KB No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: no No. of lines in distributed program, including test data, etc.:13 135 No. of bytes in distributed program, including test data, etc.: 78 512 Distribution format: tar. gz Nature of physical problem: A multitude of problems in science and engineering are often reduced to minimizing a function of many variables. There are instances that a local optimum does not correspond to the desired physical solution and hence the search for a better solution is required. Local optimization techniques are frequently trapped in local minima. Global optimization is hence the appropriate tool. For example, solving a nonlinear system of equations via optimization, employing a "least squares" type of objective, one may encounter many local minima that do not correspond to solutions, i.e. minima with values

  16. Study of Electromagnetic Scattering From Material Object Doped Randomly With Thin Metallic Wires Using Finite Element Method

    NASA Technical Reports Server (NTRS)

    Deshpande, Manohar D.

    2005-01-01

    A new numerical simulation method using the finite element methodology (FEM) is presented to study electromagnetic scattering due to an arbitrarily shaped material body doped randomly with thin and short metallic wires. The FEM approach described in many standard text books is appropriately modified to account for the presence of thin and short metallic wires distributed randomly inside an arbitrarily shaped material body. Using this modified FEM approach, the electromagnetic scattering due to cylindrical, spherical material body doped randomly with thin metallic wires is studied.

  17. Method for high-volume sequencing of nucleic acids: random and directed priming with libraries of oligonucleotides

    DOEpatents

    Studier, F.W.

    1995-04-18

    Random and directed priming methods for determining nucleotide sequences by enzymatic sequencing techniques, using libraries of primers of lengths 8, 9 or 10 bases, are disclosed. These methods permit direct sequencing of nucleic acids as large as 45,000 base pairs or larger without the necessity for subcloning. Individual primers are used repeatedly to prime sequence reactions in many different nucleic acid molecules. Libraries containing as few as 10,000 octamers, 14,200 nonamers, or 44,000 decamers would have the capacity to determine the sequence of almost any cosmid DNA. Random priming with a fixed set of primers from a smaller library can also be used to initiate the sequencing of individual nucleic acid molecules, with the sequence being completed by directed priming with primers from the library. In contrast to random cloning techniques, a combined random and directed priming strategy is far more efficient. 2 figs.

  18. Method for high-volume sequencing of nucleic acids: random and directed priming with libraries of oligonucleotides

    DOEpatents

    Studier, F. William

    1995-04-18

    Random and directed priming methods for determining nucleotide sequences by enzymatic sequencing techniques, using libraries of primers of lengths 8, 9 or 10 bases, are disclosed. These methods permit direct sequencing of nucleic acids as large as 45,000 base pairs or larger without the necessity for subcloning. Individual primers are used repeatedly to prime sequence reactions in many different nucleic acid molecules. Libraries containing as few as 10,000 octamers, 14,200 nonamers, or 44,000 decamers would have the capacity to determine the sequence of almost any cosmid DNA. Random priming with a fixed set of primers from a smaller library can also be used to initiate the sequencing of individual nucleic acid molecules, with the sequence being completed by directed priming with primers from the library. In contrast to random cloning techniques, a combined random and directed priming strategy is far more efficient.

  19. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... conducting clinical investigations of a drug is to distinguish the effect of a drug from other influences... recognized by the scientific community as the essentials of an adequate and well-controlled clinical... randomization and blinding of patients or investigators, or both. If the intent of the trial is to...

  20. A Randomized Clinical Trial of the Health Evaluation and Referral Assistant (HERA): Research Methods

    PubMed Central

    Boudreaux, Edwin D.; Abar, Beau; Baumann, Brigitte M.; Grissom, Grant

    2013-01-01

    The Health Evaluation and Referral Assistant (HERA) is a web-based program designed to facilitate screening, brief intervention, and referral to treatment (SBIRT) for tobacco, alcohol, and drug abuse. After the patient completes a computerized substance abuse assessment, the HERA produces a summary report with evidence-based recommended clinical actions for the healthcare provider (the Healthcare Provider Report) and a report for the patient (the Patient Feedback Report) that provides education regarding the consequences of use, personally tailored motivational messages, and a tailored substance abuse treatment referral list. For those who provide authorization, the HERA faxes the individual’s contact information to a substance abuse treatment provider matched to the individual’s substance use severity and personal characteristics, like insurance and location of residence (dynamic referral). This paper summarizes the methods used for a randomized controlled trial to evaluate the HERA’s efficacy in leading to increased treatment initiation and reduced substance use. The study was performed in four emergency departments. Individual patients were randomized into one of two conditions: the HERA or assessment only. A total of 4,269 patients were screened and 1,006 participants enrolled. The sample was comprised of 427 tobacco users, 212 risky alcohol users, and 367 illicit drug users. Fourty-two percent used more than one substance class. The enrolled sample was similar to the eligible patient population. The study should enhance understanding of whether computer-facilitated SBIRT can impact process of care variables, such as promoting substance abuse treatment initiation, as well as its effect on subsequent substance abuse and related outcomes. PMID:23665335

  1. Factors associated with adequate weekly reporting for disease surveillance data among health facilities in Nairobi County, Kenya, 2013

    PubMed Central

    Mwatondo, Athman Juma; Ng'ang'a, Zipporah; Maina, Caroline; Makayotto, Lyndah; Mwangi, Moses; Njeru, Ian; Arvelo, Wences

    2016-01-01

    Introduction Kenya adopted the Integrated Disease Surveillance and Response (IDSR) strategy in 1998 to strengthen disease surveillance and epidemic response. However, the goal of weekly surveillance reporting among health facilities has not been achieved. We conducted a cross-sectional study to determine the prevalence of adequate reporting and factors associated with IDSR reporting among health facilities in one Kenyan County. Methods Health facilities (public and private) were enrolled using stratified random sampling from 348 facilities prioritized for routine surveillance reporting. Adequately-reporting facilities were defined as those which submitted >10 weekly reports during a twelve-week period and a poor reporting facilities were those which submitted <10 weekly reports. Multivariate logistic regression with backward selection was used to identify risk factors associated with adequate reporting. Results From September 2 through November 30, 2013, we enrolled 175 health facilities; 130(74%) were private and 45(26%) were public. Of the 175 health facilities, 77 (44%) facilities classified as adequate reporting and 98 (56%) were reporting poorly. Multivariate analysis identified three factors to be independently associated with weekly adequate reporting: having weekly reporting forms at visit (AOR19, 95% CI: 6-65], having posters showing IDSR functions (AOR8, 95% CI: 2-12) and having a designated surveillance focal person (AOR7, 95% CI: 2-20). Conclusion The majority of health facilities in Nairobi County were reporting poorly to IDSR and we recommend that the Ministry of Health provide all health facilities in Nairobi County with weekly reporting tools and offer specific trainings on IDSR which will help designate a focal surveillance person. PMID:27303581

  2. Upscaling solute transport in naturally fractured porous media with the continuous time random walk method

    NASA Astrophysics Data System (ADS)

    Geiger, S.; Cortis, A.; Birkholzer, J. T.

    2010-12-01

    Solute transport in fractured porous media is typically "non-Fickian"; that is, it is characterized by early breakthrough and long tailing and by nonlinear growth of the Green function-centered second moment. This behavior is due to the effects of (1) multirate diffusion occurring between the highly permeable fracture network and the low-permeability rock matrix, (2) a wide range of advection rates in the fractures and, possibly, the matrix as well, and (3) a range of path lengths. As a consequence, prediction of solute transport processes at the macroscale represents a formidable challenge. Classical dual-porosity (or mobile-immobile) approaches in conjunction with an advection-dispersion equation and macroscopic dispersivity commonly fail to predict breakthrough of fractured porous media accurately. It was recently demonstrated that the continuous time random walk (CTRW) method can be used as a generalized upscaling approach. Here we extend this work and use results from high-resolution finite element-finite volume-based simulations of solute transport in an outcrop analogue of a naturally fractured reservoir to calibrate the CTRW method by extracting a distribution of retention times. This procedure allows us to predict breakthrough at other model locations accurately and to gain significant insight into the nature of the fracture-matrix interaction in naturally fractured porous reservoirs with geologically realistic fracture geometries.

  3. Upscaling solute transport in naturally fractured porous media with the continuous time random walk method

    SciTech Connect

    Geiger, S.; Cortis, A.; Birkholzer, J.T.

    2010-04-01

    Solute transport in fractured porous media is typically 'non-Fickian'; that is, it is characterized by early breakthrough and long tailing and by nonlinear growth of the Green function-centered second moment. This behavior is due to the effects of (1) multirate diffusion occurring between the highly permeable fracture network and the low-permeability rock matrix, (2) a wide range of advection rates in the fractures and, possibly, the matrix as well, and (3) a range of path lengths. As a consequence, prediction of solute transport processes at the macroscale represents a formidable challenge. Classical dual-porosity (or mobile-immobile) approaches in conjunction with an advection-dispersion equation and macroscopic dispersivity commonly fail to predict breakthrough of fractured porous media accurately. It was recently demonstrated that the continuous time random walk (CTRW) method can be used as a generalized upscaling approach. Here we extend this work and use results from high-resolution finite element-finite volume-based simulations of solute transport in an outcrop analogue of a naturally fractured reservoir to calibrate the CTRW method by extracting a distribution of retention times. This procedure allows us to predict breakthrough at other model locations accurately and to gain significant insight into the nature of the fracture-matrix interaction in naturally fractured porous reservoirs with geologically realistic fracture geometries.

  4. Switching methods in magnetic random access memory for low power applications

    NASA Astrophysics Data System (ADS)

    Guchang, Han; Jiancheng, Huang; Cheow Hin, Sim; Tran, Michael; Sze Ter, Lim

    2015-06-01

    Effect of saturation magnetization (Ms) of the free layer (FL) on the switching current is analyzed for spin transfer torque (STT) magnetic random access memory (MRAM). For in-plane FL, critical switching current (Ic0) decreases as Ms decreases. However, reduction in Ms also results in a low thermal stability factor (Δ), which must be compensated through increasing shape anisotropy, thus limiting scalability. For perpendicular FL, Ic0 reduction by using low-Ms materials is actually at the expense of data retention. To save energy consumed by STT current, two electric field (EF) controlled switching methods are proposed. Our simulation results show that elliptical FL can be switched by an EF pulse with a suitable width. However, it is difficult to implement this type of switching in real MRAM devices due to the distribution of the required switching pulse widths. A reliable switching method is to use an Oersted field guided switching. Our simulation and experimental results show that the bi-directional magnetization switching could be realized by an EF with an external field as low as  ±5 Oe if the offset field could be removed.

  5. Adipose Tissue - Adequate, Accessible Regenerative Material.

    PubMed

    Kolaparthy, Lakshmi Kanth; Sanivarapu, Sahitya; Moogla, Srinivas; Kutcham, Rupa Sruthi

    2015-11-01

    The potential use of stem cell based therapies for the repair and regeneration of various tissues offers a paradigm shift that may provide alternative therapeutic solutions for a number of diseases. The use of either embryonic stem cells (ESCs) or induced pluripotent stem cells in clinical situations is limited due to cell regulations and to technical and ethical considerations involved in genetic manipulation of human ESCs, even though these cells are highly beneficial. Mesenchymal stem cells seen to be an ideal population of stem cells in particular, Adipose derived stem cells (ASCs) which can be obtained in large number and easily harvested from adipose tissue. It is ubiquitously available and has several advantages compared to other sources as easily accessible in large quantities with minimal invasive harvesting procedure, and isolation of adipose derived mesenchymal stem cells yield a high amount of stem cells which is essential for stem cell based therapies and tissue engineering. Recently, periodontal tissue regeneration using ASCs has been examined in some animal models. This method has potential in the regeneration of functional periodontal tissues because various secreted growth factors from ASCs might not only promote the regeneration of periodontal tissues but also encourage neovascularization of the damaged tissues. This review summarizes the sources, isolation and characteristics of adipose derived stem cells and its potential role in periodontal regeneration is discussed. PMID:26634060

  6. Adipose Tissue - Adequate, Accessible Regenerative Material

    PubMed Central

    Kolaparthy, Lakshmi Kanth.; Sanivarapu, Sahitya; Moogla, Srinivas; Kutcham, Rupa Sruthi

    2015-01-01

    The potential use of stem cell based therapies for the repair and regeneration of various tissues offers a paradigm shift that may provide alternative therapeutic solutions for a number of diseases. The use of either embryonic stem cells (ESCs) or induced pluripotent stem cells in clinical situations is limited due to cell regulations and to technical and ethical considerations involved in genetic manipulation of human ESCs, even though these cells are highly beneficial. Mesenchymal stem cells seen to be an ideal population of stem cells in particular, Adipose derived stem cells (ASCs) which can be obtained in large number and easily harvested from adipose tissue. It is ubiquitously available and has several advantages compared to other sources as easily accessible in large quantities with minimal invasive harvesting procedure, and isolation of adipose derived mesenchymal stem cells yield a high amount of stem cells which is essential for stem cell based therapies and tissue engineering. Recently, periodontal tissue regeneration using ASCs has been examined in some animal models. This method has potential in the regeneration of functional periodontal tissues because various secreted growth factors from ASCs might not only promote the regeneration of periodontal tissues but also encourage neovascularization of the damaged tissues. This review summarizes the sources, isolation and characteristics of adipose derived stem cells and its potential role in periodontal regeneration is discussed. PMID:26634060

  7. 21 CFR 201.5 - Drugs; adequate directions for use.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Drugs; adequate directions for use. 201.5 Section...) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use. Adequate directions for use means directions under which the layman can use a drug safely and for the purposes...

  8. 21 CFR 201.5 - Drugs; adequate directions for use.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs; adequate directions for use. 201.5 Section...) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use. Adequate directions for use means directions under which the layman can use a drug safely and for the purposes...

  9. 4 CFR 200.14 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Responsibility for maintaining adequate safeguards. 200.14 Section 200.14 Accounts RECOVERY ACCOUNTABILITY AND TRANSPARENCY BOARD PRIVACY ACT OF 1974 § 200.14 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and...

  10. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security...

  11. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security...

  12. 4 CFR 200.14 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 4 Accounts 1 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 200....14 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security safeguards to prevent unauthorized disclosure...

  13. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...-evident (general anesthetics, drug metabolism). (3) The method of selection of subjects provides adequate... respect to pertinent variables such as age, sex, severity of disease, duration of disease, and use of... 21 Food and Drugs 5 2011-04-01 2011-04-01 false Adequate and well-controlled studies....

  14. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  15. Development of Methods for Cross-Sectional HIV Incidence Estimation in a Large, Community Randomized Trial

    PubMed Central

    Donnell, Deborah; Komárek, Arnošt; Omelka, Marek; Mullis, Caroline E.; Szekeres, Greg; Piwowar-Manning, Estelle; Fiamma, Agnes; Gray, Ronald H.; Lutalo, Tom; Morrison, Charles S.; Salata, Robert A.; Chipato, Tsungai; Celum, Connie; Kahle, Erin M.; Taha, Taha E.; Kumwenda, Newton I.; Karim, Quarraisha Abdool; Naranbhai, Vivek; Lingappa, Jairam R.; Sweat, Michael D.; Coates, Thomas; Eshleman, Susan H.

    2013-01-01

    Background Accurate methods of HIV incidence determination are critically needed to monitor the epidemic and determine the population level impact of prevention trials. One such trial, Project Accept, a Phase III, community-randomized trial, evaluated the impact of enhanced, community-based voluntary counseling and testing on population-level HIV incidence. The primary endpoint of the trial was based on a single, cross-sectional, post-intervention HIV incidence assessment. Methods and Findings Test performance of HIV incidence determination was evaluated for 403 multi-assay algorithms [MAAs] that included the BED capture immunoassay [BED-CEIA] alone, an avidity assay alone, and combinations of these assays at different cutoff values with and without CD4 and viral load testing on samples from seven African cohorts (5,325 samples from 3,436 individuals with known duration of HIV infection [1 month to >10 years]). The mean window period (average time individuals appear positive for a given algorithm) and performance in estimating an incidence estimate (in terms of bias and variance) of these MAAs were evaluated in three simulated epidemic scenarios (stable, emerging and waning). The power of different test methods to detect a 35% reduction in incidence in the matched communities of Project Accept was also assessed. A MAA was identified that included BED-CEIA, the avidity assay, CD4 cell count, and viral load that had a window period of 259 days, accurately estimated HIV incidence in all three epidemic settings and provided sufficient power to detect an intervention effect in Project Accept. Conclusions In a Southern African setting, HIV incidence estimates and intervention effects can be accurately estimated from cross-sectional surveys using a MAA. The improved accuracy in cross-sectional incidence testing that a MAA provides is a powerful tool for HIV surveillance and program evaluation. PMID:24236054

  16. Purchasing a cycle helmet: are retailers providing adequate advice?

    PubMed Central

    Plumridge, E.; McCool, J.; Chetwynd, J.; Langley, J. D.

    1996-01-01

    OBJECTIVES: The aim of this study was to examine the selling of cycle helmets in retail stores with particular reference to the adequacy of advice offered about the fit and securing of helmets. METHODS: All 55 retail outlets selling cycle helmets in Christchurch, New Zealand were studied by participant observation. A research entered each store as a prospective customer and requested assistance to purchase a helmet. She took detailed field notes of the ensuing encounter and these were subsequently transcribed, coded, and analysed. RESULTS: Adequate advice for helmet purchase was given in less than half of the stores. In general the sales assistants in specialist cycle shops were better informed and gave more adequate advice than those in department stores. Those who have good advice also tended to be more good advice also tended to be more active in helping with fitting the helmet. Knowledge about safety standards was apparent in one third of sales assistants. Few stores displayed information for customers about the correct fit of cycle helmets. CONCLUSIONS: These findings suggest that the advice and assistance being given to ensure that cycle helmets fit properly is often inadequate and thus the helmets may fail to fulfil their purpose in preventing injury. Consultation between retailers and policy makers is a necessary first step to improving this situation. PMID:9346053

  17. MEGAWHOP cloning: a method of creating random mutagenesis libraries via megaprimer PCR of whole plasmids.

    PubMed

    Miyazaki, Kentaro

    2011-01-01

    MEGAWHOP allows for the cloning of DNA fragments into a vector and is used for conventional restriction digestion/ligation-based procedures. In MEGAWHOP, the DNA fragment to be cloned is used as a set of complementary primers that replace a homologous region in a template vector through whole-plasmid PCR. After synthesis of a nicked circular plasmid, the mixture is treated with DpnI, a dam-methylated DNA-specific restriction enzyme, to digest the template plasmid. The DpnI-treated mixture is then introduced into competent Escherichia coli cells to yield plasmids carrying replaced insert fragments. Plasmids produced by the MEGAWHOP method are virtually free of contamination by species without any inserts or with multiple inserts, and also the parent. Because the fragment is usually long enough to not interfere with hybridization to the template, various types of fragments can be used with mutations at any site (either known or unknown, random, or specific). By using fragments having homologous sequences at the ends (e.g., adaptor sequence), MEGAWHOP can also be used to recombine nonhomologous sequences mediated by the adaptors, allowing rapid creation of novel constructs and chimeric genes. PMID:21601687

  18. Proposal of the Methodology for Analysing the Structural Relationship in the System of Random Process Using the Data Mining Methods

    NASA Astrophysics Data System (ADS)

    Michaľčonok, German; Kalinová, Michaela Horalová; Németh, Martin

    2014-12-01

    The aim of this paper is to present the possibilities of applying data mining techniques to the problem of analysis of structural relationships in the system of stationary random processes. In this paper, we will approach the area of the random processes, present the process of structural analysis and select suitable circuit data mining methods applicable to the area of structural analysis. We will propose the methodology for the structural analysis in the system of stationary stochastic processes using data mining methods for active experimental approach, based on the theoretical basis.

  19. MOMENT-BASED METHOD FOR RANDOM EFFECTS SELECTION IN LINEAR MIXED MODELS

    PubMed Central

    Ahn, Mihye; Lu, Wenbin

    2012-01-01

    The selection of random effects in linear mixed models is an important yet challenging problem in practice. We propose a robust and unified framework for automatically selecting random effects and estimating covariance components in linear mixed models. A moment-based loss function is first constructed for estimating the covariance matrix of random effects. Two types of shrinkage penalties, a hard thresholding operator and a new sandwich-type soft-thresholding penalty, are then imposed for sparse estimation and random effects selection. Compared with existing approaches, the new procedure does not require any distributional assumption on the random effects and error terms. We establish the asymptotic properties of the resulting estimator in terms of its consistency in both random effects selection and variance component estimation. Optimization strategies are suggested to tackle the computational challenges involved in estimating the sparse variance-covariance matrix. Furthermore, we extend the procedure to incorporate the selection of fixed effects as well. Numerical results show promising performance of the new approach in selecting both random and fixed effects and, consequently, improving the efficiency of estimating model parameters. Finally, we apply the approach to a data set from the Amsterdam Growth and Health study. PMID:23105913

  20. Turbulence parameterizations for the random displacement method (RDM) version of ADPIC

    SciTech Connect

    Nasstrom, J.S.

    1995-05-01

    This document describes the algorithms that are used in the new random displacement method (RDM) option in the ADPIC model to parameterize atmospheric boundary layer turbulence through an eddy diffusivity, K. Both the new RDM version and previous gradient version of ADPIC use eddy diffusivities, and, as before, several parameterization options are available. The options used in the RDM are similar to the options for the existing Gradient method in ADPIC, but with some changes. Preferred parameterizations are based on boundary layer turbulence scaling parameters and measured turbulent velocity statistics. Simpler parameterizations, based solely on Pasquill stability class, are also available. When eddy diffusivities are based on boundary layer turbulence scaling parameters (i.e., u, h, z and L ), {open_quotes}turbulence parameterization{close_quotes} is an appropriate term. In other cases, this term is used loosely to describe {open_quotes}sigma curves{close_quotes}. These are semi-empirical relationships between the standard deviations, {sigma}z(x) and {sigma}y(x), of concentration from a point source and downwind distance. Separate sigma curves are used for each of six Pasquill stability classes, which are used to categorize the diffusive properties of the atmospheric surface layer. Consequently, sigma curves are more than parameterizations of turbulence since they also prescribe the final concentration distribution (for a point source) given a Pasquill stability class. In the ADPIC model, sigma curves can be used to calculate the eddy diffusivities, K{sub Z} and K{sub H}. Thus, they can be used to {open_quotes}back out{close_quotes} parameterizations for K which are consistent with the dispersion associated with the particular sigma curve. This results in eddy diffusivities which are spatially homogeneous, but travel time dependent.

  1. Ulam's method to estimate invariant measures and Lyapunov exponents for one-dimensional discretely randomized photonic structures

    NASA Astrophysics Data System (ADS)

    Kissel, Glen J.

    2009-08-01

    In the one-dimensional optical analog to Anderson localization, a periodically layered medium has one or more parameters randomly disordered. Such a medium can be modeled by an infinite product of 2x2 random transfer matrices with the upper Lyapunov exponent of the matrix product identified as the localization factor (inverse localization length). Furstenberg's integral formula for the Lyapunov exponent requires integration with respect to both the probability measure of the random matrices and the invariant probability measure of the direction of the vector propagated by the random matrix product. This invariant measure is difficult to find analytically, so one of several numerical techniques must be used in its calculation. Here, we focus on one of those techniques, Ulam's method, which sets up a sparse matrix of the probabilities that an entire interval of possible directions will be transferred to some other interval of directions. The left eigenvector of this sparse matrix forms the estimated invariant measure. While Ulam's method is shown to produce results as accurate as others, it suffers from long computation times. The Ulam method, along with other approaches, is demonstrated on a random Fibonacci sequence having a known answer, and on a quarter-wave stack model with discrete disorder in layer thickness.

  2. A modified hybrid uncertain analysis method for dynamic response field of the LSOAAC with random and interval parameters

    NASA Astrophysics Data System (ADS)

    Zi, Bin; Zhou, Bin

    2016-07-01

    For the prediction of dynamic response field of the luffing system of an automobile crane (LSOAAC) with random and interval parameters, a hybrid uncertain model is introduced. In the hybrid uncertain model, the parameters with certain probability distribution are modeled as random variables, whereas, the parameters with lower and upper bounds are modeled as interval variables instead of given precise values. Based on the hybrid uncertain model, the hybrid uncertain dynamic response equilibrium equation, in which different random and interval parameters are simultaneously included in input and output terms, is constructed. Then a modified hybrid uncertain analysis method (MHUAM) is proposed. In the MHUAM, based on random interval perturbation method, the first-order Taylor series expansion and the first-order Neumann series, the dynamic response expression of the LSOAAC is developed. Moreover, the mathematical characteristics of extrema of bounds of dynamic response are determined by random interval moment method and monotonic analysis technique. Compared with the hybrid Monte Carlo method (HMCM) and interval perturbation method (IPM), numerical results show the feasibility and efficiency of the MHUAM for solving the hybrid LSOAAC problems. The effects of different uncertain models and parameters on the LSOAAC response field are also investigated deeply, and numerical results indicate that the impact made by the randomness in the thrust of the luffing cylinder F is larger than that made by the gravity of the weight in suspension Q . In addition, the impact made by the uncertainty in the displacement between the lower end of the lifting arm and the luffing cylinder a is larger than that made by the length of the lifting arm L .

  3. Research on Parameter Estimation Methods for Alpha Stable Noise in a Laser Gyroscope’s Random Error

    PubMed Central

    Wang, Xueyun; Li, Kui; Gao, Pengyu; Meng, Suxia

    2015-01-01

    Alpha stable noise, determined by four parameters, has been found in the random error of a laser gyroscope. Accurate estimation of the four parameters is the key process for analyzing the properties of alpha stable noise. Three widely used estimation methods—quantile, empirical characteristic function (ECF) and logarithmic moment method—are analyzed in contrast with Monte Carlo simulation in this paper. The estimation accuracy and the application conditions of all methods, as well as the causes of poor estimation accuracy, are illustrated. Finally, the highest precision method, ECF, is applied to 27 groups of experimental data to estimate the parameters of alpha stable noise in a laser gyroscope’s random error. The cumulative probability density curve of the experimental data fitted by an alpha stable distribution is better than that by a Gaussian distribution, which verifies the existence of alpha stable noise in a laser gyroscope’s random error. PMID:26230698

  4. Are PPS payments adequate? Issues for updating and assessing rates

    PubMed Central

    Sheingold, Steven H.; Richter, Elizabeth

    1992-01-01

    Declining operating margins under Medicare's prospective payment system (PPS) have focused attention on the adequacy of payment rates. The question of whether annual updates to the rates have been too low or cost increases too high has become important. In this article we discuss issues relevant to updating PPS rates and judging their adequacy. We describe a modification to the current framework for recommending annual update factors. This framework is then used to retrospectively assess PPS payment and cost growth since 1985. The preliminary results suggest that current rates are more than adequate to support the cost of efficient care. Also discussed are why using financial margins to evaluate rates is problematic and alternative methods that might be employed. PMID:10127450

  5. A Mixed-Methods Randomized Controlled Trial of Financial Incentives and Peer Networks to Promote Walking among Older Adults

    ERIC Educational Resources Information Center

    Kullgren, Jeffrey T.; Harkins, Kristin A.; Bellamy, Scarlett L.; Gonzales, Amy; Tao, Yuanyuan; Zhu, Jingsan; Volpp, Kevin G.; Asch, David A.; Heisler, Michele; Karlawish, Jason

    2014-01-01

    Background: Financial incentives and peer networks could be delivered through eHealth technologies to encourage older adults to walk more. Methods: We conducted a 24-week randomized trial in which 92 older adults with a computer and Internet access received a pedometer, daily walking goals, and weekly feedback on goal achievement. Participants…

  6. Single Particle Electron Microscopy Reconstruction of the Exosome Complex Using the Random Conical Tilt Method

    PubMed Central

    Liu, Xueqi; Wang, Hong-Wei

    2011-01-01

    of each single particle. There are several methods to assign the view for each particle, including the angular reconstitution1 and random conical tilt (RCT) method2. In this protocol, we describe our practice in getting the 3D reconstruction of yeast exosome complex using negative staining EM and RCT. It should be noted that our protocol of electron microscopy and image processing follows the basic principle of RCT but is not the only way to perform the method. We first describe how to embed the protein sample into a layer of Uranyl-Formate with a thickness comparable to the protein size, using a holey carbon grid covered with a layer of continuous thin carbon film. Then the specimen is inserted into a transmission electron microscope to collect untilted (0-degree) and tilted (55-degree) pairs of micrographs that will be used later for processing and obtaining an initial 3D model of the yeast exosome. To this end, we perform RCT and then refine the initial 3D model by using the projection matching refinement method3. PMID:21490573

  7. The analysis of a sparse grid stochastic collocation method for partial differential equations with high-dimensional random input data.

    SciTech Connect

    Webster, Clayton; Tempone, Raul; Nobile, Fabio

    2007-12-01

    This work describes the convergence analysis of a Smolyak-type sparse grid stochastic collocation method for the approximation of statistical quantities related to the solution of partial differential equations with random coefficients and forcing terms (input data of the model). To compute solution statistics, the sparse grid stochastic collocation method uses approximate solutions, produced here by finite elements, corresponding to a deterministic set of points in the random input space. This naturally requires solving uncoupled deterministic problems and, as such, the derived strong error estimates for the fully discrete solution are used to compare the computational efficiency of the proposed method with the Monte Carlo method. Numerical examples illustrate the theoretical results and are used to compare this approach with several others, including the standard Monte Carlo.

  8. Active video games as a tool to prevent excessive weight gain in adolescents: rationale, design and methods of a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Excessive body weight, low physical activity and excessive sedentary time in youth are major public health concerns. A new generation of video games, the ones that require physical activity to play the games –i.e. active games- may be a promising alternative to traditional non-active games to promote physical activity and reduce sedentary behaviors in youth. The aim of this manuscript is to describe the design of a study evaluating the effects of a family oriented active game intervention, incorporating several motivational elements, on anthropometrics and health behaviors in adolescents. Methods/Design The study is a randomized controlled trial (RCT), with non-active gaming adolescents aged 12 – 16 years old randomly allocated to a ten month intervention (receiving active games, as well as an encouragement to play) or a waiting-list control group (receiving active games after the intervention period). Primary outcomes are adolescents’ measured BMI-SDS (SDS = adjusted for mean standard deviation score), waist circumference-SDS, hip circumference and sum of skinfolds. Secondary outcomes are adolescents’ self-reported time spent playing active and non-active games, other sedentary activities and consumption of sugar-sweetened beverages. In addition, a process evaluation is conducted, assessing the sustainability of the active games, enjoyment, perceived competence, perceived barriers for active game play, game context, injuries from active game play, activity replacement and intention to continue playing the active games. Discussion This is the first adequately powered RCT including normal weight adolescents, evaluating a reasonably long period of provision of and exposure to active games. Next, strong elements are the incorporating motivational elements for active game play and a comprehensive process evaluation. This trial will provide evidence regarding the potential contribution of active games in prevention of excessive weight gain in

  9. Acupuncture as a treatment for functional dyspepsia: design and methods of a randomized controlled trial

    PubMed Central

    Zheng, Hui; Tian, Xiao-ping; Li, Ying; Liang, Fan-rong; Yu, Shu-guang; Liu, Xu-guang; Tang, Yong; Yang, Xu-guang; Yan, Jie; Sun, Guo-jie; Chang, Xiao-rong; Zhang, Hong-xing; Ma, Ting-ting; Yu, Shu-yuan

    2009-01-01

    Background Acupuncture is widely used in China to treat functional dyspepsia (FD). However, its effectiveness in the treatment of FD, and whether FD-specific acupoints exist, are controversial. So this study aims to determine if acupuncture is an effective treatment for FD and if acupoint specificity exists according to traditional acupuncture meridians and acupoint theories. Design This multicenter randomized controlled trial will include four acupoint treatment groups, one non-acupoint control group and one drug (positive control) group. The four acupoint treatment groups will focus on: (1) specific acupoints of the stomach meridian; (2) non-specific acupoints of the stomach meridian; (3) specific acupoints of alarm and transport points; and (4) acupoints of the gallbladder meridian. These four groups of acupoints are thought to differ in terms of clinical efficacy, according to traditional acupuncture meridians and acupoint theories. A total of 120 FD patients will be included in each group. Each patient will receive 20 sessions of acupuncture treatment over 4 weeks. The trial will be conducted in eight hospitals located in three centers of China. The primary outcomes in this trial will include differences in Nepean Dyspepsia Index scores and differences in the Symptom Index of Dyspepsia before randomization, 2 weeks and 4 weeks after randomization, and 1 month and 3 months after completing treatment. Discussion The important features of this trial include the randomization procedures (controlled by a central randomization system), a standardized protocol of acupuncture manipulation, and the fact that this is the first multicenter randomized trial of FD and acupuncture to be performed in China. The results of this trial will determine whether acupuncture is an effective treatment for FD and whether using different acupoints or different meridians leads to differences in clinical efficacy. Trial registration number Clinical Trials.gov Identifier: NCT00599677

  10. Nonaqueous Dispersion Formed by an Emulsion Solvent Evaporation Method Using Block-Random Copolymer Surfactant Synthesized by RAFT Polymerization.

    PubMed

    Ezaki, Naofumi; Watanabe, Yoshifumi; Mori, Hideharu

    2015-10-27

    As surfactants for preparation of nonaqueous microcapsule dispersions by the emulsion solvent evaporation method, three copolymers composed of stearyl methacrylate (SMA) and glycidyl methacrylate (GMA) with different monomer sequences (i.e., random, block, and block-random) were synthesized by reversible addition-fragmentation chain transfer (RAFT) polymerization. Despite having the same comonomer composition, the copolymers exhibited different functionality as surfactants for creating emulsions with respective dispersed and continuous phases consisting of methanol and isoparaffin solvent. The optimal monomer sequence for the surfactant was determined based on the droplet sizes and the stabilities of the emulsions created using these copolymers. The block-random copolymer led to an emulsion with better stability than obtained using the random copolymer and a smaller droplet size than achieved with the block copolymer. Modification of the epoxy group of the GMA unit by diethanolamine (DEA) further decreased the droplet size, leading to higher stability of the emulsion. The DEA-modified block-random copolymer gave rise to nonaqueous microcapsule dispersions after evaporation of methanol from the emulsions containing colored dyes in their dispersed phases. These dispersions exhibited high stability, and the particle sizes were small enough for application to the inkjet printing process. PMID:26421355

  11. 7 CFR 4290.200 - Adequate capital for RBICs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 15 2011-01-01 2011-01-01 false Adequate capital for RBICs. 4290.200 Section 4290.200 Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... Qualifications for the RBIC Program Capitalizing A Rbic § 4290.200 Adequate capital for RBICs. You must meet...

  12. 13 CFR 107.200 - Adequate capital for Licensees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Adequate capital for Licensees... INVESTMENT COMPANIES Qualifying for an SBIC License Capitalizing An Sbic § 107.200 Adequate capital for... Licensee, and to receive Leverage. (a) You must have enough Regulatory Capital to provide...

  13. 13 CFR 107.200 - Adequate capital for Licensees.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for Licensees... INVESTMENT COMPANIES Qualifying for an SBIC License Capitalizing An Sbic § 107.200 Adequate capital for... Licensee, and to receive Leverage. (a) You must have enough Regulatory Capital to provide...

  14. 7 CFR 4290.200 - Adequate capital for RBICs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Adequate capital for RBICs. 4290.200 Section 4290.200 Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... Qualifications for the RBIC Program Capitalizing A Rbic § 4290.200 Adequate capital for RBICs. You must meet...

  15. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...

  16. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of...

  17. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  18. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  19. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  20. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  1. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  2. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  3. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  4. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  5. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...

  6. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...

  7. 10 CFR 503.35 - Inability to obtain adequate capital.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Inability to obtain adequate capital. 503.35 Section 503.35 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES Permanent Exemptions for New Facilities § 503.35 Inability to obtain adequate capital. (a) Eligibility. Section 212(a)(1)(D)...

  8. 10 CFR 503.35 - Inability to obtain adequate capital.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Inability to obtain adequate capital. 503.35 Section 503.35 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES Permanent Exemptions for New Facilities § 503.35 Inability to obtain adequate capital. (a) Eligibility. Section 212(a)(1)(D)...

  9. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must find... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Adequate exploration plan....

  10. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must find... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Adequate exploration plan....

  11. "Something Adequate"? In Memoriam Seamus Heaney, Sister Quinlan, Nirbhaya

    ERIC Educational Resources Information Center

    Parker, Jan

    2014-01-01

    Seamus Heaney talked of poetry's responsibility to represent the "bloody miracle", the "terrible beauty" of atrocity; to create "something adequate". This article asks, what is adequate to the burning and eating of a nun and the murderous gang rape and evisceration of a medical student? It considers Njabulo…

  12. Oxide Defect Engineering Methods for Valence Change (VCM) Resistive Random Access Memories

    NASA Astrophysics Data System (ADS)

    Capulong, Jihan O.

    Electrical switching requirements for resistive random access memory (ReRAM) devices are multifaceted, based on device application. Thus, it is important to obtain an understanding of these switching properties and how they relate to the oxygen vacancy concentration and oxygen vacancy defects. Oxygen vacancy defects in the switching oxide of valence-change-based ReRAM (VCM ReRAM) play a significant role in device switching properties. Oxygen vacancies facilitate resistive switching as they form the conductive filament that changes the resistance state of the device. This dissertation will present two methods of modulating the defect concentration in VCM ReRAM composed of Pt/HfOx/Ti stack: 1) rapid thermal annealing (RTA) in Ar using different temperatures, and 2) doping using ion implantation under different dose levels. Metrology techniques such as x-ray diffractometry (XRD), x-ray photoelectron spectroscopy (XPS), and photoluminescence (PL) spectroscopy were utilized to characterize the HfOx switching oxide, which provided insight on the material properties and oxygen vacancy concentration in the oxide that was used to explain the changes in the electrical properties of the ReRAM devices. The resulting impact on the resistive switching characteristics of the devices, such as the forming voltage, set and reset threshold voltages, ON and OFF resistances, resistance ratio, and switching dispersion or uniformity were explored and summarized. Annealing in Ar showed significant impact on the forming voltage, with as much as 45% (from 22V to 12 V) of improvement, as the annealing temperature was increased. However, drawbacks of a higher oxide leakage and worse switching uniformity were seen with increasing annealing temperature. Meanwhile, doping the oxide by ion implantation showed significant effects on the resistive switching characteristics. Ta doping modulated the following switching properties with increasing dose: a) the reduction of the forming voltage, and Vset

  13. Variability in DNA polymerase efficiency: effects of random error, DNA extraction method, and isolate type

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Using computer-generated data calculated with known amounts of random error (E = 1, 5 & 10%) associated with calculated qPCR cycle number (C ) at four jth 1:10 dilutions, we found that the “efficiency” (eff) associated with each population distribution of n = 10,000 measurements varied from 0.95 to ...

  14. Increasing the Degrees of Freedom in Future Group Randomized Trials: The "df*" Method Revisited

    ERIC Educational Resources Information Center

    Murray, David M.; Blitstein, Jonathan L.; Hannan, Peter J.; Shadish, William R.

    2012-01-01

    Background: This article revisits an article published in Evaluation Review in 2005 on sample size estimation and power analysis for group-randomized trials. With help from a careful reader, we learned of an important error in the spreadsheet used to perform the calculations and generate the results presented in that article. As we studied the…

  15. Spatial cross modulation method using a random diffuser and phase-only spatial light modulator for constructing arbitrary complex fields.

    PubMed

    Shibukawa, Atsushi; Okamoto, Atsushi; Takabayashi, Masanori; Tomita, Akihisa

    2014-02-24

    We propose a spatial cross modulation method using a random diffuser and a phase-only spatial light modulator (SLM), by which arbitrary complex-amplitude fields can be generated with higher spatial resolution and diffraction efficiency than off-axis and double-phase computer-generated holograms. Our method encodes the original complex object as a phase-only diffusion image by scattering the complex object using a random diffuser. In addition, all incoming light to the SLM is consumed for a single diffraction order, making a diffraction efficiency of more than 90% possible. This method can be applied for holographic data storage, three-dimensional displays, and other such applications. PMID:24663718

  16. Adequate peritoneal dialysis: theoretical model and patient treatment.

    PubMed

    Tast, C

    1998-01-01

    The objective of this study was to evaluate the relationship between adequate PD with sufficient weekly Kt/V (2.0) and Creatinine clearance (CCR) (60l) and necessary daily dialysate volume. This recommended parameter was the result of a recent multi-centre study (CANUSA). For this there were 40 patients in our hospital examined and compared in 1996, who carried out PD for at least 8 weeks and up to 6 years. These goals (CANUSA) are easily attainable in the early treatment of many individuals with a low body surface area (BSA). With higher BSA or missing RRF (Residual Renal Function) the daily dose of dialysis must be adjusted. We found it difficult to obtain the recommended parameters and tried to find a solution to this problem. The simplest method is to increase the volume or exchange rate. The most expensive method is to change from CAPD to APD with the possibility of higher volume or exchange rates. Selection of therapy must take into consideration: 1. patient preference, 2. body mass, 3. peritoneal transport rates, 4. ability to perform therapy, 5. cost of therapy and 6. risk of peritonitis. With this information in mind, an individual prescription can be formulated and matched to the appropriate modality of PD. PMID:10392062

  17. Comparison of methods for estimating the intraclass correlation coefficient for binary responses in cancer prevention cluster randomized trials.

    PubMed

    Wu, Sheng; Crespi, Catherine M; Wong, Weng Kee

    2012-09-01

    The intraclass correlation coefficient (ICC) is a fundamental parameter of interest in cluster randomized trials as it can greatly affect statistical power. We compare common methods of estimating the ICC in cluster randomized trials with binary outcomes, with a specific focus on their application to community-based cancer prevention trials with primary outcome of self-reported cancer screening. Using three real data sets from cancer screening intervention trials with different numbers and types of clusters and cluster sizes, we obtained point estimates and 95% confidence intervals for the ICC using five methods: the analysis of variance estimator, the Fleiss-Cuzick estimator, the Pearson estimator, an estimator based on generalized estimating equations and an estimator from a random intercept logistic regression model. We compared estimates of the ICC for the overall sample and by study condition. Our results show that ICC estimates from different methods can be quite different, although confidence intervals generally overlap. The ICC varied substantially by study condition in two studies, suggesting that the common practice of assuming a common ICC across all clusters in the trial is questionable. A simulation study confirmed pitfalls of erroneously assuming a common ICC. Investigators should consider using sample size and analysis methods that allow the ICC to vary by study condition. PMID:22627076

  18. Asymptotic-preserving methods for hyperbolic and transport equations with random inputs and diffusive scalings

    SciTech Connect

    Jin, Shi; Xiu, Dongbin; Zhu, Xueyu

    2015-05-15

    In this paper we develop a set of stochastic numerical schemes for hyperbolic and transport equations with diffusive scalings and subject to random inputs. The schemes are asymptotic preserving (AP), in the sense that they preserve the diffusive limits of the equations in discrete setting, without requiring excessive refinement of the discretization. Our stochastic AP schemes are extensions of the well-developed deterministic AP schemes. To handle the random inputs, we employ generalized polynomial chaos (gPC) expansion and combine it with stochastic Galerkin procedure. We apply the gPC Galerkin scheme to a set of representative hyperbolic and transport equations and establish the AP property in the stochastic setting. We then provide several numerical examples to illustrate the accuracy and effectiveness of the stochastic AP schemes.

  19. Method for removal of random noise in eddy-current testing system

    DOEpatents

    Levy, Arthur J.

    1995-01-01

    Eddy-current response voltages, generated during inspection of metallic structures for anomalies, are often replete with noise. Therefore, analysis of the inspection data and results is difficult or near impossible, resulting in inconsistent or unreliable evaluation of the structure. This invention processes the eddy-current response voltage, removing the effect of random noise, to allow proper identification of anomalies within and associated with the structure.

  20. Method for Evaluation of Outage Probability on Random Access Channel in Mobile Communication Systems

    NASA Astrophysics Data System (ADS)

    Kollár, Martin

    2012-05-01

    In order to access the cell in all mobile communication technologies a so called random-access procedure is used. For example in GSM this is represented by sending the CHANNEL REQUEST message from Mobile Station (MS) to Base Transceiver Station (BTS) which is consequently forwarded as an CHANNEL REQUIRED message to the Base Station Controller (BSC). If the BTS decodes some noise on the Random Access Channel (RACH) as random access by mistake (so- called ‘phantom RACH') then it is a question of pure coincidence which èstablishment cause’ the BTS thinks to have recognized. A typical invalid channel access request or phantom RACH is characterized by an IMMEDIATE ASSIGNMENT procedure (assignment of an SDCCH or TCH) which is not followed by sending an ESTABLISH INDICATION from MS to BTS. In this paper a mathematical model for evaluation of the Power RACH Busy Threshold (RACHBT) in order to guaranty in advance determined outage probability on RACH is described and discussed as well. It focuses on Global System for Mobile Communications (GSM) however the obtained results can be generalized on remaining mobile technologies (ie WCDMA and LTE).

  1. A Novel Compressed Sensing Method for Magnetic Resonance Imaging: Exponential Wavelet Iterative Shrinkage-Thresholding Algorithm with Random Shift.

    PubMed

    Zhang, Yudong; Yang, Jiquan; Yang, Jianfei; Liu, Aijun; Sun, Ping

    2016-01-01

    Aim. It can help improve the hospital throughput to accelerate magnetic resonance imaging (MRI) scanning. Patients will benefit from less waiting time. Task. In the last decade, various rapid MRI techniques on the basis of compressed sensing (CS) were proposed. However, both computation time and reconstruction quality of traditional CS-MRI did not meet the requirement of clinical use. Method. In this study, a novel method was proposed with the name of exponential wavelet iterative shrinkage-thresholding algorithm with random shift (abbreviated as EWISTARS). It is composed of three successful components: (i) exponential wavelet transform, (ii) iterative shrinkage-thresholding algorithm, and (iii) random shift. Results. Experimental results validated that, compared to state-of-the-art approaches, EWISTARS obtained the least mean absolute error, the least mean-squared error, and the highest peak signal-to-noise ratio. Conclusion. EWISTARS is superior to state-of-the-art approaches. PMID:27066068

  2. A Novel Compressed Sensing Method for Magnetic Resonance Imaging: Exponential Wavelet Iterative Shrinkage-Thresholding Algorithm with Random Shift

    PubMed Central

    Zhang, Yudong; Yang, Jiquan; Yang, Jianfei; Liu, Aijun; Sun, Ping

    2016-01-01

    Aim. It can help improve the hospital throughput to accelerate magnetic resonance imaging (MRI) scanning. Patients will benefit from less waiting time. Task. In the last decade, various rapid MRI techniques on the basis of compressed sensing (CS) were proposed. However, both computation time and reconstruction quality of traditional CS-MRI did not meet the requirement of clinical use. Method. In this study, a novel method was proposed with the name of exponential wavelet iterative shrinkage-thresholding algorithm with random shift (abbreviated as EWISTARS). It is composed of three successful components: (i) exponential wavelet transform, (ii) iterative shrinkage-thresholding algorithm, and (iii) random shift. Results. Experimental results validated that, compared to state-of-the-art approaches, EWISTARS obtained the least mean absolute error, the least mean-squared error, and the highest peak signal-to-noise ratio. Conclusion. EWISTARS is superior to state-of-the-art approaches. PMID:27066068

  3. Inappropriate statistical method in a parallel-group randomized controlled trial results in unsubstantiated conclusions.

    PubMed

    Dimova, Rositsa B; Allison, David B

    2016-01-01

    The conclusions of Cassani et al. in the January 2015 issue of Nutrition Journal (doi: 10.1186/1475-2891-14-5 ) cannot be substantiated by the analysis reported nor by the data themselves. The authors ascribed the observed decrease in inflammatory markers to the components of flaxseed and based their conclusions on within-group comparisons made between the final and the baseline measurements separately in each arm of the randomized controlled trial. However, this is an improper approach and the conclusions of the paper are invalid. A correct analysis of the data shows no such effects. PMID:27265269

  4. Comparison of Parametric and Nonparametric Bootstrap Methods for Estimating Random Error in Equipercentile Equating

    ERIC Educational Resources Information Center

    Cui, Zhongmin; Kolen, Michael J.

    2008-01-01

    This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…

  5. Performance of analytical methods for overdispersed counts in cluster randomized trials: sample size, degree of clustering and imbalance.

    PubMed

    Durán Pacheco, Gonzalo; Hattendorf, Jan; Colford, John M; Mäusezahl, Daniel; Smith, Thomas

    2009-10-30

    Many different methods have been proposed for the analysis of cluster randomized trials (CRTs) over the last 30 years. However, the evaluation of methods on overdispersed count data has been based mostly on the comparison of results using empiric data; i.e. when the true model parameters are not known. In this study, we assess via simulation the performance of five methods for the analysis of counts in situations similar to real community-intervention trials. We used the negative binomial distribution to simulate overdispersed counts of CRTs with two study arms, allowing the period of time under observation to vary among individuals. We assessed different sample sizes, degrees of clustering and degrees of cluster-size imbalance. The compared methods are: (i) the two-sample t-test of cluster-level rates, (ii) generalized estimating equations (GEE) with empirical covariance estimators, (iii) GEE with model-based covariance estimators, (iv) generalized linear mixed models (GLMM) and (v) Bayesian hierarchical models (Bayes-HM). Variation in sample size and clustering led to differences between the methods in terms of coverage, significance, power and random-effects estimation. GLMM and Bayes-HM performed better in general with Bayes-HM producing less dispersed results for random-effects estimates although upward biased when clustering was low. GEE showed higher power but anticonservative coverage and elevated type I error rates. Imbalance affected the overall performance of the cluster-level t-test and the GEE's coverage in small samples. Important effects arising from accounting for overdispersion are illustrated through the analysis of a community-intervention trial on Solar Water Disinfection in rural Bolivia. PMID:19672840

  6. On Adequate Comparisons of Antenna Phase Center Variations

    NASA Astrophysics Data System (ADS)

    Schoen, S.; Kersten, T.

    2013-12-01

    One important part for ensuring the high quality of the International GNSS Service's (IGS) products is the collection and publication of receiver - and satellite antenna phase center variations (PCV). The PCV are crucial for global and regional networks, since they introduce a global scale factor of up to 16ppb or changes in the height component with an amount of up to 10cm, respectively. Furthermore, antenna phase center variations are also important for precise orbit determination, navigation and positioning of mobile platforms, like e.g. the GOCE and GRACE gravity missions, or for the accurate Precise Point Positioning (PPP) processing. Using the EUREF Permanent Network (EPN), Baire et al. (2012) showed that individual PCV values have a significant impact on the geodetic positioning. The statements are further supported by studies of Steigenberger et al. (2013) where the impact of PCV for local-ties are analysed. Currently, there are five calibration institutions including the Institut für Erdmessung (IfE) contributing to the IGS PCV file. Different approaches like field calibrations and anechoic chamber measurements are in use. Additionally, the computation and parameterization of the PCV are completely different within the methods. Therefore, every new approach has to pass a benchmark test in order to ensure that variations of PCV values of an identical antenna obtained from different methods are as consistent as possible. Since the number of approaches to obtain these PCV values rises with the number of calibration institutions, there is the necessity for an adequate comparison concept, taking into account not only the numerical values but also stochastic information and computational issues of the determined PCVs. This is of special importance, since the majority of calibrated receiver antennas published by the IGS origin from absolute field calibrations based on the Hannover Concept, Wübbena et al. (2000). In this contribution, a concept for the adequate

  7. Strain analysis from objects with a random distribution: A generalized center-to-center method

    NASA Astrophysics Data System (ADS)

    Shan, Yehua; Liang, Xinquan

    2014-03-01

    Existing methods of strain analysis such as the center-to-center method and the Fry method estimate strain from the spatial relationship between point objects in the deformed state. They assume a truncated Poisson distribution of point objects in the pre-deformed state. Significant deviations occur in nature and diffuse the central vacancy in a Fry plot, limiting the its effectiveness as a strain gauge. Therefore, a generalized center-to-center method is proposed to deal with point objects with the more general Poisson distribution, where the method outcomes do not depend on an analysis of a graphical central vacancy. This new method relies upon the probability mass function for the Poisson distribution, and adopts the maximum likelihood function method to solve for strain. The feasibility of the method is demonstrated by applying it to artificial data sets generated for known strains. Further analysis of these sets by use of the bootstrap method shows that the accuracy of the strain estimate has a strong tendency to increase either with point number or with the inclusion of more pre-deformation nearest neighbors. A poorly sorted, well packed, deformed conglomerate is analyzed, yielding strain estimate similar to the vector mean of the major axis directions of pebbles and the harmonic mean of their axial ratios from a shape-based strain determination method. These outcomes support the applicability of the new method to the analysis of deformed rocks with appropriate strain markers.

  8. Smoke alarm tests may not adequately indicate smoke alarm function.

    PubMed

    Peek-Asa, Corinne; Yang, Jingzhen; Hamann, Cara; Young, Tracy

    2011-01-01

    Smoke alarms are one of the most promoted prevention strategies to reduce residential fire deaths, and they can reduce residential fire deaths by half. Smoke alarm function can be measured by two tests: the smoke alarm button test and the chemical smoke test. Using results from a randomized trial of smoke alarms, we compared smoke alarm response to the button test and the smoke test. The smoke alarms found in the study homes at baseline were tested, as well as study alarms placed into homes as part of the randomized trial. Study alarms were tested at 12 and 42 months postinstallation. The proportion of alarms that passed the button test but not the smoke test ranged from 0.5 to 5.8% of alarms; this result was found most frequently among ionization alarms with zinc or alkaline batteries. These alarms would indicate to the owner (through the button test) that the smoke alarm was working, but the alarm would not actually respond in the case of a fire (as demonstrated by failing the smoke test). The proportion of alarms that passed the smoke test but not the button test ranged from 1.0 to 3.0%. These alarms would appear nonfunctional to the owner (because the button test failed), even though the alarm would operate in response to a fire (as demonstrated by passing the smoke test). The general public is not aware of the potential for inaccuracy in smoke alarm tests, and burn professionals can advocate for enhanced testing methods. The optimal test to determine smoke alarm function is the chemical smoke test. PMID:21747329

  9. Hybrid random walk-linear discriminant analysis method for unwrapping quantitative phase microscopy images of biological samples

    PubMed Central

    Kim, Diane N. H.; Teitell, Michael A.; Reed, Jason; Zangle, Thomas A.

    2015-01-01

    Abstract. Standard algorithms for phase unwrapping often fail for interferometric quantitative phase imaging (QPI) of biological samples due to the variable morphology of these samples and the requirement to image at low light intensities to avoid phototoxicity. We describe a new algorithm combining random walk-based image segmentation with linear discriminant analysis (LDA)-based feature detection, using assumptions about the morphology of biological samples to account for phase ambiguities when standard methods have failed. We present three versions of our method: first, a method for LDA image segmentation based on a manually compiled training dataset; second, a method using a random walker (RW) algorithm informed by the assumed properties of a biological phase image; and third, an algorithm which combines LDA-based edge detection with an efficient RW algorithm. We show that the combination of LDA plus the RW algorithm gives the best overall performance with little speed penalty compared to LDA alone, and that this algorithm can be further optimized using a genetic algorithm to yield superior performance for phase unwrapping of QPI data from biological samples. PMID:26305212

  10. Hybrid random walk-linear discriminant analysis method for unwrapping quantitative phase microscopy images of biological samples.

    PubMed

    Kim, Diane N H; Teitell, Michael A; Reed, Jason; Zangle, Thomas A

    2015-01-01

    Standard algorithms for phase unwrapping often fail for interferometric quantitative phase imaging (QPI) of biological samples due to the variable morphology of these samples and the requirement to image at low light intensities to avoid phototoxicity. We describe a new algorithm combining random walk-based image segmentation with linear discriminant analysis (LDA)-based feature detection, using assumptions about the morphology of biological samples to account for phase ambiguities when standard methods have failed. We present three versions of our method: first, a method for LDA image segmentation based on a manually compiled training dataset; second, a method using a random walker (RW) algorithm informed by the assumed properties of a biological phase image; and third, an algorithm which combines LDA-based edge detection with an efficient RW algorithm. We show that the combination of LDA plus the RW algorithm gives the best overall performance with little speed penalty compared to LDA alone, and that this algorithm can be further optimized using a genetic algorithm to yield superior performance for phase unwrapping of QPI data from biological samples. PMID:26305212

  11. Hybrid random walk-linear discriminant analysis method for unwrapping quantitative phase microscopy images of biological samples

    NASA Astrophysics Data System (ADS)

    Kim, Diane N. H.; Teitell, Michael A.; Reed, Jason; Zangle, Thomas A.

    2015-11-01

    Standard algorithms for phase unwrapping often fail for interferometric quantitative phase imaging (QPI) of biological samples due to the variable morphology of these samples and the requirement to image at low light intensities to avoid phototoxicity. We describe a new algorithm combining random walk-based image segmentation with linear discriminant analysis (LDA)-based feature detection, using assumptions about the morphology of biological samples to account for phase ambiguities when standard methods have failed. We present three versions of our method: first, a method for LDA image segmentation based on a manually compiled training dataset; second, a method using a random walker (RW) algorithm informed by the assumed properties of a biological phase image; and third, an algorithm which combines LDA-based edge detection with an efficient RW algorithm. We show that the combination of LDA plus the RW algorithm gives the best overall performance with little speed penalty compared to LDA alone, and that this algorithm can be further optimized using a genetic algorithm to yield superior performance for phase unwrapping of QPI data from biological samples.

  12. Predictors for Reporting of Dietary Assessment Methods in Food-based Randomized Controlled Trials over a Ten-year Period.

    PubMed

    Probst, Yasmine; Zammit, Gail

    2016-09-01

    The importance of monitoring dietary intake within a randomized controlled trial becomes vital to justification of the study outcomes when the study is food-based. A systematic literature review was conducted to determine how dietary assessment methods used to monitor dietary intake are reported and whether assisted technologies are used in conducting such assessments. OVID and ScienceDirect databases 2000-2010 were searched for food-based, parallel, randomized controlled trials conducted with humans using the search terms "clinical trial," "diet$ intervention" AND "diet$ assessment," "diet$ method$," "intake," "diet history," "food record," "food frequency questionnaire," "FFQ," "food diary," "24-hour recall." A total of 1364 abstracts were reviewed and 243 studies identified. The size of the study and country of origin appear to be the two most common predictors of reporting both the dietary assessment method and details of the form of assessment. The journal in which the study is published has no impact. Information technology use may increase in the future allowing other methods and forms of dietary assessment to be used efficiently. PMID:26212597

  13. Correcting treatment effect for treatment switching in randomized oncology trials with a modified iterative parametric estimation method.

    PubMed

    Zhang, Jin; Chen, Cong

    2016-09-20

    In randomized oncology trials, patients in the control arm are sometimes permitted to switch to receive experimental drug after disease progression. This is mainly due to ethical reasons or to reduce the patient dropout rate. While progression-free survival is not usually impacted by crossover, the treatment effect on overall survival can be highly confounded. The rank-preserving structural failure time (RPSFT) model and iterative parametric estimation (IPE) are the main randomization-based methods used to adjust for confounding in the analysis of overall survival. While the RPSFT has been extensively studied, the properties of the IPE have not been thoroughly examined and its application is not common. In this manuscript, we clarify the re-censoring algorithm needed for IPE estimation and incorporate it into a method we propose as modified IPE (MIPE). We compared the MIPE and RPSFT via extensive simulations and then walked through the analysis using the modified IPE in a real clinical trial. We provided practical guidance on bootstrap by examining the performance in estimating the variance and confidence interval for the MIPE. Our results indicate that the MIPE method with the proposed re-censoring rule is an attractive alternative to the RPSFT method. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26919271

  14. Estimates of Adequate School Spending by State Based on National Average Service Levels.

    ERIC Educational Resources Information Center

    Miner, Jerry

    1983-01-01

    Proposes a method for estimating expenditures per student needed to provide educational adequacy in each state. Illustrates the method using U.S., Arkansas, New York, Texas, and Washington State data, covering instruction, special needs, operations and maintenance, administration, and other costs. Estimates ratios of "adequate" to actual spending…

  15. Arabidopsis: An Adequate Model for Dicot Root Systems?

    PubMed

    Zobel, Richard W

    2016-01-01

    The Arabidopsis root system is frequently considered to have only three classes of root: primary, lateral, and adventitious. Research with other plant species has suggested up to eight different developmental/functional classes of root for a given plant root system. If Arabidopsis has only three classes of root, it may not be an adequate model for eudicot plant root systems. Recent research, however, can be interpreted to suggest that pre-flowering Arabidopsis does have at least five (5) of these classes of root. This then suggests that Arabidopsis root research can be considered an adequate model for dicot plant root systems. PMID:26904040

  16. Evaluating the Bookmark Standard Setting Method: The Impact of Random Item Ordering

    ERIC Educational Resources Information Center

    Davis-Becker, Susan L.; Buckendahl, Chad W.; Gerrow, Jack

    2011-01-01

    Throughout the world, cut scores are an important aspect of a high-stakes testing program because they are a key operational component of the interpretation of test scores. One method for setting standards that is prevalent in educational testing programs--the Bookmark method--is intended to be a less cognitively complex alternative to methods…

  17. MULTILEVEL ACCELERATION OF STOCHASTIC COLLOCATION METHODS FOR PDE WITH RANDOM INPUT DATA

    SciTech Connect

    Webster, Clayton G; Jantsch, Peter A; Teckentrup, Aretha L; Gunzburger, Max D

    2013-01-01

    Stochastic Collocation (SC) methods for stochastic partial differential equa- tions (SPDEs) suffer from the curse of dimensionality, whereby increases in the stochastic dimension cause an explosion of computational effort. To combat these challenges, multilevel approximation methods seek to decrease computational complexity by balancing spatial and stochastic discretization errors. As a form of variance reduction, multilevel techniques have been successfully applied to Monte Carlo (MC) methods, but may be extended to accelerate other methods for SPDEs in which the stochastic and spatial degrees of freedom are de- coupled. This article presents general convergence and computational complexity analysis of a multilevel method for SPDEs, demonstrating its advantages with regard to standard, single level approximation. The numerical results will highlight conditions under which multilevel sparse grid SC is preferable to the more traditional MC and SC approaches.

  18. An analytical method for disentangling the roles of adhesion and crowding for random walk models on a crowded lattice.

    PubMed

    Ellery, Adam J; Baker, Ruth E; Simpson, Matthew J

    2016-01-01

    Migration of cells and molecules in vivo is affected by interactions with obstacles. These interactions can include crowding effects, as well as adhesion/repulsion between the motile cell/molecule and the obstacles. Here we present an analytical framework that can be used to separately quantify the roles of crowding and adhesion/repulsion using a lattice-based random walk model. Our method leads to an exact calculation of the long time Fickian diffusivity, and avoids the need for computationally expensive stochastic simulations. PMID:27597573

  19. Known plaintext attack on double random phase encoding using fingerprint as key and a method for avoiding the attack.

    PubMed

    Tashima, Hideaki; Takeda, Masafumi; Suzuki, Hiroyuki; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2010-06-21

    We have shown that the application of double random phase encoding (DRPE) to biometrics enables the use of biometrics as cipher keys for binary data encryption. However, DRPE is reported to be vulnerable to known-plaintext attacks (KPAs) using a phase recovery algorithm. In this study, we investigated the vulnerability of DRPE using fingerprints as cipher keys to the KPAs. By means of computational experiments, we estimated the encryption key and restored the fingerprint image using the estimated key. Further, we propose a method for avoiding the KPA on the DRPE that employs the phase retrieval algorithm. The proposed method makes the amplitude component of the encrypted image constant in order to prevent the amplitude component of the encrypted image from being used as a clue for phase retrieval. Computational experiments showed that the proposed method not only avoids revealing the cipher key and the fingerprint but also serves as a sufficiently accurate verification system. PMID:20588510

  20. Minkowski-Voronoi diagrams as a method to generate random packings of spheropolygons for the simulation of soils

    NASA Astrophysics Data System (ADS)

    Galindo-Torres, S. A.; Muñoz, J. D.; Alonso-Marroquín, F.

    2010-11-01

    Minkowski operators (dilation and erosion of sets in vector spaces) have been extensively used in computer graphics, image processing to analyze the structure of materials, and more recently in molecular dynamics. Here, we apply those mathematical concepts to extend the discrete element method to simulate granular materials with complex-shaped particles. The Voronoi-Minkowski diagrams are introduced to generate random packings of complex-shaped particles with tunable particle roundness. Contact forces and potentials are calculated in terms of distances instead of overlaps. By using the Verlet method to detect neighborhood, we achieve CPU times that grow linearly with the body’s number of sides. Simulations of dissipative granular materials under shear demonstrate that the method maintains conservation of energy in accord with the first law of thermodynamics. A series of simulations for biaxial test, shear band formation, hysteretic behavior, and ratcheting show that the model can reproduce the main features of real granular-soil behavior.

  1. Is the Marketing Concept Adequate for Continuing Education?

    ERIC Educational Resources Information Center

    Rittenburg, Terri L.

    1984-01-01

    Because educators have a social responsibility to those they teach, the marketing concept may not be adequate as a philosophy for continuing education. In attempting to broaden the audience for continuing education, educators should consider a societal marketing concept to meet the needs of the educationally disadvantaged. (SK)

  2. Comparability and Reliability Considerations of Adequate Yearly Progress

    ERIC Educational Resources Information Center

    Maier, Kimberly S.; Maiti, Tapabrata; Dass, Sarat C.; Lim, Chae Young

    2012-01-01

    The purpose of this study is to develop an estimate of Adequate Yearly Progress (AYP) that will allow for reliable and valid comparisons among student subgroups, schools, and districts. A shrinkage-type estimator of AYP using the Bayesian framework is described. Using simulated data, the performance of the Bayes estimator will be compared to…

  3. 9 CFR 305.3 - Sanitation and adequate facilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION...

  4. Understanding Your Adequate Yearly Progress (AYP), 2011-2012

    ERIC Educational Resources Information Center

    Missouri Department of Elementary and Secondary Education, 2011

    2011-01-01

    The "No Child Left Behind Act (NCLB) of 2001" requires all schools, districts/local education agencies (LEAs) and states to show that students are making Adequate Yearly Progress (AYP). NCLB requires states to establish targets in the following ways: (1) Annual Proficiency Target; (2) Attendance/Graduation Rates; and (3) Participation Rates.…

  5. Assessing Juvenile Sex Offenders to Determine Adequate Levels of Supervision.

    ERIC Educational Resources Information Center

    Gerdes, Karen E.; And Others

    1995-01-01

    This study analyzed the internal consistency of four inventories used by Utah probation officers to determine adequate and efficacious supervision levels and placement for juvenile sex offenders. Three factors accounted for 41.2 percent of variance (custodian's and juvenile's attitude toward intervention, offense characteristics, and historical…

  6. 34 CFR 200.13 - Adequate yearly progress in general.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 1 2011-07-01 2011-07-01 false Adequate yearly progress in general. 200.13 Section 200.13 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE...

  7. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 1 2011-07-01 2011-07-01 false Making adequate yearly progress. 200.20 Section 200.20 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE DISADVANTAGED...

  8. Do Beginning Teachers Receive Adequate Support from Their Headteachers?

    ERIC Educational Resources Information Center

    Menon, Maria Eliophotou

    2012-01-01

    The article examines the problems faced by beginning teachers in Cyprus and the extent to which headteachers are considered to provide adequate guidance and support to them. Data were collected through interviews with 25 school teachers in Cyprus, who had recently entered teaching (within 1-5 years) in public primary schools. According to the…

  9. Are Substance Use Prevention Programs More Effective in Schools Making Adequate Yearly Progress? A Study of Project ALERT

    ERIC Educational Resources Information Center

    Clark, Heddy Kovach; Ringwalt, Chris L.; Shamblen, Stephen R.; Hanley, Sean M.; Flewelling, Robert L.

    2011-01-01

    This exploratory study sought to determine if a popular school-based drug prevention program might be effective in schools that are making adequate yearly progress (AYP). Thirty-four schools with grades 6 through 8 in 11 states were randomly assigned either to receive Project ALERT (n = 17) or to a control group (n = 17); of these, 10 intervention…

  10. Comparison of cast materials for the treatment of congenital idiopathic clubfoot using the Ponseti method: a prospective randomized controlled trial

    PubMed Central

    Hui, Catherine; Joughin, Elaine; Nettel-Aguirre, Alberto; Goldstein, Simon; Harder, James; Kiefer, Gerhard; Parsons, David; Brauer, Carmen; Howard, Jason

    2014-01-01

    Background The Ponseti method of congenital idiopathic clubfoot correction has traditionally specified plaster of Paris (POP) as the cast material of choice; however, there are negative aspects to using POP. We sought to determine the influence of cast material (POP v. semirigid fibreglass [SRF]) on clubfoot correction using the Ponseti method. Methods Patients were randomized to POP or SRF before undergoing the Ponseti method. The primary outcome measure was the number of casts required for clubfoot correction. Secondary outcome measures included the number of casts by severity, ease of cast removal, need for Achilles tenotomy, brace compliance, deformity relapse, need for repeat casting and need for ancillary surgical procedures. Results We enrolled 30 patients: 12 randomized to POP and 18 to SRF. There was no difference in the number of casts required for clubfoot correction between the groups (p = 0.13). According to parents, removal of POP was more difficult (p < 0.001), more time consuming (p < 0.001) and required more than 1 method (p < 0.001). At a final follow-up of 30.8 months, the mean times to deformity relapse requiring repeat casting, surgery or both were 18.7 and 16.4 months for the SRF and POP groups, respectively. Conclusion There was no significant difference in the number of casts required for correction of clubfoot between the 2 materials, but SRF resulted in a more favourable parental experience, which cannot be ignored as it may have a positive impact on psychological well-being despite the increased cost associated. PMID:25078929

  11. A Meta-Analysis of Randomized Controlled Trials of Yiqi Yangyin Huoxue Method in Treating Diabetic Nephropathy.

    PubMed

    Ou, Jiao Ying; Huang, Di; Wu, Yan Sheng; Xu, Lin; He, Fei; Wang, Hui Ling; Shi, Li Qiang; Wan, Qiang; He, Li Qun; Dong Gao, Jian

    2016-01-01

    Objective. The purpose of this systematic review is to evaluate the evidence of Yiqi Yangyin Huoxue Method for diabetic nephropathy. Methods. 11 electronic databases, through September 2015, were searched to identify randomized controlled trials of Yiqi Yangyin Huoxue Method for diabetic nephropathy. The quality of the included trials was assessed using the Jadad scale. Results. 26 randomized controlled trials were included in our review. Of all the included trials, most of them were considered as high quality. The aggregated results suggested that Yiqi Yangyin Huoxue Method is beneficial to diabetic nephropathy in bringing down the microalbuminuria (SMD = -0.98, 95% CI -1.22 to -0.74), serum creatinine (SMD = -0.56, 95% CI -0.93 to -0.20), beta-2 microglobulin (MD = 0.06, 95% CI 0.01 to 0.12), fasting plasma glucose (MD = -0.35, 95% CI -0.62 to -0.08), and 2-hour postprandial blood glucose (MD = 1.13, 95% CI 0.07 to 2.20), but not in decreasing blood urea nitrogen (SMD = -0.72, 95% CI -1.47 to 0.02) or 2-hour postprandial blood glucose (SMD = -0.48, 95% CI -1.01 to 0.04). Conclusions. Yiqi Yangyin Huoxue Method should be a valid complementary and alternative therapy in the management of diabetic nephropathy, especially in improving UAER, serum creatinine, fasting blood glucose, and beta-2 microglobulin. However, more studies with long follow-up are warrant to confirm the current findings. PMID:27313643

  12. A Meta-Analysis of Randomized Controlled Trials of Yiqi Yangyin Huoxue Method in Treating Diabetic Nephropathy

    PubMed Central

    Ou, Jiao Ying; Huang, Di; Wu, Yan Sheng; Xu, Lin; He, Fei; Wang, Hui Ling; Shi, Li Qiang; Wan, Qiang; He, Li Qun; Dong Gao, Jian

    2016-01-01

    Objective. The purpose of this systematic review is to evaluate the evidence of Yiqi Yangyin Huoxue Method for diabetic nephropathy. Methods. 11 electronic databases, through September 2015, were searched to identify randomized controlled trials of Yiqi Yangyin Huoxue Method for diabetic nephropathy. The quality of the included trials was assessed using the Jadad scale. Results. 26 randomized controlled trials were included in our review. Of all the included trials, most of them were considered as high quality. The aggregated results suggested that Yiqi Yangyin Huoxue Method is beneficial to diabetic nephropathy in bringing down the microalbuminuria (SMD = −0.98, 95% CI −1.22 to −0.74), serum creatinine (SMD = −0.56, 95% CI −0.93 to −0.20), beta-2 microglobulin (MD = 0.06, 95% CI 0.01 to 0.12), fasting plasma glucose (MD = −0.35, 95% CI −0.62 to −0.08), and 2-hour postprandial blood glucose (MD = 1.13, 95% CI 0.07 to 2.20), but not in decreasing blood urea nitrogen (SMD = −0.72, 95% CI −1.47 to 0.02) or 2-hour postprandial blood glucose (SMD = −0.48, 95% CI −1.01 to 0.04). Conclusions. Yiqi Yangyin Huoxue Method should be a valid complementary and alternative therapy in the management of diabetic nephropathy, especially in improving UAER, serum creatinine, fasting blood glucose, and beta-2 microglobulin. However, more studies with long follow-up are warrant to confirm the current findings. PMID:27313643

  13. Comparison Between Two Methods for Estimating the Vertical Scale of Fluctuation for Modeling Random Geotechnical Problems

    NASA Astrophysics Data System (ADS)

    Pieczyńska-Kozłowska, Joanna M.

    2015-12-01

    The design process in geotechnical engineering requires the most accurate mapping of soil. The difficulty lies in the spatial variability of soil parameters, which has been a site of investigation of many researches for many years. This study analyses the soil-modeling problem by suggesting two effective methods of acquiring information for modeling that consists of variability from cone penetration test (CPT). The first method has been used in geotechnical engineering, but the second one has not been associated with geotechnics so far. Both methods are applied to a case study in which the parameters of changes are estimated. The knowledge of the variability of parameters allows in a long term more effective estimation, for example, bearing capacity probability of failure.

  14. Methods of and apparatus for recording images occurring just prior to a rapid, random event

    DOEpatents

    Kelley, Edward F.

    1994-01-01

    An apparatus and a method are disclosed for recording images of events in a medium wherein the images that are recorded are of conditions existing just prior to and during the occurrence of an event that triggers recording of these images. The apparatus and method use an optical delay path that employs a spherical focusing mirror facing a circular array of flat return mirrors around a central flat mirror. The image is reflected in a symmetric pattern which balances astigmatism which is created by the spherical mirror. Delays on the order of hundreds of nanoseconds are possible.

  15. Methods of and apparatus for recording images occurring just prior to a rapid, random event

    SciTech Connect

    Kelley, E.F.

    1991-12-31

    An apparatus and a method are disclosed for recording images of events in a medium wherein the images that are recorded are of conditions existing just prior to and during the occurrence of an event that triggers recording of these images. The apparatus and method use an optical delay path that employs a spherical focusing mirror facing a circular array of flat return mirrors around a central flat mirror. The image is reflected in a symmetric pattern which balances astigmatism which is created by the spherical mirror. Delays on the order of hundreds of nanoseconds are possible.

  16. A Robust and Versatile Method of Combinatorial Chemical Synthesis of Gene Libraries via Hierarchical Assembly of Partially Randomized Modules

    PubMed Central

    Popova, Blagovesta; Schubert, Steffen; Bulla, Ingo; Buchwald, Daniela; Kramer, Wilfried

    2015-01-01

    A major challenge in gene library generation is to guarantee a large functional size and diversity that significantly increases the chances of selecting different functional protein variants. The use of trinucleotides mixtures for controlled randomization results in superior library diversity and offers the ability to specify the type and distribution of the amino acids at each position. Here we describe the generation of a high diversity gene library using tHisF of the hyperthermophile Thermotoga maritima as a scaffold. Combining various rational criteria with contingency, we targeted 26 selected codons of the thisF gene sequence for randomization at a controlled level. We have developed a novel method of creating full-length gene libraries by combinatorial assembly of smaller sub-libraries. Full-length libraries of high diversity can easily be assembled on demand from smaller and much less diverse sub-libraries, which circumvent the notoriously troublesome long-term archivation and repeated proliferation of high diversity ensembles of phages or plasmids. We developed a generally applicable software tool for sequence analysis of mutated gene sequences that provides efficient assistance for analysis of library diversity. Finally, practical utility of the library was demonstrated in principle by assessment of the conformational stability of library members and isolating protein variants with HisF activity from it. Our approach integrates a number of features of nucleic acids synthetic chemistry, biochemistry and molecular genetics to a coherent, flexible and robust method of combinatorial gene synthesis. PMID:26355961

  17. A Functional Networks Estimation Method of Resting-State fMRI Using a Hierarchical Markov Random Field

    PubMed Central

    Liu, Wei; Awate, Suyash P.; Anderson, Jeffrey S.; Fletcher, P. Thomas

    2014-01-01

    We propose a hierarchical Markov random field model that estimates both group and subject functional networks simultaneously. The model takes into account the within-subject spatial coherence as well as the between-subject consistency of the network label maps. The statistical dependency between group and subject networks acts as a regularization, which helps the network estimation on both layers. We use Gibbs sampling to approximate the posterior density of the network labels and Monte Carlo expectation maximization to estimate the model parameters. We compare our method with two alternative segmentation methods based on K-Means and normalized cuts, using synthetic and real fMRI data. The experimental results show our proposed model is able to identify both group and subject functional networks with higher accuracy, more robustness, and inter-session consistency. PMID:24954282

  18. A new method for predicting response in complex linear systems. II. [under random or deterministic steady state excitation

    NASA Technical Reports Server (NTRS)

    Bogdanoff, J. L.; Kayser, K.; Krieger, W.

    1977-01-01

    The paper describes convergence and response studies in the low frequency range of complex systems, particularly with low values of damping of different distributions, and reports on the modification of the relaxation procedure required under these conditions. A new method is presented for response estimation in complex lumped parameter linear systems under random or deterministic steady state excitation. The essence of the method is the use of relaxation procedures with a suitable error function to find the estimated response; natural frequencies and normal modes are not computed. For a 45 degree of freedom system, and two relaxation procedures, convergence studies and frequency response estimates were performed. The low frequency studies are considered in the framework of earlier studies (Kayser and Bogdanoff, 1975) involving the mid to high frequency range.

  19. RANDOM LASSO.

    PubMed

    Wang, Sijian; Nan, Bin; Rosset, Saharon; Zhu, Ji

    2011-03-01

    We propose a computationally intensive method, the random lasso method, for variable selection in linear models. The method consists of two major steps. In step 1, the lasso method is applied to many bootstrap samples, each using a set of randomly selected covariates. A measure of importance is yielded from this step for each covariate. In step 2, a similar procedure to the first step is implemented with the exception that for each bootstrap sample, a subset of covariates is randomly selected with unequal selection probabilities determined by the covariates' importance. Adaptive lasso may be used in the second step with weights determined by the importance measures. The final set of covariates and their coefficients are determined by averaging bootstrap results obtained from step 2. The proposed method alleviates some of the limitations of lasso, elastic-net and related methods noted especially in the context of microarray data analysis: it tends to remove highly correlated variables altogether or select them all, and maintains maximal flexibility in estimating their coefficients, particularly with different signs; the number of selected variables is no longer limited by the sample size; and the resulting prediction accuracy is competitive or superior compared to the alternatives. We illustrate the proposed method by extensive simulation studies. The proposed method is also applied to a Glioblastoma microarray data analysis. PMID:22997542

  20. Unsteady Fast Random Particle Mesh method for efficient prediction of tonal and broadband noises of a centrifugal fan unit

    NASA Astrophysics Data System (ADS)

    Heo, Seung; Cheong, Cheolung; Kim, Taehoon

    2015-09-01

    In this study, efficient numerical method is proposed for predicting tonal and broadband noises of a centrifugal fan unit. The proposed method is based on Hybrid Computational Aero-Acoustic (H-CAA) techniques combined with Unsteady Fast Random Particle Mesh (U-FRPM) method. The U-FRPM method is developed by extending the FRPM method proposed by Ewert et al. and is utilized to synthesize turbulence flow field from unsteady RANS solutions. The H-CAA technique combined with U-FRPM method is applied to predict broadband as well as tonal noises of a centrifugal fan unit in a household refrigerator. Firstly, unsteady flow field driven by a rotating fan is computed by solving the RANS equations with Computational Fluid Dynamic (CFD) techniques. Main source regions around the rotating fan are identified by examining the computed flow fields. Then, turbulence flow fields in the main source regions are synthesized by applying the U-FRPM method. The acoustic analogy is applied to model acoustic sources in the main source regions. Finally, the centrifugal fan noise is predicted by feeding the modeled acoustic sources into an acoustic solver based on the Boundary Element Method (BEM). The sound spectral levels predicted using the current numerical method show good agreements with the measured spectra at the Blade Pass Frequencies (BPFs) as well as in the high frequency range. On the more, the present method enables quantitative assessment of relative contributions of identified source regions to the sound field by comparing predicted sound pressure spectrum due to modeled sources.

  1. Detection and imaging in random scattering media with the D.O.R.T. method

    NASA Astrophysics Data System (ADS)

    Kerbrat, E.; Prada, C.; Cassereau, D.; Fink, M.

    2001-04-01

    Some media like titanium alloy contain grain structure that make detection of defects very difficult. In this case, the D.O.R.T. method is a good solution to separate the echo of the defect from the microstructure contribution. Results illustrating the detection of flat bottom holes (FBH) with different diameters located at 100 mm and 140 mm depth in a titanium alloy sample will be presented.

  2. Accelerating Particle Filter Using Randomized Multiscale and Fast Multipole Type Methods.

    PubMed

    Shabat, Gil; Shmueli, Yaniv; Bermanis, Amit; Averbuch, Amir

    2015-07-01

    Particle filter is a powerful tool for state tracking using non-linear observations. We present a multiscale based method that accelerates the tracking computation by particle filters. Unlike the conventional way, which calculates weights over all particles in each cycle of the algorithm, we sample a small subset from the source particles using matrix decomposition methods. Then, we apply a function extension algorithm that uses a particle subset to recover the density function for all the rest of the particles not included in the chosen subset. The computational effort is substantial especially when multiple objects are tracked concurrently. The proposed algorithm significantly reduces the computational load. By using the Fast Gaussian Transform, the complexity of the particle selection step is reduced to a linear time in n and k, where n is the number of particles and k is the number of particles in the selected subset. We demonstrate our method on both simulated and on real data such as object tracking in video sequences. PMID:26352448

  3. Prediction of broadband ground-motion time histories: Hybrid low/high-frequency method with correlated random source parameters

    USGS Publications Warehouse

    Liu, P.; Archuleta, R.J.; Hartzell, S.H.

    2006-01-01

    We present a new method for calculating broadband time histories of ground motion based on a hybrid low-frequency/high-frequency approach with correlated source parameters. Using a finite-difference method we calculate low- frequency synthetics (< ∼1 Hz) in a 3D velocity structure. We also compute broadband synthetics in a 1D velocity model using a frequency-wavenumber method. The low frequencies from the 3D calculation are combined with the high frequencies from the 1D calculation by using matched filtering at a crossover frequency of 1 Hz. The source description, common to both the 1D and 3D synthetics, is based on correlated random distributions for the slip amplitude, rupture velocity, and rise time on the fault. This source description allows for the specification of source parameters independent of any a priori inversion results. In our broadband modeling we include correlation between slip amplitude, rupture velocity, and rise time, as suggested by dynamic fault modeling. The method of using correlated random source parameters is flexible and can be easily modified to adjust to our changing understanding of earthquake ruptures. A realistic attenuation model is common to both the 3D and 1D calculations that form the low- and high-frequency components of the broadband synthetics. The value of Q is a function of the local shear-wave velocity. To produce more accurate high-frequency amplitudes and durations, the 1D synthetics are corrected with a randomized, frequency-dependent radiation pattern. The 1D synthetics are further corrected for local site and nonlinear soil effects by using a 1D nonlinear propagation code and generic velocity structure appropriate for the site’s National Earthquake Hazards Reduction Program (NEHRP) site classification. The entire procedure is validated by comparison with the 1994 Northridge, California, strong ground motion data set. The bias and error found here for response spectral acceleration are similar to the best results

  4. Special or Not so Special: Special Education Background Experiences of Principals and Adequate Yearly Progress

    ERIC Educational Resources Information Center

    Wilcox, Jennifer E.

    2011-01-01

    This mixed-methods study researched the special education background experience of principals and the effect on students in the subgroup of Students with Disabilities in making Adequate Yearly Progress (AYP). In the state of Ohio, schools and districts are expected to make AYP as a whole and additionally make AYP for each subgroup (various…

  5. Maintaining adequate hydration and nutrition in adult enteral tube feeding.

    PubMed

    Dunn, Sasha

    2015-01-01

    Predicting the nutritional and fluid requirements of enterally-fed patients can be challenging and the practicalities of ensuring adequate delivery must be taken into consideration. Patients who are enterally fed can be more reliant on clinicians, family members and carers to meet their nutrition and hydration needs and identify any deficiencies, excesses or problems with delivery. Estimating a patient's requirements can be challenging due to the limitations of using predictive equations in the clinical setting. Close monitoring by all those involved in the patient's care, as well as regular review by a dietitian, is therefore required to balance the delivery of adequate feed and fluids to meet each patient's individual needs and prevent the complications of malnutrition and dehydration. Increasing the awareness of the signs of malnutrition and dehydration in patients receiving enteral tube feeding among those involved in a patient's care will help any deficiencies to be detected early on and rectified before complications occur. PMID:26087203

  6. Assessing juvenile sex offenders to determine adequate levels of supervision.

    PubMed

    Gerdes, K E; Gourley, M M; Cash, M C

    1995-08-01

    The present study analyzed the internal consistency of four inventories currently being used by probation officers in the state of Utah to determine adequate and efficacious supervision levels and placement for juvenile sex offenders. The internal consistency or reliability of the inventories ranged from moderate to good. Factor analysis was utilized to significantly increase the reliability of the four inventories by collapsing them into the following three factors: (a) Custodian's and Juvenile's Attitude Toward Intervention; (b) Offense Characteristics; and (c) Historical Risk Factors. These three inventories/factors explained 41.2% of the variance in the combined inventories' scores. Suggestions are made regarding the creation of an additional inventory. "Characteristics of the Victim" to account for more of the variance. In addition, suggestions as to how these inventories can be used by probation officers to make objective and consistent decisions about adequate supervision levels and placement for juvenile sex offenders are discussed. PMID:7583754

  7. Random Estimate the values of seed oil of Cucurbita maxima by refractive index method.

    PubMed

    Saxena, R B

    2010-01-01

    The crude oil having lower iodine and free fatty acids values has Aamdosha properties. These properties are present due to toxic and anti-toxic compounds. These compounds can be harmful for the special diseases and may be unsaturated, saturated, open chain etc. The adulteration can take part as catalytic action for the toxic effect for the special diseases. Toxic properties of oils are removed by different ingrediants and methods. C. maxima seed tail (mst) is used with food and medicine. The present paper deals with the study of oil by refractive index and equations. PMID:22131677

  8. The California Tri-pull Taping Method in the Treatment of Shoulder Subluxation After Stroke: A Randomized Clinical Trial

    PubMed Central

    Chatterjee, Subhasish; Hayner, Kate A; Arumugam, Narkeesh; Goyal, Manu; Midha, Divya; Arora, Ashima; Sharma, Sorabh; Kumar, Senthil P

    2016-01-01

    Background: Shoulder subluxation is a frequent occurrence in individuals following a stroke. Although various methods of treatment are available, none of them address all possible consequences of the subluxation pain, limited range of motion, the subluxation, and decreased functional use of the arm. Aims: The purpose of this study was to evaluate the effectiveness of California tri-pull taping (CTPT) method on shoulder subluxation, pain, active shoulder flexion, and upper limb functional recovery after stroke. Materials and Methods: This was a randomized control study on 30 participants. All participants received conventional neurorehabilitation 5 days a week over 6 weeks. Half of the participants also received the CTPT. Pre- and post-assessment scores were taken on all participants for the amount of shoulder subluxation, pain, active shoulder flexion, and functional recovery. Results: The CTPT method demonstrated a significant reduction of pain in the treatment group from baseline, a significant improvement in active shoulder flexion and a significant improvement in proximal arm function as measured on the proximal subscale on the Fugl-Meyer upper extremity functional Scale but not the distal or total Fugl-Meyer subscales. Shoulder subluxation was not statistically significant. Conclusions: The CTPT method is an effective treatment for the hemiplegic subluxed shoulder. PMID:27213141

  9. Testing Allele Transmission of an SNP Set Using a Family-Based Generalized Genetic Random Field Method.

    PubMed

    Li, Ming; Li, Jingyun; He, Zihuai; Lu, Qing; Witte, John S; Macleod, Stewart L; Hobbs, Charlotte A; Cleves, Mario A

    2016-05-01

    Family-based association studies are commonly used in genetic research because they can be robust to population stratification (PS). Recent advances in high-throughput genotyping technologies have produced a massive amount of genomic data in family-based studies. However, current family-based association tests are mainly focused on evaluating individual variants one at a time. In this article, we introduce a family-based generalized genetic random field (FB-GGRF) method to test the joint association between a set of autosomal SNPs (i.e., single-nucleotide polymorphisms) and disease phenotypes. The proposed method is a natural extension of a recently developed GGRF method for population-based case-control studies. It models offspring genotypes conditional on parental genotypes, and, thus, is robust to PS. Through simulations, we presented that under various disease scenarios the FB-GGRF has improved power over a commonly used family-based sequence kernel association test (FB-SKAT). Further, similar to GGRF, the proposed FB-GGRF method is asymptotically well-behaved, and does not require empirical adjustment of the type I error rates. We illustrate the proposed method using a study of congenital heart defects with family trios from the National Birth Defects Prevention Study (NBDPS). PMID:27061818

  10. Diagnostic and treatment methods used by chiropractors: A random sample survey of Canada’s English-speaking provinces

    PubMed Central

    Puhl, Aaron A.; Reinhart, Christine J; Injeyan, H. Stephen

    2015-01-01

    Objective: It is important to understand how chiropractors practice beyond their formal education. The objective of this analysis was to assess the diagnostic and treatment methods used by chiropractors in English-speaking Canadian provinces. Methods: A questionnaire was created that examined practice patterns amongst chiropractors. This was sent by mail to 749 chiropractors, randomly selected and stratified proportionally across the nine English-speaking Canadian provinces. Participation was voluntary and anonymous. Data were entered into an Excel spreadsheet, and descriptive statistics were calculated. Results: The response rate was 68.0%. Almost all (95.1%) of respondents reported performing differential diagnosis procedures with their new patients; most commonly orthopaedic testing, palpation, history taking, range of motion testing and neurological examination. Palpation and painful joint findings were the most commonly used methods to determine the appropriate joint to apply manipulation. The most common treatment methods were manual joint manipulation/mobilization, stretching and exercise, posture/ergonomic advice and soft-tissue therapies. Conclusions: Differential diagnosis is a standard part of the assessment of new chiropractic patients in English-speaking Canadian provinces and the most common methods used to determine the site to apply manipulation are consistent with current scientific literature. Patients are treated with a combination of manual and/or manipulative interventions directed towards the joints and/or soft-tissues, as well as exercise instruction and postural/ergonomic advice. PMID:26500362

  11. Preventing cognitive decline in older African Americans with mild cognitive impairment: design and methods of a randomized clinical trial.

    PubMed

    Rovner, Barry W; Casten, Robin J; Hegel, Mark T; Leiby, Benjamin E

    2012-07-01

    Mild Cognitive Impairment (MCI) affects 25% of older African Americans and predicts progression to Alzheimer's disease. An extensive epidemiologic literature suggests that cognitive, physical, and/or social activities may prevent cognitive decline. We describe the methods of a randomized clinical trial to test the efficacy of Behavior Activation to prevent cognitive decline in older African Americans with the amnestic multiple domain subtype of MCI. Community Health Workers deliver 6 initial in-home treatment sessions over 2-3 months and then 6 subsequent in-home booster sessions using language, materials, and concepts that are culturally relevant to older African Americans during this 24 month clinical trial. We are randomizing 200 subjects who are recruited from churches, senior centers, and medical clinics to Behavior Activation or Supportive Therapy, which controls for attention. The primary outcome is episodic memory as measured by the Hopkins Verbal Learning Test-Revised at baseline and at months 3, 12, 18, and 24. The secondary outcomes are general and domain-specific neuropsychological function, activities of daily living, depression, and quality-of-life. The negative results of recent clinical trials of drug treatments for MCI and Alzheimer's disease suggest that behavioral interventions may provide an alternative treatment approach to preserve cognition in an aging society. PMID:22406101

  12. Using Sexually Transmitted Infection Biomarkers to Validate Reporting of Sexual Behavior within a Randomized, Experimental Evaluation of Interviewing Methods

    PubMed Central

    Mensch, Barbara S.; de A. Ribeiro, Manoel Carlos S.; Jones, Heidi E.; Lippman, Sheri A.; Montgomery, Mark R.; van de Wijgert, Janneke H. H. M.

    2008-01-01

    This paper examines the reporting of sexual and other risk behaviors within a randomized experiment using a computerized versus face-to-face interview mode. Biomarkers for sexually transmitted infection (STI) were used to validate self-reported behavior by interview mode. As part of a parent study evaluating home versus clinic screening and diagnosis for STIs, 818 women aged 18−40 years were recruited in 2004 at or near a primary care clinic in São Paulo, Brazil, and were randomized to a face-to-face interview or audio computer-assisted self-interviewing. Ninety-six percent of participants were tested for chlamydia, gonorrhea, and trichomoniasis. Reporting of STI risk behavior was consistently higher with the computerized mode of interview. Stronger associations between risk behaviors and STI were found with the computerized interview after controlling for sociodemographic factors. These results were obtained by using logistic regression approaches, as well as statistical methods that address potential residual confounding and covariate endogeneity. Furthermore, STI-positive participants were more likely than STI-negative participants to underreport risk behavior in the face-to-face interview. Results strongly suggest that computerized interviewing provides more accurate and reliable behavioral data. The analyses also confirm the benefits of using data on prevalent STIs for externally validating behavioral reporting. PMID:18525081

  13. Dynamic analysis method of offshore jack-up platforms in regular and random waves

    NASA Astrophysics Data System (ADS)

    Yu, Hao; Li, Xiaoyu; Yang, Shuguang

    2012-03-01

    A jack-up platform, with its particular structure, showed obvious dynamic characteristics under complex environmental loads in extreme conditions. In this paper, taking a simplified 3-D finite element dynamic model in extreme storm conditions as research object, a transient dynamic analysis method was proposed, which was under both regular and irregular wave loads. The steps of dynamic analysis under extreme conditions were illustrated with an applied case, and the dynamic amplification factor (DAF) was calculated for each response parameter of base shear, overturning moment and hull sway. Finally, the structural response results of dynamic and static were compared and analyzed. The results indicated that the static strength analysis of the Jack-up Platforms was not enough under the dynamic loads including wave and current, further dynamic response analysis considering both computational efficiency and accuracy was necessary.

  14. Comparing of goal setting strategy with group education method to increase physical activity level: A randomized trial

    PubMed Central

    Jiryaee, Nasrin; Siadat, Zahra Dana; Zamani, Ahmadreza; Taleban, Roya

    2015-01-01

    Background: Designing an intervention to increase physical activity is important to be based on the health care settings resources and be acceptable by the subject group. This study was designed to assess and compare the effect of the goal setting strategy with a group education method on increasing the physical activity of mothers of children aged 1 to 5. Materials and Methods: Mothers who had at least one child of 1-5 years were randomized into two groups. The effect of 1) goal-setting strategy and 2) group education method on increasing physical activity was assessed and compared 1 month and 3 months after the intervention. Also, the weight, height, body mass index (BMI), waist and hip circumference, and well-being were compared between the two groups before and after the intervention. Results: Physical activity level increased significantly after the intervention in the goal-setting group and it was significantly different between the two groups after intervention (P < 0.05). BMI, waist circumference, hip circumference, and well-being score were significantly different in the goal-setting group after the intervention. In the group education method, only the well-being score improved significantly (P < 0.05). Conclusion: Our study presented the effects of using the goal-setting strategy to boost physical activity, improving the state of well-being and decreasing BMI, waist, and hip circumference. PMID:26929765

  15. Synthesis of carbon-supported PtRh random alloy nanoparticles using electron beam irradiation reduction method

    NASA Astrophysics Data System (ADS)

    Matsuura, Yoshiyuki; Seino, Satoshi; Okazaki, Tomohisa; Akita, Tomoki; Nakagawa, Takashi; Yamamoto, Takao A.

    2016-05-01

    Bimetallic nanoparticle catalysts of PtRh supported on carbon were synthesized using an electron beam irradiation reduction method. The PtRh nanoparticle catalysts were composed of particles 2-3 nm in size, which were well dispersed on the surface of the carbon support nanoparticles. Analyses of X-ray diffraction and scanning transmission electron microscopy-energy-dispersive X-ray spectroscopy revealed that the PtRh nanoparticles have a randomly alloyed structure. The lattice constant of the PtRh nanoparticles showed good correlation with Vegard's law. These results are explained by the radiochemical formation process of the PtRh nanoparticles. Catalytic activities of PtRh/C nanoparticles for ethanol oxidation reaction were found to be higher than those obtained with Pt/C.

  16. A randomized controlled trial of venlafaxine XR for major depressive disorder after spinal cord injury: Methods and lessons learned

    PubMed Central

    Bombardier, Charles H.; Fann, Jesse R.; Wilson, Catherine S.; Heinemann, Allen W.; Richards, J. Scott; Warren, Ann Marie; Brooks, Larry; Warms, Catherine A.; Temkin, Nancy R.; Tate, Denise G.

    2014-01-01

    Context/objective We describe the rationale, design, methods, and lessons learned conducting a treatment trial for major depressive disorder (MDD) or dysthymia in people with spinal cord injury (SCI). Design A multi-site, double-blind, randomized (1:1) placebo controlled trial of venlafaxine XR for MDD or dysthymia. Subjects were block randomized and stratified by site, lifetime history of substance dependence, and prior history of MDD. Setting Six SCI centers throughout the United States. Participants Across participating centers, 2536 subjects were screened and 133 were enrolled into the trial. Subjects were 18–64 years old and at least 1 month post-SCI. Interventions Twelve-week trial of venlafaxine XR versus placebo using a flexible titration schedule. Outcome measures The primary outcome was improvement in depression severity at 12 weeks. The secondary outcome was improvement in pain. Results This article includes study methods, modifications prompted by a formative review process, preliminary data on the study sample and lessons learned. We describe common methodological and operational challenges conducting multi-site trials and how we addressed them. Challenges included study organization and decision making, staff training, obtaining human subjects approval, standardization of measurement and treatment, data and safety monitoring, subject screening and recruitment, unblinding and continuity of care, database management, and data analysis. Conclusions The methodological and operational challenges we faced and the lessons we learned may provide useful information for researchers who aim to conduct clinical trials, especially in the area of medical treatment of depression in people with SCI. PMID:24090228

  17. Multicomponent Interdisciplinary Group Intervention for Self-Management of Fibromyalgia: A Mixed-Methods Randomized Controlled Trial

    PubMed Central

    Bourgault, Patricia; Lacasse, Anaïs; Marchand, Serge; Courtemanche-Harel, Roxanne; Charest, Jacques; Gaumond, Isabelle; Barcellos de Souza, Juliana; Choinière, Manon

    2015-01-01

    Background This study evaluated the efficacy of the PASSAGE Program, a structured multicomponent interdisciplinary group intervention for the self-management of FMS. Methods A mixed-methods randomized controlled trial (intervention (INT) vs. waitlist (WL)) was conducted with patients suffering from FMS. Data were collected at baseline (T0), at the end of the intervention (T1), and 3 months later (T2). The primary outcome was change in pain intensity (0-10). Secondary outcomes were fibromyalgia severity, pain interference, sleep quality, pain coping strategies, depression, health-related quality of life, patient global impression of change (PGIC), and perceived pain relief. Qualitative group interviews with a subset of patients were also conducted. Complete data from T0 to T2 were available for 43 patients. Results The intervention had a statistically significant impact on the three PGIC measures. At the end of the PASSAGE Program, the percentages of patients who perceived overall improvement in their pain levels, functioning and quality of life were significantly higher in the INT Group (73%, 55%, 77% respectively) than in the WL Group (8%, 12%, 20%). The same differences were observed 3 months post-intervention (Intervention group: 62%, 43%, 38% vs Waitlist Group: 13%, 13%, 9%). The proportion of patients who reported ≥50% pain relief was also significantly higher in the INT Group at the end of the intervention (36% vs 12%) and 3 months post-intervention (33% vs 4%). Results of the qualitative analysis were in line with the quantitative findings regarding the efficacy of the intervention. The improvement, however, was not reflected in the primary outcome and other secondary outcome measures. Conclusion The PASSAGE Program was effective in helping FMS patients gain a sense of control over their symptoms. We suggest including PGIC in future clinical trials on FMS as they appear to capture important aspects of the patients’ experience. Trial registration

  18. Wellness Coaching for People With Prediabetes: A Randomized Encouragement Trial to Evaluate Outreach Methods at Kaiser Permanente, Northern California, 2013

    PubMed Central

    Xiao, Hong; Adams, Sara R.; Goler, Nancy; Sanna, Rashel S.; Boccio, Mindy; Bellamy, David J.; Brown, Susan D.; Neugebauer, Romain S.; Ferrara, Assiamira

    2015-01-01

    Introduction Health coaching can improve lifestyle behaviors known to prevent or manage chronic conditions. Little is known about effective ways to encourage health and wellness coaching among people who might benefit. The purpose of this randomized encouragement trial was to assess the relative success of 3 outreach methods (secured email message, telephone message, and mailed letter) on the use of wellness coaching by people with prediabetes. Methods A total of 14,584 Kaiser Permanente Northern California (KPNC) patients with diagnosed prediabetes (fasting plasma glucose, 110–125mg/dL) were randomly assigned to be contacted via 1 of 4 intervention arms from January through May 2013. The uptake rate (making an appointment at the Wellness Coaching Center [WCC]) was assessed, and the association between uptake rate and patient characteristics was examined via multivariable logistic regression. Results The overall uptake rate across intervention arms was 1.9%. Secured email message had the highest uptake rate (3.0%), followed by letters and telephone messages (P < .05 for all pairwise comparisons). No participants in the usual-care arm (ie, no outreach) made an appointment with the WCC. For each year of increased age, the estimated odds of the uptake increased by 1.02 (odds ratio [OR] = 1.02; 95% CI, 1.01–1.04). Women were nearly twice as likely to make an appointment at the WCC as men (OR = 1.87; 95% CI, 1.40–2.51). Conclusion Our results suggest that the WCC can recruit and encourage KPNC members with prediabetes to participate in the WCC. Future research should focus on increasing participation rates in health coaching among patients who may benefit. PMID:26605707

  19. Adequation of mini satellites to oceanic altimetry missions

    NASA Astrophysics Data System (ADS)

    Bellaieche, G.; Aguttes, J. P.

    1993-01-01

    Association of the mini satellite concept and oceanic altimetry missions is discussed. Mission definition and most constraining requirements (mesoscale for example) demonstrate mini satellites to be quite adequate for such missions. Progress in altimeter characteristics, orbit determination, and position reporting allow consideration of oceanic altimetry missions using low Earth orbit satellites. Satellite constellation, trace keeping and orbital period, and required payload characteristics are exposed. The mission requirements covering Sun synchronous orbit, service area, ground system, and launcher characteristics as well as constellation maintenance strategy are specified. Two options for the satellite, orbital mechanics, propulsion, onboard power and stabilizing subsystems, onboard management, satellite ground linkings, mechanical and thermal subsystems, budgets, and planning are discussed.

  20. Potential of Three-Way Randomly Amplified Polymorphic DNA Analysis as a Typing Method for Twelve Salmonella Serotypes

    PubMed Central

    Soto, S. M.; Guerra, B.; González-Hevia, M. A.; Mendoza, M. C.

    1999-01-01

    The potential of a three-way randomly amplified polymorphic DNA (RAPD) procedure (RAPD typing) for typing Salmonella enterica strains assigned to 12 serotypes was analyzed. The series of organisms used included 235 strains (326 isolates) collected mainly from clinical samples in the Principality of Asturias and 9 reference strains. RAPD typing was performed directly with broth cultures of bacteria by using three selected primers and optimized PCR conditions. The profiles obtained with the three primers were used to define RAPD types and to evaluate the procedure as a typing method at the species and serotype levels. The typeability was 100%; the reproducibility and in vitro stability could be considered good. The concordance of RAPD typing methods with serotyping methods was 100%, but some profiles obtained with two of the three primers were obtained with strains assigned to different serotypes. The discrimination index (DI) within the series of organisms was 0.94, and the DI within serotypes Typhimurium, Enteritidis, and Virchow were 0.72, 0.52, and 0.66, respectively. Within these serotypes the most common RAPD types were differentiated into phage types and vice versa; combining the types identified by the two procedures (RAPD typing and phage typing) resulted in further discrimination (DI, 0.96, 0.74, and 0.87, respectively). The efficiency, rapidity, and flexibility of the RAPD typing method support the conclusion that it can be used as a tool for identifying Salmonella organisms and as a typing method that is complementary to serotyping and phage typing methods. PMID:10543793

  1. Quantifying dose to the reconstructed breast: Can we adequately treat?

    SciTech Connect

    Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M.; Pierce, Lori J.

    2013-04-01

    To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.

  2. Adequate drainage system design for heap leaching structures.

    PubMed

    Majdi, Abbas; Amini, Mehdi; Nasab, Saeed Karimi

    2007-08-17

    The paper describes an optimum design of a drainage system for a heap leaching structure which has positive impacts on both mine environment and mine economics. In order to properly design a drainage system the causes of an increase in the acid level of the heap which in turn produces severe problems in the hydrometallurgy processes must be evaluated. One of the most significant negative impacts induced by an increase in the acid level within a heap structure is the increase of pore acid pressure which in turn increases the potential of a heap-slide that may endanger the mine environment. In this paper, initially the thickness of gravelly drainage layer is determined via existing empirical equations. Then by assuming that the calculated thickness is constant throughout the heap structure, an approach has been proposed to calculate the required internal diameter of the slotted polyethylene pipes which are used for auxiliary drainage purposes. In order to adequately design this diameter, the pipe's cross-sectional deformation due to stepped heap structure overburden pressure is taken into account. Finally, a design of an adequate drainage system for the heap structure 2 at Sarcheshmeh copper mine is presented and the results are compared with those calculated by exiting equations. PMID:17321044

  3. Combining information on multiple instrumental variables in Mendelian randomization: comparison of allele score and summarized data methods.

    PubMed

    Burgess, Stephen; Dudbridge, Frank; Thompson, Simon G

    2016-05-20

    Mendelian randomization is the use of genetic instrumental variables to obtain causal inferences from observational data. Two recent developments for combining information on multiple uncorrelated instrumental variables (IVs) into a single causal estimate are as follows: (i) allele scores, in which individual-level data on the IVs are aggregated into a univariate score, which is used as a single IV, and (ii) a summary statistic method, in which causal estimates calculated from each IV using summarized data are combined in an inverse-variance weighted meta-analysis. To avoid bias from weak instruments, unweighted and externally weighted allele scores have been recommended. Here, we propose equivalent approaches using summarized data and also provide extensions of the methods for use with correlated IVs. We investigate the impact of different choices of weights on the bias and precision of estimates in simulation studies. We show that allele score estimates can be reproduced using summarized data on genetic associations with the risk factor and the outcome. Estimates from the summary statistic method using external weights are biased towards the null when the weights are imprecisely estimated; in contrast, allele score estimates are unbiased. With equal or external weights, both methods provide appropriate tests of the null hypothesis of no causal effect even with large numbers of potentially weak instruments. We illustrate these methods using summarized data on the causal effect of low-density lipoprotein cholesterol on coronary heart disease risk. It is shown that a more precise causal estimate can be obtained using multiple genetic variants from a single gene region, even if the variants are correlated. © 2015 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:26661904

  4. Prevention of gestational diabetes through lifestyle intervention: study design and methods of a Finnish randomized controlled multicenter trial (RADIEL)

    PubMed Central

    2014-01-01

    Background Maternal overweight, obesity and consequently the incidence of gestational diabetes are increasing rapidly worldwide. The objective of the study was to assess the efficacy and cost-effectiveness of a combined diet and physical activity intervention implemented before, during and after pregnancy in a primary health care setting for preventing gestational diabetes, later type 2 diabetes and other metabolic consequences. Methods RADIEL is a randomized controlled multi-center intervention trial in women at high risk for diabetes (a previous history of gestational diabetes or prepregnancy BMI ≥30 kg/m2). Participants planning pregnancy or in the first half of pregnancy were parallel-group randomized into an intervention arm which received lifestyle counseling and a control arm which received usual care given at their local antenatal clinics. All participants visited a study nurse every three months before and during pregnancy, and at 6 weeks, 6 and 12 months postpartum. Measurements and laboratory tests were performed on all participants with special focus on dietary and exercise habits and metabolic markers. Of the 728 women [mean age 32.5 years (SD 4.7); median parity 1 (range 0-9)] considered to be eligible for the study 235 were non-pregnant and 493 pregnant [mean gestational age 13 (range 6 to 18) weeks] at the time of enrollment. The proportion of nulliparous women was 29.8% (n = 217). Out of all participants, 79.6% of the non-pregnant and 40.4% of the pregnant women had previous gestational diabetes and 20.4% of the non-pregnant and 59.6% of the pregnant women were recruited because of a prepregnancy BMI ≥30 kg/m2. Mean BMI at first visit was 30.1 kg/m2 (SD 6.2) in the non-pregnant and 32.7 kg/m2 (SD 5.6) in the pregnant group. Discussion To our knowledge, this is the first randomized lifestyle intervention trial, which includes, besides the pregnancy period, both the prepregnancy and the postpartum period. This study design also

  5. A comparative in-vivo evaluation of the alignment efficiency of 5 ligation methods: A prospective randomized clinical trial

    PubMed Central

    Reddy, Vijaya Bhaskara; Kumar, Talapaneni Ashok; Prasad, Mandava; Nuvvula, Sivakumar; Patil, Rajedra Goud; Reddy, Praveen Kumar

    2014-01-01

    Objectives: To conduct a prospective randomized study comparing the efficiency of 5 different ligation systems (ELL; elastomeric ligature, SSL; stainless steel ligature, LL; leone slide ligature, PSL; passive self-ligation and ASL; active self-ligation) over the duration of mandibular crowding alleviation. Materials and Methods: Fifty consecutive patients (54.2% male, 45.8% female; mean age: 16.69 years) satisfying the inclusion criteria were randomly allocated to 5 ligation groups with an equal sample size of 10 per group. The 5 groups received treatment with 0.022-inch MBT pre-adjusted edge-wise technique (ELL: Gemini 3M Unitek, SSL: Gemini 3M Unitek, LL: Gemini 3M Unitek, PSL: SmartClip 3M Unitek and ASL: In-Ovation R Euro GAC International). The models and cephalograms were evaluated for anterior arch alignment, extraction space closure, and lower incisal inclinations at pre-treatment T1 and at the end of initial alignment T2. Analysis of variance (ANOVA) and Post-hoc tests were used for data analysis. Results: Forty-eight participants completed the study, and SL systems showed a significant difference over CL groups in time to alignment, passive space closure, and incisal inclination. Multiple regression showed a reduction of 5.28 days in time to alignment by changing the ligation group in the order of ELL to ASL group and 1 mm increase in initial irregularity index increases time to alignment by 11.68 days. Conclusion: Self-ligation brackets were more efficient than conventional ligation brackets during initial leveling and alignment. PMID:24966742

  6. A feasible method to improve adherence of Hawley retainer in adolescent orthodontic patients: a randomized controlled trial

    PubMed Central

    Lin, Feiou; Sun, Hao; Ni, Zhenyu; Zheng, Minling; Yao, Linjie

    2015-01-01

    Background Retention is an important component of orthodontic treatment; however, poor compliance with retainer use is often encountered, especially in adolescents. The purpose of this study was to prove the hypothesis that verbal instructions combined with images showing the severe consequences of poor compliance can increase retainer use. Methods This study was a randomized controlled trial. The sample was recruited from Wenzhou, People’s Republic of China, between February 2013 and May 2014, and 326 participants were randomized into three groups. Patients and parents in Group A (n=106) were given routine retainer wear instructions only; in Group B (n=111), images illustrating the severe consequences of poor compliance with Hawley retainer use were shown to patients, combined with routine instructions; and in Group C (n=109), images illustrating the severe consequences of poor compliance with Hawley retainer use were shown to patients and parents, combined with routine instructions. Three months after debonding, questionnaires were used to investigate daily wear time and the reasons for poor compliance. Differences in means between the groups were tested by one-way analysis of variance. Results The mean daily wear time in Group C (15.09±4.13 hours) was significantly greater than in Group A (12.37±4.58 hours, P<0.01) or Group B (13.50±4.22 hours, P<0.05); the mean daily wear time in Group B was greater than in Group A, but was not significant (P=0.67). Reasons for nonusage were forgetting to wear the retainer (51%) and finding the retainer bothersome to frequently insert and remove (42%). Conclusion Verbal instructions combined with images showing the severe consequences of poor compliance can increase retainer use. Parents play an important role in compliance with retainer use in adolescent patients. PMID:26604705

  7. Fundamental Vibration Frequency and Damping Estimation: A Comparison Using the Random Decrement Method, the Empirical Mode Decomposition, and the HV Spectral Ratio Method for Local Site Characterization

    NASA Astrophysics Data System (ADS)

    Huerta-Lopez, C. I.; Upegui Botero, F. M.; Pulliam, J.; Willemann, R. J.; Pasyanos, M.; Schmitz, M.; Rojas Mercedes, N.; Louie, J. N.; Moschetti, M. P.; Martinez-Cruzado, J. A.; Suárez, L.; Huerfano Moreno, V.; Polanco, E.

    2013-12-01

    Site characterization in civil engineering demands to know at least two of the dynamic properties of soil systems, which are: (i) dominant vibration frequency, and (ii) damping. As part of an effort to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques using non invasive/non destructive seismic methods, a workshop (Pan-American Advanced Studies Institute: New Frontiers in Geophysical Research: Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation) was conducted during july 15-25, 2013 in Santo Domingo, Dominican Republic by the alliance of Pan-American Advanced Studies Institute (PASI) and Incorporated Research Institutions for Seismology (IRIS), jointly supported by Department of Energy (DOE) and National Science Foundation (NSF). Preliminary results of the site characterization in terms of fundamental vibration frequency and damping are here presented from data collected during the workshop. Three different methods were used in such estimations and later compared in order to identify the stability of estimations as well as the advantage or disadvantage among these methodologies. The used methods were the: (i) Random Decrement Method (RDM), to estimate fundamental vibration frequency and damping simultaneously; (ii) Empirical Mode Decomposition (EMD), to estimate the vibration modes, and (iii) Horizontal to Vertical Spectra ratio (HVSR), to estimate the fundamental vibration frequency. In all cases ambient vibration and induced vibration were used.

  8. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    PubMed Central

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. Results HPV prevalence for high-risk types was 62.3% (95%CI: 53.7–70.2) detected by s-DRY, 56.2% (95%CI: 47.6–64.4) by Dr-WET, and 54.6% (95%CI: 46.1–62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5–79.8) for s-FTA, 84.6% (95%CI: 66.5–93.9) for s-DRY, and 76.9% (95%CI: 58.0–89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Conclusion Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 43310942 PMID:26630353

  9. Identification of measles virus epitopes using an ultra-fast method of panning phage-displayed random peptide libraries

    PubMed Central

    Yu, Xiaoli; Barmina, Olga; Burgoon, Mark; Gilden, Don

    2010-01-01

    Phage-displayed random peptide libraries, in which high affinity phage peptides are enriched by repetitive selection (panning) on target antibody, provide a unique tool for identifying antigen specificity. This paper describes a new panning method that enables selection of peptides in 1 day as compared to about 6 days required in traditional panning to identify virus-specific epitopes. The method, termed ultra-fast selection of peptide (UFSP), utilizes phage produced by bacterial infection (phage amplification) directly for subsequent panning. Phage amplified in less than 1 h of infection in Escherichia coli are used for binding to target antibody pre-coated in the same wells of an ELISA plate, obviating the need for traditional large-scale amplification and purification. Importantly, phage elution at 37 °C was superior to that at room temperature, and phage amplification in a 150-μl volume of E. coli cells was superior to that in 250-μl volume. Application of UFSP to two monoclonal antibodies generated from clonally expanded plasma cells in subacute sclerosing panencephalitis (SSPE) brain identified high-affinity measles virus-specific-peptide epitopes. The UFSP panning methodology will expedite identification of peptides reacting with antibodies generated in other diseases of unknown antigenic specificity such as multiple sclerosis (MS), sarcoidosis and Behcet’s disease. PMID:19095007

  10. Identification of measles virus epitopes using an ultra-fast method of panning phage-displayed random peptide libraries.

    PubMed

    Yu, Xiaoli; Barmina, Olga; Burgoon, Mark; Gilden, Don

    2009-03-01

    Phage-displayed random peptide libraries, in which high affinity phage peptides are enriched by repetitive selection (panning) on target antibody, provide a unique tool for identifying antigen specificity. This paper describes a new panning method that enables selection of peptides in 1 day as compared to about 6 days required in traditional panning to identify virus-specific epitopes. The method, termed ultra-fast selection of peptide (UFSP), utilizes phage produced by bacterial infection (phage amplification) directly for subsequent panning. Phage amplified in less than 1h of infection in Escherichia coli are used for binding to target antibody pre-coated in the same wells of an ELISA plate, obviating the need for traditional large-scale amplification and purification. Importantly, phage elution at 37 degrees C was superior to that at room temperature, and phage amplification in a 150-microl volume of E. coli cells was superior to that in 250-microl volume. Application of UFSP to two monoclonal antibodies generated from clonally expanded plasma cells in subacute sclerosing panencephalitis (SSPE) brain identified high-affinity measles virus-specific-peptide epitopes. The UFSP panning methodology will expedite identification of peptides reacting with antibodies generated in other diseases of unknown antigenic specificity such as multiple sclerosis (MS), sarcoidosis and Behcet's disease. PMID:19095007

  11. Development of a novel efficient method to construct an adenovirus library displaying random peptides on the fiber knob.

    PubMed

    Yamamoto, Yuki; Goto, Naoko; Miura, Kazuki; Narumi, Kenta; Ohnami, Shumpei; Uchida, Hiroaki; Miura, Yoshiaki; Yamamoto, Masato; Aoki, Kazunori

    2014-03-01

    Redirection of adenovirus vectors by engineering the capsid-coding region has shown limited success because proper targeting ligands are generally unknown. To overcome this limitation, we constructed an adenovirus library displaying random peptides on the fiber knob, and its screening led to successful selections of several particular targeted vectors. In the previous library construction method, the full length of an adenoviral genome was generated by a Cre-lox mediated in vitro recombination between a fiber-modified plasmid library and the enzyme-digested adenoviral DNA/terminal protein complex (DNA-TPC) before transfection to the producer cells. In this system, the procedures were complicated and time-consuming, and approximately 30% of the vectors in the library were defective with no displaying peptide. These may hinder further extensive exploration of cancer-targeting vectors. To resolve these problems, in this study, we developed a novel method with the transfection of a fiber-modified plasmid library and a fiberless adenoviral DNA-TPC in Cre-expressing 293 cells. The use of in-cell Cre recombination and fiberless adenovirus greatly simplified the library-making steps. The fiberless adenovirus was useful in suppressing the expansion of unnecessary adenovirus vectors. In addition, the complexity of the library was more than a 10(4) level in one well in a 6-well dish, which was 10-fold higher than that of the original method. The results demonstrated that this novel method is useful in producing a high quality live adenovirus library, which could facilitate the development of targeted adenovirus vectors for a variety of applications in medicine. PMID:24380399

  12. Development of a novel efficient method to construct an adenovirus library displaying random peptides on the fiber knob

    PubMed Central

    Yamamoto, Yuki; Goto, Naoko; Miura, Kazuki; Narumi, Kenta; Ohnami, Shumpei; Uchida, Hiroaki; Miura, Yoshiaki; Yamamoto, Masato; Aoki, Kazunori

    2014-01-01

    Redirection of adenovirus vectors by engineering the capsid-coding region has shown limited success because proper targeting ligands are generally unknown. To overcome this limitation, we constructed an adenovirus library displaying random peptides on the fiber knob, and its screening led to successful selections of several particular targeted vectors. In the previous library construction method, the full length of an adenoviral genome was generated by a Cre-lox mediated in vitro recombination between a fiber-modified plasmid library and the enzyme-digested adenoviral DNA/terminal protein complex (DNA-TPC) before transfection to the producer cells. In this system, the procedures were complicated and time-consuming, and approximately 30% of the vectors in the library were defective with no displaying peptide. These may hinder further extensive exploration of cancer-targeting vectors. To resolve these problems, in this study, we developed a novel method with the transfection of a fiber-modified plasmid library and a fiberless adenoviral DNA-TPC in Cre-expressing 293 cells. The use of in-cell Cre recombination and fiberless adenovirus greatly simplified the library-making steps. The fiberless adenovirus was useful in suppressing the expansion of unnecessary adenovirus vectors. In addition, the complexity of the library was more than a 104 level in one well in a 6-well dish, which was 10-fold higher than that of the original method. The results demonstrated that this novel method is useful in producing a high quality live adenovirus library, which could facilitate the development of targeted adenovirus vectors for a variety of applications in medicine. PMID:24380399

  13. Comparison of non-surgical treatment methods for patients with lumbar spinal stenosis: protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Lumbar spinal stenosis is the most common reason for spinal surgery in older adults. Previous studies have shown that surgery is effective for severe cases of stenosis, but many patients with mild to moderate symptoms are not surgical candidates. These patients and their providers are seeking effective non-surgical treatment methods to manage their symptoms; yet there is a paucity of comparative effectiveness research in this area. This knowledge gap has hindered the development of clinical practice guidelines for non-surgical treatment approaches for lumbar spinal stenosis. Methods/design This study is a prospective randomized controlled clinical trial that will be conducted from November 2013 through October 2016. The sample will consist of 180 older adults (>60 years) who have both an anatomic diagnosis of stenosis confirmed by diagnostic imaging, and signs/symptoms consistent with a clinical diagnosis of lumbar spinal stenosis confirmed by clinical examination. Eligible subjects will be randomized into one of three pragmatic treatment groups: 1) usual medical care; 2) individualized manual therapy and rehabilitative exercise; or 3) community-based group exercise. All subjects will be treated for a 6-week course of care. The primary subjective outcome is the Swiss Spinal Stenosis Questionnaire, a self-reported measure of pain/function. The primary objective outcome is the Self-Paced Walking Test, a measure of walking capacity. The secondary objective outcome will be a measurement of physical activity during activities of daily living, using the SenseWear Armband, a portable device to be worn on the upper arm for one week. The primary analysis will use linear mixed models to compare the main effects of each treatment group on the changes in each outcome measure. Secondary analyses will include a responder analysis by group and an exploratory analysis of potential baseline predictors of treatment outcome. Discussion Our study should provide evidence

  14. Leveraging Random Number Generation for Mastery of Learning in Teaching Quantitative Research Courses via an E-Learning Method

    ERIC Educational Resources Information Center

    Boonsathorn, Wasita; Charoen, Danuvasin; Dryver, Arthur L.

    2014-01-01

    E-Learning brings access to a powerful but often overlooked teaching tool: random number generation. Using random number generation, a practically infinite number of quantitative problem-solution sets can be created. In addition, within the e-learning context, in the spirit of the mastery of learning, it is possible to assign online quantitative…

  15. Early identification of and proactive palliative care for patients in general practice, incentive and methods of a randomized controlled trial

    PubMed Central

    2011-01-01

    Background According to the Word Health Organization, patients who can benefit from palliative care should be identified earlier to enable proactive palliative care. Up to now, this is not common practice and has hardly been addressed in scientific literature. Still, palliative care is limited to the terminal phase and restricted to patients with cancer. Therefore, we trained general practitioners (GPs) in identifying palliative patients in an earlier phase of their disease trajectory and in delivering structured proactive palliative care. The aim of our study is to determine if this training, in combination with consulting an expert in palliative care regarding each palliative patient's tailored care plan, can improve different aspects of the quality of the remaining life of patients with severe chronic diseases such as chronic obstructive pulmonary disease, congestive heart failure and cancer. Methods/Design A two-armed randomized controlled trial was performed. As outcome variables we studied: place of death, number of hospital admissions and number of GP out of hours contacts. Discussion We expect that this study will increase the number of identified palliative care patients and improve different aspects of quality of palliative care. This is of importance to improve palliative care for patients with COPD, CHF and cancer and their informal caregivers, and to empower the GP. The study protocol is described and possible strengths and weaknesses and possible consequences have been outlined. Trial Registration The Netherlands National Trial Register: NTR2815 PMID:22050863

  16. A Randomized, Controlled Trial of Levonorgestrel Vs. The Yuzpe Regimen as Emergency Contraception Method among Iranian Women

    PubMed Central

    Hoseini, Fatemeh Sadat; Eslami, Mohammad; Abbasi, Mohammed; Noroozi Fashkhami, Fatemeh; Besharati, Soheila

    2013-01-01

    Abstract Background We aimed to compare acceptability of Levonorgestrel with the Yuzpe regimen among Iranian women based on their side-effects and resulting changes in the amount and pattern of menses. Methods Five hundred twenty nine participants aged 15-49 having regular menses and one act of unprotected intercourse within 72 h were included in the double-blind, controlled trial in 2006-2007 and randomly assigned into LNG (n=263) and HD (n=266) groups, receiving Levonorgestrel 0.75 mg given 12 h apart and ethinyl estradiol 100 μg plus 0.5 mg Levonorgestrel 0.5 mg repeated after 12 h, respectively. Results The participants receiving Levonorgestrel experienced significantly lower side-effects in the case of nausea, vomiting, and dizziness (P<0.05). The changes occurred in the amount and pattern of menses were the same for both groups (P>0.05). No significant difference was observed between the efficiencies of the treatments. Conclusion Significantly lower side-effects of Levonorgestrel can be considered as greater acceptability and translated to higher efficiency. PMID:26060625

  17. An open-label, randomized bioavailability study with alternative methods of administration of crushed ticagrelor tablets in healthy volunteers

    PubMed Central

    Teng, Renli; Carlson, Glenn; Hsia, Judith

    2015-01-01

    Objective: To compare the bioavailability and safety profile of crushed ticagrelor tablets suspended in water and administered orally or via nasogastric tube, with that of whole tablets administered orally. Methods: In this single-center, open-label, randomized, three-treatment crossover study, 36 healthy volunteers were randomized to receive a single 90-mg dose of ticagrelor administered orally as a whole tablet or as crushed tablets suspended in water and given orally or via a nasogastric tube into the stomach, with a minimum 7-day wash-out between treatments. Plasma concentrations of ticagrelor and AR-C124910XX were assessed at 0, 0.5, 1, 2, 3, 4, 6, 8, 10, 12, 16, 24, 36, and 48 hours post-ticagrelor dose for pharmacokinetic analyses. Safety and tolerability was assessed throughout the study. Results: At 0.5 hours postdose, plasma concentrations of ticagrelor and AR-C124910XX were higher with crushed tablets administered orally (148.6 ng/mL and 13.0 ng/mL, respectively) or via nasogastric tube (264.6 ng/mL and 28.6 ng/mL, respectively) compared with whole-tablet administration (33.3 ng/mL and 5.2 ng/mL, respectively). A similar trend was observed at 1 hour postdose. Ticagrelor tmax was shorter following crushed vs. whole-tablet administration (1 vs. 2 hours, respectively). Geometric mean ratios between treatments for AUC and Cmax were contained within the bioequivalence limits of 80 – 125% for ticagrelor and AR-C124910XX. All treatments were generally well tolerated. Conclusions: Ticagrelor administered as a crushed tablet is bioequivalent to whole-tablet administration, independent of mode of administration (oral or via nasogastric tube), and resulted in increased plasma concentrations of ticagrelor and AR-C124910XX at early timepoints. PMID:25500486

  18. Random Decrement Method and Modeling H/V Spectral Ratios: An Application for Soft Shallow Layers Characterization

    NASA Astrophysics Data System (ADS)

    Song, H.; Huerta-Lopez, C. I.; Martinez-Cruzado, J. A.; Rodriguez-Lozoya, H. E.; Espinoza-Barreras, F.

    2009-05-01

    Results of an ongoing study to estimate the ground response upon weak and moderate earthquake excitations are presented. A reliable site characterization in terms of its soil properties and sub-soil layer configuration are parameters required in order to do a trustworthy estimation of the ground response upon dynamic loads. This study can be described by the following four steps: (1) Ambient noise measurements were collected at the study site where a bridge was under construction between the cities of Tijuana and Ensenada in Mexico. The time series were collected using a six channels recorder with an ADC converter of 16 bits within a maximum voltage range of ± 2.5 V, the recorder has an optional settings of: Butterworth/Bessel filters, gain and sampling rate. The sensors were a three orthogonal component (X, Y, Z) accelerometers with a sensitivity of 20 V/g, flat frequency response between DC to 200 Hz, and total full range of ±0.25 of g, (2) experimental H/V Spectral Ratios were computed to estimate the fundamental vibration frequency at the site, (3) using the time domain experimental H/V spectral ratios as well as the original recorded time series, the random decrement method was applied to estimate the fundamental frequency and damping of the site (system), and (4) finally the theoretical H/V spectral ratios were obtained by means of the stiffness matrix wave propagation method.. The interpretation of the obtained results was then finally compared with a geotechnical study available at the site.

  19. Comparison of training methods to improve walking in persons with chronic spinal cord injury: a randomized clinical trial

    PubMed Central

    Alexeeva, Natalia; Sames, Carol; Jacobs, Patrick L.; Hobday, Lori; DiStasio, Marcello M.; Mitchell, Sarah A.; Calancie, Blair

    2011-01-01

    Objective To compare two forms of device-specific training – body-weight-supported (BWS) ambulation on a fixed track (TRK) and BWS ambulation on a treadmill (TM) – to comprehensive physical therapy (PT) for improving walking speed in persons with chronic, motor-incomplete spinal cord injury (SCI). Methods Thirty-five adult subjects with a history of chronic SCI (>1 year; AIS ‘C’ or ‘D’) participated in a 13-week (1 hour/day; 3 days per week) training program. Subjects were randomized into one of the three training groups. Subjects in the two BWS groups trained without the benefit of additional input from a physical therapist or gait expert. For each training session, performance values and heart rate were monitored. Pre- and post-training maximal 10-m walking speed, balance, muscle strength, fitness, and quality of life were assessed in each subject. Results All three training groups showed significant improvement in maximal walking speed, muscle strength, and psychological well-being. A significant improvement in balance was seen for PT and TRK groups but not for subjects in the TM group. In all groups, post-training measures of fitness, functional independence, and perceived health and vitality were unchanged. Conclusions Our results demonstrate that persons with chronic, motor-incomplete SCI can improve walking ability and psychological well-being following a concentrated period of ambulation therapy, regardless of training method. Improvement in walking speed was associated with improved balance and muscle strength. In spite of the fact that we withheld any formal input of a physical therapist or gait expert from subjects in the device-specific training groups, these subjects did just as well as subjects receiving comprehensive PT for improving walking speed and strength. It is likely that further modest benefits would accrue to those subjects receiving a combination of device-specific training with input from a physical therapist or gait expert to

  20. Dose Limits for Man do not Adequately Protect the Ecosystem

    SciTech Connect

    Higley, Kathryn A.; Alexakhin, Rudolf M.; McDonald, Joseph C.

    2004-08-01

    It has been known for quite some time that different organisms display differing degrees of sensitivity to the effects of ionizing radiations. Some microorganisms such as the bacterium Micrococcus radiodurans, along with many species of invertebrates, are extremely radio-resistant. Humans might be categorized as being relatively sensitive to radiation, and are a bit more resistant than some pine trees. Therefore, it could be argued that maintaining the dose limits necessary to protect humans will also result in the protection of most other species of flora and fauna. This concept is usually referred to as the anthropocentric approach. In other words, if man is protected then the environment is also adequately protected. The ecocentric approach might be stated as; the health of humans is effectively protected only when the environment is not unduly exposed to radiation. The ICRP is working on new recommendations dealing with the protection of the environment, and this debate should help to highlight a number of relevant issues concerning that topic.

  1. ENSURING ADEQUATE SAFETY WHEN USING HYDROGEN AS A FUEL

    SciTech Connect

    Coutts, D

    2007-01-22

    Demonstration projects using hydrogen as a fuel are becoming very common. Often these projects rely on project-specific risk evaluations to support project safety decisions. This is necessary because regulations, codes, and standards (hereafter referred to as standards) are just being developed. This paper will review some of the approaches being used in these evolving standards, and techniques which demonstration projects can implement to bridge the gap between current requirements and stakeholder desires. Many of the evolving standards for hydrogen-fuel use performance-based language, which establishes minimum performance and safety objectives, as compared with prescriptive-based language that prescribes specific design solutions. This is being done for several reasons including: (1) concern that establishing specific design solutions too early will stifle invention, (2) sparse performance data necessary to support selection of design approaches, and (3) a risk-adverse public which is unwilling to accept losses that were incurred in developing previous prescriptive design standards. The evolving standards often contain words such as: ''The manufacturer shall implement the measures and provide the information necessary to minimize the risk of endangering a person's safety or health''. This typically implies that the manufacturer or project manager must produce and document an acceptable level of risk. If accomplished using comprehensive and systematic process the demonstration project risk assessment can ease the transition to widespread commercialization. An approach to adequately evaluate and document the safety risk will be presented.

  2. DARHT - an `adequate` EIS: A NEPA case study

    SciTech Connect

    Webb, M.D.

    1997-08-01

    The Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility Environmental Impact Statement (EIS) provides a case study that is interesting for many reasons. The EIS was prepared quickly, in the face of a lawsuit, for a project with unforeseen environmental impacts, for a facility that was deemed urgently essential to national security. Following judicial review the EIS was deemed to be {open_quotes}adequate.{close_quotes} DARHT is a facility now being built at Los Alamos National Laboratory (LANL) as part of the Department of Energy (DOE) nuclear weapons stockpile stewardship program. DARHT will be used to evaluate the safety and reliability of nuclear weapons, evaluate conventional munitions and study high-velocity impact phenomena. DARHT will be equipped with two accelerator-driven, high-intensity X-ray machines to record images of materials driven by high explosives. DARHT will be used for a variety of hydrodynamic tests, and DOE plans to conduct some dynamic experiments using plutonium at DARHT as well.

  3. An efficient, high-order probabilistic collocation method on sparse grids for three-dimensional flow and solute transport in randomly heterogeneous porous media

    SciTech Connect

    Lin, Guang; Tartakovsky, Alexandre M.

    2009-05-01

    In this study, a probabilistic collocation method (PCM) on sparse grids was used to solve stochastic equations describing flow and transport in three-dimensional in saturated, randomly heterogeneous porous media. Karhunen-Lo\\`{e}ve (KL) decomposition was used to represent the three-dimensional log hydraulic conductivity $Y=\\ln K_s$. The hydraulic head $h$ and average pore-velocity $\\bf v$ were obtained by solving the three-dimensional continuity equation coupled with Darcy's law with random hydraulic conductivity field. The concentration was computed by solving a three-dimensional stochastic advection-dispersion equation with stochastic average pore-velocity $\\bf v$ computed from Darcy's law. PCM is an extension of the generalized polynomial chaos (gPC) that couples gPC with probabilistic collocation. By using the sparse grid points, PCM can handle a random process with large number of random dimensions, with relatively lower computational cost, compared to full tensor products. Monte Carlo (MC) simulations have also been conducted to verify accuracy of the PCM. By comparing the MC and PCM results for mean and standard deviation of concentration, it is evident that the PCM approach is computational more efficient than Monte Carlo simulations. Unlike the conventional moment-equation approach, there is no limitation on the amplitude of random perturbation in PCM. Furthermore, PCM on sparse grids can efficiently simulate solute transport in randomly heterogeneous porous media with large variances.

  4. Improving access to adequate pain management in Taiwan.

    PubMed

    Scholten, Willem

    2015-06-01

    There is a global crisis in access to pain management in the world. WHO estimates that 4.65 billion people live in countries where medical opioid consumption is near to zero. For 2010, WHO considered a per capita consumption of 216.7 mg morphine equivalents adequate, while Taiwan had a per capita consumption of 0.05 mg morphine equivalents in 2007. In Asia, the use of opioids is sensitive because of the Opium Wars in the 19th century and for this reason, the focus of controlled substances policies has been on the prevention of diversion and dependence. However, an optimal public health outcome requires that also the beneficial aspects of these substances are acknowledged. Therefore, WHO recommends a policy based on the Principle of Balance: ensuring access for medical and scientific purposes while preventing diversion, harmful use and dependence. Furthermore, international law requires that countries ensure access to opioid analgesics for medical and scientific purposes. There is evidence that opioid analgesics for chronic pain are not associated with a major risk for developing dependence. Barriers for access can be classified in the categories of overly restrictive laws and regulations; insufficient medical training on pain management and problems related to assessment of medical needs; attitudes like an excessive fear for dependence or diversion; and economic and logistical problems. The GOPI project found many examples of such barriers in Asia. Access to opioid medicines in Taiwan can be improved by analysing the national situation and drafting a plan. The WHO policy guidelines Ensuring Balance in National Policies on Controlled Substances can be helpful for achieving this purpose, as well as international guidelines for pain treatment. PMID:26068436

  5. A therapeutic application of the experience sampling method in the treatment of depression: a randomized controlled trial.

    PubMed

    Kramer, Ingrid; Simons, Claudia J P; Hartmann, Jessica A; Menne-Lothmann, Claudia; Viechtbauer, Wolfgang; Peeters, Frenk; Schruers, Koen; van Bemmel, Alex L; Myin-Germeys, Inez; Delespaul, Philippe; van Os, Jim; Wichers, Marieke

    2014-02-01

    In depression, the ability to experience daily life positive affect predicts recovery and reduces relapse rates. Interventions based on the experience sampling method (ESM-I) are ideally suited to provide insight in personal, contextualized patterns of positive affect. The aim of this study was to examine whether add-on ESM-derived feedback on personalized patterns of positive affect is feasible and useful to patients, and results in a reduction of depressive symptomatology. Depressed outpatients (n=102) receiving pharmacological treatment participated in a randomized controlled trial with three arms: an experimental group receiving add-on ESM-derived feedback, a pseudo-experimental group participating in ESM but receiving no feedback, and a control group. The experimental group participated in an ESM procedure (three days per week over a 6-week period) using a palmtop. This group received weekly standardized feedback on personalized patterns of positive affect. Hamilton Depression Rating Scale - 17 (HDRS) and Inventory of Depressive Symptoms (IDS) scores were obtained before and after the intervention. During a 6-month follow-up period, five HDRS and IDS assessments were completed. Add-on ESM-derived feedback resulted in a significant and clinically relevant stronger decrease in HDRS score relative to the control group (p<0.01; -5.5 point reduction in HDRS at 6 months). Compared to the pseudo-experimental group, a clinically relevant decrease in HDRS score was apparent at 6 months (B=-3.6, p=0.053). Self-reported depressive complaints (IDS) yielded the same pattern over time. The use of ESM-I was deemed acceptable and the provided feedback easy to understand. Patients attempted to apply suggestions from ESM-derived feedback to daily life. These data suggest that the efficacy of traditional passive pharmacological approach to treatment of major depression can be enhanced by using person-tailored daily life information regarding positive affect. PMID:24497255

  6. The COPE healthy lifestyles TEEN randomized controlled trial with culturally diverse high school adolescents: baseline characteristics and methods.

    PubMed

    Melnyk, Bernadette Mazurek; Kelly, Stephanie; Jacobson, Diana; Belyea, Michael; Shaibi, Gabriel; Small, Leigh; O'Haver, Judith; Marsiglia, Flavio Francisco

    2013-09-01

    Obesity and mental health disorders remain significant public health problems in adolescents. Substantial health disparities exist with minority youth experiencing higher rates of these problems. Schools are an outstanding venue to provide teens with skills needed to improve their physical and mental health, and academic performance. In this paper, the authors describe the design, intervention, methods and baseline data for a randomized controlled trial with 779 culturally diverse high-school adolescents in the southwest United States. Aims for this prevention study include testing the efficacy of the COPE TEEN program versus an attention control program on the adolescents' healthy lifestyle behaviors, Body Mass Index (BMI) and BMI%, mental health, social skills and academic performance immediately following the intervention programs, and at six and 12 months post interventions. Baseline findings indicate that greater than 40% of the sample is either overweight (n = 148, 19.00%) or obese (n = 182, 23.36%). The predominant ethnicity represented is Hispanic (n = 526, 67.52%). At baseline, 15.79% (n = 123) of the students had above average scores on the Beck Youth Inventory Depression subscale indicating mildly (n = 52, 6.68%), moderately (n = 47, 6.03%), or extremely (n = 24, 3.08%) elevated scores (see Table 1). Anxiety scores were slightly higher with 21.56% (n = 168) reporting responses suggesting mildly (n = 81, 10.40%), moderately (n = 58, 7.45%) or extremely (n = 29, 3.72%) elevated scores. If the efficacy of the COPE TEEN program is supported, it will offer schools a curriculum that can be easily incorporated into high school health courses to improve adolescent healthy lifestyle behaviors, psychosocial outcomes and academic performance. PMID:23748156

  7. A Randomized Exploratory Study to Evaluate Two Acupuncture Methods for the Treatment of Headaches Associated with Traumatic Brain Injury

    PubMed Central

    Bellanti, Dawn M.; Paat, Charmagne F.; Boyd, Courtney C.; Duncan, Alaine; Price, Ashley; Zhang, Weimin; French, Louis M.; Chae, Heechin

    2016-01-01

    Abstract Background: Headaches are prevalent among Service members with traumatic brain injury (TBI); 80% report chronic or recurrent headache. Evidence for nonpharmacologic treatments, such as acupuncture, are needed. Objective: The aim of this research was to determine if two types of acupuncture (auricular acupuncture [AA] and traditional Chinese acupuncture [TCA]) were feasible and more effective than usual care (UC) alone for TBI–related headache. Materials and Methods: Design: This was a three-armed, parallel, randomized exploratory study. Setting: The research took place at three military treatment facilities in the Washington, DC, metropolitan area. Patients: The subjects were previously deployed Service members (18–69 years old) with mild-to-moderate TBI and headaches. Intervention: The interventions explored were UC alone or with the addition of AA or TCA. Outcome Measures: The primary outcome was the Headache Impact Test (HIT). Secondary outcomes were the Numerical Rating Scale (NRS), Pittsburgh Sleep Quality Index, Post-Traumatic Stress Checklist, Symptom Checklist-90-R, Medical Outcome Study Quality of Life (QoL), Beck Depression Inventory, State-Trait Anxiety Inventory, the Automated Neuropsychological Assessment Metrics, and expectancy of outcome and acupuncture efficacy. Results: Mean HIT scores decreased in the AA and TCA groups but increased slightly in the UC-only group from baseline to week 6 [AA, −10.2% (−6.4 points); TCA, −4.6% (−2.9 points); UC, +0.8% (+0.6 points)]. Both acupuncture groups had sizable decreases in NRS (Pain Best), compared to UC (TCA versus UC: P = 0.0008, d = 1.70; AA versus UC: P = 0.0127, d = 1.6). No statistically significant results were found for any other secondary outcome measures. Conclusions: Both AA and TCA improved headache-related QoL more than UC did in Service members with TBI. PMID:27458496

  8. Impact of Violation of the Missing-at-Random Assumption on Full-Information Maximum Likelihood Method in Multidimensional Adaptive Testing

    ERIC Educational Resources Information Center

    Han, Kyung T.; Guo, Fanmin

    2014-01-01

    The full-information maximum likelihood (FIML) method makes it possible to estimate and analyze structural equation models (SEM) even when data are partially missing, enabling incomplete data to contribute to model estimation. The cornerstone of FIML is the missing-at-random (MAR) assumption. In (unidimensional) computerized adaptive testing…

  9. Validation of Orthopedic Postoperative Pain Assessment Methods for Dogs: A Prospective, Blinded, Randomized, Placebo-Controlled Study

    PubMed Central

    Rialland, Pascale; Authier, Simon; Guillot, Martin; del Castillo, Jérôme R. E.; Veilleux-Lemieux, Daphnée; Frank, Diane; Gauvin, Dominique; Troncy, Eric

    2012-01-01

    In the context of translational research, there is growing interest in studying surgical orthopedic pain management approaches that are common to humans and dogs. The validity of postoperative pain assessment methods is uncertain with regards to responsiveness and the potential interference of analgesia. The hypothesis was that video analysis (as a reference), electrodermal activity, and two subjective pain scales (VAS and 4A-VET) would detect different levels of pain intensity in dogs after a standardized trochleoplasty procedure. In this prospective, blinded, randomized study, postoperative pain was assessed in 25 healthy dogs during a 48-hour time frame (T). Pain was managed with placebo (Group 1, n = 10), preemptive and multimodal analgesia (Group 2, n = 5), or preemptive analgesia consisting in oral tramadol (Group 3, n = 10). Changes over time among groups were analyzed using generalized estimating equations. Multivariate regression tested the significance of relationships between pain scales and video analysis. Video analysis identified that one orthopedic behavior, namely ‘Walking with full weight bearing’ of the operated leg, decreased more in Group 1 at T24 (indicative of pain), whereas three behaviors indicative of sedation decreased in Group 2 at T24 (all p<0.004). Electrodermal activity was higher in Group 1 than in Groups 2 and 3 until T1 (p<0.0003). The VAS was not responsive. 4A-VET showed divergent results as its orthopedic component (4A-VETleg) detected lower pain in Group 2 until T12 (p<0.0009), but its interactive component (4A-VETbeh) was increased in Group 2 from T12 to T48 (p<0.001). Concurrent validity established that 4A-VETleg scores the painful orthopedic condition accurately and that pain assessment through 4A-VETbeh and VAS was severely biased by the sedative side-effect of the analgesics. Finally, the video analysis offered a concise template for assessment in dogs with acute orthopedic pain. However, subjective pain

  10. A cluster randomized controlled trial comparing three methods of disseminating practice guidelines for children with croup [ISRCTN73394937

    PubMed Central

    Johnson, David W; Craig, William; Brant, Rollin; Mitton, Craig; Svenson, Larry; Klassen, Terry P

    2006-01-01

    Background The optimal management of croup – a common respiratory illness in young children – is well established. In particular, treatment with corticosteroids has been shown to significantly reduce the rate and duration of intubation, hospitalization, and return to care for on-going croup symptoms. Furthermore treatment with a single dose of corticosteroids does not appear to result in any significant adverse outcomes, and yields overall cost-savings for both families and the health care system. However, as has been shown with many other diseases, there is a significant gap between what we know and what we do. The overall aim of this study is to identify, from a societal perspective, the costs and associated benefits of three strategies for implementing a practice guideline that addresses the management of croup. Methods/designs We propose to use a matched pair cluster trial in 24 Alberta hospitals randomized into three intervention groups. We will use mixed methods to assess outcomes including linkage and analysis of administrative databases obtained from Alberta Health and Wellness, retrospective medical chart audit, and prospective telephone surveys of the parents of children diagnosed to have croup. The intervention strategies to be compared will be mailing of printed educational materials (low intensity intervention), mailing plus a combination of interactive educational meetings, educational outreach visits, and reminders (intermediate intensity intervention), and a combination of mailing, interactive sessions, outreach visits, reminders plus identification of local opinion leaders and establishment of local consensus processes (high intensity intervention). The primary objective is to determine which of the three intervention strategies are most effective at lowering the rate of hospital days per 1,000 disease episodes. Secondary objectives are to determine which of the three dissemination strategies are most effective at increasing the use of

  11. Methods for testing theory and evaluating impact in randomized field trials: intent-to-treat analyses for integrating the perspectives of person, place, and time.

    PubMed

    Brown, C Hendricks; Wang, Wei; Kellam, Sheppard G; Muthén, Bengt O; Petras, Hanno; Toyinbo, Peter; Poduska, Jeanne; Ialongo, Nicholas; Wyman, Peter A; Chamberlain, Patricia; Sloboda, Zili; MacKinnon, David P; Windham, Amy

    2008-06-01

    Randomized field trials provide unique opportunities to examine the effectiveness of an intervention in real world settings and to test and extend both theory of etiology and theory of intervention. These trials are designed not only to test for overall intervention impact but also to examine how impact varies as a function of individual level characteristics, context, and across time. Examination of such variation in impact requires analytical methods that take into account the trial's multiple nested structure and the evolving changes in outcomes over time. The models that we describe here merge multilevel modeling with growth modeling, allowing for variation in impact to be represented through discrete mixtures--growth mixture models--and nonparametric smooth functions--generalized additive mixed models. These methods are part of an emerging class of multilevel growth mixture models, and we illustrate these with models that examine overall impact and variation in impact. In this paper, we define intent-to-treat analyses in group-randomized multilevel field trials and discuss appropriate ways to identify, examine, and test for variation in impact without inflating the Type I error rate. We describe how to make causal inferences more robust to misspecification of covariates in such analyses and how to summarize and present these interactive intervention effects clearly. Practical strategies for reducing model complexity, checking model fit, and handling missing data are discussed using six randomized field trials to show how these methods may be used across trials randomized at different levels. PMID:18215473

  12. Effect of random structure on permeability and heat transfer characteristics for flow in 2D porous medium based on MRT lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Yang, PeiPei; Wen, Zhi; Dou, RuiFeng; Liu, Xunliang

    2016-08-01

    Flow and heat transfer through a 2D random porous medium are studied by using the lattice Boltzmann method (LBM). For the random porous medium, the influence of disordered cylinder arrangement on permeability and Nusselt number are investigated. Results indicate that the permeability and Nusselt number for different cylinder locations are unequal even with the same number and size of cylinders. New correlations for the permeability and coefficient b‧Den of the Forchheimer equation are proposed for random porous medium composed of Gaussian distributed circular cylinders. Furthermore, a general set of heat transfer correlations is proposed and compared with existing experimental data and empirical correlations. Our results show that the Nu number increases with the increase of the porosity, hence heat transfer is found to be accurate considering the effect of porosity.

  13. Random Walks on Random Graphs

    NASA Astrophysics Data System (ADS)

    Cooper, Colin; Frieze, Alan

    The aim of this article is to discuss some of the notions and applications of random walks on finite graphs, especially as they apply to random graphs. In this section we give some basic definitions, in Section 2 we review applications of random walks in computer science, and in Section 3 we focus on walks in random graphs.

  14. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  15. Using Logistic Regression and Random Forests multivariate statistical methods for landslide spatial probability assessment in North-Est Sicily, Italy

    NASA Astrophysics Data System (ADS)

    Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele

    2015-04-01

    first phase of the work addressed to identify the spatial relationships between the landslides location and the 13 related factors by using the Frequency Ratio bivariate statistical method. The analysis was then carried out by adopting a multivariate statistical approach, according to the Logistic Regression technique and Random Forests technique that gave best results in terms of AUC. The models were performed and evaluated with different sample sizes and also taking into account the temporal variation of input variables such as burned areas by wildfire. The most significant outcome of this work are: the relevant influence of the sample size on the model results and the strong importance of some environmental factors (e.g. land use and wildfires) for the identification of the depletion zones of extremely rapid shallow landslides.

  16. Bayesian response adaptive randomization using longitudinal outcomes.

    PubMed

    Hatayama, Tomoyoshi; Morita, Satoshi; Sakamaki, Kentaro

    2015-01-01

    The response adaptive randomization (RAR) method is used to increase the number of patients assigned to more efficacious treatment arms in clinical trials. In many trials evaluating longitudinal patient outcomes, RAR methods based only on the final measurement may not benefit significantly from RAR because of its delayed initiation. We propose a Bayesian RAR method to improve RAR performance by accounting for longitudinal patient outcomes (longitudinal RAR). We use a Bayesian linear mixed effects model to analyze longitudinal continuous patient outcomes for calculating a patient allocation probability. In addition, we aim to mitigate the loss of statistical power because of large patient allocation imbalances by embedding adjusters into the patient allocation probability calculation. Using extensive simulation we compared the operating characteristics of our proposed longitudinal RAR method with those of the RAR method based only on the final measurement and with an equal randomization method. Simulation results showed that our proposed longitudinal RAR method assigned more patients to the presumably superior treatment arm compared with the other two methods. In addition, the embedded adjuster effectively worked to prevent extreme patient allocation imbalances. However, our proposed method may not function adequately when the treatment effect difference is moderate or less, and still needs to be modified to deal with unexpectedly large departures from the presumed longitudinal data model. PMID:26099995

  17. Determining Adequate Margins in Head and Neck Cancers: Practice and Continued Challenges.

    PubMed

    Williams, Michelle D

    2016-09-01

    Margin assessment remains a critical component of oncologic care for head and neck cancer patients. As an integrated team, both surgeons and pathologists work together to assess margins in these complex patients. Differences in method of margin sampling can impact obtainable information and effect outcomes. Additionally, what distance is an "adequate or clear" margin for patient care continues to be debated. Ultimately, future studies and potentially secondary modalities to augment pathologic assessment of margin assessment (i.e., in situ imaging or molecular assessment) may enhance local control in head and neck cancer patients. PMID:27469263

  18. The Alchemy of "Costing Out" an Adequate Education

    ERIC Educational Resources Information Center

    Hanushek, Eric A.

    2006-01-01

    In response to the rapid rise in court cases related to the adequacy of school funding, a variety of alternative methods have been developed to provide an analytical base about the necessary expenditure on schools. These approaches have been titled to give an aura of a thoughtful and solid scientific basis: the professional judgment model, the…

  19. Structured pharmaceutical analysis of the Systematic Tool to Reduce Inappropriate Prescribing is an effective method for final-year medical students to improve polypharmacy skills: a randomized controlled trial.

    PubMed

    Keijsers, Carolina J P W; van Doorn, Adriaan B D; van Kalles, Anouk; de Wildt, Dick J; Brouwers, Jacobus R B J; van de Kamp, Henrieke J; Jansen, Paul A F

    2014-07-01

    Medical students may not be adequately trained to prescribe appropriately to older adults with polypharmacy. This study addressed how to teach students to minimize inappropriate polypharmacy. Final-year medical students (N = 106) from two Dutch schools of medicine participated in this randomized controlled trial with a pre/posttest design. The Systematic Tool to Reduce Inappropriate Prescribing (STRIP) was used as the intervention. This medication review tool consists of five steps and is part of the Dutch multidisciplinary guideline on polypharmacy. Step two is a structured pharmaceutical analysis of drug use, assessed using six questions regarding undertreatment, ineffective treatment, overtreatment, potential adverse effects, contraindications or interactions, and dose adjustments. It is used in combination with the Screening Tool to Alert doctors to Right Treatment and the Screening Tool of Older Person's Prescriptions checklists. Students were asked to optimize the medication lists of real people, making use, or not, of the STRIP. The number of correct or potentially harmful decisions that the students made when revising the lists was determined by comparison with expert consensus. Students who used the STRIP had better scores than control students; they made more correct decisions (9.3 vs 7.0, 34%; P < .001, correlation coefficient (r) = 0.365) and fewer potentially harmful decisions (3.9 vs 5.6, -30%; P < .001, r = 0.386). E-learning did not have a different effect from that of non-E-learning methods. Students were satisfied with the method. The STRIP method is effective in helping final-year medical students improve their prescribing skills. PMID:24916615

  20. Decision Tree, Bagging and Random Forest methods detect TEC seismo-ionospheric anomalies around the time of the Chile, (Mw = 8.8) earthquake of 27 February 2010

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, Mehdi

    2016-06-01

    In this paper for the first time ensemble methods including Decision Tree, Bagging and Random Forest have been proposed in the field of earthquake precursors to detect GPS-TEC (Total Electron Content) seismo-ionospheric anomalies around the time and location of Chile earthquake of 27 February 2010. All of the implemented ensemble methods detected a striking anomaly in time series of TEC data, 1 day after the earthquake at 14:00 UTC. The results indicate that the proposed methods due to their performance, speed and simplicity are quite promising and deserve serious attention as a new predictor tools for seismo-ionospheric anomalies detection.

  1. Effectiveness of the Dader Method for pharmaceutical care in patients with bipolar I disorder: EMDADER-TAB: study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Bipolar I disorder (BD-I) is a chronic mental illness characterized by the presence of one or more manic episodes, or both depressive and manic episodes, usually separated by asymptomatic intervals. Pharmacists can contribute to the management of BD-I, mainly with the use of effective and safe drugs, and improve the patient’s life quality through pharmaceutical care. Some studies have shown the effect of pharmaceutical care in the achievement of therapeutic goals in different illnesses; however, to our knowledge, there is a lack of randomized controlled trials designed to assess the effect of pharmacist intervention in patients with BD. The aim of this study is to assess the effectiveness of the Dader Method for pharmaceutical care in patients with BD-I. Methods/design Randomized, controlled, prospective, single-center clinical trial with duration of 12 months will be performed to compare the effect of Dader Method of pharmaceutical care with the usual care process of patients in a psychiatric clinic. Patients diagnosed with BD-I aged between 18 and 65 years who have been discharged or referred from outpatients service of the San Juan de Dios Clinic (Antioquia, Colombia) will be included. Patients will be randomized into the intervention group who will receive pharmaceutical care provided by pharmacists working in collaboration with psychiatrists, or into the control group who will receive usual care and verbal-written counseling regarding BD. Study outcomes will be assessed at baseline and at 3, 6, 9, and 12 months after randomization. The primary outcome will be to measure the number of hospitalizations, emergency service consultations, and unscheduled outpatient visits. Effectiveness, safety, adherence, and quality of life will be assessed as secondary outcomes. Statistical analyses will be performed using two-tailed McNemar tests, Pearson chi-square tests, and Student’s t-tests; a P value <0.05 will be considered as statistically significant

  2. Comparing MTI randomization procedures to blocked randomization.

    PubMed

    Berger, Vance W; Bejleri, Klejda; Agnor, Rebecca

    2016-02-28

    Randomization is one of the cornerstones of the randomized clinical trial, and there is no shortage of methods one can use to randomize patients to treatment groups. When deciding which one to use, researchers must bear in mind that not all randomization procedures are equally adept at achieving the objective of randomization, namely, balanced treatment groups. One threat is chronological bias, and permuted blocks randomization does such a good job at controlling chronological bias that it has become the standard randomization procedure in clinical trials. But permuted blocks randomization is especially vulnerable to selection bias, so as a result, the maximum tolerated imbalance (MTI) procedures were proposed as better alternatives. In comparing the procedures, we have somewhat of a false controversy, in that actual practice goes uniformly one way (permuted blocks), whereas scientific arguments go uniformly the other way (MTI procedures). There is no argument in the literature to suggest that the permuted block design is better than or even as good as the MTI procedures, but this dearth is matched by an equivalent one regarding actual trials using the MTI procedures. So the 'controversy', if we are to call it that, pits misguided precedent against sound advice that tends to be ignored in practice. We shall review the issues to determine scientifically which of the procedures is better and, therefore, should be used. PMID:26337607

  3. Identification by random forest method of HLA class I amino acid substitutions associated with lower survival at day 100 in unrelated donor hematopoietic cell transplantation

    PubMed Central

    Marino, Susana R.; Lin, Shang; Maiers, Martin; Haagenson, Michael; Spellman, Stephen; Klein, John P.; Binkowski, T. Andrew; Lee, Stephanie J.; van Besien, Koen

    2011-01-01

    The identification of important amino acid substitutions associated with low survival in hematopoietic cell transplantation (HCT) is hampered by the large number of observed substitutions compared to the small number of patients available for analysis. Random forest analysis is designed to address these limitations. We studied 2,107 HCT recipients with good or intermediate risk hematologic malignancies to identify HLA class I amino acid substitutions associated with reduced survival at day 100 post-transplant. Random forest analysis and traditional univariate and multivariate analyses were used. Random forest analysis identified amino acid substitutions in 33 positions that were associated with reduced 100 day survival, including HLA-A 9, 43, 62, 63, 76, 77, 95, 97, 114, 116, 152, 156, 166, and 167; HLA-B 97, 109, 116, and 156; and HLA-C 6, 9, 11, 14, 21, 66, 77, 80, 95, 97, 99, 116, 156, 163, and 173. Thirteen had been previously reported by other investigators using classical biostatistical approaches. Using the same dataset, traditional multivariate logistic regression identified only 5 amino acid substitutions associated with lower day 100 survival. Random forest analysis is a novel statistical methodology for analysis of HLA-mismatching and outcome studies, capable of identifying important amino acid substitutions missed by other methods. PMID:21441965

  4. Patients with Celiac Disease Are Not Followed Adequately

    PubMed Central

    Herman, Margot L.; Rubio-Tapia, Alberto; Lahr, Brian D.; Larson, Joseph J.; Van Dyke, Carol T.; Murray, Joseph A.

    2012-01-01

    Background & Aims Adherence to a gluten-free diet is the only effective treatment for celiac disease. It has been recommended that patients be followed, make regular visits to the clinic, and undergo serologic analysis for markers of celiac disease, although a follow-up procedure has not been standardized. We determined how many patients with celiac disease are actually followed. Methods We collected data on 122 patients with biopsy-proven celiac disease, diagnosed between 1996 and 2006 in Olmsted County, Minnesota (70% women, median age of 42 years) for whom complete medical records and verification of residency were available. We determined the frequency at which patients received follow-up examinations, from 6 months to 5 years after diagnosis. The Kaplan-Meier method was used to estimate event rates at 1 and 5 year(s). Patients were classified according to categories of follow-up procedures recommended by the American Gastroenterology Association (AGA). Results We estimated that by 1 and 5 year(s) after diagnosis with celiac disease, 41.0% and 88.7% of the patients had follow-up visits, 33.6% and 79.8% were assessed for compliance with a gluten-free diet, 3.3% and 15.8% met with a registered dietitian, 2.5% and 18.1% had an additional intestinal biopsy, and 22.1% and 65.6% received serologic testing for markers of celiac disease. Among 113 patients (93%) who were followed for more than 4 years, only 35% received follow-up analyses that were consistent with AGA recommendations. Conclusions Patients with celiac disease are not followed consistently. Follow-up examinations are often inadequate and do not follow AGA recommendations. Improving follow-up strategies for patients with celiac disease could improve management of this disease. PMID:22610009

  5. Percentage of Adults with High Blood Pressure Whose Hypertension Is Adequately Controlled

    MedlinePlus

    ... is Adequately Controlled Percentage of Adults with High Blood Pressure Whose Hypertension is Adequately Controlled Heart disease ... Survey. Age Group Percentage of People with High Blood Pressure that is Controlled by Age Group f94q- ...

  6. Are family medicine residents adequately trained to deliver palliative care?

    PubMed Central

    Mahtani, Ramona; Kurahashi, Allison M.; Buchman, Sandy; Webster, Fiona; Husain, Amna; Goldman, Russell

    2015-01-01

    Objective To explore educational factors that influence family medicine residents’ (FMRs’) intentions to offer palliative care and palliative care home visits to patients. Design Qualitative descriptive study. Setting A Canadian, urban, specialized palliative care centre. Participants First-year (n = 9) and second-year (n = 6) FMRs. Methods Semistructured interviews were conducted with FMRs following a 4-week palliative care rotation. Questions focused on participant experiences during the rotation and perceptions about their roles as family physicians in the delivery of palliative care and home visits. Participant responses were analyzed to summarize and interpret patterns related to their educational experience during their rotation. Main findings Four interrelated themes were identified that described this experience: foundational skill development owing to training in a specialized setting; additional need for education and support; unaddressed gaps in pragmatic skills; and uncertainty about family physicians’ role in palliative care. Conclusion Residents described experiences that both supported and inadvertently discouraged them from considering future engagement in palliative care. Reassuringly, residents were also able to underscore opportunities for improvement in palliative care education. PMID:27035008

  7. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...

  8. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...

  9. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...

  10. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...

  11. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...

  12. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  13. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  14. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  15. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  16. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  17. 76 FR 51041 - Hemoglobin Standards and Maintaining Adequate Iron Stores in Blood Donors; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ... HUMAN SERVICES Food and Drug Administration Hemoglobin Standards and Maintaining Adequate Iron Stores in... Standards and Maintaining Adequate Iron Stores in Blood Donors.'' The purpose of this public workshop is to... donor safety and blood availability, and potential measures to maintain adequate iron stores in...

  18. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  19. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  20. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  1. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  2. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  3. Personal child and mother carbon monoxide exposures and kitchen levels: methods and results from a randomized trial of woodfired chimney cookstoves in Guatemala (RESPIRE).

    PubMed

    Smith, Kirk R; McCracken, John P; Thompson, Lisa; Edwards, Rufus; Shields, Kyra N; Canuz, Eduardo; Bruce, Nigel

    2010-07-01

    During the first randomized intervention trial (RESPIRE: Randomized Exposure Study of Pollution Indoors and Respiratory Effects) in air pollution epidemiology, we pioneered application of passive carbon monoxide (CO) diffusion tubes to measure long-term personal exposures to woodsmoke. Here we report on the protocols and validations of the method, trends in personal exposure for mothers and their young children, and the efficacy of the introduced improved chimney stove in reducing personal exposures and kitchen concentrations. Passive diffusion tubes originally developed for industrial hygiene applications were deployed on a quarterly basis to measure 48-hour integrated personal carbon monoxide exposures among 515 children 0-18 months of age and 532 mothers aged 15-55 years and area samples in a subsample of 77 kitchens, in households randomized into control and intervention groups. Instrument comparisons among types of passive diffusion tubes and against a continuous electrochemical CO monitor indicated that tubes responded nonlinearly to CO, and regression calibration was used to reduce this bias. Before stove introduction, the baseline arithmetic (geometric) mean 48-h child (n=270), mother (n=529) and kitchen (n=65) levels were, respectively, 3.4 (2.8), 3.4 (2.8) and 10.2 (8.4) p.p.m. The between-group analysis of the 3355 post-baseline measurements found CO levels to be significantly lower among the intervention group during the trial period: kitchen levels: -90%; mothers: -61%; and children: -52% in geometric means. No significant deterioration in stove effect was observed over the 18 months of surveillance. The reliability of these findings is strengthened by the large sample size made feasible by these unobtrusive and inexpensive tubes, measurement error reduction through instrument calibration, and a randomized, longitudinal study design. These results from the first randomized trial of improved household energy technology in a developing country and

  4. Randomized clinical trial of multimodal physiotherapy treatment compared to overnight lidocaine ointment in women with provoked vestibulodynia: Design and methods.

    PubMed

    Morin, Mélanie; Dumoulin, Chantale; Bergeron, Sophie; Mayrand, Marie-Hélène; Khalifé, Samir; Waddell, Guy; Dubois, Marie-France

    2016-01-01

    Provoked vestibulodynia (PVD) is a highly prevalent and debilitating condition yet its management relies mainly on non-empirically validated interventions. Among the many causes of PVD, there is growing evidence that pelvic floor muscle (PFM) dysfunctions play an important role in its pathophysiology. Multimodal physiotherapy, which addresses these dysfunctions, is judged by experts to be highly effective and is recommended as a first-line treatment. However, the effectiveness of this promising intervention has been evaluated through only two small uncontrolled trials. The proposed bi-center, single-blind, parallel group, randomized controlled trial (RCT) aims to evaluate the efficacy of multimodal physiotherapy and compare it to a frequently used first-line treatment, topical overnight application of lidocaine, in women with PVD. A total of 212 women diagnosed with PVD according to a standardized protocol were eligible for the study and were randomly assigned to either multimodal physiotherapy or lidocaine treatment for 10weeks. The primary outcome measure is pain during intercourse (assessed with a numerical rating scale). Secondary measures include sexual function, pain quality, psychological factors (including pain catastrophizing, anxiety, depression and fear of pain), PFM morphology and function, and patients' global impression of change. Assessments are made at baseline, post-treatment and at the 6-month follow-up. This manuscript presents and discusses the rationale, design and methodology of the first RCT investigating physiotherapy in comparison to a commonly prescribed first-line treatment, overnight topical lidocaine, for women with PVD. PMID:26600287

  5. A Novel Method for Assessment of Polyethylene Liner Wear in Radiopaque Tantalum Acetabular Cups: Clinical Validation in Patients Enrolled in a Randomized Controlled Trial.

    PubMed

    Troelsen, Anders; Greene, Meridith E; Ayers, David C; Bragdon, Charles R; Malchau, Henrik

    2015-12-01

    Conventional radiostereometric analysis (RSA) for wear is not possible in patients with tantalum cups. We propose a novel method for wear analysis in tantalum cups. Wear was assessed by gold standard RSA and the novel method in total hip arthroplasty patients enrolled in a randomized controlled trial receiving either titanium or tantalum cups (n=46). The novel method estimated the center of the head using a model based on identification of two proximal markers on the stem and knowledge of the stem/head configuration. The novel method was able to demonstrate a pattern of wear that was similar to the gold standard in titanium cups. The novel method offered accurate assessment and is a viable solution for assessment of wear in studies with tantalum cups. PMID:26216229

  6. Using Fuzzy Logic to Identify Schools Which May Be Misclassified by the No Child Left Behind Adequate Yearly Progress Policy

    ERIC Educational Resources Information Center

    Yates, Donald W.

    2009-01-01

    This investigation developed, tested, and prototyped a Fuzzy Inference System (FIS) that would assist decision makers in identifying schools that may have been misclassified by existing Adequate Yearly Progress (AYP) methods. This prototype was then used to evaluate Louisiana elementary schools using published school data for Academic Year 2004. …

  7. Adequate iodine levels in healthy pregnant women. A cross-sectional survey of dietary intake in Turkey

    PubMed Central

    Kasap, Burcu; Akbaba, Gülhan; Yeniçeri, Emine N.; Akın, Melike N.; Akbaba, Eren; Öner, Gökalp; Turhan, Nilgün Ö.; Duru, Mehmet E.

    2016-01-01

    Objectives: To assess current iodine levels and related factors among healthy pregnant women. Methods: In this cross-sectional, hospital-based study, healthy pregnant women (n=135) were scanned for thyroid volume, provided urine samples for urinary iodine concentration and completed a questionnaire including sociodemographic characteristics and dietary habits targeted for iodine consumption at the Department of Obstetrics and Gynecology, School of Medicine, Muğla Sıtkı Koçman University, Muğla, Turkey, between August 2014 and February 2015. Sociodemographic data were analyzed by simple descriptive statistics. Results: Median urinary iodine concentration was 222.0 µg/L, indicating adequate iodine intake during pregnancy. According to World Health Organization (WHO) criteria, 28.1% of subjects had iodine deficiency, 34.1% had adequate iodine intake, 34.8% had more than adequate iodine intake, and 3.0% had excessive iodine intake during pregnancy. Education level, higher monthly income, current employment, consuming iodized salt, and adding salt to food during, or after cooking were associated with higher urinary iodine concentration. Conclusion: Iodine status of healthy pregnant women was adequate, although the percentage of women with more than adequate iodine intake was higher than the reported literature. PMID:27279519

  8. Methods from the theory of random heterogeneous media for quantifying myocardial morphology in normal and dilated hearts.

    PubMed

    Karch, Rudolf; Neumann, Friederike; Ullrich, Robert; Heinze, Georg; Neumüller, Josef; Podesser, Bruno K; Neumann, Martin

    2010-02-01

    In the present study, descriptors from the theory of random heterogeneous media were used to characterize the morphology of the myocardial interstitial space in histological sections from hearts of healthy subjects and of patients with idiopathic dilated cardiomyopathy (DCM). Histological sections from resected DCM hearts (n = 9) were compared with donor hearts showing no signs of cardiac disease (n = 6). From control to DCM, the area fraction phi(1) of the interstitial space increased from 0.13 +/- 0.05 to 0.27 +/- 0.08, the chord-length z from 1.67 +/- 0.61 to 5.56 +/- 1.78 microm, the pore-size delta from 0.72 +/- 0.13 to 1.73 +/- 0.40 microm, the distance r (min) of the first local minimum in the two-point correlation function from 10.99 +/- 1.09 to 18.57 +/- 4.36 mum, whereas specific interface length s and decay-rate gamma of the lineal-path function decreased from 0.20 +/- 0.07 to 0.16 +/- 0.04 microm(-1) and from 0.39 +/- 0.09 to 0.16 +/- 0.05 microm(-1), respectively. All descriptors (except for s) were significantly different (p < 0.05) between control and DCM, reflecting an increasingly heterogeneous morphology in DCM hearts. Our results suggest that (1) descriptors originally developed to characterize the morphology of random heterogeneous media are well suited for histomorphometry of DCM, and (2) among the descriptors studied, either pore-size delta or chord-length z qualify best to discriminate between control and DCM hearts. PMID:19937468

  9. Zinc content of selected tissues and taste perception in rats fed zinc deficient and zinc adequate rations

    SciTech Connect

    Boeckner, L.S.; Kies, C.

    1986-03-05

    The objective of the study was to determine the effects of feeding zinc sufficient and zinc deficient rations on taste sensitivity and zinc contents of selected organs in rats. The 36 Sprague-Dawley male weanling rats were divided into 2 groups and fed zinc deficient or zinc adequate rations. The animals were subjected to 4 trial periods in which a choice of deionized distilled water or a solution of quinine sulfate at 1.28 x 10/sup -6/ was given. A randomized schedule for rat sacrifice was used. No differences were found between zinc deficient and zinc adequate rats in taste preference aversion scores for quinine sulfate in the first three trial periods; however, in the last trial period rats in the zinc sufficient group drank somewhat less water containing quinine sulfate as a percentage of total water consumption than did rats fed the zinc deficient ration. Significantly higher zinc contents of kidney, brain and parotid salivary glands were seen in zinc adequate rats compared to zinc deficient rats at the end of the study. However, liver and tongue zinc levels were lower for both groups at the close of the study than were those of rats sacrificed at the beginning of the study.

  10. Salt sales survey: a simplified, cost-effective method to evaluate population salt reduction programs--a cluster-randomized trial.

    PubMed

    Ma, Yuan; He, Feng J; Li, Nicole; Hao, Jesse; Zhang, Jing; Yan, Lijing L; Wu, Yangfeng

    2016-04-01

    Twenty-four-hour urine collection, as a gold standard method of measuring salt intake, is costly and resource consuming, which limits its use in monitoring population salt reduction programs. Our study aimed to determine whether a salt sales survey could serve as an alternative method. This was a substudy of China Rural Health Initiative-Sodium Reduction Study (CRHI-SRS), in which 120 villages were randomly allocated (1:1:2) into a price subsidy+health education (PS+HE) group, a HE-only group or a control group. Salt substitutes (SS) were supplied to shops in the intervention groups; 24-h urine was collected from 2567 randomly selected adults at the end of the trial to evaluate the effects of the intervention. Ten villages were randomly selected from each group (that is, 30 villages in total), and 166 shops from these villages were invited to participate in the monthly salt sales survey. The results showed that during the intervention period, mean monthly sales of SS per shop were 38.0 kg for the PS+HE group, 19.2 kg for the HE only and 2.2 kg for the control group (P<0.05), which was consistent with the results from the 24-h urine sodium and potassium data. The intervention effects of CRHI-SRS on sodium and potassium intake estimated from SS sales were 101% and 114%, respectively, of those observed from the 24-h urine data. Furthermore, the salt sales survey cost only 14% of the cost of the 24-h urine method and had greater statistical power. The results indicate that a salt sales survey could serve as a simple, sensitive and cost-effective method to evaluate community-based salt reduction programs in which salt is mainly added by the consumers. PMID:26657005

  11. An Automated Three-Dimensional Detection and Segmentation Method for Touching Cells by Integrating Concave Points Clustering and Random Walker Algorithm

    PubMed Central

    Gong, Hui; Chen, Shangbin; Zhang, Bin; Ding, Wenxiang; Luo, Qingming; Li, Anan

    2014-01-01

    Characterizing cytoarchitecture is crucial for understanding brain functions and neural diseases. In neuroanatomy, it is an important task to accurately extract cell populations' centroids and contours. Recent advances have permitted imaging at single cell resolution for an entire mouse brain using the Nissl staining method. However, it is difficult to precisely segment numerous cells, especially those cells touching each other. As presented herein, we have developed an automated three-dimensional detection and segmentation method applied to the Nissl staining data, with the following two key steps: 1) concave points clustering to determine the seed points of touching cells; and 2) random walker segmentation to obtain cell contours. Also, we have evaluated the performance of our proposed method with several mouse brain datasets, which were captured with the micro-optical sectioning tomography imaging system, and the datasets include closely touching cells. Comparing with traditional detection and segmentation methods, our approach shows promising detection accuracy and high robustness. PMID:25111442

  12. A Self-Administered Method of Acute Pressure Block of Sciatic Nerves for Short-Term Relief of Dental Pain: A Randomized Study

    PubMed Central

    Wang, Xiaolin; Zhao, Wanghong; Wang, Ye; Hu, Jiao; Chen, Qiu; Yu, Juncai; Wu, Bin; Huang, Rong; Gao, Jie; He, Jiman

    2014-01-01

    Objectives While stimulation of the peripheral nerves increases the pain threshold, chronic pressure stimulation of the sciatic nerve is associated with sciatica. We recently found that acute pressure block of the sciatic nerve inhibits pain. Therefore, we propose that, the pain pathology-causing pressure is chronic, not acute. Here, we report a novel self-administered method: acute pressure block of the sciatic nerves is applied by the patients themselves for short-term relief of pain from dental diseases. Design This was a randomized, single-blind study. Setting Hospital patients. Patients Patients aged 16–60 years with acute pulpitis, acute apical periodontitis, or pericoronitis of the third molar of the mandible experiencing pain ≥3 on the 11-point numerical pain rating scale. Interventions Three-minute pressure to sciatic nerves was applied by using the hands (hand pressure method) or by having the patients squat to force the thigh and shin as tightly as possible on the sandwiched sciatic nerve bundles (self-administered method). Outcomes The primary efficacy variable was the mean difference in pain scores from the baseline. Results One hundred seventy-two dental patients were randomized. The self-administered method produced significant relief from pain associated with dental diseases (P ≤ 0.001). The analgesic effect of the self-administered method was similar to that of the hand pressure method. Conclusions The self-administered method is easy to learn and can be applied at any time for pain relief. We believe that patients will benefit from this method. PMID:24400593

  13. Guidance and examination by ultrasound versus landmark and radiographic method for placement of subclavian central venous catheters: study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Central venous catheters play an important role in patient care. Real-time ultrasound-guided subclavian central venous (SCV) cannulation may reduce the incidence of complications and the time between skin penetration and the aspiration of venous blood into the syringe. Ultrasonic diagnosis of catheter misplacement and pneumothorax related to central venous catheterization is rapid and accurate. It is unclear, however, whether ultrasound real-time guidance and examination can reduce procedure times and complication rates when compared with landmark guidance and radiographic examination for SCV catheterization. Methods/Design The Subclavian Central Venous Catheters Guidance and Examination by UltraSound (SUBGEUS) study is an investigator-initiated single center, randomized, controlled two-arm trial. Three hundred patients undergoing SCV catheter placement will be randomized to ultrasound real-time guidance and examination or landmark guidance and radiographic examination. The primary outcome is the time between the beginning of the procedure and control of the catheter. Secondary outcomes include the times required for the six components of the total procedure, the occurrence of complications (pneumothorax, hemothorax, or misplacement), failure of the technique and occurrence of central venous catheter infections. Discussion The SUBGEUS trial is the first randomized controlled study to investigate whether ultrasound real-time guidance and examination for SCV catheter placement reduces all procedure times and the rate of complications. Trial registration ClinicalTrials.gov Identifier: NCT01888094 PMID:24885789

  14. Adequate bases of phase space master integrals for gg → h at NNLO and beyond

    NASA Astrophysics Data System (ADS)

    Höschele, Maik; Hoff, Jens; Ueda, Takahiro

    2014-09-01

    We study master integrals needed to compute the Higgs boson production cross section via gluon fusion in the infinite top quark mass limit, using a canonical form of differential equations for master integrals, recently identified by Henn, which makes their solution possible in a straightforward algebraic way. We apply the known criteria to derive such a suitable basis for all the phase space master integrals in afore mentioned process at next-to-next-to-leading order in QCD and demonstrate that the method is applicable to next-to-next-to-next-to-leading order as well by solving a non-planar topology. Furthermore, we discuss in great detail how to find an adequate basis using practical examples. Special emphasis is devoted to master integrals which are coupled by their differential equations.

  15. A practical method for ergonomic and usability evaluation of hand tools: a comparison of three random orbital sander configurations.

    PubMed

    Spielholz, P; Bao, S; Howard, N

    2001-11-01

    Tool and equipment purchasing decisions are constantly made by companies and workers, often with little objective information beyond word of mouth and marketing information. This study presents a pilot investigation of random orbital sanders using ergonomic and usability assessment techniques which can easily be applied in any industry. Three subjects performed a sanding task using three different tool configurations: 1) the current model sander, 2) current model with hose-swivel attachment, and 3) the new "ergonomically designed" model. Physical measurements were taken of muscle activity and wrist motion to complement think-aloud testing and a usability questionnaire. No significant differences were found in physical measurements between the three configurations. Participants strongly preferred the current model over the new model, reporting less perceived discomfort and vibration, despite what appeared to be improvements in the new design. Workplace changes intended to reduce the risk of injury sometimes may have no effect after significant capital investment, or in some cases even increase the risk of injury. Practical assessment of new tools or modifications can quickly determine whether the outcome is indeed an improvement. The results of this tool assessment highlight the need for objective information in tool and equipment design decisions. PMID:11757900

  16. Switching characteristics in Cu:SiO2 by chemical soak methods for resistive random access memory (ReRAM)

    NASA Astrophysics Data System (ADS)

    Chin, Fun-Tat; Lin, Yu-Hsien; Yang, Wen-Luh; Liao, Chin-Hsuan; Lin, Li-Min; Hsiao, Yu-Ping; Chao, Tien-Sheng

    2015-01-01

    A limited copper (Cu)-source Cu:SiO2 switching layer composed of various Cu concentrations was fabricated using a chemical soaking (CS) technique. The switching layer was then studied for developing applications in resistive random access memory (ReRAM) devices. Observing the resistive switching mechanism exhibited by all the samples suggested that Cu conductive filaments formed and ruptured during the set/reset process. The experimental results indicated that the endurance property failure that occurred was related to the joule heating effect. Moreover, the endurance switching cycle increased as the Cu concentration decreased. In high-temperature tests, the samples demonstrated that the operating (set/reset) voltages decreased as the temperature increased, and an Arrhenius plot was used to calculate the activation energy of the set/reset process. In addition, the samples demonstrated stable data retention properties when baked at 85 °C, but the samples with low Cu concentrations exhibited short retention times in the low-resistance state (LRS) during 125 °C tests. Therefore, Cu concentration is a crucial factor in the trade-off between the endurance and retention properties; furthermore, the Cu concentration can be easily modulated using this CS technique.

  17. A Randomized Crossover Trial of the Effect of a Novel Method of Pressure Control (SensAwake) in Automatic Continuous Positive Airway Pressure Therapy to Treat Sleep Disordered Breathing

    PubMed Central

    Dungan, George C.; Marshall, Nathaniel S.; Hoyos, Camilla M.; Yee, Brendon J.; Grunstein, Ronald R.

    2011-01-01

    Objectives: To study the acute effect of the new SensAwake CPAP modality (reducing pressure on awakenings) on wake after sleep onset (WASO) and other polysomnographic measures in patients with obstructive sleep apnea (OSA). Study Design: Randomized crossover trial comparing an automatic continuous positive airway pressure device (AutoCPAP) with and without SensAwake on sleep architecture. CPAP naive patients received each therapy for a single night in the laboratory with at least 1-week washout. Both patients' and technicians' subjective satisfaction was assessed. Pressure data measured and stored by the AutoCPAP device were also analyzed. Results: OSA was controlled adequately by both modes (SensAwake ON apnea hypopnea index ± SD, AHI = 5.3 ± 5.6/h vs. SensAwake OFF = 5.4 ± 5.8, p = 0.9) in the 42 patients who completed the protocol. Mean and 90% pressures were significantly lower with SensAwake (mean ON = 6.9 ± 1.9 vs. OFF = 7.7 ± 2.5 cm H2O, p < 0.05; 90% pressure ON = 9.6 ± 2.7 vs. OFF = 10.6 ± 2.7 cm H2O, p < 0.02). SensAwake did not improve WASO (ON = 74 ± 54 min vs. OFF = 78 ± 51 min, p = 0.6). There were no differences in other sleep architecture measures or patient satisfaction between the 2 modalities. AutoCPAP-measured AHI closely approximated PSG-derived (ROC AUC = 0.81 [95% CI 0.71-0.92], p = 0.0001). Conclusions: SensAwake provides similar control of the AHI to the standard AutoCPAP mode but does so at lower mean and 90% pressures. However, no measure of sleep architecture was significantly improved by the SensAwake mode during this initial acute exposure. The internal AutoCPAP AHI detection and calculation was similar to PSG-derived AHI measures. Longer term studies are needed to evaluate any long-term influence of SensAwake on WASO. Citation: Dungan GC; Marshall NS; Hoyos CM; Yee BJ; Grunstein RR. A randomized crossover trial of the effect of a novel method of pressure control (SensAwake) in automatic continuous positive airway pressure

  18. Free variable selection QSPR study to predict 19F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods

    NASA Astrophysics Data System (ADS)

    Goudarzi, Nasser

    2016-04-01

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the 19F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the 19F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  19. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  20. Effects of smartphone diaries and personal dosimeters on behavior in a randomized study of methods to document sunlight exposure.

    PubMed

    Køster, Brian; Søndergaard, Jens; Nielsen, Jesper Bo; Allen, Martin; Bjerregaard, Mette; Olsen, Anja; Bentzen, Joan

    2016-06-01

    Dosimeters and diaries have previously been used to evaluate sun-related behavior and UV exposure in local samples. However, wearing a dosimeter or filling in a diary may cause a behavioral change. The aim of this study was to examine possible confounding factors for a questionnaire validation study. We examined the effects of wearing dosimeters and filling out diaries, measurement period and recall effect on the sun-related behavior in Denmark in 2012. Our sample included 240 participants eligible by smartphone status and who took a vacation during weeks 26-32 in 2012, randomized by gender, age, education and skin type to six groups: 1) Control + diary, 2) Control, 3) 1-week dosimetry measurement, 4) 1-week dosimetry measurement + diary, 5) 3-week dosimetry measurement and 6) 1-week dosimetry measurement with 4 week delayed questionnaire. Correlation coefficients between reported outdoor time and registered outdoor time for groups 3-6 were 0.39, 0.45, 0.43 and 0.09, respectively. Group 6 was the only group not significantly correlated. Questionnaire reported outdoor exposure time was shorter in the dosimeter measurement groups (3-6) than in their respective controls. We showed that using a dosimeter or keeping a diary seems to increase attention towards the behavior examined and therefore may influence this behavior. Receiving the questionnaire with 4 week delay had a significant negative influence on correlation and recall of sunburn. When planning future UV behavior questionnaire validations, we suggest to use a 1-week interval for dosimetry measurements, no diary, and to minimize the time from end of measurement to filling out questionnaires. PMID:27419038

  1. Comparison of the genetic diversity of wild and captive groups of Microcebus murinus using the random amplified polymorphic DNA method.

    PubMed

    Neveu, H; Hafen, T; Zimmermann, E; Rumpler, Y

    1998-01-01

    Continued survival of most animal species depends on population management and active protection. It is generally agreed that, in order to avoid extinction of endangered species, ex situ and in situ conservation must be developed in tandem. However, even though many recommendations have been put forward to promote the survival of captive populations, some rapidly become extinct due to loss of genetic diversity (drift effect). Genetic markers, such as random amplified polymorphic DNA (RAPD) markers, can be applied to rapid testing of many individuals. They also permit analysis of very small amounts of DNA, when small species such as mouse lemurs (Microcebus) are to be tested. Using RAPD markers, we compare genetic diversity in four captive groups of Microcebus murinus to that in a sample of 70 wild mouse lemurs. Following the principles of Mendelian inheritance, each amplified fragment of DNA may be considered as a 'locus' (or an amplifying site). The series of bands amplified by a particular primer in any individual is referred to as the individual's 'profile'. We tested 5 primers, or, in the above terms, we studied 98 different 'loci'. Results showed that the captive groups had lost genetic information with respect to the wild sample. Among the four captive groups, the loss of genetic diversity varied according to their number of founders and/or the management of their captive reproduction. Our study of polymorphism permitted us to establish tools for the genetic management of captive breeding, and for the determination of paternity which frequently give better results than behavioural studies; and simulation of introductions or departures of individuals in one very monomorphic group permitted estimation of future increases in its genetic diversity. PMID:9595690

  2. Prediction of gestational diabetes mellitus in the first trimester, comparison of fasting plasma glucose, two-step and one-step methods: a prospective randomized controlled trial.

    PubMed

    Yeral, M Ilkin; Ozgu-Erdinc, A Seval; Uygur, Dilek; Seckin, K Doga; Karsli, M Fatih; Danisman, A Nuri

    2014-08-01

    Our aim was to evaluate and compare the diagnostic performance of three methods commonly used for GDM screening: fasting plasma glucose (FPG), two-step 50 g glucose challenge test (GCT), and 75 g glucose tolerance test (GTT) in a randomized study design to predict GDM in the first trimester and determine the best approach in predicting GDM. In a non-blind, parallel-group prospective randomized controlled study; 736 singleton pregnant women underwent FPG testing in the first trimester and randomly assigned to two groups; two-step 50 g GCT and 75 g GTT. GDM diagnosis was made according to Carpenter-Coustan or ADA (American Diabetes Association) criteria in two-step 50 g GCT and 75 g GTT groups, respectively. Subsequent testing was performed by two-step 50 g GCT at 24-28 weeks for screen negatives. After excluding the women who were lost to follow-up or withdrawn as a result of pregnancy loss, 486 pregnant women were recruited in the study. The FPG, two-step GCT, and one-step GTT methods identified GDM in 25/486 (5.1 %), 15/248 (6.0 %), and 27/238 (11.3 %) women, respectively. Area under ROC curves were 0.623, 0.708, and 0.792, respectively. Sensitivities were 47.17, 68.18, and 87.1 %, respectively. Specificities were 77.37, 100, and 100 %, respectively. Positive predictive values were 20.33, 100, and 100 %, respectively. Negative predictive values were 92.29, 97, and 98.1 %, respectively. Until superior screening alternatives become available, the 75 g GTT may be preferred for GDM screening in the first trimester. PMID:24282036

  3. Living Well with Stroke: Design and Methods for a Randomized-Controlled Trial of a Psychosocial-Behavioral Intervention for Post-Stroke Depression

    PubMed Central

    Mitchell, Pamela H.; Teri, Linda; Veith, Richard; Buzaitis, Ann; Tirschwell, David; Becker, Kyra; Fruin, Michael; Kohen, Ruth; Cain, Kevin C.

    2008-01-01

    Background Depression is a sufficiently common sequela of a completed stroke to warrant intervention to improve mood, social and functional outcome. Pharmacologic trials suggest short-term mood improvement from antidepressant treatment but no studies to date have determined whether these short-term gains can be enhanced and extended by a brief psychosocial/behavioral intervention delivered by advanced practice nurses. Nor have drug trials reported on functional outcomes such as limitations in ability, limitations in participation and overall quality of survival. This randomized controlled trial is designed to evaluate the short and long-term efficacy of a new brief psychosocial/behavioral intervention adjunctive to antidepressant treatment in reducing post-stroke depression (PSD) and improving functional outcomes. Methods 101 ischemic stroke survivors with PSD are randomly assigned to receive a brief psychosocial/behavioral intervention plus antidepressant or usual care, including antidepressants. Outcome measures The primary outcome is reduction in depressive symptom severity (Hamilton Depression Rating Scale) at 12 months following stroke. Secondary outcomes are reductions in limitations in activity (Barthel Index), reduction in limitation in participation and overall stroke impact (Stroke Impact Scale) at 6, 12, and 24 months post- stroke. Factors influencing best response to psychosocial intervention will also be explored. Discussion This paper provides detail on the design and treatment methods of this randomized trial in progress. Findings from this study will provide important information regarding the long-term efficacy of such a behavioral intervention in reducing PSD and subsequent impaired aspects of psychosocial and physical recovery. PMID:18436150

  4. Personal child and mother carbon monoxide exposures and kitchen levels: Methods and results from a randomized trial of woodfired chimney cookstoves in Guatemala (RESPIRE)

    PubMed Central

    SMITH, KIRK R.; McCRACKEN, JOHN P.; THOMPSON, LISA; EDWARDS, RUFUS; SHIELDS, KYRA N.; CANUZ, EDUARDO; BRUCE, NIGEL

    2015-01-01

    During the first randomized intervention trial (RESPIRE: Randomized Exposure Study of Pollution Indoors and Respiratory Effects) in air pollution epidemiology, we pioneered application of passive carbon monoxide (CO) diffusion tubes to measure long-term personal exposures to woodsmoke. Here we report on the protocols and validations of the method, trends in personal exposure for mothers and their young children, and the efficacy of the introduced improved chimney stove in reducing personal exposures and kitchen concentrations. Passive diffusion tubes originally developed for industrial hygiene applications were deployed on a quarterly basis to measure 48-hour integrated personal carbon monoxide exposures among 515 children 0–18 months of age and 532 mothers aged 15–55 years and area samples in a subsample of 77 kitchens, in households randomized into control and intervention groups. Instrument comparisons among types of passive diffusion tubes and against a continuous electrochemical CO monitor indicated that tubes responded nonlinearly to CO, and regression calibration was used to reduce this bias. Before stove introduction, the baseline arithmetic (geometric) mean 48-h child (n=270), mother (n=529) and kitchen (n=65) levels were, respectively, 3.4 (2.8), 3.4 (2.8) and 10.2 (8.4) p.p.m. The between-group analysis of the 3355 post-baseline measurements found CO levels to be significantly lower among the intervention group during the trial period: kitchen levels: −90%; mothers: −61%; and children: −52% in geometric means. No significant deterioration in stove effect was observed over the 18 months of surveillance. The reliability of these findings is strengthened by the large sample size made feasible by these unobtrusive and inexpensive tubes, measurement error reduction through instrument calibration, and a randomized, longitudinal study design. These results from the first randomized trial of improved household energy technology in a developing country

  5. Children's behavioral pain reactions during local anesthetic injection using cotton-roll vibration method compared with routine topical anesthesia: A randomized controlled trial

    PubMed Central

    Bagherian, Ali; Sheikhfathollahi, Mahmood

    2016-01-01

    Background: Topical anesthesia has been widely advocated as an important component of atraumatic administration of intraoral local anesthesia. The aim of this study was to use direct observation of children's behavioral pain reactions during local anesthetic injection using cotton-roll vibration method compared with routine topical anesthesia. Materials and Methods: Forty-eight children participated in this randomized controlled clinical trial. They received two separate inferior alveolar nerve block or primary maxillary molar infiltration injections on contralateral sides of the jaws by both cotton-roll vibration (a combination of topical anesthesia gel, cotton roll, and vibration for physical distraction) and control (routine topical anesthesia) methods. Behavioral pain reactions of children were measured according to the author-developed face, head, foot, hand, trunk, and cry (FHFHTC) scale, resulting in total scores between 0 and 18. Results: The total scores on the FHFHTC scale ranged between 0-5 and 0-10 in the cotton-roll vibration and control methods, respectively. The mean ± standard deviation values of total scores on FHFHTC scale were lower in the cotton-roll vibration method (1.21 ± 1.38) than in control method (2.44 ± 2.18), and this was statistically significant (P < 0.001). Conclusion: It may be concluded that the cotton-roll vibration method can be more helpful than the routine topical anesthesia in reducing behavioral pain reactions in children during local anesthesia administration. PMID:27274349

  6. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  7. Random thoughts

    NASA Astrophysics Data System (ADS)

    ajansen; kwhitefoot; panteltje1; edprochak; sudhakar, the

    2014-07-01

    In reply to the physicsworld.com news story “How to make a quantum random-number generator from a mobile phone” (16 May, http://ow.ly/xFiYc, see also p5), which describes a way of delivering random numbers by counting the number of photons that impinge on each of the individual pixels in the camera of a Nokia N9 smartphone.

  8. Are the Psychological Needs of Adolescent Survivors of Pediatric Cancer Adequately Identified and Treated?

    PubMed Central

    Kahalley, Lisa S.; Wilson, Stephanie J.; Tyc, Vida L.; Conklin, Heather M.; Hudson, Melissa M.; Wu, Shengjie; Xiong, Xiaoping; Stancel, Heather H.; Hinds, Pamela S.

    2012-01-01

    Objectives To describe the psychological needs of adolescent survivors of acute lymphoblastic leukemia (ALL) or brain tumor (BT), we examined: (a) the occurrence of cognitive, behavioral, and emotional concerns identified during a comprehensive psychological evaluation, and (b) the frequency of referrals for psychological follow-up services to address identified concerns. Methods Psychological concerns were identified on measures according to predetermined criteria for 100 adolescent survivors. Referrals for psychological follow-up services were made for concerns previously unidentified in formal assessment or not adequately addressed by current services. Results Most survivors (82%) exhibited at least one concern across domains: behavioral (76%), cognitive (47%), and emotional (19%). Behavioral concerns emerged most often on scales associated with executive dysfunction, inattention, learning, and peer difficulties. CRT was associated with cognitive concerns, χ2(1,N=100)=5.63, p<0.05. Lower income was associated with more cognitive concerns for ALL survivors, t(47)=3.28, p<0.01, and more behavioral concerns for BT survivors, t(48)=2.93, p<0.01. Of survivors with concerns, 38% were referred for psychological follow-up services. Lower-income ALL survivors received more referrals for follow-up, χ2(1,N=41)=8.05, p<0.01. Referred survivors had more concerns across domains than non-referred survivors, ALL: t(39)=2.96, p<0.01, BT: t(39)=3.52, p<0.01. Trends suggest ALL survivors may be at risk for experiencing unaddressed cognitive needs. Conclusions Many adolescent survivors of cancer experience psychological difficulties that are not adequately managed by current services, underscoring the need for long-term surveillance. In addition to prescribing regular psychological evaluations, clinicians should closely monitor whether current support services appropriately meet survivors’ needs, particularly for lower-income survivors and those treated with CRT. PMID:22278930

  9. Measurement and comparison of one- and two-dimensional modulation transfer function of optical imaging systems based on the random target method

    NASA Astrophysics Data System (ADS)

    Kang, Jiqiang; Hao, Qun; Cheng, Xuemin

    2014-10-01

    One-dimensional modulation transfer function (1-D MTF) has been generally calculated to evaluate the image quality of optical imaging systems, such as the horizontal MTF and vertical MTF. These MTFs can be measured by the use of some mature ways. However, the information of 1-D MTF for performance evaluation may not enough for the systems handling two-dimensional (2-D) targets of high resolution, thus discussing 2-D MTF will be necessary. We investigate the measurement method for the 1-D and 2-D MTF of optical imaging systems based on the random target method, and the characteristics of 2-D MTF and 1-D MTF in terms of MTF values and cutoff frequency are also noted.

  10. Randomized Response Analysis in Mplus

    ERIC Educational Resources Information Center

    Hox, Joop; Lensvelt-Mulders, Gerty

    2004-01-01

    This article describes a technique to analyze randomized response data using available structural equation modeling (SEM) software. The randomized response technique was developed to obtain estimates that are more valid when studying sensitive topics. The basic feature of all randomized response methods is that the data are deliberately…

  11. Parity among the randomly amplified polymorphic DNA method, multilocus enzyme electrophoresis, and Southern blot hybridization with the moderately repetitive DNA probe Ca3 for fingerprinting Candida albicans.

    PubMed Central

    Pujol, C; Joly, S; Lockhart, S R; Noel, S; Tibayrenc, M; Soll, D R

    1997-01-01

    Randomly amplified polymorphic DNA (RAPD) analysis, multilocus enzyme electrophoresis (MLEE), and Southern blot hybridization with moderately repetitive DNA probes have emerged as effective fingerprinting methods for the infectious fungus Candida albicans. The three methods have been compared for their capacities to identify identical or highly related isolates, to cluster weakly related isolates, to discriminate between unrelated isolates, and to assess microevolution within a strain. By computing similarity coefficients between 29 isolates from three cities within the continental United States, strong concordance of the results is demonstrated for RAPD analysis, MLEE, and Southern blot hybridization with the moderately repetitive probe Ca3, and weaker concordance of the results is demonstrated for these three fingerprinting methods and Southern blot hybridization with the moderately repetitive probe CARE2. All methods were also demonstrated to be able to resolve microevolution within a strain, with the Ca3 probe exhibiting the greatest resolving power. The strong correlations demonstrated between polymorphic markers assessed by the four independent fingerprinting methods and the nonrandom association between loci demonstrated by RAPD analysis and MLEE provide evidence for strong linkage disequilibrium and a clonal population structure for C. albicans. In addition, a synapomorphic allele, Pep-3A, was found to be present in all members of one of the three clusters discriminated by RAPD analysis, MLEE, and Ca3 fingerprinting, supporting the concordance of the clustering capacities of the three methods, the robustness of the clusters, and the clonal nature of the clusters. PMID:9276415

  12. A GPU accelerated, discrete time random walk model for simulating reactive transport in porous media using colocation probability function based reaction methods

    NASA Astrophysics Data System (ADS)

    Barnard, J. M.; Augarde, C. E.

    2012-12-01

    The simulation of reactions in flow through unsaturated porous media is a more complicated process when using particle tracking based models than in continuum based models. In the fomer particles are reacted on an individual particle-to-particle basis using either deterministic or probabilistic methods. This means that particle tracking methods, especially when simulations of reactions are included, are computationally intensive as the reaction simulations require tens of thousands of nearest neighbour searches per time step. Despite this, particle tracking methods merit further study due to their ability to eliminate numerical dispersion, to simulate anomalous transport and incomplete mixing of reactive solutes. A new model has been developed using discrete time random walk particle tracking methods to simulate reactive mass transport in porous media which includes a variation of colocation probability function based methods of reaction simulation from those presented by Benson & Meerschaert (2008). Model development has also included code acceleration via graphics processing units (GPUs). The nature of particle tracking methods means that they are well suited to parallelization using GPUs. The architecture of GPUs is single instruction - multiple data (SIMD). This means that only one operation can be performed at any one time but can be performed on multiple data simultaneously. This allows for significant speed gains where long loops of independent operations are performed. Computationally expensive code elements, such the nearest neighbour searches required by the reaction simulation, are therefore prime targets for GPU acceleration.

  13. Rectal cancer delivery of radiotherapy in adequate time and with adequate dose is influenced by treatment center, treatment schedule, and gender and is prognostic parameter for local control: Results of study CAO/ARO/AIO-94

    SciTech Connect

    Fietkau, Rainer . E-mail: rainer.fietkau@med.uni-rostock.de; Roedel, Claus; Hohenberger, Werner; Raab, Rudolf; Hess, Clemens; Liersch, Torsten; Becker, Heinz; Wittekind, Christian; Hutter, Matthias; Hager, Eva; Karstens, Johann; Ewald, Hermann; Christen, Norbert; Jagoditsch, Michael; Martus, Peter; Sauer, Rolf

    2007-03-15

    Purpose: The impact of the delivery of radiotherapy (RT) on treatment results in rectal cancer patients is unknown. Methods and Materials: The data from 788 patients with rectal cancer treated within the German CAO/AIO/ARO-94 phase III trial were analyzed concerning the impact of the delivery of RT (adequate RT: minimal radiation RT dose delivered, 4300 cGy for neoadjuvant RT or 4700 cGy for adjuvant RT; completion of RT in <44 days for neoadjuvant RT or <49 days for adjuvant RT) in different centers on the locoregional recurrence rate (LRR) and disease-free survival (DFS) at 5 years. The LRR, DFS, and delivery of RT were analyzed as endpoints in multivariate analysis. Results: A significant difference was found between the centers and the delivery of RT. The overall delivery of RT was a prognostic factor for the LRR (no RT, 29.6% {+-} 7.8%; inadequate RT, 21.2% {+-} 5.6%; adequate RT, 6.8% {+-} 1.4%; p = 0.0001) and DFS (no RT, 55.1% {+-} 9.1%; inadequate RT, 57.4% {+-} 6.3%; adequate RT, 69.1% {+-} 2.3%; p = 0.02). Postoperatively, delivery of RT was a prognostic factor for LRR on multivariate analysis (together with pathologic stage) but not for DFS (independent parameters, pathologic stage and age). Preoperatively, on multivariate analysis, pathologic stage, but not delivery of RT, was an independent prognostic parameter for LRR and DFS (together with adequate chemotherapy). On multivariate analysis, the treatment center, treatment schedule (neoadjuvant vs. adjuvant RT), and gender were prognostic parameters for adequate RT. Conclusion: Delivery of RT should be regarded as a prognostic factor for LRR in rectal cancer and is influenced by the treatment center, treatment schedule, and patient gender.

  14. Subspace inverse power method and polynomial chaos representation for the modal frequency responses of random mechanical systems

    NASA Astrophysics Data System (ADS)

    Pagnacco, E.; de Cursi, E. Souza; Sampaio, R.

    2016-04-01

    This study concerns the computation of frequency responses of linear stochastic mechanical systems through a modal analysis. A new strategy, based on transposing standards deterministic deflated and subspace inverse power methods into stochastic framework, is introduced via polynomial chaos representation. Applicability and effectiveness of the proposed schemes is demonstrated through three simple application examples and one realistic application example. It is shown that null and repeated-eigenvalue situations are addressed successfully.

  15. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  16. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  17. Calculation of the Cost of an Adequate Education in Kentucky: A Professional Judgment Approach

    ERIC Educational Resources Information Center

    Verstegen, Deborah A.

    2004-01-01

    What is an adequate education and how much does it cost? In 1989, Kentucky's State Supreme Court found the entire system of education unconstitutional--"all of its parts and parcels". The Court called for all children to have access to an adequate education, one that is uniform and has as its goal the development of seven capacities, including:…

  18. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a)...

  19. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending Veterinarian and Adequate Veterinary Care §...

  20. 75 FR 69648 - Safety Analysis Requirements for Defining Adequate Protection for the Public and the Workers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ... SAFETY BOARD Safety Analysis Requirements for Defining Adequate Protection for the Public and the Workers... TO THE SECRETARY OF ENERGY Safety Analysis Requirements for Defining Adequate Protection for the... safety analysis, or DSA, is to be prepared for every DOE nuclear facility. This DSA, once approved by...

  1. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  2. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  3. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  4. Assessing the Stability and Safety of Procedure during Endoscopic Submucosal Dissection According to Sedation Methods: A Randomized Trial

    PubMed Central

    Lee, Sang Kil; Lee, Hyuk; Lee, Yong Chan; Park, Jun Chul; Yoo, Young Chul

    2015-01-01

    Background Although endoscopic submucosal dissection (ESD) is routinely performed under sedation, the difference in ESD performance according to sedation method is not well known. This study attempted to prospectively assess and compare the satisfaction of the endoscopists and patient stability during ESD between two sedation methods. Methods One hundred and fifty-four adult patients scheduled for ESD were sedated by either the IMIE (intermittent midazolam/propofol injection by endoscopist) or CPIA (continuous propofol infusion by anesthesiologist) method. The primary endpoint of this study was to compare the level of satisfaction of the endoscopists between the two groups. The secondary endpoints included level of satisfaction of the patients, patient’s pain scores, events interfering with the procedure, incidence of unintended deep sedation, hemodynamic and respiratory events, and ESD outcomes and complications. Results Level of satisfaction of the endoscopists was significantly higher in the CPIA Group compared to the IMIE group (IMIE vs. CPIA; high satisfaction score; 63.2% vs. 87.2%, P=0.001). The incidence of unintended deep sedation was significantly higher in the IMIE Group compared to the CPIA Group (IMIE vs. CPIA; 17.1% vs. 5.1%, P=0.018) as well as the number of patients showing spontaneous movement or those requiring physical restraint (IMIE vs. CPIA; spontaneous movement; 60.5% vs. 42.3%, P=0.024, physical restraint; 27.6% vs. 10.3%, P=0.006, respectively). In contrast, level of satisfaction of the patients were found to be significantly higher in the IMIE Group (IMIE vs. CPIA; high satisfaction score; 85.5% vs. 67.9%, P=0.027). Pain scores of the patients, hemodynamic and respiratory events, and ESD outcomes and complications were not different between the two groups. Conclusion Continuous propofol and remifentanil infusion by an anesthesiologist during ESD can increase the satisfaction levels of the endoscopists by providing a more stable state of

  5. Bovine hemoglobin as the sole source of dietary iron does not support adequate iron status in copper-adequate or copper-deficient rats

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This experiment was designed to determine whether hemoglobin as the sole source of dietary iron (Fe) could sustain normal Fe status in growing rats. Because adequate copper (Cu) status is required for efficient Fe absorption in the rat, we also determined the effects of Cu deficiency on Fe status of...

  6. Analyzing indirect effects in cluster randomized trials. The effect of estimation method, number of groups and group sizes on accuracy and power

    PubMed Central

    Hox, Joop J.; Moerbeek, Mirjam; Kluytmans, Anouck; van de Schoot, Rens

    2013-01-01

    Cluster randomized trials assess the effect of an intervention that is carried out at the group or cluster level. Ajzen's theory of planned behavior is often used to model the effect of the intervention as an indirect effect mediated in turn by attitude, norms and behavioral intention. Structural equation modeling (SEM) is the technique of choice to estimate indirect effects and their significance. However, this is a large sample technique, and its application in a cluster randomized trial assumes a relatively large number of clusters. In practice, the number of clusters in these studies tends to be relatively small, e.g., much less than fifty. This study uses simulation methods to find the lowest number of clusters needed when multilevel SEM is used to estimate the indirect effect. Maximum likelihood estimation is compared to Bayesian analysis, with the central quality criteria being accuracy of the point estimate and the confidence interval. We also investigate the power of the test for the indirect effect. We conclude that Bayes estimation works well with much smaller cluster level sample sizes such as 20 cases than maximum likelihood estimation; although the bias is larger the coverage is much better. When only 5–10 clusters are available per treatment condition even with Bayesian estimation problems occur. PMID:24550881

  7. Bivariate random effects models for meta-analysis of comparative studies with binary outcomes: methods for the absolute risk difference and relative risk.

    PubMed

    Chu, Haitao; Nie, Lei; Chen, Yong; Huang, Yi; Sun, Wei

    2012-12-01

    Multivariate meta-analysis is increasingly utilised in biomedical research to combine data of multiple comparative clinical studies for evaluating drug efficacy and safety profile. When the probability of the event of interest is rare, or when the individual study sample sizes are small, a substantial proportion of studies may not have any event of interest. Conventional meta-analysis methods either exclude such studies or include them through ad hoc continuality correction by adding an arbitrary positive value to each cell of the corresponding 2 × 2 tables, which may result in less accurate conclusions. Furthermore, different continuity corrections may result in inconsistent conclusions. In this article, we discuss a bivariate Beta-binomial model derived from Sarmanov family of bivariate distributions and a bivariate generalised linear mixed effects model for binary clustered data to make valid inferences. These bivariate random effects models use all available data without ad hoc continuity corrections, and accounts for the potential correlation between treatment (or exposure) and control groups within studies naturally. We then utilise the bivariate random effects models to reanalyse two recent meta-analysis data sets. PMID:21177306

  8. Mindfulness-Based Stress Reduction for Overweight/Obese Women With and Without Polycystic Ovary Syndrome: Design and Methods of a Pilot Randomized Controlled Trial

    PubMed Central

    Raja-Khan, Nazia; Agito, Katrina; Shah, Julie; Stetter, Christy M.; Gustafson, Theresa S.; Socolow, Holly; Kunselman, Allen R.; Reibel, Diane K.; Legro, Richard S.

    2015-01-01

    Mindfulness-based stress reduction (MBSR) may be beneficial for overweight/obese women, including women with polycystic ovary syndrome (PCOS), as it has been shown to reduce psychological distress and improve quality of life in other patient populations. Preliminary studies suggest that MBSR may also have salutary effects on blood pressure and blood glucose. This paper describes the design and methods of an ongoing pilot randomized controlled trial evaluating the feasibility and effects of MBSR in PCOS and non-PCOS women who are overweight or obese. Eighty six (86) women with body mass index ≥25 kg/m2, including 31 women with PCOS, have been randomized to 8 weeks of MBSR or health education control, and followed for 16 weeks. The primary outcome is mindfulness assessed with the Toronto Mindfulness Scale. Secondary outcomes include measures of blood pressure, blood glucose, quality of life, anxiety and depression. Our overall hypothesis is that MBSR will increase mindfulness and ultimately lead to favorable changes in blood pressure, blood glucose, psychological distress and quality of life in PCOS and non-PCOS women. This would support the integration of MBSR with conventional medical treatments to reduce psychological distress, cardiovascular disease and diabetes in PCOS and non-PCOS women who are overweight or obese. PMID:25662105

  9. Analyzing indirect effects in cluster randomized trials. The effect of estimation method, number of groups and group sizes on accuracy and power.

    PubMed

    Hox, Joop J; Moerbeek, Mirjam; Kluytmans, Anouck; van de Schoot, Rens

    2014-01-01

    Cluster randomized trials assess the effect of an intervention that is carried out at the group or cluster level. Ajzen's theory of planned behavior is often used to model the effect of the intervention as an indirect effect mediated in turn by attitude, norms and behavioral intention. Structural equation modeling (SEM) is the technique of choice to estimate indirect effects and their significance. However, this is a large sample technique, and its application in a cluster randomized trial assumes a relatively large number of clusters. In practice, the number of clusters in these studies tends to be relatively small, e.g., much less than fifty. This study uses simulation methods to find the lowest number of clusters needed when multilevel SEM is used to estimate the indirect effect. Maximum likelihood estimation is compared to Bayesian analysis, with the central quality criteria being accuracy of the point estimate and the confidence interval. We also investigate the power of the test for the indirect effect. We conclude that Bayes estimation works well with much smaller cluster level sample sizes such as 20 cases than maximum likelihood estimation; although the bias is larger the coverage is much better. When only 5-10 clusters are available per treatment condition even with Bayesian estimation problems occur. PMID:24550881

  10. Blocked randomization with randomly selected block sizes.

    PubMed

    Efird, Jimmy

    2011-01-01

    When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes. PMID:21318011

  11. A cluster-randomized, placebo-controlled, maternal vitamin a or beta-carotene supplementation trial in bangladesh: design and methods

    PubMed Central

    2011-01-01

    Background We present the design, methods and population characteristics of a large community trial that assessed the efficacy of a weekly supplement containing vitamin A or beta-carotene, at recommended dietary levels, in reducing maternal mortality from early gestation through 12 weeks postpartum. We identify challenges faced and report solutions in implementing an intervention trial under low-resource, rural conditions, including the importance of population choice in promoting generalizability, maintaining rigorous data quality control to reduce inter- and intra- worker variation, and optimizing efficiencies in information and resources flow from and to the field. Methods This trial was a double-masked, cluster-randomized, dual intervention, placebo-controlled trial in a contiguous rural area of ~435 sq km with a population of ~650,000 in Gaibandha and Rangpur Districts of Northwestern Bangladesh. Approximately 120,000 married women of reproductive age underwent 5-weekly home surveillance, of whom ~60,000 were detected as pregnant, enrolled into the trial and gave birth to ~44,000 live-born infants. Upon enrollment, at ~ 9 weeks' gestation, pregnant women received a weekly oral supplement containing vitamin A (7000 ug retinol equivalents (RE)), beta-carotene (42 mg, or ~7000 ug RE) or a placebo through 12 weeks postpartum, according to prior randomized allocation of their cluster of residence. Systems described include enlistment and 5-weekly home surveillance for pregnancy based on menstrual history and urine testing, weekly supervised supplementation, periodic risk factor interviews, maternal and infant vital outcome monitoring, birth defect surveillance and clinical/biochemical substudies. Results The primary outcome was pregnancy-related mortality assessed for 3 months following parturition. Secondary outcomes included fetal loss due to miscarriage or stillbirth, infant mortality under three months of age, maternal obstetric and infectious morbidity, infant

  12. 45 CFR 1159.15 - Who has the responsibility for maintaining adequate technical, physical, and security safeguards...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... of maintaining adequate technical, physical, and security safeguards to prevent...

  13. Is reimbursement for childhood immunizations adequate? evidence from two rural areas in colorado.

    PubMed Central

    Glazner, J. E.; Steiner, J. F.; Haas, K. J.; Renfrew, B.; Deutchman, M.; Berman, S.

    2001-01-01

    OBJECTIVE: To assess adequacy of reimbursement for childhood vaccinations in two rural regions in Colorado, the authors measured medical practice costs of providing childhood vaccinations and compared them with reimbursement. METHODS: A "time-motion" method was used to measure labor costs of providing vaccinations in 13 private and public practices. Practices reported non-labor costs. The authors determined reimbursement by record review. RESULTS: The average vaccine delivery cost per dose (excluding vaccine cost) ranged from $4.69 for community health centers to $5.60 for private practices. Average reimbursement exceeded average delivery costs for all vaccines and contributed to overhead in private practices. Average reimbursement was less than total cost (vaccine-delivery costs + overhead) in private practices for most vaccines in one region with significant managed care penetration. Reimbursement to public providers was less than the average vaccine delivery costs. CONCLUSIONS: Current reimbursement may not be adequate to induce private practices to provide childhood vaccinations, particularly in areas with substantial managed care penetration. PMID:12034911

  14. Effects of a computerized feedback intervention on safety performance by junior doctors: results from a randomized mixed method study

    PubMed Central

    2013-01-01

    Background The behaviour of doctors and their responses to warnings can inform the effective design of Clinical Decision Support Systems. We used data from a University hospital electronic prescribing and laboratory reporting system with hierarchical warnings and alerts to explore junior doctors’ behaviour. The objective of this trial was to establish whether a Junior Doctor Dashboard providing feedback on prescription warning information and laboratory alerting acceptance rates was effective in changing junior doctors’ behaviour. Methods A mixed methods approach was employed which included a parallel group randomised controlled trial, and individual and focus group interviews. Junior doctors below the specialty trainee level 3 grade were recruited and randomised to two groups. Every doctor (N = 42) in the intervention group was e-mailed a link to a personal dashboard every week for 4 months. Nineteen participated in interviews. The 44 control doctors did not receive any automated feedback. The outcome measures were the difference in responses to prescribing warnings (of two severities) and laboratory alerting (of two severities) between the months before and the months during the intervention, analysed as the difference in performance between the intervention and the control groups. Results No significant differences were observed in the rates of generating prescription warnings, or in the acceptance of laboratory alarms. However, responses to laboratory alerts differed between the pre-intervention and intervention periods. For the doctors of Foundation Year 1 grade, this improvement was significantly (p = 0.002) greater in the group with access to the dashboard (53.6% ignored pre-intervention compared to 29.2% post intervention) than in the control group (47.9% ignored pre-intervention compared to 47.0% post intervention). Qualitative interview data indicated that while junior doctors were positive about the electronic prescribing functions, they

  15. Mimicking the quasi-random assembly of protein fibers in the dermis by freeze-drying method.

    PubMed

    Ghaleh, Hakimeh; Abbasi, Farhang; Alizadeh, Mina; Khoshfetrat, Ali Baradar

    2015-04-01

    Freeze-drying is extensively used for fabrication of porous materials in tissue engineering and biomedical applications, due to its versatility and use of no toxic solvent. However, it has some significant drawbacks. Conventional freeze-drying technique leads to the production of heterogeneous porous structures with side orientated columnar pores. As the top and bottom surfaces of the sample are not in contact with similar environments, different rates of heat transfer in the surfaces and the temperature gradient across the sample establish the preferential direction of heat transfer. To achieve a scaffold with a desirable microstructure for skin tissue engineering, freeze-drying method was modified by controlling the rate of cooling and regulation of heat transfer across the sample during the freezing step. It could create a homogeneous porous structure with more equiaxed non-oriented pores. Freezing the polymeric solution in the aluminum mold enhanced pore interconnectivity relative to the polystyrene mold. Recrystallization process was discussed how to influence the mean pore size of the scaffold when the final freezing temperature varied. Higher final freezing temperature can easily provide the energy required for the recrystallization process, which lead to enlarged ice crystals and resulting pores. PMID:25687012

  16. An Itô-based general approximation method for random vibration of hysteretic systems, part I: Gaussian analysis

    NASA Astrophysics Data System (ADS)

    Noori, M.; Davoodi, H.; Saffar, A.

    1988-12-01

    The cumulant-neglect closure scheme independently developed by Ibrahim and Lin is extended for determining the stationary and non-stationary response of non-linear systems with hysteric restoring force characteristics. The method is applied to the analysis of a hysteresis and model with strength and/or stiffness degradation capabilities. This model has been studied in the past by Baber and Wen for the analysis of hysterically degrading systems using equivalent linearization. The same model has also been used for stochastic seismic performance evaluation of reinforced concrete buildings. Response statistics obtained for the model by using this closure scheme are compared with results of equivalent linearization via Monte Carlo simulation. The study performed, for a wide range of degradation parameters and input power spectral density levels, shows that the Gaussian responses obtained by this approach are identical with the linearized results. This general approximation technique, however, can provide information on higher order statistics for hysteretic systems. These non-Gaussian statistics have not been made available so far by the existing approximation techniques. In this paper the Gaussian statistics are presented.

  17. Uniform random number generators

    NASA Technical Reports Server (NTRS)

    Farr, W. R.

    1971-01-01

    Methods are presented for the generation of random numbers with uniform and normal distributions. Subprogram listings of Fortran generators for the Univac 1108, SDS 930, and CDC 3200 digital computers are also included. The generators are of the mixed multiplicative type, and the mathematical method employed is that of Marsaglia and Bray.

  18. Quantifying data retention of perpendicular spin-transfer-torque magnetic random access memory chips using an effective thermal stability factor method

    SciTech Connect

    Thomas, Luc Jan, Guenole; Le, Son; Wang, Po-Kang

    2015-04-20

    The thermal stability of perpendicular Spin-Transfer-Torque Magnetic Random Access Memory (STT-MRAM) devices is investigated at chip level. Experimental data are analyzed in the framework of the Néel-Brown model including distributions of the thermal stability factor Δ. We show that in the low error rate regime important for applications, the effect of distributions of Δ can be described by a single quantity, the effective thermal stability factor Δ{sub eff}, which encompasses both the median and the standard deviation of the distributions. Data retention of memory chips can be assessed accurately by measuring Δ{sub eff} as a function of device diameter and temperature. We apply this method to show that 54 nm devices based on our perpendicular STT-MRAM design meet our 10 year data retention target up to 120 °C.

  19. Telotristat etiprate, a novel serotonin synthesis inhibitor, in patients with carcinoid syndrome and diarrhea not adequately controlled by octreotide.

    PubMed

    Kulke, Matthew H; O'Dorisio, Thomas; Phan, Alexandria; Bergsland, Emily; Law, Linda; Banks, Phillip; Freiman, Joel; Frazier, Kenny; Jackson, Jessica; Yao, James C; Kvols, Larry; Lapuerta, Pablo; Zambrowicz, Brian; Fleming, Douglas; Sands, Arthur

    2014-10-01

    Serotonin produced by neuroendocrine tumors is believed to be a principal cause of the diarrhea in carcinoid syndrome. We assessed the safety and efficacy of telotristat etiprate, an oral serotonin synthesis inhibitor, in patients with diarrhea associated with carcinoid syndrome. In this prospective, randomized study, patients with evidence of carcinoid tumor and ≥4 bowel movements (BMs)/day despite stable-dose octreotide LAR depot therapy were enrolled in sequential, escalating, cohorts of four patients per cohort. In each cohort, one patient was randomly assigned to placebo and three patients to telotristat etiprate, at 150, 250, 350, or 500 mg three times a day (tid). In a subsequent cohort, one patient was assigned to placebo and six patients to telotristat etiprate 500 mg tid. Patients were assessed for safety, BM frequency (daily diary), 24 h urinary 5-hydroxyindoleacetic acid (u5-HIAA), and adequate relief of carcinoid gastrointestinal symptoms (using a weekly questionnaire). Twenty-three patients were treated: 18 received telotristat etiprate and five received placebo. Adverse events were generally mild. Among evaluable telotristat etiprate-treated patients, 5/18 (28%) experienced a ≥30% reduction in BM frequency for ≥2 weeks, 9/16 (56%) experienced biochemical response (≥50% reduction or normalization in 24-h u5-HIAA) at week 2 or 4, and 10/18 (56%) reported adequate relief during at least 1 of the first 4 weeks of treatment. Similar activity was not observed in placebo-treated patients. Telotristat etiprate was well tolerated. Our observations suggest that telotristat etiprate has activity in controlling diarrhea associated with carcinoid syndrome. Further studies confirming these findings are warranted. PMID:25012985

  20. Telotristat Etiprate, a Novel Serotonin Synthesis Inhibitor, in Patients with Carcinoid Syndrome and Diarrhea Not Adequately Controlled by Octreotide

    PubMed Central

    Kulke, Matthew H.; O’Dorisio, Thomas; Phan, Alexandria; Bergsland, Emily; Law, Linda; Banks, Phillip; Freiman, Joel; Frazier, Kenny; Jackson, Jessica; Yao, James C.; Kvols, Larry; Lapuerta, Pablo; Zambrowicz, Brian; Fleming, Douglas; Sands, Arthur

    2014-01-01

    Serotonin produced by neuroendocrine tumors is believed to be a principal cause of the diarrhea in carcinoid syndrome. We assessed the safety and efficacy of telotristat etiprate, an oral serotonin synthesis inhibitor, in patients with diarrhea associated with carcinoid syndrome. In this prospective, randomized study, patients with evidence of carcinoid tumor and ≥4 bowel movements (BMs)/day despite stable-dose octreotide LAR depot therapy were enrolled in sequential, escalating, cohorts of 4 patients/cohort. In each cohort, 1 patient was randomly assigned to placebo and 3 patients to telotristat etiprate, at 150, 250, 350, or 500 mg 3x/day (tid). In a subsequent cohort, 1 patient was assigned to placebo and 6 patients to telotristat etiprate 500 mg tid. Patients were assessed for safety, BM frequency (daily diary), 24-hour urinary 5-hydroxyindoleacetic acid (u5-HIAA), and adequate relief of carcinoid gastrointestinal symptoms (using a weekly questionnaire). Twenty-three patients were treated; 18 received telotristat etiprate and 5 received placebo. Adverse events were generally mild. Among evaluable telotristat etiprate-treated patients, 5/18 (28%) experienced a ≥30% reduction in BM frequency for ≥2 weeks, 9/16 (56%) experienced biochemical response (≥50% reduction or normalization in 24-hour u5-HIAA) at Week 2 or 4, and 10/18 (56%) reported adequate relief during at least 1 of the first 4 weeks of treatment. Similar activity was not observed in placebo-treated patients. Telotristat etiprate was well tolerated. Our observations suggest that telotristat etiprate has activity in controlling diarrhea associated with carcinoid syndrome. Further studies confirming these findings are warranted. PMID:25012985

  1. Inferential Processing among Adequate and Struggling Adolescent Comprehenders and Relations to Reading Comprehension

    PubMed Central

    Barth, Amy E.; Barnes, Marcia; Francis, David J.; Vaughn, Sharon; York, Mary

    2015-01-01

    Separate mixed model analyses of variance (ANOVA) were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6–12 (n = 1203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in comprehension. Results suggest that there is considerable growth across the middle and high school years, particularly for adequate comprehenders in those text integration processes that maintain local coherence. Accuracy in text consistency judgments accounted for significant unique variance for passage-level, but not sentence-level comprehension, particularly for adequate comprehenders. PMID:26166946

  2. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  3. Knowledge and Informed Decision-Making about Population-Based Colorectal Cancer Screening Participation in Groups with Low and Adequate Health Literacy

    PubMed Central

    Essink-Bot, M. L.; Dekker, E.; Timmermans, D. R. M.; Uiters, E.; Fransen, M. P.

    2016-01-01

    Objective. To analyze and compare decision-relevant knowledge, decisional conflict, and informed decision-making about colorectal cancer (CRC) screening participation between potential screening participants with low and adequate health literacy (HL), defined as the skills to access, understand, and apply information to make informed decisions about health. Methods. Survey including 71 individuals with low HL and 70 with adequate HL, all eligible for the Dutch organized CRC screening program. Knowledge, attitude, intention to participate, and decisional conflict were assessed after reading the standard information materials. HL was assessed using the Short Assessment of Health Literacy in Dutch. Informed decision-making was analyzed by the multidimensional measure of informed choice. Results. 64% of the study population had adequate knowledge of CRC and CRC screening (low HL 43/71 (61%), adequate HL 47/70 (67%), p > 0.05). 57% were informed decision-makers (low HL 34/71 (55%), adequate HL 39/70 (58%), p > 0.05). Intention to participate was 89% (low HL 63/71 (89%), adequate HL 63/70 (90%)). Respondents with low HL experienced significantly more decisional conflict (25.8 versus 16.1; p = 0.00). Conclusion. Informed decision-making about CRC screening participation was suboptimal among both individuals with low HL and individuals with adequate HL. Further research is required to develop and implement effective strategies to convey decision-relevant knowledge about CRC screening to all screening invitees. PMID:27200089

  4. Knowledge and Informed Decision-Making about Population-Based Colorectal Cancer Screening Participation in Groups with Low and Adequate Health Literacy.

    PubMed

    Essink-Bot, M L; Dekker, E; Timmermans, D R M; Uiters, E; Fransen, M P

    2016-01-01

    Objective. To analyze and compare decision-relevant knowledge, decisional conflict, and informed decision-making about colorectal cancer (CRC) screening participation between potential screening participants with low and adequate health literacy (HL), defined as the skills to access, understand, and apply information to make informed decisions about health. Methods. Survey including 71 individuals with low HL and 70 with adequate HL, all eligible for the Dutch organized CRC screening program. Knowledge, attitude, intention to participate, and decisional conflict were assessed after reading the standard information materials. HL was assessed using the Short Assessment of Health Literacy in Dutch. Informed decision-making was analyzed by the multidimensional measure of informed choice. Results. 64% of the study population had adequate knowledge of CRC and CRC screening (low HL 43/71 (61%), adequate HL 47/70 (67%), p > 0.05). 57% were informed decision-makers (low HL 34/71 (55%), adequate HL 39/70 (58%), p > 0.05). Intention to participate was 89% (low HL 63/71 (89%), adequate HL 63/70 (90%)). Respondents with low HL experienced significantly more decisional conflict (25.8 versus 16.1; p = 0.00). Conclusion. Informed decision-making about CRC screening participation was suboptimal among both individuals with low HL and individuals with adequate HL. Further research is required to develop and implement effective strategies to convey decision-relevant knowledge about CRC screening to all screening invitees. PMID:27200089

  5. Site Characterization in the Urban Area of Tijuana, B. C., Mexico by Means of: H/V Spectral Ratios, Spectral Analysis of Surface Waves, and Random Decrement Method

    NASA Astrophysics Data System (ADS)

    Tapia-Herrera, R.; Huerta-Lopez, C. I.; Martinez-Cruzado, J. A.

    2009-05-01

    Results of site characterization for an experimental site in the metropolitan area of Tijuana, B. C., Mexico are presented as part of the on-going research in which time series of earthquakes, ambient noise, and induced vibrations were processed with three different methods: H/V spectral ratios, Spectral Analysis of Surface Waves (SASW), and the Random Decrement Method, (RDM). Forward modeling using the wave propagation stiffness matrix method (Roësset and Kausel, 1981) was used to compute the theoretical SH/P, SV/P spectral ratios, and the experimental H/V spectral ratios were computed following the conventional concepts of Fourier analysis. The modeling/comparison between the theoretical and experimental H/V spectral ratios was carried out. For the SASW method the theoretical dispersion curves were also computed and compared with the experimental one, and finally the theoretical free vibration decay curve was compared with the experimental one obtained with the RDM. All three methods were tested with ambient noise, induced vibrations, and earthquake signals. Both experimental spectral ratios obtained with ambient noise as well as earthquake signals agree quite well with the theoretical spectral ratios, particularly at the fundamental vibration frequency of the recording site. Differences between the fundamental vibration frequencies are evident for sites located at alluvial fill (~0.6 Hz) and at sites located at conglomerate/sandstones fill (0.75 Hz). Shear wave velocities for the soft soil layers of the 4-layer discrete soil model ranges as low as 100 m/s and up to 280 m/s. The results with the SASW provided information that allows to identify low velocity layers, not seen before with the traditional seismic methods. The damping estimations obtained with the RDM are within the expected values, and the dominant frequency of the system also obtained with the RDM correlates within the range of plus-minus 20 % with the one obtained by means of the H/V spectral

  6. Prioritising pharmaceuticals for environmental risk assessment: Towards adequate and feasible first-tier selection.

    PubMed

    Roos, V; Gunnarsson, L; Fick, J; Larsson, D G J; Rudén, C

    2012-04-01

    The presence of pharmaceuticals in the aquatic environment, and the concerns for negative effects on aquatic organisms, has gained increasing attention over the last years. As ecotoxicity data are lacking for most active pharmaceutical ingredients (APIs), it is important to identify strategies to prioritise APIs for ecotoxicity testing and environmental monitoring. We have used nine previously proposed prioritisation schemes, both risk- and hazard-based, to rank 582 APIs. The similarities and differences in overall ranking results and input data were compared. Moreover, we analysed how well the methods ranked seven relatively well-studied APIs. It is concluded that the hazard-based methods were more successful in correctly ranking the well-studied APIs, but the fish plasma model, which includes human pharmacological data, also showed a high success rate. The results of the analyses show that the input data availability vary significantly; some data, such as logP, are available for most API while information about environmental concentrations and bioconcentration are still scarce. The results also suggest that the exposure estimates in risk-based methods need to be improved and that the inclusion of effect measures at first-tier prioritisation might underestimate risks. It is proposed that in order to develop an adequate prioritisation scheme, improved data on exposure such as degradation and sewage treatment removal and bioconcentration ability should be further considered. The use of ATC codes may also be useful for the development of a prioritisation scheme that includes the mode of action of pharmaceuticals and, to some extent, mixture effects. PMID:22361586

  7. Investigating bang for your training buck: a randomized controlled trial comparing three methods of training clinicians in two core strategies of dialectical behavior therapy.

    PubMed

    Dimeff, Linda A; Harned, Melanie S; Woodcock, Eric A; Skutch, Julie M; Koerner, Kelly; Linehan, Marsha M

    2015-05-01

    The present study examined the efficacy of online training (OLT), instructor-led training (ILT), and a treatment manual (TM) in training mental health clinicians in two core strategies of Dialectical Behavior Therapy (DBT): chain analysis and validation. A randomized controlled trial compared OLT, ILT, and TM among clinicians naïve to DBT (N=172) who were assessed at baseline, post-training, and 30, 60, and 90 days following training. Primary outcomes included satisfaction, self-efficacy, motivation, knowledge, clinical proficiency, and clinical use. Overall, ILT outperformed OLT and TM in satisfaction, self-efficacy, and motivation, whereas OLT was the most effective method for increasing knowledge. The conditions did not differ in observer-rated clinical proficiency or self-reported clinical use, which both increased to moderate levels after training. In addition, ILT was particularly effective at improving motivation to use chain analysis, whereas OLT was particularly effective at increasing knowledge of validation strategies. These findings suggest that these types of brief, didactic trainings may be effective methods of increasing knowledge of new treatment strategies, but may not be sufficient to enable clinicians to achieve a high level of clinical use or proficiency. Additional research examining the possible advantages of matching training methods to types of treatment strategies may help to determine a tailored, more effective approach to training clinicians in empirically supported treatments. PMID:25892165

  8. Evaluation of catheter-manometer systems for adequate intravascular blood pressure measurements in small animals.

    PubMed

    Idvall, J; Aronsen, K F; Lindström, K; Ulmsten, U

    1977-09-30

    Various catheter-manometer systems possible for intravascular blood pressure measurments on rats have been elaborated and tested in vitro and in vivo. Using a pressure-step calibrator, it was observed from in vitro studies that microtransducers had superior frequency response compared to conventional transducers. Of the catheters tested, Pe-90 tapered to a 40 mm tip with an inner diameter of 0.3 mm had the best frequency response as judged from fall and settling times. Because of the damping effect, tapering increased fall time to 1.8 ms, which was still quite acceptable. By the same token settling time was minimized to 22.4 ms. With a special calculation method the theoretical percentile fault of the recordings was estimated to be 9.66%. When the measurement error was calculated from the actual in vivo recordings, it was found to be no more than 2.7%. These results show that the technique described is adequate for continuous intravascular blood pressure recordings on small animals. Finally it is emphasized that careful handling of the catheters and avoidance of stopcocks and air bubbles are essential for obtaining accurate and reproducible values. PMID:928971

  9. A Randomized, Single-Blind, Placebo-Controlled Study on the Efficacy of the Arthrokinematic Approach-Hakata Method in Patients with Chronic Nonspecific Low Back Pain

    PubMed Central

    Kogure, Akira; Kotani, Kazuhiko; Katada, Shigehiko; Takagi, Hiroshi; Kamikozuru, Masahiro; Isaji, Takashi; Hakata, Setsuo

    2015-01-01

    Study design cized, single-blind, controlled trial. Objective To investigate the efficacy of the Arthrokinematic approach (AKA)-Hakata (H) method for chronic low back pain. Summary of Background Data The AKA-H method is used to manually treat abnormalities of intra-articular movement. Methods One hundred eighty-six patients with chronic nonspecific low back pain randomly received either the AKA-H method (AKA-H group) or the sham technique (S group) monthly for 6 months. Data were collected at baseline and once a month. Outcome measures were pain intensity (visual analogue scale [VAS]) and quality of life (the Roland-Morris Disability Questionnaire [RDQ] and Short Form SF-36 questionnaire [SF-36]). Results At baseline, the VAS, RDQ, and SF-36 scores showed similar levels between the groups. After 6 months, the AKA-H group had more improvement in the VAS (42.8% improvement) and RDQ score (31.1% improvement) than the sham group (VAS: 10.4% improvement; RDQ: 9.8% improvement; both, P < 0.001). The respective scores for the SF-36 subscales (physical functioning, role physical, bodily pain, social functioning, general health perception, role emotional, and mental health) were also significantly more improved in the AKA-H group than in the sham group (all, P < 0.001). The scores for the physical, psychological, and social aspects of the SF-36 subscales showed similar improvement in the AKA-H group. Conclusion The AKA-H method can be effective in managing chronic low back pain. Trial Registration UMIN Clinical Trials Registry (UMIN-CTR) UMIN000006250. PMID:26646534

  10. Randomized controlled trial to evaluate the effects of combined progressive exercise on metabolic syndrome in breast cancer survivors: rationale, design, and methods

    PubMed Central

    2014-01-01

    Background Metabolic syndrome (MetS) is increasingly present in breast cancer survivors, possibly worsened by cancer-related treatments, such as chemotherapy. MetS greatly increases risk of cardiovascular disease and diabetes, co-morbidities that could impair the survivorship experience, and possibly lead to cancer recurrence. Exercise has been shown to positively influence quality of life (QOL), physical function, muscular strength and endurance, reduce fatigue, and improve emotional well-being; however, the impact on MetS components (visceral adiposity, hyperglycemia, low serum high-density lipoprotein cholesterol, hypertriglyceridemia, and hypertension) remains largely unknown. In this trial, we aim to assess the effects of combined (aerobic and resistance) exercise on components of MetS, as well as on physical fitness and QOL, in breast cancer survivors soon after completing cancer-related treatments. Methods/Design This study is a prospective randomized controlled trial (RCT) investigating the effects of a 16-week supervised progressive aerobic and resistance exercise training intervention on MetS in 100 breast cancer survivors. Main inclusion criteria are histologically-confirmed breast cancer stage I-III, completion of chemotherapy and/or radiation within 6 months prior to initiation of the study, sedentary, and free from musculoskeletal disorders. The primary endpoint is MetS; secondary endpoints include: muscle strength, shoulder function, cardiorespiratory fitness, body composition, bone mineral density, and QOL. Participants randomized to the Exercise group participate in 3 supervised weekly exercise sessions for 16 weeks. Participants randomized to the Control group are offered the same intervention after the 16-week period of observation. Discussion This is the one of few RCTs examining the effects of exercise on MetS in breast cancer survivors. Results will contribute a better understanding of metabolic disease-related effects of resistance and

  11. Estimating efficacy in a randomized trial with product nonadherence: application of multiple methods to a trial of preexposure prophylaxis for HIV prevention.

    PubMed

    Murnane, Pamela M; Brown, Elizabeth R; Donnell, Deborah; Coley, R Yates; Mugo, Nelly; Mujugira, Andrew; Celum, Connie; Baeten, Jared M

    2015-11-15

    Antiretroviral preexposure prophylaxis (PrEP) for persons at high risk of human immunodeficiency virus infection is a promising new prevention strategy. Six randomized trials of oral PrEP were recently conducted and demonstrated efficacy estimates ranging from 75% to no effect, with nonadherence likely resulting in attenuated estimates of the protective effect of PrEP. In 1 of these trials, the Partners PrEP Study (Kenya and Uganda, 2008-2011), participants (4,747 serodiscordant heterosexual couples) were randomized to receipt of tenofovir (TDF), coformulated TDF/emtricitabine (FTC), or placebo. Intention-to-treat analyses found efficacy estimates of 67% for TDF and 75% for TDF/FTC. We applied multiple methods to data from that trial to estimate the efficacy of PrEP with high adherence, including principal stratification and inverse-probability-of-censoring (IPC) weights. Results were further from the null when correcting for nonadherence: 1) among the strata with an estimated 100% probability of high adherence (TDF hazard ratio (HR) = 0.19, 95% confidence interval (CI): 0.07, 0.56; TDF/FTC HR = 0.12, 95% CI: 0.03, 0.52); 2) with IPC weights used to approximate a continuously adherent population (TDF HR = 0.18, 95% CI: 0.06, 0.53; TDF/FTC HR = 0.15, 95% CI: 0.04, 0.52); and 3) in per-protocol analysis (TDF HR = 0.18, 95% CI: 0.06, 0.53; TDF/FTC HR = 0.16, 95% CI: 0.05, 0.53). Our results suggest that the efficacy of PrEP with high adherence is over 80%. PMID:26487343

  12. Testing the effects of brief intervention in primary care for problem drug use in a randomized controlled trial: rationale, design, and methods

    PubMed Central

    2012-01-01

    Background A substantial body of research has established the effectiveness of brief interventions for problem alcohol use. Following these studies, national dissemination projects of screening, brief intervention (BI), and referral to treatment (SBIRT) for alcohol and drugs have been implemented on a widespread scale in multiple states despite little existing evidence for the impact of BI on drug use for non-treatment seekers. This article describes the design of a study testing the impact of SBIRT on individuals with drug problems, its contributions to the existing literature, and its potential to inform drug policy. Methods/design The study is a randomized controlled trial of an SBIRT intervention carried out in a primary care setting within a safety net system of care. Approximately 1,000 individuals presenting for scheduled medical care at one of seven designated primary care clinics who endorse problematic drug use when screened are randomized in a 1:1 ratio to BI versus enhanced care as usual (ECAU). Individuals in both groups are reassessed at 3, 6, 9, and 12 months after baseline. Self-reported drug use and other psychosocial measures collected at each data point are supplemented by urine analysis and public health-related data from administrative databases. Discussion This study will contribute to the existing literature by providing evidence for the impact of BI on problem drug use based on a broad range of measures including self-reported drug use, urine analysis, admission to drug abuse treatment, and changes in utilization and costs of health care services, arrests, and death with the intent of informing policy and program planning for problem drug use at the local, state, and national levels. Trial registration ClinicalTrials.gov NCT00877331 PMID:23237456

  13. Digital servo control of random sound fields

    NASA Technical Reports Server (NTRS)

    Nakich, R. B.

    1973-01-01

    It is necessary to place number of sensors at different positions in sound field to determine actual sound intensities to which test object is subjected. It is possible to determine whether specification is being met adequately or exceeded. Since excitation is of random nature, signals are essentially coherent and it is impossible to obtain true average.

  14. Treatment of reducible unstable fractures of the distal radius: randomized clinical study comparing the locked volar plate and external fixator methods: study protocol

    PubMed Central

    2014-01-01

    Background Various treatments are available for reducible unstable fractures of the distal radius, such as closed reduction combined with fixation by external fixator (EF), and rigid internal fixation using a locked volar plate (VP). Although there are studies comparing these methods, there is no conclusive evidence indicating which treatment is best. The hypothesis of this study is that surgical treatment with a VP is more effective than EF from the standpoint of functional outcome (patient-reported). Methods/Design The study is randomized clinical trial with parallel groups and a blinded evaluator and involves the surgical interventions EF and VP. Patients will be randomly assigned (assignment ratio 1:1) using sealed opaque envelopes. This trial will include consecutive adult patients with an acute (up to 15 days) displaced, unstable fracture of the distal end of the radius of type A2, A3, C1, C2 or C3 by the Arbeitsgemeinschaft für Osteosynthesefragen–Association for the Study of Internal Fixation classification and type II or type III by the IDEAL32 classification, without previous surgical treatments of the wrist. The surgical intervention assigned will be performed by three surgical specialists familiar with the techniques described. Evaluations will be performed at 2, and 8 weeks, 3, 6 and 12 months, with the primary outcomes being measured by the Disabilities of the Arm, Shoulder and Hand (DASH) questionnaire and measurement of pain (Visual Analog Pain Scale and digital algometer). Secondary outcomes will include radiographic parameters, objective functional evaluation (goniometry and dynamometry), and the rate of complications and method failure according to the intention-to-treat principle. Final postoperative evaluations (6 and 12 months) will be performed by independent blinded evaluators. For the Student’s t-test, a difference of 10 points in the DASH score, with a 95% confidence interval, a statistical power of 80%, and 20% sampling error

  15. Efficient numerical methods for the random-field Ising model: Finite-size scaling, reweighting extrapolation, and computation of response functions.

    PubMed

    Fytas, Nikolaos G; Martín-Mayor, Víctor

    2016-06-01

    It was recently shown [Phys. Rev. Lett. 110, 227201 (2013)PRLTAO0031-900710.1103/PhysRevLett.110.227201] that the critical behavior of the random-field Ising model in three dimensions is ruled by a single universality class. This conclusion was reached only after a proper taming of the large scaling corrections of the model by applying a combined approach of various techniques, coming from the zero- and positive-temperature toolboxes of statistical physics. In the present contribution we provide a detailed description of this combined scheme, explaining in detail the zero-temperature numerical scheme and developing the generalized fluctuation-dissipation formula that allowed us to compute connected and disconnected correlation functions of the model. We discuss the error evolution of our method and we illustrate the infinite limit-size extrapolation of several observables within phenomenological renormalization. We present an extension of the quotients method that allows us to obtain estimates of the critical exponent α of the specific heat of the model via the scaling of the bond energy and we discuss the self-averaging properties of the system and the algorithmic aspects of the maximum-flow algorithm used. PMID:27415388

  16. Efficient numerical methods for the random-field Ising model: Finite-size scaling, reweighting extrapolation, and computation of response functions

    NASA Astrophysics Data System (ADS)

    Fytas, Nikolaos G.; Martín-Mayor, Víctor

    2016-06-01

    It was recently shown [Phys. Rev. Lett. 110, 227201 (2013), 10.1103/PhysRevLett.110.227201] that the critical behavior of the random-field Ising model in three dimensions is ruled by a single universality class. This conclusion was reached only after a proper taming of the large scaling corrections of the model by applying a combined approach of various techniques, coming from the zero- and positive-temperature toolboxes of statistical physics. In the present contribution we provide a detailed description of this combined scheme, explaining in detail the zero-temperature numerical scheme and developing the generalized fluctuation-dissipation formula that allowed us to compute connected and disconnected correlation functions of the model. We discuss the error evolution of our method and we illustrate the infinite limit-size extrapolation of several observables within phenomenological renormalization. We present an extension of the quotients method that allows us to obtain estimates of the critical exponent α of the specific heat of the model via the scaling of the bond energy and we discuss the self-averaging properties of the system and the algorithmic aspects of the maximum-flow algorithm used.

  17. Generation of pseudo-random numbers

    NASA Technical Reports Server (NTRS)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  18. Impact of Denture Cleaning Method and Overnight Storage Condition on Denture Biofilm Mass and Composition: A Cross-Over Randomized Clinical Trial

    PubMed Central

    Duyck, Joke; Vandamme, Katleen; Krausch-Hofmann, Stefanie; Boon, Lies; De Keersmaecker, Katrien; Jalon, Eline; Teughels, Wim

    2016-01-01

    Background Appropriate oral hygiene is required to maintain oral health in denture wearers. This study aims to compare the role of denture cleaning methods in combination with overnight storage conditions on biofilm mass and composition on acrylic removable dentures. Methods In a cross-over randomized controlled trial in 13 older people, 4 conditions with 2 different mechanical cleaning methods and 2 overnight storage conditions were considered: (i) brushing and immersion in water without a cleansing tablet, (ii) brushing and immersion in water with a cleansing tablet, (iii) ultrasonic cleaning and immersion in water without a cleansing tablet, and (iv) ultrasonic cleaning and immersion in water with a cleansing tablet. Each test condition was performed for 5 consecutive days, preceded by a 2-days wash-out period. Biofilm samples were taken at baseline (control) and at the end of each test period from a standardized region. Total and individual levels of selected oral bacteria (n = 20), and of Candida albicans were identified using the Polymerase Chain Reaction (PCR) technique. Denture biofilm coverage was scored using an analogue denture plaque score. Paired t-tests and Wilcoxon-signed rank tests were used to compare the test conditions. The level of significance was set at α< 5%. Results Overnight denture storage in water with a cleansing tablet significantly reduced the total bacterial count (p<0.01). The difference in total bacterial level between the two mechanical cleaning methods was not statistically significant. No significant effect was observed on the amount of Candida albicans nor on the analogue plaque scores. Conclusions The use of cleansing tablets during overnight denture storage in addition to mechanical denture cleaning did not affect Candida albicans count, but reduced the total bacterial count on acrylic removable dentures compared to overnight storage in water. This effect was more pronounced when combined with ultrasonic cleaning compared to

  19. Development of a new method for detection and identification of Oenococcus oeni bacteriophages based on endolysin gene sequence and randomly amplified polymorphic DNA.

    PubMed

    Doria, Francesca; Napoli, Chiara; Costantini, Antonella; Berta, Graziella; Saiz, Juan-Carlos; Garcia-Moruno, Emilia

    2013-08-01

    Malolactic fermentation (MLF) is a biochemical transformation conducted by lactic acid bacteria (LAB) that occurs in wine at the end of alcoholic fermentation. Oenococcus oeni is the main species responsible for MLF in most wines. As in other fermented foods, where bacteriophages represent a potential risk for the fermentative process, O. oeni bacteriophages have been reported to be a possible cause of unsuccessful MLF in wine. Thus, preparation of commercial starters that take into account the different sensitivities of O. oeni strains to different phages would be advisable. However, currently, no methods have been described to identify phages infecting O. oeni. In this study, two factors are addressed: detection and typing of bacteriophages. First, a simple PCR method was devised targeting a conserved region of the endolysin (lys) gene to detect temperate O. oeni bacteriophages. For this purpose, 37 O. oeni strains isolated from Italian wines during different phases of the vinification process were analyzed by PCR for the presence of the lys gene, and 25 strains gave a band of the expected size (1,160 bp). This is the first method to be developed that allows identification of lysogenic O. oeni strains without the need for time-consuming phage bacterial-lysis induction methods. Moreover, a phylogenetic analysis was conducted to type bacteriophages. After the treatment of bacteria with UV light, lysis was obtained for 15 strains, and the 15 phage DNAs isolated were subjected to two randomly amplified polymorphic DNA (RAPD)-PCRs. By combining the RAPD profiles and lys sequences, 12 different O. oeni phages were clearly distinguished. PMID:23728816

  20. Development of a New Method for Detection and Identification of Oenococcus oeni Bacteriophages Based on Endolysin Gene Sequence and Randomly Amplified Polymorphic DNA

    PubMed Central

    Doria, Francesca; Napoli, Chiara; Costantini, Antonella; Berta, Graziella; Saiz, Juan-Carlos

    2013-01-01

    Malolactic fermentation (MLF) is a biochemical transformation conducted by lactic acid bacteria (LAB) that occurs in wine at the end of alcoholic fermentation. Oenococcus oeni is the main species responsible for MLF in most wines. As in other fermented foods, where bacteriophages represent a potential risk for the fermentative process, O. oeni bacteriophages have been reported to be a possible cause of unsuccessful MLF in wine. Thus, preparation of commercial starters that take into account the different sensitivities of O. oeni strains to different phages would be advisable. However, currently, no methods have been described to identify phages infecting O. oeni. In this study, two factors are addressed: detection and typing of bacteriophages. First, a simple PCR method was devised targeting a conserved region of the endolysin (lys) gene to detect temperate O. oeni bacteriophages. For this purpose, 37 O. oeni strains isolated from Italian wines during different phases of the vinification process were analyzed by PCR for the presence of the lys gene, and 25 strains gave a band of the expected size (1,160 bp). This is the first method to be developed that allows identification of lysogenic O. oeni strains without the need for time-consuming phage bacterial-lysis induction methods. Moreover, a phylogenetic analysis was conducted to type bacteriophages. After the treatment of bacteria with UV light, lysis was obtained for 15 strains, and the 15 phage DNAs isolated were subjected to two randomly amplified polymorphic DNA (RAPD)-PCRs. By combining the RAPD profiles and lys sequences, 12 different O. oeni phages were clearly distinguished. PMID:23728816

  1. A Mobile Telehealth Intervention for Adults With Insulin-Requiring Diabetes: Early Results of a Mixed-Methods Randomized Controlled Trial

    PubMed Central

    Baron, Justine; Hirani, Shashivadan

    2015-01-01

    Background The role of technology in health care delivery has grown rapidly in the last decade. The potential of mobile telehealth (MTH) to support patient self-management is a key area of research. Providing patients with technological tools that allow for the recording and transmission of health parameters to health care professionals (HCPs) may promote behavior changes that result in improved health outcomes. Although for some conditions the evidence of the effectiveness of MTH is clear, to date the findings on the effects of MTH on diabetes management remain inconsistent. Objective This study aims to evaluate an MTH intervention among insulin-requiring adults with diabetes to establish whether supplementing standard care with MTH results in improved health outcomes—glycated hemoglobin (HbA1c), blood pressure (BP), health-related quality of life (HRQoL), diabetes self-management behaviors, diabetes health care utilization, and diabetes self-efficacy and illness beliefs. An additional objective was to explore the acceptability of MTH and patients’ perceptions of, and experience, using it. Methods A mixed-method design consisting of a 9-month, two-arm, parallel randomized controlled trial (RCT) was used in combination with exit qualitative interviews. Quantitative data was collected at baseline, 3 months, and 9 months. Additional intervention fidelity data, such as participants’ MTH transmissions and contacts with the MTH nurse during the study, were also recorded. Results Data collection for both the quantitative and qualitative components of this study has ended and data analysis is ongoing. A total of 86 participants were enrolled into the study. Out of 86 participants, 45 (52%) were randomized to the intervention group and 36 (42%) to the control group. Preliminary data on MTH training sessions and MTH usage by intervention participants are presented in this paper. We expect to publish complete study results in 2015. Conclusions The range of data

  2. Does the effect of weight lifting on lymphedema following breast cancer differ by diagnostic method: results from a randomized controlled trial.

    PubMed

    Hayes, Sandra C; Speck, Rebecca M; Reimet, Elizabeth; Stark, Azadeh; Schmitz, Kathryn H

    2011-11-01

    The lymphedema diagnostic method used in descriptive or intervention studies may influence results found. The purposes of this work were to compare baseline lymphedema prevalence in the physical activity and lymphedema (PAL) trial cohort and to subsequently compare the effect of the weight-lifting intervention on lymphedema, according to four standard diagnostic methods. The PAL trial was a randomized controlled intervention study, involving 295 women who had previously been treated for breast cancer, and evaluated the effect of 12 months of weight lifting on lymphedema status. Four diagnostic methods were used to evaluate lymphedema outcomes: (i) interlimb volume difference through water displacement, (ii) interlimb size difference through sum of arm circumferences, (iii) interlimb impedance ratio using bioimpedance spectroscopy, and (iv) a validated self-report survey. Of the 295 women who participated in the PAL trial, between 22 and 52% were considered to have lymphedema at baseline according to the four diagnostic criteria used. No between-group differences were noted in the proportion of women who had a change in interlimb volume, interlimb size, interlimb ratio, or survey score of ≥5, ≥5, ≥10%, and 1 unit, respectively (cumulative incidence ratio at study end for each measure ranged between 0.6 and 0.8, with confidence intervals spanning 1.0). The variation in proportions of women within the PAL trial considered to have lymphoedema at baseline highlights the potential impact of the diagnostic criteria on population surveillance regarding prevalence of this common morbidity of treatment. Importantly though, progressive weight lifting was shown to be safe for women following breast cancer, even for those at risk or with lymphedema, irrespective of the diagnostic criteria used. PMID:21562712

  3. Using the random forest method to detect a response shift in the quality of life of multiple sclerosis patients: a cohort study

    PubMed Central

    2013-01-01

    Background Multiple sclerosis (MS), a common neurodegenerative disease, has well-described associations with quality of life (QoL) impairment. QoL changes found in longitudinal studies are difficult to interpret due to the potential response shift (RS) corresponding to respondents’ changing standards, values, and conceptualization of QoL. This study proposes to test the capacity of Random Forest (RF) for detecting RS reprioritization as the relative importance of QoL domains’ changes over time. Methods This was a longitudinal observational study. The main inclusion criteria were patients 18 years old or more with relapsing-remitting multiple sclerosis. Every 6 months up to month 24, QoL was recorded using generic and MS-specific questionnaires (MusiQoL and SF-36). At 24 months, individuals were divided into two ‘disability change’ groups: worsened and not-worsened patients. The RF method was performed based on Breiman’s description. Analyses were performed to determine which QoL scores of SF-36 predicted the MusiQoL index. The average variable importance (AVI) was estimated. Results A total of 417 (79.6%) patients were defined as not-worsened and 107 (20.4%) as worsened. A clear RS was identified in worsened patients. While the mental score AVI was almost one third higher than the physical score AVI at 12 months, it was 1.5 times lower at 24 months. Conclusion This work confirms that the RF method offers a useful statistical approach for RS detection. How to integrate the RS in the interpretation of QoL scores remains a challenge for future research. Trial registration ClinicalTrials.gov identifier: NCT00702065 PMID:23414459

  4. Do we really need a large number of particles to simulate bimolecular reactive transport with random walk methods? A kernel density estimation approach

    NASA Astrophysics Data System (ADS)

    Rahbaralam, Maryam; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2015-12-01

    Random walk particle tracking methods are a computationally efficient family of methods to solve reactive transport problems. While the number of particles in most realistic applications is in the order of 106-109, the number of reactive molecules even in diluted systems might be in the order of fractions of the Avogadro number. Thus, each particle actually represents a group of potentially reactive molecules. The use of a low number of particles may result not only in loss of accuracy, but also may lead to an improper reproduction of the mixing process, limited by diffusion. Recent works have used this effect as a proxy to model incomplete mixing in porous media. In this work, we propose using a Kernel Density Estimation (KDE) of the concentrations that allows getting the expected results for a well-mixed solution with a limited number of particles. The idea consists of treating each particle as a sample drawn from the pool of molecules that it represents; this way, the actual location of a tracked particle is seen as a sample drawn from the density function of the location of molecules represented by that given particle, rigorously represented by a kernel density function. The probability of reaction can be obtained by combining the kernels associated to two potentially reactive particles. We demonstrate that the observed deviation in the reaction vs time curves in numerical experiments reported in the literature could be attributed to the statistical method used to reconstruct concentrations (fixed particle support) from discrete particle distributions, and not to the occurrence of true incomplete mixing. We further explore the evolution of the kernel size with time, linking it to the diffusion process. Our results show that KDEs are powerful tools to improve computational efficiency and robustness in reactive transport simulations, and indicates that incomplete mixing in diluted systems should be modeled based on alternative mechanistic models and not on a

  5. A simple method for analyzing actives in random RNAi screens: introducing the “H Score” for hit nomination & gene prioritization

    PubMed Central

    Bhinder, Bhavneet; Djaballah, Hakim

    2013-01-01

    Due to the numerous challenges in hit identification from random RNAi screening, we have examined current practices with a discovery of a variety of methodologies employed and published in many reports; majority of them, unfortunately, do not address the minimum associated criteria for hit nomination, as this could potentially have been the cause or may well be the explanation as to the lack of confirmation and follow up studies, currently facing the RNAi field. Overall, we find that these criteria or parameters are not well defined, in most cases arbitrary in nature, and hence rendering it extremely difficult to judge the quality of and confidence in nominated hits across published studies. For this purpose, we have developed a simple method to score actives independent of assay readout; and provide, for the first time, a homogenous platform enabling cross-comparison of active gene lists resulting from different RNAi screening technologies. Here, we report on our recently developed method dedicated to RNAi data output analysis referred to as the BDA method applicable to both arrayed and pooled RNAi technologies; wherein the concerns pertaining to inconsistent hit nomination and off-target silencing in conjugation with minimal activity criteria to identify a high value target are addressed. In this report, a combined hit rate per gene, called “H score”, is introduced and defined. The H score provides a very useful tool for stringent active gene nomination, gene list comparison across multiple studies, prioritization of hits, and evaluation of the quality of the nominated gene hits. PMID:22934950

  6. Emotional Experiences of Obese Women with Adequate Gestational Weight Variation: A Qualitative Study

    PubMed Central

    Faria-Schützer, Débora Bicudo; Surita, Fernanda Garanhani de Castro; Alves, Vera Lucia Pereira; Vieira, Carla Maria; Turato, Egberto Ribeiro

    2015-01-01

    Background As a result of the growth of the obese population, the number of obese women of fertile age has increased in the last few years. Obesity in pregnancy is related to greater levels of anxiety, depression and physical harm. However, pregnancy is an opportune moment for the intervention of health care professionals to address obesity. The objective of this study was to describe how obese pregnant women emotionally experience success in adequate weight control. Methods and Findings Using a qualitative design that seeks to understand content in the field of health, the sample of subjects was deliberated, with thirteen obese pregnant women selected to participate in an individual interview. Data was analysed by inductive content analysis and includes complete transcription of the interviews, re-readings using suspended attention, categorization in discussion topics and the qualitative and inductive analysis of the content. The analysis revealed four categories, three of which show the trajectory of body care that obese women experience during pregnancy: 1) The obese pregnant woman starts to think about her body;2) The challenge of the diet for the obese pregnant woman; 3) The relation of the obese pregnant woman with the team of antenatal professionals. The fourth category reveals the origin of the motivation for the change: 4) The potentializing factors for change: the motivation of the obese woman while pregnant. Conclusions During pregnancy, obese women are more in touch with themselves and with their emotional conflicts. Through the transformations of their bodies, women can start a more refined self-care process and experience of the body-mind unit. The fear for their own and their baby's life, due to the risks posed by obesity, appears to be a great potentializing factor for change. The relationship with the professionals of the health care team plays an important role in the motivational support of the obese pregnant woman. PMID:26529600

  7. Gauge cooling for the singular-drift problem in the complex Langevin method — a test in Random Matrix Theory for finite density QCD

    NASA Astrophysics Data System (ADS)

    Nagata, Keitaro; Nishimura, Jun; Shimasaki, Shinji

    2016-07-01

    Recently, the complex Langevin method has been applied successfully to finite density QCD either in the deconfinement phase or in the heavy dense limit with the aid of a new technique called the gauge cooling. In the confinement phase with light quarks, however, convergence to wrong limits occurs due to the singularity in the drift term caused by small eigenvalues of the Dirac operator including the mass term. We propose that this singular-drift problem should also be overcome by the gauge cooling with different criteria for choosing the complexified gauge transformation. The idea is tested in chiral Random Matrix Theory for finite density QCD, where exact results are reproduced at zero temperature with light quarks. It is shown that the gauge cooling indeed changes drastically the eigenvalue distribution of the Dirac operator measured during the Langevin process. Despite its non-holomorphic nature, this eigenvalue distribution has a universal diverging behavior at the origin in the chiral limit due to a generalized Banks-Casher relation as we confirm explicitly.

  8. A minimalistic approach to static and dynamic electron correlations: Amending generalized valence bond method with extended random phase approximation correlation correction

    NASA Astrophysics Data System (ADS)

    Chatterjee, Koushik; Pastorczak, Ewa; Jawulski, Konrad; Pernal, Katarzyna

    2016-06-01

    A perfect-pairing generalized valence bond (GVB) approximation is known to be one of the simplest approximations, which allows one to capture the essence of static correlation in molecular systems. In spite of its attractive feature of being relatively computationally efficient, this approximation misses a large portion of dynamic correlation and does not offer sufficient accuracy to be generally useful for studying electronic structure of molecules. We propose to correct the GVB model and alleviate some of its deficiencies by amending it with the correlation energy correction derived from the recently formulated extended random phase approximation (ERPA). On the examples of systems of diverse electronic structures, we show that the resulting ERPA-GVB method greatly improves upon the GVB model. ERPA-GVB recovers most of the electron correlation and it yields energy barrier heights of excellent accuracy. Thanks to a balanced treatment of static and dynamic correlation, ERPA-GVB stays reliable when one moves from systems dominated by dynamic electron correlation to those for which the static correlation comes into play.

  9. Study design and methods for a randomized crossover trial substituting brown rice for white rice on diabetes risk factors in India

    PubMed Central

    Wedick, Nicole M.; Vasudevan, Sudha; Spiegelman, Donna; Bai, Ramya; Malik, Vasanti; Venkatachalam, Siva Sankari; Parthasarathy, Vijayalaksmi; Vaidya, Ruchi; Nagarajan, Lakshmipriya; Arumugam, Kokila; Jones, Clara; Campos, Hannia; Krishnaswamy, Kamala; Willett, Walter; Hu, Frank B.; Mohan, Anjana Ranjit; Viswanathan, Mohan

    2016-01-01

    India has the second largest number of people with diabetes in the world following China. Evidence indicates that consumption of whole grains can reduce risk of type 2 diabetes. This manuscript describes the study design and methods of a trial in progress evaluating the effects of substituting whole grain brown rice for polished (refined) white rice on biomarkers of diabetes risk (glucose metabolism, dyslipidemia, inflammation). This is a randomized controlled clinical trial with a crossover design conducted in Chennai, India among overweight but otherwise healthy volunteers aged 25–65y with a body mass index ≥23kg/m2 and habitual rice consumption ≥200grams/day. The feasibility and cultural appropriateness of this type of intervention in the local environment will also be examined. If the intervention is efficacious, the findings can be incorporated into national-level policies which could include the provision of brown rice as an option or replacement for white rice in government institutions and food programs. This relatively simple dietary intervention has the potential to substantially diminish the burden of diabetes in Asia and elsewhere. PMID:26017321

  10. Online self-administered training for post-traumatic stress disorder treatment providers: design and methods for a randomized, prospective intervention study.

    PubMed

    Ruzek, Josef I; Rosen, Raymond C; Marceau, Lisa; Larson, Mary Jo; Garvert, Donn W; Smith, Lauren; Stoddard, Anne

    2012-01-01

    This paper presents the rationale and methods for a randomized controlled evaluation of web-based training in motivational interviewing, goal setting, and behavioral task assignment. Web-based training may be a practical and cost-effective way to address the need for large-scale mental health training in evidence-based practice; however, there is a dearth of well-controlled outcome studies of these approaches. For the current trial, 168 mental health providers treating post-traumatic stress disorder (PTSD) were assigned to web-based training plus supervision, web-based training, or training-as-usual (control). A novel standardized patient (SP) assessment was developed and implemented for objective measurement of changes in clinical skills, while on-line self-report measures were used for assessing changes in knowledge, perceived self-efficacy, and practice related to cognitive behavioral therapy (CBT) techniques. Eligible participants were all actively involved in mental health treatment of veterans with PTSD. Study methodology illustrates ways of developing training content, recruiting participants, and assessing knowledge, perceived self-efficacy, and competency-based outcomes, and demonstrates the feasibility of conducting prospective studies of training efficacy or effectiveness in large healthcare systems. PMID:22583520

  11. Study design and methods for a randomized crossover trial substituting brown rice for white rice on diabetes risk factors in India.

    PubMed

    Wedick, Nicole M; Sudha, Vasudevan; Spiegelman, Donna; Bai, Mookambika Ramya; Malik, Vasanti S; Venkatachalam, Siva Sankari; Parthasarathy, Vijayalaksmi; Vaidya, Ruchi; Nagarajan, Lakshmipriya; Arumugam, Kokila; Jones, Clara; Campos, Hannia; Krishnaswamy, Kamala; Willett, Walter; Hu, Frank B; Anjana, Ranjit Mohan; Mohan, Viswanathan

    2015-01-01

    India has the second largest number of people with diabetes in the world following China. Evidence indicates that consumption of whole grains can reduce the risk of type 2 diabetes. This article describes the study design and methods of a trial in progress evaluating the effects of substituting whole grain brown rice for polished (refined) white rice on biomarkers of diabetes risk (glucose metabolism, dyslipidemia, inflammation). This is a randomized controlled clinical trial with a crossover design conducted in Chennai, India among overweight but otherwise healthy volunteers aged 25-65 y with a body mass index ≥23 kg/m(2) and habitual rice consumption ≥200 g/day. The feasibility and cultural appropriateness of this type of intervention in the local environment will also be examined. If the intervention is efficacious, the findings can be incorporated into national-level policies which could include the provision of brown rice as an option or replacement for white rice in government institutions and food programs. This relatively simple dietary intervention has the potential to substantially diminish the burden of diabetes in Asia and elsewhere. PMID:26017321

  12. Research staff training in a multisite randomized clinical trial: Methods and recommendations from the Stimulant Reduction Intervention using Dosed Exercise (STRIDE) trial

    PubMed Central

    Walker, Robrina; Morris, David W; Greer, Tracy L; Trivedi, Madhukar H

    2014-01-01

    Background Descriptions of and recommendations for meeting the challenges of training research staff for multisite studies are limited despite the recognized importance of training on trial outcomes. The STRIDE (STimulant Reduction Intervention using Dosed Exercise) study is a multisite randomized clinical trial that was conducted at nine addiction treatment programs across the United States within the National Drug Abuse Treatment Clinical Trials Network (CTN) and evaluated the addition of exercise to addiction treatment as usual (TAU), compared to health education added to TAU, for individuals with stimulant abuse or dependence. Research staff administered a variety of measures that required a range of interviewing, technical, and clinical skills. Purpose In order to address the absence of information on how research staff are trained for multisite clinical studies, the current manuscript describes the conceptual process of training and certifying research assistants for STRIDE. Methods Training was conducted using a three-stage process to allow staff sufficient time for distributive learning, practice, and calibration leading up to implementation of this complex study. Results Training was successfully implemented with staff across nine sites. Staff demonstrated evidence of study and procedural knowledge via quizzes and skill demonstration on six measures requiring certification. Overall, while the majority of staff had little to no experience in the six measures, all research assistants demonstrated ability to correctly and reliably administer the measures throughout the study. Conclusions Practical recommendations are provided for training research staff and are particularly applicable to the challenges encountered with large, multisite trials. PMID:25379036

  13. Online self-administered training for post-traumatic stress disorder treatment providers: design and methods for a randomized, prospective intervention study

    PubMed Central

    2012-01-01

    This paper presents the rationale and methods for a randomized controlled evaluation of web-based training in motivational interviewing, goal setting, and behavioral task assignment. Web-based training may be a practical and cost-effective way to address the need for large-scale mental health training in evidence-based practice; however, there is a dearth of well-controlled outcome studies of these approaches. For the current trial, 168 mental health providers treating post-traumatic stress disorder (PTSD) were assigned to web-based training plus supervision, web-based training, or training-as-usual (control). A novel standardized patient (SP) assessment was developed and implemented for objective measurement of changes in clinical skills, while on-line self-report measures were used for assessing changes in knowledge, perceived self-efficacy, and practice related to cognitive behavioral therapy (CBT) techniques. Eligible participants were all actively involved in mental health treatment of veterans with PTSD. Study methodology illustrates ways of developing training content, recruiting participants, and assessing knowledge, perceived self-efficacy, and competency-based outcomes, and demonstrates the feasibility of conducting prospective studies of training efficacy or effectiveness in large healthcare systems. PMID:22583520

  14. Feasibility, acceptability, and effects of gentle Hatha yoga for women with major depression: findings from a randomized controlled mixed-methods study.

    PubMed

    Kinser, Patricia Anne; Bourguignon, Cheryl; Whaley, Diane; Hauenstein, Emily; Taylor, Ann Gill

    2013-06-01

    Major depressive disorder (MDD) is a common, debilitating chronic condition in the United States and worldwide. Particularly in women, depressive symptoms are often accompanied by high levels of stress and ruminations, or repetitive self-critical negative thinking. There is a research and clinical imperative to evaluate complementary therapies that are acceptable and feasible for women with depression and that target specific aspects of depression in women, such as ruminations. To begin to address this need, we conducted a randomized, controlled, mixed-methods community-based study comparing an 8-week yoga intervention with an attention-control activity in 27 women with MDD. After controlling for baseline stress, there was a decrease in depression over time in both the yoga group and the attention-control group, with the yoga group having a unique trend in decreased ruminations. Participants in the yoga group reported experiencing increased connectedness and gaining a coping strategy through yoga. The findings provide support for future large scale research to explore the effects of yoga for depressed women and the unique role of yoga in decreasing rumination. PMID:23706890

  15. Comparison of four standards for determining adequate water intake of nursing home residents.

    PubMed

    Gaspar, Phyllis M

    2011-01-01

    Adequate hydration for nursing home residents is problematic. The purpose of this study was to compare four standards used to determine a recommended water intake among nursing home residents. Inconsistencies in the amount of water intake recommended based on the standards compared were identified. The standard based on height and weight provides the most individualized recommendation. An individualized recommendation would facilitate goal setting for the care plan of each older person and assist in the prevention of dehydration. It is essential that a cost-effective and clinically feasible approach to determine adequate water intake be determined for this population to prevent the adverse outcomes associated with dehydration. PMID:21469538

  16. Is random access memory random?

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.

  17. Application of bimodal distribution to the detection of changes in uranium concentration in drinking water collected by random daytime sampling method from a large water supply zone.

    PubMed

    Garboś, Sławomir; Święcicka, Dorota

    2015-11-01

    The random daytime (RDT) sampling method was used for the first time in the assessment of average weekly exposure to uranium through drinking water in a large water supply zone. Data set of uranium concentrations determined in 106 RDT samples collected in three runs from the water supply zone in Wroclaw (Poland), cannot be simply described by normal or log-normal distributions. Therefore, a numerical method designed for the detection and calculation of bimodal distribution was applied. The extracted two distributions containing data from the summer season of 2011 and the winter season of 2012 (nI=72) and from the summer season of 2013 (nII=34) allowed to estimate means of U concentrations in drinking water: 0.947 μg/L and 1.23 μg/L, respectively. As the removal efficiency of uranium during applied treatment process is negligible, the effect of increase in uranium concentration can be explained by higher U concentration in the surface-infiltration water used for the production of drinking water. During the summer season of 2013, heavy rains were observed in Lower Silesia region, causing floods over the territory of the entire region. Fluctuations in uranium concentrations in surface-infiltration water can be attributed to releases of uranium from specific sources - migration from phosphate fertilizers and leaching from mineral deposits. Thus, exposure to uranium through drinking water may increase during extreme rainfall events. The average chronic weekly intakes of uranium through drinking water, estimated on the basis of central values of the extracted normal distributions, accounted for 3.2% and 4.1% of tolerable weekly intake. PMID:26143355

  18. Methods and baseline characteristics of a randomized trial treating early childhood obesity: The Positive Lifestyles for Active Youngsters (Team PLAY) trial

    PubMed Central

    Hare, Marion; Coday, Mace; Williams, Natalie A.; Richey, Phyllis; Tylavsky, Frances; Bush, Andrew

    2012-01-01

    There are few effective obesity interventions directed towards younger children, particularly young minority children. This paper describes the design, intervention, recruitment methods, and baseline data of the ongoing Positive Lifestyles for Active Youngsters (Team PLAY) study. This randomized controlled trial is designed to test the efficacy of a 6-month, moderately intense, primary care feasible, family-based behavioral intervention, targeting both young children and their parent, in promoting healthy weight change. Participants are 270 overweight and obese children (ages 4 to 7 years) and their parent, who were recruited from a primarily African American urban population. Parents and children were instructed in proven cognitive behavioral techniques (e.g. goal setting, self-talk, stimulus control and reinforcement) designed to encourage healthier food choices (more whole grains, fruits and vegetables, and less concentrated fats and sugar), reduce portion sizes, decrease sweetened beverages and increase moderate to vigorous physical activity engagement. The main outcome of this study is change in BMI at two years post enrollment. Recruitment using reactive methods (mailings, TV ads, pamphlets) was found to be more successful than using only a proactive approach (referral through physicians). At baseline, most children were very obese with an average BMI z-score of 2.6. Reported intake of fruits and vegetables and minutes of moderate to vigorous physical activity engagement did not meet national recommendations. If efficacious, Team PLAY would offer a model for obesity treatment directed at families with young children that could be tested and translated to both community and primary care settings. PMID:22342450

  19. Game-Based E-Learning Is More Effective than a Conventional Instructional Method: A Randomized Controlled Trial with Third-Year Medical Students

    PubMed Central

    Boeker, Martin; Andel, Peter; Vach, Werner; Frankenschmidt, Alexander

    2013-01-01

    Background When compared with more traditional instructional methods, Game-based e-learning (GbEl) promises a higher motivation of learners by presenting contents in an interactive, rule-based and competitive way. Most recent systematic reviews and meta-analysis of studies on Game-based learning and GbEl in the medical professions have shown limited effects of these instructional methods. Objectives To compare the effectiveness on the learning outcome of a Game-based e-learning (GbEl) instruction with a conventional script-based instruction in the teaching of phase contrast microscopy urinalysis under routine training conditions of undergraduate medical students. Methods A randomized controlled trial was conducted with 145 medical students in their third year of training in the Department of Urology at the University Medical Center Freiburg, Germany. 82 subjects where allocated for training with an educational adventure-game (GbEl group) and 69 subjects for conventional training with a written script-based approach (script group). Learning outcome was measured with a 34 item single choice test. Students' attitudes were collected by a questionnaire regarding fun with the training, motivation to continue the training and self-assessment of acquired knowledge. Results The students in the GbEl group achieved significantly better results in the cognitive knowledge test than the students in the script group: the mean score was 28.6 for the GbEl group and 26.0 for the script group of a total of 34.0 points with a Cohen's d effect size of 0.71 (ITT analysis). Attitudes towards the recent learning experience were significantly more positive with GbEl. Students reported to have more fun while learning with the game when compared to the script-based approach. Conclusions Game-based e-learning is more effective than a script-based approach for the training of urinalysis in regard to cognitive learning outcome and has a high positive motivational impact on learning. Game

  20. Shoulder Arthroscopy Does Not Adequately Visualize Pathology of the Long Head of Biceps Tendon

    PubMed Central

    Saithna, Adnan; Longo, Alison; Leiter, Jeff; Old, Jason; MacDonald, Peter M.

    2016-01-01

    Background: Pulling the long head of the biceps tendon into the joint at arthroscopy is a common method for evaluation of tendinopathic lesions. However, the rate of missed diagnoses when using this technique is reported to be as high as 30% to 50%. Hypothesis: Tendon excursion achieved using a standard arthroscopic probe does not allow adequate visualization of extra-articular sites of predilection of tendinopathy. Study Design: Descriptive laboratory study. Methods: Seven forequarter amputation cadaveric specimens were evaluated. The biceps tendon was tagged to mark the intra-articular length and the maximum excursions achieved using a probe and a grasper in both beach-chair and lateral positions. Statistical analyses were performed using analysis of variance to compare means. Results: The mean intra-articular and extra-articular lengths of the tendons were 23.9 and 82.3 mm, respectively. The length of tendon that could be visualized by pulling it into the joint with a probe through the anterior midglenoid portal was not significantly different when using either lateral decubitus (mean ± SD, 29.9 ± 3.89 mm; 95% CI, 25.7-34 mm) or beach-chair positions (32.7 ± 4.23 mm; 95% CI, 28.6-36.8 mm). The maximum length of the overall tendon visualized in any specimen using a standard technique was 37 mm. Although there was a trend to greater excursion using a grasper through the same portal, this was not statistically significant. However, using a grasper through the anterosuperior portal gave a significantly greater mean excursion than any other technique (46.7 ± 4.31 mm; 95% CI, 42.6-50.8 mm), but this still failed to allow evaluation of Denard zone C. Conclusion: Pulling the tendon into the joint with a probe via an anterior portal does not allow visualization of distal sites of predilection of pathology. Surgeons should be aware that this technique is inadequate and can result in missed diagnoses. Clinical Relevance: This study demonstrates that glenohumeral

  1. How random are random numbers generated using photons?

    NASA Astrophysics Data System (ADS)

    Solis, Aldo; Angulo Martínez, Alí M.; Ramírez Alarcón, Roberto; Cruz Ramírez, Hector; U'Ren, Alfred B.; Hirsch, Jorge G.

    2015-06-01

    Randomness is fundamental in quantum theory, with many philosophical and practical implications. In this paper we discuss the concept of algorithmic randomness, which provides a quantitative method to assess the Borel normality of a given sequence of numbers, a necessary condition for it to be considered random. We use Borel normality as a tool to investigate the randomness of ten sequences of bits generated from the differences between detection times of photon pairs generated by spontaneous parametric downconversion. These sequences are shown to fulfil the randomness criteria without difficulties. As deviations from Borel normality for photon-generated random number sequences have been reported in previous work, a strategy to understand these diverging findings is outlined.

  2. The Relationship between Parental Involvement and Adequate Yearly Progress among Urban, Suburban, and Rural Schools

    ERIC Educational Resources Information Center

    Ma, Xin; Shen, Jianping; Krenn, Huilan Y.

    2014-01-01

    Using national data from the 2007-08 School and Staffing Survey, we compared the relationships between parental involvement and school outcomes related to adequate yearly progress (AYP) in urban, suburban, and rural schools. Parent-initiated parental involvement demonstrated significantly positive relationships with both making AYP and staying off…

  3. Influenza 2005-2006: vaccine supplies adequate, but bird flu looms.

    PubMed

    Mossad, Sherif B

    2005-11-01

    Influenza vaccine supplies appear to be adequate for the 2005-2006 season, though delivery has been somewhat delayed. However, in the event of a pandemic of avian flu-considered inevitable by most experts, although no one knows when it will happen-the United States would be woefully unprepared. PMID:16315443

  4. Calculating and Reducing Errors Associated with the Evaluation of Adequate Yearly Progress.

    ERIC Educational Resources Information Center

    Hill, Richard

    In the Spring, 1996, issue of "CRESST Line," E. Baker and R. Linn commented that, in efforts to measure the progress of schools, "the fluctuations due to differences in the students themselves could conceal differences in instructional effects." This is particularly true in the context of the evaluation of adequate yearly progress required by…

  5. How Much and What Kind? Identifying an Adequate Technology Infrastructure for Early Childhood Education. Policy Brief

    ERIC Educational Resources Information Center

    Daugherty, Lindsay; Dossani, Rafiq; Johnson, Erin-Elizabeth; Wright, Cameron

    2014-01-01

    To realize the potential benefits of technology use in early childhood education (ECE), and to ensure that technology can help to address the digital divide, providers, families of young children, and young children themselves must have access to an adequate technology infrastructure. The goals for technology use in ECE that a technology…

  6. Prenatal zinc supplementation of zinc-adequate rats adversely affects immunity in offspring

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We previously showed that zinc (Zn) supplementation of Zn-adequate dams induced immunosuppressive effects that persist in the offspring after weaning. We investigated whether the immunosuppressive effects were due to in utero exposure and/or mediated via milk using a cross-fostering design. Pregnant...

  7. 75 FR 5893 - Suspension of Community Eligibility for Failure To Maintain Adequate Floodplain Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... FR 51735. Executive Order 13132, Federalism. This rule involves no policies that have ] federalism....C. 4001 et seq., Reorganization Plan No. 3 of 1978, 3 CFR, 1978 Comp., p. 329; E.O. 12127, 44 FR... To Maintain Adequate Floodplain Management Regulations AGENCY: Federal Emergency Management...

  8. 26 CFR 1.467-2 - Rent accrual for section 467 rental agreements without adequate interest.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... provide for a variable rate of interest. For purposes of the adequate interest test under paragraph (b)(1) of this section, if a section 467 rental agreement provides for variable interest, the rental... date as the issue date) for the variable rates called for by the rental agreement. For purposes of...

  9. The Unequal Effect of Adequate Yearly Progress: Evidence from School Visits

    ERIC Educational Resources Information Center

    Brown, Abigail B.; Clift, Jack W.

    2010-01-01

    The authors report insights, based on annual site visits to elementary and middle schools in three states from 2004 to 2006, into the incentive effect of the No Child Left Behind Act's requirement that increasing percentages of students make Adequate Yearly Progress (AYP) in every public school. They develop a framework, drawing on the physics…

  10. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...

  11. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...

  12. Perceptions of Teachers in Their First Year of School Restructuring: Failure to Make Adequate Yearly Progress

    ERIC Educational Resources Information Center

    Moser, Sharon

    2010-01-01

    The 2007-2008 school year marked the first year Florida's Title I schools that did not made Adequate Yearly Progress (AYP) for five consecutive years entered into restructuring as mandated by the "No Child Left Behind Act" of 2001. My study examines the perceptions of teacher entering into their first year of school restructuring due to failure to…

  13. A Model for Touch Technique and Computation of Adequate Cane Length.

    ERIC Educational Resources Information Center

    Plain-Switzer, Karen

    1993-01-01

    This article presents a model for the motion of a long-cane executing the touch technique and presents formulas for the projected length of a cane adequate to protect an individual with blindness against wall-type and pole-type hazards. The paper concludes that the long-cane should reach from the floor to the user's armpit. (JDD)

  14. Towards Defining Adequate Lithium Trials for Individuals with Mental Retardation and Mental Illness.

    ERIC Educational Resources Information Center

    Pary, Robert J.

    1991-01-01

    Use of lithium with mentally retarded individuals with psychiatric conditions and/or behavior disturbances is discussed. The paper describes components of an adequate clinical trial and reviews case studies and double-blind cases. The paper concludes that aggression is the best indicator for lithium use, and reviews treatment parameters and…

  15. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... animal health, behavior, and well-being is conveyed to the attending veterinarian; (4) Guidance to... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION...

  16. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND...

  17. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND...

  18. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND...

  19. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... animal health, behavior, and well-being is conveyed to the attending veterinarian; (4) Guidance to... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION...

  20. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND...

  1. Inferential Processing among Adequate and Struggling Adolescent Comprehenders and Relations to Reading Comprehension

    ERIC Educational Resources Information Center

    Barth, Amy E.; Barnes, Marcia; Francis, David; Vaughn, Sharon; York, Mary

    2015-01-01

    Separate mixed model analyses of variance were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6-12 (n = 1,203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in…

  2. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... with the State's requirements for availability of services, as set forth in § 438.206. (e) CMS' right... HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and Performance... 42 Public Health 4 2010-10-01 2010-10-01 false Assurances of adequate capacity and services....

  3. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... requirements: (1) Offers an appropriate range of preventive, primary care, and specialty services that is adequate for the anticipated number of enrollees for the service area. (2) Maintains a network of providers... enrollment in its service area in accordance with the State's standards for access to care under this...

  4. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... requirements: (1) Offers an appropriate range of preventive, primary care, and specialty services that is adequate for the anticipated number of enrollees for the service area. (2) Maintains a network of providers... enrollment in its service area in accordance with the State's standards for access to care under this...

  5. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... requirements: (1) Offers an appropriate range of preventive, primary care, and specialty services that is adequate for the anticipated number of enrollees for the service area. (2) Maintains a network of providers... enrollment in its service area in accordance with the State's standards for access to care under this...

  6. Effect of tranquilizers on animal resistance to the adequate stimuli of the vestibular apparatus

    NASA Technical Reports Server (NTRS)

    Maksimovich, Y. B.; Khinchikashvili, N. V.

    1980-01-01

    The effect of tranquilizers on vestibulospinal reflexes and motor activity was studied in 900 centrifuged albino mice. Actometric studies have shown that the tranquilizers have a group capacity for increasing animal resistance to the action of adequate stimuli to the vestibular apparatus.

  7. Final 2004 Report on Adequate Yearly Progress in the Montgomery County Public Schools

    ERIC Educational Resources Information Center

    Stevenson, Jose W.

    2005-01-01

    The vast majority of Montgomery County public schools made sufficient progress on state testing and accountability standards in 2004 to comply with the adequate yearly progress (AYP) requirements under the "No Child Left Behind (NCLB) Act of 2001." Information released by the Maryland State Department of Education (MSDE) in October 2004 shows that…

  8. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...

  9. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...

  10. Leadership Style and Adequate Yearly Progress: A Correlational Study of Effective Principal Leadership

    ERIC Educational Resources Information Center

    Leapley-Portscheller, Claudia Iris

    2008-01-01

    Principals are responsible for leading efforts to reach increasingly higher levels of student academic proficiency in schools associated with adequate yearly progress (AYP) requirements. The purpose of this quantitative, correlational study was to identify the degree to which perceptions of principal transformational, transactional, and…

  11. Percentage of Adults with High Cholesterol Whose LDL Cholesterol Levels Are Adequately Controlled

    MedlinePlus

    ... of Adults with High Cholesterol Whose LDL Cholesterol Levels are Adequately Controlled High cholesterol can double a ... with High Cholesterol that is Controlled by Education Level 8k4c-k22f Download these data » Click on legends ...

  12. 42 CFR 413.24 - Adequate cost data and cost finding.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Adequate cost data and cost finding. 413.24 Section 413.24 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM PRINCIPLES OF REASONABLE COST REIMBURSEMENT; PAYMENT FOR END-STAGE RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY...

  13. Principals' Perceptions of Effective Strategies in Meeting Adequate Yearly Progress in Special Education

    ERIC Educational Resources Information Center

    Meyer, Jadie K.

    2012-01-01

    The purpose of this study was to examine the perceptions of principals who have met Adequate Yearly Progress (AYP) with the special education subgroup. This was a qualitative study, utilizing interviews to answer the research questions. The first three research questions analyzed the areas of assessment, building-level leadership, and curriculum…

  14. Human milk feeding supports adequate growth in infants

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The ...

  15. 75 FR 74022 - Safety Analysis Requirements for Defining Adequate Protection for the Public and the Workers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-30

    ... November 15, 2010 (75 FR 69648). The corrected text of the recommendation approved by the Board is below... or telephone number (202) 694-7000. Correction: In the Federal Register of November 15, 2010 (75 FR... SAFETY BOARD Safety Analysis Requirements for Defining Adequate Protection for the Public and the...

  16. Evaluating Rural Progress in Mathematics Achievement: Threats to the Validity of "Adequate Yearly Progress"

    ERIC Educational Resources Information Center

    Lee, Jaekyung

    2003-01-01

    This article examines major threats to the validity of Adequate Yearly Progress (AYP) in the context of rural schools. Although rural students and their schools made significant academic progress in the past on national and state assessments, the current goal of AYP turns out to be highly unrealistic for them unless states set far lower…

  17. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Exemptions for pesticides adequately regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION...

  18. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Exemptions for pesticides adequately regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION...

  19. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Exemptions for pesticides adequately regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION...

  20. What Is the Cost of an Adequate Vermont High School Education?

    ERIC Educational Resources Information Center

    Rucker, Frank D.

    2010-01-01

    Access to an adequate education has been widely considered an undeniable right since Chief Justice Warren stated in his landmark decision that "Today, education is perhaps the most important function of state and local governments...it is doubtful that any child may reasonably be expected to succeed in life if he is denied the opportunity of an…

  1. Landslide Susceptibility Analysis by the comparison and integration of Random Forest and Logistic Regression methods; application to the disaster of Nova Friburgo - Rio de Janeiro, Brasil (January 2011)

    NASA Astrophysics Data System (ADS)

    Esposito, Carlo; Barra, Anna; Evans, Stephen G.; Scarascia Mugnozza, Gabriele; Delaney, Keith

    2014-05-01

    The study of landslide susceptibility by multivariate statistical methods is based on finding a quantitative relationship between controlling factors and landslide occurrence. Such studies have become popular in the last few decades thanks to the development of geographic information systems (GIS) software and the related improved data management. In this work we applied a statistical approach to an area of high landslide susceptibility mainly due to its tropical climate and geological-geomorphological setting. The study area is located in the south-east region of Brazil that has frequently been affected by flood and landslide hazard, especially because of heavy rainfall events during the summer season. In this work we studied a disastrous event that occurred on January 11th and 12th of 2011, which involved Região Serrana (the mountainous region of Rio de Janeiro State) and caused more than 5000 landslides and at least 904 deaths. In order to produce susceptibility maps, we focused our attention on an area of 93,6 km2 that includes Nova Friburgo city. We utilized two different multivariate statistic methods: Logistic Regression (LR), already widely used in applied geosciences, and Random Forest (RF), which has only recently been applied to landslide susceptibility analysis. With reference to each mapping unit, the first method (LR) results in a probability of landslide occurrence, while the second one (RF) gives a prediction in terms of % of area susceptible to slope failure. With this aim in mind, a landslide inventory map (related to the studied event) has been drawn up through analyses of high-resolution GeoEye satellite images, in a GIS environment. Data layers of 11 causative factors have been created and processed in order to be used as continuous numerical or discrete categorical variables in statistical analysis. In particular, the logistic regression method has frequent difficulties in managing numerical continuous and discrete categorical variables

  2. Treatment Outcomes of Corticosteroid Injection and Extracorporeal Shock Wave Therapy as Two Primary Therapeutic Methods for Acute Plantar Fasciitis: A Prospective Randomized Clinical Trial.

    PubMed

    Mardani-Kivi, Mohsen; Karimi Mobarakeh, Mahmoud; Hassanzadeh, Zabihallah; Mirbolook, Ahmadreza; Asadi, Kamran; Ettehad, Hossein; Hashemi-Motlagh, Keyvan; Saheb-Ekhtiari, Khashayar; Fallah-Alipour, Keyvan

    2015-01-01

    The outcome of corticosteroid injection (CSI) and extracorporeal shock wave therapy (ESWT) as primary treatment of acute plantar fasciitis has been debated. The purpose of the present study was to evaluate and compare the therapeutic effects of CSI and ESWT in patients with acute (<6-week duration) symptomatic plantar fasciitis. Of the 116 eligible patients, 68 were randomized to 2 equal groups of 34 patients, each undergoing either ESWT or CSI. The ESWT method included 2000 impulses with energy of 0.15 mJ/mm(2) and a total energy flux density of 900 mJ/mm(2) for 3 consecutive sessions at 1-week intervals. In the CSI group, 40 mg of methyl prednisolone acetate plus 1 mL of lidocaine 2% was injected into the maximal tenderness point at the inframedial calcaneal tuberosity. The success and recurrence rates and pain intensity measured using the visual analog scale, were recorded and compared at the 3-month follow-up visit. The pain intensity had reduced significantly in all patients undergoing either technique. However, the value and trend of pain reduction in the CSI group was significantly greater than those in the ESWT group (p < .0001). In the ESWT and CSI groups, 19 (55.9%) and 5 (14.7%) patients experienced treatment failure, respectively. Age, gender, body mass index, and recurrence rate were similar between the 2 groups (p > .05). Both ESWT and CSI can be used as the primary and/or initial treatment option for treating patients with acute plantar fasciitis; however, the CSI technique had better therapeutic outcomes. PMID:26215551

  3. Investigating the roles of dimensionality in the helix-coil transition in random walk protein models using the method of Lee-Yang zeros

    NASA Astrophysics Data System (ADS)

    Linhananta, Apichart

    2004-03-01

    Recent computer simulations of coarse-grained (J.P. Kemp and Z.Y. Chen, 1998, Phys. Rev. Lett. 81, 3880) and all-atom protein models (M. Takano et. al., 2002, J. Chem. Phys. 116, 2219) have demonstrated that the helix-coil transition is a first-order-like transition. This contradicts the classical Ising-based one-dimensional Zimm-Bragg theory in which no phase transition occurs. It was conjectured that the discrepancy is due to long-range interactions and the fact that real protein systems are not one dimensional. The effects of long-range interactions have been investigated by minimalist models (N.A. Alves and U.H.E. Hansmann, 2000, Phys. Rev. Lett. 84, 1836; J.P. Kemp, U.H.E. Hansmann, and Z.Y. Chen, 2000, Eur. Phys. J. B, 15, 371), which suggests the universality of the helix-coil transition. This paper examines how conformation entropy of proteins in two or three dimensions drives the helix-coil transition. This is done by constructing two- or three-dimension self-avoiding random-walk models of proteins in which it is energetically favorable for the protein to assume linear (helical) configurations. The partition functions were determined exactly. Using the method of finite-size scaling and Lee-Yang zeros, it was confirmed that the phase transition is first order-like. The model is then extended to examine proteins that can form tertiary structures and non-native interactions. It was found that the partition-function zeros of these models differ significantly from those of the helix-coil models. The characterization of phase transitions in models of proteins and biopolymers by Lee-Yang zeros are discussed.

  4. Smartphone self-monitoring to support self-management among people living with HIV: Perceived benefits and theory of change from a mixed-methods, randomized pilot study

    PubMed Central

    Swendeman, Dallas; Ramanathan, Nithya; Baetscher, Laura; Medich, Melissa; Scheffler, Aaron; Comulada, W. Scott; Estrin, Deborah

    2015-01-01

    BACKGROUND Self-monitoring by mobile phone applications offers new opportunities to engage patients in self-management. Self-monitoring has not been examined thoroughly as a self-directed intervention strategy for self-management of multiple behaviors and states by people living with HIV (PLH). METHODS PLH (n=50), primarily African-American and Latino, were recruited from two AIDS services organizations and randomly assigned to daily smartphone (n=34) or bi-weekly web-survey only (n=16) self-monitoring for six weeks. Smartphone self-monitoring included responding to brief surveys on medication adherence, mental health, substance use, and sexual risk behaviors, and brief text diaries on stressful events. Qualitative analyses examine bi-weekly, open-ended user-experience interviews regarding perceived benefits and barriers of self-monitoring, and to elaborate a theoretical model for potential efficacy of self-monitoring to support self-management for multiple domains. RESULTS Self-monitoring functions include reflection for self-awareness, cues to action (reminders), reinforcements from self-tracking, and their potential effects on risk perceptions, motivations, skills, and behavioral activation states. Participants also reported therapeutic benefits related to self-expression for catharsis, non-judgmental disclosure, and in-the-moment support. About one-third of participants reported that surveys were too long, frequent, or tedious. Some smartphone group participants suggested that daily self-monitoring was more beneficial than bi-weekly due to frequency and in-the-moment availability. About twice as many daily self-monitoring group participants reported increased awareness and behavior change support from self-monitoring compared to bi-weekly web-survey only participants. CONCLUSION Self-monitoring is a potentially efficacious disruptive innovation for supporting self-management by PLH and for complementing other interventions, but more research is needed to

  5. Three-dimensional evaluation of postoperative swelling in treatment of zygomatic bone fractures using two different cooling therapy methods: a randomized, observer-blind, prospective study

    PubMed Central

    2013-01-01

    Background Surgical treatment and complications in patients with zygomatic bone fractures can lead to a significant degree of tissue trauma resulting in common postoperative symptoms and types of pain, facial swelling and functional impairment. Beneficial effects of local cold treatment on postoperative swelling, edema, pain, inflammation, and hemorrhage, as well as the reduction of metabolism, bleeding and hematomas, have been described. The aim of this study was to compare postoperative cooling therapy applied through the use of cooling compresses with the water-circulating cooling face mask manufactured by Hilotherm in terms of beneficial impact on postoperative facial swelling, pain, eye motility, diplopia, neurological complaints and patient satisfaction. Methods Forty-two patients were selected for treatment of unilateral zygomatic bone fractures and were divided randomly to one of two treatments: either a Hilotherm cooling face mask or conventional cooling compresses. Cooling was initiated as soon as possible after surgery until postoperative day 3 and was applied continuously for 12 hours daily. Facial swelling was quantified through a three-dimensional optical scanning technique. Furthermore, pain, neurological complaints, eye motility, diplopia and patient satisfaction were observed for each patient. Results Patients receiving a cooling therapy by Hilotherm demonstrated significantly less facial swelling, less pain, reduced limitation of eye motility and diplopia, fewer neurological complaints and were more satisfied compared to patients receiving conventional cooling therapy. Conclusions Hilotherapy is more efficient in managing postoperative swelling and pain after treatment of unilateral zygomatic bone fractures than conventional cooling. Trial registration German Clinical Trials Register ID: DRKS00004846 PMID:23895539

  6. Planning 4-Dimensional Computed Tomography (4DCT) Cannot Adequately Represent Daily Intrafractional Motion of Abdominal Tumors

    SciTech Connect

    Ge, Jiajia; Santanam, Lakshmi; Noel, Camille; Parikh, Parag J.

    2013-03-15

    Purpose: To evaluate whether planning 4-dimensional computed tomography (4DCT) can adequately represent daily motion of abdominal tumors in regularly fractionated and stereotactic body radiation therapy (SBRT) patients. Methods and Materials: Intrafractional tumor motion of 10 patients with abdominal tumors (4 pancreas-fractionated and 6 liver-stereotactic patients) with implanted fiducials was measured based on daily orthogonal fluoroscopic movies over 38 treatment fractions. The needed internal margin for at least 90% of tumor coverage was calculated based on a 95th and fifth percentile of daily 3-dimensional tumor motion. The planning internal margin was generated by fusing 4DCT motion from all phase bins. The disagreement between needed and planning internal margin was analyzed fraction by fraction in 3 motion axes (superior-inferior [SI], anterior-posterior [AP], and left-right [LR]). The 4DCT margin was considered as an overestimation/underestimation of daily motion when disagreement exceeded at least 3 mm in the SI axis and/or 1.2 mm in the AP and LR axes (4DCT image resolution). The underlying reasons for this disagreement were evaluated based on interfractional and intrafractional breathing variation. Results: The 4DCT overestimated daily 3-dimensional motion in 39% of the fractions in 7 of 10 patients and underestimated it in 53% of the fractions in 8 of 10 patients. Median underestimation was 3.9 mm, 3.0 mm, and 1.7 mm in the SI axis, AP axis, and LR axis, respectively. The 4DCT was found to capture irregular deep breaths in 3 of 10 patients, with 4DCT motion larger than mean daily amplitude by 18 to 21 mm. The breathing pattern varied from breath to breath and day to day. The intrafractional variation of amplitude was significantly larger than intrafractional variation (2.7 mm vs 1.3 mm) in the primary motion axis (ie, SI axis). The SBRT patients showed significantly larger intrafractional amplitude variation than fractionated patients (3.0 mm vs 2

  7. Is serum or sputum eosinophil cationic protein level adequate for diagnosis of mild asthma?

    PubMed

    Khakzad, Mohammad Reza; Mirsadraee, Majid; Sankian, Mojtaba; Varasteh, Abdolreza; Meshkat, Mojtaba

    2009-09-01

    Spirometry has been used as a common diagnostic test in asthma. Most of the patients with a mild asthma have a FEV1 within normal range. Hence, other diagnostic methods are usually used. The aim of this study was to evaluate whether eosinophil Cationic Protein (ECP) could be an accurate diagnostic marker of mild asthma. In this study diagnosis of asthma was made according to internationally accepted criteria. Asthma severity was evaluated according to frequency of symptoms and FEV1.Adequate sputum samples were obtained in 50 untreated subjects. A control group of 12 normal subjects that showed PC20 more than 8 mg/dl was also examined. Sputum was induced by inhalation of hypertonic saline. Inflammatory cells in sputum smears were assessed semi-quantitatively. ECP and IgE concentrations, eosinophil (EO) percentage and ECP/EO ratio in serum and sputum were also determined. The results revealed that Cough and dyspnea were the most frequent clinical findings. Dyspnea and wheezing were the symptoms that correlated with staging of asthma. FEV1 was within normal range (more than 80% of predicted) in 22 (44%) subjects.Asthmatic patients showed significantly higher numbers of blood eosinophils (4.5+/- 3.1% vs. 1.2+/-0.2%, P=0.009), and higher levels of serum ECP than control group (3.1+/- 2.6 % and 22.6+/- 15.8 ng/ml, respectively). Sputum ECP level in asthmatics was significantly higher than non- asthmatics (55.3+/-29.8ng/mL vs. 25.0+/-24.7ng/mL, P=0.045). Regression analysis showed no significant correlation between spirometric parameters and biomarkers, the only exception was significant correlation between FEF(25-75) and serum ECP (r= 0.28, P 0.041). Regarding clinical symptoms, wheezing was significantly correlated with elevation of most of biomarkers. Since, serum and sputum ECP levels are elevated in untreated asthmatics, the ECP level could be used for accurate diagnosis of mild form of asthma in which spirometry is unremarkable. PMID:20124607

  8. Human milk feeding supports adequate growth in infants ≤ 1250 grams birth weight

    PubMed Central

    2013-01-01

    Background Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The objective of this study was to evaluate growth velocities and incidence of extrauterine growth restriction in infants ≤ 1250 grams (g) birth weight (BW) receiving an exclusive human milk-based diet with early and rapid advancement of fortification using a donor human milk derived fortifier. Methods In a single center, prospective observational cohort study, preterm infants weighing ≤ 1250 g BW were fed an exclusive human milk-based diet until 34 weeks postmenstrual age. Human milk fortification with donor human milk derived fortifier was started at 60 mL/kg/d and advanced to provide 6 to 8 additional kilocalories per ounce (or 0.21 to 0.28 kilocalories per gram). Data for growth were compared to historical growth standards and previous human milk-fed cohorts. Results We consecutively evaluated 104 infants with mean gestational age of 27.6 ± 2.0 weeks and BW of 913 ± 181 g (mean ± standard deviation). Weight gain was 24.8 ± 5.4 g/kg/day with length 0.99 ± 0.23 cm/week and head circumference 0.72 ± 0.14 cm/week. There were 3 medical NEC cases and 1 surgical NEC case. 22 infants (21%) were small for gestational age at birth. Overall, 45 infants (43%) had extrauterine growth restriction. Weight velocity was affected by day of fortification (p = 0.005) and day of full feeds (p = 0.02). Our cohort had significantly greater growth in weight and length compared to previous entirely human milk-fed cohorts. Conclusions A feeding protocol for infants ≤ 1250 g BW providing an exclusive human milk-based diet with early and rapid advancement of fortification leads to growth meeting targeted standards with a low rate of extrauterine growth restriction. Consistent

  9. The concept of adequate causation and Max Weber's comparative sociology of religion.

    PubMed

    Buss, A

    1999-06-01

    Max Weber's The Protestant Ethic and the Spirit of Capitalism, studied in isolation, shows mainly an elective affinity or an adequacy on the level of meaning between the Protestant ethic and the 'spirit' of capitalism. Here it is suggested that Weber's subsequent essays on 'The Economic Ethics of World Religions' are the result of his opinion that adequacy on the level of meaning needs and can be verified by causal adequacy. After some introductory remarks, particularly on elective affinity, the paper tries to develop the concept of adequate causation and the related concept of objective possibility on the basis of the work of v. Kries on whom Weber heavily relied. In the second part, this concept is used to show how the study of the economic ethics of India, China, Rome and orthodox Russia can support the thesis that the 'spirit' of capitalism, although it may not have been caused by the Protestant ethic, was perhaps adequately caused by it. PMID:15260028

  10. A novel strategy to overcome resistance in stent placement at lesion site after adequate predilatation.

    PubMed

    Jain, D; Tolg, R; Katus, H A; Richardt, G

    2000-12-01

    Resistance was encountered in passing a 3 x 18 mm stent across a lesion in the proximal left anterior descending coronary artery. Successive changes in stent with repeated balloon dilatations did not succeed. Finally, a 9 mm stent was passed across the lesion and deployed at the site of maximal resistance. The 18 mm stent was then placed through this stent. A novel strategy to overcome resistance in the stent passage through the lesion after an adequate balloon predilatation is reported. PMID:11103034

  11. Myth 19: Is Advanced Placement an Adequate Program for Gifted Students?

    ERIC Educational Resources Information Center

    Gallagher, Shelagh A.

    2009-01-01

    Is it a myth that Advanced Placement (AP) is an adequate program for gifted students? AP is so covered with myths and assumptions that it is hard to get a clear view of the issues. In this article, the author finds the answer about AP by looking at current realties. First, AP is hard for gifted students to avoid. Second, AP never was a program…

  12. Global risk assessment of aflatoxins in maize and peanuts: are regulatory standards adequately protective?

    PubMed

    Wu, Felicia; Stacy, Shaina L; Kensler, Thomas W

    2013-09-01

    The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295

  13. Global Risk Assessment of Aflatoxins in Maize and Peanuts: Are Regulatory Standards Adequately Protective?

    PubMed Central

    Wu, Felicia

    2013-01-01

    The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295

  14. Self-esteem, social support, and satisfaction differences in women with adequate and inadequate prenatal care.

    PubMed

    Higgins, P; Murray, M L; Williams, E M

    1994-03-01

    This descriptive, retrospective study examined levels of self-esteem, social support, and satisfaction with prenatal care in 193 low-risk postpartal women who obtained adequate and inadequate care. The participants were drawn from a regional medical center and university teaching hospital in New Mexico. A demographic questionnaire, the Coopersmith self-esteem inventory, the personal resource questionnaire part 2, and the prenatal care satisfaction inventory were used for data collection. Significant differences were found in the level of education, income, insurance, and ethnicity between women who received adequate prenatal care and those who received inadequate care. Women who were likely to seek either adequate or inadequate prenatal care were those whose total family income was $10,000 to $19,999 per year and high school graduates. Statistically significant differences were found in self-esteem, social support, and satisfaction between the two groups of women. Strategies to enhance self-esteem and social support have to be developed to reach women at risk for receiving inadequate prenatal care. PMID:8155221

  15. Three not adequately understood lunar phenomena investigated by the wave planetology

    NASA Astrophysics Data System (ADS)

    Kochemasov, G. G.

    2009-04-01

    Three not adequately understood lunar phenomena investigated by the wave planetology G. Kochemasov IGEM of the Russian Academy of Sciences, Moscow, Russia, kochem.36@mail.ru The lunar science notwithstanding rather numerous researches of the last 50 years still debates some important issues. Three of them concern an origin of mascons, the deepest but low ferruginous South Pole-Aitken depression, a strange character of the frequency-crater size curve. Prevailing approaches are mainly based on impacts having made the present geomorphology of the Moon. However practically are ignored the fact of antipodality of basins and marea, a complex character of the frequency-crater size curve obviously implying an involvement of different sources and reasons responsible for crater formation. Attempts to find impactor sources in various sometimes very remote parts of the Solar system are too artificial, besides they do not explain very intensive, like lunar cratering of Mercury. Saturation of the lunar surface by ~70-km diameter craters is very strange for random impacts from any source; to find a time interval for this saturation is difficult if not possible because it affects formations of various ages. Lunar basins and marea completely contradict to a classical frequency- crater size curve. Their presumed ( and measured) different ages make dubious existence of one specialized impactor source. So, if one accepts an impact process as the only process responsible for cratering (ring forms development) then the real mess in crater statistics and timing never will be overcome. The wave planetology [1-3 & others] examined by many planets and satellites of the Solar system proved to be real. In a case of the Moon it can help in answering the above questions. First of all it should be admitted that the complex lunar crater (ring forms) statistics is due to a superposition and mixing of two main processes (a minor involvement of volcanic features is also present): impacts and wave

  16. Is fine-needle aspiration diagnosis of malignancy adequate prior to major lung resections including pneumonectomy?

    PubMed

    Khorsandi, Maziar; Shaikhrezai, Kasra; Wallace, William; Brackenbury, Edward

    2012-08-01

    A best evidence topic in thoracic surgery was written according to a structured protocol. The question addressed was whether a fine-needle aspiration (FNA) diagnosis is of sufficient reliability for the diagnosis of lung cancer prior to a major lung resection. Altogether, 112 papers were found using the reported search, of which 13 papers presented the best evidence to answer the clinical question. The author, journal, date and country of publication, patient group studied, study type, relevant outcomes, results and study weaknesses of these papers are tabulated. The tabulated studies include two meta-analyses, one systematic review, one randomized controlled trial (RCT) and nine cohort studies. The specificity reported for FNA in the diagnosis and staging of lung cancer ranged from 96.2 to 100%. One meta-analysis reported a specificity of 97%. Another meta-analysis reported a specificity of 98.8%. A systematic review reported a specificity of 97%. An RCT reported a specificity of 96.2-100%. We conclude that the FNA for lung cancer is reported to be highly specific prior to major lung resection with a very low false positive rate. However, although a false positive may occasionally be acceptable in lobectomies, where the lobes are often removed without histology, all steps should be taken to avoid a false positive result in pneumonectomy considering the serious consequences of embarking upon such an operation in the small number of patients with a false positive result, and we recommend that a positive FNA result should be confirmed by means of alternative sampling methods. We also acknowledge that obtaining an additional biopsy specimen would add to the risk of morbidity and costs; therefore, any benefits should be weighed against risks and additional costs. PMID:22611184

  17. A Brief, Low-Cost, Theory-Based Intervention to Promote Dual Method Use by Black and Latina Female Adolescents: A Randomized Clinical Trial

    ERIC Educational Resources Information Center

    Roye, Carol; Perlmutter Silverman, Paula; Krauss, Beatrice

    2007-01-01

    HIV/AIDS disproportionately affects young women of color. Young women who use hormonal contraception are less likely to use condoms. Brief, inexpensive HIV-prevention interventions are needed for high-volume clinics. This study was a randomized clinical trial of two interventions: (a) a video made for this study and (b) an adaptation of Project…

  18. The Totally Extraperitoneal Method versus Lichtenstein's Technique for Inguinal Hernia Repair: A Systematic Review with Meta-Analyses and Trial Sequential Analyses of Randomized Clinical Trials

    PubMed Central

    Koning, G. G.; Wetterslev, J.; van Laarhoven, C. J. H. M.; Keus, F.

    2013-01-01

    Background Lichtenstein's technique is considered the reference technique for inguinal hernia repair. Recent trials suggest that the totally extraperitoneal (TEP) technique may lead to reduced proportions of chronic pain. A systematic review evaluating the benefits and harms of the TEP compared with Lichtenstein's technique is needed. Methodology/Principal Findings The review was performed according to the ‘Cochrane Handbook for Systematic Reviews’. Searches were conducted until January 2012. Patients with primary uni- or bilateral inguinal hernias were included. Only trials randomising patients to TEP and Lichtenstein were included. Bias evaluation and trial sequential analysis (TSA) were performed. The error matrix was constructed to minimise the risk of systematic and random errors. Thirteen trials randomized 5404 patients. There was no significant effect of the TEP compared with the Lichtenstein on the number of patients with chronic pain in a random-effects model risk ratio (RR 0.80; 95% confidence interval (CI) 0.61 to 1.04; p = 0.09). There was also no significant effect on number of patients with recurrences in a random-effects model (RR 1.41; 95% CI 0.72 to 2.78; p = 0.32) and the TEP technique may or may not be associated with less severe adverse events (random-effects model RR 0.91; 95% CI 0.73 to 1.12; p = 0.37). TSA showed that the required information size was far from being reached for patient important outcomes. Conclusions/Significance TEP versus Lichtenstein for inguinal hernia repair has been evaluated by 13 trials with high risk of bias. The review with meta-analyses, TSA and error matrix approach shows no conclusive evidence of a difference between TEP and Lichtenstein on the primary outcomes chronic pain, recurrences, and severe adverse events. PMID:23349689

  19. Adequate Iodine Status in New Zealand School Children Post-Fortification of Bread with Iodised Salt

    PubMed Central

    Jones, Emma; McLean, Rachael; Davies, Briar; Hawkins, Rochelle; Meiklejohn, Eva; Ma, Zheng Feei; Skeaff, Sheila

    2016-01-01

    Iodine deficiency re-emerged in New Zealand in the 1990s, prompting the mandatory fortification of bread with iodised salt from 2009. This study aimed to determine the iodine status of New Zealand children when the fortification of bread was well established. A cross-sectional survey of children aged 8–10 years was conducted in the cities of Auckland and Christchurch, New Zealand, from March to May 2015. Children provided a spot urine sample for the determination of urinary iodine concentration (UIC), a fingerpick blood sample for Thyroglobulin (Tg) concentration, and completed a questionnaire ascertaining socio-demographic information that also included an iodine-specific food frequency questionnaire (FFQ). The FFQ was used to estimate iodine intake from all main food sources including bread and iodised salt. The median UIC for all children (n = 415) was 116 μg/L (females 106 μg/L, males 131 μg/L) indicative of adequate iodine status according to the World Health Organisation (WHO, i.e., median UIC of 100–199 μg/L). The median Tg concentration was 8.7 μg/L, which was <10 μg/L confirming adequate iodine status. There was a significant difference in UIC by sex (p = 0.001) and ethnicity (p = 0.006). The mean iodine intake from the food-only model was 65 μg/day. Bread contributed 51% of total iodine intake in the food-only model, providing a mean iodine intake of 35 μg/day. The mean iodine intake from the food-plus-iodised salt model was 101 μg/day. In conclusion, the results of this study confirm that the iodine status in New Zealand school children is now adequate. PMID:27196925

  20. Adequate Iodine Status in New Zealand School Children Post-Fortification of Bread with Iodised Salt.

    PubMed

    Jones, Emma; McLean, Rachael; Davies, Briar; Hawkins, Rochelle; Meiklejohn, Eva; Ma, Zheng Feei; Skeaff, Sheila

    2016-01-01

    Iodine deficiency re-emerged in New Zealand in the 1990s, prompting the mandatory fortification of bread with iodised salt from 2009. This study aimed to determine the iodine status of New Zealand children when the fortification of bread was well established. A cross-sectional survey of children aged 8-10 years was conducted in the cities of Auckland and Christchurch, New Zealand, from March to May 2015. Children provided a spot urine sample for the determination of urinary iodine concentration (UIC), a fingerpick blood sample for Thyroglobulin (Tg) concentration, and completed a questionnaire ascertaining socio-demographic information that also included an iodine-specific food frequency questionnaire (FFQ). The FFQ was used to estimate iodine intake from all main food sources including bread and iodised salt. The median UIC for all children (n = 415) was 116 μg/L (females 106 μg/L, males 131 μg/L) indicative of adequate iodine status according to the World Health Organisation (WHO, i.e., median UIC of 100-199 μg/L). The median Tg concentration was 8.7 μg/L, which was <10 μg/L confirming adequate iodine status. There was a significant difference in UIC by sex (p = 0.001) and ethnicity (p = 0.006). The mean iodine intake from the food-only model was 65 μg/day. Bread contributed 51% of total iodine intake in the food-only model, providing a mean iodine intake of 35 μg/day. The mean iodine intake from the food-plus-iodised salt model was 101 μg/day. In conclusion, the results of this study confirm that the iodine status in New Zealand school children is now adequate. PMID:27196925