Science.gov

Sample records for adequate randomization methods

  1. Are shear force methods adequately reported?

    PubMed

    Holman, Benjamin W B; Fowler, Stephanie M; Hopkins, David L

    2016-09-01

    This study aimed to determine the detail to which shear force (SF) protocols and methods have been reported in the scientific literature between 2009 and 2015. Articles (n=734) published in peer-reviewed animal and food science journals and limited to only those testing the SF of unprocessed and non-fabricated mammal meats were evaluated. It was found that most of these SF articles originated in Europe (35.3%), investigated bovine species (49.0%), measured m. longissimus samples (55.2%), used tenderometers manufactured by Instron (31.2%), and equipped with Warner-Bratzler blades (68.8%). SF samples were also predominantly thawed prior to cooking (37.1%) and cooked sous vide, using a water bath (50.5%). Information pertaining to blade crosshead speed (47.5%), recorded SF resistance (56.7%), muscle fibre orientation when tested (49.2%), sub-section or core dimension (21.8%), end-point temperature (29.3%), and other factors contributing to SF variation were often omitted. This base failure diminishes repeatability and accurate SF interpretation, and must therefore be rectified. PMID:27107727

  2. Are shear force methods adequately reported?

    PubMed

    Holman, Benjamin W B; Fowler, Stephanie M; Hopkins, David L

    2016-09-01

    This study aimed to determine the detail to which shear force (SF) protocols and methods have been reported in the scientific literature between 2009 and 2015. Articles (n=734) published in peer-reviewed animal and food science journals and limited to only those testing the SF of unprocessed and non-fabricated mammal meats were evaluated. It was found that most of these SF articles originated in Europe (35.3%), investigated bovine species (49.0%), measured m. longissimus samples (55.2%), used tenderometers manufactured by Instron (31.2%), and equipped with Warner-Bratzler blades (68.8%). SF samples were also predominantly thawed prior to cooking (37.1%) and cooked sous vide, using a water bath (50.5%). Information pertaining to blade crosshead speed (47.5%), recorded SF resistance (56.7%), muscle fibre orientation when tested (49.2%), sub-section or core dimension (21.8%), end-point temperature (29.3%), and other factors contributing to SF variation were often omitted. This base failure diminishes repeatability and accurate SF interpretation, and must therefore be rectified.

  3. Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios

    NASA Technical Reports Server (NTRS)

    Juarez, Alfredo; Harper, Susana Tapia

    2016-01-01

    The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.

  4. Ray methods in random media

    NASA Technical Reports Server (NTRS)

    Hornstein, J.; Fainberg, J.

    1981-01-01

    We review ray-optical methods of analyzing short-wavelength propagation in random media. The advantages and limitations of ray methods are discussed, and results of the statistical theory of ray segment fluctuations pertinent to ray tracing are summarized. The standard method of Monte Carlo ray tracing is compared to a new method which takes into account recent results on the statistics of ray segment fluctuations.

  5. Randomization methods in emergency setting trials: a descriptive review

    PubMed Central

    Moe‐Byrne, Thirimon; Oddie, Sam; McGuire, William

    2015-01-01

    Background Quasi‐randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic indicators between treatment groups in trials using true randomization versus trials using quasi‐randomization. Results Seven reviews contained 16 trials that used true randomization and 11 that used quasi‐randomization. Baseline group imbalance was identified in four trials using true randomization (25%) and in two quasi‐randomized trials (18%). Of the four truly randomized trials with imbalance, three concealed treatment allocation adequately. Clinical heterogeneity and poor reporting limited the assessment of trial recruitment outcomes. Conclusions We did not find strong or consistent evidence that quasi‐randomization is associated with selection bias more often than true randomization. High risk of bias judgements for quasi‐randomized emergency studies should therefore not be assumed in systematic reviews. Clinical heterogeneity across trials within reviews, coupled with limited availability of relevant trial accrual data, meant it was not possible to adequately explore the possibility that true randomization might result in slower trial recruitment rates, or the recruitment of less representative populations. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. PMID:26333419

  6. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  7. Elementary Science Methods Courses and the "National Science Education Standards": Are We Adequately Preparing Teachers?

    ERIC Educational Resources Information Center

    Smith, Leigh K.; Gess-Newsome, Julie

    2004-01-01

    Despite the apparent lack of universally accepted goals or objectives for elementary science methods courses, teacher educators nationally are autonomously designing these classes to prepare prospective teachers to teach science. It is unclear, however, whether science methods courses are preparing teachers to teach science effectively or to…

  8. Are adequate methods available to detect protist parasites on fresh produce?

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Human parasitic protists such as Cryptosporidium, Giardia and microsporidia contaminate a variety of fresh produce worldwide. Existing detection methods lack sensitivity and specificity for most foodborne parasites. Furthermore, detection has been problematic because these parasites adhere tenacious...

  9. Functional methods for waves in random media

    NASA Technical Reports Server (NTRS)

    Chow, P. L.

    1981-01-01

    Some basic ideas in functional methods for waves in random media are illustrated through a simple random differential equation. These methods are then generalized to solve certain random parabolic equations via an exponential representation given by the Feynman-Kac formula. It is shown that these functional methods are applicable to a number of problems in random wave propagation. They include the forward-scattering approximation in Gaussian white-noise media; the solution of the optical beam propagation problem by a phase-integral method; the high-frequency scattering by bounded random media; and a derivation of approximate moment equations from the functional integral representation.

  10. Functional methods for waves in random media

    NASA Technical Reports Server (NTRS)

    Chow, P. L.

    1981-01-01

    Some basic ideas in functional methods for waves in random media are illustrated through a simple random differential equation. These methods are then generalized to solve certain random parabolic equations via an exponential representation given by the Feynman-Kac formula. It is shown that these functional methods are applicable to a number of problems in random wave propagation. They include the forward-scattering approximation in Gaussian white-noise media; the solution of the optical beam propagation problem by a phase-integral method; the high-frequency scattering by bounded random media, and a derivation of approximate moment equations from the functional integral representation.

  11. Quasi-Isotropic Approximation of Geometrical Optics Method as Adequate Electrodynamical Basis for Tokamak Plasma Polarimetry

    NASA Astrophysics Data System (ADS)

    Bieg, Bohdan; Chrzanowski, Janusz; Kravtsov, Yury A.; Orsitto, Francesco

    Basic principles and recent findings of quasi-isotropic approximation (QIA) of a geometrical optics method are presented in a compact manner. QIA was developed in 1969 to describe electromagnetic waves in weakly anisotropic media. QIA represents the wave field as a power series in two small parameters, one of which is a traditional geometrical optics parameter, equal to wavelength ratio to plasma characteristic scale, and the other one is the largest component of anisotropy tensor. As a result, "" QIA ideally suits to tokamak polarimetry/interferometry systems in submillimeter range, where plasma manifests properties of weakly anisotropic medium.

  12. Estimating the benefits of maintaining adequate lake levels to homeowners using the hedonic property method

    NASA Astrophysics Data System (ADS)

    Loomis, John; Feldman, Marvin

    2003-09-01

    The hedonic property method was used to estimate residents' economic benefits from maintaining high and stable lake levels at Lake Almanor, California. Nearly a thousand property transactions over a 14-year period from 1987 to 2001 were analyzed. The linear hedonic property regression explained more than 60% of the variation in-house prices. Property prices were negatively and significantly related to the number of linear feet of exposed lake shoreline. Each additional one foot of exposed shoreline reduces the property price by 108-119. A view of the lake added nearly 31,000 to house prices, while lakefront properties sold for 209,000 more than non-lake front properties.

  13. Random Walk Method for Potential Problems

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, T.; Raju, I. S.

    2002-01-01

    A local Random Walk Method (RWM) for potential problems governed by Lapalace's and Paragon's equations is developed for two- and three-dimensional problems. The RWM is implemented and demonstrated in a multiprocessor parallel environment on a Beowulf cluster of computers. A speed gain of 16 is achieved as the number of processors is increased from 1 to 23.

  14. Adverse prognostic value of peritumoral vascular invasion: is it abrogated by adequate endocrine adjuvant therapy? Results from two International Breast Cancer Study Group randomized trials of chemoendocrine adjuvant therapy for early breast cancer

    PubMed Central

    Viale, G.; Giobbie-Hurder, A.; Gusterson, B. A.; Maiorano, E.; Mastropasqua, M. G.; Sonzogni, A.; Mallon, E.; Colleoni, M.; Castiglione-Gertsch, M.; Regan, M. M.; Brown, R. W.; Golouh, R.; Crivellari, D.; Karlsson, P.; Öhlschlegel, C.; Gelber, R. D.; Goldhirsch, A.; Coates, A. S.

    2010-01-01

    Background: Peritumoral vascular invasion (PVI) may assist in assigning optimal adjuvant systemic therapy for women with early breast cancer. Patients and methods: Patients participated in two International Breast Cancer Study Group randomized trials testing chemoendocrine adjuvant therapies in premenopausal (trial VIII) or postmenopausal (trial IX) node-negative breast cancer. PVI was assessed by institutional pathologists and/or central review on hematoxylin–eosin-stained slides in 99% of patients (analysis cohort 2754 patients, median follow-up >9 years). Results: PVI, present in 23% of the tumors, was associated with higher grade tumors and larger tumor size (trial IX only). Presence of PVI increased locoregional and distant recurrence and was significantly associated with poorer disease-free survival. The adverse prognostic impact of PVI in trial VIII was limited to premenopausal patients with endocrine-responsive tumors randomized to therapies not containing goserelin, and conversely the beneficial effect of goserelin was limited to patients whose tumors showed PVI. In trial IX, all patients received tamoxifen: the adverse prognostic impact of PVI was limited to patients with receptor-negative tumors regardless of chemotherapy. Conclusion: Adequate endocrine adjuvant therapy appears to abrogate the adverse impact of PVI in node-negative disease, while PVI may identify patients who will benefit particularly from adjuvant therapy. PMID:19633051

  15. Are Power Analyses Reported with Adequate Detail? Evidence from the First Wave of Group Randomized Trials Funded by the Institute of Education Sciences

    ERIC Educational Resources Information Center

    Spybrook, Jessaca

    2008-01-01

    This study examines the reporting of power analyses in the group randomized trials funded by the Institute of Education Sciences from 2002 to 2006. A detailed power analysis provides critical information that allows reviewers to (a) replicate the power analysis and (b) assess whether the parameters used in the power analysis are reasonable.…

  16. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  17. Are the most distressing concerns of patients with inoperable lung cancer adequately assessed? A mixed-methods analysis.

    PubMed

    Tishelman, Carol; Lövgren, Malin; Broberger, Eva; Hamberg, Katarina; Sprangers, Mirjam A G

    2010-04-10

    PURPOSE Standardized questionnaires for patient-reported outcomes are generally composed of specified predetermined items, although other areas may also cause patients distress. We therefore studied reports of what was most distressing for 343 patients with inoperable lung cancer (LC) at six time points during the first year postdiagnosis and how these concerns were assessed by three quality-of-life and symptom questionnaires. PATIENTS AND METHODS Qualitative analysis of patients' responses to the question "What do you find most distressing at present?" generated 20 categories, with 17 under the dimensions of "bodily distress," "life situation with LC," and "iatrogenic distress." Descriptive and inferential statistical analyses were conducted. RESULTS The majority of statements reported as most distressing related to somatic and psychosocial problems, with 26% of patients reporting an overarching form of distress instead of specific problems at some time point. Twenty-seven percent reported some facet of their contact with the health care system as causing them most distress. While 55% to 59% of concerns reported as most distressing were clearly assessed by the European Organisation for Research and Treatment for Cancer Quality of Life Questionnaire Core-30 and Lung Cancer Module instruments, the Memorial Symptom Assessment Scale, and the modified Distress Screening Tool, iatrogenic distress is not specifically targeted by any of the three instruments examined. CONCLUSION Using this approach, several distressing issues were found to be commonly reported by this patient group but were not assessed by standardized questionnaires. This highlights the need to carefully consider choice of instrument in relation to study objectives and characteristics of the sample investigated and to consider complementary means of assessment in clinical practice.

  18. Convergence of a random walk method for the Burgers equation

    SciTech Connect

    Roberts, S.

    1985-10-01

    In this paper we consider a random walk algorithm for the solution of Burgers' equation. The algorithm uses the method of fractional steps. The non-linear advection term of the equation is solved by advecting ''fluid'' particles in a velocity field induced by the particles. The diffusion term of the equation is approximated by adding an appropriate random perturbation to the positions of the particles. Though the algorithm is inefficient as a method for solving Burgers' equation, it does model a similar method, the random vortex method, which has been used extensively to solve the incompressible Navier-Stokes equations. The purpose of this paper is to demonstrate the strong convergence of our random walk method and so provide a model for the proof of convergence for more complex random walk algorithms; for instance, the random vortex method without boundaries.

  19. Replica methods for loopy sparse random graphs

    NASA Astrophysics Data System (ADS)

    Coolen, ACC

    2016-03-01

    I report on the development of a novel statistical mechanical formalism for the analysis of random graphs with many short loops, and processes on such graphs. The graphs are defined via maximum entropy ensembles, in which both the degrees (via hard constraints) and the adjacency matrix spectrum (via a soft constraint) are prescribed. The sum over graphs can be done analytically, using a replica formalism with complex replica dimensions. All known results for tree-like graphs are recovered in a suitable limit. For loopy graphs, the emerging theory has an appealing and intuitive structure, suggests how message passing algorithms should be adapted, and what is the structure of theories describing spin systems on loopy architectures. However, the formalism is still largely untested, and may require further adjustment and refinement. This paper is dedicated to the memory of our colleague and friend Jun-Ichi Inoue, with whom the author has had the great pleasure and privilege of collaborating.

  20. A simplified method for random vibration analysis of structures with random parameters

    NASA Astrophysics Data System (ADS)

    Ghienne, Martin; Blanzé, Claude

    2016-09-01

    Piezoelectric patches with adapted electrical circuits or viscoelastic dissipative materials are two solutions particularly adapted to reduce vibration of light structures. To accurately design these solutions, it is necessary to describe precisely the dynamical behaviour of the structure. It may quickly become computationally intensive to describe robustly this behaviour for a structure with nonlinear phenomena, such as contact or friction for bolted structures, and uncertain variations of its parameters. The aim of this work is to propose a non-intrusive reduced stochastic method to characterize robustly the vibrational response of a structure with random parameters. Our goal is to characterize the eigenspace of linear systems with dynamic properties considered as random variables. This method is based on a separation of random aspects from deterministic aspects and allows us to estimate the first central moments of each random eigenfrequency with a single deterministic finite elements computation. The method is applied to a frame with several Young's moduli modeled as random variables. This example could be expanded to a bolted structure including piezoelectric devices. The method needs to be enhanced when random eigenvalues are closely spaced. An indicator with no additional computational cost is proposed to characterize the ’’proximity” of two random eigenvalues.

  1. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that the schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solutions and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  2. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that that schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solution and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  3. Performance of statistical methods for analysing survival data in the presence of non-random compliance.

    PubMed

    Odondi, Lang'o; McNamee, Roseanne

    2010-12-20

    Noncompliance often complicates estimation of treatment efficacy from randomized trials. Under random noncompliance, per protocol analyses or even simple regression adjustments for noncompliance, could be adequate for causal inference, but special methods are needed when noncompliance is related to risk. For survival data, Robins and Tsiatis introduced the semi-parametric structural Causal Accelerated Life Model (CALM) which allows time-dependent departures from randomized treatment in either arm and relates each observed event time to a potential event time that would have been observed if the control treatment had been given throughout the trial. Alternatively, Loeys and Goetghebeur developed a structural Proportional Hazards (C-Prophet) model for when there is all-or-nothing noncompliance in the treatment arm only. Whitebiet al. proposed a 'complier average causal effect' method for Proportional Hazards estimation which allows time-dependent departures from randomized treatment in the active arm. A time-invariant version of this estimator (CHARM) consists of a simple adjustment to the Intention-to-Treat hazard ratio estimate. We used simulation studies mimicking a randomized controlled trial of active treatment versus control with censored time-to-event data, and under both random and non-random time-dependent noncompliance, to evaluate performance of these methods in terms of 95 per cent confidence interval coverage, bias and root mean square errors (RMSE). All methods performed well in terms of bias, even the C-Prophet used after treating time-varying compliance as all-or-nothing. Coverage of the latter method, as implemented in Stata, was too low. The CALM method performed best in terms of bias and coverage but had the largest RMSE. PMID:20963732

  4. Random errors in interferometry with the least-squares method

    SciTech Connect

    Wang Qi

    2011-01-20

    This investigation analyzes random errors in interferometric surface profilers using the least-squares method when random noises are present. Two types of random noise are considered here: intensity noise and position noise. Two formulas have been derived for estimating the standard deviations of the surface height measurements: one is for estimating the standard deviation when only intensity noise is present, and the other is for estimating the standard deviation when only position noise is present. Measurements on simulated noisy interferometric data have been performed, and standard deviations of the simulated measurements have been compared with those theoretically derived. The relationships have also been discussed between random error and the wavelength of the light source and between random error and the amplitude of the interference fringe.

  5. Random and systematic measurement errors in acoustic impedance as determined by the transmission line method

    NASA Technical Reports Server (NTRS)

    Parrott, T. L.; Smith, C. D.

    1977-01-01

    The effect of random and systematic errors associated with the measurement of normal incidence acoustic impedance in a zero-mean-flow environment was investigated by the transmission line method. The influence of random measurement errors in the reflection coefficients and pressure minima positions was investigated by computing fractional standard deviations of the normalized impedance. Both the standard techniques of random process theory and a simplified technique were used. Over a wavelength range of 68 to 10 cm random measurement errors in the reflection coefficients and pressure minima positions could be described adequately by normal probability distributions with standard deviations of 0.001 and 0.0098 cm, respectively. An error propagation technique based on the observed concentration of the probability density functions was found to give essentially the same results but with a computation time of about 1 percent of that required for the standard technique. The results suggest that careful experimental design reduces the effect of random measurement errors to insignificant levels for moderate ranges of test specimen impedance component magnitudes. Most of the observed random scatter can be attributed to lack of control by the mounting arrangement over mechanical boundary conditions of the test sample.

  6. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  7. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Technical Reports Server (NTRS)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  8. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  9. Random element method for numerical modeling of diffusional processes

    NASA Technical Reports Server (NTRS)

    Ghoniem, A. F.; Oppenheim, A. K.

    1982-01-01

    The random element method is a generalization of the random vortex method that was developed for the numerical modeling of momentum transport processes as expressed in terms of the Navier-Stokes equations. The method is based on the concept that random walk, as exemplified by Brownian motion, is the stochastic manifestation of diffusional processes. The algorithm based on this method is grid-free and does not require the diffusion equation to be discritized over a mesh, it is thus devoid of numerical diffusion associated with finite difference methods. Moreover, the algorithm is self-adaptive in space and explicit in time, resulting in an improved numerical resolution of gradients as well as a simple and efficient computational procedure. The method is applied here to an assortment of problems of diffusion of momentum and energy in one-dimension as well as heat conduction in two-dimensions in order to assess its validity and accuracy. The numerical solutions obtained are found to be in good agreement with exact solution except for a statistical error introduced by using a finite number of elements, the error can be reduced by increasing the number of elements or by using ensemble averaging over a number of solutions.

  10. Pseudo Random Classification of Circulation Patterns - Comparison to Deliberate Methods

    NASA Astrophysics Data System (ADS)

    Philipp, Andreas

    2010-05-01

    Classification of circulations patterns, e.g. of sea level pressure patterns, can be done by many different methods, e.g. by cluster analysis, methods based on eigenvalues or those based on the leader algorithm like the Lund classification. However none of these methods can give clear advice on the problem of appropriate numbers of classes and even though the number is decided different methods lead to different results. High efforts are made to find methods leading to indisbutable results. However, doubts on the classifiability of tropospheric circulation states have been raised recently and the existence of natural groups of similar patterns within the circulation data, which might be caused by circulation regimes, are questionable. If those groups or clusters exist, methods which are designed to find them, in particular cluster analysis, should be superior to classification schemes based on pseudo random definition of classes. In order to prove this assumption, a classification method called "random centroids" has been designed, for each class choosing one single circulation pattern using a random number generator and assigning all remaining patterns to them according to the minimum Euclidean distance. Evaluation metrics like the "explained cluster variance" for pressure, temperature and precipitation are calculated in order to compare those pseudo random classifications to classifications provided by the cost733cat dataset including many different classification catalogs for various methods (COST Action 733 "Harmonisation and Applications of Weather Type Classifications for European regions"). By running the randomcent method 1000 times the empirical probability density function of the evaluation metrics can be established and provides information about the probability for the established deliberate methods to be better than random classifications. The results show that most of the classifications fail to succeed the 95th percentile of the empirical probability

  11. Efficient stochastic Galerkin methods for random diffusion equations

    SciTech Connect

    Xiu Dongbin Shen Jie

    2009-02-01

    We discuss in this paper efficient solvers for stochastic diffusion equations in random media. We employ generalized polynomial chaos (gPC) expansion to express the solution in a convergent series and obtain a set of deterministic equations for the expansion coefficients by Galerkin projection. Although the resulting system of diffusion equations are coupled, we show that one can construct fast numerical methods to solve them in a decoupled fashion. The methods are based on separation of the diagonal terms and off-diagonal terms in the matrix of the Galerkin system. We examine properties of this matrix and show that the proposed method is unconditionally stable for unsteady problems and convergent for steady problems with a convergent rate independent of discretization parameters. Numerical examples are provided, for both steady and unsteady random diffusions, to support the analysis.

  12. An Evaluation of the Effectiveness of Recruitment Methods: The Staying Well after Depression Randomized Controlled Trial

    PubMed Central

    Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J. Mark G.

    2014-01-01

    Background Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited and trials often fail to report sufficient details about the recruitment sources and resources utilised. Purpose We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. Methods We describe eight recruitment methods utilised and two further sources not initiated by the research team and examine their efficacy in terms of (i) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial, (ii) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants, and (iii) comparison of sociodemographic characteristics of individuals recruited from different sources. Results Poster advertising, web-based advertising and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. Limitations It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other

  13. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  14. Extremely Randomized Machine Learning Methods for Compound Activity Prediction.

    PubMed

    Czarnecki, Wojciech M; Podlewska, Sabina; Bojarski, Andrzej J

    2015-11-09

    Speed, a relatively low requirement for computational resources and high effectiveness of the evaluation of the bioactivity of compounds have caused a rapid growth of interest in the application of machine learning methods to virtual screening tasks. However, due to the growth of the amount of data also in cheminformatics and related fields, the aim of research has shifted not only towards the development of algorithms of high predictive power but also towards the simplification of previously existing methods to obtain results more quickly. In the study, we tested two approaches belonging to the group of so-called 'extremely randomized methods'-Extreme Entropy Machine and Extremely Randomized Trees-for their ability to properly identify compounds that have activity towards particular protein targets. These methods were compared with their 'non-extreme' competitors, i.e., Support Vector Machine and Random Forest. The extreme approaches were not only found out to improve the efficiency of the classification of bioactive compounds, but they were also proved to be less computationally complex, requiring fewer steps to perform an optimization procedure.

  15. Analytic method for calculating properties of random walks on networks

    NASA Technical Reports Server (NTRS)

    Goldhirsch, I.; Gefen, Y.

    1986-01-01

    A method for calculating the properties of discrete random walks on networks is presented. The method divides complex networks into simpler units whose contribution to the mean first-passage time is calculated. The simplified network is then further iterated. The method is demonstrated by calculating mean first-passage times on a segment, a segment with a single dangling bond, a segment with many dangling bonds, and a looplike structure. The results are analyzed and related to the applicability of the Einstein relation between conductance and diffusion.

  16. Random-breakage mapping method applied to human DNA sequences

    NASA Technical Reports Server (NTRS)

    Lobrich, M.; Rydberg, B.; Cooper, P. K.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    The random-breakage mapping method [Game et al. (1990) Nucleic Acids Res., 18, 4453-4461] was applied to DNA sequences in human fibroblasts. The methodology involves NotI restriction endonuclease digestion of DNA from irradiated calls, followed by pulsed-field gel electrophoresis, Southern blotting and hybridization with DNA probes recognizing the single copy sequences of interest. The Southern blots show a band for the unbroken restriction fragments and a smear below this band due to radiation induced random breaks. This smear pattern contains two discontinuities in intensity at positions that correspond to the distance of the hybridization site to each end of the restriction fragment. By analyzing the positions of those discontinuities we confirmed the previously mapped position of the probe DXS1327 within a NotI fragment on the X chromosome, thus demonstrating the validity of the technique. We were also able to position the probes D21S1 and D21S15 with respect to the ends of their corresponding NotI fragments on chromosome 21. A third chromosome 21 probe, D21S11, has previously been reported to be close to D21S1, although an uncertainty about a second possible location existed. Since both probes D21S1 and D21S11 hybridized to a single NotI fragment and yielded a similar smear pattern, this uncertainty is removed by the random-breakage mapping method.

  17. Random Sampling of Quantum States: a Survey of Methods. And Some Issues Regarding the Overparametrized Method

    NASA Astrophysics Data System (ADS)

    Maziero, Jonas

    2015-12-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed.

  18. Cox regression methods for two-stage randomization designs.

    PubMed

    Lokhnygina, Yuliya; Helterbrand, Jeffrey D

    2007-06-01

    Two-stage randomization designs (TSRD) are becoming increasingly common in oncology and AIDS clinical trials as they make more efficient use of study participants to examine therapeutic regimens. In these designs patients are initially randomized to an induction treatment, followed by randomization to a maintenance treatment conditional on their induction response and consent to further study treatment. Broader acceptance of TSRDs in drug development may hinge on the ability to make appropriate intent-to-treat type inference within this design framework as to whether an experimental induction regimen is better than a standard induction regimen when maintenance treatment is fixed. Recently Lunceford, Davidian, and Tsiatis (2002, Biometrics 58, 48-57) introduced an inverse probability weighting based analytical framework for estimating survival distributions and mean restricted survival times, as well as for comparing treatment policies at landmarks in the TSRD setting. In practice Cox regression is widely used and in this article we extend the analytical framework of Lunceford et al. (2002) to derive a consistent estimator for the log hazard in the Cox model and a robust score test to compare treatment policies. Large sample properties of these methods are derived, illustrated via a simulation study, and applied to a TSRD clinical trial. PMID:17425633

  19. Finite amplitude method for the quasiparticle random-phase approximation

    NASA Astrophysics Data System (ADS)

    Avogadro, Paolo; Nakatsukasa, Takashi

    2011-07-01

    We present the finite amplitude method (FAM), originally proposed in Ref. , for superfluid systems. A Hartree-Fock-Bogoliubov code may be transformed into a code of the quasiparticle-random-phase approximation (QRPA) with simple modifications. This technique has advantages over the conventional QRPA calculations, such as coding feasibility and computational cost. We perform the fully self-consistent linear-response calculation for the spherical neutron-rich nucleus 174Sn, modifying the hfbrad code, to demonstrate the accuracy, feasibility, and usefulness of the FAM.

  20. Statistical comparison of random allocation methods in cancer clinical trials.

    PubMed

    Hagino, Atsushi; Hamada, Chikuma; Yoshimura, Isao; Ohashi, Yasuo; Sakamoto, Junichi; Nakazato, Hiroaki

    2004-12-01

    The selection of a trial design is an important issue in the planning of clinical trials. One of the most important considerations in trial design is the method of treatment allocation and appropriate analysis plan corresponding to the design. In this article, we conducted computer simulations using the actual data from 2158 rectal cancer patients enrolled in the surgery-alone group from seven randomized controlled trials in Japan to compare the performance of allocation methods, simple randomization, stratified randomization and minimization in relatively small-scale trials (total number of two groups are 50, 100, 150 or 200 patients). The degree of imbalance in prognostic factors between groups was evaluated by changing the allocation probability of minimization from 1.00 to 0.70 by 0.05. The simulation demonstrated that minimization provides the best performance to ensure balance in the number of patients between groups and prognostic factors. Moreover, to achieve the 1 percentile for the p-value of chi-square test around 0.50 with respect to balance in prognostic factors, the allocation probability of minimization was required to be set to 0.95 for 50, 0.80 for 100, 0.75 for 150 and 0.70 for 200 patients. When the sample size was larger, sufficient balance could be achieved even if reducing allocation probability. The simulation using actual data demonstrated that unadjusted tests for the allocation factors resulted in conservative type I errors when dynamic allocation, such as minimization, was used. In contrast, adjusted tests for allocation factors as covariates improved type I errors closer to the nominal significance level and they provided slightly higher power. In conclusion, both the statistical and clinical validity of minimization was demonstrated in our study.

  1. Research on the methods of optical image hiding based on double random phase encoding and digital holography

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Sang, Nong

    2011-12-01

    Optical information hiding system has many features such as high processing speed, high parallel, high encryption dimension and high speed of optical transformation and related operations, more advantages than digital method in some way. But it has not adequate security, and enough combination with techniques of digital image processing. So on basis of analyzing existing image hiding and analyzing techniques, we give out the idea. We should adopt idea of virtual optics on the way of all-digital simulation to do research of optical image hiding and analyzing methods based on optical image processing technique especially technique of double random phase encoding and digital holography.

  2. PROSPECTIVE RANDOMIZED STUDY COMPARING TWO ANESTHETIC METHODS FOR SHOULDER SURGERY

    PubMed Central

    Ikemoto, Roberto Yukio; Murachovsky, Joel; Prata Nascimento, Luis Gustavo; Bueno, Rogerio Serpone; Oliveira Almeida, Luiz Henrique; Strose, Eric; de Mello, Sérgio Cabral; Saletti, Deise

    2015-01-01

    Objective: To evaluate the efficacy of suprascapular nerve block in combination with infusion of anesthetic into the subacromial space, compared with interscalene block. Methods: Forty-five patients with small or medium-sized isolated supraspinatus tendon lesions who underwent arthroscopic repair were prospectively and comparatively evaluated through random assignation to three groups of 15, each with a different combination of anesthetic methods. The efficacy of postoperative analgesia was measured using the visual analogue scale for pain and the analgesic, anti-inflammatory and opioid drug consumption. Inhalation anesthetic consumption during surgery was also compared between the groups. Results: The statistical analysis did not find any statistically significant differences among the groups regarding anesthetic consumption during surgery or postoperative analgesic efficacy during the first 48 hours. Conclusion: Suprascapular nerve block with infusion of anesthetic into the subacromial space is an excellent alternative to interscalene block, particularly in hospitals in which an electrical nerve stimulating device is unavailable. PMID:27022569

  3. A New GP Recombination Method Using Random Tree Sampling

    NASA Astrophysics Data System (ADS)

    Tanji, Makoto; Iba, Hitoshi

    We propose a new program evolution method named PORTS (Program Optimization by Random Tree Sampling) which is motivated by the idea of preservation and control of tree fragments in GP (Genetic Programming). We assume that to recombine genetic materials efficiently, tree fragments of any size should be preserved into the next generation. PORTS samples tree fragments and concatenates them by traversing and transitioning between promising trees instead of using subtree crossover and mutation. Because the size of a fragment preserved during a generation update follows a geometric distribution, merits of the method are that it is relatively easy to predict the behavior of tree fragments over time and to control sampling size, by changing a single parameter. From experimental results on RoyalTree, Symbolic Regression and 6-Multiplexer problem, we observed that the performance of PORTS is competitive with Simple GP. Furthermore, the average node size of optimal solutions obtained by PORTS was simple than Simple GP's result.

  4. Asbestos/NESHAP adequately wet guidance

    SciTech Connect

    Shafer, R.; Throwe, S.; Salgado, O.; Garlow, C.; Hoerath, E.

    1990-12-01

    The Asbestos NESHAP requires facility owners and/or operators involved in demolition and renovation activities to control emissions of particulate asbestos to the outside air because no safe concentration of airborne asbestos has ever been established. The primary method used to control asbestos emissions is to adequately wet the Asbestos Containing Material (ACM) with a wetting agent prior to, during and after demolition/renovation activities. The purpose of the document is to provide guidance to asbestos inspectors and the regulated community on how to determine if friable ACM is adequately wet as required by the Asbestos NESHAP.

  5. A new method for direction finding based on Markov random field model

    NASA Astrophysics Data System (ADS)

    Ota, Mamoru; Kasahara, Yoshiya; Goto, Yoshitaka

    2015-07-01

    Investigating the characteristics of plasma waves observed by scientific satellites in the Earth's plasmasphere/magnetosphere is effective for understanding the mechanisms for generating waves and the plasma environment that influences wave generation and propagation. In particular, finding the propagation directions of waves is important for understanding mechanisms of VLF/ELF waves. To find these directions, the wave distribution function (WDF) method has been proposed. This method is based on the idea that observed signals consist of a number of elementary plane waves that define wave energy density distribution. However, the resulting equations constitute an ill-posed problem in which a solution is not determined uniquely; hence, an adequate model must be assumed for a solution. Although many models have been proposed, we have to select the most optimum model for the given situation because each model has its own advantages and disadvantages. In the present study, we propose a new method for direction finding of the plasma waves measured by plasma wave receivers. Our method is based on the assumption that the WDF can be represented by a Markov random field model with inference of model parameters performed using a variational Bayesian learning algorithm. Using computer-generated spectral matrices, we evaluated the performance of the model and compared the results with those obtained from two conventional methods.

  6. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods.

    PubMed

    Shara, Nawar; Yassin, Sayf A; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V; Wang, Wenyu; Lee, Elisa T; Umans, Jason G

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989-1991), 2 (1993-1995), and 3 (1998-1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.

  7. Multilevel Analysis Methods for Partially Nested Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Sanders, Elizabeth A.

    2011-01-01

    This paper explores multilevel modeling approaches for 2-group randomized experiments in which a treatment condition involving clusters of individuals is compared to a control condition involving only ungrouped individuals, otherwise known as partially nested cluster randomized designs (PNCRTs). Strategies for comparing groups from a PNCRT in the…

  8. Randomized BioBrick assembly: a novel DNA assembly method for randomizing and optimizing genetic circuits and metabolic pathways.

    PubMed

    Sleight, Sean C; Sauro, Herbert M

    2013-09-20

    The optimization of genetic circuits and metabolic pathways often involves constructing various iterations of the same construct or using directed evolution to achieve the desired function. Alternatively, a method that randomizes individual parts in the same assembly reaction could be used for optimization by allowing for the ability to screen large numbers of individual clones expressing randomized circuits or pathways for optimal function. Here we describe a new assembly method to randomize genetic circuits and metabolic pathways from modular DNA fragments derived from PCR-amplified BioBricks. As a proof-of-principle for this method, we successfully assembled CMY (Cyan-Magenta-Yellow) three-gene circuits using Gibson Assembly that express CFP, RFP, and YFP with independently randomized promoters, ribosome binding sites, transcriptional terminators, and all parts randomized simultaneously. Sequencing results from 24 CMY circuits with various parts randomized show that 20/24 circuits are distinct and expression varies over a 200-fold range above background levels. We then adapted this method to randomize the same parts with enzyme coding sequences from the lycopene biosynthesis pathway instead of fluorescent proteins, designed to independently express each enzyme in the pathway from a different promoter. Lycopene production is improved using this randomization method by about 30% relative to the highest polycistronic-expressing pathway. These results demonstrate the potential of generating nearly 20,000 unique circuit or pathway combinations when three parts are permutated at each position in a three-gene circuit or pathway, and the methodology can likely be adapted to other circuits and pathways to maximize products of interest.

  9. A comparison of methods for representing sparsely sampled random quantities.

    SciTech Connect

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  10. Investigation of stochastic radiation transport methods in random heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Reinert, Dustin Ray

    Among the most formidable challenges facing our world is the need for safe, clean, affordable energy sources. Growing concerns over global warming induced climate change and the rising costs of fossil fuels threaten conventional means of electricity production and are driving the current nuclear renaissance. One concept at the forefront of international development efforts is the High Temperature Gas-Cooled Reactor (HTGR). With numerous passive safety features and a meltdown-proof design capable of attaining high thermodynamic efficiencies for electricity generation as well as high temperatures useful for the burgeoning hydrogen economy, the HTGR is an extremely promising technology. Unfortunately, the fundamental understanding of neutron behavior within HTGR fuels lags far behind that of more conventional water-cooled reactors. HTGRs utilize a unique heterogeneous fuel element design consisting of thousands of tiny fissile fuel kernels randomly mixed with a non-fissile graphite matrix. Monte Carlo neutron transport simulations of the HTGR fuel element geometry in its full complexity are infeasible and this has motivated the development of more approximate computational techniques. A series of MATLAB codes was written to perform Monte Carlo simulations within HTGR fuel pebbles to establish a comprehensive understanding of the parameters under which the accuracy of the approximate techniques diminishes. This research identified the accuracy of the chord length sampling method to be a function of the matrix scattering optical thickness, the kernel optical thickness, and the kernel packing density. Two new Monte Carlo methods designed to focus the computational effort upon the parameter conditions shown to contribute most strongly to the overall computational error were implemented and evaluated. An extended memory chord length sampling routine that recalls a neutron's prior material traversals was demonstrated to be effective in fixed source calculations containing

  11. A Random Forest-based ensemble method for activity recognition.

    PubMed

    Feng, Zengtao; Mo, Lingfei; Li, Meng

    2015-01-01

    This paper presents a multi-sensor ensemble approach to human physical activity (PA) recognition, using random forest. We designed an ensemble learning algorithm, which integrates several independent Random Forest classifiers based on different sensor feature sets to build a more stable, more accurate and faster classifier for human activity recognition. To evaluate the algorithm, PA data collected from the PAMAP (Physical Activity Monitoring for Aging People), which is a standard, publicly available database, was utilized to train and test. The experimental results show that the algorithm is able to correctly recognize 19 PA types with an accuracy of 93.44%, while the training is faster than others. The ensemble classifier system based on the RF (Random Forest) algorithm can achieve high recognition accuracy and fast calculation. PMID:26737432

  12. Grid-free simulation of diffusion using random walk methods

    NASA Technical Reports Server (NTRS)

    Ghoniem, A. F.; Sherman, F. S.

    1985-01-01

    The simulation of the diffusion of a continuum field by the random walk (RW) displacement of a set of particles is considered. Elements of the gradients of the diffusive concentration are transported by computational particles. It is demonstrated that, by the use of concentration gradients in the RW process, statistical errors are reduced and each realization of the numerical solution is a representation of the exact solution. The algorithm is grid-free, and the computational elements move to follow the gradients; hence, the algorithm is self-adaptive, and uniform resolution is achieved for all times.

  13. Statistics-based reconstruction method with high random-error tolerance for integral imaging.

    PubMed

    Zhang, Juan; Zhou, Liqiu; Jiao, Xiaoxue; Zhang, Lei; Song, Lipei; Zhang, Bo; Zheng, Yi; Zhang, Zan; Zhao, Xing

    2015-10-01

    A three-dimensional (3D) digital reconstruction method for integral imaging with high random-error tolerance based on statistics is proposed. By statistically analyzing the points reconstructed by triangulation from all corresponding image points in an elemental images array, 3D reconstruction with high random-error tolerance could be realized. To simulate the impacts of random errors, random offsets with different error levels are added to a different number of elemental images in simulation and optical experiments. The results of simulation and optical experiments showed that the proposed statistic-based reconstruction method has relatively stable and better reconstruction accuracy than the conventional reconstruction method. It can be verified that the proposed method can effectively reduce the impacts of random errors on 3D reconstruction of integral imaging. This method is simple and very helpful to the development of integral imaging technology.

  14. 34 CFR 85.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Adequate evidence. 85.900 Section 85.900 Education Office of the Secretary, Department of Education GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 85.900 Adequate evidence. Adequate evidence means information sufficient to support...

  15. 12 CFR 380.52 - Adequate protection.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Adequate protection. 380.52 Section 380.52... ORDERLY LIQUIDATION AUTHORITY Receivership Administrative Claims Process § 380.52 Adequate protection. (a... interest of a claimant, the receiver shall provide adequate protection by any of the following means:...

  16. 12 CFR 380.52 - Adequate protection.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Adequate protection. 380.52 Section 380.52... ORDERLY LIQUIDATION AUTHORITY Receivership Administrative Claims Process § 380.52 Adequate protection. (a... interest of a claimant, the receiver shall provide adequate protection by any of the following means:...

  17. 12 CFR 380.52 - Adequate protection.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Adequate protection. 380.52 Section 380.52... ORDERLY LIQUIDATION AUTHORITY Receivership Administrative Claims Process § 380.52 Adequate protection. (a... interest of a claimant, the receiver shall provide adequate protection by any of the following means:...

  18. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient...

  19. The simulation of groundwater flow velocity random fields by the method of partitioning and randomization of the spectrum

    NASA Astrophysics Data System (ADS)

    Konecny, Franz; Fürst, Josef

    2007-02-01

    Due to the heterogeneity of aquifers, groundwater flow velocity fields can be viewed as vector random fields (v.r.f.). For the application of Monte Carlo methods to investigate problems of pollutant transport, the efficient generation of v.r.f. with prescribed covariance structure is an important task. The subject of this paper is the simulation of v.r.f. with a given spectral tensor. We adopt a method that combines two principles: spectral domain partitioning and spectrum randomization (SDP/SR). The SR principle allows to reproduce exactly the covariance structure of the v.r.f., which is of particular importance for Monte Carlo simulation, such as random walk particle tracking. Following this methodology, replicates of the v.r.f. can be generated using a cosine series. Once the coefficients of the series were determined, the v.r.f. can be computed at any point of its domain by mere evaluation of the cosine terms. The method does not require a computational grid and is computationally more efficient than, e.g., Gaussian conditioning.

  20. Combining randomized and non-randomized evidence in clinical research: a review of methods and applications.

    PubMed

    Verde, Pablo E; Ohmann, Christian

    2015-03-01

    Researchers may have multiple motivations for combining disparate pieces of evidence in a meta-analysis, such as generalizing experimental results or increasing the power to detect an effect that a single study is not able to detect. However, while in meta-analysis, the main question may be simple, the structure of evidence available to answer it may be complex. As a consequence, combining disparate pieces of evidence becomes a challenge. In this review, we cover statistical methods that have been used for the evidence-synthesis of different study types with the same outcome and similar interventions. For the methodological review, a literature retrieval in the area of generalized evidence-synthesis was performed, and publications were identified, assessed, grouped and classified. Furthermore real applications of these methods in medicine were identified and described. For these approaches, 39 real clinical applications could be identified. A new classification of methods is provided, which takes into account: the inferential approach, the bias modeling, the hierarchical structure, and the use of graphical modeling. We conclude with a discussion of pros and cons of our approach and give some practical advice. PMID:26035469

  1. A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn

    2006-01-01

    A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…

  2. Which Ab Initio Wave Function Methods Are Adequate for Quantitative Calculations of the Energies of Biradicals? The Performance of Coupled-Cluster and Multi-Reference Methods Along a Single-Bond Dissociation Coordinate

    SciTech Connect

    Yang, Ke; Jalan, Amrit; Green, William H.; Truhlar, Donald G.

    2013-01-08

    We examine the accuracy of single-reference and multireference correlated wave function methods for predicting accurate energies and potential energy curves of biradicals. The biradicals considered are intermediate species along the bond dissociation coordinates for breaking the F-F bond in F2, the O-O bond in H2O2, and the C-C bond in CH3CH3. We apply a host of single-reference and multireference approximations in a consistent way to the same cases to provide a better assessment of their relative accuracies than was previously possible. The most accurate method studied is coupled cluster theory with all connected excitations through quadruples, CCSDTQ. Without explicit quadruple excitations, the most accurate potential energy curves are obtained by the single-reference RCCSDt method, followed, in order of decreasing accuracy, by UCCSDT, RCCSDT, UCCSDt, seven multireference methods, including perturbation theory, configuration interaction, and coupled-cluster methods (with MRCI+Q being the best and Mk-MR-CCSD the least accurate), four CCSD(T) methods, and then CCSD.

  3. Note on coefficient matrices from stochastic Galerkin methods for random diffusion equations

    SciTech Connect

    Zhou Tao; Tang Tao

    2010-11-01

    In a recent work by Xiu and Shen [D. Xiu, J. Shen, Efficient stochastic Galerkin methods for random diffusion equations, J. Comput. Phys. 228 (2009) 266-281], the Galerkin methods are used to solve stochastic diffusion equations in random media, where some properties for the coefficient matrix of the resulting system are provided. They also posed an open question on the properties of the coefficient matrix. In this work, we will provide some results related to the open question.

  4. An easily customized, random allocation system using the minimization method for multi-institutional clinical trials.

    PubMed

    Kenjo, Y; Antoku, Y; Akazawa, K; Hanada, E; Kinukawa, N; Nose, Y

    2000-05-01

    In a randomized clinical trial, random allocation of patients to treatment groups should be done to balance in the distribution of prognostic factors. Random allocation in a multi-institutional randomized clinical trial is conducted by a coordinating center, independent of the medical institution the attending doctor uses for his/her practice. This study provides a sophisticated system for doing an exact random allocation of patients to treatment groups. The minimization method proposed by Pocock was applied to this system to balance the distribution of prognostic factors between two treatment groups, even when the number of registered patients is relatively small (S.J. Pocock, Allocation of patients to treatment in clinical trial, Biometrics 35 (1979) 183-197). Furthermore, Zelen's method is used to balance the number of patients allocated to the two groups within each institution (M. Zelen, The randomization and stratification of patients to clinical trials, J. Chron. Dis. 27 (1974) 365-375.). This system was created by the 'PERL’ language for writing common gateway interface (CGI) script, and can therefore, be easily extended to include data entry function by attending doctors as well as the random allocation function. This system is being used effectively in thirteen multi-institutional randomized clinical trials for stomach, colon-rectum and breast cancers in Japan.

  5. A multiple step random walk Monte Carlo method for heat conduction involving distributed heat sources

    NASA Astrophysics Data System (ADS)

    Naraghi, M. H. N.; Chung, B. T. F.

    1982-06-01

    A multiple step fixed random walk Monte Carlo method for solving heat conduction in solids with distributed internal heat sources is developed. In this method, the probability that a walker reaches a point a few steps away is calculated analytically and is stored in the computer. Instead of moving to the immediate neighboring point the walker is allowed to jump several steps further. The present multiple step random walk technique can be applied to both conventional Monte Carlo and the Exodus methods. Numerical results indicate that the present method compares well with finite difference solutions while the computation speed is much faster than that of single step Exodus and conventional Monte Carlo methods.

  6. The GEA method for light scattering by dielectric spheroids and ellipsoids with fixed and random orientations

    NASA Astrophysics Data System (ADS)

    Yang, Lei Ming

    The GEA method is employed to study light scattering by dielectric spheroids and ellipsoids with either fixed or random orientations. A simple formula is obtained for a dielectric ellipsoid. The results from a dielectric spheroid with fixed or random orientations are compared numerically and found to agree well with the T-matrix method for small angle scattering. The numerical results for an ellipsoid are obtained. The validity region for the equal-volume sphere method for a dielectric spheroid and equal-volume spheroid method for a dielectric ellipsoid are also discussed.

  7. A comparison of confidence interval methods for the intraclass correlation coefficient in cluster randomized trials.

    PubMed

    Ukoumunne, Obioha C

    2002-12-30

    This study compared different methods for assigning confidence intervals to the analysis of variance estimator of the intraclass correlation coefficient (rho). The context of the comparison was the use of rho to estimate the variance inflation factor when planning cluster randomized trials. The methods were compared using Monte Carlo simulations of unbalanced clustered data and data from a cluster randomized trial of an intervention to improve the management of asthma in a general practice setting. The coverage and precision of the intervals were compared for data with different numbers of clusters, mean numbers of subjects per cluster and underlying values of rho. The performance of the methods was also compared for data with Normal and non-Normally distributed cluster specific effects. Results of the simulations showed that methods based upon the variance ratio statistic provided greater coverage levels than those based upon large sample approximations to the standard error of rho. Searle's method provided close to nominal coverage for data with Normally distributed random effects. Adjusted versions of Searle's method to allow for lack of balance in the data generally did not improve upon it either in terms of coverage or precision. Analyses of the trial data, however, showed that limits provided by Thomas and Hultquist's method may differ from those of the other variance ratio statistic methods when the arithmetic mean differs markedly from the harmonic mean cluster size. The simulation results demonstrated that marked non-Normality in the cluster level random effects compromised the performance of all methods. Confidence intervals for the methods were generally wide relative to the underlying size of rho suggesting that there may be great uncertainty associated with sample size calculations for cluster trials where large clusters are randomized. Data from cluster based studies with sample sizes much larger than those typical of cluster randomized trials are

  8. Research on Parameter Estimation Methods for Alpha Stable Noise in a Laser Gyroscope's Random Error.

    PubMed

    Wang, Xueyun; Li, Kui; Gao, Pengyu; Meng, Suxia

    2015-01-01

    Alpha stable noise, determined by four parameters, has been found in the random error of a laser gyroscope. Accurate estimation of the four parameters is the key process for analyzing the properties of alpha stable noise. Three widely used estimation methods-quantile, empirical characteristic function (ECF) and logarithmic moment method-are analyzed in contrast with Monte Carlo simulation in this paper. The estimation accuracy and the application conditions of all methods, as well as the causes of poor estimation accuracy, are illustrated. Finally, the highest precision method, ECF, is applied to 27 groups of experimental data to estimate the parameters of alpha stable noise in a laser gyroscope's random error. The cumulative probability density curve of the experimental data fitted by an alpha stable distribution is better than that by a Gaussian distribution, which verifies the existence of alpha stable noise in a laser gyroscope's random error.

  9. Safety assessment of a shallow foundation using the random finite element method

    NASA Astrophysics Data System (ADS)

    Zaskórski, Łukasz; Puła, Wojciech

    2015-04-01

    A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical

  10. Alternative exact method for random walks on finite and periodic lattices with traps

    NASA Astrophysics Data System (ADS)

    Soler, Jose M.

    1982-07-01

    An alternative general method for random walks in finite or periodic lattices with traps is presented. The method gives, in a straightforward manner and in very little computing time, the exact probability that a random walker, starting from a given site, will undergo n steps before trapping. Another version gives the probability that the walker is at any other given position after n steps. The expected walk lengths calculated for simple lattices agree exactly with those given by a previous exact method by Walsh and Kozak.

  11. An overhang-based DNA block shuffling method for creating a customized random library

    PubMed Central

    Fujishima, Kosuke; Venter, Chris; Wang, Kendrick; Ferreira, Raphael; Rothschild, Lynn J.

    2015-01-01

    We present an overhang-based DNA block shuffling method to create a customized random DNA library with flexible sequence design and length. Our method enables the efficient and seamless assembly of short DNA blocks with dinucleotide overhangs through a simple ligation process. Next generation sequencing analysis of the assembled DNA library revealed that ligation was accurate, directional and unbiased. This straightforward DNA assembly method should fulfill the versatile needs of both in vivo and in vitro functional screening of random peptides and RNA created with a desired amino acid and nucleotide composition, as well as making highly repetitive gene constructs that are difficult to synthesize de novo. PMID:26010273

  12. Applying a weighted random forests method to extract karst sinkholes from LiDAR data

    NASA Astrophysics Data System (ADS)

    Zhu, Junfeng; Pierskalla, William P.

    2016-02-01

    Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.

  13. Random projection-based dimensionality reduction method for hyperspectral target detection

    NASA Astrophysics Data System (ADS)

    Feng, Weiyi; Chen, Qian; He, Weiji; Arce, Gonzalo R.; Gu, Guohua; Zhuang, Jiayan

    2015-09-01

    Dimensionality reduction is a frequent preprocessing step in hyperspectral image analysis. High-dimensional data will cause the issue of the "curse of dimensionality" in the applications of hyperspectral imagery. In this paper, a dimensionality reduction method of hyperspectral images based on random projection (RP) for target detection was investigated. In the application areas of hyperspectral imagery, e.g. target detection, the high dimensionality of the hyperspectral data would lead to burdensome computations. Random projection is attractive in this area because it is data independent and computationally more efficient than other widely-used hyperspectral dimensionality-reduction methods, such as Principal Component Analysis (PCA) or the maximum-noise-fraction (MNF) transform. In RP, the original highdimensional data is projected onto a low dimensional subspace using a random matrix, which is very simple. Theoretical and experimental results indicated that random projections preserved the structure of the original high-dimensional data quite well without introducing significant distortion. In the experiments, Constrained Energy Minimization (CEM) was adopted as the target detector and a RP-based CEM method for hyperspectral target detection was implemented to reveal that random projections might be a good alternative as a dimensionality reduction tool of hyperspectral images to yield improved target detection with higher detection accuracy and lower computation time than other methods.

  14. Estimating Super Heavy Element Event Random Probabilities Using Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Stoyer, Mark; Henderson, Roger; Kenneally, Jacqueline; Moody, Kenton; Nelson, Sarah; Shaughnessy, Dawn; Wilk, Philip

    2009-10-01

    Because superheavy element (SHE) experiments involve very low event rates and low statistics, estimating the probability that a given event sequence is due to random events is extremely important in judging the validity of the data. A Monte Carlo method developed at LLNL [1] is used on recent SHE experimental data to calculate random event probabilities. Current SHE experimental activities in collaboration with scientists at Dubna, Russia will be discussed. [4pt] [1] N.J. Stoyer, et al., Nucl. Instrum. Methods Phys. Res. A 455 (2000) 433.

  15. Methods and optical fibers that decrease pulse degradation resulting from random chromatic dispersion

    DOEpatents

    Chertkov, Michael; Gabitov, Ildar

    2004-03-02

    The present invention provides methods and optical fibers for periodically pinning an actual (random) accumulated chromatic dispersion of an optical fiber to a predicted accumulated dispersion of the fiber through relatively simple modifications of fiber-optic manufacturing methods or retrofitting of existing fibers. If the pinning occurs with sufficient frequency (at a distance less than or are equal to a correlation scale), pulse degradation resulting from random chromatic dispersion is minimized. Alternatively, pinning may occur quasi-periodically, i.e., the pinning distance is distributed between approximately zero and approximately two to three times the correlation scale.

  16. A novel model and estimation method for the individual random component of earthquake ground-motion relations

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2016-06-01

    In this paper, I introduce a novel approach to modelling the individual random component (also called the intra-event uncertainty) of a ground-motion relation (GMR), as well as a novel approach to estimating the corresponding parameters. In essence, I contend that the individual random component is reproduced adequately by a simple stochastic mechanism of random impulses acting in the horizontal plane, with random directions. The random number of impulses was Poisson distributed. The parameters of the model were estimated according to a proposal by Raschke J Seismol 17(4):1157-1182, (2013a), with the sample of random difference ξ = ln(Y 1 )-ln(Y 2 ), in which Y 1 and Y 2 are the horizontal components of local ground-motion intensity. Any GMR element was eliminated by subtraction, except the individual random components. In the estimation procedure, the distribution of difference ξ was approximated by combining a large Monte Carlo simulated sample and Kernel smoothing. The estimated model satisfactorily fitted the difference ξ of the sample of peak ground accelerations, and the variance of the individual random components was considerably smaller than that of conventional GMRs. In addition, the dependence of variance on the epicentre distance was considered; however, a dependence of variance on the magnitude was not detected. Finally, the influence of the novel model and the corresponding approximations on PSHA was researched. The applied approximations of distribution of the individual random component were satisfactory for the researched example of PSHA.

  17. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster.

    PubMed

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness of the samples between the two methods was assessed. The method presented here was superior to the traditional method. Only 14% of the samples had a standard deviation higher than expected, as compared with 58% in the traditional method. To reduce bias in the estimation of the variance and the mean of a trait and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila.

  18. Analysis of random structure-acoustic interaction problems using coupled boundary element and finite element methods

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Pates, Carl S., III

    1994-01-01

    A coupled boundary element (BEM)-finite element (FEM) approach is presented to accurately model structure-acoustic interaction systems. The boundary element method is first applied to interior, two and three-dimensional acoustic domains with complex geometry configurations. Boundary element results are very accurate when compared with limited exact solutions. Structure-interaction problems are then analyzed with the coupled FEM-BEM method, where the finite element method models the structure and the boundary element method models the interior acoustic domain. The coupled analysis is compared with exact and experimental results for a simplistic model. Composite panels are analyzed and compared with isotropic results. The coupled method is then extended for random excitation. Random excitation results are compared with uncoupled results for isotropic and composite panels.

  19. Characterization of a random anisotropic conductivity field with Karhunen-Loeve methods

    SciTech Connect

    Cherry, Matthew R.; Sabbagh, Harold S.; Pilchak, Adam L.; Knopp, Jeremy S.

    2014-02-18

    While parametric uncertainty quantification for NDE models has been addressed in recent years, the problem of stochastic field parameters such as spatially distributed electrical conductivity has only been investigated minimally in the last year. In that work, the authors treated the field as a one-dimensional random process and Karhunen-Loeve methods were used to discretize this process to make it amenable to UQ methods such as ANOVA expansions. In the present work, we will treat the field as a two dimensional random process, and the eigenvalues and eigenfunctions of the integral operator will be determined via Galerkin methods. The Karhunen-Loeve methods is extended to two dimensions and implemented to represent this process. Several different choices for basis functions will be discussed, as well as convergence criteria for each. The methods are applied to correlation functions collected over electron backscatter data from highly micro textured Ti-7Al.

  20. Supervision of Student Teachers: How Adequate?

    ERIC Educational Resources Information Center

    Dean, Ken

    This study attempted to ascertain how adequately student teachers are supervised by college supervisors and supervising teachers. Questions to be answered were as follows: a) How do student teachers rate the adequacy of supervision given them by college supervisors and supervising teachers? and b) Are there significant differences between ratings…

  1. Small Rural Schools CAN Have Adequate Curriculums.

    ERIC Educational Resources Information Center

    Loustaunau, Martha

    The small rural school's foremost and largest problem is providing an adequate curriculum for students in a changing world. Often the small district cannot or is not willing to pay the per-pupil cost of curriculum specialists, specialized courses using expensive equipment no more than one period a day, and remodeled rooms to accommodate new…

  2. Toward More Adequate Quantitative Instructional Research.

    ERIC Educational Resources Information Center

    VanSickle, Ronald L.

    1986-01-01

    Sets an agenda for improving instructional research conducted with classical quantitative experimental or quasi-experimental methodology. Includes guidelines regarding the role of a social perspective, adequate conceptual and operational definition, quality instrumentation, control of threats to internal and external validity, and the use of…

  3. An Adequate Education Defined. Fastback 476.

    ERIC Educational Resources Information Center

    Thomas, M. Donald; Davis, E. E. (Gene)

    Court decisions historically have dealt with educational equity; now they are helping to establish "adequacy" as a standard in education. Legislatures, however, have been slow to enact remedies. One debate over education adequacy, though, is settled: Schools are not financed at an adequate level. This fastback is divided into three sections.…

  4. Funding the Formula Adequately in Oklahoma

    ERIC Educational Resources Information Center

    Hancock, Kenneth

    2015-01-01

    This report is a longevity, simulational study that looks at how the ratio of state support to local support effects the number of school districts that breaks the common school's funding formula which in turns effects the equity of distribution to the common schools. After nearly two decades of adequately supporting the funding formula, Oklahoma…

  5. Randomized Controlled Trial of Teaching Methods: Do Classroom Experiments Improve Economic Education in High Schools?

    ERIC Educational Resources Information Center

    Eisenkopf, Gerald; Sulser, Pascal A.

    2016-01-01

    The authors present results from a comprehensive field experiment at Swiss high schools in which they compare the effectiveness of teaching methods in economics. They randomly assigned classes into an experimental and a conventional teaching group, or a control group that received no specific instruction. Both teaching treatments improve economic…

  6. Application of the coherent anomaly method to the branching annihilation random walk

    NASA Astrophysics Data System (ADS)

    Inui, N.

    1993-12-01

    The branching annihilation random walk (in short BARW) exhibiting an extinction-survival phase transition in one dimension is studied by the coherent anomaly method. This is the first theoretical evidence that the BRAW belongs to the universality class of directed percolation.

  7. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter.

    PubMed

    Huang, Lei

    2015-09-30

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required.

  8. A shortcut through the Coulomb gas method for spectral linear statistics on random matrices

    NASA Astrophysics Data System (ADS)

    Deelan Cunden, Fabio; Facchi, Paolo; Vivo, Pierpaolo

    2016-04-01

    In the last decade, spectral linear statistics on large dimensional random matrices have attracted significant attention. Within the physics community, a privileged role has been played by invariant matrix ensembles for which a two-dimensional Coulomb gas analogy is available. We present a critical revision of the Coulomb gas method in random matrix theory (RMT) borrowing language and tools from large deviations theory. This allows us to formalize an equivalent, but more effective and quicker route toward RMT free energy calculations. Moreover, we argue that this more modern viewpoint is likely to shed further light on the interesting issues of weak phase transitions and evaporation phenomena recently observed in RMT.

  9. An efficient method for calculating RMS von Mises stress in a random vibration environment

    SciTech Connect

    Segalman, D.J.; Fulcher, C.W.G.; Reese, G.M.; Field, R.V. Jr.

    1998-02-01

    An efficient method is presented for calculation of RMS von Mises stresses from stress component transfer functions and the Fourier representation of random input forces. An efficient implementation of the method calculates the RMS stresses directly from the linear stress and displacement modes. The key relation presented is one suggested in past literature, but does not appear to have been previously exploited in this manner.

  10. An efficient method for calculating RMS von Mises stress in a random vibration environment

    SciTech Connect

    Segalman, D.J.; Fulcher, C.W.G.; Reese, G.M.; Field, R.V. Jr.

    1997-12-01

    An efficient method is presented for calculation of RMS von Mises stresses from stress component transfer functions and the Fourier representation of random input forces. An efficient implementation of the method calculates the RMS stresses directly from the linear stress and displacement modes. The key relation presented is one suggested in past literature, but does not appear to have been previously exploited in this manner.

  11. Methods for testing theory and evaluating impact in randomized field trials

    PubMed Central

    Brown, C. Hendricks; Wang, Wei; Kellam, Sheppard G.; Muthén, Bengt O.; Petras, Hanno; Toyinbo, Peter; Poduska, Jeanne; Ialongo, Nicholas; Wyman, Peter A.; Chamberlain, Patricia; Sloboda, Zili; MacKinnon, David P.; Windham, Amy

    2008-01-01

    Randomized field trials provide unique opportunities to examine the effectiveness of an intervention in real world settings and to test and extend both theory of etiology and theory of intervention. These trials are designed not only to test for overall intervention impact but also to examine how impact varies as a function of individual level characteristics, context, and across time. Examination of such variation in impact requires analytical methods that take into account the trial’s multiple nested structure and the evolving changes in outcomes over time. The models that we describe here merge multilevel modeling with growth modeling, allowing for variation in impact to be represented through discrete mixtures—growth mixture models—and nonparametric smooth functions—generalized additive mixed models. These methods are part of an emerging class of multilevel growth mixture models, and we illustrate these with models that examine overall impact and variation in impact. In this paper, we define intent-to-treat analyses in group-randomized multilevel field trials and discuss appropriate ways to identify, examine, and test for variation in impact without inflating the Type I error rate. We describe how to make causal inferences more robust to misspecification of covariates in such analyses and how to summarize and present these interactive intervention effects clearly. Practical strategies for reducing model complexity, checking model fit, and handling missing data are discussed using six randomized field trials to show how these methods may be used across trials randomized at different levels. PMID:18215473

  12. Local search methods based on variable focusing for random K-satisfiability.

    PubMed

    Lemoy, Rémi; Alava, Mikko; Aurell, Erik

    2015-01-01

    We introduce variable focused local search algorithms for satisfiabiliity problems. Usual approaches focus uniformly on unsatisfied clauses. The methods described here work by focusing on random variables in unsatisfied clauses. Variants are considered where variables are selected uniformly and randomly or by introducing a bias towards picking variables participating in several unsatistified clauses. These are studied in the case of the random 3-SAT problem, together with an alternative energy definition, the number of variables in unsatisfied constraints. The variable-based focused Metropolis search (V-FMS) is found to be quite close in performance to the standard clause-based FMS at optimal noise. At infinite noise, instead, the threshold for the linearity of solution times with instance size is improved by picking preferably variables in several UNSAT clauses. Consequences for algorithmic design are discussed. PMID:25679737

  13. Local search methods based on variable focusing for random K -satisfiability

    NASA Astrophysics Data System (ADS)

    Lemoy, Rémi; Alava, Mikko; Aurell, Erik

    2015-01-01

    We introduce variable focused local search algorithms for satisfiabiliity problems. Usual approaches focus uniformly on unsatisfied clauses. The methods described here work by focusing on random variables in unsatisfied clauses. Variants are considered where variables are selected uniformly and randomly or by introducing a bias towards picking variables participating in several unsatistified clauses. These are studied in the case of the random 3-SAT problem, together with an alternative energy definition, the number of variables in unsatisfied constraints. The variable-based focused Metropolis search (V-FMS) is found to be quite close in performance to the standard clause-based FMS at optimal noise. At infinite noise, instead, the threshold for the linearity of solution times with instance size is improved by picking preferably variables in several UNSAT clauses. Consequences for algorithmic design are discussed.

  14. Pixel partition method using Markov random field for measurements of closely spaced objects by optical sensors

    NASA Astrophysics Data System (ADS)

    Wang, Xueying; Li, Jun; Sheng, Weidong; An, Wei; Du, Qinfeng

    2015-10-01

    ABSTRACT In Space-based optical system, during the tracking for closely spaced objects (CSOs), the traditional method with a constant false alarm rate(CFAR) detecting brings either more clutter measurements or the loss of target information. CSOs can be tracked as Extended targets because their features on optical sensor's pixel-plane. A pixel partition method under the framework of Markov random field(MRF) is proposed, simulation results indicate: the method proposed provides higher pixel partition performance than traditional method, especially when the signal-noise-rate is poor.

  15. A method to dynamic stochastic multicriteria decision making with log-normally distributed random variables.

    PubMed

    Wang, Xin-Fan; Wang, Jian-Qiang; Deng, Sheng-Yue

    2013-01-01

    We investigate the dynamic stochastic multicriteria decision making (SMCDM) problems, in which the criterion values take the form of log-normally distributed random variables, and the argument information is collected from different periods. We propose two new geometric aggregation operators, such as the log-normal distribution weighted geometric (LNDWG) operator and the dynamic log-normal distribution weighted geometric (DLNDWG) operator, and develop a method for dynamic SMCDM with log-normally distributed random variables. This method uses the DLNDWG operator and the LNDWG operator to aggregate the log-normally distributed criterion values, utilizes the entropy model of Shannon to generate the time weight vector, and utilizes the expectation values and variances of log-normal distributions to rank the alternatives and select the best one. Finally, an example is given to illustrate the feasibility and effectiveness of this developed method.

  16. On analysis-based two-step interpolation methods for randomly sampled seismic data

    NASA Astrophysics Data System (ADS)

    Yang, Pengliang; Gao, Jinghuai; Chen, Wenchao

    2013-02-01

    Interpolating the missing traces of regularly or irregularly sampled seismic record is an exceedingly important issue in the geophysical community. Many modern acquisition and reconstruction methods are designed to exploit the transform domain sparsity of the few randomly recorded but informative seismic data using thresholding techniques. In this paper, to regularize randomly sampled seismic data, we introduce two accelerated, analysis-based two-step interpolation algorithms, the analysis-based FISTA (fast iterative shrinkage-thresholding algorithm) and the FPOCS (fast projection onto convex sets) algorithm from the IST (iterative shrinkage-thresholding) algorithm and the POCS (projection onto convex sets) algorithm. A MATLAB package is developed for the implementation of these thresholding-related interpolation methods. Based on this package, we compare the reconstruction performance of these algorithms, using synthetic and real seismic data. Combined with several thresholding strategies, the accelerated convergence of the proposed methods is also highlighted.

  17. A Bloch decomposition-based stochastic Galerkin method for quantum dynamics with a random external potential

    NASA Astrophysics Data System (ADS)

    Wu, Zhizhang; Huang, Zhongyi

    2016-07-01

    In this paper, we consider the numerical solution of the one-dimensional Schrödinger equation with a periodic lattice potential and a random external potential. This is an important model in solid state physics where the randomness results from complicated phenomena that are not exactly known. Here we generalize the Bloch decomposition-based time-splitting pseudospectral method to the stochastic setting using the generalized polynomial chaos with a Galerkin procedure so that the main effects of dispersion and periodic potential are still computed together. We prove that our method is unconditionally stable and numerical examples show that it has other nice properties and is more efficient than the traditional method. Finally, we give some numerical evidence for the well-known phenomenon of Anderson localization.

  18. Application of random forests method to predict the retention indices of some polycyclic aromatic hydrocarbons.

    PubMed

    Goudarzi, N; Shahsavani, D; Emadi-Gandaghi, F; Chamjangali, M Arab

    2014-03-14

    In this work, a quantitative structure-retention relationship (QSRR) investigation was carried out based on the new method of random forests (RF) for prediction of the retention indices (RIs) of some polycyclic aromatic hydrocarbon (PAH) compounds. The RIs of these compounds were calculated using the theoretical descriptors generated from their molecular structures. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. Optimization of these parameters showed that in the point m=70, nt=460, the RF method can give the best results. Also, performance of the RF model was compared with that of the artificial neural network (ANN) and multiple linear regression (MLR) techniques. The results obtained show the relative superiority of the RF method over the MLR and ANN ones.

  19. Fourier mode analysis of multigrid methods for partial differential equations with random coefficients

    SciTech Connect

    Seynaeve, Bert; Rosseel, Eveline; Nicolai, Bart; Vandewalle, Stefan . E-mail: Stefan.Vandewalle@cs.kuleuven.be

    2007-05-20

    Partial differential equations with random coefficients appear for example in reliability problems and uncertainty propagation models. Various approaches exist for computing the stochastic characteristics of the solution of such a differential equation. In this paper, we consider the spectral expansion approach. This method transforms the continuous model into a large discrete algebraic system. We study the convergence properties of iterative methods for solving this discretized system. We consider one-level and multi-level methods. The classical Fourier mode analysis technique is extended towards the stochastic case. This is done by taking the eigenstructure into account of a certain matrix that depends on the random structure of the problem. We show how the convergence properties depend on the particulars of the algorithm, on the discretization parameters and on the stochastic characteristics of the model. Numerical results are added to illustrate some of our theoretical findings.

  20. A systematic review of randomized controlled trials on sterilization methods of extracted human teeth

    PubMed Central

    Western, J. Sylvia; Dicksit, Daniel Devaprakash

    2016-01-01

    Aim of this Study: The aim was to evaluate the efficiency of different sterilization methods on extracted human teeth (EHT) by a systematic review of in vitro randomized controlled trials. Methodology: An extensive electronic database literature search concerning the sterilization of EHT was conducted. The search terms used were “human teeth, sterilization, disinfection, randomized controlled trials, and infection control.” Randomized controlled trials which aim at comparing the efficiency of different methods of sterilization of EHT were all included in this systematic review. Results: Out of 1618 articles obtained, eight articles were selected for this systematic review. The sterilization methods reviewed were autoclaving, 10% formalin, 5.25% sodium hypochlorite, 3% hydrogen peroxide, 2% glutaraldehyde, 0.1% thymol, and boiling to 100°C. Data were extracted from the selected individual studies and their findings were summarized. Conclusion: Autoclaving and 10% formalin can be considered as 100% efficient and reliable methods. While the use of 5.25% sodium hypochlorite, 3% hydrogen peroxide, 2% glutaraldehyde, 0.1% thymol, and boiling to 100°C was inefficient and unreliable methods of sterilization of EHT. PMID:27563183

  1. A self-adaptive method for creating high efficiency communication channels through random scattering media.

    PubMed

    Hao, Xiang; Martin-Rouault, Laure; Cui, Meng

    2014-07-29

    Controlling the propagation of electromagnetic waves is important to a broad range of applications. Recent advances in controlling wave propagation in random scattering media have enabled optical focusing and imaging inside random scattering media. In this work, we propose and demonstrate a new method to deliver optical power more efficiently through scattering media. Drastically different from the random matrix characterization approach, our method can rapidly establish high efficiency communication channels using just a few measurements, regardless of the number of optical modes, and provides a practical and robust solution to boost the signal levels in optical or short wave communications. We experimentally demonstrated analog and digital signal transmission through highly scattering media with greatly improved performance. Besides scattering, our method can also reduce the loss of signal due to absorption. Experimentally, we observed that our method forced light to go around absorbers, leading to even higher signal improvement than in the case of purely scattering media. Interestingly, the resulting signal improvement is highly directional, which provides a new means against eavesdropping.

  2. Wave propagation through random media: A local method of small perturbations based on the Helmholtz equation

    NASA Technical Reports Server (NTRS)

    Grosse, Ralf

    1990-01-01

    Propagation of sound through the turbulent atmosphere is a statistical problem. The randomness of the refractive index field causes sound pressure fluctuations. Although no general theory to predict sound pressure statistics from given refractive index statistics exists, there are several approximate solutions to the problem. The most common approximation is the parabolic equation method. Results obtained by this method are restricted to small refractive index fluctuations and to small wave lengths. While the first condition is generally met in the atmosphere, it is desirable to overcome the second. A generalization of the parabolic equation method with respect to the small wave length restriction is presented.

  3. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    NASA Astrophysics Data System (ADS)

    Liao, Qifeng; Lin, Guang

    2016-07-01

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  4. Improved random-starting method for the EM algorithm for finite mixtures of regressions.

    PubMed

    Schepers, Jan

    2015-03-01

    Two methods for generating random starting values for the expectation maximization (EM) algorithm are compared in terms of yielding maximum likelihood parameter estimates in finite mixtures of regressions. One of these methods is ubiquitous in applications of finite mixture regression, whereas the other method is an alternative that appears not to have been used so far. The two methods are compared in two simulation studies and on an illustrative data set. The results show that the alternative method yields solutions with likelihood values at least as high as, and often higher than, those returned by the standard method. Moreover, analyses of the illustrative data set show that the results obtained by the two methods may differ considerably with regard to some of the substantive conclusions. The results reported in this article indicate that in applications of finite mixture regression, consideration should be given to the type of mechanism chosen to generate random starting values for the EM algorithm. In order to facilitate the use of the proposed alternative method, an R function implementing the approach is provided in the Appendix of the article.

  5. A Bayesian hierarchical method to account for random effects in cytogenetic dosimetry based on calibration curves.

    PubMed

    Mano, Shuhei; Suto, Yumiko

    2014-11-01

    The dicentric chromosome assay (DCA) is one of the most sensitive and reliable methods of inferring doses of radiation exposure in patients. In DCA, one calibration curve is prepared in advance by in vitro irradiation to blood samples from one or sometimes multiple healthy donors in considering possible inter-individual variability. Although the standard method has been demonstrated to be quite accurate for actual dose estimates, it cannot account for random effects, which come from such as the blood donor used to prepare the calibration curve, the radiation-exposed patient, and the examiners. To date, it is unknown how these random effects impact on the standard method of dose estimation. We propose a novel Bayesian hierarchical method that incorporates random effects into the dose estimation. To demonstrate dose estimation by the proposed method and to assess the impact of inter-individual variability in samples from multiple donors on the estimation, peripheral blood samples from 13 occupationally non-exposed, non-smoking, healthy individuals were collected and irradiated with gamma rays. The results clearly showed significant inter-individual variability and the standard method using a sample from a single donor gave anti-conservative confidence interval of the irradiated dose. In contrast, the Bayesian credible interval for irradiated dose calculated by the proposed method using samples from multiple donors properly covered the actual doses. Although the classical confidence interval of calibration curve with accounting inter-individual variability in samples from multiple donors was roughly coincident with the Bayesian credible interval, the proposed method has better reasoning and potential for extensions.

  6. Novel copyright information hiding method based on random phase matrix of Fresnel diffraction transforms

    NASA Astrophysics Data System (ADS)

    Cao, Chao; Chen, Ru-jun

    2009-10-01

    In this paper, we present a new copyright information hide method for digital images in Moiré fringe formats. The copyright information is embedded into the protected image and the detecting image based on Fresnel phase matrix. Firstly, using Fresnel diffraction transform, the random phase matrix of copyright information is generated. Then, according to Moiré fringe principle, the protected image and the detecting image are modulated respectively based on the random phase matrix, and the copyright information is embedded into them. When the protected image and the detecting image are overlapped, the copyright information can reappear. Experiment results show that our method has good concealment performance, and is a new way for copyright protection.

  7. Chosen-plaintext attack on double-random-phase-encoding-based image hiding method

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Li, Guirong; Zhu, Xianchen

    2015-12-01

    By using optical image processing techniques, a novel text encryption and hiding method applied by double-random phase-encoding technique is proposed in the paper. The first step is that the secret message is transformed into a 2- dimension array. The higher bits of the elements in the array are used to fill with the bit stream of the secret text, while the lower bits are stored specific values. Then, the transformed array is encoded by double random phase encoding technique. Last, the encoded array is embedded on a public host image to obtain the image embedded with hidden text. The performance of the proposed technique is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient.

  8. Effective conductivity of particulate polymer composite electrolytes using random resistor network method

    SciTech Connect

    Kalnaus, Sergiy; Sabau, Adrian S; Newman, Sarah M; Tenhaeff, Wyatt E; Daniel, Claus; Dudney, Nancy J

    2011-01-01

    The effective DC conductivity of particulate composite electrolytes was obtained by solving electrostatics equations using random resistors network method in three dimensions. The composite structure was considered to consist of three phases: matrix, particulate filler, and conductive shell that surrounded each particle; each phase possessing a different conductivity. Different particle size distributions were generated using Monte Carlo simulations. Unlike effective medium formulations, it was shown that the random resistors network method was able to predict percolation thresholds for the effective composite conductivity. It was found that the mean particle radius has a higher influence on the effective composite conductivity compared to the effect of type of the particle size distributions that were considered. The effect of the shell thickness on the composite conductivity has been investigated. It was found that the conductivity enhancement due to the presence of the conductive shell phase becomes less evident as the shell thickness increases.

  9. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  10. Percolation of the site random-cluster model by Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Wang, Songsong; Zhang, Wanzhou; Ding, Chengxiang

    2015-08-01

    We propose a site random-cluster model by introducing an additional cluster weight in the partition function of the traditional site percolation. To simulate the model on a square lattice, we combine the color-assignation and the Swendsen-Wang methods to design a highly efficient cluster algorithm with a small critical slowing-down phenomenon. To verify whether or not it is consistent with the bond random-cluster model, we measure several quantities, such as the wrapping probability Re, the percolating cluster density P∞, and the magnetic susceptibility per site χp, as well as two exponents, such as the thermal exponent yt and the fractal dimension yh of the percolating cluster. We find that for different exponents of cluster weight q =1.5 , 2, 2.5 , 3, 3.5 , and 4, the numerical estimation of the exponents yt and yh are consistent with the theoretical values. The universalities of the site random-cluster model and the bond random-cluster model are completely identical. For larger values of q , we find obvious signatures of the first-order percolation transition by the histograms and the hysteresis loops of percolating cluster density and the energy per site. Our results are helpful for the understanding of the percolation of traditional statistical models.

  11. Thermodynamic method for generating random stress distributions on an earthquake fault

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2012-01-01

    This report presents a new method for generating random stress distributions on an earthquake fault, suitable for use as initial conditions in a dynamic rupture simulation. The method employs concepts from thermodynamics and statistical mechanics. A pattern of fault slip is considered to be analogous to a micro-state of a thermodynamic system. The energy of the micro-state is taken to be the elastic energy stored in the surrounding medium. Then, the Boltzmann distribution gives the probability of a given pattern of fault slip and stress. We show how to decompose the system into independent degrees of freedom, which makes it computationally feasible to select a random state. However, due to the equipartition theorem, straightforward application of the Boltzmann distribution leads to a divergence which predicts infinite stress. To avoid equipartition, we show that the finite strength of the fault acts to restrict the possible states of the system. By analyzing a set of earthquake scaling relations, we derive a new formula for the expected power spectral density of the stress distribution, which allows us to construct a computer algorithm free of infinities. We then present a new technique for controlling the extent of the rupture by generating a random stress distribution thousands of times larger than the fault surface, and selecting a portion which, by chance, has a positive stress perturbation of the desired size. Finally, we present a new two-stage nucleation method that combines a small zone of forced rupture with a larger zone of reduced fracture energy.

  12. Methods of Blinding in Reports of Randomized Controlled Trials Assessing Pharmacologic Treatments: A Systematic Review

    PubMed Central

    Boutron, Isabelle; Estellat, Candice; Guittet, Lydia; Dechartres, Agnes; Sackett, David L; Hróbjartsson, Asbjørn; Ravaud, Philippe

    2006-01-01

    Background Blinding is a cornerstone of therapeutic evaluation because lack of blinding can bias treatment effect estimates. An inventory of the blinding methods would help trialists conduct high-quality clinical trials and readers appraise the quality of results of published trials. We aimed to systematically classify and describe methods to establish and maintain blinding of patients and health care providers and methods to obtain blinding of outcome assessors in randomized controlled trials of pharmacologic treatments. Methods and Findings We undertook a systematic review of all reports of randomized controlled trials assessing pharmacologic treatments with blinding published in 2004 in high impact-factor journals from Medline and the Cochrane Methodology Register. We used a standardized data collection form to extract data. The blinding methods were classified according to whether they primarily (1) established blinding of patients or health care providers, (2) maintained the blinding of patients or health care providers, and (3) obtained blinding of assessors of the main outcomes. We identified 819 articles, with 472 (58%) describing the method of blinding. Methods to establish blinding of patients and/or health care providers concerned mainly treatments provided in identical form, specific methods to mask some characteristics of the treatments (e.g., added flavor or opaque coverage), or use of double dummy procedures or simulation of an injection. Methods to avoid unblinding of patients and/or health care providers involved use of active placebo, centralized assessment of side effects, patients informed only in part about the potential side effects of each treatment, centralized adapted dosage, or provision of sham results of complementary investigations. The methods reported for blinding outcome assessors mainly relied on a centralized assessment of complementary investigations, clinical examination (i.e., use of video, audiotape, or photography), or

  13. A stochastic simulation method for the assessment of resistive random access memory retention reliability

    SciTech Connect

    Berco, Dan Tseng, Tseung-Yuen

    2015-12-21

    This study presents an evaluation method for resistive random access memory retention reliability based on the Metropolis Monte Carlo algorithm and Gibbs free energy. The method, which does not rely on a time evolution, provides an extremely efficient way to compare the relative retention properties of metal-insulator-metal structures. It requires a small number of iterations and may be used for statistical analysis. The presented approach is used to compare the relative robustness of a single layer ZrO{sub 2} device with a double layer ZnO/ZrO{sub 2} one, and obtain results which are in good agreement with experimental data.

  14. Low-noise multiple watermarks technology based on complex double random phase encoding method

    NASA Astrophysics Data System (ADS)

    Zheng, Jihong; Lu, Rongwen; Sun, Liujie; Zhuang, Songlin

    2010-11-01

    Based on double random phase encoding method (DRPE), watermarking technology may provide a stable and robust method to protect the copyright of the printing. However, due to its linear character, DRPE exist the serious safety risk when it is attacked. In this paper, a complex coding method, which means adding the chaotic encryption based on logistic mapping before the DRPE coding, is provided and simulated. The results testify the complex method will provide better security protection for the watermarking. Furthermore, a low-noise multiple watermarking is studied, which means embedding multiple watermarks into one host printing and decrypt them with corresponding phase keys individually. The Digital simulation and mathematic analysis show that with the same total embedding weight factor, multiply watermarking will improve signal noise ratio (SNR) of the output printing image significantly. The complex multiply watermark method may provide a robust, stability, reliability copyright protection with higher quality printing image.

  15. An efficient hybrid reliability analysis method with random and interval variables

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2016-09-01

    Random and interval variables often coexist. Interval variables make reliability analysis much more computationally intensive. This work develops a new hybrid reliability analysis method so that the probability analysis (PA) loop and interval analysis (IA) loop are decomposed into two separate loops. An efficient PA algorithm is employed, and a new efficient IA method is developed. The new IA method consists of two stages. The first stage is for monotonic limit-state functions. If the limit-state function is not monotonic, the second stage is triggered. In the second stage, the limit-state function is sequentially approximated with a second order form, and the gradient projection method is applied to solve the extreme responses of the limit-state function with respect to the interval variables. The efficiency and accuracy of the proposed method are demonstrated by three examples.

  16. Determination of Slope Safety Factor with Analytical Solution and Searching Critical Slip Surface with Genetic-Traversal Random Method

    PubMed Central

    2014-01-01

    In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679

  17. Random fields generation on the GPU with the spectral turning bands method

    NASA Astrophysics Data System (ADS)

    Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.

    2014-08-01

    Random field (RF) generation algorithms are of paramount importance for many scientific domains, such as astrophysics, geostatistics, computer graphics and many others. Some examples are the generation of initial conditions for cosmological simulations or hydrodynamical turbulence driving. In the latter a new random field is needed every time-step. Current approaches commonly make use of 3D FFT (Fast Fourier Transform) and require the whole generated field to be stored in memory. Moreover, they are limited to regular rectilinear meshes and need an extra processing step to support non-regular meshes. In this paper, we introduce TBARF (Turning BAnd Random Fields), a RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs. Our algorithm replaces the 3D FFT with a lower order, one-dimensional FFT followed by a projection step, and is further optimized with loop unrolling and blocking. We show that TBARF can easily generate RF on non-regular (non uniform) meshes and can afford mesh sizes bigger than the available GPU memory by using a streaming, out-of-core approach. TBARF is 2 to 5 times faster than the traditional methods when generating RFs with more than 16M cells. It can also generate RF on non-regular meshes, and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.

  18. An FRF bounding method for randomly uncertain structures with or without coupling to an acoustic cavity

    NASA Astrophysics Data System (ADS)

    Dunne, L. W.; Dunne, J. F.

    2009-04-01

    An efficient frequency response function (FRF) bounding method is proposed using asymptotic extreme-value theory. The method exploits a small random sample of realised FRFs obtained from nominally identical structures to predict corresponding FRF bounds for a substantially larger batch. This is useful for predicting forced-vibration levels in automotive vehicle bodies when parameters are assumed to vary statistically. Small samples are assumed to come either from Monte Carlo simulation using a vibration model, or via measurements from real structures. The basis of the method is to undertake a hypothesis test and if justified, repeatedly fit inverted Type I asymptotic threshold exceedance models at discrete frequencies, for which the models are not locked to a block size (as in classical extreme-value models). The chosen FRF 'bound' is predicted from the inverse model in the form of the ' m-observational return level', namely the level that will be exceeded on average once in every m structures realised. The method is tested on simulated linear structures, initially to establish its scope and limitations. Initial testing is performed on a sdof system followed by small and medium-sized uncoupled mdof grillages. Testing then continues to: (i) a random acoustically coupled grillage structure; and (ii) a partially random industrial-scale box structure which exhibits similar dynamic characteristics to a small vehicle structure and is analysed in NASTRAN. In both cases, structural and acoustic responses to a single deterministic load are examined. The paper shows that the method is not suitable for very small uncoupled systems but rapidly becomes very appropriate for both uncoupled and coupled mdof structures.

  19. A New Markov Random Field Segmentation Method for Breast Lesion Segmentation in MR images.

    PubMed

    Azmi, Reza; Norozi, Narges

    2011-07-01

    Breast cancer is a major public health problem for women in the Iran and many other parts of the world. Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) plays a pivotal role in breast cancer care, including detection, diagnosis, and treatment monitoring. But segmentation of these images which is seriously affected by intensity inhomogeneities created by radio-frequency coils is a challenging task. Markov Random Field (MRF) is used widely in medical image segmentation especially in MR images. It is because this method can model intensity inhomogeneities occurring in these images. But this method has two critical weaknesses: Computational complexity and sensitivity of the results to the models parameters. To overcome these problems, in this paper, we present Improved-Markov Random Field (I-MRF) method for breast lesion segmentation in MR images. Unlike the conventional MRF, in the proposed approach, we don't use the Iterative Conditional Mode (ICM) method or Simulated Annealing (SA) for class membership estimation of each pixel (lesion and non-lesion). The prior distribution of the class membership is modeled as a ratio of two conditional probability distributions in a neighborhood which is defined for each pixel: probability distribution of similar pixels and non-similar ones. Since our proposed approach don't use an iterative method for maximizing the posterior probability, above mentioned problems are solved. Experimental results show that performance of segmentation in this approach is higher than conventional MRF in terms of accuracy, precision, and Computational complexity.

  20. Hyperspectral image clustering method based on artificial bee colony algorithm and Markov random fields

    NASA Astrophysics Data System (ADS)

    Sun, Xu; Yang, Lina; Gao, Lianru; Zhang, Bing; Li, Shanshan; Li, Jun

    2015-01-01

    Center-oriented hyperspectral image clustering methods have been widely applied to hyperspectral remote sensing image processing; however, the drawbacks are obvious, including the over-simplicity of computing models and underutilized spatial information. In recent years, some studies have been conducted trying to improve this situation. We introduce the artificial bee colony (ABC) and Markov random field (MRF) algorithms to propose an ABC-MRF-cluster model to solve the problems mentioned above. In this model, a typical ABC algorithm framework is adopted in which cluster centers and iteration conditional model algorithm's results are considered as feasible solutions and objective functions separately, and MRF is modified to be capable of dealing with the clustering problem. Finally, four datasets and two indices are used to show that the application of ABC-cluster and ABC-MRF-cluster methods could help to obtain better image accuracy than conventional methods. Specifically, the ABC-cluster method is superior when used for a higher power of spectral discrimination, whereas the ABC-MRF-cluster method can provide better results when used for an adjusted random index. In experiments on simulated images with different signal-to-noise ratios, ABC-cluster and ABC-MRF-cluster showed good stability.

  1. Numerical solution of the problem of flame propagation by the use of the random element method

    NASA Technical Reports Server (NTRS)

    Ghoniem, A. F.; Oppenheim, A. K.

    1983-01-01

    A numerical, grid-free algorithm is presented for one-dimensional reaction-diffusion model of laminar flame propagation in premixed gases. It is based on the random element method we developed for the analysis of diffusional processes. The effect of combustion is taken into account by applying the principle of fractional steps to separate the process of diffusion, modeled by the random walk of computational elements, from the exothermic effects of chemical reaction, monitoring their strength. The validity of the algorithm is demonstrated by application to flame propagation problems for which exact solutions exist. The flame speed evaluated by its use oscillates around the exact value at a relatively small amplitude, while the temperature and species concentration profiles are self-correcting in their convergence to the exact solution. A satisfactory resolution is obtained by the use of quite a small number of computational elements which automatically adjust their distribution of fit sharp gradients.

  2. A comparative analysis of recruitment methods used in a randomized trial of diabetes education interventions.

    PubMed

    Beaton, Sarah J; Sperl-Hillen, JoAnn M; Worley, Ann Von; Fernandes, Omar D; Baumer, Dorothy; Hanson, Ann M; Parker, Emily D; Busch, Maureen E; Davis, Herbert T; Spain, C Victor

    2010-11-01

    Recruitment methods heavily impact budget and outcomes in clinical trials. We conducted a post-hoc examination of the efficiency and cost of three different recruitment methods used in Journey for Control of Diabetes: the IDEA Study, a randomized controlled trial evaluating outcomes of group and individual diabetes education in New Mexico and Minnesota. Electronic databases were used to identify health plan members with diabetes and then one of the following three methods was used to recruit study participants: 1. Minnesota Method 1--Mail only (first half of recruitment period). Mailed invitations with return-response forms. 2. Minnesota Method 2--Mail and selective phone calls (second half of recruitment period). Mailed invitations with return-response forms and subsequent phone calls to nonresponders. 3. New Mexico Method 3--Mail and non-selective phone calls (full recruitment period): Mailed invitations with subsequent phone calls to all. The combined methods succeeded in meeting the recruitment goal of 623 subjects. There were 147 subjects recruited using Minnesota's Method 1, 190 using Minnesota's Method 2, and 286 using New Mexico's Method 3. Efficiency rates (percentage of invited patients who enrolled) were 4.2% for Method 1, 8.4% for Method 2, and 7.9% for Method 3. Calculated costs per enrolled subject were $71.58 (Method 1), $85.47 (Method 2), and $92.09 (Method 3). A mail-only method to assess study interest was relatively inexpensive but not efficient enough to sustain recruitment targets. Phone call follow-up after mailed invitations added to recruitment efficiency. Use of return-response forms with selective phone follow-up to non-responders was cost effective.

  3. Rapid DNA extraction methods and new primers for randomly amplified polymorphic DNA analysis of Giardia duodenalis.

    PubMed

    Deng, M Q; Cliver, D O

    1999-08-01

    A randomly amplified polymorphic DNA (RAPD) procedure using simple genomic DNA preparation methods and newly designed primers was optimized for analyzing Giardia duodenalis strains. Genomic DNA was extracted from in vitro cultivated trophozoites by five freezing-thawing cycles or by sonic treatment. Compared to a conventional method involving proteinase K digestion and phenol extraction, both freezing-thawing and sonication were equally efficient, yet with the advantage of being much less time- and labor-intensive. Five of the 10 tested RAPD primers produced reproducible polymorphisms among five human origin G. duodenalis strains, and grouping of these strains based on RAPD profiles was in agreement among these primers. The consistent classification of two standard laboratory reference strains, Portland-1 and WB, in the same group confirmed previous results using other fingerprinting methods, indicating that the reported simple DNA extraction methods and the selected primers are useful in RAPD for molecular characterization of G. duodenalis strains.

  4. Efficient and robust method for comparing the immunogenicity of candidate vaccines in randomized clinical trials.

    PubMed

    Gilbert, Peter B; Sato, Alicia; Sun, Xiao; Mehrotra, Devan V

    2009-01-14

    In randomized clinical trials designed to compare the magnitude of vaccine-induced immune responses between vaccination regimens, the statistical method used for the analysis typically does not account for baseline participant characteristics. This article shows that incorporating baseline variables predictive of the immunogenicity study endpoint can provide large gains in precision and power for estimation and testing of the group mean difference (requiring fewer subjects for the same scientific output) compared to conventional methods, and recommends the "semiparametric efficient" method described in Tsiatis et al. [Tsiatis AA, Davidian M, Zhang M, Lu X. Covariate adjustment for two-sample treatment comparisons in randomized clinical trials: a principled yet flexible approach. Stat Med 2007. doi:10.1002/sim.3113] for practical use. As such, vaccine clinical trial programs can be improved (1) by investigating baseline predictors (e.g., readouts from laboratory assays) of vaccine-induced immune responses, and (2) by implementing the proposed semiparametric efficient method in trials where baseline predictors are available. PMID:19022314

  5. Evaluation of Strip Footing Bearing Capacity Built on the Anthropogenic Embankment by Random Finite Element Method

    NASA Astrophysics Data System (ADS)

    Pieczynska-Kozlowska, Joanna

    2014-05-01

    One of a geotechnical problem in the area of Wroclaw is an anthropogenic embankment layer delaying to the depth of 4-5m, arising as a result of historical incidents. In such a case an assumption of bearing capacity of strip footing might be difficult. The standard solution is to use a deep foundation or foundation soil replacement. However both methods generate significant costs. In the present paper the authors focused their attention on the influence of anthropogenic embankment variability on bearing capacity. Soil parameters were defined on the basis of CPT test and modeled as 2D anisotropic random fields and the assumption of bearing capacity were made according deterministic finite element methods. Many repeated of the different realizations of random fields lead to stable expected value of bearing capacity. The algorithm used to estimate the bearing capacity of strip footing was the random finite element method (e.g. [1]). In traditional approach of bearing capacity the formula proposed by [2] is taken into account. qf = c'Nc + qNq + 0.5γBN- γ (1) where: qf is the ultimate bearing stress, cis the cohesion, qis the overburden load due to foundation embedment, γ is the soil unit weight, Bis the footing width, and Nc, Nq and Nγ are the bearing capacity factors. The method of evaluation the bearing capacity of strip footing based on finite element method incorporate five parameters: Young's modulus (E), Poisson's ratio (ν), dilation angle (ψ), cohesion (c), and friction angle (φ). In the present study E, ν and ψ are held constant while c and φ are randomized. Although the Young's modulus does not affect the bearing capacity it governs the initial elastic response of the soil. Plastic stress redistribution is accomplished using a viscoplastic algorithm merge with an elastic perfectly plastic (Mohr - Coulomb) failure criterion. In this paper a typical finite element mesh was assumed with 8-node elements consist in 50 columns and 20 rows. Footings width B

  6. A simple combinatorial method to describe particle retention time in random media with applications in chromatography

    NASA Astrophysics Data System (ADS)

    da Silva, Roberto; Lamb, Luis C.; Lima, Eder C.; Dupont, Jairton

    2012-01-01

    We propose a foundational model to explain properties of the retention time distribution of particle transport in a random medium. These particles are captured and released by distributed theoretical plates in a random medium as in standard chromatography. Our approach differs from current models, since it is not based on simple random walks, but on a directed and coordinated movement of the particles whose retention time dispersion in the column is due to the imprisonment time of the particle spent in the theoretical plates. Given a pair of fundamental parameters (λc,λe) the capture and release probabilities, we use simple combinatorial methods to predict the Probability Distribution of the retention times. We have analyzed several distributions typically used in chromatographic peak fits. We show that a log-normal distribution with only two parameters describes with high accuracy chromatographic distributions typically used in experiments. This distribution show a better fit than distributions with a larger number of parameters, possibly allowing for better control of experimental data.

  7. Implementation of the finite amplitude method for the relativistic quasiparticle random-phase approximation

    NASA Astrophysics Data System (ADS)

    Nikšić, T.; Kralj, N.; Tutiš, T.; Vretenar, D.; Ring, P.

    2013-10-01

    A new implementation of the finite amplitude method (FAM) for the solution of the relativistic quasiparticle random-phase approximation (RQRPA) is presented, based on the relativistic Hartree-Bogoliubov (RHB) model for deformed nuclei. The numerical accuracy and stability of the FAM-RQRPA is tested in a calculation of the monopole response of 22O. As an illustrative example, the model is applied to a study of the evolution of monopole strength in the chain of Sm isotopes, including the splitting of the giant monopole resonance in axially deformed systems.

  8. Recommended Minimum Test Requirements and Test Methods for Assessing Durability of Random-Glass-Fiber Composites

    SciTech Connect

    Battiste, R.L.; Corum, J.M.; Ren, W.; Ruggles, M.B.

    1999-06-01

    This report provides recommended minimum test requirements are suggested test methods for establishing the durability properties and characteristics of candidate random-glass-fiber polymeric composites for automotive structural applications. The recommendations and suggestions are based on experience and results developed at Oak Ridge National Laboratory (ORNL) under a US Department of Energy Advanced Automotive Materials project entitled ''Durability of Lightweight Composite Structures,'' which is closely coordinated with the Automotive Composites Consortium. The report is intended as an aid to suppliers offering new structural composites for automotive applications and to testing organizations that are called on to characterize the composites.

  9. New high resolution Random Telegraph Noise (RTN) characterization method for resistive RAM

    NASA Astrophysics Data System (ADS)

    Maestro, M.; Diaz, J.; Crespo-Yepes, A.; Gonzalez, M. B.; Martin-Martinez, J.; Rodriguez, R.; Nafria, M.; Campabadal, F.; Aymerich, X.

    2016-01-01

    Random Telegraph Noise (RTN) is one of the main reliability problems of resistive switching-based memories. To understand the physics behind RTN, a complete and accurate RTN characterization is required. The standard equipment used to analyse RTN has a typical time resolution of ∼2 ms which prevents evaluating fast phenomena. In this work, a new RTN measurement procedure, which increases the measurement time resolution to 2 μs, is proposed. The experimental set-up, together with the recently proposed Weighted Time Lag (W-LT) method for the analysis of RTN signals, allows obtaining a more detailed and precise information about the RTN phenomenon.

  10. Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, A. M.; McGhee, D. S.

    2003-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.

  11. Statistical Comparison and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; McGhee, David S.

    2004-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield ultimate strength and high cycle fatigue capability. This paper examines the cumulative distribution function (CDF) percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematics is then used to calculate the combined value corresponding to any desired percentile along with a curve fit to this value. Another Excel macro is used to calculate the combination using a Monte Carlo simulation. Unlike the traditional techniques, these methods quantify the calculated load value with a Consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Also, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can lower the design loading substantially without losing any of the identified structural reliability.

  12. A Novel Hepatocellular Carcinoma Image Classification Method Based on Voting Ranking Random Forests.

    PubMed

    Xia, Bingbing; Jiang, Huiyan; Liu, Huiling; Yi, Dehui

    2015-01-01

    This paper proposed a novel voting ranking random forests (VRRF) method for solving hepatocellular carcinoma (HCC) image classification problem. Firstly, in preprocessing stage, this paper used bilateral filtering for hematoxylin-eosin (HE) pathological images. Next, this paper segmented the bilateral filtering processed image and got three different kinds of images, which include single binary cell image, single minimum exterior rectangle cell image, and single cell image with a size of n⁎n. After that, this paper defined atypia features which include auxiliary circularity, amendment circularity, and cell symmetry. Besides, this paper extracted some shape features, fractal dimension features, and several gray features like Local Binary Patterns (LBP) feature, Gray Level Co-occurrence Matrix (GLCM) feature, and Tamura features. Finally, this paper proposed a HCC image classification model based on random forests and further optimized the model by voting ranking method. The experiment results showed that the proposed features combined with VRRF method have a good performance in HCC image classification problem.

  13. A Novel Hepatocellular Carcinoma Image Classification Method Based on Voting Ranking Random Forests

    PubMed Central

    Xia, Bingbing; Jiang, Huiyan; Liu, Huiling; Yi, Dehui

    2016-01-01

    This paper proposed a novel voting ranking random forests (VRRF) method for solving hepatocellular carcinoma (HCC) image classification problem. Firstly, in preprocessing stage, this paper used bilateral filtering for hematoxylin-eosin (HE) pathological images. Next, this paper segmented the bilateral filtering processed image and got three different kinds of images, which include single binary cell image, single minimum exterior rectangle cell image, and single cell image with a size of n⁎n. After that, this paper defined atypia features which include auxiliary circularity, amendment circularity, and cell symmetry. Besides, this paper extracted some shape features, fractal dimension features, and several gray features like Local Binary Patterns (LBP) feature, Gray Level Cooccurrence Matrix (GLCM) feature, and Tamura features. Finally, this paper proposed a HCC image classification model based on random forests and further optimized the model by voting ranking method. The experiment results showed that the proposed features combined with VRRF method have a good performance in HCC image classification problem. PMID:27293477

  14. Is a vegetarian diet adequate for children.

    PubMed

    Hackett, A; Nathan, I; Burgess, L

    1998-01-01

    The number of people who avoid eating meat is growing, especially among young people. Benefits to health from a vegetarian diet have been reported in adults but it is not clear to what extent these benefits are due to diet or to other aspects of lifestyles. In children concern has been expressed concerning the adequacy of vegetarian diets especially with regard to growth. The risks/benefits seem to be related to the degree of restriction of he diet; anaemia is probably both the main and the most serious risk but this also applies to omnivores. Vegan diets are more likely to be associated with malnutrition, especially if the diets are the result of authoritarian dogma. Overall, lacto-ovo-vegetarian children consume diets closer to recommendations than omnivores and their pre-pubertal growth is at least as good. The simplest strategy when becoming vegetarian may involve reliance on vegetarian convenience foods which are not necessarily superior in nutritional composition. The vegetarian sector of the food industry could do more to produce foods closer to recommendations. Vegetarian diets can be, but are not necessarily, adequate for children, providing vigilance is maintained, particularly to ensure variety. Identical comments apply to omnivorous diets. Three threats to the diet of children are too much reliance on convenience foods, lack of variety and lack of exercise.

  15. Methods of learning in statistical education: Design and analysis of a randomized trial

    NASA Astrophysics Data System (ADS)

    Boyd, Felicity Turner

    Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus

  16. Color computer-generated hologram generation using the random phase-free method and color space conversion.

    PubMed

    Shimobaba, Tomoyoshi; Makowski, Michał; Nagahama, Yuki; Endo, Yutaka; Hirayama, Ryuji; Hiyama, Daisuke; Hasegawa, Satoki; Sano, Marie; Kakue, Takashi; Oikawa, Minoru; Sugie, Takashige; Takada, Naoki; Ito, Tomoyoshi

    2016-05-20

    We propose two calculation methods of generating color computer-generated holograms (CGHs) with the random phase-free method and color space conversion in order to improve the image quality and accelerate the calculation. The random phase-free method improves the image quality in monochrome CGH, but it is not performed in color CGH. We first aimed to improve the image quality of color CGH using the random phase-free method and then to accelerate the color CGH generation with a combination of the random phase-free method and color space conversion method, which accelerates the color CGH calculation due to down-sampling of the color components converted by color space conversion. To overcome the problem of image quality degradation that occurs due to the down-sampling of random phases, the combination of the random phase-free method and color space conversion method improves the quality of reconstructed images and accelerates the color CGH calculation. We demonstrated the effectiveness of the proposed method in simulation, and in this paper discuss its application to lensless zoomable holographic projection.

  17. Color computer-generated hologram generation using the random phase-free method and color space conversion.

    PubMed

    Shimobaba, Tomoyoshi; Makowski, Michał; Nagahama, Yuki; Endo, Yutaka; Hirayama, Ryuji; Hiyama, Daisuke; Hasegawa, Satoki; Sano, Marie; Kakue, Takashi; Oikawa, Minoru; Sugie, Takashige; Takada, Naoki; Ito, Tomoyoshi

    2016-05-20

    We propose two calculation methods of generating color computer-generated holograms (CGHs) with the random phase-free method and color space conversion in order to improve the image quality and accelerate the calculation. The random phase-free method improves the image quality in monochrome CGH, but it is not performed in color CGH. We first aimed to improve the image quality of color CGH using the random phase-free method and then to accelerate the color CGH generation with a combination of the random phase-free method and color space conversion method, which accelerates the color CGH calculation due to down-sampling of the color components converted by color space conversion. To overcome the problem of image quality degradation that occurs due to the down-sampling of random phases, the combination of the random phase-free method and color space conversion method improves the quality of reconstructed images and accelerates the color CGH calculation. We demonstrated the effectiveness of the proposed method in simulation, and in this paper discuss its application to lensless zoomable holographic projection. PMID:27411145

  18. Simulation of water splitting reaction in porous media using Random Walk particle tracking method

    NASA Astrophysics Data System (ADS)

    Rahmatian, Nima; Petrasch, Jörg; Mei, Renwei; Klausner, James

    2013-11-01

    Water splitting using iron-based looping process is a well-known method to produce high purity hydrogen. A stable porous structure is best suited for the reaction over many cycles due to high surface area. In order to simulate the reacting flow in the porous structure Random Walk method is used due to its ability to handle stiff reaction kinetics and varying hydrodynamic dispersion tensor caused by pore-level velocity fluctuations. Because of significant variation in bulk density during conversion of steam to hydrogen, Random Walk formulation needs to be modified to account for bulk density variations and source term due to chemical reaction. The species transport equation is recast in the form of Fokker-Planck equation and the trajectories of fluid particles are obtained by solving an appropriate Langevin equation that has additional drift terms due to spatial variations in bulk density and dispersion tensor. The source term is accounted for by changing the number or the composition of fluid particles based on the reaction kinetics. The treatment for each new term is validated using highly resolved finite difference solution. A bench-scale reactor for hydrogen production is simulated and excellent agreement with the measured hydrogen production rate is obtained.

  19. Compensation method for random drifts of laser beams based on moving average feedback control

    NASA Astrophysics Data System (ADS)

    Zhang, Lixia; Wang, Ruilin; Lin, Wumei; Liao, Zhijie

    2012-10-01

    In order to eliminate the measurement errors caused by the instability of laser beams, a real-time compensation algorithms for the random drifts of laser beams based on moving average (MA) correction mechanism was presented. By establishing a correction model with two fast steering mirrors in the beam delivery path and analyzing the pulse to pulse beam fluctuation, a real-time beam drifts correction is implemented based on closed loop feedback control, which especially focuses on reducing the pulse to pulse drifts and ground fluctuations. The simulation results show that this algorithm can control beam drifts effectively. Optimal MA can be reduced to 3n-1/2 times (n--pulse numbers in a window) without the ground vibrations. There are a series of improvements on the moving standard deviation (MSD) as well. MSD get a sudden decline at the window pulse. Meanwhile, the drifts can be restrained while loading the ground vibrations without any big jump, and the dropping amplitude is bigger than without the ground vibration. MSD drop while the whole system is controlled by this compensation method and the results are stable. The key of this compensation method for random drifts of laser beams based on moving average feedback control lies in the appropriate corrections formula. What is more, this algorithm which is practical can achieve high precision control of direction drifts.

  20. Identifying protein interaction subnetworks by a bagging Markov random field-based method.

    PubMed

    Chen, Li; Xuan, Jianhua; Riggins, Rebecca B; Wang, Yue; Clarke, Robert

    2013-01-01

    Identification of differentially expressed subnetworks from protein-protein interaction (PPI) networks has become increasingly important to our global understanding of the molecular mechanisms that drive cancer. Several methods have been proposed for PPI subnetwork identification, but the dependency among network member genes is not explicitly considered, leaving many important hub genes largely unidentified. We present a new method, based on a bagging Markov random field (BMRF) framework, to improve subnetwork identification for mechanistic studies of breast cancer. The method follows a maximum a posteriori principle to form a novel network score that explicitly considers pairwise gene interactions in PPI networks, and it searches for subnetworks with maximal network scores. To improve their robustness across data sets, a bagging scheme based on bootstrapping samples is implemented to statistically select high confidence subnetworks. We first compared the BMRF-based method with existing methods on simulation data to demonstrate its improved performance. We then applied our method to breast cancer data to identify PPI subnetworks associated with breast cancer progression and/or tamoxifen resistance. The experimental results show that not only an improved prediction performance can be achieved by the BMRF approach when tested on independent data sets, but biologically meaningful subnetworks can also be revealed that are relevant to breast cancer and tamoxifen resistance.

  1. Finite-element method for calculation of the effective permittivity of random inhomogeneous media.

    PubMed

    Myroshnychenko, Viktor; Brosseau, Christian

    2005-01-01

    The challenge of designing new solid-state materials from calculations performed with the help of computers applied to models of spatial randomness has attracted an increasing amount of interest in recent years. In particular, dispersions of particles in a host matrix are scientifically and technologically important for a variety of reasons. Herein, we report our development of an efficient computer code to calculate the effective (bulk) permittivity of two-phase disordered composite media consisting of hard circular disks made of a lossless dielectric (permittivity epsilon2) randomly placed in a plane made of a lossless homogeneous dielectric (permittivity epsilon1) at different surface fractions. Specifically, the method is based on (i) a finite-element description of composites in which both the host and the randomly distributed inclusions are isotropic phases, and (ii) an ordinary Monte Carlo sampling. Periodic boundary conditions are employed throughout the simulation and various numbers of disks have been considered in the calculations. From this systematic study, we show how the number of Monte Carlo steps needed to achieve equilibrated distributions of disks increases monotonically with the surface fraction. Furthermore, a detailed study is made of the dependence of the results on a minimum separation distance between disks. Numerical examples are presented to connect the macroscopic property such as the effective permittivity to microstructural characteristics such as the mean coordination number and radial distribution function. In addition, several approximate effective medium theories, exact bounds, exact results for two-dimensional regular arrays, and the exact dilute limit are used to test and validate the finite-element algorithm. Numerical results indicate that the fourth-order bounds provide an excellent estimate of the effective permittivity for a wide range of surface fractions, in accordance with the fact that the bounds become progressively

  2. Adequate mathematical modelling of environmental processes

    NASA Astrophysics Data System (ADS)

    Chashechkin, Yu. D.

    2012-04-01

    In environmental observations and laboratory visualization both large scale flow components like currents, jets, vortices, waves and a fine structure are registered (different examples are given). The conventional mathematical modeling both analytical and numerical is directed mostly on description of energetically important flow components. The role of a fine structures is still remains obscured. A variety of existing models makes it difficult to choose the most adequate and to estimate mutual assessment of their degree of correspondence. The goal of the talk is to give scrutiny analysis of kinematics and dynamics of flows. A difference between the concept of "motion" as transformation of vector space into itself with a distance conservation and the concept of "flow" as displacement and rotation of deformable "fluid particles" is underlined. Basic physical quantities of the flow that are density, momentum, energy (entropy) and admixture concentration are selected as physical parameters defined by the fundamental set which includes differential D'Alembert, Navier-Stokes, Fourier's and/or Fick's equations and closing equation of state. All of them are observable and independent. Calculations of continuous Lie groups shown that only the fundamental set is characterized by the ten-parametric Galilelian groups reflecting based principles of mechanics. Presented analysis demonstrates that conventionally used approximations dramatically change the symmetries of the governing equations sets which leads to their incompatibility or even degeneration. The fundamental set is analyzed taking into account condition of compatibility. A high order of the set indicated on complex structure of complete solutions corresponding to physical structure of real flows. Analytical solutions of a number problems including flows induced by diffusion on topography, generation of the periodic internal waves a compact sources in week-dissipative media as well as numerical solutions of the same

  3. Multi-level Monte Carlo finite volume methods for uncertainty quantification of acoustic wave propagation in random heterogeneous layered medium

    NASA Astrophysics Data System (ADS)

    Mishra, S.; Schwab, Ch.; Šukys, J.

    2016-05-01

    We consider the very challenging problem of efficient uncertainty quantification for acoustic wave propagation in a highly heterogeneous, possibly layered, random medium, characterized by possibly anisotropic, piecewise log-exponentially distributed Gaussian random fields. A multi-level Monte Carlo finite volume method is proposed, along with a novel, bias-free upscaling technique that allows to represent the input random fields, generated using spectral FFT methods, efficiently. Combined together with a recently developed dynamic load balancing algorithm that scales to massively parallel computing architectures, the proposed method is able to robustly compute uncertainty for highly realistic random subsurface formations that can contain a very high number (millions) of sources of uncertainty. Numerical experiments, in both two and three space dimensions, illustrating the efficiency of the method are presented.

  4. Combining Monte Carlo and mean-field-like methods for inference in hidden Markov random fields.

    PubMed

    Forbes, Florence; Fort, Gersende

    2007-03-01

    Issues involving missing data are typical settings where exact inference is not tractable as soon as nontrivial interactions occur between the missing variables. Approximations are required, and most of them are based either on simulation methods or on deterministic variational methods. While variational methods provide fast and reasonable approximate estimates in many scenarios, simulation methods offer more consideration of important theoretical issues such as accuracy of the approximation and convergence of the algorithms but at a much higher computational cost. In this work, we propose a new class of algorithms that combine the main features and advantages of both simulation and deterministic methods and consider applications to inference in hidden Markov random fields (HMRFs). These algorithms can be viewed as stochastic perturbations of variational expectation maximization (VEM) algorithms, which are not tractable for HMRF. We focus more specifically on one of these perturbations and we prove their (almost sure) convergence to the same limit set as the limit set of VEM. In addition, experiments on synthetic and real-world images show that the algorithm performance is very close and sometimes better than that of other existing simulation-based and variational EM-like algorithms.

  5. Classification method for disease risk mapping based on discrete hidden Markov random fields.

    PubMed

    Charras-Garrido, Myriam; Abrial, David; Goër, Jocelyn De; Dachian, Sergueï; Peyrard, Nathalie

    2012-04-01

    Risk mapping in epidemiology enables areas with a low or high risk of disease contamination to be localized and provides a measure of risk differences between these regions. Risk mapping models for pooled data currently used by epidemiologists focus on the estimated risk for each geographical unit. They are based on a Poisson log-linear mixed model with a latent intrinsic continuous hidden Markov random field (HMRF) generally corresponding to a Gaussian autoregressive spatial smoothing. Risk classification, which is necessary to draw clearly delimited risk zones (in which protection measures may be applied), generally must be performed separately. We propose a method for direct classified risk mapping based on a Poisson log-linear mixed model with a latent discrete HMRF. The discrete hidden field (HF) corresponds to the assignment of each spatial unit to a risk class. The risk values attached to the classes are parameters and are estimated. When mapping risk using HMRFs, the conditional distribution of the observed field is modeled with a Poisson rather than a Gaussian distribution as in image segmentation. Moreover, abrupt changes in risk levels are rare in disease maps. The spatial hidden model should favor smoothed out risks, but conventional discrete Markov random fields (e.g. the Potts model) do not impose this. We therefore propose new potential functions for the HF that take into account class ordering. We use a Monte Carlo version of the expectation-maximization algorithm to estimate parameters and determine risk classes. We illustrate the method's behavior on simulated and real data sets. Our method appears particularly well adapted to localize high-risk regions and estimate the corresponding risk levels.

  6. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  7. Skeletonization of the internal thoracic artery: a randomized comparison of harvesting methods.

    PubMed

    Urso, Stefano; Alvarez, Luis; Sádaba, Rafael; Greco, Ernesto

    2008-02-01

    We performed a randomized study to compare internal thoracic artery (ITA) flow response to two harvesting methods used in the skeletonization procedure: ultrasonic scalpel and bipolar electrocautery. Sixty patients scheduled for CABG were randomized to receive either ultrasonically (n=30 patients) or electrocautery (n=30 patients) skeletonized ITAs. Intraoperative ITA graft mean flows were obtained with a transit-time flowmeter. ITA flows were evaluated at the beginning (Time 1) and at the end (Time 2) of the harvesting procedure. Post-cardiopulmonary bypass (CPB) flow measurement (Time 3) was obtained in the ITA grafts anastomosed to the left anterior descending artery. Intraoperative mean flow decreased significantly within ultrasonic group (Group U) and electrocautery group (Group E) at the end of the harvesting procedure (P<0.0001 in both cases). Within both groups the final mean flow measured on anastomosed ITAs (Time 3) was significantly higher than the beginning ITA flow value (Time 1). No statistical difference was noted comparing ITA flows between the two groups at any time of evaluation. Skeletonization harvesting of the ITA produces a modification of the mean flow. The quantity and the reversibility of this phenomenon, probably related to vasospasm, are independent from the energy source used in the skeletonization procedure. PMID:17998305

  8. A Novel Microaneurysms Detection Method Based on Local Applying of Markov Random Field.

    PubMed

    Ganjee, Razieh; Azmi, Reza; Moghadam, Mohsen Ebrahimi

    2016-03-01

    Diabetic Retinopathy (DR) is one of the most common complications of long-term diabetes. It is a progressive disease and by damaging retina, it finally results in blindness of patients. Since Microaneurysms (MAs) appear as a first sign of DR in retina, early detection of this lesion is an essential step in automatic detection of DR. In this paper, a new MAs detection method is presented. The proposed approach consists of two main steps. In the first step, the MA candidates are detected based on local applying of Markov random field model (MRF). In the second step, these candidate regions are categorized to identify the correct MAs using 23 features based on shape, intensity and Gaussian distribution of MAs intensity. The proposed method is evaluated on DIARETDB1 which is a standard and publicly available database in this field. Evaluation of the proposed method on this database resulted in the average sensitivity of 0.82 for a confidence level of 75 as a ground truth. The results show that our method is able to detect the low contrast MAs with the background while its performance is still comparable to other state of the art approaches.

  9. Reduction of claustrophobia during magnetic resonance imaging: methods and design of the "CLAUSTRO" randomized controlled trial

    PubMed Central

    2011-01-01

    Background Magnetic resonance (MR) imaging has been described as the most important medical innovation in the last 25 years. Over 80 million MR procedures are now performed each year and on average 2.3% (95% confidence interval: 2.0 to 2.5%) of all patients scheduled for MR imaging suffer from claustrophobia. Thus, prevention of MR imaging by claustrophobia is a common problem and approximately 2,000,000 MR procedures worldwide cannot be completed due to this situation. Patients with claustrophobic anxiety are more likely to be frightened and experience a feeling of confinement or being closed in during MR imaging. In these patients, conscious sedation and additional sequences (after sedation) may be necessary to complete the examinations. Further improvements in MR design appear to be essential to alleviate this situation and broaden the applicability of MR imaging. A more open scanner configuration might help reduce claustrophobic reactions while maintaining image quality and diagnostic accuracy. Methods/Design We propose to analyze the rate of claustrophobic reactions, clinical utility, image quality, patient acceptance, and cost-effectiveness of an open MR scanner in a randomized comparison with a recently designed short-bore but closed scanner with 97% noise reduction. The primary aim of this study is thus to determine whether an open MR scanner can reduce claustrophobic reactions, thereby enabling more examinations of claustrophobic patients without incurring the safety issues associated with conscious sedation. In this manuscript we detail the methods and design of the prospective "CLAUSTRO" trial. Discussion This randomized controlled trial will be the first direct comparison of open vertical and closed short-bore MR systems in regards to claustrophobia and image quality as well as diagnostic utility. Trial Registration ClinicalTrials.gov: NCT00715806 PMID:21310075

  10. a Method to Estimate Temporal Interaction in a Conditional Random Field Based Approach for Crop Recognition

    NASA Astrophysics Data System (ADS)

    Diaz, P. M. A.; Feitosa, R. Q.; Sanches, I. D.; Costa, G. A. O. P.

    2016-06-01

    This paper presents a method to estimate the temporal interaction in a Conditional Random Field (CRF) based approach for crop recognition from multitemporal remote sensing image sequences. This approach models the phenology of different crop types as a CRF. Interaction potentials are assumed to depend only on the class labels of an image site at two consecutive epochs. In the proposed method, the estimation of temporal interaction parameters is considered as an optimization problem, whose goal is to find the transition matrix that maximizes the CRF performance, upon a set of labelled data. The objective functions underlying the optimization procedure can be formulated in terms of different accuracy metrics, such as overall and average class accuracy per crop or phenological stages. To validate the proposed approach, experiments were carried out upon a dataset consisting of 12 co-registered LANDSAT images of a region in southeast of Brazil. Pattern Search was used as the optimization algorithm. The experimental results demonstrated that the proposed method was able to substantially outperform estimates related to joint or conditional class transition probabilities, which rely on training samples.

  11. A new acupuncture method for management of irritable bowel syndrome: A randomized double blind clinical trial

    PubMed Central

    Rafiei, Rahmatollah; Ataie, Mehdi; Ramezani, Mohammad Arash; Etemadi, Ali; Ataei, Behrooz; Nikyar, Hamidreza; Abdoli, Saman

    2014-01-01

    Background: Irritable bowel syndrome (IBS) is gastrointestinal functional disorder which is multifactorial with unknown etiology. There are several modalities for treatment of it. Acupuncture is increasingly used in numerous diseases, also in gastrointestinal disorders like IBS. The purpose of the study was to assess the effects of catgut embedding acupuncture in improving of IBS. Materials and Methods: A randomized double blind sham control clinical trial was designed. A total of 60 IBS patients assigned to three separated groups. The first group received clofac as drug only group (DO). The second one received catgut embedding acupuncture in special point (AP) and the last group received sham acupuncture (SA). Symptoms, pain, depression and anxiety assessed before and after two weeks at the end of study. Results: There was statistically significant difference between AP and SA and DO in constipation and bloating. Differences that were statistically significant favored acupuncture on pain (F = 6.409, P = 0.003), and depression (F = 6.735, P = 0.002) as the other outcomes. The average (standard deviation (SD)) of weight loss was 2 kg (0.88) in acupuncture group. Conclusion: Our finding showed a significant positive associated between acupuncture and IBS. Catgut embedding acupuncture is a new method which can eliminated IBS symptoms and can use as alternative therapeutic method for improvement of IBS. PMID:25538771

  12. Effects of Pilates method in elderly people: Systematic review of randomized controlled trials.

    PubMed

    de Oliveira Francisco, Cristina; de Almeida Fagundes, Alessandra; Gorges, Bruna

    2015-07-01

    The Pilates method has been widely used in physical training and rehabilitation. Evidence regarding the effectiveness of this method in elderly people is limited. Six randomized controlled trials studies involving the use of the Pilates method for elderly people, published prior to December 2013, were selected from the databases PubMed, MEDLINE, Embase, Cochrane, Scielo and PEDro. Three articles suggested that Pilates produced improvements in balance. Two studies evaluated the adherence to Pilates programs. One study assessed Pilates' influence on cardio-metabolic parameters and another study evaluated changes in body composition. Strong evidence was found regarding beneficial effects of Pilates over static and dynamic balance in women. Nevertheless, evidence of balance improvement in both genders, changes in body composition in woman and adherence to Pilates programs were limited. Effects on cardio-metabolic parameters due to Pilates training presented inconclusive results. Pilates may be a useful tool in rehabilitation and prevention programs but more high quality studies are necessary to establish all the effects on elderly populations. PMID:26118523

  13. THE LOSS OF ACCURACY OF STOCHASTIC COLLOCATION METHOD IN SOLVING NONLINEAR DIFFERENTIAL EQUATIONS WITH RANDOM INPUT DATA

    SciTech Connect

    Webster, Clayton G; Tran, Hoang A; Trenchea, Catalin S

    2013-01-01

    n this paper we show how stochastic collocation method (SCM) could fail to con- verge for nonlinear differential equations with random coefficients. First, we consider Navier-Stokes equation with uncertain viscosity and derive error estimates for stochastic collocation discretization. Our analysis gives some indicators on how the nonlinearity negatively affects the accuracy of the method. The stochastic collocation method is then applied to noisy Lorenz system. Simulation re- sults demonstrate that the solution of a nonlinear equation could be highly irregular on the random data and in such cases, stochastic collocation method cannot capture the correct solution.

  14. Genetically controlled random search: a global optimization method for continuous multidimensional functions

    NASA Astrophysics Data System (ADS)

    Tsoulos, Ioannis G.; Lagaris, Isaac E.

    2006-01-01

    A new stochastic method for locating the global minimum of a multidimensional function inside a rectangular hyperbox is presented. A sampling technique is employed that makes use of the procedure known as grammatical evolution. The method can be considered as a "genetic" modification of the Controlled Random Search procedure due to Price. The user may code the objective function either in C++ or in Fortran 77. We offer a comparison of the new method with others of similar structure, by presenting results of computational experiments on a set of test functions. Program summaryTitle of program: GenPrice Catalogue identifier:ADWP Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWP Program available from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: the tool is designed to be portable in all systems running the GNU C++ compiler Installation: University of Ioannina, Greece Programming language used: GNU-C++, GNU-C, GNU Fortran-77 Memory required to execute with typical data: 200 KB No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: no No. of lines in distributed program, including test data, etc.:13 135 No. of bytes in distributed program, including test data, etc.: 78 512 Distribution format: tar. gz Nature of physical problem: A multitude of problems in science and engineering are often reduced to minimizing a function of many variables. There are instances that a local optimum does not correspond to the desired physical solution and hence the search for a better solution is required. Local optimization techniques are frequently trapped in local minima. Global optimization is hence the appropriate tool. For example, solving a nonlinear system of equations via optimization, employing a "least squares" type of objective, one may encounter many local minima that do not correspond to solutions, i.e. minima with values

  15. Adequate histologic sectioning of prostate needle biopsies.

    PubMed

    Bostwick, David G; Kahane, Hillel

    2013-08-01

    No standard method exists for sampling prostate needle biopsies, although most reports claim to embed 3 cores per block and obtain 3 slices from each block. This study was undertaken to determine the extent of histologic sectioning necessary for optimal examination of prostate biopsies. We prospectively compared the impact on cancer yield of submitting 1 biopsy core per cassette (biopsies from January 2010) with 3 cores per cassette (biopsies from August 2010) from a large national reference laboratory. Between 6 and 12 slices were obtained with the former 1-core method, resulting in 3 to 6 slices being placed on each of 2 slides; for the latter 3-core method, a limit of 6 slices was obtained, resulting in 3 slices being place on each of 2 slides. A total of 6708 sets of 12 to 18 core biopsies were studied, including 3509 biopsy sets from the 1-biopsy-core-per-cassette group (January 2010) and 3199 biopsy sets from the 3-biopsy-cores-percassette group (August 2010). The yield of diagnoses was classified as benign, atypical small acinar proliferation, high-grade prostatic intraepithelial neoplasia, and cancer and was similar with the 2 methods: 46.2%, 8.2%, 4.5%, and 41.1% and 46.7%, 6.3%, 4.4%, and 42.6%, respectively (P = .02). Submission of 1 core or 3 cores per cassette had no effect on the yield of atypical small acinar proliferation, prostatic intraepithelial neoplasia, or cancer in prostate needle biopsies. Consequently, we recommend submission of 3 cores per cassette to minimize labor and cost of processing. PMID:23764163

  16. Non-stationary random vibration analysis of a 3D train-bridge system using the probability density evolution method

    NASA Astrophysics Data System (ADS)

    Yu, Zhi-wu; Mao, Jian-feng; Guo, Feng-qi; Guo, Wei

    2016-03-01

    Rail irregularity is one of the main sources causing train-bridge random vibration. A new random vibration theory for the coupled train-bridge systems is proposed in this paper. First, number theory method (NTM) with 2N-dimensional vectors for the stochastic harmonic function (SHF) of rail irregularity power spectrum density was adopted to determine the representative points of spatial frequencies and phases to generate the random rail irregularity samples, and the non-stationary rail irregularity samples were modulated with the slowly varying function. Second, the probability density evolution method (PDEM) was employed to calculate the random dynamic vibration of the three-dimensional (3D) train-bridge system by a program compiled on the MATLAB® software platform. Eventually, the Newmark-β integration method and double edge difference method of total variation diminishing (TVD) format were adopted to obtain the mean value curve, the standard deviation curve and the time-history probability density information of responses. A case study was presented in which the ICE-3 train travels on a three-span simply-supported high-speed railway bridge with excitation of random rail irregularity. The results showed that compared to the Monte Carlo simulation, the PDEM has higher computational efficiency for the same accuracy, i.e., an improvement by 1-2 orders of magnitude. Additionally, the influences of rail irregularity and train speed on the random vibration of the coupled train-bridge system were discussed.

  17. Comparison of four different pain relief methods during hysterosalpingography: A randomized controlled study

    PubMed Central

    Unlu, Bekir Serdar; Yilmazer, Mehmet; Koken, Gulengul; Arioz, Dagistan Tolga; Unlu, Ebru; Baki, Elif Dogan; Kurttay, Cemile; Karacin, Osman

    2015-01-01

    BACKGROUND: Hysterosalpingography (HSG) is the most commonly used method for evaluating the anatomy and patency of the uterine cavity and fallopian tubes, and is an important tool in the evaluation of infertility. The most frequent side effect is the pain associated with the procedure. OBJECTIVES: To evaluate four analgesic methods to determine the most useful method for reducing discomfort associated with HSG. METHODS: In the present prospective study, 75 patients undergoing HSG for evaluation of infertility were randomly assigned to four groups: 550 mg of a nonsteroidal anti-inflammatory drug (NSAID) (group 1); 550 mg NSAID + paracervical block (group 2); 550 mg NSAID + paracervical analgesic cream (group 3); or 550 mg NSAID + intrauterine analgesic instillation (group 4). A visual analogue scale was used to assess the pain perception at five predefined steps. RESULTS: Instillation of the liquids used for HSG was found to be the most painful step of HSG, and this step was where the only significant difference among groups was observed. When comparing visual analogue scale scores, group 2 and group 3 reported significantly less pain than the other groups. Group 1 reported significantly higher mean (± SD) scores (7.2±1.6) compared with groups 2 and 3 (4.7±2.5 and 3.8±2.4, respectively) (P<0.001). In addition, group 2 reported significantly less pain than group 4 (4.7±2.5 versus 6.7±1.8, respectively) (P<0.02). CONCLUSIONS: For effective pain relief during HSG, in addition to 550 mg NSAID, local application of lidocaine cream to the posterior fornix of the cervix uteri and paracervical lidocaine injection into the cervix uteri appear to be the most effective methods. PMID:25848848

  18. A comparison between orthogonal and parallel plating methods for distal humerus fractures: a prospective randomized trial.

    PubMed

    Lee, Sang Ki; Kim, Kap Jung; Park, Kyung Hoon; Choy, Won Sik

    2014-10-01

    With the continuing improvements in implants for distal humerus fractures, it is expected that newer types of plates, which are anatomically precontoured, thinner and less irritating to soft tissue, would have comparable outcomes when used in a clinical study. The purpose of this study was to compare the clinical and radiographic outcomes in patients with distal humerus fractures who were treated with orthogonal and parallel plating methods using precontoured distal humerus plates. Sixty-seven patients with a mean age of 55.4 years (range 22-90 years) were included in this prospective study. The subjects were randomly assigned to receive 1 of 2 treatments: orthogonal or parallel plating. The following results were assessed: operating time, time to fracture union, presence of a step or gap at the articular margin, varus-valgus angulation, functional recovery, and complications. No intergroup differences were observed based on radiological and clinical results between the groups. In our practice, no significant differences were found between the orthogonal and parallel plating methods in terms of clinical outcomes, mean operation time, union time, or complication rates. There were no cases of fracture nonunion in either group; heterotrophic ossification was found 3 patients in orthogonal plating group and 2 patients in parallel plating group. In our practice, no significant differences were found between the orthogonal and parallel plating methods in terms of clinical outcomes or complication rates. However, orthogonal plating method may be preferred in cases of coronal shear fractures, where posterior to anterior fixation may provide additional stability to the intraarticular fractures. Additionally, parallel plating method may be the preferred technique used for fractures that occur at the most distal end of the humerus.

  19. Study of Electromagnetic Scattering From Material Object Doped Randomly With Thin Metallic Wires Using Finite Element Method

    NASA Technical Reports Server (NTRS)

    Deshpande, Manohar D.

    2005-01-01

    A new numerical simulation method using the finite element methodology (FEM) is presented to study electromagnetic scattering due to an arbitrarily shaped material body doped randomly with thin and short metallic wires. The FEM approach described in many standard text books is appropriately modified to account for the presence of thin and short metallic wires distributed randomly inside an arbitrarily shaped material body. Using this modified FEM approach, the electromagnetic scattering due to cylindrical, spherical material body doped randomly with thin metallic wires is studied.

  20. Method for high-volume sequencing of nucleic acids: random and directed priming with libraries of oligonucleotides

    DOEpatents

    Studier, F.W.

    1995-04-18

    Random and directed priming methods for determining nucleotide sequences by enzymatic sequencing techniques, using libraries of primers of lengths 8, 9 or 10 bases, are disclosed. These methods permit direct sequencing of nucleic acids as large as 45,000 base pairs or larger without the necessity for subcloning. Individual primers are used repeatedly to prime sequence reactions in many different nucleic acid molecules. Libraries containing as few as 10,000 octamers, 14,200 nonamers, or 44,000 decamers would have the capacity to determine the sequence of almost any cosmid DNA. Random priming with a fixed set of primers from a smaller library can also be used to initiate the sequencing of individual nucleic acid molecules, with the sequence being completed by directed priming with primers from the library. In contrast to random cloning techniques, a combined random and directed priming strategy is far more efficient. 2 figs.

  1. Method for high-volume sequencing of nucleic acids: random and directed priming with libraries of oligonucleotides

    DOEpatents

    Studier, F. William

    1995-04-18

    Random and directed priming methods for determining nucleotide sequences by enzymatic sequencing techniques, using libraries of primers of lengths 8, 9 or 10 bases, are disclosed. These methods permit direct sequencing of nucleic acids as large as 45,000 base pairs or larger without the necessity for subcloning. Individual primers are used repeatedly to prime sequence reactions in many different nucleic acid molecules. Libraries containing as few as 10,000 octamers, 14,200 nonamers, or 44,000 decamers would have the capacity to determine the sequence of almost any cosmid DNA. Random priming with a fixed set of primers from a smaller library can also be used to initiate the sequencing of individual nucleic acid molecules, with the sequence being completed by directed priming with primers from the library. In contrast to random cloning techniques, a combined random and directed priming strategy is far more efficient.

  2. Upscaling solute transport in naturally fractured porous media with the continuous time random walk method

    NASA Astrophysics Data System (ADS)

    Geiger, S.; Cortis, A.; Birkholzer, J. T.

    2010-12-01

    Solute transport in fractured porous media is typically "non-Fickian"; that is, it is characterized by early breakthrough and long tailing and by nonlinear growth of the Green function-centered second moment. This behavior is due to the effects of (1) multirate diffusion occurring between the highly permeable fracture network and the low-permeability rock matrix, (2) a wide range of advection rates in the fractures and, possibly, the matrix as well, and (3) a range of path lengths. As a consequence, prediction of solute transport processes at the macroscale represents a formidable challenge. Classical dual-porosity (or mobile-immobile) approaches in conjunction with an advection-dispersion equation and macroscopic dispersivity commonly fail to predict breakthrough of fractured porous media accurately. It was recently demonstrated that the continuous time random walk (CTRW) method can be used as a generalized upscaling approach. Here we extend this work and use results from high-resolution finite element-finite volume-based simulations of solute transport in an outcrop analogue of a naturally fractured reservoir to calibrate the CTRW method by extracting a distribution of retention times. This procedure allows us to predict breakthrough at other model locations accurately and to gain significant insight into the nature of the fracture-matrix interaction in naturally fractured porous reservoirs with geologically realistic fracture geometries.

  3. Upscaling solute transport in naturally fractured porous media with the continuous time random walk method

    SciTech Connect

    Geiger, S.; Cortis, A.; Birkholzer, J.T.

    2010-04-01

    Solute transport in fractured porous media is typically 'non-Fickian'; that is, it is characterized by early breakthrough and long tailing and by nonlinear growth of the Green function-centered second moment. This behavior is due to the effects of (1) multirate diffusion occurring between the highly permeable fracture network and the low-permeability rock matrix, (2) a wide range of advection rates in the fractures and, possibly, the matrix as well, and (3) a range of path lengths. As a consequence, prediction of solute transport processes at the macroscale represents a formidable challenge. Classical dual-porosity (or mobile-immobile) approaches in conjunction with an advection-dispersion equation and macroscopic dispersivity commonly fail to predict breakthrough of fractured porous media accurately. It was recently demonstrated that the continuous time random walk (CTRW) method can be used as a generalized upscaling approach. Here we extend this work and use results from high-resolution finite element-finite volume-based simulations of solute transport in an outcrop analogue of a naturally fractured reservoir to calibrate the CTRW method by extracting a distribution of retention times. This procedure allows us to predict breakthrough at other model locations accurately and to gain significant insight into the nature of the fracture-matrix interaction in naturally fractured porous reservoirs with geologically realistic fracture geometries.

  4. Causal inference methods to assess safety upper bounds in randomized trials with noncompliance

    PubMed Central

    Berlin, Jesse A; Pinheiro, José; Wilcox, Marsha A

    2015-01-01

    Background Premature discontinuation and other forms of noncompliance with treatment assignment can complicate causal inference of treatment effects in randomized trials. The intent-to-treat analysis gives unbiased estimates for causal effects of treatment assignment on outcome, but may understate potential benefit or harm of actual treatment. The corresponding upper confidence limit can also be underestimated. Purpose To compare estimates of the hazard ratio and upper bound of the two-sided 95% confidence interval from causal inference methods that account for noncompliance with those from the intent-to-treat analysis. Methods We used simulations with parameters chosen to reflect cardiovascular safety trials of diabetes drugs, with a focus on upper bound estimates relative to 1.3, based on regulatory guidelines. A total of 1000 simulations were run under each parameter combination for a hypothetical trial of 10,000 total subjects randomly assigned to active treatment or control at 1:1 ratio. Noncompliance was considered in the form of treatment discontinuation and cross-over at specified proportions, with an assumed true hazard ratio of 0.9, 1, and 1.3, respectively. Various levels of risk associated with being a non-complier (independent of treatment status) were evaluated. Hazard ratio and upper bound estimates from causal survival analysis and intent-to-treat were obtained from each simulation and summarized under each parameter setting. Results Causal analysis estimated the true hazard ratio with little bias in almost all settings examined. Intent-to-treat was unbiased only when the true hazard ratio = 1; otherwise it underestimated both benefit and harm. When upper bound estimates from intent-to-treat were ≥1.3, corresponding estimates from causal analysis were also ≥1.3 in almost 100% of the simulations, regardless of the true hazard ratio. When upper bound estimates from intent-to-treat were <1.3 and the true hazard ratio = 1, corresponding

  5. Adipose Tissue - Adequate, Accessible Regenerative Material

    PubMed Central

    Kolaparthy, Lakshmi Kanth.; Sanivarapu, Sahitya; Moogla, Srinivas; Kutcham, Rupa Sruthi

    2015-01-01

    The potential use of stem cell based therapies for the repair and regeneration of various tissues offers a paradigm shift that may provide alternative therapeutic solutions for a number of diseases. The use of either embryonic stem cells (ESCs) or induced pluripotent stem cells in clinical situations is limited due to cell regulations and to technical and ethical considerations involved in genetic manipulation of human ESCs, even though these cells are highly beneficial. Mesenchymal stem cells seen to be an ideal population of stem cells in particular, Adipose derived stem cells (ASCs) which can be obtained in large number and easily harvested from adipose tissue. It is ubiquitously available and has several advantages compared to other sources as easily accessible in large quantities with minimal invasive harvesting procedure, and isolation of adipose derived mesenchymal stem cells yield a high amount of stem cells which is essential for stem cell based therapies and tissue engineering. Recently, periodontal tissue regeneration using ASCs has been examined in some animal models. This method has potential in the regeneration of functional periodontal tissues because various secreted growth factors from ASCs might not only promote the regeneration of periodontal tissues but also encourage neovascularization of the damaged tissues. This review summarizes the sources, isolation and characteristics of adipose derived stem cells and its potential role in periodontal regeneration is discussed. PMID:26634060

  6. Adipose Tissue - Adequate, Accessible Regenerative Material.

    PubMed

    Kolaparthy, Lakshmi Kanth; Sanivarapu, Sahitya; Moogla, Srinivas; Kutcham, Rupa Sruthi

    2015-11-01

    The potential use of stem cell based therapies for the repair and regeneration of various tissues offers a paradigm shift that may provide alternative therapeutic solutions for a number of diseases. The use of either embryonic stem cells (ESCs) or induced pluripotent stem cells in clinical situations is limited due to cell regulations and to technical and ethical considerations involved in genetic manipulation of human ESCs, even though these cells are highly beneficial. Mesenchymal stem cells seen to be an ideal population of stem cells in particular, Adipose derived stem cells (ASCs) which can be obtained in large number and easily harvested from adipose tissue. It is ubiquitously available and has several advantages compared to other sources as easily accessible in large quantities with minimal invasive harvesting procedure, and isolation of adipose derived mesenchymal stem cells yield a high amount of stem cells which is essential for stem cell based therapies and tissue engineering. Recently, periodontal tissue regeneration using ASCs has been examined in some animal models. This method has potential in the regeneration of functional periodontal tissues because various secreted growth factors from ASCs might not only promote the regeneration of periodontal tissues but also encourage neovascularization of the damaged tissues. This review summarizes the sources, isolation and characteristics of adipose derived stem cells and its potential role in periodontal regeneration is discussed. PMID:26634060

  7. A Zero-One Programming Approach to Gulliksen's Matched Random Subtests Method. Research Report 86-4.

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Boekkooi-Timminga, Ellen

    In order to estimate the classical coefficient of test reliability, parallel measurements are needed. H. Gulliksen's matched random subtests method, which is a graphical method for splitting a test into parallel test halves, has practical relevance because it maximizes the alpha coefficient as a lower bound of the classical test reliability…

  8. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  9. Improving function in age-related macular degeneration: design and methods of a randomized clinical trial.

    PubMed

    Rovner, Barry W; Casten, Robin J; Hegel, Mark T; Massof, Robert W; Leiby, Benjamin E; Tasman, William S

    2011-03-01

    Age-Related Macular Degeneration (AMD) is the leading cause of severe vision loss in older adults and impairs the ability to read, drive, and live independently and increases the risk for depression, falls, and earlier mortality. Although new medical treatments have improved AMD's prognosis, vision-related disability remains a major public health problem. Improving Function in AMD (IF-AMD) is a two-group randomized, parallel design, controlled clinical trial that compares the efficacy of Problem-Solving Therapy (PST) with Supportive Therapy (ST) (an attention control treatment) to improve vision function in 240 patients with AMD. PST and ST therapists deliver 6 one-hour respective treatment sessions to subjects in their homes over 2 months. Outcomes are assessed masked to treatment assignment at 3 months (main trial endpoint) and 6 months (maintenance effects). The primary outcome is targeted vision function (TVF), which refers to specific vision-dependent functional goals that subjects highly value but find difficult to achieve. TVF is an innovative outcome measure in that it is targeted and tailored to individual subjects yet is measured in a standardized way. This paper describes the research methods, theoretical and clinical aspects of the study treatments, and the measures used to evaluate functional and psychiatric outcomes in this population.

  10. MEGAWHOP cloning: a method of creating random mutagenesis libraries via megaprimer PCR of whole plasmids.

    PubMed

    Miyazaki, Kentaro

    2011-01-01

    MEGAWHOP allows for the cloning of DNA fragments into a vector and is used for conventional restriction digestion/ligation-based procedures. In MEGAWHOP, the DNA fragment to be cloned is used as a set of complementary primers that replace a homologous region in a template vector through whole-plasmid PCR. After synthesis of a nicked circular plasmid, the mixture is treated with DpnI, a dam-methylated DNA-specific restriction enzyme, to digest the template plasmid. The DpnI-treated mixture is then introduced into competent Escherichia coli cells to yield plasmids carrying replaced insert fragments. Plasmids produced by the MEGAWHOP method are virtually free of contamination by species without any inserts or with multiple inserts, and also the parent. Because the fragment is usually long enough to not interfere with hybridization to the template, various types of fragments can be used with mutations at any site (either known or unknown, random, or specific). By using fragments having homologous sequences at the ends (e.g., adaptor sequence), MEGAWHOP can also be used to recombine nonhomologous sequences mediated by the adaptors, allowing rapid creation of novel constructs and chimeric genes. PMID:21601687

  11. Study of random sequential adsorption by means of the gradient method

    NASA Astrophysics Data System (ADS)

    Loscar, E. S.; Guisoni, N.; Albano, E. V.

    2012-02-01

    By using the gradient method (GM) we study random sequential adsorption (RSA) processes in two dimensions under a gradient constraint that is imposed on the adsorption probability along one axis of the sample. The GM has previously been applied successfully to absorbing phase transitions (both first and second order), and also to the percolation transition. Now, we show that by using the GM the two transitions involved in RSA processes, namely percolation and jamming, can be studied simultaneously by means of the same set of simulations and by using the same theoretical background. For this purpose we theoretically derive the relevant scaling relationships for the RSA of monomers and we tested our analytical results by means of numerical simulations performed upon RSA of both monomers and dimers. We also show that two differently defined interfaces, which run in the direction perpendicular to the axis where the adsorption probability gradient is applied and separate the high-density (large-adsorption probability) and the low-density (low-adsorption probability) regimes, capture the main features of the jamming and percolation transitions, respectively. According to the GM, the scaling behaviour of those interfaces is governed by the roughness exponent α = 1/(1 + ν), where ν is the suitable correlation length exponent. Besides, we present and discuss in a brief overview some achievements of the GM as applied to different physical situations, including a comparison of the critical exponents determined in the present paper with those already published in the literature.

  12. MOMENT-BASED METHOD FOR RANDOM EFFECTS SELECTION IN LINEAR MIXED MODELS

    PubMed Central

    Ahn, Mihye; Lu, Wenbin

    2012-01-01

    The selection of random effects in linear mixed models is an important yet challenging problem in practice. We propose a robust and unified framework for automatically selecting random effects and estimating covariance components in linear mixed models. A moment-based loss function is first constructed for estimating the covariance matrix of random effects. Two types of shrinkage penalties, a hard thresholding operator and a new sandwich-type soft-thresholding penalty, are then imposed for sparse estimation and random effects selection. Compared with existing approaches, the new procedure does not require any distributional assumption on the random effects and error terms. We establish the asymptotic properties of the resulting estimator in terms of its consistency in both random effects selection and variance component estimation. Optimization strategies are suggested to tackle the computational challenges involved in estimating the sparse variance-covariance matrix. Furthermore, we extend the procedure to incorporate the selection of fixed effects as well. Numerical results show promising performance of the new approach in selecting both random and fixed effects and, consequently, improving the efficiency of estimating model parameters. Finally, we apply the approach to a data set from the Amsterdam Growth and Health study. PMID:23105913

  13. Proposal of the Methodology for Analysing the Structural Relationship in the System of Random Process Using the Data Mining Methods

    NASA Astrophysics Data System (ADS)

    Michaľčonok, German; Kalinová, Michaela Horalová; Németh, Martin

    2014-12-01

    The aim of this paper is to present the possibilities of applying data mining techniques to the problem of analysis of structural relationships in the system of stationary random processes. In this paper, we will approach the area of the random processes, present the process of structural analysis and select suitable circuit data mining methods applicable to the area of structural analysis. We will propose the methodology for the structural analysis in the system of stationary stochastic processes using data mining methods for active experimental approach, based on the theoretical basis.

  14. Beyond the Randomized Controlled Trial: A Review of Alternatives in mHealth Clinical Trial Methods

    PubMed Central

    Wiljer, David; Cafazzo, Joseph A

    2016-01-01

    Background Randomized controlled trials (RCTs) have long been considered the primary research study design capable of eliciting causal relationships between health interventions and consequent outcomes. However, with a prolonged duration from recruitment to publication, high-cost trial implementation, and a rigid trial protocol, RCTs are perceived as an impractical evaluation methodology for most mHealth apps. Objective Given the recent development of alternative evaluation methodologies and tools to automate mHealth research, we sought to determine the breadth of these methods and the extent that they were being used in clinical trials. Methods We conducted a review of the ClinicalTrials.gov registry to identify and examine current clinical trials involving mHealth apps and retrieved relevant trials registered between November 2014 and November 2015. Results Of the 137 trials identified, 71 were found to meet inclusion criteria. The majority used a randomized controlled trial design (80%, 57/71). Study designs included 36 two-group pretest-posttest control group comparisons (51%, 36/71), 16 posttest-only control group comparisons (23%, 16/71), 7 one-group pretest-posttest designs (10%, 7/71), 2 one-shot case study designs (3%, 2/71), and 2 static-group comparisons (3%, 2/71). A total of 17 trials included a qualitative component to their methodology (24%, 17/71). Complete trial data collection required 20 months on average to complete (mean 21, SD 12). For trials with a total duration of 2 years or more (31%, 22/71), the average time from recruitment to complete data collection (mean 35 months, SD 10) was 2 years longer than the average time required to collect primary data (mean 11, SD 8). Trials had a moderate sample size of 112 participants. Two trials were conducted online (3%, 2/71) and 7 trials collected data continuously (10%, 7/68). Onsite study implementation was heavily favored (97%, 69/71). Trials with four data collection points had a longer study

  15. A modified hybrid uncertain analysis method for dynamic response field of the LSOAAC with random and interval parameters

    NASA Astrophysics Data System (ADS)

    Zi, Bin; Zhou, Bin

    2016-07-01

    For the prediction of dynamic response field of the luffing system of an automobile crane (LSOAAC) with random and interval parameters, a hybrid uncertain model is introduced. In the hybrid uncertain model, the parameters with certain probability distribution are modeled as random variables, whereas, the parameters with lower and upper bounds are modeled as interval variables instead of given precise values. Based on the hybrid uncertain model, the hybrid uncertain dynamic response equilibrium equation, in which different random and interval parameters are simultaneously included in input and output terms, is constructed. Then a modified hybrid uncertain analysis method (MHUAM) is proposed. In the MHUAM, based on random interval perturbation method, the first-order Taylor series expansion and the first-order Neumann series, the dynamic response expression of the LSOAAC is developed. Moreover, the mathematical characteristics of extrema of bounds of dynamic response are determined by random interval moment method and monotonic analysis technique. Compared with the hybrid Monte Carlo method (HMCM) and interval perturbation method (IPM), numerical results show the feasibility and efficiency of the MHUAM for solving the hybrid LSOAAC problems. The effects of different uncertain models and parameters on the LSOAAC response field are also investigated deeply, and numerical results indicate that the impact made by the randomness in the thrust of the luffing cylinder F is larger than that made by the gravity of the weight in suspension Q . In addition, the impact made by the uncertainty in the displacement between the lower end of the lifting arm and the luffing cylinder a is larger than that made by the length of the lifting arm L .

  16. Anterior cruciate ligament reconstruction. A prospective randomized study of three surgical methods.

    PubMed

    Anderson, A F; Snyder, R B; Lipscomb, A B

    2001-01-01

    A prospective randomized study was performed to determine the differences in results between three methods of anterior cruciate ligament reconstruction: autogenous bone-patellar tendon-bone graft (group 1), semitendinosus and gracilis tendon graft reconstruction combined with an extraarticular procedure (group 2), and semitendinosus and gracilis tendon graft reconstruction alone (group 3). Preoperatively, there were no significant differences between groups. At a mean of 35.4 +/- 11.6 months postoperatively, 102 patients returned for evaluation. International Knee Documentation Committee knee evaluation revealed no significant differences in symptoms, function, return to pre-injury activity, harvest site abnormalities, or limitation of motion between groups 1 and 3. Patients in group 2 had a higher incidence of patellofemoral crepitation and loss of motion than did patients in group 3. The mean manual maximum KT-1000 arthrometer side-to-side difference was 2.1 +/- 2.0 mm in group 1, which was statistically significantly better than the difference in group 3 (3.1 +/- 2.3 mm). Final knee rating showed that 34 of 35 patients in group 1, 23 of 34 patients in group 2, and 26 of 33 patients in group 3 had a normal or nearly normal overall knee rating. Anterior cruciate ligament reconstruction with a semitendinosus and gracilis or a patellar tendon autograft may yield similar subjective results; however, the patellar tendon autograft may provide better objective stability in the long term. In addition, there appears to be no benefit to combining an intraarticular anterior cruciate ligament reconstruction with an extraarticular procedure.

  17. Three randomized trials of maternal influenza immunization in Mali, Nepal, and South Africa: Methods and expectations.

    PubMed

    Omer, Saad B; Richards, Jennifer L; Madhi, Shabir A; Tapia, Milagritos D; Steinhoff, Mark C; Aqil, Anushka R; Wairagkar, Niteen

    2015-07-31

    Influenza infection in pregnancy can have adverse impacts on maternal, fetal, and infant outcomes. Influenza vaccination in pregnancy is an appealing strategy to protect pregnant women and their infants. The Bill & Melinda Gates Foundation is supporting three large, randomized trials in Nepal, Mali, and South Africa evaluating the efficacy and safety of maternal immunization to prevent influenza disease in pregnant women and their infants <6 months of age. Results from these individual studies are expected in 2014 and 2015. While the results from the three maternal immunization trials are likely to strengthen the evidence base regarding the impact of influenza immunization in pregnancy, expectations for these results should be realistic. For example, evidence from previous influenza vaccine studies - conducted in general, non-pregnant populations - suggests substantial geographic and year-to-year variability in influenza incidence and vaccine efficacy/effectiveness. Since the evidence generated from the three maternal influenza immunization trials will be complementary, in this paper we present a side-by-side description of the three studies as well as the similarities and differences between these trials in terms of study location, design, outcome evaluation, and laboratory and epidemiological methods. We also describe the likely remaining knowledge gap after the results from these trials become available along with a description of the analyses that will be conducted when the results from these individual data are pooled. Moreover, we highlight that additional research on logistics of seasonal influenza vaccine supply, surveillance and strain matching, and optimal delivery strategies for pregnant women will be important for informing global policy related to maternal influenza immunization.

  18. Wave reflection from randomly inhomogeneous ionospheric layer: 1. The method of describing the wavefield in a reflecting layer with random irregularities

    NASA Astrophysics Data System (ADS)

    Tinin, Mikhail

    2016-08-01

    It has been previously proposed to describe wave propagation in inhomogeneous media in a small-angle approximation with the aid of a double weighted Fourier transform (DWFT) method. This method agrees with the methods of geometrical optics, smooth perturbations, and phase screen in domains of their applicability; therefore, it can be employed to solve direct and inverse problems of radio wave propagation in multiscale inhomogeneous ionospheric plasma. In this paper, for the DWFT wide-angle generalization a wave equation is preliminary reduced using the Fock proper-time method to a parabolic equation that then is solved by the DWFT method. The resulting solution is analyzed for the case of wave reflection and scattering by a layer with random irregularities and linear profile of average permittivity. We show the transformation of this solution into strict results in the absence of irregularities and in the single-scatter approximation, including backscattering, during weak phase fluctuations. Under certain conditions, the solution takes the form of the small-angle DWFT with respect to refraction in the layer and backscatter effects. Spatial processing in source and observer coordinates brings a beam of received waves into one wave without amplitude fluctuations, which allows an increase in resolution of vertical ionospheric sounding systems.

  19. Novel image fusion method based on adaptive pulse coupled neural network and discrete multi-parameter fractional random transform

    NASA Astrophysics Data System (ADS)

    Lang, Jun; Hao, Zhengchao

    2014-01-01

    In this paper, we first propose the discrete multi-parameter fractional random transform (DMPFRNT), which can make the spectrum distributed randomly and uniformly. Then we introduce this new spectrum transform into the image fusion field and present a new approach for the remote sensing image fusion, which utilizes both adaptive pulse coupled neural network (PCNN) and the discrete multi-parameter fractional random transform in order to meet the requirements of both high spatial resolution and low spectral distortion. In the proposed scheme, the multi-spectral (MS) and panchromatic (Pan) images are converted into the discrete multi-parameter fractional random transform domains, respectively. In DMPFRNT spectrum domain, high amplitude spectrum (HAS) and low amplitude spectrum (LAS) components carry different informations of original images. We take full advantage of the synchronization pulse issuance characteristics of PCNN to extract the HAS and LAS components properly, and give us the PCNN ignition mapping images which can be used to determine the fusion parameters. In the fusion process, local standard deviation of the amplitude spectrum is chosen as the link strength of pulse coupled neural network. Numerical simulations are performed to demonstrate that the proposed method is more reliable and superior than several existing methods based on Hue Saturation Intensity representation, Principal Component Analysis, the discrete fractional random transform etc.

  20. A Mixed-Methods Randomized Controlled Trial of Financial Incentives and Peer Networks to Promote Walking among Older Adults

    ERIC Educational Resources Information Center

    Kullgren, Jeffrey T.; Harkins, Kristin A.; Bellamy, Scarlett L.; Gonzales, Amy; Tao, Yuanyuan; Zhu, Jingsan; Volpp, Kevin G.; Asch, David A.; Heisler, Michele; Karlawish, Jason

    2014-01-01

    Background: Financial incentives and peer networks could be delivered through eHealth technologies to encourage older adults to walk more. Methods: We conducted a 24-week randomized trial in which 92 older adults with a computer and Internet access received a pedometer, daily walking goals, and weekly feedback on goal achievement. Participants…

  1. Single particle electron microscopy reconstruction of the exosome complex using the random conical tilt method.

    PubMed

    Liu, Xueqi; Wang, Hong-Wei

    2011-03-28

    of each single particle. There are several methods to assign the view for each particle, including the angular reconstitution(1) and random conical tilt (RCT) method(2). In this protocol, we describe our practice in getting the 3D reconstruction of yeast exosome complex using negative staining EM and RCT. It should be noted that our protocol of electron microscopy and image processing follows the basic principle of RCT but is not the only way to perform the method. We first describe how to embed the protein sample into a layer of Uranyl-Formate with a thickness comparable to the protein size, using a holey carbon grid covered with a layer of continuous thin carbon film. Then the specimen is inserted into a transmission electron microscope to collect untilted (0-degree) and tilted (55-degree) pairs of micrographs that will be used later for processing and obtaining an initial 3D model of the yeast exosome. To this end, we perform RCT and then refine the initial 3D model by using the projection matching refinement method(3).

  2. A dose optimization method for electron radiotherapy using randomized aperture beams.

    PubMed

    Engel, Konrad; Gauer, Tobias

    2009-09-01

    The present paper describes the entire optimization process of creating a radiotherapy treatment plan for advanced electron irradiation. Special emphasis is devoted to the selection of beam incidence angles and beam energies as well as to the choice of appropriate subfields generated by a refined version of intensity segmentation and a novel random aperture approach. The algorithms have been implemented in a stand-alone programme using dose calculations from a commercial treatment planning system. For this study, the treatment planning system Pinnacle from Philips has been used and connected to the optimization programme using an ASCII interface. Dose calculations in Pinnacle were performed by Monte Carlo simulations for a remote-controlled electron multileaf collimator (MLC) from Euromechanics. As a result, treatment plans for breast cancer patients could be significantly improved when using randomly generated aperture beams. The combination of beams generated through segmentation and randomization achieved the best results in terms of target coverage and sparing of critical organs. The treatment plans could be further improved by use of a field reduction treatment plans could be further improved by use of a field reduction algorithm. Without a relevant loss in dose distribution, the total number of MLC fields and monitor units could be reduced by up to 20%. In conclusion, using randomized aperture beams is a promising new approach in radiotherapy and exhibits potential for further improvements in dose optimization through a combination of randomized electron and photon aperture beams.

  3. Are women with psychosis receiving adequate cervical cancer screening?

    PubMed Central

    Tilbrook, Devon; Polsky, Jane; Lofters, Aisha

    2010-01-01

    ABSTRACT OBJECTIVE To investigate the rates of cervical cancer screening among female patients with psychosis compared with similar patients without psychosis, as an indicator of the quality of primary preventive health care. DESIGN A retrospective cohort study using medical records between November 1, 2004, and November 1, 2007. SETTING Two urban family medicine clinics associated with an academic hospital in Toronto, Ont. PARTICIPANTS A random sample of female patients with and without psychosis between the ages of 20 and 69 years. MAIN OUTCOME MEASURES Number of Papanicolaou tests in a 3-year period. RESULTS Charts for 51 female patients with psychosis and 118 female patients without psychosis were reviewed. Of those women with psychosis, 62.7% were diagnosed with schizophrenia, 19.6% with bipolar disorder, 17.6% with schizoaffective disorder, and 29.4% with other psychotic disorders. Women in both groups were similar in age, rate of comorbidities, and number of full physical examinations. Women with psychosis were significantly more likely to smoke (P < .0001), to have more primary care appointments (P = .035), and to miss appointments (P = .0002) than women without psychosis. After adjustment for age, other psychiatric illnesses, number of physical examinations, number of missed appointments, and having a gynecologist, women with psychosis were significantly less likely to have had a Pap test in the previous 3 years compared with women without psychosis (47.1% vs 73.7%, respectively; odds ratio 0.19, 95% confidence interval 0.06 to 0.58). CONCLUSION Women with psychosis are more than 5 times less likely to receive adequate Pap screening compared with the general population despite their increased rates of smoking and increased number of primary care visits. PMID:20393098

  4. The analysis of a sparse grid stochastic collocation method for partial differential equations with high-dimensional random input data.

    SciTech Connect

    Webster, Clayton; Tempone, Raul; Nobile, Fabio

    2007-12-01

    This work describes the convergence analysis of a Smolyak-type sparse grid stochastic collocation method for the approximation of statistical quantities related to the solution of partial differential equations with random coefficients and forcing terms (input data of the model). To compute solution statistics, the sparse grid stochastic collocation method uses approximate solutions, produced here by finite elements, corresponding to a deterministic set of points in the random input space. This naturally requires solving uncoupled deterministic problems and, as such, the derived strong error estimates for the fully discrete solution are used to compare the computational efficiency of the proposed method with the Monte Carlo method. Numerical examples illustrate the theoretical results and are used to compare this approach with several others, including the standard Monte Carlo.

  5. Acupuncture as a treatment for functional dyspepsia: design and methods of a randomized controlled trial

    PubMed Central

    Zheng, Hui; Tian, Xiao-ping; Li, Ying; Liang, Fan-rong; Yu, Shu-guang; Liu, Xu-guang; Tang, Yong; Yang, Xu-guang; Yan, Jie; Sun, Guo-jie; Chang, Xiao-rong; Zhang, Hong-xing; Ma, Ting-ting; Yu, Shu-yuan

    2009-01-01

    Background Acupuncture is widely used in China to treat functional dyspepsia (FD). However, its effectiveness in the treatment of FD, and whether FD-specific acupoints exist, are controversial. So this study aims to determine if acupuncture is an effective treatment for FD and if acupoint specificity exists according to traditional acupuncture meridians and acupoint theories. Design This multicenter randomized controlled trial will include four acupoint treatment groups, one non-acupoint control group and one drug (positive control) group. The four acupoint treatment groups will focus on: (1) specific acupoints of the stomach meridian; (2) non-specific acupoints of the stomach meridian; (3) specific acupoints of alarm and transport points; and (4) acupoints of the gallbladder meridian. These four groups of acupoints are thought to differ in terms of clinical efficacy, according to traditional acupuncture meridians and acupoint theories. A total of 120 FD patients will be included in each group. Each patient will receive 20 sessions of acupuncture treatment over 4 weeks. The trial will be conducted in eight hospitals located in three centers of China. The primary outcomes in this trial will include differences in Nepean Dyspepsia Index scores and differences in the Symptom Index of Dyspepsia before randomization, 2 weeks and 4 weeks after randomization, and 1 month and 3 months after completing treatment. Discussion The important features of this trial include the randomization procedures (controlled by a central randomization system), a standardized protocol of acupuncture manipulation, and the fact that this is the first multicenter randomized trial of FD and acupuncture to be performed in China. The results of this trial will determine whether acupuncture is an effective treatment for FD and whether using different acupoints or different meridians leads to differences in clinical efficacy. Trial registration number Clinical Trials.gov Identifier: NCT00599677

  6. Active video games as a tool to prevent excessive weight gain in adolescents: rationale, design and methods of a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Excessive body weight, low physical activity and excessive sedentary time in youth are major public health concerns. A new generation of video games, the ones that require physical activity to play the games –i.e. active games- may be a promising alternative to traditional non-active games to promote physical activity and reduce sedentary behaviors in youth. The aim of this manuscript is to describe the design of a study evaluating the effects of a family oriented active game intervention, incorporating several motivational elements, on anthropometrics and health behaviors in adolescents. Methods/Design The study is a randomized controlled trial (RCT), with non-active gaming adolescents aged 12 – 16 years old randomly allocated to a ten month intervention (receiving active games, as well as an encouragement to play) or a waiting-list control group (receiving active games after the intervention period). Primary outcomes are adolescents’ measured BMI-SDS (SDS = adjusted for mean standard deviation score), waist circumference-SDS, hip circumference and sum of skinfolds. Secondary outcomes are adolescents’ self-reported time spent playing active and non-active games, other sedentary activities and consumption of sugar-sweetened beverages. In addition, a process evaluation is conducted, assessing the sustainability of the active games, enjoyment, perceived competence, perceived barriers for active game play, game context, injuries from active game play, activity replacement and intention to continue playing the active games. Discussion This is the first adequately powered RCT including normal weight adolescents, evaluating a reasonably long period of provision of and exposure to active games. Next, strong elements are the incorporating motivational elements for active game play and a comprehensive process evaluation. This trial will provide evidence regarding the potential contribution of active games in prevention of excessive weight gain in

  7. Oxide Defect Engineering Methods for Valence Change (VCM) Resistive Random Access Memories

    NASA Astrophysics Data System (ADS)

    Capulong, Jihan O.

    Electrical switching requirements for resistive random access memory (ReRAM) devices are multifaceted, based on device application. Thus, it is important to obtain an understanding of these switching properties and how they relate to the oxygen vacancy concentration and oxygen vacancy defects. Oxygen vacancy defects in the switching oxide of valence-change-based ReRAM (VCM ReRAM) play a significant role in device switching properties. Oxygen vacancies facilitate resistive switching as they form the conductive filament that changes the resistance state of the device. This dissertation will present two methods of modulating the defect concentration in VCM ReRAM composed of Pt/HfOx/Ti stack: 1) rapid thermal annealing (RTA) in Ar using different temperatures, and 2) doping using ion implantation under different dose levels. Metrology techniques such as x-ray diffractometry (XRD), x-ray photoelectron spectroscopy (XPS), and photoluminescence (PL) spectroscopy were utilized to characterize the HfOx switching oxide, which provided insight on the material properties and oxygen vacancy concentration in the oxide that was used to explain the changes in the electrical properties of the ReRAM devices. The resulting impact on the resistive switching characteristics of the devices, such as the forming voltage, set and reset threshold voltages, ON and OFF resistances, resistance ratio, and switching dispersion or uniformity were explored and summarized. Annealing in Ar showed significant impact on the forming voltage, with as much as 45% (from 22V to 12 V) of improvement, as the annealing temperature was increased. However, drawbacks of a higher oxide leakage and worse switching uniformity were seen with increasing annealing temperature. Meanwhile, doping the oxide by ion implantation showed significant effects on the resistive switching characteristics. Ta doping modulated the following switching properties with increasing dose: a) the reduction of the forming voltage, and Vset

  8. Family nurture intervention (FNI): methods and treatment protocol of a randomized controlled trial in the NICU

    PubMed Central

    2012-01-01

    Background The stress that results from preterm birth, requisite acute care and prolonged physical separation in the Neonatal Intensive Care Unit (NICU) can have adverse physiological/psychological effects on both the infant and the mother. In particular, the experience compromises the establishment and maintenance of optimal mother-infant relationship, the subsequent development of the infant, and the mother's emotional well-being. These findings highlight the importance of investigating early interventions that are designed to overcome or reduce the effects of these environmental insults and challenges. Methods This study is a randomized controlled trial (RCT) with blinded assessment comparing Standard Care (SC) with a novel Family Nurture Intervention (FNI). FNI targets preterm infants born 26-34 weeks postmenstrual age (PMA) and their mothers in the NICU. The intervention incorporates elements of mother-infant interventions with known efficacy and organizes them under a new theoretical context referred to collectively as calming activities. This intervention is facilitated by specially trained Nurture Specialists in three ways: 1) In the isolette through calming interactions between mother and infant via odor exchange, firm sustained touch and vocal soothing, and eye contact; 2) Outside the isolette during holding and feeding via the Calming Cycle; and 3) through family sessions designed to engage help and support the mother. In concert with infant neurobehavioral and physiological assessments from birth through 24 months corrected age (CA), maternal assessments are made using standard tools including anxiety, depression, attachment, support systems, temperament as well as physiological stress parameters. Quality of mother-infant interaction is also assessed. Our projected enrolment is 260 families (130 per group). Discussion The FNI is designed to increase biologically important activities and behaviors that enhance maternally-mediated sensory experiences of

  9. Variability in DNA polymerase efficiency: effects of random error, DNA extraction method, and isolate type

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Using computer-generated data calculated with known amounts of random error (E = 1, 5 & 10%) associated with calculated qPCR cycle number (C ) at four jth 1:10 dilutions, we found that the “efficiency” (eff) associated with each population distribution of n = 10,000 measurements varied from 0.95 to ...

  10. Mixing Methods in Randomized Controlled Trials (RCTs): Validation, Contextualization, Triangulation, and Control

    ERIC Educational Resources Information Center

    Spillane, James P.; Pareja, Amber Stitziel; Dorner, Lisa; Barnes, Carol; May, Henry; Huff, Jason; Camburn, Eric

    2010-01-01

    In this paper we described how we mixed research approaches in a Randomized Control Trial (RCT) of a school principal professional development program. Using examples from our study we illustrate how combining qualitative and quantitative data can address some key challenges from validating instruments and measures of mediator variables to…

  11. Methods of Learning in Statistical Education: A Randomized Trial of Public Health Graduate Students

    ERIC Educational Resources Information Center

    Enders, Felicity Boyd; Diener-West, Marie

    2006-01-01

    A randomized trial of 265 consenting students was conducted within an introductory biostatistics course: 69 received eight small group cooperative learning sessions; 97 accessed internet learning sessions; 96 received no intervention. Effect on examination score (95% CI) was assessed by intent-to-treat analysis and by incorporating reported…

  12. Increasing the Degrees of Freedom in Future Group Randomized Trials: The "df*" Method Revisited

    ERIC Educational Resources Information Center

    Murray, David M.; Blitstein, Jonathan L.; Hannan, Peter J.; Shadish, William R.

    2012-01-01

    Background: This article revisits an article published in Evaluation Review in 2005 on sample size estimation and power analysis for group-randomized trials. With help from a careful reader, we learned of an important error in the spreadsheet used to perform the calculations and generate the results presented in that article. As we studied the…

  13. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  14. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  15. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  16. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...

  17. 13 CFR 108.200 - Adequate capital for NMVC Companies.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order to... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for...

  18. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 34 Education 1 2012-07-01 2012-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...

  19. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 1 2013-07-01 2013-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...

  20. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...

  1. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 1 2014-07-01 2014-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...

  2. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 1 2011-07-01 2011-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...

  3. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  4. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  5. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  6. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  7. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  8. 9 CFR 305.3 - Sanitation and adequate facilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...

  9. 9 CFR 305.3 - Sanitation and adequate facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...

  10. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  11. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  12. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...

  13. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...

  14. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...

  15. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...

  16. 13 CFR 107.200 - Adequate capital for Licensees.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS INVESTMENT COMPANIES Qualifying for an SBIC License Capitalizing An Sbic § 107.200 Adequate capital...

  17. 21 CFR 201.5 - Drugs; adequate directions for use.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Drugs; adequate directions for use. 201.5 Section 201.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use....

  18. 21 CFR 201.5 - Drugs; adequate directions for use.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs; adequate directions for use. 201.5 Section 201.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use....

  19. 7 CFR 4290.200 - Adequate capital for RBICs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Adequate capital for RBICs. 4290.200 Section 4290.200 Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... Qualifications for the RBIC Program Capitalizing A Rbic § 4290.200 Adequate capital for RBICs. You must meet...

  20. "Something Adequate"? In Memoriam Seamus Heaney, Sister Quinlan, Nirbhaya

    ERIC Educational Resources Information Center

    Parker, Jan

    2014-01-01

    Seamus Heaney talked of poetry's responsibility to represent the "bloody miracle", the "terrible beauty" of atrocity; to create "something adequate". This article asks, what is adequate to the burning and eating of a nun and the murderous gang rape and evisceration of a medical student? It considers Njabulo…

  1. The random card sort method and respondent certainty in contingent valuation: an exploratory investigation of range bias.

    PubMed

    Shackley, Phil; Dixon, Simon

    2014-10-01

    Willingness to pay (WTP) values derived from contingent valuation surveys are prone to a number of biases. Range bias occurs when the range of money values presented to respondents in a payment card affects their stated WTP values. This paper reports the results of an exploratory study whose aim was to investigate whether the effects of range bias can be reduced through the use of an alternative to the standard payment card method, namely, a random card sort method. The results suggest that the random card sort method is prone to range bias but that this bias may be mitigated by restricting the analysis to the WTP values of those respondents who indicate they are 'definitely sure' they would pay their stated WTP.

  2. Are population pharmacokinetic and/or pharmacodynamic models adequately evaluated? A survey of the literature from 2002 to 2004

    PubMed Central

    Brendel, Karl; Dartois, Céline; Comets, Emmanuelle; Lemenuel-Diot, Annabelle; Laveille, Christian; Tranchand, Brigitte; Girard, Pascal; Laffont, Céline M.; Mentré, France

    2007-01-01

    Purpose Model evaluation is an important issue in population analyses. We aimed to perform a systematic review of all population PK and/or PD analyses published between 2002 and 2004 to survey the current methods used to evaluate a model and to assess whether those models were adequately evaluated. Methods We selected 324 papers in MEDLINE using defined keywords and built a data abstraction form (DAF) composed of a checklist of items to extract the relevant information from these articles with respect to model evaluation. In the DAF, evaluation methods were divided into 3 subsections: basic internal methods (goodness-of-fit plots [GOF], uncertainty in parameter estimates and model sensitivity), advanced internal methods (data splitting, resampling techniques and Monte Carlo simulations) and external model evaluation. Results Basic internal evaluation was the most frequently described method in the reports: 65% of the models involved GOF evaluation. Standard errors or confidence intervals were reported for 50% of fixed effects but only 22% of random effects. Advanced internal methods were used in approximately 25% of models: data splitting was more often used than bootstrap and cross-validation; simulations were used in 6% of models to evaluate models by visual predictive check or by posterior predictive check. External evaluation was performed in only 7% of models. Conclusions Using the subjective synthesis of model evaluation for each paper, we judged models to be adequately evaluated in 28% of PK models and 26% of PD models. Basic internal evaluation was preferred to more advanced methods, probably because the former are performed easily with most software. We also noticed that when the aim of modelling was predictive, advanced internal methods or more stringent methods were more often used. PMID:17328581

  3. Asymptotic-preserving methods for hyperbolic and transport equations with random inputs and diffusive scalings

    SciTech Connect

    Jin, Shi; Xiu, Dongbin; Zhu, Xueyu

    2015-05-15

    In this paper we develop a set of stochastic numerical schemes for hyperbolic and transport equations with diffusive scalings and subject to random inputs. The schemes are asymptotic preserving (AP), in the sense that they preserve the diffusive limits of the equations in discrete setting, without requiring excessive refinement of the discretization. Our stochastic AP schemes are extensions of the well-developed deterministic AP schemes. To handle the random inputs, we employ generalized polynomial chaos (gPC) expansion and combine it with stochastic Galerkin procedure. We apply the gPC Galerkin scheme to a set of representative hyperbolic and transport equations and establish the AP property in the stochastic setting. We then provide several numerical examples to illustrate the accuracy and effectiveness of the stochastic AP schemes.

  4. Method for removal of random noise in eddy-current testing system

    DOEpatents

    Levy, Arthur J.

    1995-01-01

    Eddy-current response voltages, generated during inspection of metallic structures for anomalies, are often replete with noise. Therefore, analysis of the inspection data and results is difficult or near impossible, resulting in inconsistent or unreliable evaluation of the structure. This invention processes the eddy-current response voltage, removing the effect of random noise, to allow proper identification of anomalies within and associated with the structure.

  5. A Novel Compressed Sensing Method for Magnetic Resonance Imaging: Exponential Wavelet Iterative Shrinkage-Thresholding Algorithm with Random Shift.

    PubMed

    Zhang, Yudong; Yang, Jiquan; Yang, Jianfei; Liu, Aijun; Sun, Ping

    2016-01-01

    Aim. It can help improve the hospital throughput to accelerate magnetic resonance imaging (MRI) scanning. Patients will benefit from less waiting time. Task. In the last decade, various rapid MRI techniques on the basis of compressed sensing (CS) were proposed. However, both computation time and reconstruction quality of traditional CS-MRI did not meet the requirement of clinical use. Method. In this study, a novel method was proposed with the name of exponential wavelet iterative shrinkage-thresholding algorithm with random shift (abbreviated as EWISTARS). It is composed of three successful components: (i) exponential wavelet transform, (ii) iterative shrinkage-thresholding algorithm, and (iii) random shift. Results. Experimental results validated that, compared to state-of-the-art approaches, EWISTARS obtained the least mean absolute error, the least mean-squared error, and the highest peak signal-to-noise ratio. Conclusion. EWISTARS is superior to state-of-the-art approaches. PMID:27066068

  6. A Novel Compressed Sensing Method for Magnetic Resonance Imaging: Exponential Wavelet Iterative Shrinkage-Thresholding Algorithm with Random Shift

    PubMed Central

    Zhang, Yudong; Yang, Jiquan; Yang, Jianfei; Liu, Aijun; Sun, Ping

    2016-01-01

    Aim. It can help improve the hospital throughput to accelerate magnetic resonance imaging (MRI) scanning. Patients will benefit from less waiting time. Task. In the last decade, various rapid MRI techniques on the basis of compressed sensing (CS) were proposed. However, both computation time and reconstruction quality of traditional CS-MRI did not meet the requirement of clinical use. Method. In this study, a novel method was proposed with the name of exponential wavelet iterative shrinkage-thresholding algorithm with random shift (abbreviated as EWISTARS). It is composed of three successful components: (i) exponential wavelet transform, (ii) iterative shrinkage-thresholding algorithm, and (iii) random shift. Results. Experimental results validated that, compared to state-of-the-art approaches, EWISTARS obtained the least mean absolute error, the least mean-squared error, and the highest peak signal-to-noise ratio. Conclusion. EWISTARS is superior to state-of-the-art approaches. PMID:27066068

  7. Perception of spatiotemporal random fractals: an extension of colorimetric methods to the study of dynamic texture.

    PubMed

    Billock, V A; Cunningham, D W; Havig, P R; Tsou, B H

    2001-10-01

    Recent work establishes that static and dynamic natural images have fractal-like l/falpha spatiotemporal spectra. Artifical textures, with randomized phase spectra, and 1/falpha amplitude spectra are also used in studies of texture and noise perception. Influenced by colorimetric principles and motivated by the ubiquity of 1/falpha spatial and temporal image spectra, we treat the spatial and temporal frequency exponents as the dimensions characterizing a dynamic texture space, and we characterize two key attributes of this space, the spatiotemporal appearance map and the spatiotemporal discrimination function (a map of MacAdam-like just-noticeable-difference contours).

  8. Comparison of Parametric and Nonparametric Bootstrap Methods for Estimating Random Error in Equipercentile Equating

    ERIC Educational Resources Information Center

    Cui, Zhongmin; Kolen, Michael J.

    2008-01-01

    This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…

  9. A novel whole genome amplification method using type IIS restriction enzymes to create overhangs with random sequences.

    PubMed

    Pan, Xiaoming; Wan, Baihui; Li, Chunchuan; Liu, Yu; Wang, Jing; Mou, Haijin; Liang, Xingguo

    2014-08-20

    Ligation-mediated polymerase chain reaction (LM-PCR) is a whole genome amplification (WGA) method, for which genomic DNA is cleaved into numerous fragments and then all of the fragments are amplified by PCR after attaching a universal end sequence. However, the self-ligation of these fragments could happen and may cause biased amplification and restriction of its application. To decrease the self-ligation probability, here we use type IIS restriction enzymes to digest genomic DNA into fragments with 4-5nt long overhangs with random sequences. After ligation to an adapter with random end sequences to above fragments, PCR is carried out and almost all present DNA sequences are amplified. In this study, whole genome of Vibrio parahaemolyticus was amplified and the amplification efficiency was evaluated by quantitative PCR. The results suggested that our approach could provide sufficient genomic DNA with good quality to meet requirements of various genetic analyses.

  10. Two methods of random seed generation to avoid over-segmentation with stochastic watershed: application to nuclear fuel micrographs.

    PubMed

    Tolosa, S Cativa; Blacher, S; Denis, A; Marajofsky, A; Pirard, J-P; Gommes, C J

    2009-10-01

    A stochastic version of the watershed algorithm is obtained by choosing randomly in the image the seeds from which the watershed regions are grown. The output of the procedure is a probability density function corresponding to the probability that each pixel belongs to a boundary. In the present paper, two stochastic seed-generation processes are explored to avoid over-segmentation. The first is a non-uniform Poisson process, the density of which is optimized on the basis of opening granulometry. The second process positions the seeds randomly within disks centred on the maxima of a distance map. The two methods are applied to characterize the grain structure of nuclear fuel pellets. Estimators are proposed for the total edge length and grain number per unit area, L(A) and N(A), which take advantage of the probabilistic nature of the probability density function and do not require segmentation.

  11. Low-coherence interferometry as a method for assessing the transport parameters in randomly inhomogeneous media

    SciTech Connect

    Zimnyakov, D A; Sina, J S; Yuvchenko, S A; Isaeva, E A; Chekmasov, S P; Ushakova, O V

    2014-01-31

    The specific features of using low-coherence interferometric probing of layers in randomly inhomogeneous media for determination of the radiation propagation transport length both in diffuse regime and in the case of optically thin media are discussed. The transport length is determined by the rate of exponential decay of the interference signal with the increase in the path length difference between the light beams in the reference arm of the low-coherence interferometer and in the object arm, containing the probed layer as a diffuse reflector. The results are presented of experimental testing of the discussed approach with the use of layers of densely packed titanium dioxide nanoparticles and polytetrafluoroethylene. (radiation scattering)

  12. Control Capacity and A Random Sampling Method in Exploring Controllability of Complex Networks

    PubMed Central

    Jia, Tao; Barabási, Albert-László

    2013-01-01

    Controlling complex systems is a fundamental challenge of network science. Recent advances indicate that control over the system can be achieved through a minimum driver node set (MDS). The existence of multiple MDS's suggests that nodes do not participate in control equally, prompting us to quantify their participations. Here we introduce control capacity quantifying the likelihood that a node is a driver node. To efficiently measure this quantity, we develop a random sampling algorithm. This algorithm not only provides a statistical estimate of the control capacity, but also bridges the gap between multiple microscopic control configurations and macroscopic properties of the network under control. We demonstrate that the possibility of being a driver node decreases with a node's in-degree and is independent of its out-degree. Given the inherent multiplicity of MDS's, our findings offer tools to explore control in various complex systems. PMID:23912679

  13. A Real-time Auto-detection Method for Random Telegraph Signal (RTS) Noise Detection in CMOS Active pixel sensors

    NASA Astrophysics Data System (ADS)

    Zheng, R.; Zhao, R.; Ma, Y.; Li, B.; Wei, X.; Wang, J.; Gao, W.; Wei, T.; Gao, D.; Hu, Y.

    2015-07-01

    CMOS Active pixel sensors (CMOS APS) are attractive for use in the innermost layers of charged particle trackers, due to their good tradeoffs among the key performances. However, CMOS APS can be greatly influenced by random telegraph signal (RTS) noise, which can cause particle tracking or energy calculation failures. In-depth research of pixels' RTS behavior stimulates the interest of the methods for RTS noise detection, reconstruction and parameters extraction. In this paper, a real-time auto-detection method is proposed, using real-time Gaussian noise standard deviation as the detection threshold. Experimental results show that, compared with current methods using signal standard deviation as the thresholds, the proposed method is more sensitive in multi-level RTS detection and more effective in the case of RTS noise degradation.

  14. On Adequate Comparisons of Antenna Phase Center Variations

    NASA Astrophysics Data System (ADS)

    Schoen, S.; Kersten, T.

    2013-12-01

    One important part for ensuring the high quality of the International GNSS Service's (IGS) products is the collection and publication of receiver - and satellite antenna phase center variations (PCV). The PCV are crucial for global and regional networks, since they introduce a global scale factor of up to 16ppb or changes in the height component with an amount of up to 10cm, respectively. Furthermore, antenna phase center variations are also important for precise orbit determination, navigation and positioning of mobile platforms, like e.g. the GOCE and GRACE gravity missions, or for the accurate Precise Point Positioning (PPP) processing. Using the EUREF Permanent Network (EPN), Baire et al. (2012) showed that individual PCV values have a significant impact on the geodetic positioning. The statements are further supported by studies of Steigenberger et al. (2013) where the impact of PCV for local-ties are analysed. Currently, there are five calibration institutions including the Institut für Erdmessung (IfE) contributing to the IGS PCV file. Different approaches like field calibrations and anechoic chamber measurements are in use. Additionally, the computation and parameterization of the PCV are completely different within the methods. Therefore, every new approach has to pass a benchmark test in order to ensure that variations of PCV values of an identical antenna obtained from different methods are as consistent as possible. Since the number of approaches to obtain these PCV values rises with the number of calibration institutions, there is the necessity for an adequate comparison concept, taking into account not only the numerical values but also stochastic information and computational issues of the determined PCVs. This is of special importance, since the majority of calibrated receiver antennas published by the IGS origin from absolute field calibrations based on the Hannover Concept, Wübbena et al. (2000). In this contribution, a concept for the adequate

  15. Predictors for Reporting of Dietary Assessment Methods in Food-based Randomized Controlled Trials over a Ten-year Period.

    PubMed

    Probst, Yasmine; Zammit, Gail

    2016-09-01

    The importance of monitoring dietary intake within a randomized controlled trial becomes vital to justification of the study outcomes when the study is food-based. A systematic literature review was conducted to determine how dietary assessment methods used to monitor dietary intake are reported and whether assisted technologies are used in conducting such assessments. OVID and ScienceDirect databases 2000-2010 were searched for food-based, parallel, randomized controlled trials conducted with humans using the search terms "clinical trial," "diet$ intervention" AND "diet$ assessment," "diet$ method$," "intake," "diet history," "food record," "food frequency questionnaire," "FFQ," "food diary," "24-hour recall." A total of 1364 abstracts were reviewed and 243 studies identified. The size of the study and country of origin appear to be the two most common predictors of reporting both the dietary assessment method and details of the form of assessment. The journal in which the study is published has no impact. Information technology use may increase in the future allowing other methods and forms of dietary assessment to be used efficiently.

  16. Hybrid random walk-linear discriminant analysis method for unwrapping quantitative phase microscopy images of biological samples

    PubMed Central

    Kim, Diane N. H.; Teitell, Michael A.; Reed, Jason; Zangle, Thomas A.

    2015-01-01

    Abstract. Standard algorithms for phase unwrapping often fail for interferometric quantitative phase imaging (QPI) of biological samples due to the variable morphology of these samples and the requirement to image at low light intensities to avoid phototoxicity. We describe a new algorithm combining random walk-based image segmentation with linear discriminant analysis (LDA)-based feature detection, using assumptions about the morphology of biological samples to account for phase ambiguities when standard methods have failed. We present three versions of our method: first, a method for LDA image segmentation based on a manually compiled training dataset; second, a method using a random walker (RW) algorithm informed by the assumed properties of a biological phase image; and third, an algorithm which combines LDA-based edge detection with an efficient RW algorithm. We show that the combination of LDA plus the RW algorithm gives the best overall performance with little speed penalty compared to LDA alone, and that this algorithm can be further optimized using a genetic algorithm to yield superior performance for phase unwrapping of QPI data from biological samples. PMID:26305212

  17. MULTILEVEL ACCELERATION OF STOCHASTIC COLLOCATION METHODS FOR PDE WITH RANDOM INPUT DATA

    SciTech Connect

    Webster, Clayton G; Jantsch, Peter A; Teckentrup, Aretha L; Gunzburger, Max D

    2013-01-01

    Stochastic Collocation (SC) methods for stochastic partial differential equa- tions (SPDEs) suffer from the curse of dimensionality, whereby increases in the stochastic dimension cause an explosion of computational effort. To combat these challenges, multilevel approximation methods seek to decrease computational complexity by balancing spatial and stochastic discretization errors. As a form of variance reduction, multilevel techniques have been successfully applied to Monte Carlo (MC) methods, but may be extended to accelerate other methods for SPDEs in which the stochastic and spatial degrees of freedom are de- coupled. This article presents general convergence and computational complexity analysis of a multilevel method for SPDEs, demonstrating its advantages with regard to standard, single level approximation. The numerical results will highlight conditions under which multilevel sparse grid SC is preferable to the more traditional MC and SC approaches.

  18. Smoke alarm tests may not adequately indicate smoke alarm function.

    PubMed

    Peek-Asa, Corinne; Yang, Jingzhen; Hamann, Cara; Young, Tracy

    2011-01-01

    Smoke alarms are one of the most promoted prevention strategies to reduce residential fire deaths, and they can reduce residential fire deaths by half. Smoke alarm function can be measured by two tests: the smoke alarm button test and the chemical smoke test. Using results from a randomized trial of smoke alarms, we compared smoke alarm response to the button test and the smoke test. The smoke alarms found in the study homes at baseline were tested, as well as study alarms placed into homes as part of the randomized trial. Study alarms were tested at 12 and 42 months postinstallation. The proportion of alarms that passed the button test but not the smoke test ranged from 0.5 to 5.8% of alarms; this result was found most frequently among ionization alarms with zinc or alkaline batteries. These alarms would indicate to the owner (through the button test) that the smoke alarm was working, but the alarm would not actually respond in the case of a fire (as demonstrated by failing the smoke test). The proportion of alarms that passed the smoke test but not the button test ranged from 1.0 to 3.0%. These alarms would appear nonfunctional to the owner (because the button test failed), even though the alarm would operate in response to a fire (as demonstrated by passing the smoke test). The general public is not aware of the potential for inaccuracy in smoke alarm tests, and burn professionals can advocate for enhanced testing methods. The optimal test to determine smoke alarm function is the chemical smoke test. PMID:21747329

  19. An analytical method for disentangling the roles of adhesion and crowding for random walk models on a crowded lattice.

    PubMed

    Ellery, Adam J; Baker, Ruth E; Simpson, Matthew J

    2016-01-01

    Migration of cells and molecules in vivo is affected by interactions with obstacles. These interactions can include crowding effects, as well as adhesion/repulsion between the motile cell/molecule and the obstacles. Here we present an analytical framework that can be used to separately quantify the roles of crowding and adhesion/repulsion using a lattice-based random walk model. Our method leads to an exact calculation of the long time Fickian diffusivity, and avoids the need for computationally expensive stochastic simulations. PMID:27597573

  20. An analytical method for disentangling the roles of adhesion and crowding for random walk models on a crowded lattice

    NASA Astrophysics Data System (ADS)

    Ellery, Adam J.; Baker, Ruth E.; Simpson, Matthew J.

    2016-10-01

    Migration of cells and molecules in vivo is affected by interactions with obstacles. These interactions can include crowding effects, as well as adhesion/repulsion between the motile cell/molecule and the obstacles. Here we present an analytical framework that can be used to separately quantify the roles of crowding and adhesion/repulsion using a lattice-based random walk model. Our method leads to an exact calculation of the long time Fickian diffusivity, and avoids the need for computationally expensive stochastic simulations.

  1. An analytical method for disentangling the roles of adhesion and crowding for random walk models on a crowded lattice.

    PubMed

    Ellery, Adam J; Baker, Ruth E; Simpson, Matthew J

    2016-09-06

    Migration of cells and molecules in vivo is affected by interactions with obstacles. These interactions can include crowding effects, as well as adhesion/repulsion between the motile cell/molecule and the obstacles. Here we present an analytical framework that can be used to separately quantify the roles of crowding and adhesion/repulsion using a lattice-based random walk model. Our method leads to an exact calculation of the long time Fickian diffusivity, and avoids the need for computationally expensive stochastic simulations.

  2. Perturbation methods to track wireless optical wave propagation in a random medium.

    PubMed

    Bosu, Rahul; Prince, Shanthi

    2016-02-01

    We consider certain problems in optical wave propagation in linear and nonlinear media in a regime of low-level atmospheric refractive index fluctuations. The perturbation theory hinges on the identification of such perturbing parameters to study its effect on the characteristics of the propagating free-space optical beam. Here, we illustrate the application of the few perturbation methods that are used to track the distorted traversing free-space optical fields. Furthermore, the tracking error is computed for the various approximate solutions in contrast to the numerical solution for comparing the capturing efficiency of the different perturbative analytical techniques. We found that the coordinate straining method efficiently reduces the deviation in the approximate solutions as compared to the regular perturbation expansion method. Furthermore, from the result analysis, it is observed that the tracking reliability of the various approximation solutions for a particular perturbation level depends on the proper choice of the approximation method and the order of the solution.

  3. Maternal Side-Effects of Continuous vs. Intermittent Method of EntonoxDuring Labor: A Randomized Clinical Trial.

    PubMed

    Agah, Jila; Baghani, Roya; Tabaraei, Yaser; Rad, Abolfazl

    2016-01-01

    Labor pain is one of the most tiresome types of pain. So human has been seeking to allay this pain until now. Administration of a suitable agent such as Entonox during labor is very beneficial for childbirth outcomes. Entonox can be administered in two ways: intermittently and continuously. The aim of this study is to demonstrate whether continuous method is as safe as intermittent method? This randomized clinical trial was performed in Mobini Hospital, Sabzevar, Iran. One hundred admitted women for vaginal delivery were included in this study. Fitted patients were randomly divided into two equal groups. After thorough training, the patients used Entonox during active phase of labor. Fifty parturients used it intermittently and 50 others used it continuously. Then, maternal adverse effects, satisfaction and labor progression were registered and compared in two groups. Statistical Analysis was performed by spss17 software, t-test and chi square test. The maternal side effects of Entonox had no significant difference in two groups (p>0.05). Mothers' satisfaction rate in continuous group was more than the intermittent group significantly (p<0.001). Meantime of active phase of labor had no significant difference between two groups (p=0.2). It seems that by more investigations, there will be conditions for mothers to choose the desired method of Entonox usage, intermittently or continuously. This approach leads to reduction of difficult labor and cesarean section and consequently helps improvement of maternal health level, both physically and psychologically. PMID:27642337

  4. Known plaintext attack on double random phase encoding using fingerprint as key and a method for avoiding the attack.

    PubMed

    Tashima, Hideaki; Takeda, Masafumi; Suzuki, Hiroyuki; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2010-06-21

    We have shown that the application of double random phase encoding (DRPE) to biometrics enables the use of biometrics as cipher keys for binary data encryption. However, DRPE is reported to be vulnerable to known-plaintext attacks (KPAs) using a phase recovery algorithm. In this study, we investigated the vulnerability of DRPE using fingerprints as cipher keys to the KPAs. By means of computational experiments, we estimated the encryption key and restored the fingerprint image using the estimated key. Further, we propose a method for avoiding the KPA on the DRPE that employs the phase retrieval algorithm. The proposed method makes the amplitude component of the encrypted image constant in order to prevent the amplitude component of the encrypted image from being used as a clue for phase retrieval. Computational experiments showed that the proposed method not only avoids revealing the cipher key and the fingerprint but also serves as a sufficiently accurate verification system. PMID:20588510

  5. Minkowski-Voronoi diagrams as a method to generate random packings of spheropolygons for the simulation of soils.

    PubMed

    Galindo-Torres, S A; Muñoz, J D; Alonso-Marroquín, F

    2010-11-01

    Minkowski operators (dilation and erosion of sets in vector spaces) have been extensively used in computer graphics, image processing to analyze the structure of materials, and more recently in molecular dynamics. Here, we apply those mathematical concepts to extend the discrete element method to simulate granular materials with complex-shaped particles. The Voronoi-Minkowski diagrams are introduced to generate random packings of complex-shaped particles with tunable particle roundness. Contact forces and potentials are calculated in terms of distances instead of overlaps. By using the Verlet method to detect neighborhood, we achieve CPU times that grow linearly with the body's number of sides. Simulations of dissipative granular materials under shear demonstrate that the method maintains conservation of energy in accord with the first law of thermodynamics. A series of simulations for biaxial test, shear band formation, hysteretic behavior, and ratcheting show that the model can reproduce the main features of real granular-soil behavior.

  6. Genotype-phenotype matching analysis of 38 Lactococcus lactis strains using random forest methods

    PubMed Central

    2013-01-01

    Background Lactococcus lactis is used in dairy food fermentation and for the efficient production of industrially relevant enzymes. The genome content and different phenotypes have been determined for multiple L. lactis strains in order to understand intra-species genotype and phenotype diversity and annotate gene functions. In this study, we identified relations between gene presence and a collection of 207 phenotypes across 38 L. lactis strains of dairy and plant origin. Gene occurrence and phenotype data were used in an iterative gene selection procedure, based on the Random Forest algorithm, to identify genotype-phenotype relations. Results A total of 1388 gene-phenotype relations were found, of which some confirmed known gene-phenotype relations, such as the importance of arabinose utilization genes only for strains of plant origin. We also identified a gene cluster related to growth on melibiose, a plant disaccharide; this cluster is present only in melibiose-positive strains and can be used as a genetic marker in trait improvement. Additionally, several novel gene-phenotype relations were uncovered, for instance, genes related to arsenite resistance or arginine metabolism. Conclusions Our results indicate that genotype-phenotype matching by integrating large data sets provides the possibility to identify gene-phenotype relations, possibly improve gene function annotation and identified relations can be used for screening bacterial culture collections for desired phenotypes. In addition to all gene-phenotype relations, we also provide coherent phenotype data for 38 Lactococcus strains assessed in 207 different phenotyping experiments, which to our knowledge is the largest to date for the Lactococcus lactis species. PMID:23530958

  7. Understanding Your Adequate Yearly Progress (AYP), 2011-2012

    ERIC Educational Resources Information Center

    Missouri Department of Elementary and Secondary Education, 2011

    2011-01-01

    The "No Child Left Behind Act (NCLB) of 2001" requires all schools, districts/local education agencies (LEAs) and states to show that students are making Adequate Yearly Progress (AYP). NCLB requires states to establish targets in the following ways: (1) Annual Proficiency Target; (2) Attendance/Graduation Rates; and (3) Participation Rates.…

  8. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...

  9. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...

  10. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...

  11. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...

  12. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...

  13. Adequate Schools and Inadequate Education: An Anthropological Perspective.

    ERIC Educational Resources Information Center

    Wolcott, Harry F.

    To illustrate his claim that schools generally do a remarkably good job of schooling while the society makes inadequate use of other means to educate young people, the author presents a case history of a young American (identified pseudonymously as "Brad") whose schooling was adequate but whose education was not. Brad, jobless and homeless,…

  14. Comparability and Reliability Considerations of Adequate Yearly Progress

    ERIC Educational Resources Information Center

    Maier, Kimberly S.; Maiti, Tapabrata; Dass, Sarat C.; Lim, Chae Young

    2012-01-01

    The purpose of this study is to develop an estimate of Adequate Yearly Progress (AYP) that will allow for reliable and valid comparisons among student subgroups, schools, and districts. A shrinkage-type estimator of AYP using the Bayesian framework is described. Using simulated data, the performance of the Bayes estimator will be compared to…

  15. 13 CFR 107.200 - Adequate capital for Licensees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS... operate actively in accordance with your Articles and within the context of your business plan,...

  16. Assessing Juvenile Sex Offenders to Determine Adequate Levels of Supervision.

    ERIC Educational Resources Information Center

    Gerdes, Karen E.; And Others

    1995-01-01

    This study analyzed the internal consistency of four inventories used by Utah probation officers to determine adequate and efficacious supervision levels and placement for juvenile sex offenders. Three factors accounted for 41.2 percent of variance (custodian's and juvenile's attitude toward intervention, offense characteristics, and historical…

  17. 4 CFR 200.14 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... identifiable personal data and automated systems shall be adequately trained in the security and privacy of... records in which identifiable personal data are processed or maintained, including all reports and output... personal records or data; must minimize, to the extent practicable, the risk that skilled technicians...

  18. Do Beginning Teachers Receive Adequate Support from Their Headteachers?

    ERIC Educational Resources Information Center

    Menon, Maria Eliophotou

    2012-01-01

    The article examines the problems faced by beginning teachers in Cyprus and the extent to which headteachers are considered to provide adequate guidance and support to them. Data were collected through interviews with 25 school teachers in Cyprus, who had recently entered teaching (within 1-5 years) in public primary schools. According to the…

  19. Comparison Between Two Methods for Estimating the Vertical Scale of Fluctuation for Modeling Random Geotechnical Problems

    NASA Astrophysics Data System (ADS)

    Pieczyńska-Kozłowska, Joanna M.

    2015-12-01

    The design process in geotechnical engineering requires the most accurate mapping of soil. The difficulty lies in the spatial variability of soil parameters, which has been a site of investigation of many researches for many years. This study analyses the soil-modeling problem by suggesting two effective methods of acquiring information for modeling that consists of variability from cone penetration test (CPT). The first method has been used in geotechnical engineering, but the second one has not been associated with geotechnics so far. Both methods are applied to a case study in which the parameters of changes are estimated. The knowledge of the variability of parameters allows in a long term more effective estimation, for example, bearing capacity probability of failure.

  20. Methods of and apparatus for recording images occurring just prior to a rapid, random event

    DOEpatents

    Kelley, Edward F.

    1994-01-01

    An apparatus and a method are disclosed for recording images of events in a medium wherein the images that are recorded are of conditions existing just prior to and during the occurrence of an event that triggers recording of these images. The apparatus and method use an optical delay path that employs a spherical focusing mirror facing a circular array of flat return mirrors around a central flat mirror. The image is reflected in a symmetric pattern which balances astigmatism which is created by the spherical mirror. Delays on the order of hundreds of nanoseconds are possible.

  1. General method for inspection based on artificial vision of plan products with random or organized texture

    NASA Astrophysics Data System (ADS)

    Alexief, Jean-Louis; Kerkeni, Naceur; Angue, Jean-Claude

    1996-02-01

    The visual inspection is a technique of non destructive control that analyzes from pixel images conformity of a product that may present some manufacturing defects. The diversity of products to inspect implies specific method and algorithms. That leads to a dedicated design of the inspection system. Our research in the design of an inspection system based on artificial vision approaches the definition of a methodological framework for the design. We also approach aided design to determine the sequences of image processing allowing the defect detection. We propose a design framework based on the different phases leading to the conceptual model of the IAV system (inspection based on artificial vision). The aided design is envisaged under two aspects. The first one concerns products with finite dimension and determined form for which an image processing sequence planner can bring solutions. The second one concerns products with characteristic texture where it is needed to define a method for defect detection. This method will have to be applicable in many cases. Thus for inspection of plan product with texture, we propose a method based on the spectral analysis. This method uses the fact that defects produce significant modifications on the energy spectrum. It is based on the construction of an optimal spatial filter which improves contrast defect. This filter can be built automatically and/or interactively by a human operator. Several defects can be detected simultaneously by a combination of filters. To determine global image attributes, the most discriminant on the filtered image, the analysis in the main components can be used.

  2. A Meta-Analysis of Randomized Controlled Trials of Yiqi Yangyin Huoxue Method in Treating Diabetic Nephropathy

    PubMed Central

    Ou, Jiao Ying; Huang, Di; Wu, Yan Sheng; Xu, Lin; He, Fei; Wang, Hui Ling; Shi, Li Qiang; Wan, Qiang; He, Li Qun; Dong Gao, Jian

    2016-01-01

    Objective. The purpose of this systematic review is to evaluate the evidence of Yiqi Yangyin Huoxue Method for diabetic nephropathy. Methods. 11 electronic databases, through September 2015, were searched to identify randomized controlled trials of Yiqi Yangyin Huoxue Method for diabetic nephropathy. The quality of the included trials was assessed using the Jadad scale. Results. 26 randomized controlled trials were included in our review. Of all the included trials, most of them were considered as high quality. The aggregated results suggested that Yiqi Yangyin Huoxue Method is beneficial to diabetic nephropathy in bringing down the microalbuminuria (SMD = −0.98, 95% CI −1.22 to −0.74), serum creatinine (SMD = −0.56, 95% CI −0.93 to −0.20), beta-2 microglobulin (MD = 0.06, 95% CI 0.01 to 0.12), fasting plasma glucose (MD = −0.35, 95% CI −0.62 to −0.08), and 2-hour postprandial blood glucose (MD = 1.13, 95% CI 0.07 to 2.20), but not in decreasing blood urea nitrogen (SMD = −0.72, 95% CI −1.47 to 0.02) or 2-hour postprandial blood glucose (SMD = −0.48, 95% CI −1.01 to 0.04). Conclusions. Yiqi Yangyin Huoxue Method should be a valid complementary and alternative therapy in the management of diabetic nephropathy, especially in improving UAER, serum creatinine, fasting blood glucose, and beta-2 microglobulin. However, more studies with long follow-up are warrant to confirm the current findings. PMID:27313643

  3. A Meta-Analysis of Randomized Controlled Trials of Yiqi Yangyin Huoxue Method in Treating Diabetic Nephropathy.

    PubMed

    Ou, Jiao Ying; Huang, Di; Wu, Yan Sheng; Xu, Lin; He, Fei; Wang, Hui Ling; Shi, Li Qiang; Wan, Qiang; He, Li Qun; Dong Gao, Jian

    2016-01-01

    Objective. The purpose of this systematic review is to evaluate the evidence of Yiqi Yangyin Huoxue Method for diabetic nephropathy. Methods. 11 electronic databases, through September 2015, were searched to identify randomized controlled trials of Yiqi Yangyin Huoxue Method for diabetic nephropathy. The quality of the included trials was assessed using the Jadad scale. Results. 26 randomized controlled trials were included in our review. Of all the included trials, most of them were considered as high quality. The aggregated results suggested that Yiqi Yangyin Huoxue Method is beneficial to diabetic nephropathy in bringing down the microalbuminuria (SMD = -0.98, 95% CI -1.22 to -0.74), serum creatinine (SMD = -0.56, 95% CI -0.93 to -0.20), beta-2 microglobulin (MD = 0.06, 95% CI 0.01 to 0.12), fasting plasma glucose (MD = -0.35, 95% CI -0.62 to -0.08), and 2-hour postprandial blood glucose (MD = 1.13, 95% CI 0.07 to 2.20), but not in decreasing blood urea nitrogen (SMD = -0.72, 95% CI -1.47 to 0.02) or 2-hour postprandial blood glucose (SMD = -0.48, 95% CI -1.01 to 0.04). Conclusions. Yiqi Yangyin Huoxue Method should be a valid complementary and alternative therapy in the management of diabetic nephropathy, especially in improving UAER, serum creatinine, fasting blood glucose, and beta-2 microglobulin. However, more studies with long follow-up are warrant to confirm the current findings. PMID:27313643

  4. Are Substance Use Prevention Programs More Effective in Schools Making Adequate Yearly Progress? A Study of Project ALERT

    ERIC Educational Resources Information Center

    Clark, Heddy Kovach; Ringwalt, Chris L.; Shamblen, Stephen R.; Hanley, Sean M.; Flewelling, Robert L.

    2011-01-01

    This exploratory study sought to determine if a popular school-based drug prevention program might be effective in schools that are making adequate yearly progress (AYP). Thirty-four schools with grades 6 through 8 in 11 states were randomly assigned either to receive Project ALERT (n = 17) or to a control group (n = 17); of these, 10 intervention…

  5. A Robust and Versatile Method of Combinatorial Chemical Synthesis of Gene Libraries via Hierarchical Assembly of Partially Randomized Modules.

    PubMed

    Popova, Blagovesta; Schubert, Steffen; Bulla, Ingo; Buchwald, Daniela; Kramer, Wilfried

    2015-01-01

    A major challenge in gene library generation is to guarantee a large functional size and diversity that significantly increases the chances of selecting different functional protein variants. The use of trinucleotides mixtures for controlled randomization results in superior library diversity and offers the ability to specify the type and distribution of the amino acids at each position. Here we describe the generation of a high diversity gene library using tHisF of the hyperthermophile Thermotoga maritima as a scaffold. Combining various rational criteria with contingency, we targeted 26 selected codons of the thisF gene sequence for randomization at a controlled level. We have developed a novel method of creating full-length gene libraries by combinatorial assembly of smaller sub-libraries. Full-length libraries of high diversity can easily be assembled on demand from smaller and much less diverse sub-libraries, which circumvent the notoriously troublesome long-term archivation and repeated proliferation of high diversity ensembles of phages or plasmids. We developed a generally applicable software tool for sequence analysis of mutated gene sequences that provides efficient assistance for analysis of library diversity. Finally, practical utility of the library was demonstrated in principle by assessment of the conformational stability of library members and isolating protein variants with HisF activity from it. Our approach integrates a number of features of nucleic acids synthetic chemistry, biochemistry and molecular genetics to a coherent, flexible and robust method of combinatorial gene synthesis.

  6. A Robust and Versatile Method of Combinatorial Chemical Synthesis of Gene Libraries via Hierarchical Assembly of Partially Randomized Modules

    PubMed Central

    Popova, Blagovesta; Schubert, Steffen; Bulla, Ingo; Buchwald, Daniela; Kramer, Wilfried

    2015-01-01

    A major challenge in gene library generation is to guarantee a large functional size and diversity that significantly increases the chances of selecting different functional protein variants. The use of trinucleotides mixtures for controlled randomization results in superior library diversity and offers the ability to specify the type and distribution of the amino acids at each position. Here we describe the generation of a high diversity gene library using tHisF of the hyperthermophile Thermotoga maritima as a scaffold. Combining various rational criteria with contingency, we targeted 26 selected codons of the thisF gene sequence for randomization at a controlled level. We have developed a novel method of creating full-length gene libraries by combinatorial assembly of smaller sub-libraries. Full-length libraries of high diversity can easily be assembled on demand from smaller and much less diverse sub-libraries, which circumvent the notoriously troublesome long-term archivation and repeated proliferation of high diversity ensembles of phages or plasmids. We developed a generally applicable software tool for sequence analysis of mutated gene sequences that provides efficient assistance for analysis of library diversity. Finally, practical utility of the library was demonstrated in principle by assessment of the conformational stability of library members and isolating protein variants with HisF activity from it. Our approach integrates a number of features of nucleic acids synthetic chemistry, biochemistry and molecular genetics to a coherent, flexible and robust method of combinatorial gene synthesis. PMID:26355961

  7. A functional network estimation method of resting-state fMRI using a hierarchical Markov random field.

    PubMed

    Liu, Wei; Awate, Suyash P; Anderson, Jeffrey S; Fletcher, P Thomas

    2014-10-15

    We propose a hierarchical Markov random field model for estimating both group and subject functional networks simultaneously. The model takes into account the within-subject spatial coherence as well as the between-subject consistency of the network label maps. The statistical dependency between group and subject networks acts as a regularization, which helps the network estimation on both layers. We use Gibbs sampling to approximate the posterior density of the network labels and Monte Carlo expectation maximization to estimate the model parameters. We compare our method with two alternative segmentation methods based on K-Means and normalized cuts, using synthetic and real fMRI data. The experimental results show that our proposed model is able to identify both group and subject functional networks with higher accuracy on synthetic data, more robustness, and inter-session consistency on the real data.

  8. A Functional Networks Estimation Method of Resting-State fMRI Using a Hierarchical Markov Random Field

    PubMed Central

    Liu, Wei; Awate, Suyash P.; Anderson, Jeffrey S.; Fletcher, P. Thomas

    2014-01-01

    We propose a hierarchical Markov random field model that estimates both group and subject functional networks simultaneously. The model takes into account the within-subject spatial coherence as well as the between-subject consistency of the network label maps. The statistical dependency between group and subject networks acts as a regularization, which helps the network estimation on both layers. We use Gibbs sampling to approximate the posterior density of the network labels and Monte Carlo expectation maximization to estimate the model parameters. We compare our method with two alternative segmentation methods based on K-Means and normalized cuts, using synthetic and real fMRI data. The experimental results show our proposed model is able to identify both group and subject functional networks with higher accuracy, more robustness, and inter-session consistency. PMID:24954282

  9. A Two-Stage Estimation Method for Random Coefficient Differential Equation Models with Application to Longitudinal HIV Dynamic Data

    PubMed Central

    Fang, Yun; Zhu, Li-Xing

    2011-01-01

    We propose a two-stage estimation method for random coefficient ordinary differential equation (ODE) models. A maximum pseudo-likelihood estimator (MPLE) is derived based on a mixed-effects modeling approach and its asymptotic properties for population parameters are established. The proposed method does not require repeatedly solving ODEs, and is computationally efficient although it does pay a price with the loss of some estimation efficiency. However, the method does offer an alternative approach when the exact likelihood approach fails due to model complexity and high-dimensional parameter space, and it can also serve as a method to obtain the starting estimates for more accurate estimation methods. In addition, the proposed method does not need to specify the initial values of state variables and preserves all the advantages of the mixed-effects modeling approach. The finite sample properties of the proposed estimator are studied via Monte Carlo simulations and the methodology is also illustrated with application to an AIDS clinical data set. PMID:22171150

  10. Fast method to compute scattering by a buried object under a randomly rough surface: PILE combined with FB-SA.

    PubMed

    Bourlier, Christophe; Kubické, Gildas; Déchamps, Nicolas

    2008-04-01

    A fast, exact numerical method based on the method of moments (MM) is developed to calculate the scattering from an object below a randomly rough surface. Déchamps et al. [J. Opt. Soc. Am. A23, 359 (2006)] have recently developed the PILE (propagation-inside-layer expansion) method for a stack of two one-dimensional rough interfaces separating homogeneous media. From the inversion of the impedance matrix by block (in which two impedance matrices of each interface and two coupling matrices are involved), this method allows one to calculate separately and exactly the multiple-scattering contributions inside the layer in which the inverses of the impedance matrices of each interface are involved. Our purpose here is to apply this method for an object below a rough surface. In addition, to invert a matrix of large size, the forward-backward spectral acceleration (FB-SA) approach of complexity O(N) (N is the number of unknowns on the interface) proposed by Chou and Johnson [Radio Sci.33, 1277 (1998)] is applied. The new method, PILE combined with FB-SA, is tested on perfectly conducting circular and elliptic cylinders located below a dielectric rough interface obeying a Gaussian process with Gaussian and exponential height autocorrelation functions. PMID:18382488

  11. Unsteady Fast Random Particle Mesh method for efficient prediction of tonal and broadband noises of a centrifugal fan unit

    NASA Astrophysics Data System (ADS)

    Heo, Seung; Cheong, Cheolung; Kim, Taehoon

    2015-09-01

    In this study, efficient numerical method is proposed for predicting tonal and broadband noises of a centrifugal fan unit. The proposed method is based on Hybrid Computational Aero-Acoustic (H-CAA) techniques combined with Unsteady Fast Random Particle Mesh (U-FRPM) method. The U-FRPM method is developed by extending the FRPM method proposed by Ewert et al. and is utilized to synthesize turbulence flow field from unsteady RANS solutions. The H-CAA technique combined with U-FRPM method is applied to predict broadband as well as tonal noises of a centrifugal fan unit in a household refrigerator. Firstly, unsteady flow field driven by a rotating fan is computed by solving the RANS equations with Computational Fluid Dynamic (CFD) techniques. Main source regions around the rotating fan are identified by examining the computed flow fields. Then, turbulence flow fields in the main source regions are synthesized by applying the U-FRPM method. The acoustic analogy is applied to model acoustic sources in the main source regions. Finally, the centrifugal fan noise is predicted by feeding the modeled acoustic sources into an acoustic solver based on the Boundary Element Method (BEM). The sound spectral levels predicted using the current numerical method show good agreements with the measured spectra at the Blade Pass Frequencies (BPFs) as well as in the high frequency range. On the more, the present method enables quantitative assessment of relative contributions of identified source regions to the sound field by comparing predicted sound pressure spectrum due to modeled sources.

  12. Random Estimate the values of seed oil of Cucurbita maxima by refractive index method

    PubMed Central

    Saxena, R. B.

    2010-01-01

    The crude oil having lower iodine and free fatty acids values has Aamdosha properties. These properties are present due to toxic and anti-toxic compounds. These compounds can be harmful for the special diseases and may be unsaturated, saturated, open chain etc. The adulteration can take part as catalytic action for the toxic effect for the special diseases. Toxic properties of oils are removed by different ingrediants and methods. C. maxima seed tail (mst) is used with food and medicine. The present paper deals with the study of oil by refractive index and equations. PMID:22131677

  13. Prediction of broadband ground-motion time histories: Hybrid low/high-frequency method with correlated random source parameters

    USGS Publications Warehouse

    Liu, P.; Archuleta, R.J.; Hartzell, S.H.

    2006-01-01

    We present a new method for calculating broadband time histories of ground motion based on a hybrid low-frequency/high-frequency approach with correlated source parameters. Using a finite-difference method we calculate low- frequency synthetics (< ∼1 Hz) in a 3D velocity structure. We also compute broadband synthetics in a 1D velocity model using a frequency-wavenumber method. The low frequencies from the 3D calculation are combined with the high frequencies from the 1D calculation by using matched filtering at a crossover frequency of 1 Hz. The source description, common to both the 1D and 3D synthetics, is based on correlated random distributions for the slip amplitude, rupture velocity, and rise time on the fault. This source description allows for the specification of source parameters independent of any a priori inversion results. In our broadband modeling we include correlation between slip amplitude, rupture velocity, and rise time, as suggested by dynamic fault modeling. The method of using correlated random source parameters is flexible and can be easily modified to adjust to our changing understanding of earthquake ruptures. A realistic attenuation model is common to both the 3D and 1D calculations that form the low- and high-frequency components of the broadband synthetics. The value of Q is a function of the local shear-wave velocity. To produce more accurate high-frequency amplitudes and durations, the 1D synthetics are corrected with a randomized, frequency-dependent radiation pattern. The 1D synthetics are further corrected for local site and nonlinear soil effects by using a 1D nonlinear propagation code and generic velocity structure appropriate for the site’s National Earthquake Hazards Reduction Program (NEHRP) site classification. The entire procedure is validated by comparison with the 1994 Northridge, California, strong ground motion data set. The bias and error found here for response spectral acceleration are similar to the best results

  14. A randomized and blinded comparison of three bleeding time techniques: the Ivy method, and the Simplate II method in two directions.

    PubMed

    Srámek, R; Srámek, A; Koster, T; Briët, E; Rosendaal, F R

    1992-05-01

    We compared the Ivy bleeding time method and two alternatives of the Simplate II method (incisions in horizontal and vertical direction) with each other, with regard to the sensitivity, the specificity, the costs and the burden for the patient. In the aspirin study an aspirin-induced bleeding defect was used. Seventy-two healthy volunteers were randomized to receive either 500 mg acetylsalicylic acid (ASA) or a placebo. Double blinding was maintained throughout the study. In the anticoagulation study 62 patients participated, who received oral anticoagulants (OAC) for various reasons. All participants received two bleeding time methods. The burden for the participants of each method was screened by a small standard questionnaire. The differences in sensitivity and specificity between the three methods proved minimal. The Ivy method was more often preferred by the participants than the Simplate methods. Since a choice on the basis of sensitivity and specificity appears not possible, we prefer the Ivy method because of lower costs and less burden.

  15. Design-based mask metrology hot spot classification and recipe making through random pattern recognition method

    NASA Astrophysics Data System (ADS)

    Cui, Ying; Baik, Kiho; Gleason, Bob; Tavassoli, Malahat

    2006-10-01

    Design Based Metrology (DBM) requires an integrated process from design to metrology, and the very first and key step of this integration is to translate design CD lists to metrology measurement recipes. Design CD lists can come from different sources, such as design rule check, OPC validation, or yield analysis. These design CD lists can not be directly used to create metrology tool recipes, since tool recipe makers usually require specific information of each CD site, or a measurement matrix. The manual process to identify measurement matrix for each design CD site can be very difficult, especially when the list is in hundreds or more. This paper will address this issue and propose a method to automate Design CD Identification (DCDI), using a new CD Pattern Vector (CDPV) library.

  16. Dynamic analysis method of offshore jack-up platforms in regular and random waves

    NASA Astrophysics Data System (ADS)

    Yu, Hao; Li, Xiaoyu; Yang, Shuguang

    2012-03-01

    A jack-up platform, with its particular structure, showed obvious dynamic characteristics under complex environmental loads in extreme conditions. In this paper, taking a simplified 3-D finite element dynamic model in extreme storm conditions as research object, a transient dynamic analysis method was proposed, which was under both regular and irregular wave loads. The steps of dynamic analysis under extreme conditions were illustrated with an applied case, and the dynamic amplification factor (DAF) was calculated for each response parameter of base shear, overturning moment and hull sway. Finally, the structural response results of dynamic and static were compared and analyzed. The results indicated that the static strength analysis of the Jack-up Platforms was not enough under the dynamic loads including wave and current, further dynamic response analysis considering both computational efficiency and accuracy was necessary.

  17. Methods for identifying SNP interactions: a review on variations of Logic Regression, Random Forest and Bayesian logistic regression.

    PubMed

    Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula

    2011-01-01

    Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.

  18. The California Tri-pull Taping Method in the Treatment of Shoulder Subluxation After Stroke: A Randomized Clinical Trial

    PubMed Central

    Chatterjee, Subhasish; Hayner, Kate A; Arumugam, Narkeesh; Goyal, Manu; Midha, Divya; Arora, Ashima; Sharma, Sorabh; Kumar, Senthil P

    2016-01-01

    Background: Shoulder subluxation is a frequent occurrence in individuals following a stroke. Although various methods of treatment are available, none of them address all possible consequences of the subluxation pain, limited range of motion, the subluxation, and decreased functional use of the arm. Aims: The purpose of this study was to evaluate the effectiveness of California tri-pull taping (CTPT) method on shoulder subluxation, pain, active shoulder flexion, and upper limb functional recovery after stroke. Materials and Methods: This was a randomized control study on 30 participants. All participants received conventional neurorehabilitation 5 days a week over 6 weeks. Half of the participants also received the CTPT. Pre- and post-assessment scores were taken on all participants for the amount of shoulder subluxation, pain, active shoulder flexion, and functional recovery. Results: The CTPT method demonstrated a significant reduction of pain in the treatment group from baseline, a significant improvement in active shoulder flexion and a significant improvement in proximal arm function as measured on the proximal subscale on the Fugl-Meyer upper extremity functional Scale but not the distal or total Fugl-Meyer subscales. Shoulder subluxation was not statistically significant. Conclusions: The CTPT method is an effective treatment for the hemiplegic subluxed shoulder. PMID:27213141

  19. Estimation of resting-state functional connectivity using random subspace based partial correlation: a novel method for reducing global artifacts.

    PubMed

    Chen, Tianwen; Ryali, Srikanth; Qin, Shaozheng; Menon, Vinod

    2013-11-15

    Intrinsic functional connectivity analysis using resting-state functional magnetic resonance imaging (rsfMRI) has become a powerful tool for examining brain functional organization. Global artifacts such as physiological noise pose a significant problem in estimation of intrinsic functional connectivity. Here we develop and test a novel random subspace method for functional connectivity (RSMFC) that effectively removes global artifacts in rsfMRI data. RSMFC estimates the partial correlation between a seed region and each target brain voxel using multiple subsets of voxels sampled randomly across the whole brain. We evaluated RSMFC on both simulated and experimental rsfMRI data and compared its performance with standard methods that rely on global mean regression (GSReg) which are widely used to remove global artifacts. Using extensive simulations we demonstrate that RSMFC is effective in removing global artifacts in rsfMRI data. Critically, using a novel simulated dataset we demonstrate that, unlike GSReg, RSMFC does not artificially introduce anti-correlations between inherently uncorrelated networks, a result of paramount importance for reliably estimating functional connectivity. Furthermore, we show that the overall sensitivity, specificity and accuracy of RSMFC are superior to GSReg. Analysis of posterior cingulate cortex connectivity in experimental rsfMRI data from 22 healthy adults revealed strong functional connectivity in the default mode network, including more reliable identification of connectivity with left and right medial temporal lobe regions that were missed by GSReg. Notably, compared to GSReg, negative correlations with lateral fronto-parietal regions were significantly weaker in RSMFC. Our results suggest that RSMFC is an effective method for minimizing the effects of global artifacts and artificial negative correlations, while accurately recovering intrinsic functional brain networks.

  20. An efficient computational method for characterizing the effects of random surface errors on the average power pattern of reflectors

    NASA Technical Reports Server (NTRS)

    Rahmat-Samii, Y.

    1983-01-01

    Based on the works of Ruze (1966) and Vu (1969), a novel mathematical model has been developed to determine efficiently the average power pattern degradations caused by random surface errors. In this model, both nonuniform root mean square (rms) surface errors and nonuniform illumination functions are employed. In addition, the model incorporates the dependence on F/D in the construction of the solution. The mathematical foundation of the model rests on the assumption that in each prescribed annular region of the antenna, the geometrical rms surface value is known. It is shown that closed-form expressions can then be derived, which result in a very efficient computational method for the average power pattern. Detailed parametric studies are performed with these expressions to determine the effects of different random errors and illumination tapers on parameters such as gain loss and sidelobe levels. The results clearly demonstrate that as sidelobe levels decrease, their dependence on the surface rms/wavelength becomes much stronger and, for a specified tolerance level, a considerably smaller rms/wavelength is required to maintain the low sidelobes within the required bounds.

  1. Preventing cognitive decline in older African Americans with mild cognitive impairment: design and methods of a randomized clinical trial.

    PubMed

    Rovner, Barry W; Casten, Robin J; Hegel, Mark T; Leiby, Benjamin E

    2012-07-01

    Mild Cognitive Impairment (MCI) affects 25% of older African Americans and predicts progression to Alzheimer's disease. An extensive epidemiologic literature suggests that cognitive, physical, and/or social activities may prevent cognitive decline. We describe the methods of a randomized clinical trial to test the efficacy of Behavior Activation to prevent cognitive decline in older African Americans with the amnestic multiple domain subtype of MCI. Community Health Workers deliver 6 initial in-home treatment sessions over 2-3 months and then 6 subsequent in-home booster sessions using language, materials, and concepts that are culturally relevant to older African Americans during this 24 month clinical trial. We are randomizing 200 subjects who are recruited from churches, senior centers, and medical clinics to Behavior Activation or Supportive Therapy, which controls for attention. The primary outcome is episodic memory as measured by the Hopkins Verbal Learning Test-Revised at baseline and at months 3, 12, 18, and 24. The secondary outcomes are general and domain-specific neuropsychological function, activities of daily living, depression, and quality-of-life. The negative results of recent clinical trials of drug treatments for MCI and Alzheimer's disease suggest that behavioral interventions may provide an alternative treatment approach to preserve cognition in an aging society.

  2. Using sexually transmitted infection biomarkers to validate reporting of sexual behavior within a randomized, experimental evaluation of interviewing methods.

    PubMed

    Hewett, Paul C; Mensch, Barbara S; Ribeiro, Manoel Carlos S de A; Jones, Heidi E; Lippman, Sheri A; Montgomery, Mark R; van de Wijgert, Janneke H H M

    2008-07-15

    This paper examines the reporting of sexual and other risk behaviors within a randomized experiment using a computerized versus face-to-face interview mode. Biomarkers for sexually transmitted infection (STI) were used to validate self-reported behavior by interview mode. As part of a parent study evaluating home versus clinic screening and diagnosis for STIs, 818 women aged 18-40 years were recruited in 2004 at or near a primary care clinic in São Paulo, Brazil, and were randomized to a face-to-face interview or audio computer-assisted self-interviewing. Ninety-six percent of participants were tested for chlamydia, gonorrhea, and trichomoniasis. Reporting of STI risk behavior was consistently higher with the computerized mode of interview. Stronger associations between risk behaviors and STI were found with the computerized interview after controlling for sociodemographic factors. These results were obtained by using logistic regression approaches, as well as statistical methods that address potential residual confounding and covariate endogeneity. Furthermore, STI-positive participants were more likely than STI-negative participants to underreport risk behavior in the face-to-face interview. Results strongly suggest that computerized interviewing provides more accurate and reliable behavioral data. The analyses also confirm the benefits of using data on prevalent STIs for externally validating behavioral reporting.

  3. Using Sexually Transmitted Infection Biomarkers to Validate Reporting of Sexual Behavior within a Randomized, Experimental Evaluation of Interviewing Methods

    PubMed Central

    Mensch, Barbara S.; de A. Ribeiro, Manoel Carlos S.; Jones, Heidi E.; Lippman, Sheri A.; Montgomery, Mark R.; van de Wijgert, Janneke H. H. M.

    2008-01-01

    This paper examines the reporting of sexual and other risk behaviors within a randomized experiment using a computerized versus face-to-face interview mode. Biomarkers for sexually transmitted infection (STI) were used to validate self-reported behavior by interview mode. As part of a parent study evaluating home versus clinic screening and diagnosis for STIs, 818 women aged 18−40 years were recruited in 2004 at or near a primary care clinic in São Paulo, Brazil, and were randomized to a face-to-face interview or audio computer-assisted self-interviewing. Ninety-six percent of participants were tested for chlamydia, gonorrhea, and trichomoniasis. Reporting of STI risk behavior was consistently higher with the computerized mode of interview. Stronger associations between risk behaviors and STI were found with the computerized interview after controlling for sociodemographic factors. These results were obtained by using logistic regression approaches, as well as statistical methods that address potential residual confounding and covariate endogeneity. Furthermore, STI-positive participants were more likely than STI-negative participants to underreport risk behavior in the face-to-face interview. Results strongly suggest that computerized interviewing provides more accurate and reliable behavioral data. The analyses also confirm the benefits of using data on prevalent STIs for externally validating behavioral reporting. PMID:18525081

  4. A randomized clinical trial of negative pressure ventilation in severe chronic obstructive pulmonary disease: design and methods.

    PubMed

    Shapiro, S H; Macklem, P T; Gray-Donald, K; Martin, J G; Ernst, P P; Wood-Dauphinee, S; Hutchinson, T A; Spitzer, W O

    1991-01-01

    This report documents the design and methods of a randomized clinical trial designed to test the effectiveness of home negative pressure ventilation in patients with severe chronic obstructive pulmonary disease. Active negative pressure ventilation was compared with a sham version of the treatment after a pre-trial assessment had indicated the feasibility of the latter. Over 1200 patients in the metropolitan Montreal area were screened. Of these, 348 patients were recruited to enter a 4-week stabilization period, and 184 were subsequently randomized to receive either active or sham negative pressure ventilation. A 5-day in-hospital period was used to train patients in ventilator use and obtain baseline measures of exercise capacity, lung function, respiratory symptoms, and quality of life. Home ventilation treatment took place during a following 12-week period. Respirator use was recorded both from patient logs and from concealed meters installed in the units. Patients received four home visits by physiotherapists during the 12-week period and returned for follow-up to the hospital 4 and 12 weeks post-discharge for reassessment.

  5. Preventing cognitive decline in older African Americans with mild cognitive impairment: design and methods of a randomized clinical trial.

    PubMed

    Rovner, Barry W; Casten, Robin J; Hegel, Mark T; Leiby, Benjamin E

    2012-07-01

    Mild Cognitive Impairment (MCI) affects 25% of older African Americans and predicts progression to Alzheimer's disease. An extensive epidemiologic literature suggests that cognitive, physical, and/or social activities may prevent cognitive decline. We describe the methods of a randomized clinical trial to test the efficacy of Behavior Activation to prevent cognitive decline in older African Americans with the amnestic multiple domain subtype of MCI. Community Health Workers deliver 6 initial in-home treatment sessions over 2-3 months and then 6 subsequent in-home booster sessions using language, materials, and concepts that are culturally relevant to older African Americans during this 24 month clinical trial. We are randomizing 200 subjects who are recruited from churches, senior centers, and medical clinics to Behavior Activation or Supportive Therapy, which controls for attention. The primary outcome is episodic memory as measured by the Hopkins Verbal Learning Test-Revised at baseline and at months 3, 12, 18, and 24. The secondary outcomes are general and domain-specific neuropsychological function, activities of daily living, depression, and quality-of-life. The negative results of recent clinical trials of drug treatments for MCI and Alzheimer's disease suggest that behavioral interventions may provide an alternative treatment approach to preserve cognition in an aging society. PMID:22406101

  6. Random walk of magnetic field lines in dynamical turbulence: A field line tracing method. I. Slab turbulence

    SciTech Connect

    Shalchi, A.

    2010-08-15

    To study the wandering of magnetic field lines is an important subject in theoretical physics. Results of field line random walk theories can be applied in plasma physics as well as astrophysics. Previous investigations are based on magnetostatic models. These models have been used in analytical work as well as in computer simulations to warrant mathematical and numerical tractability. To replace the magnetostatic model by a dynamical turbulence model is a difficult task. In the present article, a field line tracing method is used to describe field line wandering in dynamical magnetic turbulence. As examples different models are employed, namely, the plasma wave model, the damping model of dynamical turbulence, and the random sweeping model. It is demonstrated that the choice of the turbulence model has a very strong influence on the field line structure. It seems that if dynamical turbulence effects are included, Markovian diffusion can be found for other forms of the wave spectrum as in the magnetostatic model. Therefore, the results of the present paper are useful to specify turbulence models. As a further application we consider charged particle transport at early times.

  7. How to Do Random Allocation (Randomization)

    PubMed Central

    Shin, Wonshik

    2014-01-01

    Purpose To explain the concept and procedure of random allocation as used in a randomized controlled study. Methods We explain the general concept of random allocation and demonstrate how to perform the procedure easily and how to report it in a paper. PMID:24605197

  8. Broadband inversion of 1J(CC) responses in 1,n-ADEQUATE spectra.

    PubMed

    Reibarkh, Mikhail; Williamson, R Thomas; Martin, Gary E; Bermel, Wolfgang

    2013-11-01

    Establishing the carbon skeleton of a molecule greatly facilitates the process of structure elucidation, both manual and computer-assisted. Recent advances in the family of ADEQUATE experiments demonstrated their potential in this regard. 1,1-ADEQUATE, which provides direct (13)C-(13)C correlation via (1)J(CC), and 1,n-ADEQUATE, which typically yields (3)J(CC) and (1)J(CC) correlations, are more sensitive and more widely applicable experiments than INADEQUATE and PANACEA. A recently reported modified pulse sequence that semi-selectively inverts (1)J(CC) correlations in 1,n-ADEQUATE spectra provided a significant improvement, allowing (1)J(CC) and (n)J(CC) correlations to be discerned in the same spectrum. However, the reported experiment requires a careful matching of the amplitude transfer function with (1)J(CC) coupling constants in order to achieve the inversion, and even then some (1)J(CC) correlations could still have positive intensity due to the oscillatory nature of the transfer function. Both shortcomings limit the practicality of the method. We now report a new, dual-optimized inverted (1)J(CC) 1,n-ADEQUATE experiment, which provides more uniform inversion of (1)J(CC) correlations across the range of 29-82 Hz. Unlike the original method, the dual optimization experiment does not require fine-tuning for the molecule's (1)J(CC) coupling constant values. Even more usefully, the dual-optimized version provides up to two-fold improvement in signal-to-noise for some long-range correlations. Using modern, cryogenically-cooled probes, the experiment can be successfully applied to samples of ~1 mg under favorable circumstances. The improvements afforded by dual optimization inverted (1)J(CC) 1,n-ADEQUATE experiment make it a useful and practical tool for NMR structure elucidation and should facilitate the implementation and utilization of the experiment.

  9. Proteomics-based, multivariate random forest method for prediction of protein separation behavior during cation-exchange chromatography.

    PubMed

    Swanson, Ryan K; Xu, Ruo; Nettleton, Dan; Glatz, Charles E

    2012-08-01

    The most significant cost of recombinant protein production lies in the optimization of the downstream purification methods, mainly due to a lack of knowledge of the separation behavior of the host cell proteins (HCP). To reduce the effort required for purification process development, this work was aimed at modeling the separation behavior of a complex mixture of proteins in cation-exchange chromatography (CEX). With the emergence of molecular pharming as a viable option for the production of recombinant pharmaceutical proteins, the HCP mixture chosen was an extract of corn germ. Aqueous two phase system (ATPS) partitioning followed by two-dimensional electrophoresis (2DE) provided data on isoelectric point, molecular weight and surface hydrophobicity of the extract and step-elution fractions. A multivariate random forest (MVRF) method was then developed using the three characterization variables to predict the elution pattern of individual corn HCP. The MVRF method achieved an average root mean squared error (RMSE) value of 0.0406 (fraction of protein eluted in each CEX elution step) for all the proteins that were characterized, providing evidence for the effectiveness of both the characterization method and the analysis approach for protein purification applications.

  10. Proteomics-based, multivariate random forest method for prediction of protein separation behavior during cation-exchange chromatography.

    PubMed

    Swanson, Ryan K; Xu, Ruo; Nettleton, Dan; Glatz, Charles E

    2012-08-01

    The most significant cost of recombinant protein production lies in the optimization of the downstream purification methods, mainly due to a lack of knowledge of the separation behavior of the host cell proteins (HCP). To reduce the effort required for purification process development, this work was aimed at modeling the separation behavior of a complex mixture of proteins in cation-exchange chromatography (CEX). With the emergence of molecular pharming as a viable option for the production of recombinant pharmaceutical proteins, the HCP mixture chosen was an extract of corn germ. Aqueous two phase system (ATPS) partitioning followed by two-dimensional electrophoresis (2DE) provided data on isoelectric point, molecular weight and surface hydrophobicity of the extract and step-elution fractions. A multivariate random forest (MVRF) method was then developed using the three characterization variables to predict the elution pattern of individual corn HCP. The MVRF method achieved an average root mean squared error (RMSE) value of 0.0406 (fraction of protein eluted in each CEX elution step) for all the proteins that were characterized, providing evidence for the effectiveness of both the characterization method and the analysis approach for protein purification applications. PMID:22748375

  11. Effects of Continuous Use of Entonox in Comparison with Intermittent Method on Obstetric Outcomes: A Randomized Clinical Trial

    PubMed Central

    Baghani, Roya; Safiabadi Tali, Seid Hossein; Tabarraei, Yaser

    2014-01-01

    Background. Entonox (N2O2) which is an inhalational gas for relieving labor pain is commonly used intermittently; however some women are interested in continuous breathing in face mask. So we decided to compare the complications induced by two methods to find out whether it is safe to permit the mothers to use Entonox continuously or not. Patients and Methods. This randomized clinical trial was performed in Mobini Hospital, Sabzevar, Iran. 50 parturients used Entonox intermittently and 50 cases used it continuously during labor. Then obstetrical outcomes were analyzed in two groups by spss 17 software, t-test, and Chi2 while P < 0.05 was considered significant. Results. This study showed the mean duration of second stage of labor had no significant difference (P = 0.3). Perineal laceration was less in continuous group significantly (P = 0.04). Assisted vaginal birth was not different significantly (P = 0.4). Uterine atony had no significant difference in two groups (P = 0.2). Maternal collaboration in pushing and satisfaction were higher in continuous group significantly (P = 0.03), (P < 0.0001). Apgar score of neonates at first and fifth minute was acceptable and not different significantly in two groups (P = 0.3). Conclusions. Our study demonstrated continuous method is also safe. So, it seems reasonable to set mothers free to choose the desired method of Entonox usage. PMID:25525519

  12. A method for calculating spectral statistics based on random-matrix universality with an application to the three-point correlations of the Riemann zeros

    NASA Astrophysics Data System (ADS)

    Bogomolny, E.; Keating, J. P.

    2013-08-01

    We illustrate a general method for calculating spectral statistics that combines the universal (random matrix theory limit) and the non-universal (trace-formula-related) contributions by giving a heuristic derivation of the three-point correlation function for the zeros of the Riemann zeta function. The main idea is to construct a generalized Hermitian random matrix ensemble whose mean eigenvalue density coincides with a large but finite portion of the actual density of the spectrum or the Riemann zeros. Averaging the random matrix result over remaining oscillatory terms related, in the case of the zeta function, to small primes leads to a formula for the three-point correlation function that is in agreement with results from other heuristic methods. This provides support for these different methods. The advantage of the approach we set out here is that it incorporates the determinantal structure of the random matrix limit.

  13. Wellness Coaching for People With Prediabetes: A Randomized Encouragement Trial to Evaluate Outreach Methods at Kaiser Permanente, Northern California, 2013

    PubMed Central

    Xiao, Hong; Adams, Sara R.; Goler, Nancy; Sanna, Rashel S.; Boccio, Mindy; Bellamy, David J.; Brown, Susan D.; Neugebauer, Romain S.; Ferrara, Assiamira

    2015-01-01

    Introduction Health coaching can improve lifestyle behaviors known to prevent or manage chronic conditions. Little is known about effective ways to encourage health and wellness coaching among people who might benefit. The purpose of this randomized encouragement trial was to assess the relative success of 3 outreach methods (secured email message, telephone message, and mailed letter) on the use of wellness coaching by people with prediabetes. Methods A total of 14,584 Kaiser Permanente Northern California (KPNC) patients with diagnosed prediabetes (fasting plasma glucose, 110–125mg/dL) were randomly assigned to be contacted via 1 of 4 intervention arms from January through May 2013. The uptake rate (making an appointment at the Wellness Coaching Center [WCC]) was assessed, and the association between uptake rate and patient characteristics was examined via multivariable logistic regression. Results The overall uptake rate across intervention arms was 1.9%. Secured email message had the highest uptake rate (3.0%), followed by letters and telephone messages (P < .05 for all pairwise comparisons). No participants in the usual-care arm (ie, no outreach) made an appointment with the WCC. For each year of increased age, the estimated odds of the uptake increased by 1.02 (odds ratio [OR] = 1.02; 95% CI, 1.01–1.04). Women were nearly twice as likely to make an appointment at the WCC as men (OR = 1.87; 95% CI, 1.40–2.51). Conclusion Our results suggest that the WCC can recruit and encourage KPNC members with prediabetes to participate in the WCC. Future research should focus on increasing participation rates in health coaching among patients who may benefit. PMID:26605707

  14. A randomized controlled trial of venlafaxine XR for major depressive disorder after spinal cord injury: Methods and lessons learned

    PubMed Central

    Bombardier, Charles H.; Fann, Jesse R.; Wilson, Catherine S.; Heinemann, Allen W.; Richards, J. Scott; Warren, Ann Marie; Brooks, Larry; Warms, Catherine A.; Temkin, Nancy R.; Tate, Denise G.

    2014-01-01

    Context/objective We describe the rationale, design, methods, and lessons learned conducting a treatment trial for major depressive disorder (MDD) or dysthymia in people with spinal cord injury (SCI). Design A multi-site, double-blind, randomized (1:1) placebo controlled trial of venlafaxine XR for MDD or dysthymia. Subjects were block randomized and stratified by site, lifetime history of substance dependence, and prior history of MDD. Setting Six SCI centers throughout the United States. Participants Across participating centers, 2536 subjects were screened and 133 were enrolled into the trial. Subjects were 18–64 years old and at least 1 month post-SCI. Interventions Twelve-week trial of venlafaxine XR versus placebo using a flexible titration schedule. Outcome measures The primary outcome was improvement in depression severity at 12 weeks. The secondary outcome was improvement in pain. Results This article includes study methods, modifications prompted by a formative review process, preliminary data on the study sample and lessons learned. We describe common methodological and operational challenges conducting multi-site trials and how we addressed them. Challenges included study organization and decision making, staff training, obtaining human subjects approval, standardization of measurement and treatment, data and safety monitoring, subject screening and recruitment, unblinding and continuity of care, database management, and data analysis. Conclusions The methodological and operational challenges we faced and the lessons we learned may provide useful information for researchers who aim to conduct clinical trials, especially in the area of medical treatment of depression in people with SCI. PMID:24090228

  15. Adequate iron stores and the 'Nil nocere' principle.

    PubMed

    Hollán, S; Johansen, K S

    1993-01-01

    There is a need to change the policy of unselective iron supplementation during periods of life with physiologically increased cell proliferation. Levels of iron stores to be regarded as adequate during infancy and pregnancy are still not well established. Recent data support the view that it is not justified to interfere with physiological adaptations developed through millions of years by sophisticated and precisely coordinated regulation of iron absorption, utilization and storage. Recent data suggest that the chelatable intracellular iron pool regulates the expression of proteins with central importance in cellular iron metabolism (TfR, ferritin, and erythroid 5-aminolevulinic synthetase) in a coordinately controlled way through an iron dependent cytosolic mRNA binding protein, the iron regulating factor (IRF). This factor is simultaneously a sensor and a regulator of iron levels. The reduction of ferritin levels during highly increased cell proliferation is a mirror of the increased density of TfRs. An abundance of data support the vigorous competition for growth-essential iron between microbial pathogens and their vertebrate hosts. The highly coordinated regulation of iron metabolism is probably crucial in achieving a balance between the blockade of readily accessible iron to invading organisms and yet providing sufficient iron for the immune system of the host. The most evident adverse clinical effects of excess iron have been observed in immunodeficient patients in tropical countries and in AIDS patients. Excess iron also increases the risk of initiation and promotion of malignant processes by iron binding to DNA and by the iron-catalysed release of free radicals. Oxygen radicals were shown to damage critical biomolecules leading, apart from cancer, to a variety of human disease states, including inflammation and atherosclerosis. They are also involved in processes of aging and thrombosis. Recent clinical trials have suggested that the use of iron

  16. [Abdominal cure procedures. Adequate use of Nobecutan Spray].

    PubMed

    López Soto, Rosa María

    2009-12-01

    Open abdominal wounds, complicated by infection and/or risk of eventration tend to become chronic and usually require frequent prolonged cure. Habitual changing of bandages develop into one of the clearest risk factors leading to the deterioration of perilesional cutaneous integrity. This brings with it new complications which draw out the evolution of the process, provoking an important deterioration in quality of life for the person who suffers this and a considerable increase in health costs. What is needed is a product and a procedure which control the risk of irritation, which protect the skin, which favor a patient's comfort and which shorten treatment requirements while lowering health care expenses. This report invites medical personnel to think seriously about the scientific rationale, and treatment practice, as to why and how to apply Nobecutan adequately, this reports concludes stating the benefits in the adequate use of this product. The objective of this report is to guarantee the adequate use of this product in treatment of complicated abdominal wounds. This product responds to the needs which are present in these clinical cases favoring skin care apt isolation and protection, while at the same time, facilitating the placement and stability of dressings and bandages used to cure wounds. In order for this to happen, the correct use of this product is essential; medical personnel must pay attention to precautions and recommendations for proper application. The author's experiences in habitual handling of this product during various years, included in the procedures for standardized cures for these wounds, corroborates its usefulness; the author considers use of this product to be highly effective while being simple to apply; furthermore, one succeeds in providing quality care and optimizes resources employed.

  17. A tilt-pair based method for assigning the projection directions of randomly oriented single-particle molecules.

    PubMed

    Ueno, Yutaka; Mine, Shouhei; Kawasaki, Kazunori

    2015-04-01

    In this article, we describe an improved method to assign the projection angle for averaged images using tilt-pair images for three-dimensional reconstructions from randomly oriented single-particle molecular images. Our study addressed the so-called 'initial volume problem' in the single-particle reconstruction, which involves estimation of projection angles of the particle images. The projected images of the particles in different tilt observations were mixed and averaged for the characteristic views. After the ranking of these group average images in terms of reliable tilt angle information, mutual tilt angles between images are assigned from the constituent tilt-pair information. Then, multiples of the conical tilt series are made and merged to construct a network graph of the particle images in terms of projection angles, which are optimized for the three-dimensional reconstruction. We developed the method with images of a synthetic object and applied it to a single-particle image data set of the purified deacetylase from archaea. With the introduction of low-angle tilt observations to minimize unfavorable imaging conditions due to tilting, the results demonstrated reasonable reconstruction models without imposing symmetry to the structure. This method also guides its users to discriminate particle images of different conformational state of the molecule.

  18. Prevention of gestational diabetes through lifestyle intervention: study design and methods of a Finnish randomized controlled multicenter trial (RADIEL)

    PubMed Central

    2014-01-01

    Background Maternal overweight, obesity and consequently the incidence of gestational diabetes are increasing rapidly worldwide. The objective of the study was to assess the efficacy and cost-effectiveness of a combined diet and physical activity intervention implemented before, during and after pregnancy in a primary health care setting for preventing gestational diabetes, later type 2 diabetes and other metabolic consequences. Methods RADIEL is a randomized controlled multi-center intervention trial in women at high risk for diabetes (a previous history of gestational diabetes or prepregnancy BMI ≥30 kg/m2). Participants planning pregnancy or in the first half of pregnancy were parallel-group randomized into an intervention arm which received lifestyle counseling and a control arm which received usual care given at their local antenatal clinics. All participants visited a study nurse every three months before and during pregnancy, and at 6 weeks, 6 and 12 months postpartum. Measurements and laboratory tests were performed on all participants with special focus on dietary and exercise habits and metabolic markers. Of the 728 women [mean age 32.5 years (SD 4.7); median parity 1 (range 0-9)] considered to be eligible for the study 235 were non-pregnant and 493 pregnant [mean gestational age 13 (range 6 to 18) weeks] at the time of enrollment. The proportion of nulliparous women was 29.8% (n = 217). Out of all participants, 79.6% of the non-pregnant and 40.4% of the pregnant women had previous gestational diabetes and 20.4% of the non-pregnant and 59.6% of the pregnant women were recruited because of a prepregnancy BMI ≥30 kg/m2. Mean BMI at first visit was 30.1 kg/m2 (SD 6.2) in the non-pregnant and 32.7 kg/m2 (SD 5.6) in the pregnant group. Discussion To our knowledge, this is the first randomized lifestyle intervention trial, which includes, besides the pregnancy period, both the prepregnancy and the postpartum period. This study design also

  19. Quantifying dose to the reconstructed breast: Can we adequately treat?

    SciTech Connect

    Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M.; Pierce, Lori J.

    2013-04-01

    To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.

  20. A comparative in-vivo evaluation of the alignment efficiency of 5 ligation methods: A prospective randomized clinical trial

    PubMed Central

    Reddy, Vijaya Bhaskara; Kumar, Talapaneni Ashok; Prasad, Mandava; Nuvvula, Sivakumar; Patil, Rajedra Goud; Reddy, Praveen Kumar

    2014-01-01

    Objectives: To conduct a prospective randomized study comparing the efficiency of 5 different ligation systems (ELL; elastomeric ligature, SSL; stainless steel ligature, LL; leone slide ligature, PSL; passive self-ligation and ASL; active self-ligation) over the duration of mandibular crowding alleviation. Materials and Methods: Fifty consecutive patients (54.2% male, 45.8% female; mean age: 16.69 years) satisfying the inclusion criteria were randomly allocated to 5 ligation groups with an equal sample size of 10 per group. The 5 groups received treatment with 0.022-inch MBT pre-adjusted edge-wise technique (ELL: Gemini 3M Unitek, SSL: Gemini 3M Unitek, LL: Gemini 3M Unitek, PSL: SmartClip 3M Unitek and ASL: In-Ovation R Euro GAC International). The models and cephalograms were evaluated for anterior arch alignment, extraction space closure, and lower incisal inclinations at pre-treatment T1 and at the end of initial alignment T2. Analysis of variance (ANOVA) and Post-hoc tests were used for data analysis. Results: Forty-eight participants completed the study, and SL systems showed a significant difference over CL groups in time to alignment, passive space closure, and incisal inclination. Multiple regression showed a reduction of 5.28 days in time to alignment by changing the ligation group in the order of ELL to ASL group and 1 mm increase in initial irregularity index increases time to alignment by 11.68 days. Conclusion: Self-ligation brackets were more efficient than conventional ligation brackets during initial leveling and alignment. PMID:24966742

  1. Fundamental Vibration Frequency and Damping Estimation: A Comparison Using the Random Decrement Method, the Empirical Mode Decomposition, and the HV Spectral Ratio Method for Local Site Characterization

    NASA Astrophysics Data System (ADS)

    Huerta-Lopez, C. I.; Upegui Botero, F. M.; Pulliam, J.; Willemann, R. J.; Pasyanos, M.; Schmitz, M.; Rojas Mercedes, N.; Louie, J. N.; Moschetti, M. P.; Martinez-Cruzado, J. A.; Suárez, L.; Huerfano Moreno, V.; Polanco, E.

    2013-12-01

    Site characterization in civil engineering demands to know at least two of the dynamic properties of soil systems, which are: (i) dominant vibration frequency, and (ii) damping. As part of an effort to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques using non invasive/non destructive seismic methods, a workshop (Pan-American Advanced Studies Institute: New Frontiers in Geophysical Research: Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation) was conducted during july 15-25, 2013 in Santo Domingo, Dominican Republic by the alliance of Pan-American Advanced Studies Institute (PASI) and Incorporated Research Institutions for Seismology (IRIS), jointly supported by Department of Energy (DOE) and National Science Foundation (NSF). Preliminary results of the site characterization in terms of fundamental vibration frequency and damping are here presented from data collected during the workshop. Three different methods were used in such estimations and later compared in order to identify the stability of estimations as well as the advantage or disadvantage among these methodologies. The used methods were the: (i) Random Decrement Method (RDM), to estimate fundamental vibration frequency and damping simultaneously; (ii) Empirical Mode Decomposition (EMD), to estimate the vibration modes, and (iii) Horizontal to Vertical Spectra ratio (HVSR), to estimate the fundamental vibration frequency. In all cases ambient vibration and induced vibration were used.

  2. A new analgesic method, two-minute sciatic nerve press, for immediate pain relief: a randomized trial

    PubMed Central

    He, Jiman; Wu, Bin; Jiang, Xianrong; Zhang, Fenglin; Zhao, Tao; Zhang, Wenlon

    2008-01-01

    Background Current analgesics have drawbacks such as delays in acquisition, lag-times for effect, and side effects. We recently presented a preliminary report of a new analgesic method involving a two-minute sciatic nerve press, which resulted in immediate short-term relief of pain associated with dental and renal diseases. The present study investigated whether this technique was effective for pain associated with other disease types, and whether the relief was effective for up to one hour. Methods This randomized, placebo-controlled, parallel-group trial was conducted in four hospitals in Anhui Province, China. Patients with pain were sequentially recruited by participating physicians during clinic visits, and 135 patients aged 15 – 80 years were enrolled. Dental disease patients included those with acute pulpitis and periapical abscesses. Renal disease patients included those with kidney infections and/or stones. Tumor patients included those with nose, breast, stomach and liver cancers, while Emergency Room patients had various pathologies. Patients were randomly assigned to receive a "sciatic nerve press" in which pressure was applied simultaneously to the sciatic nerves at the back of both thighs, or a "placebo press" in which pressure was applied to a parallel region on the front of the thighs. Each fist applied a pressure of 11 – 20 kg for 2 minutes. Patients rated their level of pain before and after the procedure. Results The "sciatic nerve press" produced immediate relief of pain in all patient groups. Emergency patients reported a 43.5% reduction in pain (p < 0.001). Significant pain relief for dental, renal and tumor patients lasted for 60 minutes (p < 0.001). The peak pain relief occurred at the 10 – 20th minutes, and the relief decreased 47% by the 60th minutes. Conclusion Two minutes of pressure on both sciatic nerves produced immediate significant short-term conduction analgesia. This technique is a convenient, safe and powerful method for

  3. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    PubMed Central

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. Results HPV prevalence for high-risk types was 62.3% (95%CI: 53.7–70.2) detected by s-DRY, 56.2% (95%CI: 47.6–64.4) by Dr-WET, and 54.6% (95%CI: 46.1–62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5–79.8) for s-FTA, 84.6% (95%CI: 66.5–93.9) for s-DRY, and 76.9% (95%CI: 58.0–89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Conclusion Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 43310942 PMID:26630353

  4. Development of a novel efficient method to construct an adenovirus library displaying random peptides on the fiber knob

    PubMed Central

    Yamamoto, Yuki; Goto, Naoko; Miura, Kazuki; Narumi, Kenta; Ohnami, Shumpei; Uchida, Hiroaki; Miura, Yoshiaki; Yamamoto, Masato; Aoki, Kazunori

    2014-01-01

    Redirection of adenovirus vectors by engineering the capsid-coding region has shown limited success because proper targeting ligands are generally unknown. To overcome this limitation, we constructed an adenovirus library displaying random peptides on the fiber knob, and its screening led to successful selections of several particular targeted vectors. In the previous library construction method, the full length of an adenoviral genome was generated by a Cre-lox mediated in vitro recombination between a fiber-modified plasmid library and the enzyme-digested adenoviral DNA/terminal protein complex (DNA-TPC) before transfection to the producer cells. In this system, the procedures were complicated and time-consuming, and approximately 30% of the vectors in the library were defective with no displaying peptide. These may hinder further extensive exploration of cancer-targeting vectors. To resolve these problems, in this study, we developed a novel method with the transfection of a fiber-modified plasmid library and a fiberless adenoviral DNA-TPC in Cre-expressing 293 cells. The use of in-cell Cre recombination and fiberless adenovirus greatly simplified the library-making steps. The fiberless adenovirus was useful in suppressing the expansion of unnecessary adenovirus vectors. In addition, the complexity of the library was more than a 104 level in one well in a 6-well dish, which was 10-fold higher than that of the original method. The results demonstrated that this novel method is useful in producing a high quality live adenovirus library, which could facilitate the development of targeted adenovirus vectors for a variety of applications in medicine. PMID:24380399

  5. Development of a novel efficient method to construct an adenovirus library displaying random peptides on the fiber knob.

    PubMed

    Yamamoto, Yuki; Goto, Naoko; Miura, Kazuki; Narumi, Kenta; Ohnami, Shumpei; Uchida, Hiroaki; Miura, Yoshiaki; Yamamoto, Masato; Aoki, Kazunori

    2014-03-01

    Redirection of adenovirus vectors by engineering the capsid-coding region has shown limited success because proper targeting ligands are generally unknown. To overcome this limitation, we constructed an adenovirus library displaying random peptides on the fiber knob, and its screening led to successful selections of several particular targeted vectors. In the previous library construction method, the full length of an adenoviral genome was generated by a Cre-lox mediated in vitro recombination between a fiber-modified plasmid library and the enzyme-digested adenoviral DNA/terminal protein complex (DNA-TPC) before transfection to the producer cells. In this system, the procedures were complicated and time-consuming, and approximately 30% of the vectors in the library were defective with no displaying peptide. These may hinder further extensive exploration of cancer-targeting vectors. To resolve these problems, in this study, we developed a novel method with the transfection of a fiber-modified plasmid library and a fiberless adenoviral DNA-TPC in Cre-expressing 293 cells. The use of in-cell Cre recombination and fiberless adenovirus greatly simplified the library-making steps. The fiberless adenovirus was useful in suppressing the expansion of unnecessary adenovirus vectors. In addition, the complexity of the library was more than a 10(4) level in one well in a 6-well dish, which was 10-fold higher than that of the original method. The results demonstrated that this novel method is useful in producing a high quality live adenovirus library, which could facilitate the development of targeted adenovirus vectors for a variety of applications in medicine. PMID:24380399

  6. Choices for achieving adequate dietary calcium with a vegetarian diet.

    PubMed

    Weaver, C M; Proulx, W R; Heaney, R

    1999-09-01

    To achieve adequate dietary calcium intake, several choices are available that accommodate a variety of lifestyles and tastes. Liberal consumption of dairy products in the diet is the approach of most Americans. Some plants provide absorbable calcium, but the quantity of vegetables required to reach sufficient calcium intake make an exclusively plant-based diet impractical for most individuals unless fortified foods or supplements are included. Also, dietary constituents that decrease calcium retention, such as salt, protein, and caffeine, can be high in the vegetarian diet. Although it is possible to obtain calcium balance from a plant-based diet in a Western lifestyle, it may be more convenient to achieve calcium balance by increasing calcium consumption than by limiting other dietary factors.

  7. Genetic Modification of Preimplantation Embryos: Toward Adequate Human Research Policies

    PubMed Central

    Dresser, Rebecca

    2004-01-01

    Citing advances in transgenic animal research and setbacks in human trials of somatic cell genetic interventions, some scientists and others want to begin planning for research involving the genetic modification of human embryos. Because this form of genetic modification could affect later-born children and their offspring, the protection of human subjects should be a priority in decisions about whether to proceed with such research. Yet because of gaps in existing federal policies, embryo modification proposals might not receive adequate scientific and ethical scrutiny. This article describes current policy shortcomings and recommends policy actions designed to ensure that the investigational genetic modification of embryos meets accepted standards for research on human subjects. PMID:15016248

  8. Leveraging Random Number Generation for Mastery of Learning in Teaching Quantitative Research Courses via an E-Learning Method

    ERIC Educational Resources Information Center

    Boonsathorn, Wasita; Charoen, Danuvasin; Dryver, Arthur L.

    2014-01-01

    E-Learning brings access to a powerful but often overlooked teaching tool: random number generation. Using random number generation, a practically infinite number of quantitative problem-solution sets can be created. In addition, within the e-learning context, in the spirit of the mastery of learning, it is possible to assign online quantitative…

  9. Eliminating bias in randomized controlled trials: importance of allocation concealment and masking.

    PubMed

    Viera, Anthony J; Bangdiwala, Shrikant I

    2007-02-01

    Randomization in randomized controlled trials involves more than generation of a random sequence by which to assign subjects. For randomization to be successfully implemented, the randomization sequence must be adequately protected (concealed) so that investigators, involved health care providers, and subjects are not aware of the upcoming assignment. The absence of adequate allocation concealment can lead to selection bias, one of the very problems that randomization was supposed to eliminate. Authors of reports of randomized trials should provide enough details on how allocation concealment was achieved so the reader can determine the likelihood of success. Fortunately, a plan of allocation concealment can always be incorporated into the design of a randomized trial. Certain methods minimize the risk of concealment failing more than others. Keeping knowledge of subjects' assignment after allocation from subjects, investigators/health care providers, or those assessing outcomes is referred to as masking (also known as blinding). The goal of masking is to prevent ascertainment bias. In contrast to allocation concealment, masking cannot always be incorporated into a randomized controlled trial. Both allocation concealment and masking add to the elimination of bias in randomized controlled trials.

  10. Comparison of the Roll Plate Method to the Sonication Method To Diagnose Catheter Colonization and Bacteremia in Patients with Long-Term Tunnelled Catheters: a Randomized Prospective Study▿

    PubMed Central

    Slobbe, Lennert; el Barzouhi, Abdelilah; Boersma, Eric; Rijnders, Bart J. A.

    2009-01-01

    Diagnosing catheter-related bloodstream infection (CRBSI) still often involves tip culture. The conventional method is the semiquantitative roll plate method. However, the use of a quantitative sonication technique could have additional value, as it may detect endoluminal microorganisms more easily. Because endoluminal infection tends to occur in long-term central venous catheters, we compared both techniques for patients with long-term tunnelled catheters. For 313 consecutive Hickman catheter tips from 279 hematological patients, colonization detection rates were compared by performing both techniques in a random order, using conventional detection cutoffs. Additionally, for the subgroup of patients with clinical suspicion of CRBSI (n = 89), the diagnostic values of both techniques were compared. The overall tip colonization rate was 25%. For each technique, the detection rate tended to be better if that technique was performed first. The diagnostic performance for the subgroup of patients with clinical suspicion of CRBSI was limited and not different for both methods. Sensitivity and specificity were 45% and 84%, respectively, for sonication versus 35% and 90%, respectively, for the roll plate technique. The fact that 35 of 40 patients with CRBSI received antimicrobial therapy before catheter removal and tip culture, in an attempt to salvage the catheter, may partly explain this poor performance. No differences were observed when catheters were stratified according to in situ time below or above the median of 4 weeks. The sonication culture technique was not better than the roll plate method to diagnose tip colonization or CRBSI in patients with long-term tunnelled catheters. PMID:19171682

  11. Intracranial Pressure Monitoring in Severe Traumatic Brain Injury in Latin America: Process and Methods for a Multi-Center Randomized Controlled Trial

    PubMed Central

    Lujan, Silvia; Dikmen, Sureyya; Temkin, Nancy; Petroni, Gustavo; Pridgeon, Jim; Barber, Jason; Machamer, Joan; Cherner, Mariana; Chaddock, Kelley; Hendrix, Terence; Rondina, Carlos; Videtta, Walter; Celix, Juanita M.; Chesnut, Randall

    2012-01-01

    Abstract In patients with severe traumatic brain injury (TBI), the influence on important outcomes of the use of information from intracranial pressure (ICP) monitoring to direct treatment has never been tested in a randomized controlled trial (RCT). We are conducting an RCT in six trauma centers in Latin America to test this question. We hypothesize that patients randomized to ICP monitoring will have lower mortality and better outcomes at 6-months post-trauma than patients treated without ICP monitoring. We selected three centers in Bolivia to participate in the trial, based on (1) the absence of ICP monitoring, (2) adequate patient accession and data collection during the pilot phase, (3) preliminary institutional review board approval, and (4) the presence of equipoise about the value of ICP monitoring. We conducted extensive training of site personnel, and initiated the trial on September 1, 2008. Subsequently, we included three additional centers. A total of 176 patients were entered into the trial as of August 31, 2010. Current enrollment is 81% of that expected. The trial is expected to reach its enrollment goal of 324 patients by September of 2011. We are conducting a high-quality RCT to answer a question that is important globally. In addition, we are establishing the capacity to conduct strong research in Latin America, where TBI is a serious epidemic. Finally, we are demonstrating the feasibility and utility of international collaborations that share resources and unique patient populations to conduct strong research about global public health concerns. PMID:22435793

  12. Dose Limits for Man do not Adequately Protect the Ecosystem

    SciTech Connect

    Higley, Kathryn A.; Alexakhin, Rudolf M.; McDonald, Joseph C.

    2004-08-01

    It has been known for quite some time that different organisms display differing degrees of sensitivity to the effects of ionizing radiations. Some microorganisms such as the bacterium Micrococcus radiodurans, along with many species of invertebrates, are extremely radio-resistant. Humans might be categorized as being relatively sensitive to radiation, and are a bit more resistant than some pine trees. Therefore, it could be argued that maintaining the dose limits necessary to protect humans will also result in the protection of most other species of flora and fauna. This concept is usually referred to as the anthropocentric approach. In other words, if man is protected then the environment is also adequately protected. The ecocentric approach might be stated as; the health of humans is effectively protected only when the environment is not unduly exposed to radiation. The ICRP is working on new recommendations dealing with the protection of the environment, and this debate should help to highlight a number of relevant issues concerning that topic.

  13. DARHT - an `adequate` EIS: A NEPA case study

    SciTech Connect

    Webb, M.D.

    1997-08-01

    The Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility Environmental Impact Statement (EIS) provides a case study that is interesting for many reasons. The EIS was prepared quickly, in the face of a lawsuit, for a project with unforeseen environmental impacts, for a facility that was deemed urgently essential to national security. Following judicial review the EIS was deemed to be {open_quotes}adequate.{close_quotes} DARHT is a facility now being built at Los Alamos National Laboratory (LANL) as part of the Department of Energy (DOE) nuclear weapons stockpile stewardship program. DARHT will be used to evaluate the safety and reliability of nuclear weapons, evaluate conventional munitions and study high-velocity impact phenomena. DARHT will be equipped with two accelerator-driven, high-intensity X-ray machines to record images of materials driven by high explosives. DARHT will be used for a variety of hydrodynamic tests, and DOE plans to conduct some dynamic experiments using plutonium at DARHT as well.

  14. ENSURING ADEQUATE SAFETY WHEN USING HYDROGEN AS A FUEL

    SciTech Connect

    Coutts, D

    2007-01-22

    Demonstration projects using hydrogen as a fuel are becoming very common. Often these projects rely on project-specific risk evaluations to support project safety decisions. This is necessary because regulations, codes, and standards (hereafter referred to as standards) are just being developed. This paper will review some of the approaches being used in these evolving standards, and techniques which demonstration projects can implement to bridge the gap between current requirements and stakeholder desires. Many of the evolving standards for hydrogen-fuel use performance-based language, which establishes minimum performance and safety objectives, as compared with prescriptive-based language that prescribes specific design solutions. This is being done for several reasons including: (1) concern that establishing specific design solutions too early will stifle invention, (2) sparse performance data necessary to support selection of design approaches, and (3) a risk-adverse public which is unwilling to accept losses that were incurred in developing previous prescriptive design standards. The evolving standards often contain words such as: ''The manufacturer shall implement the measures and provide the information necessary to minimize the risk of endangering a person's safety or health''. This typically implies that the manufacturer or project manager must produce and document an acceptable level of risk. If accomplished using comprehensive and systematic process the demonstration project risk assessment can ease the transition to widespread commercialization. An approach to adequately evaluate and document the safety risk will be presented.

  15. Quantifying variability within water samples: the need for adequate subsampling.

    PubMed

    Donohue, Ian; Irvine, Kenneth

    2008-01-01

    Accurate and precise determination of the concentration of nutrients and other substances in waterbodies is an essential requirement for supporting effective management and legislation. Owing primarily to logistic and financial constraints, however, national and regional agencies responsible for monitoring surface waters tend to quantify chemical indicators of water quality using a single sample from each waterbody, thus largely ignoring spatial variability. We show here that total sample variability, which comprises both analytical variability and within-sample heterogeneity, of a number of important chemical indicators of water quality (chlorophyll a, total phosphorus, total nitrogen, soluble molybdate-reactive phosphorus and dissolved inorganic nitrogen) varies significantly both over time and among determinands, and can be extremely high. Within-sample heterogeneity, whose mean contribution to total sample variability ranged between 62% and 100%, was significantly higher in samples taken from rivers compared with those from lakes, and was shown to be reduced by filtration. Our results show clearly that neither a single sample, nor even two sub-samples from that sample is adequate for the reliable, and statistically robust, detection of changes in the quality of surface waters. We recommend strongly that, in situations where it is practicable to take only a single sample from a waterbody, a minimum of three sub-samples are analysed from that sample for robust quantification of both the concentrations of determinands and total sample variability. PMID:17706740

  16. An efficient, high-order probabilistic collocation method on sparse grids for three-dimensional flow and solute transport in randomly heterogeneous porous media

    SciTech Connect

    Lin, Guang; Tartakovsky, Alexandre M.

    2009-05-01

    In this study, a probabilistic collocation method (PCM) on sparse grids was used to solve stochastic equations describing flow and transport in three-dimensional in saturated, randomly heterogeneous porous media. Karhunen-Lo\\`{e}ve (KL) decomposition was used to represent the three-dimensional log hydraulic conductivity $Y=\\ln K_s$. The hydraulic head $h$ and average pore-velocity $\\bf v$ were obtained by solving the three-dimensional continuity equation coupled with Darcy's law with random hydraulic conductivity field. The concentration was computed by solving a three-dimensional stochastic advection-dispersion equation with stochastic average pore-velocity $\\bf v$ computed from Darcy's law. PCM is an extension of the generalized polynomial chaos (gPC) that couples gPC with probabilistic collocation. By using the sparse grid points, PCM can handle a random process with large number of random dimensions, with relatively lower computational cost, compared to full tensor products. Monte Carlo (MC) simulations have also been conducted to verify accuracy of the PCM. By comparing the MC and PCM results for mean and standard deviation of concentration, it is evident that the PCM approach is computational more efficient than Monte Carlo simulations. Unlike the conventional moment-equation approach, there is no limitation on the amplitude of random perturbation in PCM. Furthermore, PCM on sparse grids can efficiently simulate solute transport in randomly heterogeneous porous media with large variances.

  17. The COPE healthy lifestyles TEEN randomized controlled trial with culturally diverse high school adolescents: Baseline characteristics and methods

    PubMed Central

    Melnyk, Bernadette Mazurek; Kelly, Stephanie; Jacobson, Diana; Belyea, Michael; Shaibi, Gabriel; Small, Leigh; O’Haver, Judith; Marsiglia, Flavio Francisco

    2014-01-01

    Obesity and mental health disorders remain significant public health problems in adolescents. Substantial health disparities exist with minority youth experiencing higher rates of these problems. Schools are an outstanding venue to provide teens with skills needed to improve their physical and mental health, and academic performance. In this paper, the authors describe the design, intervention, methods and baseline data for a randomized controlled trial with 779 culturally diverse high-school adolescents in the southwest United States. Aims for this prevention study include testing the efficacy of the COPE TEEN program versus an attention control program on the adolescents’ healthy lifestyle behaviors, Body Mass Index (BMI) and BMI%, mental health, social skills and academic performance immediately following the intervention programs, and at six and 12 months post interventions. Baseline findings indicate that greater than 40% of the sample is either overweight (n = 148, 19.00%) or obese (n = 182, 23.36%). The predominant ethnicity represented is Hispanic (n = 526, 67.52%). At baseline, 15.79%(n = 123) of the students had above average scores on the Beck Youth Inventory Depression subscale indicating mildly (n = 52, 6.68%), moderately (n = 47, 6.03%), or extremely (n = 24, 3.08%) elevated scores (see 1). Anxiety scores were slightly higher with 21.56% (n = 168) reporting responses suggesting mildly (n = 81, 10.40%), moderately (n = 58, 7.45%) or extremely (n = 29, 3.72%) elevated scores. If the efficacy of the COPE TEEN program is supported, it will offer schools a curriculum that can be easily incorporated into high school health courses to improve adolescent healthy lifestyle behaviors, psychosocial outcomes and academic performance. PMID:23748156

  18. A Randomized Exploratory Study to Evaluate Two Acupuncture Methods for the Treatment of Headaches Associated with Traumatic Brain Injury

    PubMed Central

    Bellanti, Dawn M.; Paat, Charmagne F.; Boyd, Courtney C.; Duncan, Alaine; Price, Ashley; Zhang, Weimin; French, Louis M.; Chae, Heechin

    2016-01-01

    Abstract Background: Headaches are prevalent among Service members with traumatic brain injury (TBI); 80% report chronic or recurrent headache. Evidence for nonpharmacologic treatments, such as acupuncture, are needed. Objective: The aim of this research was to determine if two types of acupuncture (auricular acupuncture [AA] and traditional Chinese acupuncture [TCA]) were feasible and more effective than usual care (UC) alone for TBI–related headache. Materials and Methods: Design: This was a three-armed, parallel, randomized exploratory study. Setting: The research took place at three military treatment facilities in the Washington, DC, metropolitan area. Patients: The subjects were previously deployed Service members (18–69 years old) with mild-to-moderate TBI and headaches. Intervention: The interventions explored were UC alone or with the addition of AA or TCA. Outcome Measures: The primary outcome was the Headache Impact Test (HIT). Secondary outcomes were the Numerical Rating Scale (NRS), Pittsburgh Sleep Quality Index, Post-Traumatic Stress Checklist, Symptom Checklist-90-R, Medical Outcome Study Quality of Life (QoL), Beck Depression Inventory, State-Trait Anxiety Inventory, the Automated Neuropsychological Assessment Metrics, and expectancy of outcome and acupuncture efficacy. Results: Mean HIT scores decreased in the AA and TCA groups but increased slightly in the UC-only group from baseline to week 6 [AA, −10.2% (−6.4 points); TCA, −4.6% (−2.9 points); UC, +0.8% (+0.6 points)]. Both acupuncture groups had sizable decreases in NRS (Pain Best), compared to UC (TCA versus UC: P = 0.0008, d = 1.70; AA versus UC: P = 0.0127, d = 1.6). No statistically significant results were found for any other secondary outcome measures. Conclusions: Both AA and TCA improved headache-related QoL more than UC did in Service members with TBI. PMID:27458496

  19. A therapeutic application of the experience sampling method in the treatment of depression: a randomized controlled trial.

    PubMed

    Kramer, Ingrid; Simons, Claudia J P; Hartmann, Jessica A; Menne-Lothmann, Claudia; Viechtbauer, Wolfgang; Peeters, Frenk; Schruers, Koen; van Bemmel, Alex L; Myin-Germeys, Inez; Delespaul, Philippe; van Os, Jim; Wichers, Marieke

    2014-02-01

    In depression, the ability to experience daily life positive affect predicts recovery and reduces relapse rates. Interventions based on the experience sampling method (ESM-I) are ideally suited to provide insight in personal, contextualized patterns of positive affect. The aim of this study was to examine whether add-on ESM-derived feedback on personalized patterns of positive affect is feasible and useful to patients, and results in a reduction of depressive symptomatology. Depressed outpatients (n=102) receiving pharmacological treatment participated in a randomized controlled trial with three arms: an experimental group receiving add-on ESM-derived feedback, a pseudo-experimental group participating in ESM but receiving no feedback, and a control group. The experimental group participated in an ESM procedure (three days per week over a 6-week period) using a palmtop. This group received weekly standardized feedback on personalized patterns of positive affect. Hamilton Depression Rating Scale - 17 (HDRS) and Inventory of Depressive Symptoms (IDS) scores were obtained before and after the intervention. During a 6-month follow-up period, five HDRS and IDS assessments were completed. Add-on ESM-derived feedback resulted in a significant and clinically relevant stronger decrease in HDRS score relative to the control group (p<0.01; -5.5 point reduction in HDRS at 6 months). Compared to the pseudo-experimental group, a clinically relevant decrease in HDRS score was apparent at 6 months (B=-3.6, p=0.053). Self-reported depressive complaints (IDS) yielded the same pattern over time. The use of ESM-I was deemed acceptable and the provided feedback easy to understand. Patients attempted to apply suggestions from ESM-derived feedback to daily life. These data suggest that the efficacy of traditional passive pharmacological approach to treatment of major depression can be enhanced by using person-tailored daily life information regarding positive affect. PMID:24497255

  20. The COPE healthy lifestyles TEEN randomized controlled trial with culturally diverse high school adolescents: baseline characteristics and methods.

    PubMed

    Melnyk, Bernadette Mazurek; Kelly, Stephanie; Jacobson, Diana; Belyea, Michael; Shaibi, Gabriel; Small, Leigh; O'Haver, Judith; Marsiglia, Flavio Francisco

    2013-09-01

    Obesity and mental health disorders remain significant public health problems in adolescents. Substantial health disparities exist with minority youth experiencing higher rates of these problems. Schools are an outstanding venue to provide teens with skills needed to improve their physical and mental health, and academic performance. In this paper, the authors describe the design, intervention, methods and baseline data for a randomized controlled trial with 779 culturally diverse high-school adolescents in the southwest United States. Aims for this prevention study include testing the efficacy of the COPE TEEN program versus an attention control program on the adolescents' healthy lifestyle behaviors, Body Mass Index (BMI) and BMI%, mental health, social skills and academic performance immediately following the intervention programs, and at six and 12 months post interventions. Baseline findings indicate that greater than 40% of the sample is either overweight (n = 148, 19.00%) or obese (n = 182, 23.36%). The predominant ethnicity represented is Hispanic (n = 526, 67.52%). At baseline, 15.79% (n = 123) of the students had above average scores on the Beck Youth Inventory Depression subscale indicating mildly (n = 52, 6.68%), moderately (n = 47, 6.03%), or extremely (n = 24, 3.08%) elevated scores (see Table 1). Anxiety scores were slightly higher with 21.56% (n = 168) reporting responses suggesting mildly (n = 81, 10.40%), moderately (n = 58, 7.45%) or extremely (n = 29, 3.72%) elevated scores. If the efficacy of the COPE TEEN program is supported, it will offer schools a curriculum that can be easily incorporated into high school health courses to improve adolescent healthy lifestyle behaviors, psychosocial outcomes and academic performance.

  1. Improving access to adequate pain management in Taiwan.

    PubMed

    Scholten, Willem

    2015-06-01

    There is a global crisis in access to pain management in the world. WHO estimates that 4.65 billion people live in countries where medical opioid consumption is near to zero. For 2010, WHO considered a per capita consumption of 216.7 mg morphine equivalents adequate, while Taiwan had a per capita consumption of 0.05 mg morphine equivalents in 2007. In Asia, the use of opioids is sensitive because of the Opium Wars in the 19th century and for this reason, the focus of controlled substances policies has been on the prevention of diversion and dependence. However, an optimal public health outcome requires that also the beneficial aspects of these substances are acknowledged. Therefore, WHO recommends a policy based on the Principle of Balance: ensuring access for medical and scientific purposes while preventing diversion, harmful use and dependence. Furthermore, international law requires that countries ensure access to opioid analgesics for medical and scientific purposes. There is evidence that opioid analgesics for chronic pain are not associated with a major risk for developing dependence. Barriers for access can be classified in the categories of overly restrictive laws and regulations; insufficient medical training on pain management and problems related to assessment of medical needs; attitudes like an excessive fear for dependence or diversion; and economic and logistical problems. The GOPI project found many examples of such barriers in Asia. Access to opioid medicines in Taiwan can be improved by analysing the national situation and drafting a plan. The WHO policy guidelines Ensuring Balance in National Policies on Controlled Substances can be helpful for achieving this purpose, as well as international guidelines for pain treatment.

  2. Improving access to adequate pain management in Taiwan.

    PubMed

    Scholten, Willem

    2015-06-01

    There is a global crisis in access to pain management in the world. WHO estimates that 4.65 billion people live in countries where medical opioid consumption is near to zero. For 2010, WHO considered a per capita consumption of 216.7 mg morphine equivalents adequate, while Taiwan had a per capita consumption of 0.05 mg morphine equivalents in 2007. In Asia, the use of opioids is sensitive because of the Opium Wars in the 19th century and for this reason, the focus of controlled substances policies has been on the prevention of diversion and dependence. However, an optimal public health outcome requires that also the beneficial aspects of these substances are acknowledged. Therefore, WHO recommends a policy based on the Principle of Balance: ensuring access for medical and scientific purposes while preventing diversion, harmful use and dependence. Furthermore, international law requires that countries ensure access to opioid analgesics for medical and scientific purposes. There is evidence that opioid analgesics for chronic pain are not associated with a major risk for developing dependence. Barriers for access can be classified in the categories of overly restrictive laws and regulations; insufficient medical training on pain management and problems related to assessment of medical needs; attitudes like an excessive fear for dependence or diversion; and economic and logistical problems. The GOPI project found many examples of such barriers in Asia. Access to opioid medicines in Taiwan can be improved by analysing the national situation and drafting a plan. The WHO policy guidelines Ensuring Balance in National Policies on Controlled Substances can be helpful for achieving this purpose, as well as international guidelines for pain treatment. PMID:26068436

  3. Impact of Violation of the Missing-at-Random Assumption on Full-Information Maximum Likelihood Method in Multidimensional Adaptive Testing

    ERIC Educational Resources Information Center

    Han, Kyung T.; Guo, Fanmin

    2014-01-01

    The full-information maximum likelihood (FIML) method makes it possible to estimate and analyze structural equation models (SEM) even when data are partially missing, enabling incomplete data to contribute to model estimation. The cornerstone of FIML is the missing-at-random (MAR) assumption. In (unidimensional) computerized adaptive testing…

  4. Testing the Validity of the Random Quadrat and Line-Transect Methods for Estimating Field Populations--or 'String and Coverslip' Ecology

    ERIC Educational Resources Information Center

    Hitchon, Arnold

    1975-01-01

    Uses a printed sheet of paper to represent a plant population in a given area. Quadrat sampling is simulated by dropping a square coverslip on the sheet and recording the symbols under the coverslip. Line-transect method is simulated by having a student randomly adjust the sheet under a string. (GS)

  5. Bioelement effects on thyroid gland in children living in iodine-adequate territory.

    PubMed

    Gorbachev, Anatoly L; Skalny, Anatoly V; Koubassov, Roman V

    2007-01-01

    Endemic goitre is a primary pathology of thyroid gland and critical medico social problem in many countries. A dominant cause of endemic goitre is iodine deficiency. However, besides primary iodine deficiency, the goitre may probably develop due to effects of other bioelement imbalances, essential to thyroid function maintenance. Here we studied 44 cases of endemic goitre in prepubertal children (7-10 y.o.) living in iodine-adequate territory. Thyroid volume was estimated by ultrasonometry. Main bioelements (Al, Ca, Cd, Co, Cr, Cu, Fe, Hg, I, Mg, Mn, Pb, Se, Si, Zn) were determined in hair samples by ICP-OES/ICP-MS method. Relationships between hair content of bioelements and thyroid gland size were estimated by multiple regressions. The regression model revealed significant positive relations between thyroid volume and Cr, Si, Mn contents. However, the actual factor of thyroid gland increase was only Si excess in organism. Significant negative relations of thyroid volume were revealed with I, Mg, Zn, Se, Co and Cd. In spite of this, the actual factors of thyroid gland volume increasing were I, Co, Mg and Se deficiency. Total bioelement contribution in thyroid impairment was estimated as 24%. Thus, it was suggested that endemic goitre in iodine-adequate territory can be formed by bioelement imbalances, namely Si excess and Co, Mg, Se shortage as well as endogenous I deficiency in spite of iodine-adequate environment.

  6. Effect of random structure on permeability and heat transfer characteristics for flow in 2D porous medium based on MRT lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Yang, PeiPei; Wen, Zhi; Dou, RuiFeng; Liu, Xunliang

    2016-08-01

    Flow and heat transfer through a 2D random porous medium are studied by using the lattice Boltzmann method (LBM). For the random porous medium, the influence of disordered cylinder arrangement on permeability and Nusselt number are investigated. Results indicate that the permeability and Nusselt number for different cylinder locations are unequal even with the same number and size of cylinders. New correlations for the permeability and coefficient b‧Den of the Forchheimer equation are proposed for random porous medium composed of Gaussian distributed circular cylinders. Furthermore, a general set of heat transfer correlations is proposed and compared with existing experimental data and empirical correlations. Our results show that the Nu number increases with the increase of the porosity, hence heat transfer is found to be accurate considering the effect of porosity.

  7. Using Logistic Regression and Random Forests multivariate statistical methods for landslide spatial probability assessment in North-Est Sicily, Italy

    NASA Astrophysics Data System (ADS)

    Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele

    2015-04-01

    first phase of the work addressed to identify the spatial relationships between the landslides location and the 13 related factors by using the Frequency Ratio bivariate statistical method. The analysis was then carried out by adopting a multivariate statistical approach, according to the Logistic Regression technique and Random Forests technique that gave best results in terms of AUC. The models were performed and evaluated with different sample sizes and also taking into account the temporal variation of input variables such as burned areas by wildfire. The most significant outcome of this work are: the relevant influence of the sample size on the model results and the strong importance of some environmental factors (e.g. land use and wildfires) for the identification of the depletion zones of extremely rapid shallow landslides.

  8. Description of and users manual for TUBA: A computer code for generating two-dimensional random fields via the turning bands method

    SciTech Connect

    Zimmerman, D.A.; Wilson, J.L.

    1992-01-01

    TUBA is a computer code for the generation of synthetic two-dimensional random fields via the Turning Bands Method. It is primarily used to generate synthetic permeability fields for hydrologic and petroleum engineering applications, but it has applications wherever synthetic random fields are employed. This is version 2.0 of TUBA, a completely redesigned and rewritten code. It generates stationary or non-stationary, isotropic and anisotropic, and point or areal average random fields. Five functional covariance models are available in the code. These are Gaussian, Bessel, Telis, and Generalized Covariance models. The user can supply other forms. The random fields can be generated onto a gridded system (e.g., at the nodes of a point centered finite difference model, or the blocks of a block centered model), or at arbitrary locations in space (e.g., at the Gauss points of a finite element grid). TUBA can be used to generate the field values in local areas at much greater resolution than the original simulated field. The fields can be generated with a normal or a lognormal distribution. The size of the simulation is limited only by the virtual memory capabilities of the computer on which it is run. Random fields with over a million nodes have been generated with TUBA on a 386PC running Xenix. The code has been run on 286 and 386 PC's running DOS, on Sun 3's and 4's using Unix, and on Dec VAX's running VMS.

  9. Description of and users manual for TUBA: A computer code for generating two-dimensional random fields via the turning bands method

    SciTech Connect

    Zimmerman, D.A.; Wilson, J.L.

    1992-01-01

    TUBA is a computer code for the generation of synthetic two-dimensional random fields via the Turning Bands Method. It is primarily used to generate synthetic permeability fields for hydrologic and petroleum engineering applications, but it has applications wherever synthetic random fields are employed. This is version 2.0 of TUBA, a completely redesigned and rewritten code. It generates stationary or non-stationary, isotropic and anisotropic, and point or areal average random fields. Five functional covariance models are available in the code. These are Gaussian, Bessel, Telis, and Generalized Covariance models. The user can supply other forms. The random fields can be generated onto a gridded system (e.g., at the nodes of a point centered finite difference model, or the blocks of a block centered model), or at arbitrary locations in space (e.g., at the Gauss points of a finite element grid). TUBA can be used to generate the field values in local areas at much greater resolution than the original simulated field. The fields can be generated with a normal or a lognormal distribution. The size of the simulation is limited only by the virtual memory capabilities of the computer on which it is run. Random fields with over a million nodes have been generated with TUBA on a 386PC running Xenix. The code has been run on 286 and 386 PC`s running DOS, on Sun 3`s and 4`s using Unix, and on Dec VAX`s running VMS.

  10. Bayesian response adaptive randomization using longitudinal outcomes.

    PubMed

    Hatayama, Tomoyoshi; Morita, Satoshi; Sakamaki, Kentaro

    2015-01-01

    The response adaptive randomization (RAR) method is used to increase the number of patients assigned to more efficacious treatment arms in clinical trials. In many trials evaluating longitudinal patient outcomes, RAR methods based only on the final measurement may not benefit significantly from RAR because of its delayed initiation. We propose a Bayesian RAR method to improve RAR performance by accounting for longitudinal patient outcomes (longitudinal RAR). We use a Bayesian linear mixed effects model to analyze longitudinal continuous patient outcomes for calculating a patient allocation probability. In addition, we aim to mitigate the loss of statistical power because of large patient allocation imbalances by embedding adjusters into the patient allocation probability calculation. Using extensive simulation we compared the operating characteristics of our proposed longitudinal RAR method with those of the RAR method based only on the final measurement and with an equal randomization method. Simulation results showed that our proposed longitudinal RAR method assigned more patients to the presumably superior treatment arm compared with the other two methods. In addition, the embedded adjuster effectively worked to prevent extreme patient allocation imbalances. However, our proposed method may not function adequately when the treatment effect difference is moderate or less, and still needs to be modified to deal with unexpectedly large departures from the presumed longitudinal data model.

  11. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  12. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  13. Determining Adequate Margins in Head and Neck Cancers: Practice and Continued Challenges.

    PubMed

    Williams, Michelle D

    2016-09-01

    Margin assessment remains a critical component of oncologic care for head and neck cancer patients. As an integrated team, both surgeons and pathologists work together to assess margins in these complex patients. Differences in method of margin sampling can impact obtainable information and effect outcomes. Additionally, what distance is an "adequate or clear" margin for patient care continues to be debated. Ultimately, future studies and potentially secondary modalities to augment pathologic assessment of margin assessment (i.e., in situ imaging or molecular assessment) may enhance local control in head and neck cancer patients. PMID:27469263

  14. The Alchemy of "Costing Out" an Adequate Education

    ERIC Educational Resources Information Center

    Hanushek, Eric A.

    2006-01-01

    In response to the rapid rise in court cases related to the adequacy of school funding, a variety of alternative methods have been developed to provide an analytical base about the necessary expenditure on schools. These approaches have been titled to give an aura of a thoughtful and solid scientific basis: the professional judgment model, the…

  15. Design and methods for a pilot randomized clinical trial involving exercise and behavioral activation to treat comorbid type 2 diabetes and major depressive disorder.

    PubMed

    Schneider, Kristin L; Pagoto, Sherry L; Handschin, Barbara; Panza, Emily; Bakke, Susan; Liu, Qin; Blendea, Mihaela; Ockene, Ira S; Ma, Yunsheng

    2011-06-01

    BACKGROUND: The comorbidity of type 2 diabetes mellitus (T2DM) and depression is associated with poor glycemic control. Exercise has been shown to improve mood and glycemic control, but individuals with comorbid T2DM and depression are disproportionately sedentary compared to the general population and report more difficulty with exercise. Behavioral activation, an evidence-based depression psychotherapy, was designed to help people with depression make gradual behavior changes, and may be helpful to build exercise adherence in sedentary populations. This pilot randomized clinical trial will test the feasibility of a group exercise program enhanced with behavioral activation strategies among women with comorbid T2DM and depression. METHODS/DESIGN: Sedentary women with inadequately controlled T2DM and depression (N=60) will be randomly assigned to one of two conditions: exercise or usual care. Participants randomized to the exercise condition will attend 38 behavioral activation-enhanced group exercise classes over 24 weeks in addition to usual care. Participants randomized to the usual care condition will receive depression treatment referrals and print information on diabetes management via diet and physical activity. Assessments will occur at baseline and 3-, 6-, and 9-months following randomization. The goals of this pilot study are to demonstrate feasibility and intervention acceptability, estimate the resources and costs required to deliver the intervention and to estimate the standard deviation of continuous outcomes (e.g., depressive symptoms and glycosylated hemoglobin) in preparation for a fully-powered randomized clinical trial. DISCUSSION: A novel intervention that combines exercise and behavioral activation strategies could potentially improve glycemic control and mood in women with comorbid type 2 diabetes and depression. TRIAL REGISTRATION: NCT01024790.

  16. Design and methods for a pilot randomized clinical trial involving exercise and behavioral activation to treat comorbid type 2 diabetes and major depressive disorder

    PubMed Central

    Schneider, Kristin L.; Pagoto, Sherry L.; Handschin, Barbara; Panza, Emily; Bakke, Susan; Liu, Qin; Blendea, Mihaela; Ockene, Ira S.; Ma, Yunsheng

    2011-01-01

    Background The comorbidity of type 2 diabetes mellitus (T2DM) and depression is associated with poor glycemic control. Exercise has been shown to improve mood and glycemic control, but individuals with comorbid T2DM and depression are disproportionately sedentary compared to the general population and report more difficulty with exercise. Behavioral activation, an evidence-based depression psychotherapy, was designed to help people with depression make gradual behavior changes, and may be helpful to build exercise adherence in sedentary populations. This pilot randomized clinical trial will test the feasibility of a group exercise program enhanced with behavioral activation strategies among women with comorbid T2DM and depression. Methods/Design Sedentary women with inadequately controlled T2DM and depression (N=60) will be randomly assigned to one of two conditions: exercise or usual care. Participants randomized to the exercise condition will attend 38 behavioral activation-enhanced group exercise classes over 24 weeks in addition to usual care. Participants randomized to the usual care condition will receive depression treatment referrals and print information on diabetes management via diet and physical activity. Assessments will occur at baseline and 3-, 6-, and 9-months following randomization. The goals of this pilot study are to demonstrate feasibility and intervention acceptability, estimate the resources and costs required to deliver the intervention and to estimate the standard deviation of continuous outcomes (e.g., depressive symptoms and glycosylated hemoglobin) in preparation for a fully-powered randomized clinical trial. Discussion A novel intervention that combines exercise and behavioral activation strategies could potentially improve glycemic control and mood in women with comorbid type 2 diabetes and depression. Trial registration NCT01024790 PMID:21765864

  17. The methodology and operation of a pilot randomized control trial of the effectiveness of the Bug Busting method against a single application insecticide product for head louse treatment.

    PubMed

    Bingham, P; Kirk, S; Hill, N; Figueroa, J

    2000-07-01

    A Department of Health leaflet suggests two treatment methods for head lice: mechanical removal by wet combing; and insecticide lotion/rinses. However, there are no reports in the literature comparing the effectiveness of these two treatment methods and well controlled clinical trials of insecticide treatments are sparse. A pilot randomized control trial of the effectiveness of a specific method of wet combing, 'Bug Busting', against a single application of a proprietary insecticide product is reported. The difficulties of designing a trial are discussed and modifications that would allow a definitive trial to take place are suggested. The pilot study included enzyme analysis of lice for insecticide resistance status assessment.

  18. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data

    PubMed Central

    Menze, Bjoern H; Kelm, B Michael; Masuch, Ralf; Himmelreich, Uwe; Bachert, Peter; Petrich, Wolfgang; Hamprecht, Fred A

    2009-01-01

    Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task. PMID:19591666

  19. An outbreak of viral gastroenteritis associated with adequately prepared oysters.

    PubMed

    Chalmers, J W; McMillan, J H

    1995-08-01

    Over Christmas 1993, an outbreak of food poisoning occurred among guests in a hotel in South West Scotland. Evidence from a cohort study strongly suggested that raw oysters were the vehicle for infection, probably due to a Small Round Structured Virus (SRSV). Detailed enquiry about the source and preparation of the oysters revealed no evidence of any unsafe handling at any stage in the food chain, nor any evidence of bacterial contamination. It is suggested that the present standards of preparation and monitoring are inadequate to protect the consumer, and that bacteriophage monitoring may be a useful method of screening for viral contamination in future.

  20. Choosing an adequate FEM grid for global mantle convection modelling

    NASA Astrophysics Data System (ADS)

    Thieulot, Cedric

    2016-04-01

    Global numerical models of mantle convection are typically run on a grid which represents a hollow sphere. In the context of using the Finite Element method, there are many ways to discretise a hollow sphere by means of cuboids in a regular fashion (adaptive mesh refinement is here not considered). I will here focus on the following two: the cubed sphere [1], which is a quasi-uniform mapping of a cube to a sphere (considering both equidistant and equiangular projections), and the 12-block grid used for instance in CITCOM [2]. By means of simple experiments, I will show that at comparable resolutions (and all other things being equal), the 12-block grid is surprisingly vastly superior to the cubed-sphere grid, when used in combination with trilinear velocity - constant pressure elements, while being more difficult to build/implement. [1] C. Ronchi, R. Iacono, and P. S. Paolucci, The "Cubed Sphere": A New Method for the Solution of Partial Differential Equations in Spherical Geometry, Journal of Computational Physics, 124, p93-114 (1996). [2] S. Zhong and M.T. Zuber and L.N. Moresi and M. Gurnis, Role of temperature-dependent viscosity and surface plates in spherical shell models of mantle convection, Journal of Geophysical Research, 105 (B5), p 11,063-11,082 (2000).

  1. Effectiveness of the Dader Method for pharmaceutical care in patients with bipolar I disorder: EMDADER-TAB: study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Bipolar I disorder (BD-I) is a chronic mental illness characterized by the presence of one or more manic episodes, or both depressive and manic episodes, usually separated by asymptomatic intervals. Pharmacists can contribute to the management of BD-I, mainly with the use of effective and safe drugs, and improve the patient’s life quality through pharmaceutical care. Some studies have shown the effect of pharmaceutical care in the achievement of therapeutic goals in different illnesses; however, to our knowledge, there is a lack of randomized controlled trials designed to assess the effect of pharmacist intervention in patients with BD. The aim of this study is to assess the effectiveness of the Dader Method for pharmaceutical care in patients with BD-I. Methods/design Randomized, controlled, prospective, single-center clinical trial with duration of 12 months will be performed to compare the effect of Dader Method of pharmaceutical care with the usual care process of patients in a psychiatric clinic. Patients diagnosed with BD-I aged between 18 and 65 years who have been discharged or referred from outpatients service of the San Juan de Dios Clinic (Antioquia, Colombia) will be included. Patients will be randomized into the intervention group who will receive pharmaceutical care provided by pharmacists working in collaboration with psychiatrists, or into the control group who will receive usual care and verbal-written counseling regarding BD. Study outcomes will be assessed at baseline and at 3, 6, 9, and 12 months after randomization. The primary outcome will be to measure the number of hospitalizations, emergency service consultations, and unscheduled outpatient visits. Effectiveness, safety, adherence, and quality of life will be assessed as secondary outcomes. Statistical analyses will be performed using two-tailed McNemar tests, Pearson chi-square tests, and Student’s t-tests; a P value <0.05 will be considered as statistically significant

  2. Percentage of Adults with High Blood Pressure Whose Hypertension Is Adequately Controlled

    MedlinePlus

    ... is Adequately Controlled Percentage of Adults with High Blood Pressure Whose Hypertension is Adequately Controlled Heart disease ... Survey. Age Group Percentage of People with High Blood Pressure that is Controlled by Age Group f94q- ...

  3. Patients with Celiac Disease Are Not Followed Adequately

    PubMed Central

    Herman, Margot L.; Rubio-Tapia, Alberto; Lahr, Brian D.; Larson, Joseph J.; Van Dyke, Carol T.; Murray, Joseph A.

    2012-01-01

    Background & Aims Adherence to a gluten-free diet is the only effective treatment for celiac disease. It has been recommended that patients be followed, make regular visits to the clinic, and undergo serologic analysis for markers of celiac disease, although a follow-up procedure has not been standardized. We determined how many patients with celiac disease are actually followed. Methods We collected data on 122 patients with biopsy-proven celiac disease, diagnosed between 1996 and 2006 in Olmsted County, Minnesota (70% women, median age of 42 years) for whom complete medical records and verification of residency were available. We determined the frequency at which patients received follow-up examinations, from 6 months to 5 years after diagnosis. The Kaplan-Meier method was used to estimate event rates at 1 and 5 year(s). Patients were classified according to categories of follow-up procedures recommended by the American Gastroenterology Association (AGA). Results We estimated that by 1 and 5 year(s) after diagnosis with celiac disease, 41.0% and 88.7% of the patients had follow-up visits, 33.6% and 79.8% were assessed for compliance with a gluten-free diet, 3.3% and 15.8% met with a registered dietitian, 2.5% and 18.1% had an additional intestinal biopsy, and 22.1% and 65.6% received serologic testing for markers of celiac disease. Among 113 patients (93%) who were followed for more than 4 years, only 35% received follow-up analyses that were consistent with AGA recommendations. Conclusions Patients with celiac disease are not followed consistently. Follow-up examinations are often inadequate and do not follow AGA recommendations. Improving follow-up strategies for patients with celiac disease could improve management of this disease. PMID:22610009

  4. Assessment of analysis-of-variance-based methods to quantify the random variations of observers in medical imaging measurements: guidelines to the investigator.

    PubMed

    Zeggelink, William F A Klein; Hart, Augustinus A M; Gilhuijs, Kenneth G A

    2004-07-01

    The random variations of observers in medical imaging measurements negatively affect the outcome of cancer treatment, and should be taken into account during treatment by the application of safety margins that are derived from estimates of the random variations. Analysis-of-variance- (ANOVA-) based methods are the most preferable techniques to assess the true individual random variations of observers, but the number of observers and the number of cases must be taken into account to achieve meaningful results. Our aim in this study is twofold. First, to evaluate three representative ANOVA-based methods for typical numbers of observers and typical numbers of cases. Second, to establish guidelines to the investigator to determine which method, how many observers, and which number of cases are required to obtain the a priori chosen performance. The ANOVA-based methods evaluated in this study are an established technique (pairwise differences method: PWD), a new approach providing additional statistics (residuals method: RES), and a generic technique that uses restricted maximum likelihood (REML) estimation. Monte Carlo simulations were performed to assess the performance of the ANOVA-based methods, which is expressed by their accuracy (closeness of the estimates to the truth), their precision (standard error of the estimates), and the reliability of their statistical test for the significance of a difference in the random variation of an observer between two groups of cases. The highest accuracy is achieved using REML estimation, but for datasets of at least 50 cases or arrangements with 6 or more observers, the differences between the methods are negligible, with deviations from the truth well below +/-3%. For datasets up to 100 cases, it is most beneficial to increase the number of cases to improve the precision of the estimated random variations, whereas for datasets over 100 cases, an improvement in precision is most efficiently achieved by increasing the number of

  5. Are family medicine residents adequately trained to deliver palliative care?

    PubMed Central

    Mahtani, Ramona; Kurahashi, Allison M.; Buchman, Sandy; Webster, Fiona; Husain, Amna; Goldman, Russell

    2015-01-01

    Objective To explore educational factors that influence family medicine residents’ (FMRs’) intentions to offer palliative care and palliative care home visits to patients. Design Qualitative descriptive study. Setting A Canadian, urban, specialized palliative care centre. Participants First-year (n = 9) and second-year (n = 6) FMRs. Methods Semistructured interviews were conducted with FMRs following a 4-week palliative care rotation. Questions focused on participant experiences during the rotation and perceptions about their roles as family physicians in the delivery of palliative care and home visits. Participant responses were analyzed to summarize and interpret patterns related to their educational experience during their rotation. Main findings Four interrelated themes were identified that described this experience: foundational skill development owing to training in a specialized setting; additional need for education and support; unaddressed gaps in pragmatic skills; and uncertainty about family physicians’ role in palliative care. Conclusion Residents described experiences that both supported and inadvertently discouraged them from considering future engagement in palliative care. Reassuringly, residents were also able to underscore opportunities for improvement in palliative care education. PMID:27035008

  6. 76 FR 51041 - Hemoglobin Standards and Maintaining Adequate Iron Stores in Blood Donors; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ... HUMAN SERVICES Food and Drug Administration Hemoglobin Standards and Maintaining Adequate Iron Stores in... Standards and Maintaining Adequate Iron Stores in Blood Donors.'' The purpose of this public workshop is to... donor safety and blood availability, and potential measures to maintain adequate iron stores in...

  7. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  8. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  9. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  10. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  11. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...

  12. Randomized controlled trial to test a computerized psychosocial cancer assessment and referral program: methods and research design.

    PubMed

    O'Hea, Erin L; Cutillo, Alexandra; Dietzen, Laura; Harralson, Tina; Grissom, Grant; Person, Sharina; Boudreaux, Edwin D

    2013-05-01

    The National Cancer Coalition Network, National Cancer Institute, and American College of Surgeons all emphasize the need for oncology providers to identify, address, and monitor psychosocial needs of their patients. The Mental Health Assessment and Dynamic Referral for Oncology (MHADRO) is a patient-driven, computerized, psychosocial assessment that identifies, addresses, and monitors physical, psychological, and social issues faced by oncology patients. This paper presents the methodology of a randomized controlled trial (RCT) that tested the impact of the MHADRO on patient outcomes at 2, 6, and 12 months. Patient outcomes including overall psychological distress, depression, anxiety, functional disability, and use of psychosocial resources will be presented in future publications after all follow-up data is gathered. Eight hundred and thirty six cancer patients with heterogeneous diagnoses, across three comprehensive cancer centers in different parts of the United States, were randomized to the MHADRO (intervention) or an assessment-only control group. Patients in the intervention group were provided detailed, personalized reports and, when needed, referrals to mental health services; their oncology provider received detailed reports designed to foster clinical decision making. Those patients who demonstrated high levels of psychosocial problems were given the option to authorize that a copy of their report be sent electronically to a "best match" mental health professional. Demographic and patient cancer-related data as well as comparisons between patients who were enrolled and those who declined enrollment are presented. Challenges encountered during the RCT and strategies used to address them are discussed.

  13. Randomized clinical trial of multimodal physiotherapy treatment compared to overnight lidocaine ointment in women with provoked vestibulodynia: Design and methods.

    PubMed

    Morin, Mélanie; Dumoulin, Chantale; Bergeron, Sophie; Mayrand, Marie-Hélène; Khalifé, Samir; Waddell, Guy; Dubois, Marie-France

    2016-01-01

    Provoked vestibulodynia (PVD) is a highly prevalent and debilitating condition yet its management relies mainly on non-empirically validated interventions. Among the many causes of PVD, there is growing evidence that pelvic floor muscle (PFM) dysfunctions play an important role in its pathophysiology. Multimodal physiotherapy, which addresses these dysfunctions, is judged by experts to be highly effective and is recommended as a first-line treatment. However, the effectiveness of this promising intervention has been evaluated through only two small uncontrolled trials. The proposed bi-center, single-blind, parallel group, randomized controlled trial (RCT) aims to evaluate the efficacy of multimodal physiotherapy and compare it to a frequently used first-line treatment, topical overnight application of lidocaine, in women with PVD. A total of 212 women diagnosed with PVD according to a standardized protocol were eligible for the study and were randomly assigned to either multimodal physiotherapy or lidocaine treatment for 10weeks. The primary outcome measure is pain during intercourse (assessed with a numerical rating scale). Secondary measures include sexual function, pain quality, psychological factors (including pain catastrophizing, anxiety, depression and fear of pain), PFM morphology and function, and patients' global impression of change. Assessments are made at baseline, post-treatment and at the 6-month follow-up. This manuscript presents and discusses the rationale, design and methodology of the first RCT investigating physiotherapy in comparison to a commonly prescribed first-line treatment, overnight topical lidocaine, for women with PVD.

  14. Performance of SuSi: a method for generating atomistic models of amorphous polymers based on a random search of energy minima.

    PubMed

    Curcó, David; Alemán, Carlos

    2004-04-30

    The performance of a recently developed method to generate representative atomistic models of amorphous polymers has been investigated. This method, which is denoted SuSi, can be defined as a random generator of energy minima. The effects produced by different parameters used to define the size of the system and the characteristics of the generation algorithm have been examined. Calculations have been performed on poly(L,D-lactic) acid (rho = 1.25 g/cm3) and nylon 6 (rho = 1.084 g/cm(3)), which are important commercial polymers.

  15. A Novel Method for Assessment of Polyethylene Liner Wear in Radiopaque Tantalum Acetabular Cups: Clinical Validation in Patients Enrolled in a Randomized Controlled Trial.

    PubMed

    Troelsen, Anders; Greene, Meridith E; Ayers, David C; Bragdon, Charles R; Malchau, Henrik

    2015-12-01

    Conventional radiostereometric analysis (RSA) for wear is not possible in patients with tantalum cups. We propose a novel method for wear analysis in tantalum cups. Wear was assessed by gold standard RSA and the novel method in total hip arthroplasty patients enrolled in a randomized controlled trial receiving either titanium or tantalum cups (n=46). The novel method estimated the center of the head using a model based on identification of two proximal markers on the stem and knowledge of the stem/head configuration. The novel method was able to demonstrate a pattern of wear that was similar to the gold standard in titanium cups. The novel method offered accurate assessment and is a viable solution for assessment of wear in studies with tantalum cups.

  16. A Novel Method for Assessment of Polyethylene Liner Wear in Radiopaque Tantalum Acetabular Cups: Clinical Validation in Patients Enrolled in a Randomized Controlled Trial.

    PubMed

    Troelsen, Anders; Greene, Meridith E; Ayers, David C; Bragdon, Charles R; Malchau, Henrik

    2015-12-01

    Conventional radiostereometric analysis (RSA) for wear is not possible in patients with tantalum cups. We propose a novel method for wear analysis in tantalum cups. Wear was assessed by gold standard RSA and the novel method in total hip arthroplasty patients enrolled in a randomized controlled trial receiving either titanium or tantalum cups (n=46). The novel method estimated the center of the head using a model based on identification of two proximal markers on the stem and knowledge of the stem/head configuration. The novel method was able to demonstrate a pattern of wear that was similar to the gold standard in titanium cups. The novel method offered accurate assessment and is a viable solution for assessment of wear in studies with tantalum cups. PMID:26216229

  17. Using Fuzzy Logic to Identify Schools Which May Be Misclassified by the No Child Left Behind Adequate Yearly Progress Policy

    ERIC Educational Resources Information Center

    Yates, Donald W.

    2009-01-01

    This investigation developed, tested, and prototyped a Fuzzy Inference System (FIS) that would assist decision makers in identifying schools that may have been misclassified by existing Adequate Yearly Progress (AYP) methods. This prototype was then used to evaluate Louisiana elementary schools using published school data for Academic Year 2004. …

  18. Comparison of 3D-OP-OSEM and 3D-FBP reconstruction algorithms for High-Resolution Research Tomograph studies: effects of randoms estimation methods.

    PubMed

    van Velden, Floris H P; Kloet, Reina W; van Berckel, Bart N M; Wolfensberger, Saskia P A; Lammertsma, Adriaan A; Boellaard, Ronald

    2008-06-21

    The High-Resolution Research Tomograph (HRRT) is a dedicated human brain positron emission tomography (PET) scanner. Recently, a 3D filtered backprojection (3D-FBP) reconstruction method has been implemented to reduce bias in short duration frames, currently observed in 3D ordinary Poisson OSEM (3D-OP-OSEM) reconstructions. Further improvements might be expected using a new method of variance reduction on randoms (VRR) based on coincidence histograms instead of using the delayed window technique (DW) to estimate randoms. The goal of this study was to evaluate VRR in combination with 3D-OP-OSEM and 3D-FBP reconstruction techniques. To this end, several phantom studies and a human brain study were performed. For most phantom studies, 3D-OP-OSEM showed higher accuracy of observed activity concentrations with VRR than with DW. However, both positive and negative deviations in reconstructed activity concentrations and large biases of grey to white matter contrast ratio (up to 88%) were still observed as a function of scan statistics. Moreover 3D-OP-OSEM+VRR also showed bias up to 64% in clinical data, i.e. in some pharmacokinetic parameters as compared with those obtained with 3D-FBP+VRR. In the case of 3D-FBP, VRR showed similar results as DW for both phantom and clinical data, except that VRR showed a better standard deviation of 6-10%. Therefore, VRR should be used to correct for randoms in HRRT PET studies.

  19. Comparison of 3D-OP-OSEM and 3D-FBP reconstruction algorithms for High-Resolution Research Tomograph studies: effects of randoms estimation methods

    NASA Astrophysics Data System (ADS)

    van Velden, Floris H. P.; Kloet, Reina W.; van Berckel, Bart N. M.; Wolfensberger, Saskia P. A.; Lammertsma, Adriaan A.; Boellaard, Ronald

    2008-06-01

    The High-Resolution Research Tomograph (HRRT) is a dedicated human brain positron emission tomography (PET) scanner. Recently, a 3D filtered backprojection (3D-FBP) reconstruction method has been implemented to reduce bias in short duration frames, currently observed in 3D ordinary Poisson OSEM (3D-OP-OSEM) reconstructions. Further improvements might be expected using a new method of variance reduction on randoms (VRR) based on coincidence histograms instead of using the delayed window technique (DW) to estimate randoms. The goal of this study was to evaluate VRR in combination with 3D-OP-OSEM and 3D-FBP reconstruction techniques. To this end, several phantom studies and a human brain study were performed. For most phantom studies, 3D-OP-OSEM showed higher accuracy of observed activity concentrations with VRR than with DW. However, both positive and negative deviations in reconstructed activity concentrations and large biases of grey to white matter contrast ratio (up to 88%) were still observed as a function of scan statistics. Moreover 3D-OP-OSEM+VRR also showed bias up to 64% in clinical data, i.e. in some pharmacokinetic parameters as compared with those obtained with 3D-FBP+VRR. In the case of 3D-FBP, VRR showed similar results as DW for both phantom and clinical data, except that VRR showed a better standard deviation of 6-10%. Therefore, VRR should be used to correct for randoms in HRRT PET studies.

  20. Adequate iodine levels in healthy pregnant women. A cross-sectional survey of dietary intake in Turkey

    PubMed Central

    Kasap, Burcu; Akbaba, Gülhan; Yeniçeri, Emine N.; Akın, Melike N.; Akbaba, Eren; Öner, Gökalp; Turhan, Nilgün Ö.; Duru, Mehmet E.

    2016-01-01

    Objectives: To assess current iodine levels and related factors among healthy pregnant women. Methods: In this cross-sectional, hospital-based study, healthy pregnant women (n=135) were scanned for thyroid volume, provided urine samples for urinary iodine concentration and completed a questionnaire including sociodemographic characteristics and dietary habits targeted for iodine consumption at the Department of Obstetrics and Gynecology, School of Medicine, Muğla Sıtkı Koçman University, Muğla, Turkey, between August 2014 and February 2015. Sociodemographic data were analyzed by simple descriptive statistics. Results: Median urinary iodine concentration was 222.0 µg/L, indicating adequate iodine intake during pregnancy. According to World Health Organization (WHO) criteria, 28.1% of subjects had iodine deficiency, 34.1% had adequate iodine intake, 34.8% had more than adequate iodine intake, and 3.0% had excessive iodine intake during pregnancy. Education level, higher monthly income, current employment, consuming iodized salt, and adding salt to food during, or after cooking were associated with higher urinary iodine concentration. Conclusion: Iodine status of healthy pregnant women was adequate, although the percentage of women with more than adequate iodine intake was higher than the reported literature. PMID:27279519

  1. An Automated Three-Dimensional Detection and Segmentation Method for Touching Cells by Integrating Concave Points Clustering and Random Walker Algorithm

    PubMed Central

    Gong, Hui; Chen, Shangbin; Zhang, Bin; Ding, Wenxiang; Luo, Qingming; Li, Anan

    2014-01-01

    Characterizing cytoarchitecture is crucial for understanding brain functions and neural diseases. In neuroanatomy, it is an important task to accurately extract cell populations' centroids and contours. Recent advances have permitted imaging at single cell resolution for an entire mouse brain using the Nissl staining method. However, it is difficult to precisely segment numerous cells, especially those cells touching each other. As presented herein, we have developed an automated three-dimensional detection and segmentation method applied to the Nissl staining data, with the following two key steps: 1) concave points clustering to determine the seed points of touching cells; and 2) random walker segmentation to obtain cell contours. Also, we have evaluated the performance of our proposed method with several mouse brain datasets, which were captured with the micro-optical sectioning tomography imaging system, and the datasets include closely touching cells. Comparing with traditional detection and segmentation methods, our approach shows promising detection accuracy and high robustness. PMID:25111442

  2. Salt sales survey: a simplified, cost-effective method to evaluate population salt reduction programs--a cluster-randomized trial.

    PubMed

    Ma, Yuan; He, Feng J; Li, Nicole; Hao, Jesse; Zhang, Jing; Yan, Lijing L; Wu, Yangfeng

    2016-04-01

    Twenty-four-hour urine collection, as a gold standard method of measuring salt intake, is costly and resource consuming, which limits its use in monitoring population salt reduction programs. Our study aimed to determine whether a salt sales survey could serve as an alternative method. This was a substudy of China Rural Health Initiative-Sodium Reduction Study (CRHI-SRS), in which 120 villages were randomly allocated (1:1:2) into a price subsidy+health education (PS+HE) group, a HE-only group or a control group. Salt substitutes (SS) were supplied to shops in the intervention groups; 24-h urine was collected from 2567 randomly selected adults at the end of the trial to evaluate the effects of the intervention. Ten villages were randomly selected from each group (that is, 30 villages in total), and 166 shops from these villages were invited to participate in the monthly salt sales survey. The results showed that during the intervention period, mean monthly sales of SS per shop were 38.0 kg for the PS+HE group, 19.2 kg for the HE only and 2.2 kg for the control group (P<0.05), which was consistent with the results from the 24-h urine sodium and potassium data. The intervention effects of CRHI-SRS on sodium and potassium intake estimated from SS sales were 101% and 114%, respectively, of those observed from the 24-h urine data. Furthermore, the salt sales survey cost only 14% of the cost of the 24-h urine method and had greater statistical power. The results indicate that a salt sales survey could serve as a simple, sensitive and cost-effective method to evaluate community-based salt reduction programs in which salt is mainly added by the consumers. PMID:26657005

  3. Zinc content of selected tissues and taste perception in rats fed zinc deficient and zinc adequate rations

    SciTech Connect

    Boeckner, L.S.; Kies, C.

    1986-03-05

    The objective of the study was to determine the effects of feeding zinc sufficient and zinc deficient rations on taste sensitivity and zinc contents of selected organs in rats. The 36 Sprague-Dawley male weanling rats were divided into 2 groups and fed zinc deficient or zinc adequate rations. The animals were subjected to 4 trial periods in which a choice of deionized distilled water or a solution of quinine sulfate at 1.28 x 10/sup -6/ was given. A randomized schedule for rat sacrifice was used. No differences were found between zinc deficient and zinc adequate rats in taste preference aversion scores for quinine sulfate in the first three trial periods; however, in the last trial period rats in the zinc sufficient group drank somewhat less water containing quinine sulfate as a percentage of total water consumption than did rats fed the zinc deficient ration. Significantly higher zinc contents of kidney, brain and parotid salivary glands were seen in zinc adequate rats compared to zinc deficient rats at the end of the study. However, liver and tongue zinc levels were lower for both groups at the close of the study than were those of rats sacrificed at the beginning of the study.

  4. A Self-Administered Method of Acute Pressure Block of Sciatic Nerves for Short-Term Relief of Dental Pain: A Randomized Study

    PubMed Central

    Wang, Xiaolin; Zhao, Wanghong; Wang, Ye; Hu, Jiao; Chen, Qiu; Yu, Juncai; Wu, Bin; Huang, Rong; Gao, Jie; He, Jiman

    2014-01-01

    Objectives While stimulation of the peripheral nerves increases the pain threshold, chronic pressure stimulation of the sciatic nerve is associated with sciatica. We recently found that acute pressure block of the sciatic nerve inhibits pain. Therefore, we propose that, the pain pathology-causing pressure is chronic, not acute. Here, we report a novel self-administered method: acute pressure block of the sciatic nerves is applied by the patients themselves for short-term relief of pain from dental diseases. Design This was a randomized, single-blind study. Setting Hospital patients. Patients Patients aged 16–60 years with acute pulpitis, acute apical periodontitis, or pericoronitis of the third molar of the mandible experiencing pain ≥3 on the 11-point numerical pain rating scale. Interventions Three-minute pressure to sciatic nerves was applied by using the hands (hand pressure method) or by having the patients squat to force the thigh and shin as tightly as possible on the sandwiched sciatic nerve bundles (self-administered method). Outcomes The primary efficacy variable was the mean difference in pain scores from the baseline. Results One hundred seventy-two dental patients were randomized. The self-administered method produced significant relief from pain associated with dental diseases (P ≤ 0.001). The analgesic effect of the self-administered method was similar to that of the hand pressure method. Conclusions The self-administered method is easy to learn and can be applied at any time for pain relief. We believe that patients will benefit from this method. PMID:24400593

  5. Investigation on wide-band scattering of a 2-D target above 1-D randomly rough surface by FDTD method.

    PubMed

    Li, Juan; Guo, Li-Xin; Jiao, Yong-Chang; Li, Ke

    2011-01-17

    Finite-difference time-domain (FDTD) algorithm with a pulse wave excitation is used to investigate the wide-band composite scattering from a two-dimensional(2-D) infinitely long target with arbitrary cross section located above a one-dimensional(1-D) randomly rough surface. The FDTD calculation is performed with a pulse wave incidence, and the 2-D representative time-domain scattered field in the far zone is obtained directly by extrapolating the currently calculated data on the output boundary. Then the 2-D wide-band scattering result is acquired by transforming the representative time-domain field to the frequency domain with a Fourier transform. Taking the composite scattering of an infinitely long cylinder above rough surface as an example, the wide-band response in the far zone by FDTD with the pulsed excitation is computed and it shows a good agreement with the numerical result by FDTD with the sinusoidal illumination. Finally, the normalized radar cross section (NRCS) from a 2-D target above 1-D rough surface versus the incident frequency, and the representative scattered fields in the far zone versus the time are analyzed in detail.

  6. Parallelization of a spatial random field characterization process using the Method of Anchored Distributions and the HTCondor high throughput computing system

    NASA Astrophysics Data System (ADS)

    Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.

    2013-12-01

    A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)

  7. Adequate bases of phase space master integrals for gg → h at NNLO and beyond

    NASA Astrophysics Data System (ADS)

    Höschele, Maik; Hoff, Jens; Ueda, Takahiro

    2014-09-01

    We study master integrals needed to compute the Higgs boson production cross section via gluon fusion in the infinite top quark mass limit, using a canonical form of differential equations for master integrals, recently identified by Henn, which makes their solution possible in a straightforward algebraic way. We apply the known criteria to derive such a suitable basis for all the phase space master integrals in afore mentioned process at next-to-next-to-leading order in QCD and demonstrate that the method is applicable to next-to-next-to-next-to-leading order as well by solving a non-planar topology. Furthermore, we discuss in great detail how to find an adequate basis using practical examples. Special emphasis is devoted to master integrals which are coupled by their differential equations.

  8. A novel method for diagnosis of smear-negative tuberculosis patients by combining a random unbiased Phi29 amplification with a specific real-time PCR.

    PubMed

    Pang, Yu; Lu, Jie; Yang, Jian; Wang, Yufeng; Cohen, Chad; Ni, Xin; Zhao, Yanlin

    2015-07-01

    In this study, we develop a novel method for diagnosis of smear-negative tuberculosis patients by performing a random unbiased Phi29 amplification prior to the use of a specific real-time PCR. The limit of detection (LOD) of the conventional real-time PCR was 100 colony-forming units (CFU) of MTB genome/reaction, while the REPLI real-time PCR assay could detect 0.4 CFU/reaction. In comparison with the conventional real-time PCR, REPLI real-time PCR shows better sensitivity for the detection of smear-negative tuberculosis (P = 0.015).

  9. Free variable selection QSPR study to predict 19F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods

    NASA Astrophysics Data System (ADS)

    Goudarzi, Nasser

    2016-04-01

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the 19F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the 19F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  10. Effects of smartphone diaries and personal dosimeters on behavior in a randomized study of methods to document sunlight exposure.

    PubMed

    Køster, Brian; Søndergaard, Jens; Nielsen, Jesper Bo; Allen, Martin; Bjerregaard, Mette; Olsen, Anja; Bentzen, Joan

    2016-06-01

    Dosimeters and diaries have previously been used to evaluate sun-related behavior and UV exposure in local samples. However, wearing a dosimeter or filling in a diary may cause a behavioral change. The aim of this study was to examine possible confounding factors for a questionnaire validation study. We examined the effects of wearing dosimeters and filling out diaries, measurement period and recall effect on the sun-related behavior in Denmark in 2012. Our sample included 240 participants eligible by smartphone status and who took a vacation during weeks 26-32 in 2012, randomized by gender, age, education and skin type to six groups: 1) Control + diary, 2) Control, 3) 1-week dosimetry measurement, 4) 1-week dosimetry measurement + diary, 5) 3-week dosimetry measurement and 6) 1-week dosimetry measurement with 4 week delayed questionnaire. Correlation coefficients between reported outdoor time and registered outdoor time for groups 3-6 were 0.39, 0.45, 0.43 and 0.09, respectively. Group 6 was the only group not significantly correlated. Questionnaire reported outdoor exposure time was shorter in the dosimeter measurement groups (3-6) than in their respective controls. We showed that using a dosimeter or keeping a diary seems to increase attention towards the behavior examined and therefore may influence this behavior. Receiving the questionnaire with 4 week delay had a significant negative influence on correlation and recall of sunburn. When planning future UV behavior questionnaire validations, we suggest to use a 1-week interval for dosimetry measurements, no diary, and to minimize the time from end of measurement to filling out questionnaires. PMID:27419038

  11. The use of propensity score methods with survival or time-to-event outcomes: reporting measures of effect similar to those used in randomized experiments.

    PubMed

    Austin, Peter C

    2014-03-30

    Propensity score methods are increasingly being used to estimate causal treatment effects in observational studies. In medical and epidemiological studies, outcomes are frequently time-to-event in nature. Propensity-score methods are often applied incorrectly when estimating the effect of treatment on time-to-event outcomes. This article describes how two different propensity score methods (matching and inverse probability of treatment weighting) can be used to estimate the measures of effect that are frequently reported in randomized controlled trials: (i) marginal survival curves, which describe survival in the population if all subjects were treated or if all subjects were untreated; and (ii) marginal hazard ratios. The use of these propensity score methods allows one to replicate the measures of effect that are commonly reported in randomized controlled trials with time-to-event outcomes: both absolute and relative reductions in the probability of an event occurring can be determined. We also provide guidance on variable selection for the propensity score model, highlight methods for assessing the balance of baseline covariates between treated and untreated subjects, and describe the implementation of a sensitivity analysis to assess the effect of unmeasured confounding variables on the estimated treatment effect when outcomes are time-to-event in nature. The methods in the paper are illustrated by estimating the effect of discharge statin prescribing on the risk of death in a sample of patients hospitalized with acute myocardial infarction. In this tutorial article, we describe and illustrate all the steps necessary to conduct a comprehensive analysis of the effect of treatment on time-to-event outcomes.

  12. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  13. Children's behavioral pain reactions during local anesthetic injection using cotton-roll vibration method compared with routine topical anesthesia: A randomized controlled trial

    PubMed Central

    Bagherian, Ali; Sheikhfathollahi, Mahmood

    2016-01-01

    Background: Topical anesthesia has been widely advocated as an important component of atraumatic administration of intraoral local anesthesia. The aim of this study was to use direct observation of children's behavioral pain reactions during local anesthetic injection using cotton-roll vibration method compared with routine topical anesthesia. Materials and Methods: Forty-eight children participated in this randomized controlled clinical trial. They received two separate inferior alveolar nerve block or primary maxillary molar infiltration injections on contralateral sides of the jaws by both cotton-roll vibration (a combination of topical anesthesia gel, cotton roll, and vibration for physical distraction) and control (routine topical anesthesia) methods. Behavioral pain reactions of children were measured according to the author-developed face, head, foot, hand, trunk, and cry (FHFHTC) scale, resulting in total scores between 0 and 18. Results: The total scores on the FHFHTC scale ranged between 0-5 and 0-10 in the cotton-roll vibration and control methods, respectively. The mean ± standard deviation values of total scores on FHFHTC scale were lower in the cotton-roll vibration method (1.21 ± 1.38) than in control method (2.44 ± 2.18), and this was statistically significant (P < 0.001). Conclusion: It may be concluded that the cotton-roll vibration method can be more helpful than the routine topical anesthesia in reducing behavioral pain reactions in children during local anesthesia administration. PMID:27274349

  14. Utilizing Peer Nominations in Middle School: A Longitudinal Comparison between Complete Classroom-Based and Random List Methods

    ERIC Educational Resources Information Center

    Bellmore, Amy; Jiang, Xiao Lu; Juvonen, Jaana

    2010-01-01

    Although peer nominations provide invaluable data on social status and reputations of classmates, the large size and organizational structure of secondary schools pose a practical challenge to utilizing nomination methods. Particularly problematic is determining the appropriate reference group when students are no longer in self-contained…

  15. Comparison of Address-based Sampling and Random-digit Dialing Methods for Recruiting Young Men as Controls in a Case-Control Study of Testicular Cancer Susceptibility

    PubMed Central

    Clagett, Bartholt; Nathanson, Katherine L.; Ciosek, Stephanie L.; McDermoth, Monique; Vaughn, David J.; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A.

    2013-01-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-phone numbers and address-based sampling (ABS), to recruit primarily white men aged 18–55 years into a study of testicular cancer susceptibility conducted in the Philadelphia, Pennsylvania, metropolitan area between 2009 and 2012. With few exceptions, eligible and enrolled controls recruited by means of RDD and ABS were similar with regard to characteristics for which data were collected on the screening survey. While we find ABS to be a comparably effective method of recruiting young males compared with landline RDD, we acknowledge the potential impact that selection bias may have had on our results because of poor overall response rates, which ranged from 11.4% for landline RDD to 1.7% for ABS. PMID:24008901

  16. Generating random density matrices

    NASA Astrophysics Data System (ADS)

    Życzkowski, Karol; Penson, Karol A.; Nechita, Ion; Collins, Benoît

    2011-06-01

    We study various methods to generate ensembles of random density matrices of a fixed size N, obtained by partial trace of pure states on composite systems. Structured ensembles of random pure states, invariant with respect to local unitary transformations are introduced. To analyze statistical properties of quantum entanglement in bi-partite systems we analyze the distribution of Schmidt coefficients of random pure states. Such a distribution is derived in the case of a superposition of k random maximally entangled states. For another ensemble, obtained by performing selective measurements in a maximally entangled basis on a multi-partite system, we show that this distribution is given by the Fuss-Catalan law and find the average entanglement entropy. A more general class of structured ensembles proposed, containing also the case of Bures, forms an extension of the standard ensemble of structureless random pure states, described asymptotically, as N → ∞, by the Marchenko-Pastur distribution.

  17. Subspace inverse power method and polynomial chaos representation for the modal frequency responses of random mechanical systems

    NASA Astrophysics Data System (ADS)

    Pagnacco, E.; de Cursi, E. Souza; Sampaio, R.

    2016-07-01

    This study concerns the computation of frequency responses of linear stochastic mechanical systems through a modal analysis. A new strategy, based on transposing standards deterministic deflated and subspace inverse power methods into stochastic framework, is introduced via polynomial chaos representation. Applicability and effectiveness of the proposed schemes is demonstrated through three simple application examples and one realistic application example. It is shown that null and repeated-eigenvalue situations are addressed successfully.

  18. Intersection of race/ethnicity and gender in depression care: screening, access, and minimally adequate treatment

    PubMed Central

    Hahm, Hyeouk Chris; Cook, Benjamin; Ault-Brutus, Andrea; Alegria, Margarita

    2015-01-01

    Objectives This study was conducted to understand the interaction of race/ethnicity and gender in depression screening, any mental health care, and adequate care. Methods 2010–2012 electronic health records data of adult primary care patients from a New England urban health care system was used (n = 65,079). Multivariate logit regression models were used to assess the associations between race/ethnicity, gender, and other covariates with depression screening, any depression care among those screened positive, and adequate depression care among users. Secondly, disparities were evaluated by race/ethnicity and gender and incorporated differences due to insurance, marital status, and area-level SES measures. Findings Black and Asian males and females were less likely to be screened for depression compared to their white counterparts, while Latino males and females were more likely to be screened. Among those that screened PHQ-9>10, black males and females, Latino males, and Asian males and females were less likely to receive any mental health care than their white counterparts. The black-white disparity in screening was greater for females compared to males. The Latino-white disparity for any mental health care and adequacy of care was greater for males compared to females. Conclusions Our approach underscores the importance of identifying disparities at each step of depression care by both race/ethnicity and gender. Targeting certain groups in specific stages of care would be more effective (i.e., screening of black females, any mental health care and adequacy of care for Latino males) than a blanket approach to disparities reduction. PMID:25727113

  19. Are the Psychological Needs of Adolescent Survivors of Pediatric Cancer Adequately Identified and Treated?

    PubMed Central

    Kahalley, Lisa S.; Wilson, Stephanie J.; Tyc, Vida L.; Conklin, Heather M.; Hudson, Melissa M.; Wu, Shengjie; Xiong, Xiaoping; Stancel, Heather H.; Hinds, Pamela S.

    2012-01-01

    Objectives To describe the psychological needs of adolescent survivors of acute lymphoblastic leukemia (ALL) or brain tumor (BT), we examined: (a) the occurrence of cognitive, behavioral, and emotional concerns identified during a comprehensive psychological evaluation, and (b) the frequency of referrals for psychological follow-up services to address identified concerns. Methods Psychological concerns were identified on measures according to predetermined criteria for 100 adolescent survivors. Referrals for psychological follow-up services were made for concerns previously unidentified in formal assessment or not adequately addressed by current services. Results Most survivors (82%) exhibited at least one concern across domains: behavioral (76%), cognitive (47%), and emotional (19%). Behavioral concerns emerged most often on scales associated with executive dysfunction, inattention, learning, and peer difficulties. CRT was associated with cognitive concerns, χ2(1,N=100)=5.63, p<0.05. Lower income was associated with more cognitive concerns for ALL survivors, t(47)=3.28, p<0.01, and more behavioral concerns for BT survivors, t(48)=2.93, p<0.01. Of survivors with concerns, 38% were referred for psychological follow-up services. Lower-income ALL survivors received more referrals for follow-up, χ2(1,N=41)=8.05, p<0.01. Referred survivors had more concerns across domains than non-referred survivors, ALL: t(39)=2.96, p<0.01, BT: t(39)=3.52, p<0.01. Trends suggest ALL survivors may be at risk for experiencing unaddressed cognitive needs. Conclusions Many adolescent survivors of cancer experience psychological difficulties that are not adequately managed by current services, underscoring the need for long-term surveillance. In addition to prescribing regular psychological evaluations, clinicians should closely monitor whether current support services appropriately meet survivors’ needs, particularly for lower-income survivors and those treated with CRT. PMID:22278930

  20. Use of Linear Programming to Develop Cost-Minimized Nutritionally Adequate Health Promoting Food Baskets

    PubMed Central

    Tetens, Inge; Dejgård Jensen, Jørgen; Smed, Sinne; Gabrijelčič Blenkuš, Mojca; Rayner, Mike; Darmon, Nicole; Robertson, Aileen

    2016-01-01

    Background Food-Based Dietary Guidelines (FBDGs) are developed to promote healthier eating patterns, but increasing food prices may make healthy eating less affordable. The aim of this study was to design a range of cost-minimized nutritionally adequate health-promoting food baskets (FBs) that help prevent both micronutrient inadequacy and diet-related non-communicable diseases at lowest cost. Methods Average prices for 312 foods were collected within the Greater Copenhagen area. The cost and nutrient content of five different cost-minimized FBs for a family of four were calculated per day using linear programming. The FBs were defined using five different constraints: cultural acceptability (CA), or dietary guidelines (DG), or nutrient recommendations (N), or cultural acceptability and nutrient recommendations (CAN), or dietary guidelines and nutrient recommendations (DGN). The variety and number of foods in each of the resulting five baskets was increased through limiting the relative share of individual foods. Results The one-day version of N contained only 12 foods at the minimum cost of DKK 27 (€ 3.6). The CA, DG, and DGN were about twice of this and the CAN cost ~DKK 81 (€ 10.8). The baskets with the greater variety of foods contained from 70 (CAN) to 134 (DGN) foods and cost between DKK 60 (€ 8.1, N) and DKK 125 (€ 16.8, DGN). Ensuring that the food baskets cover both dietary guidelines and nutrient recommendations doubled the cost while cultural acceptability (CAN) tripled it. Conclusion Use of linear programming facilitates the generation of low-cost food baskets that are nutritionally adequate, health promoting, and culturally acceptable. PMID:27760131

  1. Selecting rRNA binding sites for the ribosomal proteins L4 and L6 from randomly fragmented rRNA: application of a method called SERF.

    PubMed

    Stelzl, U; Spahn, C M; Nierhaus, K H

    2000-04-25

    Two-thirds of the 54 proteins of the Escherichia coli ribosome interact directly with the rRNAs, but the rRNA binding sites of only a very few proteins are known. We present a method (selection of random RNA fragments; SERF) that can identify the minimal binding region for proteins within ribonucleo-protein complexes such as the ribosome. The power of the method is exemplified with the ribosomal proteins L4 and L6. Binding sequences are identified for both proteins and characterized by phosphorothioate footprinting. Surprisingly, the binding region of L4, a 53-nt rRNA fragment of domain I of 23S rRNA, can simultaneously and independently bind L24, one of the two assembly initiator proteins of the large subunit.

  2. A facile method to enhance out-coupling efficiency in organic light-emitting diodes via a random-pyramids textured layer

    NASA Astrophysics Data System (ADS)

    Zhu, Wenqing; Xiao, Teng; Zhai, Guangsheng; Yu, Jingting; Shi, Guanjie; Chen, Guo; Wei, Bin

    2016-09-01

    We demonstrate a facile method to enhance light extraction in organic light-emitting diodes using a polymer layer with a texture consisting of random upright pyramids. The simple fabrication technique of the textured layer is based on silicon alkali-etching and imprint lithography. With the textured layer applied to the external face of the glass substrate, the organic light-emitting diode achieved a 26% enhancement of current efficiency and a 30% enhancement of power efficiency without spectral distortion over wide viewing angles. A ray-tracing optical simulation reveals that the textured layer can alter the traveling path of light and assist in out-coupling a large portion of light delivered into the substrate. The proposed method is a promising approach for achieving enhanced efficiency organic light-emitting diodes for the simple fabrication process and the effective light extraction.

  3. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  4. Calculation of the Cost of an Adequate Education in Kentucky: A Professional Judgment Approach

    ERIC Educational Resources Information Center

    Verstegen, Deborah A.

    2004-01-01

    What is an adequate education and how much does it cost? In 1989, Kentucky's State Supreme Court found the entire system of education unconstitutional--"all of its parts and parcels". The Court called for all children to have access to an adequate education, one that is uniform and has as its goal the development of seven capacities, including:…

  5. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  6. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  7. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Medical devices; adequate directions for use. 801.5 Section 801.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate...

  8. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Medical devices; adequate directions for use. 801.5 Section 801.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate...

  9. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Medical devices; adequate directions for use. 801.5 Section 801.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate...

  10. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Medical devices; adequate directions for use. 801.5 Section 801.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate...

  11. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL... Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides... has determined, in accordance with FIFRA sec. 25(b)(1), that they are adequately regulated by...

  12. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL... Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides... has determined, in accordance with FIFRA sec. 25(b)(1), that they are adequately regulated by...

  13. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL... Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides... has determined, in accordance with FIFRA sec. 25(b)(1), that they are adequately regulated by...

  14. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  15. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  16. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  17. The Healthy Young Men's Study: Sampling Methods to Recruit a Random Cohort of Young Men Who Have Sex with Men.

    PubMed

    Ford, Wesley L; Weiss, George; Kipke, Michele D; Ritt-Olson, Anamara; Iverson, Ellen; Lopez, Donna

    2009-10-01

    Recruiting a scientifically sound cohort of young men who have sex with men (YMSM) is an enduring research challenge. The few cohort studies that have been conducted to date on YMSM have relied on non-probability sampling methods to construct their cohorts. While these studies have provided valuable information about HIV risk behaviors among YMSM, their generalizability to broader YMSM populations is limited.In this paper the authors describe a venue-based sampling methodology used to recruit a large and diverse cohort of YMSM from public venues in Los Angeles County. Venue-based sampling is a multi-stage, probability sampling design that uses standard outreach techniques and standard survey methods to systematically enumerate, sample, and survey hard-to-reach populations. The study design allowed the authors to estimate individual, familial and interpersonal psychosocial factors associated with HIV risk and health seeking behaviors for a cohort of YMSM with known properties. Study participants completed an extensive baseline survey and over a two year period will complete four follow-up surveys at six-month intervals. The baseline survey was administered in both English and Spanish.

  18. Implementation of low-kurtosis pseudo-random excitations to compensate for the effects of nonlinearity on damping estimation by the half-power method

    NASA Astrophysics Data System (ADS)

    Steinwolf, A.; Schwarzendahl, S. M.; Wallaschek, J.

    2014-02-01

    Pseudo-random excitation with low crest factor is less likely to force a structure under test into nonlinear behavior, which should be avoided, or at least minimized, in the practice of experimental modal analysis. However, simply cutting high peaks and removing them from the excitation time history is not an option because such clipping of the signal introduces frequency distortions of the amplitude spectrum. A better approach is to manipulate phases of the harmonics before generating the time history instead of clipping it afterwards. To do so a new parameter, kurtosis, is used in this paper to characterize the high peak behavior of pseudo-random excitations. An analytical solution is obtained for how the phases should be selected in order to reduce kurtosis and make modal testing excitations smoother with less extreme peaks. This solution was implemented for evaluation of the damping ratio of a SDOF system by the half-power method in the presence of an additional cubic term in the equation of motion. The system response obtained by numerical integration was treated as modal analysis data and the result is that the kurtosis-optimized excitation has compensated for the effect of nonlinearity and allowed to identify the damping ratio with good precision whereas an ordinary Gaussian excitation with randomized phases caused an error of 75 percent. Comparison with the numerical crest factor minimization by time-frequency-domain swapping has been made and experimental results from a modal testing rig with a realistic turbine blade are also presented in the paper.

  19. Randomized Response Analysis in Mplus

    ERIC Educational Resources Information Center

    Hox, Joop; Lensvelt-Mulders, Gerty

    2004-01-01

    This article describes a technique to analyze randomized response data using available structural equation modeling (SEM) software. The randomized response technique was developed to obtain estimates that are more valid when studying sensitive topics. The basic feature of all randomized response methods is that the data are deliberately…

  20. Mindfulness-Based Stress Reduction for Overweight/Obese Women With and Without Polycystic Ovary Syndrome: Design and Methods of a Pilot Randomized Controlled Trial

    PubMed Central

    Raja-Khan, Nazia; Agito, Katrina; Shah, Julie; Stetter, Christy M.; Gustafson, Theresa S.; Socolow, Holly; Kunselman, Allen R.; Reibel, Diane K.; Legro, Richard S.

    2015-01-01

    Mindfulness-based stress reduction (MBSR) may be beneficial for overweight/obese women, including women with polycystic ovary syndrome (PCOS), as it has been shown to reduce psychological distress and improve quality of life in other patient populations. Preliminary studies suggest that MBSR may also have salutary effects on blood pressure and blood glucose. This paper describes the design and methods of an ongoing pilot randomized controlled trial evaluating the feasibility and effects of MBSR in PCOS and non-PCOS women who are overweight or obese. Eighty six (86) women with body mass index ≥25 kg/m2, including 31 women with PCOS, have been randomized to 8 weeks of MBSR or health education control, and followed for 16 weeks. The primary outcome is mindfulness assessed with the Toronto Mindfulness Scale. Secondary outcomes include measures of blood pressure, blood glucose, quality of life, anxiety and depression. Our overall hypothesis is that MBSR will increase mindfulness and ultimately lead to favorable changes in blood pressure, blood glucose, psychological distress and quality of life in PCOS and non-PCOS women. This would support the integration of MBSR with conventional medical treatments to reduce psychological distress, cardiovascular disease and diabetes in PCOS and non-PCOS women who are overweight or obese. PMID:25662105

  1. Analyzing indirect effects in cluster randomized trials. The effect of estimation method, number of groups and group sizes on accuracy and power.

    PubMed

    Hox, Joop J; Moerbeek, Mirjam; Kluytmans, Anouck; van de Schoot, Rens

    2014-01-01

    Cluster randomized trials assess the effect of an intervention that is carried out at the group or cluster level. Ajzen's theory of planned behavior is often used to model the effect of the intervention as an indirect effect mediated in turn by attitude, norms and behavioral intention. Structural equation modeling (SEM) is the technique of choice to estimate indirect effects and their significance. However, this is a large sample technique, and its application in a cluster randomized trial assumes a relatively large number of clusters. In practice, the number of clusters in these studies tends to be relatively small, e.g., much less than fifty. This study uses simulation methods to find the lowest number of clusters needed when multilevel SEM is used to estimate the indirect effect. Maximum likelihood estimation is compared to Bayesian analysis, with the central quality criteria being accuracy of the point estimate and the confidence interval. We also investigate the power of the test for the indirect effect. We conclude that Bayes estimation works well with much smaller cluster level sample sizes such as 20 cases than maximum likelihood estimation; although the bias is larger the coverage is much better. When only 5-10 clusters are available per treatment condition even with Bayesian estimation problems occur. PMID:24550881

  2. The effectiveness of the McKenzie method in addition to first-line care for acute low back pain: a randomized controlled trial

    PubMed Central

    2010-01-01

    Background Low back pain is a highly prevalent and disabling condition worldwide. Clinical guidelines for the management of patients with acute low back pain recommend first-line treatment consisting of advice, reassurance and simple analgesics. Exercise is also commonly prescribed to these patients. The primary aim of this study was to evaluate the short-term effect of adding the McKenzie method to the first-line care of patients with acute low back pain. Methods A multi-centre randomized controlled trial with a 3-month follow-up was conducted between September 2005 and June 2008. Patients seeking care for acute non-specific low back pain from primary care medical practices were screened. Eligible participants were assigned to receive a treatment programme based on the McKenzie method and first-line care (advice, reassurance and time-contingent acetaminophen) or first-line care alone, for 3 weeks. Primary outcome measures included pain (0-10 Numeric Rating Scale) over the first seven days, pain at 1 week, pain at 3 weeks and global perceived effect (-5 to 5 scale) at 3 weeks. Treatment effects were estimated using linear mixed models. Results One hundred and forty-eight participants were randomized into study groups, of whom 138 (93%) completed the last follow-up. The addition of the McKenzie method to first-line care produced statistically significant but small reductions in pain when compared to first-line care alone: mean of -0.4 points (95% confidence interval, -0.8 to -0.1) at 1 week, -0.7 points (95% confidence interval, -1.2 to -0.1) at 3 weeks, and -0.3 points (95% confidence interval, -0.5 to -0.0) over the first 7 days. Patients receiving the McKenzie method did not show additional effects on global perceived effect, disability, function or on the risk of persistent symptoms. These patients sought less additional health care than those receiving only first-line care (P = 0.002). Conclusions When added to the currently recommended first-line care of acute

  3. Mimicking the quasi-random assembly of protein fibers in the dermis by freeze-drying method.

    PubMed

    Ghaleh, Hakimeh; Abbasi, Farhang; Alizadeh, Mina; Khoshfetrat, Ali Baradar

    2015-04-01

    Freeze-drying is extensively used for fabrication of porous materials in tissue engineering and biomedical applications, due to its versatility and use of no toxic solvent. However, it has some significant drawbacks. Conventional freeze-drying technique leads to the production of heterogeneous porous structures with side orientated columnar pores. As the top and bottom surfaces of the sample are not in contact with similar environments, different rates of heat transfer in the surfaces and the temperature gradient across the sample establish the preferential direction of heat transfer. To achieve a scaffold with a desirable microstructure for skin tissue engineering, freeze-drying method was modified by controlling the rate of cooling and regulation of heat transfer across the sample during the freezing step. It could create a homogeneous porous structure with more equiaxed non-oriented pores. Freezing the polymeric solution in the aluminum mold enhanced pore interconnectivity relative to the polystyrene mold. Recrystallization process was discussed how to influence the mean pore size of the scaffold when the final freezing temperature varied. Higher final freezing temperature can easily provide the energy required for the recrystallization process, which lead to enlarged ice crystals and resulting pores.

  4. Mimicking the quasi-random assembly of protein fibers in the dermis by freeze-drying method.

    PubMed

    Ghaleh, Hakimeh; Abbasi, Farhang; Alizadeh, Mina; Khoshfetrat, Ali Baradar

    2015-04-01

    Freeze-drying is extensively used for fabrication of porous materials in tissue engineering and biomedical applications, due to its versatility and use of no toxic solvent. However, it has some significant drawbacks. Conventional freeze-drying technique leads to the production of heterogeneous porous structures with side orientated columnar pores. As the top and bottom surfaces of the sample are not in contact with similar environments, different rates of heat transfer in the surfaces and the temperature gradient across the sample establish the preferential direction of heat transfer. To achieve a scaffold with a desirable microstructure for skin tissue engineering, freeze-drying method was modified by controlling the rate of cooling and regulation of heat transfer across the sample during the freezing step. It could create a homogeneous porous structure with more equiaxed non-oriented pores. Freezing the polymeric solution in the aluminum mold enhanced pore interconnectivity relative to the polystyrene mold. Recrystallization process was discussed how to influence the mean pore size of the scaffold when the final freezing temperature varied. Higher final freezing temperature can easily provide the energy required for the recrystallization process, which lead to enlarged ice crystals and resulting pores. PMID:25687012

  5. 45 CFR 1159.15 - Who has the responsibility for maintaining adequate technical, physical, and security safeguards...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... of maintaining adequate technical, physical, and security safeguards to prevent...

  6. A cluster-randomized, placebo-controlled, maternal vitamin a or beta-carotene supplementation trial in bangladesh: design and methods

    PubMed Central

    2011-01-01

    Background We present the design, methods and population characteristics of a large community trial that assessed the efficacy of a weekly supplement containing vitamin A or beta-carotene, at recommended dietary levels, in reducing maternal mortality from early gestation through 12 weeks postpartum. We identify challenges faced and report solutions in implementing an intervention trial under low-resource, rural conditions, including the importance of population choice in promoting generalizability, maintaining rigorous data quality control to reduce inter- and intra- worker variation, and optimizing efficiencies in information and resources flow from and to the field. Methods This trial was a double-masked, cluster-randomized, dual intervention, placebo-controlled trial in a contiguous rural area of ~435 sq km with a population of ~650,000 in Gaibandha and Rangpur Districts of Northwestern Bangladesh. Approximately 120,000 married women of reproductive age underwent 5-weekly home surveillance, of whom ~60,000 were detected as pregnant, enrolled into the trial and gave birth to ~44,000 live-born infants. Upon enrollment, at ~ 9 weeks' gestation, pregnant women received a weekly oral supplement containing vitamin A (7000 ug retinol equivalents (RE)), beta-carotene (42 mg, or ~7000 ug RE) or a placebo through 12 weeks postpartum, according to prior randomized allocation of their cluster of residence. Systems described include enlistment and 5-weekly home surveillance for pregnancy based on menstrual history and urine testing, weekly supervised supplementation, periodic risk factor interviews, maternal and infant vital outcome monitoring, birth defect surveillance and clinical/biochemical substudies. Results The primary outcome was pregnancy-related mortality assessed for 3 months following parturition. Secondary outcomes included fetal loss due to miscarriage or stillbirth, infant mortality under three months of age, maternal obstetric and infectious morbidity, infant

  7. Mixing a grounded theory approach with a randomized controlled trial related to intimate partner violence: what challenges arise for mixed methods research?

    PubMed

    Catallo, Cristina; Jack, Susan M; Ciliska, Donna; Macmillan, Harriet L

    2013-01-01

    Little is known about how to systematically integrate complex qualitative studies within the context of randomized controlled trials. A two-phase sequential explanatory mixed methods study was conducted in Canada to understand how women decide to disclose intimate partner violence in emergency department settings. Mixing a RCT (with a subanalysis of data) with a grounded theory approach required methodological modifications to maintain the overall rigour of this mixed methods study. Modifications were made to the following areas of the grounded theory approach to support the overall integrity of the mixed methods study design: recruitment of participants, maximum variation and negative case sampling, data collection, and analysis methods. Recommendations for future studies include: (1) planning at the outset to incorporate a qualitative approach with a RCT and to determine logical points during the RCT to integrate the qualitative component and (2) consideration for the time needed to carry out a RCT and a grounded theory approach, especially to support recruitment, data collection, and analysis. Data mixing strategies should be considered during early stages of the study, so that appropriate measures can be developed and used in the RCT to support initial coding structures and data analysis needs of the grounded theory phase.

  8. Comparison of the compact dry TC and 3M petrifilm ACP dry sheet media methods with the spiral plate method for the examination of randomly selected foods for obtaining aerobic colony counts.

    PubMed

    Ellis, P; Meldrum, R

    2002-02-01

    Two hundred thirty-six randomly selected food and milk samples were examined to obtain aerobic colony counts by two dry sheet media methods and a standard Public Health Laboratory Service spiral plate method. Results for 40 samples were outside the limits of detection for one or more of the tested methods and were not considered. (The limits of detection for the spiral plate method were 200 to 1 x 10(8) CFU/ml for the spiral plate method and 100 to 3 x 10(6) CFU/ml for the dry sheet media methods.) The remaining 196 sets of results were analyzed further. When the results from the three methods were compared, correlation coefficients were all >0.80 and slopes and intercepts were close to 1.0 and 0.0, respectively. Mean log values and standard deviations were very similar for all three methods. The results were evaluated according to published UK guidelines for ready-to-eat foods sampled at the point of sale, which include a quality acceptability assessment that is based on aerobic colony counts. Eighty-six percent of the comparable results gave the same verdict with regard to acceptability according to the aerobic colony count guidelines. Both dry sheet media methods were comparable to the spiral plate method and can be recommended for the examination of food.

  9. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  10. Fully automatic segmentation of multiple sclerosis lesions in brain MR FLAIR images using adaptive mixtures method and Markov random field model.

    PubMed

    Khayati, Rasoul; Vafadust, Mansur; Towhidkhah, Farzad; Nabavi, Massood

    2008-03-01

    In this paper, an approach is proposed for fully automatic segmentation of MS lesions in fluid attenuated inversion recovery (FLAIR) Magnetic Resonance (MR) images. The proposed approach, based on a Bayesian classifier, utilizes the adaptive mixtures method (AMM) and Markov random field (MRF) model to obtain and upgrade the class conditional probability density function (CCPDF) and the a priori probability of each class. To compare the performance of the proposed approach with those of previous approaches including manual segmentation, the similarity criteria of different slices related to 20 MS patients were calculated. Also, volumetric comparison of lesions volume between the fully automated segmentation and the gold standard was performed using correlation coefficient (CC). The results showed a better performance for the proposed approach, compared to those of previous works.

  11. Quantifying data retention of perpendicular spin-transfer-torque magnetic random access memory chips using an effective thermal stability factor method

    SciTech Connect

    Thomas, Luc Jan, Guenole; Le, Son; Wang, Po-Kang

    2015-04-20

    The thermal stability of perpendicular Spin-Transfer-Torque Magnetic Random Access Memory (STT-MRAM) devices is investigated at chip level. Experimental data are analyzed in the framework of the Néel-Brown model including distributions of the thermal stability factor Δ. We show that in the low error rate regime important for applications, the effect of distributions of Δ can be described by a single quantity, the effective thermal stability factor Δ{sub eff}, which encompasses both the median and the standard deviation of the distributions. Data retention of memory chips can be assessed accurately by measuring Δ{sub eff} as a function of device diameter and temperature. We apply this method to show that 54 nm devices based on our perpendicular STT-MRAM design meet our 10 year data retention target up to 120 °C.

  12. Uniform random number generators

    NASA Technical Reports Server (NTRS)

    Farr, W. R.

    1971-01-01

    Methods are presented for the generation of random numbers with uniform and normal distributions. Subprogram listings of Fortran generators for the Univac 1108, SDS 930, and CDC 3200 digital computers are also included. The generators are of the mixed multiplicative type, and the mathematical method employed is that of Marsaglia and Bray.

  13. Knowledge and Informed Decision-Making about Population-Based Colorectal Cancer Screening Participation in Groups with Low and Adequate Health Literacy

    PubMed Central

    Essink-Bot, M. L.; Dekker, E.; Timmermans, D. R. M.; Uiters, E.; Fransen, M. P.

    2016-01-01

    Objective. To analyze and compare decision-relevant knowledge, decisional conflict, and informed decision-making about colorectal cancer (CRC) screening participation between potential screening participants with low and adequate health literacy (HL), defined as the skills to access, understand, and apply information to make informed decisions about health. Methods. Survey including 71 individuals with low HL and 70 with adequate HL, all eligible for the Dutch organized CRC screening program. Knowledge, attitude, intention to participate, and decisional conflict were assessed after reading the standard information materials. HL was assessed using the Short Assessment of Health Literacy in Dutch. Informed decision-making was analyzed by the multidimensional measure of informed choice. Results. 64% of the study population had adequate knowledge of CRC and CRC screening (low HL 43/71 (61%), adequate HL 47/70 (67%), p > 0.05). 57% were informed decision-makers (low HL 34/71 (55%), adequate HL 39/70 (58%), p > 0.05). Intention to participate was 89% (low HL 63/71 (89%), adequate HL 63/70 (90%)). Respondents with low HL experienced significantly more decisional conflict (25.8 versus 16.1; p = 0.00). Conclusion. Informed decision-making about CRC screening participation was suboptimal among both individuals with low HL and individuals with adequate HL. Further research is required to develop and implement effective strategies to convey decision-relevant knowledge about CRC screening to all screening invitees. PMID:27200089

  14. Investigating bang for your training buck: a randomized controlled trial comparing three methods of training clinicians in two core strategies of dialectical behavior therapy.

    PubMed

    Dimeff, Linda A; Harned, Melanie S; Woodcock, Eric A; Skutch, Julie M; Koerner, Kelly; Linehan, Marsha M

    2015-05-01

    The present study examined the efficacy of online training (OLT), instructor-led training (ILT), and a treatment manual (TM) in training mental health clinicians in two core strategies of Dialectical Behavior Therapy (DBT): chain analysis and validation. A randomized controlled trial compared OLT, ILT, and TM among clinicians naïve to DBT (N=172) who were assessed at baseline, post-training, and 30, 60, and 90 days following training. Primary outcomes included satisfaction, self-efficacy, motivation, knowledge, clinical proficiency, and clinical use. Overall, ILT outperformed OLT and TM in satisfaction, self-efficacy, and motivation, whereas OLT was the most effective method for increasing knowledge. The conditions did not differ in observer-rated clinical proficiency or self-reported clinical use, which both increased to moderate levels after training. In addition, ILT was particularly effective at improving motivation to use chain analysis, whereas OLT was particularly effective at increasing knowledge of validation strategies. These findings suggest that these types of brief, didactic trainings may be effective methods of increasing knowledge of new treatment strategies, but may not be sufficient to enable clinicians to achieve a high level of clinical use or proficiency. Additional research examining the possible advantages of matching training methods to types of treatment strategies may help to determine a tailored, more effective approach to training clinicians in empirically supported treatments. PMID:25892165

  15. Prioritising pharmaceuticals for environmental risk assessment: Towards adequate and feasible first-tier selection.

    PubMed

    Roos, V; Gunnarsson, L; Fick, J; Larsson, D G J; Rudén, C

    2012-04-01

    The presence of pharmaceuticals in the aquatic environment, and the concerns for negative effects on aquatic organisms, has gained increasing attention over the last years. As ecotoxicity data are lacking for most active pharmaceutical ingredients (APIs), it is important to identify strategies to prioritise APIs for ecotoxicity testing and environmental monitoring. We have used nine previously proposed prioritisation schemes, both risk- and hazard-based, to rank 582 APIs. The similarities and differences in overall ranking results and input data were compared. Moreover, we analysed how well the methods ranked seven relatively well-studied APIs. It is concluded that the hazard-based methods were more successful in correctly ranking the well-studied APIs, but the fish plasma model, which includes human pharmacological data, also showed a high success rate. The results of the analyses show that the input data availability vary significantly; some data, such as logP, are available for most API while information about environmental concentrations and bioconcentration are still scarce. The results also suggest that the exposure estimates in risk-based methods need to be improved and that the inclusion of effect measures at first-tier prioritisation might underestimate risks. It is proposed that in order to develop an adequate prioritisation scheme, improved data on exposure such as degradation and sewage treatment removal and bioconcentration ability should be further considered. The use of ATC codes may also be useful for the development of a prioritisation scheme that includes the mode of action of pharmaceuticals and, to some extent, mixture effects. PMID:22361586

  16. The Need for Domestic Violence Laws with Adequate Legal and Social Support Services.

    ERIC Educational Resources Information Center

    Hemmons, Willa M.

    1981-01-01

    Describes the need for comprehensive domestic violence programs that include medical, legal, economic, psychological, and child care services. Although most states have family violence legislation, more work is needed to adequately implement these programs. (Author/JAC)

  17. Evaluation of catheter-manometer systems for adequate intravascular blood pressure measurements in small animals.

    PubMed

    Idvall, J; Aronsen, K F; Lindström, K; Ulmsten, U

    1977-09-30

    Various catheter-manometer systems possible for intravascular blood pressure measurments on rats have been elaborated and tested in vitro and in vivo. Using a pressure-step calibrator, it was observed from in vitro studies that microtransducers had superior frequency response compared to conventional transducers. Of the catheters tested, Pe-90 tapered to a 40 mm tip with an inner diameter of 0.3 mm had the best frequency response as judged from fall and settling times. Because of the damping effect, tapering increased fall time to 1.8 ms, which was still quite acceptable. By the same token settling time was minimized to 22.4 ms. With a special calculation method the theoretical percentile fault of the recordings was estimated to be 9.66%. When the measurement error was calculated from the actual in vivo recordings, it was found to be no more than 2.7%. These results show that the technique described is adequate for continuous intravascular blood pressure recordings on small animals. Finally it is emphasized that careful handling of the catheters and avoidance of stopcocks and air bubbles are essential for obtaining accurate and reproducible values. PMID:928971

  18. Fractional randomness

    NASA Astrophysics Data System (ADS)

    Tapiero, Charles S.; Vallois, Pierre

    2016-11-01

    The premise of this paper is that a fractional probability distribution is based on fractional operators and the fractional (Hurst) index used that alters the classical setting of random variables. For example, a random variable defined by its density function might not have a fractional density function defined in its conventional sense. Practically, it implies that a distribution's granularity defined by a fractional kernel may have properties that differ due to the fractional index used and the fractional calculus applied to define it. The purpose of this paper is to consider an application of fractional calculus to define the fractional density function of a random variable. In addition, we provide and prove a number of results, defining the functional forms of these distributions as well as their existence. In particular, we define fractional probability distributions for increasing and decreasing functions that are right continuous. Examples are used to motivate the usefulness of a statistical approach to fractional calculus and its application to economic and financial problems. In conclusion, this paper is a preliminary attempt to construct statistical fractional models. Due to the breadth and the extent of such problems, this paper may be considered as an initial attempt to do so.

  19. A Randomized, Single-Blind, Placebo-Controlled Study on the Efficacy of the Arthrokinematic Approach-Hakata Method in Patients with Chronic Nonspecific Low Back Pain

    PubMed Central

    Kogure, Akira; Kotani, Kazuhiko; Katada, Shigehiko; Takagi, Hiroshi; Kamikozuru, Masahiro; Isaji, Takashi; Hakata, Setsuo

    2015-01-01

    Study design cized, single-blind, controlled trial. Objective To investigate the efficacy of the Arthrokinematic approach (AKA)-Hakata (H) method for chronic low back pain. Summary of Background Data The AKA-H method is used to manually treat abnormalities of intra-articular movement. Methods One hundred eighty-six patients with chronic nonspecific low back pain randomly received either the AKA-H method (AKA-H group) or the sham technique (S group) monthly for 6 months. Data were collected at baseline and once a month. Outcome measures were pain intensity (visual analogue scale [VAS]) and quality of life (the Roland-Morris Disability Questionnaire [RDQ] and Short Form SF-36 questionnaire [SF-36]). Results At baseline, the VAS, RDQ, and SF-36 scores showed similar levels between the groups. After 6 months, the AKA-H group had more improvement in the VAS (42.8% improvement) and RDQ score (31.1% improvement) than the sham group (VAS: 10.4% improvement; RDQ: 9.8% improvement; both, P < 0.001). The respective scores for the SF-36 subscales (physical functioning, role physical, bodily pain, social functioning, general health perception, role emotional, and mental health) were also significantly more improved in the AKA-H group than in the sham group (all, P < 0.001). The scores for the physical, psychological, and social aspects of the SF-36 subscales showed similar improvement in the AKA-H group. Conclusion The AKA-H method can be effective in managing chronic low back pain. Trial Registration UMIN Clinical Trials Registry (UMIN-CTR) UMIN000006250. PMID:26646534

  20. Randomized controlled trial to evaluate the effects of combined progressive exercise on metabolic syndrome in breast cancer survivors: rationale, design, and methods

    PubMed Central

    2014-01-01

    Background Metabolic syndrome (MetS) is increasingly present in breast cancer survivors, possibly worsened by cancer-related treatments, such as chemotherapy. MetS greatly increases risk of cardiovascular disease and diabetes, co-morbidities that could impair the survivorship experience, and possibly lead to cancer recurrence. Exercise has been shown to positively influence quality of life (QOL), physical function, muscular strength and endurance, reduce fatigue, and improve emotional well-being; however, the impact on MetS components (visceral adiposity, hyperglycemia, low serum high-density lipoprotein cholesterol, hypertriglyceridemia, and hypertension) remains largely unknown. In this trial, we aim to assess the effects of combined (aerobic and resistance) exercise on components of MetS, as well as on physical fitness and QOL, in breast cancer survivors soon after completing cancer-related treatments. Methods/Design This study is a prospective randomized controlled trial (RCT) investigating the effects of a 16-week supervised progressive aerobic and resistance exercise training intervention on MetS in 100 breast cancer survivors. Main inclusion criteria are histologically-confirmed breast cancer stage I-III, completion of chemotherapy and/or radiation within 6 months prior to initiation of the study, sedentary, and free from musculoskeletal disorders. The primary endpoint is MetS; secondary endpoints include: muscle strength, shoulder function, cardiorespiratory fitness, body composition, bone mineral density, and QOL. Participants randomized to the Exercise group participate in 3 supervised weekly exercise sessions for 16 weeks. Participants randomized to the Control group are offered the same intervention after the 16-week period of observation. Discussion This is the one of few RCTs examining the effects of exercise on MetS in breast cancer survivors. Results will contribute a better understanding of metabolic disease-related effects of resistance and

  1. A randomized study comparing three groups of vein harvesting methods for coronary artery bypass grafting: endoscopic harvest versus standard bridging and open techniques

    PubMed Central

    Krishnamoorthy, Bhuvaneswari; Critchley, William R.; Glover, Alex T.; Nair, Janesh; Jones, Mark T.; Waterworth, Paul D.; Fildes, James E.; Yonan, Nizar

    2012-01-01

    OBJECTIVES The use of an open vein harvesting (OVH) technique for saphenous vein harvesting (SVH) is associated with wound complications and delayed patient mobilization. This has led to the development of minimally invasive vein harvesting (MIVH) techniques, such as standard bridging and endoscopic SVH (EVH). This randomized trial was established to assess immediate clinical outcome and patient satisfaction in our centre. METHODS A total of 150 consecutive patients were prospectively randomized into three groups. Group 1 consisted of 50 patients who underwent OVH, Group 2 consisted of 50 patients who underwent a standard bridging technique (SBT) and Group 3 consisted of 50 patients who underwent EVH. Each group was assessed for the incidence of wound infection, postoperative pain and satisfaction and the number of vein repairs using previously validated scoring systems. RESULTS The MIVH techniques reduced the pain at hospital (P < 0.001) and at 6 weeks (P < 0.001), and improved cosmesis (P < 0.001), compared with the OVH group. Patient satisfaction was greatest in the EVH group followed by the SBT and then the OVH group. The clinical markers of inflammation were reduced with an MIVHt. There were more vein repairs in the EVH compared with the OVH (P < 0.001) and the SBT (P = 0.04) groups. CONCLUSIONS This study demonstrates that MIVH reduces wound morbidity. We believe that each technique has advantages and disadvantages, which should be considered during the selection of a harvesting procedure by both the patient and the surgeon. PMID:22611182

  2. A randomized field trial for the primary prevention of osteoporosis among adolescent females: Comparison of two methods, mother centered and daughter centered

    PubMed Central

    Ansari, Hourieh; Farajzadegan, Ziba; Hajigholami, Ali; Paknahad, Zamzam

    2014-01-01

    Background: Osteoporosis is a serious public health. Since the majority of bone mass occurs during adolescence, primary prevention is important. Probably mother's participation in health education interventions leads to promote health behaviors in children. Aims: To assess a lifestyle modification intervention focused on mothers and students has an impact on osteoporosis preventive behaviors in adolescent girls. Materials and Methods: It is a randomized field trial in female high schools. 210 girls aged between 11 and 15 were randomly selected. Students in groups A and C and mothers in group B were selected Through the sampling frame. Our lifestyle modification was based on group based education in the public girls’ high schools. Subjects in the intervention groups participated in three educational sessions. Students’ osteoporosis preventive behaviors were measured by using a lifestyle questionnaire consisting of items assessing nutrition, physical activity and sun exposure. Repeated measure ANOVA at baseline, 4 week, 2 months and 6 months and were used to analyze the data. Results: After 1 month, diet and sun exposure scores increased significantly (P < 0.001) but it was higher in group B compared with group A. (About diet P < 0.001 and sun exposure = 0. 001). After 6 months, diet and sun exposure status in the group A approximately decreased to baseline, while in group B, diet components were significantly different compared to baseline (P < 0.001). There was no change in physical activity. Conclusion: Osteoporosis prevention intervention of adolescent can be effective when parents or girls participate in training sessions, but education is associated with better outcomes when focused on mothers. PMID:25422660

  3. A comparison of spinal manipulation methods and usual medical care for acute and sub-acute low back pain: a randomized clinical trial

    PubMed Central

    Haas, Mitchell; Glick, Ronald; Stevans, Joel; Landsittel, Doug

    2014-01-01

    Study Design Randomized-controlled trial with follow-up to 6 months. Objective This was a comparative effectiveness trial of: manual-thrust manipulation (MTM) versus mechanical-assisted manipulation (MAM); and manipulation versus usual medical care (UMC). Summary of Background Data Low back pain (LBP) is one of the most common conditions seen in primary care and physical medicine practice. MTM is a common treatment for LBP. Claims that MAM is an effective alternative to MTM have yet to be substantiated. There is also question about the effectiveness of manipulation in acute and sub-acute LBP, as compared to UMC. Methods 107 adults with onset of LBP within the past 12 weeks were randomized to 1 of 3 treatment groups: MTM; MAM; or UMC. Outcome measures included the Oswestry LBP disability index (0 to 100 scale) and numeric pain rating (0 to 10 scale). Participants in the manipulation groups were treated twice weekly over 4 weeks; subjects in UMC were seen for 3 visits during this time. Outcome measures were captured at baseline, 4 weeks, 3 months and 6 months. Results Linear regression showed a statistically significant advantage of MTM at 4 weeks compared to MAM (disability = −8.1, p = .009; pain = −1.4, p = .002) and UMC (disability = −6.5, p = .032; pain = −1.7, p < .001). Responder analysis, defined as 30% and 50% reductions in Oswestry scores revealed a significantly greater proportion of responders at 4 weeks in MTM (76%; 50%) compared to MAM (50%; 16%) and UMC (48%; 39%).Similar between-group results were found for pain: MTM (94%; 76%); MAM (69%; 47%); and UMC (56%; 41%). No statistically significant group differences were found between MAM and UMC, and for any comparison at 3 or 6 months. Conclusions MTM provides greater short-term reductions in self-reported disability and pain scores compared to UMC or MAM. PMID:25423308

  4. Comparison of treatment effect estimates of non-vitamin K antagonist oral anticoagulants versus warfarin between observational studies using propensity score methods and randomized controlled trials.

    PubMed

    Li, Guowei; Holbrook, Anne; Jin, Yanling; Zhang, Yonghong; Levine, Mitchell A H; Mbuagbaw, Lawrence; Witt, Daniel M; Crowther, Mark; Connolly, Stuart; Chai-Adisaksopha, Chatree; Wan, Zhongxiao; Cheng, Ji; Thabane, Lehana

    2016-06-01

    Emerging observational studies using propensity score (PS) methods assessed real-world comparative effectiveness of non-vitamin K antagonist oral anticoagulants (NOACs) versus warfarin in patients with non-valvular atrial fibrillation (AF). We aimed to compare treatment effect estimates of NOACs between PS studies and randomized controlled trials (RCTs). Electronic databases and conference proceedings were searched systematically. Primary outcomes included stroke or systemic embolism (SE) and major bleeding. A random-effects meta-analysis was performed to synthesize the data by pooling the PS- and RCT-derived hazard ratios (HRs) separately. The ratio of HRs (RHR) from the ratio of PS-derived HRs relative to RCT-derived HRs was used to determine whether there was a difference between estimates from PS studies and RCTs. There were 10 PS studies and 5 RCTs included for analysis. No significant difference of treatment effect estimates between the PS studies and RCTs was observed: RHR 1.11, 95 % CI 0.98-1.23 for stroke or SE; RHR 1.07, 95 % CI 0.87-1.34 for major bleeding. A significant association between NOACs and risk of stroke or SE was observed: HR 0.88, 95 % CI 0.83-0.94 for the PS studies; HR 0.79, 95 % CI 0.72-0.87 for the RCTs. However, no relationship between NOACs and risk of major bleeding was found: HR 0.91, 95 % CI 0.79-1.05 for the PS studies; HR 0.85, 95 % CI 0.73-1.00 for the RCTs. In this study, treatment effect estimates of NOACs versus warfarin in patients with non-valvular AF from PS studies are found to be in agreement with those from RCTs. PMID:27370013

  5. Efficient numerical methods for the random-field Ising model: Finite-size scaling, reweighting extrapolation, and computation of response functions.

    PubMed

    Fytas, Nikolaos G; Martín-Mayor, Víctor

    2016-06-01

    It was recently shown [Phys. Rev. Lett. 110, 227201 (2013)PRLTAO0031-900710.1103/PhysRevLett.110.227201] that the critical behavior of the random-field Ising model in three dimensions is ruled by a single universality class. This conclusion was reached only after a proper taming of the large scaling corrections of the model by applying a combined approach of various techniques, coming from the zero- and positive-temperature toolboxes of statistical physics. In the present contribution we provide a detailed description of this combined scheme, explaining in detail the zero-temperature numerical scheme and developing the generalized fluctuation-dissipation formula that allowed us to compute connected and disconnected correlation functions of the model. We discuss the error evolution of our method and we illustrate the infinite limit-size extrapolation of several observables within phenomenological renormalization. We present an extension of the quotients method that allows us to obtain estimates of the critical exponent α of the specific heat of the model via the scaling of the bond energy and we discuss the self-averaging properties of the system and the algorithmic aspects of the maximum-flow algorithm used. PMID:27415388

  6. Development of an efficient fungal DNA extraction method to be used in random amplified polymorphic DNA-PCR analysis to differentiate cyclopiazonic acid mold producers.

    PubMed

    Sánchez, Beatriz; Rodríguez, Mar; Casado, Eva M; Martín, Alberto; Córdoba, Juan J

    2008-12-01

    A variety of previously established mechanical and chemical treatments to achieve fungal cell lysis combined with a semiautomatic system operated by a vacuum pump were tested to obtain DNA extract to be directly used in randomly amplified polymorphic DNA (RAPD)-PCR to differentiate cyclopiazonic acid-producing and -nonproducing mold strains. A DNA extraction method that includes digestion with proteinase K and lyticase prior to using a mortar and pestle grinding and a semiautomatic vacuum system yielded DNA of high quality in all the fungal strains and species tested, at concentrations ranging from 17 to 89 ng/microl in 150 microl of the final DNA extract. Two microliters of DNA extracted with this method was directly used for RAPD-PCR using primer (GACA)4. Reproducible RAPD fingerprints showing high differences between producer and nonproducer strains were observed. These differences in the RAPD patterns did not differentiate all the strains tested in clusters by cyclopiazonic acid production but may be very useful to distinguish cyclopiazonic acid producer strains from nonproducer strains by a simple RAPD analysis. Thus, the DNA extracts obtained could be used directly without previous purification and quantification for RAPD analysis to differentiate cyclopiazonic acid producer from nonproducer mold strains. This combined analysis could be adaptable to other toxigenic fungal species to enable differentiation of toxigenic and non-toxigenic molds, a procedure of great interest in food safety. PMID:19244904

  7. Efficient numerical methods for the random-field Ising model: Finite-size scaling, reweighting extrapolation, and computation of response functions

    NASA Astrophysics Data System (ADS)

    Fytas, Nikolaos G.; Martín-Mayor, Víctor

    2016-06-01

    It was recently shown [Phys. Rev. Lett. 110, 227201 (2013), 10.1103/PhysRevLett.110.227201] that the critical behavior of the random-field Ising model in three dimensions is ruled by a single universality class. This conclusion was reached only after a proper taming of the large scaling corrections of the model by applying a combined approach of various techniques, coming from the zero- and positive-temperature toolboxes of statistical physics. In the present contribution we provide a detailed description of this combined scheme, explaining in detail the zero-temperature numerical scheme and developing the generalized fluctuation-dissipation formula that allowed us to compute connected and disconnected correlation functions of the model. We discuss the error evolution of our method and we illustrate the infinite limit-size extrapolation of several observables within phenomenological renormalization. We present an extension of the quotients method that allows us to obtain estimates of the critical exponent α of the specific heat of the model via the scaling of the bond energy and we discuss the self-averaging properties of the system and the algorithmic aspects of the maximum-flow algorithm used.

  8. Development of an efficient fungal DNA extraction method to be used in random amplified polymorphic DNA-PCR analysis to differentiate cyclopiazonic acid mold producers.

    PubMed

    Sánchez, Beatriz; Rodríguez, Mar; Casado, Eva M; Martín, Alberto; Córdoba, Juan J

    2008-12-01

    A variety of previously established mechanical and chemical treatments to achieve fungal cell lysis combined with a semiautomatic system operated by a vacuum pump were tested to obtain DNA extract to be directly used in randomly amplified polymorphic DNA (RAPD)-PCR to differentiate cyclopiazonic acid-producing and -nonproducing mold strains. A DNA extraction method that includes digestion with proteinase K and lyticase prior to using a mortar and pestle grinding and a semiautomatic vacuum system yielded DNA of high quality in all the fungal strains and species tested, at concentrations ranging from 17 to 89 ng/microl in 150 microl of the final DNA extract. Two microliters of DNA extracted with this method was directly used for RAPD-PCR using primer (GACA)4. Reproducible RAPD fingerprints showing high differences between producer and nonproducer strains were observed. These differences in the RAPD patterns did not differentiate all the strains tested in clusters by cyclopiazonic acid production but may be very useful to distinguish cyclopiazonic acid producer strains from nonproducer strains by a simple RAPD analysis. Thus, the DNA extracts obtained could be used directly without previous purification and quantification for RAPD analysis to differentiate cyclopiazonic acid producer from nonproducer mold strains. This combined analysis could be adaptable to other toxigenic fungal species to enable differentiation of toxigenic and non-toxigenic molds, a procedure of great interest in food safety.

  9. Development of a new method for detection and identification of Oenococcus oeni bacteriophages based on endolysin gene sequence and randomly amplified polymorphic DNA.

    PubMed

    Doria, Francesca; Napoli, Chiara; Costantini, Antonella; Berta, Graziella; Saiz, Juan-Carlos; Garcia-Moruno, Emilia

    2013-08-01

    Malolactic fermentation (MLF) is a biochemical transformation conducted by lactic acid bacteria (LAB) that occurs in wine at the end of alcoholic fermentation. Oenococcus oeni is the main species responsible for MLF in most wines. As in other fermented foods, where bacteriophages represent a potential risk for the fermentative process, O. oeni bacteriophages have been reported to be a possible cause of unsuccessful MLF in wine. Thus, preparation of commercial starters that take into account the different sensitivities of O. oeni strains to different phages would be advisable. However, currently, no methods have been described to identify phages infecting O. oeni. In this study, two factors are addressed: detection and typing of bacteriophages. First, a simple PCR method was devised targeting a conserved region of the endolysin (lys) gene to detect temperate O. oeni bacteriophages. For this purpose, 37 O. oeni strains isolated from Italian wines during different phases of the vinification process were analyzed by PCR for the presence of the lys gene, and 25 strains gave a band of the expected size (1,160 bp). This is the first method to be developed that allows identification of lysogenic O. oeni strains without the need for time-consuming phage bacterial-lysis induction methods. Moreover, a phylogenetic analysis was conducted to type bacteriophages. After the treatment of bacteria with UV light, lysis was obtained for 15 strains, and the 15 phage DNAs isolated were subjected to two randomly amplified polymorphic DNA (RAPD)-PCRs. By combining the RAPD profiles and lys sequences, 12 different O. oeni phages were clearly distinguished. PMID:23728816

  10. Development of a new method for detection and identification of Oenococcus oeni bacteriophages based on endolysin gene sequence and randomly amplified polymorphic DNA.

    PubMed

    Doria, Francesca; Napoli, Chiara; Costantini, Antonella; Berta, Graziella; Saiz, Juan-Carlos; Garcia-Moruno, Emilia

    2013-08-01

    Malolactic fermentation (MLF) is a biochemical transformation conducted by lactic acid bacteria (LAB) that occurs in wine at the end of alcoholic fermentation. Oenococcus oeni is the main species responsible for MLF in most wines. As in other fermented foods, where bacteriophages represent a potential risk for the fermentative process, O. oeni bacteriophages have been reported to be a possible cause of unsuccessful MLF in wine. Thus, preparation of commercial starters that take into account the different sensitivities of O. oeni strains to different phages would be advisable. However, currently, no methods have been described to identify phages infecting O. oeni. In this study, two factors are addressed: detection and typing of bacteriophages. First, a simple PCR method was devised targeting a conserved region of the endolysin (lys) gene to detect temperate O. oeni bacteriophages. For this purpose, 37 O. oeni strains isolated from Italian wines during different phases of the vinification process were analyzed by PCR for the presence of the lys gene, and 25 strains gave a band of the expected size (1,160 bp). This is the first method to be developed that allows identification of lysogenic O. oeni strains without the need for time-consuming phage bacterial-lysis induction methods. Moreover, a phylogenetic analysis was conducted to type bacteriophages. After the treatment of bacteria with UV light, lysis was obtained for 15 strains, and the 15 phage DNAs isolated were subjected to two randomly amplified polymorphic DNA (RAPD)-PCRs. By combining the RAPD profiles and lys sequences, 12 different O. oeni phages were clearly distinguished.

  11. Impact of Denture Cleaning Method and Overnight Storage Condition on Denture Biofilm Mass and Composition: A Cross-Over Randomized Clinical Trial

    PubMed Central

    Duyck, Joke; Vandamme, Katleen; Krausch-Hofmann, Stefanie; Boon, Lies; De Keersmaecker, Katrien; Jalon, Eline; Teughels, Wim

    2016-01-01

    Background Appropriate oral hygiene is required to maintain oral health in denture wearers. This study aims to compare the role of denture cleaning methods in combination with overnight storage conditions on biofilm mass and composition on acrylic removable dentures. Methods In a cross-over randomized controlled trial in 13 older people, 4 conditions with 2 different mechanical cleaning methods and 2 overnight storage conditions were considered: (i) brushing and immersion in water without a cleansing tablet, (ii) brushing and immersion in water with a cleansing tablet, (iii) ultrasonic cleaning and immersion in water without a cleansing tablet, and (iv) ultrasonic cleaning and immersion in water with a cleansing tablet. Each test condition was performed for 5 consecutive days, preceded by a 2-days wash-out period. Biofilm samples were taken at baseline (control) and at the end of each test period from a standardized region. Total and individual levels of selected oral bacteria (n = 20), and of Candida albicans were identified using the Polymerase Chain Reaction (PCR) technique. Denture biofilm coverage was scored using an analogue denture plaque score. Paired t-tests and Wilcoxon-signed rank tests were used to compare the test conditions. The level of significance was set at α< 5%. Results Overnight denture storage in water with a cleansing tablet significantly reduced the total bacterial count (p<0.01). The difference in total bacterial level between the two mechanical cleaning methods was not statistically significant. No significant effect was observed on the amount of Candida albicans nor on the analogue plaque scores. Conclusions The use of cleansing tablets during overnight denture storage in addition to mechanical denture cleaning did not affect Candida albicans count, but reduced the total bacterial count on acrylic removable dentures compared to overnight storage in water. This effect was more pronounced when combined with ultrasonic cleaning compared to

  12. Development of a New Method for Detection and Identification of Oenococcus oeni Bacteriophages Based on Endolysin Gene Sequence and Randomly Amplified Polymorphic DNA

    PubMed Central

    Doria, Francesca; Napoli, Chiara; Costantini, Antonella; Berta, Graziella; Saiz, Juan-Carlos

    2013-01-01

    Malolactic fermentation (MLF) is a biochemical transformation conducted by lactic acid bacteria (LAB) that occurs in wine at the end of alcoholic fermentation. Oenococcus oeni is the main species responsible for MLF in most wines. As in other fermented foods, where bacteriophages represent a potential risk for the fermentative process, O. oeni bacteriophages have been reported to be a possible cause of unsuccessful MLF in wine. Thus, preparation of commercial starters that take into account the different sensitivities of O. oeni strains to different phages would be advisable. However, currently, no methods have been described to identify phages infecting O. oeni. In this study, two factors are addressed: detection and typing of bacteriophages. First, a simple PCR method was devised targeting a conserved region of the endolysin (lys) gene to detect temperate O. oeni bacteriophages. For this purpose, 37 O. oeni strains isolated from Italian wines during different phases of the vinification process were analyzed by PCR for the presence of the lys gene, and 25 strains gave a band of the expected size (1,160 bp). This is the first method to be developed that allows identification of lysogenic O. oeni strains without the need for time-consuming phage bacterial-lysis induction methods. Moreover, a phylogenetic analysis was conducted to type bacteriophages. After the treatment of bacteria with UV light, lysis was obtained for 15 strains, and the 15 phage DNAs isolated were subjected to two randomly amplified polymorphic DNA (RAPD)-PCRs. By combining the RAPD profiles and lys sequences, 12 different O. oeni phages were clearly distinguished. PMID:23728816

  13. Patient acceptance of adequately filled breast implants using the tilt test.

    PubMed

    Tebbetts, J B

    2000-07-01

    Adequate fill of any breast implant, regardless of shell characteristics, shape, or filler material, is important to prevent implant shell wrinkling, folding, or collapse that could potentially decrease the life of the implant. Implant shell life is a major factor that affects reoperation rates. The greater the necessity of reoperations, regardless of implant type, the greater the rate of local complications, necessitating additional surgery with additional risks and costs to patients. Palpable shell folding, visible wrinkling or rippling, palpable shifts of filler material, sloshing, and compromised aesthetic results can result from an under-filled implant. Any of these complications can necessitate reoperations with increased risks and costs to patients. This is a study of 609 consecutive patients from January of 1993 to December of 1998 who were given detailed preoperative informed consent and a choice of implant shape and type and who chose the increased firmness associated with an implant that is adequately filled to pass the tilt test. This study addresses two questions: (1) Will patients accept the increased firmness of an implant that is filled to pass the tilt test? and (2) Is adequate fill by the tilt test useful clinically to help reduce the incidence of postoperative rippling, wrinkling, and spontaneous deflation in saline implants? Patients were followed by postoperative examinations and questionnaires. No patient requested implant replacement to a softer implant postoperatively, and no reoperations were performed for visible rippling or wrinkling. The spontaneous deflation rate over this 6-year period was 9 of 1218 implants, or 0.739 percent. If patients will accept more firmness with an adequately filled implant, regardless of the filler material, surgeons might worry less about recommending an adequately filled implant to patients, and manufacturers might feel more comfortable producing adequately filled implants and redefining fill volumes for

  14. Generation of pseudo-random numbers

    NASA Technical Reports Server (NTRS)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  15. Do we really need a large number of particles to simulate bimolecular reactive transport with random walk methods? A kernel density estimation approach

    NASA Astrophysics Data System (ADS)

    Rahbaralam, Maryam; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2015-12-01

    Random walk particle tracking methods are a computationally efficient family of methods to solve reactive transport problems. While the number of particles in most realistic applications is in the order of 106-109, the number of reactive molecules even in diluted systems might be in the order of fractions of the Avogadro number. Thus, each particle actually represents a group of potentially reactive molecules. The use of a low number of particles may result not only in loss of accuracy, but also may lead to an improper reproduction of the mixing process, limited by diffusion. Recent works have used this effect as a proxy to model incomplete mixing in porous media. In this work, we propose using a Kernel Density Estimation (KDE) of the concentrations that allows getting the expected results for a well-mixed solution with a limited number of particles. The idea consists of treating each particle as a sample drawn from the pool of molecules that it represents; this way, the actual location of a tracked particle is seen as a sample drawn from the density function of the location of molecules represented by that given particle, rigorously represented by a kernel density function. The probability of reaction can be obtained by combining the kernels associated to two potentially reactive particles. We demonstrate that the observed deviation in the reaction vs time curves in numerical experiments reported in the literature could be attributed to the statistical method used to reconstruct concentrations (fixed particle support) from discrete particle distributions, and not to the occurrence of true incomplete mixing. We further explore the evolution of the kernel size with time, linking it to the diffusion process. Our results show that KDEs are powerful tools to improve computational efficiency and robustness in reactive transport simulations, and indicates that incomplete mixing in diluted systems should be modeled based on alternative mechanistic models and not on a

  16. A simple method for analyzing actives in random RNAi screens: introducing the “H Score” for hit nomination & gene prioritization

    PubMed Central

    Bhinder, Bhavneet; Djaballah, Hakim

    2013-01-01

    Due to the numerous challenges in hit identification from random RNAi screening, we have examined current practices with a discovery of a variety of methodologies employed and published in many reports; majority of them, unfortunately, do not address the minimum associated criteria for hit nomination, as this could potentially have been the cause or may well be the explanation as to the lack of confirmation and follow up studies, currently facing the RNAi field. Overall, we find that these criteria or parameters are not well defined, in most cases arbitrary in nature, and hence rendering it extremely difficult to judge the quality of and confidence in nominated hits across published studies. For this purpose, we have developed a simple method to score actives independent of assay readout; and provide, for the first time, a homogenous platform enabling cross-comparison of active gene lists resulting from different RNAi screening technologies. Here, we report on our recently developed method dedicated to RNAi data output analysis referred to as the BDA method applicable to both arrayed and pooled RNAi technologies; wherein the concerns pertaining to inconsistent hit nomination and off-target silencing in conjugation with minimal activity criteria to identify a high value target are addressed. In this report, a combined hit rate per gene, called “H score”, is introduced and defined. The H score provides a very useful tool for stringent active gene nomination, gene list comparison across multiple studies, prioritization of hits, and evaluation of the quality of the nominated gene hits. PMID:22934950

  17. Random Forest as an Imputation Method for Education and Psychology Research: Its Impact on Item Fit and Difficulty of the Rasch Model

    ERIC Educational Resources Information Center

    Golino, Hudson F.; Gomes, Cristiano M. A.

    2016-01-01

    This paper presents a non-parametric imputation technique, named random forest, from the machine learning field. The random forest procedure has two main tuning parameters: the number of trees grown in the prediction and the number of predictors used. Fifty experimental conditions were created in the imputation procedure, with different…

  18. Do Foley Catheters Adequately Drain the Bladder? Evidence from CT Imaging Studies

    PubMed Central

    Avulova, Svetlana; Li, Valery J.; Khusid, Johnathan A.; Choi, Woo S.; Weiss, Jeffrey P.

    2015-01-01

    ABSTRACT Introduction: The Foley catheter has been widely assumed to be an effective means of draining the bladder. However, recent studies have brought into question its efficacy. The objective of our study is to further assess the adequacy of Foley catheter for complete drainage of the bladder. Materials and Methods: Consecutive catheterized patients were identified from a retrospective review of contrast enhanced and non-contrast enhanced computed tomo-graphic (CT) abdomen and pelvis studies completed from 7/1/2011-6/30/2012. Residual urine volume (RUV) was measured using 5mm axial CT sections as follows: The length (L) and width (W) of the bladder in the section with the greatest cross sectional area was combined with bladder height (H) as determined by multiplanar reformatted images in order to calculate RUV by applying the formula for the volume (V) of a sphere in a cube: V=(ϖ/6)*(L*W*H). Results: RUVs of 167 (mean age 67) consecutively catheterized men (n=72) and women (n=95) identified by CT abdomen and pelvis studies were calculated. The mean RUV was 13.2 mL (range: 0.0 mL-859.1 mL, standard deviation: 75.9 mL, margin of error at 95% confidence:11.6 mL). Four (2.4%) catheterized patients had RUVs of >50 mL, two of whom had an improperly placed catheter tip noted on their CT-reports. Conclusions: Previous studies have shown that up to 43% of catheterized patients had a RUV greater than 50 mL, suggesting inadequacy of bladder drainage via the Foley catheter. Our study indicated that the vast majority of patients with Foley catheters (97.6%), had adequately drained bladders with volumes of <50 mL. PMID:26200550

  19. When are studies adequate for regulatory purposes? View of one regulated.

    PubMed Central

    Bundy, M

    1981-01-01

    The question of adequacy of studies for regulatory purposes has been debated for years. Nine questions need answers to determine adequacy: (1) Does the study deal with a defined problem or a defined segment of it? (2) Do the study data justify the conclusions drawn? (3) Were appropriate statistical analyses used? Is there evidence of bias versus objectivity in the collection or analysis of data? (4) Does the study support, supplement (or complement) or refute information in the literature? Is the study truly new information? (5) Does the study conform to the Interagency Regulatory Liaison Group (IRLG) guidelines for documentation of Epidemiologic Studies? (6) Does the study stand up to peer review? (7) Have other investigators been able to confirm the findings by duplicating the study? (8) Is the study acceptable or can it be made acceptable for publication in a reputable scientific journal? (9) Is the problem of such magnitude or significance that regulation is required? Because there is no such thing as a risk-free environment or absolute safety and there is no definitive "yes" answer to each of the questions, the regulated would hope--yes, insist--that the regulators exercise judgement with great skill in promulgation of rules or regulations. The application of safety factors and the determination of acceptable levels of risk should be social decisions. A discussion of instances where the "regulated" believes that studies have not been adequate, or others habe been ignored, or misinterpreted for regulatory purposes in included.A method of settling controversial questions to eliminate the litigation route is proposed. Judgment which is so often eliminated by regulation needs to find its way back into the regulatory process. The regulated recognize the need for regulations. However, when these regulations are based on less than good scientific judgment, harm will be done to the regulatory process itself in the long run. PMID:7333262

  20. Emotional Experiences of Obese Women with Adequate Gestational Weight Variation: A Qualitative Study

    PubMed Central

    Faria-Schützer, Débora Bicudo; Surita, Fernanda Garanhani de Castro; Alves, Vera Lucia Pereira; Vieira, Carla Maria; Turato, Egberto Ribeiro

    2015-01-01

    Background As a result of the growth of the obese population, the number of obese women of fertile age has increased in the last few years. Obesity in pregnancy is related to greater levels of anxiety, depression and physical harm. However, pregnancy is an opportune moment for the intervention of health care professionals to address obesity. The objective of this study was to describe how obese pregnant women emotionally experience success in adequate weight control. Methods and Findings Using a qualitative design that seeks to understand content in the field of health, the sample of subjects was deliberated, with thirteen obese pregnant women selected to participate in an individual interview. Data was analysed by inductive content analysis and includes complete transcription of the interviews, re-readings using suspended attention, categorization in discussion topics and the qualitative and inductive analysis of the content. The analysis revealed four categories, three of which show the trajectory of body care that obese women experience during pregnancy: 1) The obese pregnant woman starts to think about her body;2) The challenge of the diet for the obese pregnant woman; 3) The relation of the obese pregnant woman with the team of antenatal professionals. The fourth category reveals the origin of the motivation for the change: 4) The potentializing factors for change: the motivation of the obese woman while pregnant. Conclusions During pregnancy, obese women are more in touch with themselves and with their emotional conflicts. Through the transformations of their bodies, women can start a more refined self-care process and experience of the body-mind unit. The fear for their own and their baby's life, due to the risks posed by obesity, appears to be a great potentializing factor for change. The relationship with the professionals of the health care team plays an important role in the motivational support of the obese pregnant woman. PMID:26529600

  1. Determining median urinary iodine concentration that indicates adequate iodine intake at population level.

    PubMed Central

    Delange, François; de Benoist, Bruno; Burgi, Hans

    2002-01-01

    OBJECTIVE: Urinary iodine concentration is the prime indicator of nutritional iodine status and is used to evaluate population-based iodine supplementation. In 1994, WHO, UNICEF and ICCIDD recommended median urinary iodine concentrations for populations of 100- 200 micro g/l, assuming the 100 micro g/l threshold would limit concentrations <50 micro g/l to METHOD: A questionnaire on frequency distribution of urinary iodine in iodine-replete populations was circulated to 29 scientific groups. FINDINGS: Nineteen groups reported data from 48 populations with median urinary iodine concentrations >100 micro g/l. The total population was 55 892, including 35 661 (64%) schoolchildren. Median urinary iodine concentrations were 111-540 (median 201) micro g/l for all populations, 100-199 micro g/l in 23 (48%) populations and >/=200 micro g/l in 25 (52%). The frequencies of values <50 micro g/l were 0-20.8 (mean 4.8%) overall and 7.2% and 2.5% in populations with medians of 100-199 micro g/l and >200 micro g/l, respectively. The frequency reached 20% only in two places where iodine had been supplemented for <2 years. CONCLUSION: The frequency of urinary iodine concentrations <50 micro g/l in populations with median urinary iodine concentrations >/=100 micro g/l has been overestimated. The threshold of 100 micro g/l does not need to be increased. In populations, median urinary iodine concentrations of 100-200 micro g/l indicate adequate iodine intake and optimal iodine nutrition. PMID:12219154

  2. Is random access memory random?

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.

  3. Gauge cooling for the singular-drift problem in the complex Langevin method — a test in Random Matrix Theory for finite density QCD

    NASA Astrophysics Data System (ADS)

    Nagata, Keitaro; Nishimura, Jun; Shimasaki, Shinji

    2016-07-01

    Recently, the complex Langevin method has been applied successfully to finite density QCD either in the deconfinement phase or in the heavy dense limit with the aid of a new technique called the gauge cooling. In the confinement phase with light quarks, however, convergence to wrong limits occurs due to the singularity in the drift term caused by small eigenvalues of the Dirac operator including the mass term. We propose that this singular-drift problem should also be overcome by the gauge cooling with different criteria for choosing the complexified gauge transformation. The idea is tested in chiral Random Matrix Theory for finite density QCD, where exact results are reproduced at zero temperature with light quarks. It is shown that the gauge cooling indeed changes drastically the eigenvalue distribution of the Dirac operator measured during the Langevin process. Despite its non-holomorphic nature, this eigenvalue distribution has a universal diverging behavior at the origin in the chiral limit due to a generalized Banks-Casher relation as we confirm explicitly.

  4. Study design and methods for a randomized crossover trial substituting brown rice for white rice on diabetes risk factors in India.

    PubMed

    Wedick, Nicole M; Sudha, Vasudevan; Spiegelman, Donna; Bai, Mookambika Ramya; Malik, Vasanti S; Venkatachalam, Siva Sankari; Parthasarathy, Vijayalaksmi; Vaidya, Ruchi; Nagarajan, Lakshmipriya; Arumugam, Kokila; Jones, Clara; Campos, Hannia; Krishnaswamy, Kamala; Willett, Walter; Hu, Frank B; Anjana, Ranjit Mohan; Mohan, Viswanathan

    2015-01-01

    India has the second largest number of people with diabetes in the world following China. Evidence indicates that consumption of whole grains can reduce the risk of type 2 diabetes. This article describes the study design and methods of a trial in progress evaluating the effects of substituting whole grain brown rice for polished (refined) white rice on biomarkers of diabetes risk (glucose metabolism, dyslipidemia, inflammation). This is a randomized controlled clinical trial with a crossover design conducted in Chennai, India among overweight but otherwise healthy volunteers aged 25-65 y with a body mass index ≥23 kg/m(2) and habitual rice consumption ≥200 g/day. The feasibility and cultural appropriateness of this type of intervention in the local environment will also be examined. If the intervention is efficacious, the findings can be incorporated into national-level policies which could include the provision of brown rice as an option or replacement for white rice in government institutions and food programs. This relatively simple dietary intervention has the potential to substantially diminish the burden of diabetes in Asia and elsewhere.

  5. Study design and methods for a randomized crossover trial substituting brown rice for white rice on diabetes risk factors in India

    PubMed Central

    Wedick, Nicole M.; Vasudevan, Sudha; Spiegelman, Donna; Bai, Ramya; Malik, Vasanti; Venkatachalam, Siva Sankari; Parthasarathy, Vijayalaksmi; Vaidya, Ruchi; Nagarajan, Lakshmipriya; Arumugam, Kokila; Jones, Clara; Campos, Hannia; Krishnaswamy, Kamala; Willett, Walter; Hu, Frank B.; Mohan, Anjana Ranjit; Viswanathan, Mohan

    2016-01-01

    India has the second largest number of people with diabetes in the world following China. Evidence indicates that consumption of whole grains can reduce risk of type 2 diabetes. This manuscript describes the study design and methods of a trial in progress evaluating the effects of substituting whole grain brown rice for polished (refined) white rice on biomarkers of diabetes risk (glucose metabolism, dyslipidemia, inflammation). This is a randomized controlled clinical trial with a crossover design conducted in Chennai, India among overweight but otherwise healthy volunteers aged 25–65y with a body mass index ≥23kg/m2 and habitual rice consumption ≥200grams/day. The feasibility and cultural appropriateness of this type of intervention in the local environment will also be examined. If the intervention is efficacious, the findings can be incorporated into national-level policies which could include the provision of brown rice as an option or replacement for white rice in government institutions and food programs. This relatively simple dietary intervention has the potential to substantially diminish the burden of diabetes in Asia and elsewhere. PMID:26017321

  6. Effect of a Wheelie Training Method With the Front Wheels on a Ramp in Novice Able-Bodied Participants: A Randomized Controlled Trial.

    PubMed

    Yang, Yu-Sheng; Koontz, Alicia M; Chen, Chyi-Rong; Fang, Wei-Chien; Chang, Jyh-Jong

    2015-01-01

    The objective of this study was to determine if wheelie training that begins with learning how to balance with the front wheels on a ramp would increase the success rate, reduce the training time, and improve retention rates. A randomized controlled trial design was used to evaluate the effectiveness of wheelie training on a ramp setting (ramp group, n = 26) and conventional training (conventional group, n = 26). The main outcome measures were success rates in achieving wheelie competence, training time, and the retention rate in 7 and 30 days respectively. The results showed that the success rate for each training group both reached 100%. The mean training times for the conventional group and the ramp group were 86.0 ± 35.7 and 76.0 ± 25.8 minutes. Training time was not significantly affected by the training method (p = 0.23), but it was affected by gender, with women requiring an average of 92.0 ± 31.4 minutes in comparison with 70.0 ± 27.5 minutes for men (p = 0.01). The skill retention rate after 7 and 30 days was 100% for both groups. Neither success rate nor training time for wheelie skill acquisition by learners were improved by learning wheelie balance on a ramp. However, a high retention rate of wheelie skills for both training groups was found, which implies that success can be achieved by training on a ramp used in this study.

  7. A minimalistic approach to static and dynamic electron correlations: Amending generalized valence bond method with extended random phase approximation correlation correction.

    PubMed

    Chatterjee, Koushik; Pastorczak, Ewa; Jawulski, Konrad; Pernal, Katarzyna

    2016-06-28

    A perfect-pairing generalized valence bond (GVB) approximation is known to be one of the simplest approximations, which allows one to capture the essence of static correlation in molecular systems. In spite of its attractive feature of being relatively computationally efficient, this approximation misses a large portion of dynamic correlation and does not offer sufficient accuracy to be generally useful for studying electronic structure of molecules. We propose to correct the GVB model and alleviate some of its deficiencies by amending it with the correlation energy correction derived from the recently formulated extended random phase approximation (ERPA). On the examples of systems of diverse electronic structures, we show that the resulting ERPA-GVB method greatly improves upon the GVB model. ERPA-GVB recovers most of the electron correlation and it yields energy barrier heights of excellent accuracy. Thanks to a balanced treatment of static and dynamic correlation, ERPA-GVB stays reliable when one moves from systems dominated by dynamic electron correlation to those for which the static correlation comes into play. PMID:27369501

  8. Self-reported segregation experience throughout the life course and its association with adequate health literacy.

    PubMed

    Goodman, Melody S; Gaskin, Darrell J; Si, Xuemei; Stafford, Jewel D; Lachance, Christina; Kaphingst, Kimberly A

    2012-09-01

    Residential segregation has been shown to be associated with health outcomes and health care utilization. We examined the association between racial composition of five physical environments throughout the life course and adequate health literacy among 836 community health center patients in Suffolk County, NY. Respondents who attended a mostly White junior high school or currently lived in a mostly White neighborhood were more likely to have adequate health literacy compared to those educated or living in predominantly minority or diverse environments. This association was independent of the respondent's race, ethnicity, age, education, and country of birth.

  9. A randomized clinical trial of the effectiveness of mechanical traction for sub-groups of patients with low back pain: study methods and rationale

    PubMed Central

    2010-01-01

    Background Patients with signs of nerve root irritation represent a sub-group of those with low back pain who are at increased risk of persistent symptoms and progression to costly and invasive management strategies including surgery. A period of non-surgical management is recommended for most patients, but there is little evidence to guide non-surgical decision-making. We conducted a preliminary study examining the effectiveness of a treatment protocol of mechanical traction with extension-oriented activities for patients with low back pain and signs of nerve root irritation. The results suggested this approach may be effective, particularly in a more specific sub-group of patients. The aim of this study will be to examine the effectiveness of treatment that includes traction for patients with low back pain and signs of nerve root irritation, and within the pre-defined sub-group. Methods/Design The study will recruit 120 patients with low back pain and signs of nerve root irritation. Patients will be randomized to receive an extension-oriented treatment approach, with or without the addition of mechanical traction. Randomization will be stratified based on the presence of the pre-defined sub-grouping criteria. All patients will receive 12 physical therapy treatment sessions over 6 weeks. Follow-up assessments will occur after 6 weeks, 6 months, and 1 year. The primary outcome will be disability measured with a modified Oswestry questionnaire. Secondary outcomes will include self-reports of low back and leg pain intensity, quality of life, global rating of improvement, additional healthcare utilization, and work absence. Statistical analysis will be based on intention to treat principles and will use linear mixed model analysis to compare treatment groups, and examine the interaction between treatment and sub-grouping status. Discussion This trial will provide a methodologically rigorous evaluation of the effectiveness of using traction for patients with low back

  10. Methods and baseline characteristics of a randomized trial treating early childhood obesity: The Positive Lifestyles for Active Youngsters (Team PLAY) trial

    PubMed Central

    Hare, Marion; Coday, Mace; Williams, Natalie A.; Richey, Phyllis; Tylavsky, Frances; Bush, Andrew

    2012-01-01

    There are few effective obesity interventions directed towards younger children, particularly young minority children. This paper describes the design, intervention, recruitment methods, and baseline data of the ongoing Positive Lifestyles for Active Youngsters (Team PLAY) study. This randomized controlled trial is designed to test the efficacy of a 6-month, moderately intense, primary care feasible, family-based behavioral intervention, targeting both young children and their parent, in promoting healthy weight change. Participants are 270 overweight and obese children (ages 4 to 7 years) and their parent, who were recruited from a primarily African American urban population. Parents and children were instructed in proven cognitive behavioral techniques (e.g. goal setting, self-talk, stimulus control and reinforcement) designed to encourage healthier food choices (more whole grains, fruits and vegetables, and less concentrated fats and sugar), reduce portion sizes, decrease sweetened beverages and increase moderate to vigorous physical activity engagement. The main outcome of this study is change in BMI at two years post enrollment. Recruitment using reactive methods (mailings, TV ads, pamphlets) was found to be more successful than using only a proactive approach (referral through physicians). At baseline, most children were very obese with an average BMI z-score of 2.6. Reported intake of fruits and vegetables and minutes of moderate to vigorous physical activity engagement did not meet national recommendations. If efficacious, Team PLAY would offer a model for obesity treatment directed at families with young children that could be tested and translated to both community and primary care settings. PMID:22342450

  11. A new real-time method for detecting the effect of fentanyl using the preoperative pressure pain threshold and Narcotrend index: a randomized study in female surgery patients.

    PubMed

    Duan, Guangyou; Guo, Shanna; Zhan, Huiming; Qi, Dongmei; Zhang, Yuhao; Zhang, Xianwei

    2015-01-01

    Individual variability in the effects of opioid analgesics such as fentanyl remains a major challenge for tailored pharmacological treatment including postoperative analgesia. This study aimed to establish a new real-time method for detecting the effects of fentanyl and their individual differences in the preoperative period, using the pressure pain threshold (PPT) and Narcotrend index (NTI) test.Eighty women undergoing elective surgery under general anesthesia were enrolled in this randomized, double-blinded, placebo-controlled study to receive either intravenous fentanyl (Group F) or saline (Group S). Before (T1) and 5 (T2) and 10 min (T3) after intravenous injection, the PPT, NTI, respiratory rate, heart rate, blood pressure, and pulse oxygen saturation were measured. The initial time at which the Narcotrend index showed a decline was also recorded.In total, 40 patients in Group S and 38 patients in Group F were included in the final analysis. At 5 min and 10 min after intravenous fentanyl administration, the analgesic effect was determined by measuring the PPT, which was significantly increased (P < 0.001), and the sedative effect was detected using the NTI, which was significantly decreased (P < 0.001). The distribution of percentage changes of the PPT and NTI showed individual differences. At T2 and T3, the absolute changes in NTI and PPT were positively correlated (r = 0.444 at T2, P = 0.005; r = 0.332 at T3, P = 0.042).Through the PPT and NTI, it was feasible to easily detect the effects of fentanyl and their individual differences in real time before induction of anesthesia in the operation room. This method could potentially be applied to preoperatively determine patients' sensitivity to fentanyl.

  12. Application of bimodal distribution to the detection of changes in uranium concentration in drinking water collected by random daytime sampling method from a large water supply zone.

    PubMed

    Garboś, Sławomir; Święcicka, Dorota

    2015-11-01

    The random daytime (RDT) sampling method was used for the first time in the assessment of average weekly exposure to uranium through drinking water in a large water supply zone. Data set of uranium concentrations determined in 106 RDT samples collected in three runs from the water supply zone in Wroclaw (Poland), cannot be simply described by normal or log-normal distributions. Therefore, a numerical method designed for the detection and calculation of bimodal distribution was applied. The extracted two distributions containing data from the summer season of 2011 and the winter season of 2012 (nI=72) and from the summer season of 2013 (nII=34) allowed to estimate means of U concentrations in drinking water: 0.947 μg/L and 1.23 μg/L, respectively. As the removal efficiency of uranium during applied treatment process is negligible, the effect of increase in uranium concentration can be explained by higher U concentration in the surface-infiltration water used for the production of drinking water. During the summer season of 2013, heavy rains were observed in Lower Silesia region, causing floods over the territory of the entire region. Fluctuations in uranium concentrations in surface-infiltration water can be attributed to releases of uranium from specific sources - migration from phosphate fertilizers and leaching from mineral deposits. Thus, exposure to uranium through drinking water may increase during extreme rainfall events. The average chronic weekly intakes of uranium through drinking water, estimated on the basis of central values of the extracted normal distributions, accounted for 3.2% and 4.1% of tolerable weekly intake.

  13. Application of the Random Forest method to analyse epidemiological and phenotypic characteristics of Salmonella 4,[5],12:i:- and Salmonella Typhimurium strains.

    PubMed

    Barco, L; Mancin, M; Ruffa, M; Saccardin, C; Minorello, C; Zavagnin, P; Lettini, A A; Olsen, J E; Ricci, A

    2012-11-01

    Salmonella enterica 4,[5],12:i:- is a monophasic variant of S. Typhimurium. In the last decade, its prevalence rose sharply. Although S. 4,[5],12:i:- and S. Typhimurium are known to pose a considerable public health risk, there is no detailed information on the circulation of these serovars in Italy, particularly as far as veterinary isolates are concerned. For this reason, a data set of 877 strains isolated in the north-east of Italy from foodstuffs, animals and environment was analysed during 2005-2010. The Random Forests (RF) method was used to identify the most important epidemiological and phenotypic variables to show the difference between the two serovars. Both descriptive analysis and RF revealed that S. 4,[5],12:i:- is less heterogeneous than S. Typhimurium. RF highlighted that phage type was the most important variable to differentiate the two serovars. The most common phage types identified for S. 4,[5],12:i:- were DT20a, U311 and DT193. The same phage types were also found in S. Typhimurium isolates, although with a much lower prevalence. DT7 and DT120 were ascribed to the two serovars at comparable levels. DT104, DT2 and DT99 were ascribed exclusively to S. Typhimurium, and almost all the other phage types identified were more related to the latter serovar. Such data confirm that phage typing can provide an indication of the biphasic or monophasic state of the strains investigated and could therefore support serotyping results. However, phage typing cannot be used as the definitive method to differentiate the two serovars, as part of the phage types were detected for both serovars and, in particular, all phage types found for S. 4,[5],12:i- were found also for S. Typhimurium.

  14. Game-Based E-Learning Is More Effective than a Conventional Instructional Method: A Randomized Controlled Trial with Third-Year Medical Students

    PubMed Central

    Boeker, Martin; Andel, Peter; Vach, Werner; Frankenschmidt, Alexander

    2013-01-01

    Background When compared with more traditional instructional methods, Game-based e-learning (GbEl) promises a higher motivation of learners by presenting contents in an interactive, rule-based and competitive way. Most recent systematic reviews and meta-analysis of studies on Game-based learning and GbEl in the medical professions have shown limited effects of these instructional methods. Objectives To compare the effectiveness on the learning outcome of a Game-based e-learning (GbEl) instruction with a conventional script-based instruction in the teaching of phase contrast microscopy urinalysis under routine training conditions of undergraduate medical students. Methods A randomized controlled trial was conducted with 145 medical students in their third year of training in the Department of Urology at the University Medical Center Freiburg, Germany. 82 subjects where allocated for training with an educational adventure-game (GbEl group) and 69 subjects for conventional training with a written script-based approach (script group). Learning outcome was measured with a 34 item single choice test. Students' attitudes were collected by a questionnaire regarding fun with the training, motivation to continue the training and self-assessment of acquired knowledge. Results The students in the GbEl group achieved significantly better results in the cognitive knowledge test than the students in the script group: the mean score was 28.6 for the GbEl group and 26.0 for the script group of a total of 34.0 points with a Cohen's d effect size of 0.71 (ITT analysis). Attitudes towards the recent learning experience were significantly more positive with GbEl. Students reported to have more fun while learning with the game when compared to the script-based approach. Conclusions Game-based e-learning is more effective than a script-based approach for the training of urinalysis in regard to cognitive learning outcome and has a high positive motivational impact on learning. Game

  15. Shoulder Arthroscopy Does Not Adequately Visualize Pathology of the Long Head of Biceps Tendon

    PubMed Central

    Saithna, Adnan; Longo, Alison; Leiter, Jeff; Old, Jason; MacDonald, Peter M.

    2016-01-01

    Background: Pulling the long head of the biceps tendon into the joint at arthroscopy is a common method for evaluation of tendinopathic lesions. However, the rate of missed diagnoses when using this technique is reported to be as high as 30% to 50%. Hypothesis: Tendon excursion achieved using a standard arthroscopic probe does not allow adequate visualization of extra-articular sites of predilection of tendinopathy. Study Design: Descriptive laboratory study. Methods: Seven forequarter amputation cadaveric specimens were evaluated. The biceps tendon was tagged to mark the intra-articular length and the maximum excursions achieved using a probe and a grasper in both beach-chair and lateral positions. Statistical analyses were performed using analysis of variance to compare means. Results: The mean intra-articular and extra-articular lengths of the tendons were 23.9 and 82.3 mm, respectively. The length of tendon that could be visualized by pulling it into the joint with a probe through the anterior midglenoid portal was not significantly different when using either lateral decubitus (mean ± SD, 29.9 ± 3.89 mm; 95% CI, 25.7-34 mm) or beach-chair positions (32.7 ± 4.23 mm; 95% CI, 28.6-36.8 mm). The maximum length of the overall tendon visualized in any specimen using a standard technique was 37 mm. Although there was a trend to greater excursion using a grasper through the same portal, this was not statistically significant. However, using a grasper through the anterosuperior portal gave a significantly greater mean excursion than any other technique (46.7 ± 4.31 mm; 95% CI, 42.6-50.8 mm), but this still failed to allow evaluation of Denard zone C. Conclusion: Pulling the tendon into the joint with a probe via an anterior portal does not allow visualization of distal sites of predilection of pathology. Surgeons should be aware that this technique is inadequate and can result in missed diagnoses. Clinical Relevance: This study demonstrates that glenohumeral

  16. 75 FR 5893 - Suspension of Community Eligibility for Failure To Maintain Adequate Floodplain Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... FR 51735. Executive Order 13132, Federalism. This rule involves no policies that have ] federalism....C. 4001 et seq., Reorganization Plan No. 3 of 1978, 3 CFR, 1978 Comp., p. 329; E.O. 12127, 44 FR... To Maintain Adequate Floodplain Management Regulations AGENCY: Federal Emergency Management...

  17. 26 CFR 1.467-2 - Rent accrual for section 467 rental agreements without adequate interest.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... interest (the stated rate of interest) on deferred or prepaid fixed rent at a single fixed rate (as defined in § 1.1273-1(c)(1)(iii)); (B) The stated rate of interest on fixed rent is no lower than 110 percent... provide for a variable rate of interest. For purposes of the adequate interest test under paragraph...

  18. Towards Defining Adequate Lithium Trials for Individuals with Mental Retardation and Mental Illness.

    ERIC Educational Resources Information Center

    Pary, Robert J.

    1991-01-01

    Use of lithium with mentally retarded individuals with psychiatric conditions and/or behavior disturbances is discussed. The paper describes components of an adequate clinical trial and reviews case studies and double-blind cases. The paper concludes that aggression is the best indicator for lithium use, and reviews treatment parameters and…

  19. How Much and What Kind? Identifying an Adequate Technology Infrastructure for Early Childhood Education. Policy Brief

    ERIC Educational Resources Information Center

    Daugherty, Lindsay; Dossani, Rafiq; Johnson, Erin-Elizabeth; Wright, Cameron

    2014-01-01

    To realize the potential benefits of technology use in early childhood education (ECE), and to ensure that technology can help to address the digital divide, providers, families of young children, and young children themselves must have access to an adequate technology infrastructure. The goals for technology use in ECE that a technology…

  20. Evaluating the Reliability of Selected School-Based Indices of Adequate Reading Progress

    ERIC Educational Resources Information Center

    Wheeler, Courtney E.

    2010-01-01

    The present study examined the stability (i.e., 4-month and 12-month test-retest reliability) of six selected school-based indices of adequate reading progress. The total sampling frame included between 3970 and 5655 schools depending on the index and research question. Each school had at least 40 second-grade students that had complete Oral…

  1. Understanding the pelvic pain mechanism is key to find an adequate therapeutic approach.

    PubMed

    Van Kerrebroeck, Philip

    2016-06-25

    Pain is a natural mechanism to actual or potential tissue damage and involves both a sensory and an emotional experience. In chronic pelvic pain, localisation of pain can be widespread and can cause considerable distress. A multidisciplinary approach is needed in order to fully understand the pelvic pain mechanism and to identify an adequate therapeutic approach.

  2. 33 CFR 155.4050 - Ensuring that the salvors and marine firefighters are adequate.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Ensuring that the salvors and marine firefighters are adequate. 155.4050 Section 155.4050 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) POLLUTION OIL OR HAZARDOUS MATERIAL POLLUTION...

  3. Performance Effects of Failure to Make Adequate Yearly Progress (AYP): Evidence from a Regression Discontinuity Framework

    ERIC Educational Resources Information Center

    Hemelt, Steven W.

    2011-01-01

    As the No Child Left Behind (NCLB) law moves through the reauthorization process, it is important to understand the basic performance impacts of its central structure of accountability. In this paper, I examine the effects of failure to make Adequate Yearly Progress (AYP) under NCLB on subsequent student math and reading performance at the school…

  4. Determining Adequate Yearly Progress in a State Performance or Proficiency Index Model

    ERIC Educational Resources Information Center

    Erpenbach, William J.

    2009-01-01

    The purpose of this paper is to present an overview regarding how several states use a performance or proficiency index in their determination of adequate yearly progress (AYP) under the No Child Left Behind Act of 2001 (NCLB). Typically, indexes are based on one of two weighting schemes: (1) either they weight academic performance levels--also…

  5. The Relationship between Parental Involvement and Adequate Yearly Progress among Urban, Suburban, and Rural Schools

    ERIC Educational Resources Information Center

    Ma, Xin; Shen, Jianping; Krenn, Huilan Y.

    2014-01-01

    Using national data from the 2007-08 School and Staffing Survey, we compared the relationships between parental involvement and school outcomes related to adequate yearly progress (AYP) in urban, suburban, and rural schools. Parent-initiated parental involvement demonstrated significantly positive relationships with both making AYP and staying off…

  6. Effect of tranquilizers on animal resistance to the adequate stimuli of the vestibular apparatus

    NASA Technical Reports Server (NTRS)

    Maksimovich, Y. B.; Khinchikashvili, N. V.

    1980-01-01

    The effect of tranquilizers on vestibulospinal reflexes and motor activity was studied in 900 centrifuged albino mice. Actometric studies have shown that the tranquilizers have a group capacity for increasing animal resistance to the action of adequate stimuli to the vestibular apparatus.

  7. Human milk feeding supports adequate growth in infants

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The ...

  8. [Factors associated with adequate fruit and vegetable intake by schoolchildren in Santa Catarina State, Brazil].

    PubMed

    Costa, Larissa da Cunha Feio; Vasconcelos, Francisco de Assis Guedes de; Corso, Arlete Catarina Tittoni

    2012-06-01

    This study aimed to estimate fruit and vegetable intake and identify associated factors among schoolchildren in Santa Catarina State, Brazil. A cross-sectional study was conducted with 4,964 students from public and private schools in eight districts in the State, analyzing socioeconomic and anthropometric data and dietary intake. Adequate fruit and vegetable intake was defined as five or more servings per day. Poisson regression was performed to test associations between fruit and vegetable intake and independent variables (p < 0.05). Adequate intake was found in 2.7% of children, while 26.6% of the sample did not consume any fruits and vegetables. In the analysis of the association between independent variables and adequate fruit and vegetable intake in the total sample, only geographic region (residents in western Santa Catarina) and consumption of candy were significantly associated. In the stratified analysis by sex, for boys, only geographic region was associated, while among girls, region and candy consumption were significantly associated with adequate fruit and vegetable intake. The findings indicate the need for specific strategies in the school community to improve fruit and vegetable intake by schoolchildren.

  9. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...

  10. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...

  11. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...

  12. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...

  13. Percentage of Adults with High Cholesterol Whose LDL Cholesterol Levels Are Adequately Controlled

    MedlinePlus

    ... of Adults with High Cholesterol Whose LDL Cholesterol Levels are Adequately Controlled High cholesterol can double a ... with High Cholesterol that is Controlled by Education Level 8k4c-k22f Download these data » Click on legends ...

  14. Perceptions of Teachers in Their First Year of School Restructuring: Failure to Make Adequate Yearly Progress

    ERIC Educational Resources Information Center

    Moser, Sharon

    2010-01-01

    The 2007-2008 school year marked the first year Florida's Title I schools that did not made Adequate Yearly Progress (AYP) for five consecutive years entered into restructuring as mandated by the "No Child Left Behind Act" of 2001. My study examines the perceptions of teacher entering into their first year of school restructuring due to failure to…

  15. The Unequal Effect of Adequate Yearly Progress: Evidence from School Visits

    ERIC Educational Resources Information Center

    Brown, Abigail B.; Clift, Jack W.

    2010-01-01

    The authors report insights, based on annual site visits to elementary and middle schools in three states from 2004 to 2006, into the incentive effect of the No Child Left Behind Act's requirement that increasing percentages of students make Adequate Yearly Progress (AYP) in every public school. They develop a framework, drawing on the physics…

  16. Influenza 2005-2006: vaccine supplies adequate, but bird flu looms.

    PubMed

    Mossad, Sherif B

    2005-11-01

    Influenza vaccine supplies appear to be adequate for the 2005-2006 season, though delivery has been somewhat delayed. However, in the event of a pandemic of avian flu-considered inevitable by most experts, although no one knows when it will happen-the United States would be woefully unprepared. PMID:16315443

  17. Prenatal zinc supplementation of zinc-adequate rats adversely affects immunity in offspring

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We previously showed that zinc (Zn) supplementation of Zn-adequate dams induced immunosuppressive effects that persist in the offspring after weaning. We investigated whether the immunosuppressive effects were due to in utero exposure and/or mediated via milk using a cross-fostering design. Pregnant...

  18. Inferential Processing among Adequate and Struggling Adolescent Comprehenders and Relations to Reading Comprehension

    ERIC Educational Resources Information Center

    Barth, Amy E.; Barnes, Marcia; Francis, David; Vaughn, Sharon; York, Mary

    2015-01-01

    Separate mixed model analyses of variance were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6-12 (n = 1,203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in…

  19. What Is the Cost of an Adequate Vermont High School Education?

    ERIC Educational Resources Information Center

    Rucker, Frank D.

    2010-01-01

    Access to an adequate education has been widely considered an undeniable right since Chief Justice Warren stated in his landmark decision that "Today, education is perhaps the most important function of state and local governments...it is doubtful that any child may reasonably be expected to succeed in life if he is denied the opportunity of an…

  20. Calculating and Reducing Errors Associated with the Evaluation of Adequate Yearly Progress.

    ERIC Educational Resources Information Center

    Hill, Richard

    In the Spring, 1996, issue of "CRESST Line," E. Baker and R. Linn commented that, in efforts to measure the progress of schools, "the fluctuations due to differences in the students themselves could conceal differences in instructional effects." This is particularly true in the context of the evaluation of adequate yearly progress required by…

  1. 26 CFR 1.467-2 - Rent accrual for section 467 rental agreements without adequate interest.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... provide for a variable rate of interest. For purposes of the adequate interest test under paragraph (b)(1) of this section, if a section 467 rental agreement provides for variable interest, the rental... date as the issue date) for the variable rates called for by the rental agreement. For purposes of...

  2. 26 CFR 1.467-2 - Rent accrual for section 467 rental agreements without adequate interest.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... provide for a variable rate of interest. For purposes of the adequate interest test under paragraph (b)(1) of this section, if a section 467 rental agreement provides for variable interest, the rental... date as the issue date) for the variable rates called for by the rental agreement. For purposes of...

  3. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  4. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  5. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...

  6. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  7. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  8. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...

  9. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...

  10. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...

  11. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  12. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...

  13. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...

  14. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...

  15. How random are random numbers generated using photons?

    NASA Astrophysics Data System (ADS)

    Solis, Aldo; Angulo Martínez, Alí M.; Ramírez Alarcón, Roberto; Cruz Ramírez, Hector; U'Ren, Alfred B.; Hirsch, Jorge G.

    2015-06-01

    Randomness is fundamental in quantum theory, with many philosophical and practical implications. In this paper we discuss the concept of algorithmic randomness, which provides a quantitative method to assess the Borel normality of a given sequence of numbers, a necessary condition for it to be considered random. We use Borel normality as a tool to investigate the randomness of ten sequences of bits generated from the differences between detection times of photon pairs generated by spontaneous parametric downconversion. These sequences are shown to fulfil the randomness criteria without difficulties. As deviations from Borel normality for photon-generated random number sequences have been reported in previous work, a strategy to understand these diverging findings is outlined.

  16. Quantum random walk polynomial and quantum random walk measure

    NASA Astrophysics Data System (ADS)

    Kang, Yuanbao; Wang, Caishi

    2014-05-01

    In the paper, we introduce a quantum random walk polynomial (QRWP) that can be defined as a polynomial , which is orthogonal with respect to a quantum random walk measure (QRWM) on , such that the parameters are in the recurrence relations and satisfy . We firstly obtain some results of QRWP and QRWM, in which case the correspondence between measures and orthogonal polynomial sequences is one-to-one. It shows that any measure with respect to which a quantum random walk polynomial sequence is orthogonal is a quantum random walk measure. We next collect some properties of QRWM; moreover, we extend Karlin and McGregor's representation formula for the transition probabilities of a quantum random walk (QRW) in the interacting Fock space, which is a parallel result with the CGMV method. Using these findings, we finally obtain some applications for QRWM, which are of interest in the study of quantum random walk, highlighting the role played by QRWP and QRWM.

  17. Prospective randomized comparison of gastrotomy closure associating tunnel access and over-the-scope clip (OTSC) with two other methods in an experimental ex vivo setting

    PubMed Central

    Gonzalez, Jean-Michel; Saito, Kayoko; Kang, Changdon; Gromski, Mark; Sawhney, Mandeep; Chuttani, Ram; Matthes, Kai

    2015-01-01

    Background: Safe transgastric natural orifice transluminal endoscopic surgery (NOTES) procedures require a reliable closure of the gastrotomy. Recently a novel peritoneal access method via a submucosal tunnel has been described with encouraging preliminary results. Aim: The aim is to compare a submucosal tunnel access plus over-the-scope clip (OTSC) system for closure with two other closure modalities. Patients and methods: This is a prospective ex vivo study conducted on 42 porcine stomach models equally randomized into three groups in an academic medical center. The procedures performed in each group included: (1) Tunnel (6 cm) + endoclips; (2) Knife + balloon dilation access + OTSC; and (3) Tunnel + OTSC. A pressurized air-leak test was performed to evaluate the strength of the closure. Stomach volumes, procedure times, number of clips, and incision sizes were also registered. Results: The mean air-leak pressure was statistically higher in Group 3 than in Groups 1 and 2–95.2 ± 19.3 mmHg versus 72.5 ± 35.2 and 79.0 ± 24.5 mmHg (P < 0.05). The gastrotomy creation times for Groups 1, 2, and 3 were 28.0 ± 10.1, 4.3 ± 1.4, and 20.1 ± 10.6 minutes, respectively, with significantly lower time in Group 2 (P < 0.001). The closure times were 16.1 ± 6.1, 6.5 ± 1.2, and 5.3 ± 3.0 minutes, respectively, and significantly longer in the endoclip group (P < 0.001). There were no differences in the volumes and the incision sizes among the three groups. Conclusion: The combination of a submucosal tunnel access and OTSC offers a stronger closure than the other methods studied. PMID:26134780

  18. Simulation of Free Surface Dynamic in a Random Heterogeneous Porous Medium by the Method Based on Mapping the Regular Domain on the Flow Domain.

    NASA Astrophysics Data System (ADS)

    Lazarev, Y.; Petrov, P.; Tartakovsky, D. M.

    2002-12-01

    In this paper the problem of vacuum-incompressible fluid interface moving in a porous medium by treating conductivity of a medium as a random field with known statistics is considered. The flow is described by a combination of mass conservation and Darcy' law. The use of a coordinate system tied with the moving fluid allows reducing the problem to the well-explored class of problems with fixed boundaries and an effective conductivity tensor instead of the initial scalar conductivity. The hydraulic head is represented as a series in powers of effective conductivity fluctuations. The applied procedure is close to the perturbation theory procedure in the amplitude of the hydraulic conductivity fluctuations ˜ K when searching the solution with accuracy up to ˜ K2 . In both cases physical quantity variance is considered to be proportional to its reason: ˜ V = A ṡ ˜ K ( A is a linear operator). Yet unlike perturbation theory, where it is considered that A depends only on undisturbed flow parameters A = A(¯ K), in the approach being used A is considered to be dependent on averaged flow parameters A = A( ¯ K,< ˜ K2 > ). Equations of the mean hydraulic head and mean flux and expressions for respective variances as well are derived in the 2-D case. For 1-D flow the derived solution agrees with the exact one within terms of σ K2 -order at any free surface fluctuations. Within this approach the free surface moving, evolution in time of the mean hydraulic head spatial distribution, mean flux and relative correlation functions are described by the set of first-order partial differential equations. The conjugate gradient method with preconditioning is proposed to be used as the general method of equation numerical solving to find hydraulic head statistic moments. The problem matrix symmetry and its positive definiteness serve the foundation of the method applicability. RFLOW code has been elaborated to solve this set of equations numerically. Testing data of the

  19. Treatment Outcomes of Corticosteroid Injection and Extracorporeal Shock Wave Therapy as Two Primary Therapeutic Methods for Acute Plantar Fasciitis: A Prospective Randomized Clinical Trial.

    PubMed

    Mardani-Kivi, Mohsen; Karimi Mobarakeh, Mahmoud; Hassanzadeh, Zabihallah; Mirbolook, Ahmadreza; Asadi, Kamran; Ettehad, Hossein; Hashemi-Motlagh, Keyvan; Saheb-Ekhtiari, Khashayar; Fallah-Alipour, Keyvan

    2015-01-01

    The outcome of corticosteroid injection (CSI) and extracorporeal shock wave therapy (ESWT) as primary treatment of acute plantar fasciitis has been debated. The purpose of the present study was to evaluate and compare the therapeutic effects of CSI and ESWT in patients with acute (<6-week duration) symptomatic plantar fasciitis. Of the 116 eligible patients, 68 were randomized to 2 equal groups of 34 patients, each undergoing either ESWT or CSI. The ESWT method included 2000 impulses with energy of 0.15 mJ/mm(2) and a total energy flux density of 900 mJ/mm(2) for 3 consecutive sessions at 1-week intervals. In the CSI group, 40 mg of methyl prednisolone acetate plus 1 mL of lidocaine 2% was injected into the maximal tenderness point at the inframedial calcaneal tuberosity. The success and recurrence rates and pain intensity measured using the visual analog scale, were recorded and compared at the 3-month follow-up visit. The pain intensity had reduced significantly in all patients undergoing either technique. However, the value and trend of pain reduction in the CSI group was significantly greater than those in the ESWT group (p < .0001). In the ESWT and CSI groups, 19 (55.9%) and 5 (14.7%) patients experienced treatment failure, respectively. Age, gender, body mass index, and recurrence rate were similar between the 2 groups (p > .05). Both ESWT and CSI can be used as the primary and/or initial treatment option for treating patients with acute plantar fasciitis; however, the CSI technique had better therapeutic outcomes.

  20. Treatment Outcomes of Corticosteroid Injection and Extracorporeal Shock Wave Therapy as Two Primary Therapeutic Methods for Acute Plantar Fasciitis: A Prospective Randomized Clinical Trial.

    PubMed

    Mardani-Kivi, Mohsen; Karimi Mobarakeh, Mahmoud; Hassanzadeh, Zabihallah; Mirbolook, Ahmadreza; Asadi, Kamran; Ettehad, Hossein; Hashemi-Motlagh, Keyvan; Saheb-Ekhtiari, Khashayar; Fallah-Alipour, Keyvan

    2015-01-01

    The outcome of corticosteroid injection (CSI) and extracorporeal shock wave therapy (ESWT) as primary treatment of acute plantar fasciitis has been debated. The purpose of the present study was to evaluate and compare the therapeutic effects of CSI and ESWT in patients with acute (<6-week duration) symptomatic plantar fasciitis. Of the 116 eligible patients, 68 were randomized to 2 equal groups of 34 patients, each undergoing either ESWT or CSI. The ESWT method included 2000 impulses with energy of 0.15 mJ/mm(2) and a total energy flux density of 900 mJ/mm(2) for 3 consecutive sessions at 1-week intervals. In the CSI group, 40 mg of methyl prednisolone acetate plus 1 mL of lidocaine 2% was injected into the maximal tenderness point at the inframedial calcaneal tuberosity. The success and recurrence rates and pain intensity measured using the visual analog scale, were recorded and compared at the 3-month follow-up visit. The pain intensity had reduced significantly in all patients undergoing either technique. However, the value and trend of pain reduction in the CSI group was significantly greater than those in the ESWT group (p < .0001). In the ESWT and CSI groups, 19 (55.9%) and 5 (14.7%) patients experienced treatment failure, respectively. Age, gender, body mass index, and recurrence rate were similar between the 2 groups (p > .05). Both ESWT and CSI can be used as the primary and/or initial treatment option for treating patients with acute plantar fasciitis; however, the CSI technique had better therapeutic outcomes. PMID:26215551