Science.gov

Sample records for adequate randomization methods

  1. Are shear force methods adequately reported?

    PubMed

    Holman, Benjamin W B; Fowler, Stephanie M; Hopkins, David L

    2016-09-01

    This study aimed to determine the detail to which shear force (SF) protocols and methods have been reported in the scientific literature between 2009 and 2015. Articles (n=734) published in peer-reviewed animal and food science journals and limited to only those testing the SF of unprocessed and non-fabricated mammal meats were evaluated. It was found that most of these SF articles originated in Europe (35.3%), investigated bovine species (49.0%), measured m. longissimus samples (55.2%), used tenderometers manufactured by Instron (31.2%), and equipped with Warner-Bratzler blades (68.8%). SF samples were also predominantly thawed prior to cooking (37.1%) and cooked sous vide, using a water bath (50.5%). Information pertaining to blade crosshead speed (47.5%), recorded SF resistance (56.7%), muscle fibre orientation when tested (49.2%), sub-section or core dimension (21.8%), end-point temperature (29.3%), and other factors contributing to SF variation were often omitted. This base failure diminishes repeatability and accurate SF interpretation, and must therefore be rectified.

  2. Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios

    NASA Technical Reports Server (NTRS)

    Juarez, Alfredo; Harper, Susana Tapia

    2016-01-01

    The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.

  3. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  4. Randomization methods in emergency setting trials: a descriptive review

    PubMed Central

    Moe‐Byrne, Thirimon; Oddie, Sam; McGuire, William

    2015-01-01

    Background Quasi‐randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic indicators between treatment groups in trials using true randomization versus trials using quasi‐randomization. Results Seven reviews contained 16 trials that used true randomization and 11 that used quasi‐randomization. Baseline group imbalance was identified in four trials using true randomization (25%) and in two quasi‐randomized trials (18%). Of the four truly randomized trials with imbalance, three concealed treatment allocation adequately. Clinical heterogeneity and poor reporting limited the assessment of trial recruitment outcomes. Conclusions We did not find strong or consistent evidence that quasi‐randomization is associated with selection bias more often than true randomization. High risk of bias judgements for quasi‐randomized emergency studies should therefore not be assumed in systematic reviews. Clinical heterogeneity across trials within reviews, coupled with limited availability of relevant trial accrual data, meant it was not possible to adequately explore the possibility that true randomization might result in slower trial recruitment rates, or the recruitment of less representative populations. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. PMID:26333419

  5. Are adequate methods available to detect protist parasites on fresh produce?

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Human parasitic protists such as Cryptosporidium, Giardia and microsporidia contaminate a variety of fresh produce worldwide. Existing detection methods lack sensitivity and specificity for most foodborne parasites. Furthermore, detection has been problematic because these parasites adhere tenacious...

  6. Estimating the benefits of maintaining adequate lake levels to homeowners using the hedonic property method

    NASA Astrophysics Data System (ADS)

    Loomis, John; Feldman, Marvin

    2003-09-01

    The hedonic property method was used to estimate residents' economic benefits from maintaining high and stable lake levels at Lake Almanor, California. Nearly a thousand property transactions over a 14-year period from 1987 to 2001 were analyzed. The linear hedonic property regression explained more than 60% of the variation in-house prices. Property prices were negatively and significantly related to the number of linear feet of exposed lake shoreline. Each additional one foot of exposed shoreline reduces the property price by 108-119. A view of the lake added nearly 31,000 to house prices, while lakefront properties sold for 209,000 more than non-lake front properties.

  7. Methods for selection of adequate neural network structures with application to early assessment of chest pain patients by biochemical monitoring.

    PubMed

    Ellenius, J; Groth, T

    2000-07-01

    A methodology for selecting, training and estimating the performance of adequate artificial neural network (ANN) structures and incorporating them with algorithms that are optimized for clinical decision making is presented. The methodology was applied to the problem of early ruling-in/ruling-out of patients with suspected acute myocardial infarction using frequent biochemical monitoring. The selection of adequate ANN structures from a set of candidates was based on criteria for model compatibility, parameter identifiability and diagnostic performance. The candidate ANN structures evaluated were the single-layer perceptron (SLP), the fuzzified SLP, the multiple SLP, the gated multiple SLP, the multi-layer perceptron (MLP) and the discrete-time recursive neural network. The identifiability of the ANNs was assessed in terms of the conditioning of the Hessian of the objective function, and variability of parameter estimates and decision boundaries in the trials of leave-one-out cross-validation. The commonly used MLP was shown to be non-identifiable for the present problem and available amount of data, despite artificially reducing the model complexity with use of regularization methods. The investigation is concluded by recommending a number of guidelines in order to obtain an adequate ANN model.

  8. Random vibration ESS adequacy prediction method

    NASA Astrophysics Data System (ADS)

    Lambert, Ronald G.

    Closed form analytical expressions have been derived and are used as part of the proposed method to quantitatively predict the adequacy of the random vibration portion of an Environmental Stress Screen (ESS) to meet its main objective for screening typical avionics electronic assemblies for workmanship defects without consuming excessive useful life. This method is limited to fatigue related defects (including initial damage/Fracture Mechanics effects) and requires defect fatigue and service environment parameter values. Examples are given to illustrate the method.

  9. Random Walk Method for Potential Problems

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, T.; Raju, I. S.

    2002-01-01

    A local Random Walk Method (RWM) for potential problems governed by Lapalace's and Paragon's equations is developed for two- and three-dimensional problems. The RWM is implemented and demonstrated in a multiprocessor parallel environment on a Beowulf cluster of computers. A speed gain of 16 is achieved as the number of processors is increased from 1 to 23.

  10. Adverse prognostic value of peritumoral vascular invasion: is it abrogated by adequate endocrine adjuvant therapy? Results from two International Breast Cancer Study Group randomized trials of chemoendocrine adjuvant therapy for early breast cancer

    PubMed Central

    Viale, G.; Giobbie-Hurder, A.; Gusterson, B. A.; Maiorano, E.; Mastropasqua, M. G.; Sonzogni, A.; Mallon, E.; Colleoni, M.; Castiglione-Gertsch, M.; Regan, M. M.; Brown, R. W.; Golouh, R.; Crivellari, D.; Karlsson, P.; Öhlschlegel, C.; Gelber, R. D.; Goldhirsch, A.; Coates, A. S.

    2010-01-01

    Background: Peritumoral vascular invasion (PVI) may assist in assigning optimal adjuvant systemic therapy for women with early breast cancer. Patients and methods: Patients participated in two International Breast Cancer Study Group randomized trials testing chemoendocrine adjuvant therapies in premenopausal (trial VIII) or postmenopausal (trial IX) node-negative breast cancer. PVI was assessed by institutional pathologists and/or central review on hematoxylin–eosin-stained slides in 99% of patients (analysis cohort 2754 patients, median follow-up >9 years). Results: PVI, present in 23% of the tumors, was associated with higher grade tumors and larger tumor size (trial IX only). Presence of PVI increased locoregional and distant recurrence and was significantly associated with poorer disease-free survival. The adverse prognostic impact of PVI in trial VIII was limited to premenopausal patients with endocrine-responsive tumors randomized to therapies not containing goserelin, and conversely the beneficial effect of goserelin was limited to patients whose tumors showed PVI. In trial IX, all patients received tamoxifen: the adverse prognostic impact of PVI was limited to patients with receptor-negative tumors regardless of chemotherapy. Conclusion: Adequate endocrine adjuvant therapy appears to abrogate the adverse impact of PVI in node-negative disease, while PVI may identify patients who will benefit particularly from adjuvant therapy. PMID:19633051

  11. Social Franchising and a Nationwide Mass Media Campaign Increased the Prevalence of Adequate Complementary Feeding in Vietnam: A Cluster-Randomized Program Evaluation.

    PubMed

    Rawat, Rahul; Nguyen, Phuong Hong; Tran, Lan Mai; Hajeebhoy, Nemat; Nguyen, Huan Van; Baker, Jean; Frongillo, Edward A; Ruel, Marie T; Menon, Purnima

    2017-02-08

    Background: Rigorous evaluations of health system-based interventions in large-scale programs to improve complementary feeding (CF) practices are limited. Alive & Thrive applied principles of social franchising within the government health system in Vietnam to improve the quality of interpersonal counseling (IPC) for infant and young child feeding combined with a national mass media (MM) campaign and community mobilization (CM). Objective: We evaluated the impact of enhanced IPC + MM + CM (intensive) compared with standard IPC + less-intensive MM and CM (nonintensive) on CF practices and anthropometric indicators.Methods: A cluster-randomized, nonblinded evaluation design with cross-sectional surveys (n = ∼500 children aged 6-23.9 mo and ∼1000 children aged 24-59.9 mo/group) implemented at baseline (2010) and endline (2014) was used. Difference-in-difference estimates (DDEs) of impact were calculated for intent-to-treat (ITT) analyses and modified per-protocol analyses (MPAs; mothers who attended the social franchising at least once: 62%).Results: Groups were similar at baseline. In ITT analyses, there were no significant differences between groups in changes in CF practices over time. In the MPAs, greater improvements in the intensive than in the nonintensive group were seen for minimum dietary diversity [DDE: 6.4 percentage points (pps); P < 0.05] and minimum acceptable diet (8.0 pps; P < 0.05). Significant stunting declines occurred in both intensive (7.1 pps) and nonintensive (5.4 pps) groups among children aged 24-59.9 mo, with no differential decline.Conclusions: When combined with MM and CM, an at-scale social franchising approach to improve IPC, delivered through the existing health care system, significantly improved CF practices, but not child growth, among mothers who used counseling services at least once. A greater impact may be achieved with strategies designed to increase service utilization. This trial was registered at clinicaltrials.gov as NCT

  12. Are Power Analyses Reported with Adequate Detail? Evidence from the First Wave of Group Randomized Trials Funded by the Institute of Education Sciences

    ERIC Educational Resources Information Center

    Spybrook, Jessaca

    2008-01-01

    This study examines the reporting of power analyses in the group randomized trials funded by the Institute of Education Sciences from 2002 to 2006. A detailed power analysis provides critical information that allows reviewers to (a) replicate the power analysis and (b) assess whether the parameters used in the power analysis are reasonable.…

  13. Are the defined substrate-based methods adequate to determine the microbiological quality of natural recreational waters?

    PubMed

    Valente, Marta Sofia; Pedro, Paulo; Alonso, M Carmen; Borrego, Juan J; Dionísio, Lídia

    2010-03-01

    Monitoring the microbiological quality of water used for recreational activities is very important to human public health. Although the sanitary quality of recreational marine waters could be evaluated by standard methods, they are time-consuming and need confirmation. For these reasons, faster and more sensitive methods, such as the defined substrate-based technology, have been developed. In the present work, we have compared the standard method of membrane filtration using Tergitol-TTC agar for total coliforms and Escherichia coli, and Slanetz and Bartley agar for enterococci, and the IDEXX defined substrate technology for these faecal pollution indicators to determine the microbiological quality of natural recreational waters. ISO 17994:2004 standard was used to compare these methods. The IDEXX for total coliforms and E. coli, Colilert, showed higher values than those obtained by the standard method. Enterolert test, for the enumeration of enterococci, showed lower values when compared with the standard method. It may be concluded that more studies to evaluate the precision and accuracy of the rapid tests are required in order to apply them for routine monitoring of marine and freshwater recreational bathing areas. The main advantages of these methods are that they are more specific, feasible and simpler than the standard methodology.

  14. Sham Electroacupuncture Methods in Randomized Controlled Trials

    PubMed Central

    Chen, Zi-xian; Li, Yan; Zhang, Xiao-guang; Chen, Shuang; Yang, Wen-ting; Zheng, Xia-wei; Zheng, Guo-qing

    2017-01-01

    Sham electroacupuncture (EA) control is commonly used to evaluate the specific effects of EA in randomized-controlled trials (RCTs). However, establishing an inert and concealable sham EA control remains methodologically challenging. Here, we aimed to systematically investigate the sham EA methods. Eight electronic databases were searched from their inception to April 2015. Ten out of the 17 sham EA methods were identified from 94 RCTs involving 6134 participants according to three aspects: needle location, depth of needle insertion and electrical stimulation. The top three most frequently used types were sham EA type A, type L and type O ordinally. Only 24 out of the 94 trials reported credibility tests in six types of sham EA methods and the results were mainly as follows: sham EA type A (10/24), type B (5/24) and type Q (5/24). Compared with sham EA controls, EA therapy in 56.2% trials reported the specific effects, of which the highest positive rate was observed in type N (3/4), type F (5/7), type D (4/6) and type M (2/3). In conclusion, several sham EA types were identified as a promising candidate for further application in RCTs. Nonetheless, more evidence for inert and concealable sham EA control methods is needed. PMID:28106094

  15. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  16. Two random repeat recall methods to assess alcohol use.

    PubMed Central

    Midanik, L T

    1993-01-01

    Two random repeat recall methods were compared with a summary measure to assess alcohol use. Subjects (n = 142) were randomly assigned to one of two groups; they were called either on 14 random days during three 30-day waves and asked about drinking yesterday, or on 2 random days during each wave and asked about drinking in the past week. Follow-up telephone interviews obtained summary measures for each wave. Random repeat methods generally obtained higher estimates. However, the high dropout rate makes questionable the feasibility of using this approach with general population samples. PMID:8498631

  17. Replica methods for loopy sparse random graphs

    NASA Astrophysics Data System (ADS)

    Coolen, ACC

    2016-03-01

    I report on the development of a novel statistical mechanical formalism for the analysis of random graphs with many short loops, and processes on such graphs. The graphs are defined via maximum entropy ensembles, in which both the degrees (via hard constraints) and the adjacency matrix spectrum (via a soft constraint) are prescribed. The sum over graphs can be done analytically, using a replica formalism with complex replica dimensions. All known results for tree-like graphs are recovered in a suitable limit. For loopy graphs, the emerging theory has an appealing and intuitive structure, suggests how message passing algorithms should be adapted, and what is the structure of theories describing spin systems on loopy architectures. However, the formalism is still largely untested, and may require further adjustment and refinement. This paper is dedicated to the memory of our colleague and friend Jun-Ichi Inoue, with whom the author has had the great pleasure and privilege of collaborating.

  18. Novel Random Mutagenesis Method for Directed Evolution.

    PubMed

    Feng, Hong; Wang, Hai-Yan; Zhao, Hong-Yan

    2017-01-01

    Directed evolution is a powerful strategy for gene mutagenesis, and has been used for protein engineering both in scientific research and in the biotechnology industry. The routine method for directed evolution was developed by Stemmer in 1994 (Stemmer, Proc Natl Acad Sci USA 91, 10747-10751, 1994; Stemmer, Nature 370, 389-391, 1994). Since then, various methods have been introduced, each of which has advantages and limitations depending upon the targeted genes and procedure. In this chapter, a novel alternative directed evolution method which combines mutagenesis PCR with dITP and fragmentation by endonuclease V is described. The kanamycin resistance gene is used as a reporter gene to verify the novel method for directed evolution. This method for directed evolution has been demonstrated to be efficient, reproducible, and easy to manipulate in practice.

  19. A simplified method for random vibration analysis of structures with random parameters

    NASA Astrophysics Data System (ADS)

    Ghienne, Martin; Blanzé, Claude

    2016-09-01

    Piezoelectric patches with adapted electrical circuits or viscoelastic dissipative materials are two solutions particularly adapted to reduce vibration of light structures. To accurately design these solutions, it is necessary to describe precisely the dynamical behaviour of the structure. It may quickly become computationally intensive to describe robustly this behaviour for a structure with nonlinear phenomena, such as contact or friction for bolted structures, and uncertain variations of its parameters. The aim of this work is to propose a non-intrusive reduced stochastic method to characterize robustly the vibrational response of a structure with random parameters. Our goal is to characterize the eigenspace of linear systems with dynamic properties considered as random variables. This method is based on a separation of random aspects from deterministic aspects and allows us to estimate the first central moments of each random eigenfrequency with a single deterministic finite elements computation. The method is applied to a frame with several Young's moduli modeled as random variables. This example could be expanded to a bolted structure including piezoelectric devices. The method needs to be enhanced when random eigenvalues are closely spaced. An indicator with no additional computational cost is proposed to characterize the ’’proximity” of two random eigenvalues.

  20. Efficient Training Methods for Conditional Random Fields

    DTIC Science & Technology

    2008-02-01

    Learning (ICML), 2007. [63] Bruce G. Lindsay. Composite likelihood methods. Contemporary Mathematics, pages 221–239, 1988. 189 [64] Yan Liu, Jaime ...Conference on Machine Learning (ICML), pages 737–744, 2005. [107] Erik F. Tjong Kim Sang and Sabine Buchholz. Introduction to the CoNLL-2000 shared task

  1. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that the schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solutions and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  2. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that that schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solution and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  3. Tabu search method with random moves for globally optimal design

    NASA Astrophysics Data System (ADS)

    Hu, Nanfang

    1992-09-01

    Optimum engineering design problems are usually formulated as non-convex optimization problems of continuous variables. Because of the absence of convexity structure, they can have multiple minima, and global optimization becomes difficult. Traditional methods of optimization, such as penalty methods, can often be trapped at a local optimum. The tabu search method with random moves to solve approximately these problems is introduced. Its reliability and efficiency are examined with the help of standard test functions. By the analysis of the implementations, it is seen that this method is easy to use, and no derivative information is necessary. It outperforms the random search method and composite genetic algorithm. In particular, it is applied to minimum weight design examples of a three-bar truss, coil springs, a Z-section and a channel section. For the channel section, the optimal design using the tabu search method with random moves saved 26.14 percent over the weight of the SUMT method.

  4. Segmentation of stochastic images with a stochastic random walker method.

    PubMed

    Pätz, Torben; Preusser, Tobias

    2012-05-01

    We present an extension of the random walker segmentation to images with uncertain gray values. Such gray-value uncertainty may result from noise or other imaging artifacts or more general from measurement errors in the image acquisition process. The purpose is to quantify the influence of the gray-value uncertainty onto the result when using random walker segmentation. In random walker segmentation, a weighted graph is built from the image, where the edge weights depend on the image gradient between the pixels. For given seed regions, the probability is evaluated for a random walk on this graph starting at a pixel to end in one of the seed regions. Here, we extend this method to images with uncertain gray values. To this end, we consider the pixel values to be random variables (RVs), thus introducing the notion of stochastic images. We end up with stochastic weights for the graph in random walker segmentation and a stochastic partial differential equation (PDE) that has to be solved. We discretize the RVs and the stochastic PDE by the method of generalized polynomial chaos, combining the recent developments in numerical methods for the discretization of stochastic PDEs and an interactive segmentation algorithm. The resulting algorithm allows for the detection of regions where the segmentation result is highly influenced by the uncertain pixel values. Thus, it gives a reliability estimate for the resulting segmentation, and it furthermore allows determining the probability density function of the segmented object volume.

  5. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  6. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Technical Reports Server (NTRS)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  7. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Astrophysics Data System (ADS)

    Lawson, John

    2004-03-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  8. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  9. Chemical Poker: An Innovative Testing Method Using Elements of Randomness.

    ERIC Educational Resources Information Center

    Benvenuto, Mark A.; Ferruzzi, Arthur

    2002-01-01

    Introduces a testing method for general and organic chemistry in which students receive prepared randomized cards with a portion of a question on them to complete and then use them to answer specific test questions. Discusses the effects of this technique on each individual student's learning process. (KHR)

  10. An Evaluation of the Effectiveness of Recruitment Methods: The Staying Well after Depression Randomized Controlled Trial

    PubMed Central

    Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J. Mark G.

    2014-01-01

    Background Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited and trials often fail to report sufficient details about the recruitment sources and resources utilised. Purpose We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. Methods We describe eight recruitment methods utilised and two further sources not initiated by the research team and examine their efficacy in terms of (i) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial, (ii) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants, and (iii) comparison of sociodemographic characteristics of individuals recruited from different sources. Results Poster advertising, web-based advertising and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. Limitations It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other

  11. Extremely Randomized Machine Learning Methods for Compound Activity Prediction.

    PubMed

    Czarnecki, Wojciech M; Podlewska, Sabina; Bojarski, Andrzej J

    2015-11-09

    Speed, a relatively low requirement for computational resources and high effectiveness of the evaluation of the bioactivity of compounds have caused a rapid growth of interest in the application of machine learning methods to virtual screening tasks. However, due to the growth of the amount of data also in cheminformatics and related fields, the aim of research has shifted not only towards the development of algorithms of high predictive power but also towards the simplification of previously existing methods to obtain results more quickly. In the study, we tested two approaches belonging to the group of so-called 'extremely randomized methods'-Extreme Entropy Machine and Extremely Randomized Trees-for their ability to properly identify compounds that have activity towards particular protein targets. These methods were compared with their 'non-extreme' competitors, i.e., Support Vector Machine and Random Forest. The extreme approaches were not only found out to improve the efficiency of the classification of bioactive compounds, but they were also proved to be less computationally complex, requiring fewer steps to perform an optimization procedure.

  12. Missing data methods in Mendelian randomization studies with multiple instruments.

    PubMed

    Burgess, Stephen; Seaman, Shaun; Lawlor, Debbie A; Casas, Juan P; Thompson, Simon G

    2011-11-01

    Mendelian randomization studies typically have low power. Where there are several valid candidate genetic instruments, precision can be gained by using all the instruments available. However, sporadically missing genetic data can offset this gain. The authors describe 4 Bayesian methods for imputing the missing data based on a missing-at-random assumption: multiple imputations, single nucleotide polymorphism (SNP) imputation, latent variables, and haplotype imputation. These methods are demonstrated in a simulation study and then applied to estimate the causal relation between C-reactive protein and each of fibrinogen and coronary heart disease, based on 3 SNPs in British Women's Heart and Health Study participants assessed at baseline between May 1999 and June 2000. A complete-case analysis based on all 3 SNPs was found to be more precise than analyses using any 1 SNP alone. Precision is further improved by using any of the 4 proposed missing data methods; the improvement is equivalent to about a 25% increase in sample size. All methods gave similar results, which were apparently not overly sensitive to violation of the missing-at-random assumption. Programming code for the analyses presented is available online.

  13. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  14. System and Method for Tracking Vehicles Using Random Search Algorithms.

    DTIC Science & Technology

    1997-01-31

    patent application is available for licensing. Requests for information should be addressed to: OFFICE OF NAVAL RESEARCH DEPARTMENT OF THE NAVY...relates to a system and a method for 22 tracking vehicles using random search algorithm methodolgies . 23 (2) Description of the Prior Art 24 Contact...algorithm methodologies for finding peaks in non-linear 14 functions. U.S. Patent No. 5,148,513 to Koza et al., for 15 example, relates to a non-linear

  15. Random Sampling of Quantum States: a Survey of Methods. And Some Issues Regarding the Overparametrized Method

    NASA Astrophysics Data System (ADS)

    Maziero, Jonas

    2015-12-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed.

  16. Random-breakage mapping method applied to human DNA sequences

    NASA Technical Reports Server (NTRS)

    Lobrich, M.; Rydberg, B.; Cooper, P. K.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    The random-breakage mapping method [Game et al. (1990) Nucleic Acids Res., 18, 4453-4461] was applied to DNA sequences in human fibroblasts. The methodology involves NotI restriction endonuclease digestion of DNA from irradiated calls, followed by pulsed-field gel electrophoresis, Southern blotting and hybridization with DNA probes recognizing the single copy sequences of interest. The Southern blots show a band for the unbroken restriction fragments and a smear below this band due to radiation induced random breaks. This smear pattern contains two discontinuities in intensity at positions that correspond to the distance of the hybridization site to each end of the restriction fragment. By analyzing the positions of those discontinuities we confirmed the previously mapped position of the probe DXS1327 within a NotI fragment on the X chromosome, thus demonstrating the validity of the technique. We were also able to position the probes D21S1 and D21S15 with respect to the ends of their corresponding NotI fragments on chromosome 21. A third chromosome 21 probe, D21S11, has previously been reported to be close to D21S1, although an uncertainty about a second possible location existed. Since both probes D21S1 and D21S11 hybridized to a single NotI fragment and yielded a similar smear pattern, this uncertainty is removed by the random-breakage mapping method.

  17. Finite amplitude method for the quasiparticle random-phase approximation

    SciTech Connect

    Avogadro, Paolo; Nakatsukasa, Takashi

    2011-07-15

    We present the finite amplitude method (FAM), originally proposed in Ref. [17], for superfluid systems. A Hartree-Fock-Bogoliubov code may be transformed into a code of the quasiparticle-random-phase approximation (QRPA) with simple modifications. This technique has advantages over the conventional QRPA calculations, such as coding feasibility and computational cost. We perform the fully self-consistent linear-response calculation for the spherical neutron-rich nucleus {sup 174}Sn, modifying the hfbrad code, to demonstrate the accuracy, feasibility, and usefulness of the FAM.

  18. Theory of optimum radio reception methods in random noise

    NASA Astrophysics Data System (ADS)

    Gutkin, L. S.

    1982-09-01

    The theory of optimum methods of reception of signals on the background of random noise, widely used in development of any radioelectronic systems and devices based on reception and transmission of information (radar and radio controlled, radio communications, radio telemetry, radio astronomy, television, and other systems), as well as electroacoustical and wire communications sytems, is presented. Optimum linear and nonlinear filtration, binary and comples signal detection and discrimination, estimation of signal parameters, receiver synthesis for incomplete a priori data, special features of synthesis with respect to certain quality indicators, and other problems are examined.

  19. PROSPECTIVE RANDOMIZED STUDY COMPARING TWO ANESTHETIC METHODS FOR SHOULDER SURGERY

    PubMed Central

    Ikemoto, Roberto Yukio; Murachovsky, Joel; Prata Nascimento, Luis Gustavo; Bueno, Rogerio Serpone; Oliveira Almeida, Luiz Henrique; Strose, Eric; de Mello, Sérgio Cabral; Saletti, Deise

    2015-01-01

    Objective: To evaluate the efficacy of suprascapular nerve block in combination with infusion of anesthetic into the subacromial space, compared with interscalene block. Methods: Forty-five patients with small or medium-sized isolated supraspinatus tendon lesions who underwent arthroscopic repair were prospectively and comparatively evaluated through random assignation to three groups of 15, each with a different combination of anesthetic methods. The efficacy of postoperative analgesia was measured using the visual analogue scale for pain and the analgesic, anti-inflammatory and opioid drug consumption. Inhalation anesthetic consumption during surgery was also compared between the groups. Results: The statistical analysis did not find any statistically significant differences among the groups regarding anesthetic consumption during surgery or postoperative analgesic efficacy during the first 48 hours. Conclusion: Suprascapular nerve block with infusion of anesthetic into the subacromial space is an excellent alternative to interscalene block, particularly in hospitals in which an electrical nerve stimulating device is unavailable. PMID:27022569

  20. Asbestos/NESHAP adequately wet guidance

    SciTech Connect

    Shafer, R.; Throwe, S.; Salgado, O.; Garlow, C.; Hoerath, E.

    1990-12-01

    The Asbestos NESHAP requires facility owners and/or operators involved in demolition and renovation activities to control emissions of particulate asbestos to the outside air because no safe concentration of airborne asbestos has ever been established. The primary method used to control asbestos emissions is to adequately wet the Asbestos Containing Material (ACM) with a wetting agent prior to, during and after demolition/renovation activities. The purpose of the document is to provide guidance to asbestos inspectors and the regulated community on how to determine if friable ACM is adequately wet as required by the Asbestos NESHAP.

  1. Sequential methods for random-effects meta-analysis

    PubMed Central

    Higgins, Julian P T; Whitehead, Anne; Simmonds, Mark

    2011-01-01

    Although meta-analyses are typically viewed as retrospective activities, they are increasingly being applied prospectively to provide up-to-date evidence on specific research questions. When meta-analyses are updated account should be taken of the possibility of false-positive findings due to repeated significance tests. We discuss the use of sequential methods for meta-analyses that incorporate random effects to allow for heterogeneity across studies. We propose a method that uses an approximate semi-Bayes procedure to update evidence on the among-study variance, starting with an informative prior distribution that might be based on findings from previous meta-analyses. We compare our methods with other approaches, including the traditional method of cumulative meta-analysis, in a simulation study and observe that it has Type I and Type II error rates close to the nominal level. We illustrate the method using an example in the treatment of bleeding peptic ulcers. Copyright © 2010 John Wiley & Sons, Ltd. PMID:21472757

  2. A new method for direction finding based on Markov random field model

    NASA Astrophysics Data System (ADS)

    Ota, Mamoru; Kasahara, Yoshiya; Goto, Yoshitaka

    2015-07-01

    Investigating the characteristics of plasma waves observed by scientific satellites in the Earth's plasmasphere/magnetosphere is effective for understanding the mechanisms for generating waves and the plasma environment that influences wave generation and propagation. In particular, finding the propagation directions of waves is important for understanding mechanisms of VLF/ELF waves. To find these directions, the wave distribution function (WDF) method has been proposed. This method is based on the idea that observed signals consist of a number of elementary plane waves that define wave energy density distribution. However, the resulting equations constitute an ill-posed problem in which a solution is not determined uniquely; hence, an adequate model must be assumed for a solution. Although many models have been proposed, we have to select the most optimum model for the given situation because each model has its own advantages and disadvantages. In the present study, we propose a new method for direction finding of the plasma waves measured by plasma wave receivers. Our method is based on the assumption that the WDF can be represented by a Markov random field model with inference of model parameters performed using a variational Bayesian learning algorithm. Using computer-generated spectral matrices, we evaluated the performance of the model and compared the results with those obtained from two conventional methods.

  3. Reanalysis of morphine consumption from two randomized controlled trials of gabapentin using longitudinal statistical methods

    PubMed Central

    Zhang, Shiyuan; Paul, James; Nantha-Aree, Manyat; Buckley, Norman; Shahzad, Uswa; Cheng, Ji; DeBeer, Justin; Winemaker, Mitchell; Wismer, David; Punthakee, Dinshaw; Avram, Victoria; Thabane, Lehana

    2015-01-01

    Background Postoperative pain management in total joint replacement surgery remains ineffective in up to 50% of patients and has an overwhelming impact in terms of patient well-being and health care burden. We present here an empirical analysis of two randomized controlled trials assessing whether addition of gabapentin to a multimodal perioperative analgesia regimen can reduce morphine consumption or improve analgesia for patients following total joint arthroplasty (the MOBILE trials). Methods Morphine consumption, measured for four time periods in patients undergoing total hip or total knee arthroplasty, was analyzed using a linear mixed-effects model to provide a longitudinal estimate of the treatment effect. Repeated-measures analysis of variance and generalized estimating equations were used in a sensitivity analysis to compare the robustness of the methods. Results There was no statistically significant difference in morphine consumption between the treatment group and a control group (mean effect size estimate 1.0, 95% confidence interval −4.7, 6.7, P=0.73). The results remained robust across different longitudinal methods. Conclusion The results of the current reanalysis of morphine consumption align with those of the MOBILE trials. Gabapentin did not significantly reduce morphine consumption in patients undergoing major replacement surgeries. The results remain consistent across longitudinal methods. More work in the area of postoperative pain is required to provide adequate management for this patient population. PMID:25709496

  4. Yoga for veterans with chronic low back pain: Design and methods of a randomized clinical trial.

    PubMed

    Groessl, Erik J; Schmalzl, Laura; Maiya, Meghan; Liu, Lin; Goodman, Debora; Chang, Douglas G; Wetherell, Julie L; Bormann, Jill E; Atkinson, J Hamp; Baxi, Sunita

    2016-05-01

    Chronic low back pain (CLBP) afflicts millions of people worldwide, with particularly high prevalence in military veterans. Many treatment options exist for CLBP, but most have limited effectiveness and some have significant side effects. In general populations with CLBP, yoga has been shown to improve health outcomes with few side effects. However, yoga has not been adequately studied in military veteran populations. In the current paper we will describe the design and methods of a randomized clinical trial aimed at examining whether yoga can effectively reduce disability and pain in US military veterans with CLBP. A total of 144 US military veterans with CLBP will be randomized to either yoga or a delayed treatment comparison group. The yoga intervention will consist of 2× weekly yoga classes for 12weeks, complemented by regular home practice guided by a manual. The delayed treatment group will receive the same intervention after six months. The primary outcome is the change in back pain-related disability measured with the Roland-Morris Disability Questionnaire at baseline and 12-weeks. Secondary outcomes include pain intensity, pain interference, depression, anxiety, fatigue/energy, quality of life, self-efficacy, sleep quality, and medication usage. Additional process and/or mediational factors will be measured to examine dose response and effect mechanisms. Assessments will be conducted at baseline, 6-weeks, 12-weeks, and 6-months. All randomized participants will be included in intention-to-treat analyses. Study results will provide much needed evidence on the feasibility and effectiveness of yoga as a therapeutic modality for the treatment of CLBP in US military veterans.

  5. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods.

    PubMed

    Shara, Nawar; Yassin, Sayf A; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V; Wang, Wenyu; Lee, Elisa T; Umans, Jason G

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989-1991), 2 (1993-1995), and 3 (1998-1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.

  6. Multilevel Analysis Methods for Partially Nested Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Sanders, Elizabeth A.

    2011-01-01

    This paper explores multilevel modeling approaches for 2-group randomized experiments in which a treatment condition involving clusters of individuals is compared to a control condition involving only ungrouped individuals, otherwise known as partially nested cluster randomized designs (PNCRTs). Strategies for comparing groups from a PNCRT in the…

  7. A comparison of methods for representing sparsely sampled random quantities.

    SciTech Connect

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  8. Efficient randomized methods for stability analysis of fluids systems

    NASA Astrophysics Data System (ADS)

    Dawson, Scott; Rowley, Clarence

    2016-11-01

    We show that probabilistic algorithms that have recently been developed for the approximation of large matrices can be utilized to numerically evaluate the properties of linear operators in fluids systems. In particular, we present an algorithm that is well suited for optimal transient growth (i.e., nonmodal stability) analysis. For non-normal systems, such analysis can be important for analyzing local regions of convective instability, and in identifying high-amplitude transients that can trigger nonlinear instabilities. Our proposed algorithms are easy to wrap around pre-existing timesteppers for linearized forward and adjoint equations, are highly parallelizable, and come with known error bounds. Furthermore, they allow for efficient computation of optimal growth modes for numerous time horizons simultaneously. We compare the proposed algorithm to both direct matrix-forming and Krylov subspace approaches on a number of test problems. We will additionally discuss the potential for randomized methods to assist more broadly in the speed-up of algorithms for analyzing both fluids data and operators. Supported by AFOSR Grant FA9550-14-1-0289.

  9. Randomized BioBrick assembly: a novel DNA assembly method for randomizing and optimizing genetic circuits and metabolic pathways.

    PubMed

    Sleight, Sean C; Sauro, Herbert M

    2013-09-20

    The optimization of genetic circuits and metabolic pathways often involves constructing various iterations of the same construct or using directed evolution to achieve the desired function. Alternatively, a method that randomizes individual parts in the same assembly reaction could be used for optimization by allowing for the ability to screen large numbers of individual clones expressing randomized circuits or pathways for optimal function. Here we describe a new assembly method to randomize genetic circuits and metabolic pathways from modular DNA fragments derived from PCR-amplified BioBricks. As a proof-of-principle for this method, we successfully assembled CMY (Cyan-Magenta-Yellow) three-gene circuits using Gibson Assembly that express CFP, RFP, and YFP with independently randomized promoters, ribosome binding sites, transcriptional terminators, and all parts randomized simultaneously. Sequencing results from 24 CMY circuits with various parts randomized show that 20/24 circuits are distinct and expression varies over a 200-fold range above background levels. We then adapted this method to randomize the same parts with enzyme coding sequences from the lycopene biosynthesis pathway instead of fluorescent proteins, designed to independently express each enzyme in the pathway from a different promoter. Lycopene production is improved using this randomization method by about 30% relative to the highest polycistronic-expressing pathway. These results demonstrate the potential of generating nearly 20,000 unique circuit or pathway combinations when three parts are permutated at each position in a three-gene circuit or pathway, and the methodology can likely be adapted to other circuits and pathways to maximize products of interest.

  10. Pre-randomization and de-randomization in emergency medical research: new names and rigorous criteria for old methods.

    PubMed

    Hallstrom, Alfred P; Paradis, Norman A

    2005-04-01

    Clinical trials are performed to determine if a therapy is effective in the treatment of a disease. The methods of randomization and blinding are used to assure that the only planned difference between the two groups is the therapy itself, and differences in outcome cannot be attributed to bias. Emergency medical conditions, and in particular therapies that must be administered in an emergency, present challenges to inclusion, exclusion, randomization, and blinding that are at times insurmountable in the context of available resources. Pre-randomization (that is, assigning the therapy to be used before the event occurs) and de-randomization (that is, removing randomized cases that do not meet established inclusion criteria) may address some of the challenges resulting from emergency enrollment but have the potential to create bias. We describe these techniques, and provide criteria that should be employed if pre-randomization and/or de-randomization are being considered. It is possible to use these techniques to successfully complete clinical trials that would not have been possible using only standard methodology and still ensure that results are without bias.

  11. Investigation of stochastic radiation transport methods in random heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Reinert, Dustin Ray

    Among the most formidable challenges facing our world is the need for safe, clean, affordable energy sources. Growing concerns over global warming induced climate change and the rising costs of fossil fuels threaten conventional means of electricity production and are driving the current nuclear renaissance. One concept at the forefront of international development efforts is the High Temperature Gas-Cooled Reactor (HTGR). With numerous passive safety features and a meltdown-proof design capable of attaining high thermodynamic efficiencies for electricity generation as well as high temperatures useful for the burgeoning hydrogen economy, the HTGR is an extremely promising technology. Unfortunately, the fundamental understanding of neutron behavior within HTGR fuels lags far behind that of more conventional water-cooled reactors. HTGRs utilize a unique heterogeneous fuel element design consisting of thousands of tiny fissile fuel kernels randomly mixed with a non-fissile graphite matrix. Monte Carlo neutron transport simulations of the HTGR fuel element geometry in its full complexity are infeasible and this has motivated the development of more approximate computational techniques. A series of MATLAB codes was written to perform Monte Carlo simulations within HTGR fuel pebbles to establish a comprehensive understanding of the parameters under which the accuracy of the approximate techniques diminishes. This research identified the accuracy of the chord length sampling method to be a function of the matrix scattering optical thickness, the kernel optical thickness, and the kernel packing density. Two new Monte Carlo methods designed to focus the computational effort upon the parameter conditions shown to contribute most strongly to the overall computational error were implemented and evaluated. An extended memory chord length sampling routine that recalls a neutron's prior material traversals was demonstrated to be effective in fixed source calculations containing

  12. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  13. Flush or blow lines adequately

    SciTech Connect

    Junique, J.C.

    1988-07-01

    During the commissioning of new plants, before initial startup, an important step is to clean debris from pipes and equipment. This is usually done by flushing with water or blowing with steam or air. It is not the intention of this article to give recommendation about how to proceed, but rather to give a general method to estimate the effectiveness of this operation. The method is based on the general theory of particle dynamics and the concept of drag force - the force needed to displace particles and move them along through the system. We want to make sure the degree of cleanliness obtained at the end of flushing or blowing is such that, later, in the most critical case during operation or operational upset, the particles which are left in the pipework or equipment will not move further. Therefore, the notion of drag force is useful to make comparisons between normal operation and cleaning operation. The concept can also be used to compare the efficiency of different cleaning media; for example, whether to use air blowing or water flushing.

  14. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient...

  15. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient...

  16. Combining randomized and non-randomized evidence in clinical research: a review of methods and applications.

    PubMed

    Verde, Pablo E; Ohmann, Christian

    2015-03-01

    Researchers may have multiple motivations for combining disparate pieces of evidence in a meta-analysis, such as generalizing experimental results or increasing the power to detect an effect that a single study is not able to detect. However, while in meta-analysis, the main question may be simple, the structure of evidence available to answer it may be complex. As a consequence, combining disparate pieces of evidence becomes a challenge. In this review, we cover statistical methods that have been used for the evidence-synthesis of different study types with the same outcome and similar interventions. For the methodological review, a literature retrieval in the area of generalized evidence-synthesis was performed, and publications were identified, assessed, grouped and classified. Furthermore real applications of these methods in medicine were identified and described. For these approaches, 39 real clinical applications could be identified. A new classification of methods is provided, which takes into account: the inferential approach, the bias modeling, the hierarchical structure, and the use of graphical modeling. We conclude with a discussion of pros and cons of our approach and give some practical advice.

  17. A new method that indicates the peak stress of random vibration response

    NASA Astrophysics Data System (ADS)

    Yan, Yong; Xie, Peng; Xu, Zhen; Jin, Guang

    2012-09-01

    It is an important assessment targets that make a quantitative study of the peak stress of random vibration response during the mechanical properties design process of the space payload. Based on the equivalent of the destructive effect of the random vibration peak response and sine vibration response, the paper established the link between the two, obtained the sine vibration input function that equivalent to the destructive effect of the random vibration peak response. Considering the characteristic of the quantitative research that stress of sine vibration can be, the paper analyzed the stress of the sine vibration by the finite element method and indirectly accessed to the random vibration response peak stress which equivalent to the sine vibration destructive effect. This method worked very well to indicate the peak stress of random vibration response during the ground random vibration tests. The paper provided an effective means of predictive and validation method for the mechanical properties design and test during the ground random vibration test evaluation. The developments costs of the engineering can be significant saving and greatly shorten the development cycle by the method of the peak stress of random vibration response indicated during the ground tests. It is also helpful to improve the safety and reliability of the space load structure in order to avoid the failure or fatigue of the ground random vibration tests.

  18. The Use of Mixed Methods in Randomized Control Trials

    ERIC Educational Resources Information Center

    White, Howard

    2013-01-01

    Evaluations should be issues driven, not methods driven. The starting point should be priority programs to be evaluated or policies to be tested. From this starting point, a list of evaluation questions is identified. For each evaluation question, the task is to identify the best available method for answering that question. Hence it is likely…

  19. A numerical study of rays in random media. [Monte Carlo method simulation

    NASA Technical Reports Server (NTRS)

    Youakim, M. Y.; Liu, C. H.; Yeh, K. C.

    1973-01-01

    Statistics of electromagnetic rays in a random medium are studied numerically by the Monte Carlo method. Two dimensional random surfaces with prescribed correlation functions are used to simulate the random media. Rays are then traced in these sample media. Statistics of the ray properties such as the ray positions and directions are computed. Histograms showing the distributions of the ray positions and directions at different points along the ray path as well as at given points in space are given. The numerical experiment is repeated for different cases corresponding to weakly and strongly random media with isotropic and anisotropic irregularities. Results are compared with those derived from theoretical investigations whenever possible.

  20. A New Method for Position Location in Random Media

    DTIC Science & Technology

    2000-09-29

    time-difference-of-arrival measures, angle-of-arrival measures and power- of-arrival measures. In this paper we propose a new method, which is based on...TELEC�), Santiago de Cuba, Cuba, Julio 2000. [8] J. R. Reitz, F. J. Milord, R. W. Christy, Fundamentos de la Teorfa Electromagn~tica. Mexico D.F., Mexico: Addison-Wesley Iberoamericana, 1986.

  1. 5 CFR 919.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Adequate evidence. 919.900 Section 919.900 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 919.900 Adequate...

  2. A multiple step random walk Monte Carlo method for heat conduction involving distributed heat sources

    NASA Astrophysics Data System (ADS)

    Naraghi, M. H. N.; Chung, B. T. F.

    1982-06-01

    A multiple step fixed random walk Monte Carlo method for solving heat conduction in solids with distributed internal heat sources is developed. In this method, the probability that a walker reaches a point a few steps away is calculated analytically and is stored in the computer. Instead of moving to the immediate neighboring point the walker is allowed to jump several steps further. The present multiple step random walk technique can be applied to both conventional Monte Carlo and the Exodus methods. Numerical results indicate that the present method compares well with finite difference solutions while the computation speed is much faster than that of single step Exodus and conventional Monte Carlo methods.

  3. A method for computing random chord length distributions in geometrical objects.

    PubMed

    Borak, T B

    1994-03-01

    A method is described that uses a Monte Carlo approach for computing the distribution of random chord lengths in objects traversed by rays originating uniformly in space (mu-randomness). The resulting distributions converge identically to the analytical solutions for a sphere and satisfy the Cauchy relationship for mean chord lengths in circular cylinders. The method can easily be applied to geometrical shapes that are not convex such as the region between nested cylinders to simulate the sensitive volume of a detector. Comparisons with other computational methods are presented.

  4. Applying a weighted random forests method to extract karst sinkholes from LiDAR data

    NASA Astrophysics Data System (ADS)

    Zhu, Junfeng; Pierskalla, William P.

    2016-02-01

    Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.

  5. A New Method of Random Environmental Walking for Assessing Behavioral Preferences for Different Lighting Applications

    PubMed Central

    Patching, Geoffrey R.; Rahm, Johan; Jansson, Märit; Johansson, Maria

    2017-01-01

    Accurate assessment of people’s preferences for different outdoor lighting applications is increasingly considered important in the development of new urban environments. Here a new method of random environmental walking is proposed to complement current methods of assessing urban lighting applications, such as self-report questionnaires. The procedure involves participants repeatedly walking between different lighting applications by random selection of a lighting application and preferred choice or by random selection of a lighting application alone. In this manner, participants are exposed to all lighting applications of interest more than once and participants’ preferences for the different lighting applications are reflected in the number of times they walk to each lighting application. On the basis of an initial simulation study, to explore the feasibility of this approach, a comprehensive field test was undertaken. The field test included random environmental walking and collection of participants’ subjective ratings of perceived pleasantness (PP), perceived quality, perceived strength, and perceived flicker of four lighting applications. The results indicate that random environmental walking can reveal participants’ preferences for different lighting applications that, in the present study, conformed to participants’ ratings of PP and perceived quality of the lighting applications. As a complement to subjectively stated environmental preferences, random environmental walking has the potential to expose behavioral preferences for different lighting applications. PMID:28337163

  6. A New Method of Random Environmental Walking for Assessing Behavioral Preferences for Different Lighting Applications.

    PubMed

    Patching, Geoffrey R; Rahm, Johan; Jansson, Märit; Johansson, Maria

    2017-01-01

    Accurate assessment of people's preferences for different outdoor lighting applications is increasingly considered important in the development of new urban environments. Here a new method of random environmental walking is proposed to complement current methods of assessing urban lighting applications, such as self-report questionnaires. The procedure involves participants repeatedly walking between different lighting applications by random selection of a lighting application and preferred choice or by random selection of a lighting application alone. In this manner, participants are exposed to all lighting applications of interest more than once and participants' preferences for the different lighting applications are reflected in the number of times they walk to each lighting application. On the basis of an initial simulation study, to explore the feasibility of this approach, a comprehensive field test was undertaken. The field test included random environmental walking and collection of participants' subjective ratings of perceived pleasantness (PP), perceived quality, perceived strength, and perceived flicker of four lighting applications. The results indicate that random environmental walking can reveal participants' preferences for different lighting applications that, in the present study, conformed to participants' ratings of PP and perceived quality of the lighting applications. As a complement to subjectively stated environmental preferences, random environmental walking has the potential to expose behavioral preferences for different lighting applications.

  7. A novel model and estimation method for the individual random component of earthquake ground-motion relations

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2017-01-01

    In this paper, I introduce a novel approach to modelling the individual random component (also called the intra-event uncertainty) of a ground-motion relation (GMR), as well as a novel approach to estimating the corresponding parameters. In essence, I contend that the individual random component is reproduced adequately by a simple stochastic mechanism of random impulses acting in the horizontal plane, with random directions. The random number of impulses was Poisson distributed. The parameters of the model were estimated according to a proposal by Raschke J Seismol 17(4):1157-1182, (2013a), with the sample of random difference ξ = ln( Y 1 )- ln( Y 2 ), in which Y 1 and Y 2 are the horizontal components of local ground-motion intensity. Any GMR element was eliminated by subtraction, except the individual random components. In the estimation procedure, the distribution of difference ξ was approximated by combining a large Monte Carlo simulated sample and Kernel smoothing. The estimated model satisfactorily fitted the difference ξ of the sample of peak ground accelerations, and the variance of the individual random components was considerably smaller than that of conventional GMRs. In addition, the dependence of variance on the epicentre distance was considered; however, a dependence of variance on the magnitude was not detected. Finally, the influence of the novel model and the corresponding approximations on PSHA was researched. The applied approximations of distribution of the individual random component were satisfactory for the researched example of PSHA.

  8. Methods and optical fibers that decrease pulse degradation resulting from random chromatic dispersion

    DOEpatents

    Chertkov, Michael; Gabitov, Ildar

    2004-03-02

    The present invention provides methods and optical fibers for periodically pinning an actual (random) accumulated chromatic dispersion of an optical fiber to a predicted accumulated dispersion of the fiber through relatively simple modifications of fiber-optic manufacturing methods or retrofitting of existing fibers. If the pinning occurs with sufficient frequency (at a distance less than or are equal to a correlation scale), pulse degradation resulting from random chromatic dispersion is minimized. Alternatively, pinning may occur quasi-periodically, i.e., the pinning distance is distributed between approximately zero and approximately two to three times the correlation scale.

  9. A Nonparametric Belief Propagation Method for Uncertainty Quantification with Applications to Flow in Random Porous Media

    DTIC Science & Technology

    2012-12-10

    veloped to address UQ problems. The most widely used method is the Monte Carlo (MC) method. MC’s wide acceptance is due to the fact that it can uncover...of the random space is computationally infeasible. Gaussian approximation is a widely used technique to build the continuous model, however, the...including linear and nonlinear dimension reduction algorithms. For linear dimension reduction, the most famous and the most widely used method is the

  10. Funding the Formula Adequately in Oklahoma

    ERIC Educational Resources Information Center

    Hancock, Kenneth

    2015-01-01

    This report is a longevity, simulational study that looks at how the ratio of state support to local support effects the number of school districts that breaks the common school's funding formula which in turns effects the equity of distribution to the common schools. After nearly two decades of adequately supporting the funding formula, Oklahoma…

  11. Analysis of random structure-acoustic interaction problems using coupled boundary element and finite element methods

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Pates, Carl S., III

    1994-01-01

    A coupled boundary element (BEM)-finite element (FEM) approach is presented to accurately model structure-acoustic interaction systems. The boundary element method is first applied to interior, two and three-dimensional acoustic domains with complex geometry configurations. Boundary element results are very accurate when compared with limited exact solutions. Structure-interaction problems are then analyzed with the coupled FEM-BEM method, where the finite element method models the structure and the boundary element method models the interior acoustic domain. The coupled analysis is compared with exact and experimental results for a simplistic model. Composite panels are analyzed and compared with isotropic results. The coupled method is then extended for random excitation. Random excitation results are compared with uncoupled results for isotropic and composite panels.

  12. Characterization of a random anisotropic conductivity field with Karhunen-Loeve methods

    SciTech Connect

    Cherry, Matthew R.; Sabbagh, Harold S.; Pilchak, Adam L.; Knopp, Jeremy S.

    2014-02-18

    While parametric uncertainty quantification for NDE models has been addressed in recent years, the problem of stochastic field parameters such as spatially distributed electrical conductivity has only been investigated minimally in the last year. In that work, the authors treated the field as a one-dimensional random process and Karhunen-Loeve methods were used to discretize this process to make it amenable to UQ methods such as ANOVA expansions. In the present work, we will treat the field as a two dimensional random process, and the eigenvalues and eigenfunctions of the integral operator will be determined via Galerkin methods. The Karhunen-Loeve methods is extended to two dimensions and implemented to represent this process. Several different choices for basis functions will be discussed, as well as convergence criteria for each. The methods are applied to correlation functions collected over electron backscatter data from highly micro textured Ti-7Al.

  13. Search Control Algorithm Based on Random Step Size Hill-Climbing Method for Adaptive PMD Compensation

    NASA Astrophysics Data System (ADS)

    Tanizawa, Ken; Hirose, Akira

    Adaptive polarization mode dispersion (PMD) compensation is required for the speed-up and advancement of the present optical communications. The combination of a tunable PMD compensator and its adaptive control method achieves adaptive PMD compensation. In this paper, we report an effective search control algorithm for the feedback control of the PMD compensator. The algorithm is based on the hill-climbing method. However, the step size changes randomly to prevent the convergence from being trapped at a local maximum or a flat, unlike the conventional hill-climbing method. The randomness depends on the Gaussian probability density functions. We conducted transmission simulations at 160Gb/s and the results show that the proposed method provides more optimal compensator control than the conventional hill-climbing method.

  14. Mendelian randomization analysis of a time-varying exposure for binary disease outcomes using functional data analysis methods.

    PubMed

    Cao, Ying; Rajan, Suja S; Wei, Peng

    2016-12-01

    A Mendelian randomization (MR) analysis is performed to analyze the causal effect of an exposure variable on a disease outcome in observational studies, by using genetic variants that affect the disease outcome only through the exposure variable. This method has recently gained popularity among epidemiologists given the success of genetic association studies. Many exposure variables of interest in epidemiological studies are time varying, for example, body mass index (BMI). Although longitudinal data have been collected in many cohort studies, current MR studies only use one measurement of a time-varying exposure variable, which cannot adequately capture the long-term time-varying information. We propose using the functional principal component analysis method to recover the underlying individual trajectory of the time-varying exposure from the sparsely and irregularly observed longitudinal data, and then conduct MR analysis using the recovered curves. We further propose two MR analysis methods. The first assumes a cumulative effect of the time-varying exposure variable on the disease risk, while the second assumes a time-varying genetic effect and employs functional regression models. We focus on statistical testing for a causal effect. Our simulation studies mimicking the real data show that the proposed functional data analysis based methods incorporating longitudinal data have substantial power gains compared to standard MR analysis using only one measurement. We used the Framingham Heart Study data to demonstrate the promising performance of the new methods as well as inconsistent results produced by the standard MR analysis that relies on a single measurement of the exposure at some arbitrary time point.

  15. Randomized Controlled Trial of Teaching Methods: Do Classroom Experiments Improve Economic Education in High Schools?

    ERIC Educational Resources Information Center

    Eisenkopf, Gerald; Sulser, Pascal A.

    2016-01-01

    The authors present results from a comprehensive field experiment at Swiss high schools in which they compare the effectiveness of teaching methods in economics. They randomly assigned classes into an experimental and a conventional teaching group, or a control group that received no specific instruction. Both teaching treatments improve economic…

  16. Random Qualitative Validation: A Mixed-Methods Approach to Survey Validation

    ERIC Educational Resources Information Center

    Van Duzer, Eric

    2012-01-01

    The purpose of this paper is to introduce the process and value of Random Qualitative Validation (RQV) in the development and interpretation of survey data. RQV is a method of gathering clarifying qualitative data that improves the validity of the quantitative analysis. This paper is concerned with validity in relation to the participants'…

  17. Is the Stock of VET Skills Adequate? Assessment Methodologies.

    ERIC Educational Resources Information Center

    Blandy, Richard; Freeland, Brett

    In Australia and elsewhere, four approaches have been used to determine whether stocks of vocational education and training (VET) skills are adequate to meet industry needs. The four methods are as follows: (1) the manpower requirements approach; (2) the international, national, and industry comparisons approach; (3) the labor market analysis…

  18. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter.

    PubMed

    Huang, Lei

    2015-09-30

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required.

  19. A shortcut through the Coulomb gas method for spectral linear statistics on random matrices

    NASA Astrophysics Data System (ADS)

    Deelan Cunden, Fabio; Facchi, Paolo; Vivo, Pierpaolo

    2016-04-01

    In the last decade, spectral linear statistics on large dimensional random matrices have attracted significant attention. Within the physics community, a privileged role has been played by invariant matrix ensembles for which a two-dimensional Coulomb gas analogy is available. We present a critical revision of the Coulomb gas method in random matrix theory (RMT) borrowing language and tools from large deviations theory. This allows us to formalize an equivalent, but more effective and quicker route toward RMT free energy calculations. Moreover, we argue that this more modern viewpoint is likely to shed further light on the interesting issues of weak phase transitions and evaporation phenomena recently observed in RMT.

  20. An efficient method for calculating RMS von Mises stress in a random vibration environment

    SciTech Connect

    Segalman, D.J.; Fulcher, C.W.G.; Reese, G.M.; Field, R.V. Jr.

    1998-02-01

    An efficient method is presented for calculation of RMS von Mises stresses from stress component transfer functions and the Fourier representation of random input forces. An efficient implementation of the method calculates the RMS stresses directly from the linear stress and displacement modes. The key relation presented is one suggested in past literature, but does not appear to have been previously exploited in this manner.

  1. Methods for testing theory and evaluating impact in randomized field trials

    PubMed Central

    Brown, C. Hendricks; Wang, Wei; Kellam, Sheppard G.; Muthén, Bengt O.; Petras, Hanno; Toyinbo, Peter; Poduska, Jeanne; Ialongo, Nicholas; Wyman, Peter A.; Chamberlain, Patricia; Sloboda, Zili; MacKinnon, David P.; Windham, Amy

    2008-01-01

    Randomized field trials provide unique opportunities to examine the effectiveness of an intervention in real world settings and to test and extend both theory of etiology and theory of intervention. These trials are designed not only to test for overall intervention impact but also to examine how impact varies as a function of individual level characteristics, context, and across time. Examination of such variation in impact requires analytical methods that take into account the trial’s multiple nested structure and the evolving changes in outcomes over time. The models that we describe here merge multilevel modeling with growth modeling, allowing for variation in impact to be represented through discrete mixtures—growth mixture models—and nonparametric smooth functions—generalized additive mixed models. These methods are part of an emerging class of multilevel growth mixture models, and we illustrate these with models that examine overall impact and variation in impact. In this paper, we define intent-to-treat analyses in group-randomized multilevel field trials and discuss appropriate ways to identify, examine, and test for variation in impact without inflating the Type I error rate. We describe how to make causal inferences more robust to misspecification of covariates in such analyses and how to summarize and present these interactive intervention effects clearly. Practical strategies for reducing model complexity, checking model fit, and handling missing data are discussed using six randomized field trials to show how these methods may be used across trials randomized at different levels. PMID:18215473

  2. An efficient Monte Carlo interior penalty discontinuous Galerkin method for elastic wave scattering in random media

    NASA Astrophysics Data System (ADS)

    Feng, X.; Lorton, C.

    2017-03-01

    This paper develops and analyzes an efficient Monte Carlo interior penalty discontinuous Galerkin (MCIP-DG) method for elastic wave scattering in random media. The method is constructed based on a multi-modes expansion of the solution of the governing random partial differential equations. It is proved that the mode functions satisfy a three-term recurrence system of partial differential equations (PDEs) which are nearly deterministic in the sense that the randomness only appears in the right-hand side source terms, not in the coefficients of the PDEs. Moreover, the same differential operator applies to all mode functions. A proven unconditionally stable and optimally convergent IP-DG method is used to discretize the deterministic PDE operator, an efficient numerical algorithm is proposed based on combining the Monte Carlo method and the IP-DG method with the $LU$ direct linear solver. It is shown that the algorithm converges optimally with respect to both the mesh size $h$ and the sampling number $M$, and practically its total computational complexity is only amount to solving very few deterministic elastic Helmholtz equations using the $LU$ direct linear solver. Numerically experiments are also presented to demonstrate the performance and key features of the proposed MCIP-DG method.

  3. Local search methods based on variable focusing for random K -satisfiability

    NASA Astrophysics Data System (ADS)

    Lemoy, Rémi; Alava, Mikko; Aurell, Erik

    2015-01-01

    We introduce variable focused local search algorithms for satisfiabiliity problems. Usual approaches focus uniformly on unsatisfied clauses. The methods described here work by focusing on random variables in unsatisfied clauses. Variants are considered where variables are selected uniformly and randomly or by introducing a bias towards picking variables participating in several unsatistified clauses. These are studied in the case of the random 3-SAT problem, together with an alternative energy definition, the number of variables in unsatisfied constraints. The variable-based focused Metropolis search (V-FMS) is found to be quite close in performance to the standard clause-based FMS at optimal noise. At infinite noise, instead, the threshold for the linearity of solution times with instance size is improved by picking preferably variables in several UNSAT clauses. Consequences for algorithmic design are discussed.

  4. Strength of evidence of noninferiority trials with the two confidence interval method with random margin.

    PubMed

    Wang, So-Young; Kang, Seung-Ho

    2013-03-11

    This article deals with the dependency(ies) of noninferiority test(s) when the two confidence interval method is employed. There are two different definitions of the two confidence interval method. One of the objectives of this article is to sort out some of the confusion in these two different definitions. In the first definition the two confidence interval method is considered as the fixed margin method that treats a noninferiority margin as a fixed constant after it is determined based on historical data. In this article the method is called the two confidence interval method with fixed margin. The issue of the dependency(ies) of noninferiority test(s) does not occur in this case. In the second definition the two confidence interval method incorporates the uncertainty associated with the estimation for the noninferiority margin. In this article the method is called the two confidence interval method with random margin. The dependency(ies) occurs, because the two confidence interval method(s) with random margin shares the same historical data. In this article we investigate how the dependency(ies) affects the unconditional and conditional across-trial type I error rates.

  5. A Bloch decomposition-based stochastic Galerkin method for quantum dynamics with a random external potential

    SciTech Connect

    Wu, Zhizhang Huang, Zhongyi

    2016-07-15

    In this paper, we consider the numerical solution of the one-dimensional Schrödinger equation with a periodic lattice potential and a random external potential. This is an important model in solid state physics where the randomness results from complicated phenomena that are not exactly known. Here we generalize the Bloch decomposition-based time-splitting pseudospectral method to the stochastic setting using the generalized polynomial chaos with a Galerkin procedure so that the main effects of dispersion and periodic potential are still computed together. We prove that our method is unconditionally stable and numerical examples show that it has other nice properties and is more efficient than the traditional method. Finally, we give some numerical evidence for the well-known phenomenon of Anderson localization.

  6. A method to dynamic stochastic multicriteria decision making with log-normally distributed random variables.

    PubMed

    Wang, Xin-Fan; Wang, Jian-Qiang; Deng, Sheng-Yue

    2013-01-01

    We investigate the dynamic stochastic multicriteria decision making (SMCDM) problems, in which the criterion values take the form of log-normally distributed random variables, and the argument information is collected from different periods. We propose two new geometric aggregation operators, such as the log-normal distribution weighted geometric (LNDWG) operator and the dynamic log-normal distribution weighted geometric (DLNDWG) operator, and develop a method for dynamic SMCDM with log-normally distributed random variables. This method uses the DLNDWG operator and the LNDWG operator to aggregate the log-normally distributed criterion values, utilizes the entropy model of Shannon to generate the time weight vector, and utilizes the expectation values and variances of log-normal distributions to rank the alternatives and select the best one. Finally, an example is given to illustrate the feasibility and effectiveness of this developed method.

  7. A systematic review of randomized controlled trials on sterilization methods of extracted human teeth

    PubMed Central

    Western, J. Sylvia; Dicksit, Daniel Devaprakash

    2016-01-01

    Aim of this Study: The aim was to evaluate the efficiency of different sterilization methods on extracted human teeth (EHT) by a systematic review of in vitro randomized controlled trials. Methodology: An extensive electronic database literature search concerning the sterilization of EHT was conducted. The search terms used were “human teeth, sterilization, disinfection, randomized controlled trials, and infection control.” Randomized controlled trials which aim at comparing the efficiency of different methods of sterilization of EHT were all included in this systematic review. Results: Out of 1618 articles obtained, eight articles were selected for this systematic review. The sterilization methods reviewed were autoclaving, 10% formalin, 5.25% sodium hypochlorite, 3% hydrogen peroxide, 2% glutaraldehyde, 0.1% thymol, and boiling to 100°C. Data were extracted from the selected individual studies and their findings were summarized. Conclusion: Autoclaving and 10% formalin can be considered as 100% efficient and reliable methods. While the use of 5.25% sodium hypochlorite, 3% hydrogen peroxide, 2% glutaraldehyde, 0.1% thymol, and boiling to 100°C was inefficient and unreliable methods of sterilization of EHT. PMID:27563183

  8. A self-adaptive method for creating high efficiency communication channels through random scattering media

    PubMed Central

    Hao, Xiang; Martin-Rouault, Laure; Cui, Meng

    2014-01-01

    Controlling the propagation of electromagnetic waves is important to a broad range of applications. Recent advances in controlling wave propagation in random scattering media have enabled optical focusing and imaging inside random scattering media. In this work, we propose and demonstrate a new method to deliver optical power more efficiently through scattering media. Drastically different from the random matrix characterization approach, our method can rapidly establish high efficiency communication channels using just a few measurements, regardless of the number of optical modes, and provides a practical and robust solution to boost the signal levels in optical or short wave communications. We experimentally demonstrated analog and digital signal transmission through highly scattering media with greatly improved performance. Besides scattering, our method can also reduce the loss of signal due to absorption. Experimentally, we observed that our method forced light to go around absorbers, leading to even higher signal improvement than in the case of purely scattering media. Interestingly, the resulting signal improvement is highly directional, which provides a new means against eavesdropping. PMID:25070592

  9. Wave propagation through random media: A local method of small perturbations based on the Helmholtz equation

    NASA Technical Reports Server (NTRS)

    Grosse, Ralf

    1990-01-01

    Propagation of sound through the turbulent atmosphere is a statistical problem. The randomness of the refractive index field causes sound pressure fluctuations. Although no general theory to predict sound pressure statistics from given refractive index statistics exists, there are several approximate solutions to the problem. The most common approximation is the parabolic equation method. Results obtained by this method are restricted to small refractive index fluctuations and to small wave lengths. While the first condition is generally met in the atmosphere, it is desirable to overcome the second. A generalization of the parabolic equation method with respect to the small wave length restriction is presented.

  10. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    NASA Astrophysics Data System (ADS)

    Liao, Qifeng; Lin, Guang

    2016-07-01

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  11. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    SciTech Connect

    Liao, Qifeng; Lin, Guang

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  12. Effective conductivity of particulate polymer composite electrolytes using random resistor network method

    SciTech Connect

    Kalnaus, Sergiy; Sabau, Adrian S; Newman, Sarah M; Tenhaeff, Wyatt E; Daniel, Claus; Dudney, Nancy J

    2011-01-01

    The effective DC conductivity of particulate composite electrolytes was obtained by solving electrostatics equations using random resistors network method in three dimensions. The composite structure was considered to consist of three phases: matrix, particulate filler, and conductive shell that surrounded each particle; each phase possessing a different conductivity. Different particle size distributions were generated using Monte Carlo simulations. Unlike effective medium formulations, it was shown that the random resistors network method was able to predict percolation thresholds for the effective composite conductivity. It was found that the mean particle radius has a higher influence on the effective composite conductivity compared to the effect of type of the particle size distributions that were considered. The effect of the shell thickness on the composite conductivity has been investigated. It was found that the conductivity enhancement due to the presence of the conductive shell phase becomes less evident as the shell thickness increases.

  13. A method for assessing the effect of water quality changes on plumbosolvency using random daytime sampling.

    PubMed

    Cardew, P T

    2003-07-01

    The Mann-Whitney U-test is used to demonstrate the impact of phosphate on lead concentrations measured at customer properties. This test is statistically robust and particularly efficient for the type of distributions encountered in lead random daytime sampling. This non-parametric technique is developed to provide a best estimate of the lead reduction that results from a change in plumbosolvency conditions. The method is illustrated with compliance data collected before and after the introduction of phosphate at customer properties in the north west of England. Limitations due to operational factors are highlighted. Using a Monte Carlo simulation of the variability of lead random daytime samples it is shown that the method might be practical when assessing the impact of incremental changes in phosphate concentration on plumbosolvency.

  14. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  15. Robust audio watermark method using sinusoid patterns based on pseudo-random sequences

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Kobayashi, Yoshiyuki; Sawato, Shusaku; Inoue, Akira

    2003-06-01

    In recent years, the spread spectrum watermarking technology has become the most promising technique that not only widely used for still image and video watermarking, but also used for audio watermarking. However, some technique problems such as requiring psycho-acoustic shaping used for reducing audible noise have greatly limited the utility of spread spectrum watermarking technology in audio watermarking. In this paper, we propose a novel audio watermarking method using spread spectrum watermarking technology by which we can embed watermark audio signals inaudibly with a robust to a wide range of unintended and intended attacks. In proposed method, the watermark is represented by sinusoidal patterns consisting of sinusoids with the phase-modulated by the elements of pseudo-random sequence. We theoretically and experimentally confirmed that the sinusoidal patterns based on pseudo-random sequences keep the same correlation property of pseudo-random sequences and have the characteristics of high robustness with less noise, being easy to manipulate, and without requirement of psycho-acoustic shaping. The watermark detection is done by blind detection and the effectiveness of proposed method have been certificated by the test of STEP2001.

  16. Seismic coherent and random noise attenuation using the undecimated discrete wavelet transform method with WDGA technique

    NASA Astrophysics Data System (ADS)

    Goudarzi, Alireza; Riahi, Mohammad Ali

    2012-12-01

    One of the most crucial challenges in seismic data processing is the reduction of the noise in the data or improving the signal-to-noise ratio. In this study, the 1D undecimated discrete wavelet transform (UDWT) has been acquired to attenuate random noise and ground roll. Wavelet domain ground roll analysis (WDGA) is applied to find the ground roll energy in the wavelet domain. The WDGA will be a substitute method for thresholding in seismic data processing. To compare the effectiveness of the WDGA method, we apply the 1D double density discrete wavelet transform (DDDWT) using soft thresholding in the random noise reduction and ground roll attenuation processes. Seismic signals intersect with ground roll in the time and frequency domains. Random noise and ground roll have many undesirable effects on pre-stack seismic data, and result in an inaccurate velocity analysis for NMO correction. In this paper, the UDWT by using the WDGA technique and DDDWT (using the soft thresholding technique) and the regular Fourier based method as f-k transform will be used and compared for seismic denoising.

  17. Thermodynamic method for generating random stress distributions on an earthquake fault

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2012-01-01

    This report presents a new method for generating random stress distributions on an earthquake fault, suitable for use as initial conditions in a dynamic rupture simulation. The method employs concepts from thermodynamics and statistical mechanics. A pattern of fault slip is considered to be analogous to a micro-state of a thermodynamic system. The energy of the micro-state is taken to be the elastic energy stored in the surrounding medium. Then, the Boltzmann distribution gives the probability of a given pattern of fault slip and stress. We show how to decompose the system into independent degrees of freedom, which makes it computationally feasible to select a random state. However, due to the equipartition theorem, straightforward application of the Boltzmann distribution leads to a divergence which predicts infinite stress. To avoid equipartition, we show that the finite strength of the fault acts to restrict the possible states of the system. By analyzing a set of earthquake scaling relations, we derive a new formula for the expected power spectral density of the stress distribution, which allows us to construct a computer algorithm free of infinities. We then present a new technique for controlling the extent of the rupture by generating a random stress distribution thousands of times larger than the fault surface, and selecting a portion which, by chance, has a positive stress perturbation of the desired size. Finally, we present a new two-stage nucleation method that combines a small zone of forced rupture with a larger zone of reduced fracture energy.

  18. A stochastic simulation method for the assessment of resistive random access memory retention reliability

    SciTech Connect

    Berco, Dan Tseng, Tseung-Yuen

    2015-12-21

    This study presents an evaluation method for resistive random access memory retention reliability based on the Metropolis Monte Carlo algorithm and Gibbs free energy. The method, which does not rely on a time evolution, provides an extremely efficient way to compare the relative retention properties of metal-insulator-metal structures. It requires a small number of iterations and may be used for statistical analysis. The presented approach is used to compare the relative robustness of a single layer ZrO{sub 2} device with a double layer ZnO/ZrO{sub 2} one, and obtain results which are in good agreement with experimental data.

  19. An efficient hybrid reliability analysis method with random and interval variables

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2016-09-01

    Random and interval variables often coexist. Interval variables make reliability analysis much more computationally intensive. This work develops a new hybrid reliability analysis method so that the probability analysis (PA) loop and interval analysis (IA) loop are decomposed into two separate loops. An efficient PA algorithm is employed, and a new efficient IA method is developed. The new IA method consists of two stages. The first stage is for monotonic limit-state functions. If the limit-state function is not monotonic, the second stage is triggered. In the second stage, the limit-state function is sequentially approximated with a second order form, and the gradient projection method is applied to solve the extreme responses of the limit-state function with respect to the interval variables. The efficiency and accuracy of the proposed method are demonstrated by three examples.

  20. Determination of Slope Safety Factor with Analytical Solution and Searching Critical Slip Surface with Genetic-Traversal Random Method

    PubMed Central

    2014-01-01

    In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679

  1. Reflective Random Indexing and indirect inference: a scalable method for discovery of implicit connections.

    PubMed

    Cohen, Trevor; Schvaneveldt, Roger; Widdows, Dominic

    2010-04-01

    The discovery of implicit connections between terms that do not occur together in any scientific document underlies the model of literature-based knowledge discovery first proposed by Swanson. Corpus-derived statistical models of semantic distance such as Latent Semantic Analysis (LSA) have been evaluated previously as methods for the discovery of such implicit connections. However, LSA in particular is dependent on a computationally demanding method of dimension reduction as a means to obtain meaningful indirect inference, limiting its ability to scale to large text corpora. In this paper, we evaluate the ability of Random Indexing (RI), a scalable distributional model of word associations, to draw meaningful implicit relationships between terms in general and biomedical language. Proponents of this method have achieved comparable performance to LSA on several cognitive tasks while using a simpler and less computationally demanding method of dimension reduction than LSA employs. In this paper, we demonstrate that the original implementation of RI is ineffective at inferring meaningful indirect connections, and evaluate Reflective Random Indexing (RRI), an iterative variant of the method that is better able to perform indirect inference. RRI is shown to lead to more clearly related indirect connections and to outperform existing RI implementations in the prediction of future direct co-occurrence in the MEDLINE corpus.

  2. Subtraction method in the second random-phase approximation: First applications with a Skyrme energy functional

    NASA Astrophysics Data System (ADS)

    Gambacurta, D.; Grasso, M.; Engel, J.

    2015-09-01

    We make use of a subtraction procedure, introduced to overcome double-counting problems in beyond-mean-field theories, in the second random-phase-approximation (SRPA) for the first time. This procedure guarantees the stability of the SRPA (so that all excitation energies are real). We show that the method fits perfectly into nuclear density-functional theory. We illustrate applications to the monopole and quadrupole response and to low-lying 0+ and 2+ states in the nucleus 16O . We show that the subtraction procedure leads to (i) results that are weakly cutoff dependent and (ii) a considerable reduction of the SRPA downwards shift with respect to the random-phase approximation (RPA) spectra (systematically found in all previous applications). This implementation of the SRPA model will allow a reliable analysis of the effects of two particle-two hole configurations (2p2h) on the excitation spectra of medium-mass and heavy nuclei.

  3. RecRWR: a recursive random walk method for improved identification of diseases.

    PubMed

    Arrais, Joel Perdiz; Oliveira, José Luís

    2015-01-01

    High-throughput methods such as next-generation sequencing or DNA microarrays lack precision, as they return hundreds of genes for a single disease profile. Several computational methods applied to physical interaction of protein networks have been successfully used in identification of the best disease candidates for each expression profile. An open problem for these methods is the ability to combine and take advantage of the wealth of biomedical data publicly available. We propose an enhanced method to improve selection of the best disease targets for a multilayer biomedical network that integrates PPI data annotated with stable knowledge from OMIM diseases and GO biological processes. We present a comprehensive validation that demonstrates the advantage of the proposed approach, Recursive Random Walk with Restarts (RecRWR). The obtained results outline the superiority of the proposed approach, RecRWR, in identifying disease candidates, especially with high levels of biological noise and benefiting from all data available.

  4. Mixture model and Markov random field-based remote sensing image unsupervised clustering method

    NASA Astrophysics Data System (ADS)

    Hou, Y.; Yang, Y.; Rao, N.; Lun, X.; Lan, J.

    2011-03-01

    In this paper, a novel method for remote sensing image clustering based on mixture model and Markov random field (MRF) is proposed. A remote sensing image can be considered as Gaussian mixture model. The image clustering result corresponding to the image label field is a MRF. So, the image clustering procedure is transformed to a maximum a posterior (MAP) problem by Bayesian theorem. The intensity difference and the spatial distance between the two pixels in the same clique are introduced into the traditional MRF potential function. The iterative conditional model (ICM) is employed to find the solution of MAP. We use the max entropy criterion to choose the optimal clustering number. In the experiments, the method is compared with the traditional MRF clustering method using ICM and simulated annealing (SA). The results show that this method is better than the traditional MRF model both in noise filtering and miss-classification ratio.

  5. Image reconstruction in EIT with unreliable electrode data using random sample consensus method

    NASA Astrophysics Data System (ADS)

    Jeon, Min Ho; Khambampati, Anil Kumar; Kim, Bong Seok; In Kang, Suk; Kim, Kyung Youn

    2017-04-01

    In electrical impedance tomography (EIT), it is important to acquire reliable measurement data through EIT system for achieving good reconstructed image. In order to have reliable data, various methods for checking and optimizing the EIT measurement system have been studied. However, most of the methods involve additional cost for testing and the measurement setup is often evaluated before the experiment. It is useful to have a method which can detect the faulty electrode data during the experiment without any additional cost. This paper presents a method based on random sample consensus (RANSAC) to find the incorrect data on fault electrode in EIT data. RANSAC is a curve fitting method that removes the outlier data from measurement data. RANSAC method is applied with Gauss Newton (GN) method for image reconstruction of human thorax with faulty data. Numerical and phantom experiments are performed and the reconstruction performance of the proposed RANSAC method with GN is compared with conventional GN method. From the results, it can be noticed that RANSAC with GN has better reconstruction performance than conventional GN method with faulty electrode data.

  6. Klapp method effect on idiopathic scoliosis in adolescents: blind randomized controlled clinical trial

    PubMed Central

    Dantas, Diego De Sousa; De Assis, Sanderson José Costa; Baroni, Marina Pegoraro; Lopes, Johnnatas Mikael; Cacho, Enio Walker Azevedo; Cacho, Roberta De Oliveira; Pereira, Silvana Alves

    2017-01-01

    [Purpose] To estimate the effect of Klapp method on idiopathic scoliosis in school students. [Subjects and Methods] A single-blind randomized clinical trial with 22 students randomly divided into intervention group (n=12) and inactive control group (n=10). Exercise protocol consisted of Klapp method, 20 sessions, three times a week for intervention group, and inactivity for control group. Dorsal muscle strength was measured by dynamometer; body asymmetries and gibbosity angles were measured by biophotogrammetry. Data were obtained by Generalized Estimated Equation, with 5% significance level. Clinical impact for dependent variables was estimated by “d” Cohen. [Results] There was no change in intragroup analysis and intergroup for all postural symmetry variables. However, it was detected intergroup difference in extensor muscle strength and intergroup difference with marginal significance of gibbosity angles. Regarding extensor muscle strength, intervention group produced average improvement of 7.0 kgf compared to control group. Gibbosity angles progressed less in intervention group, with 5.71° average delay compared to control group. [Conclusion] Klapp method was effective for gibbosity stabilization and it improves spine extensor muscle strength. PMID:28210027

  7. Klapp method effect on idiopathic scoliosis in adolescents: blind randomized controlled clinical trial.

    PubMed

    Dantas, Diego De Sousa; De Assis, Sanderson José Costa; Baroni, Marina Pegoraro; Lopes, Johnnatas Mikael; Cacho, Enio Walker Azevedo; Cacho, Roberta De Oliveira; Pereira, Silvana Alves

    2017-01-01

    [Purpose] To estimate the effect of Klapp method on idiopathic scoliosis in school students. [Subjects and Methods] A single-blind randomized clinical trial with 22 students randomly divided into intervention group (n=12) and inactive control group (n=10). Exercise protocol consisted of Klapp method, 20 sessions, three times a week for intervention group, and inactivity for control group. Dorsal muscle strength was measured by dynamometer; body asymmetries and gibbosity angles were measured by biophotogrammetry. Data were obtained by Generalized Estimated Equation, with 5% significance level. Clinical impact for dependent variables was estimated by "d" Cohen. [Results] There was no change in intragroup analysis and intergroup for all postural symmetry variables. However, it was detected intergroup difference in extensor muscle strength and intergroup difference with marginal significance of gibbosity angles. Regarding extensor muscle strength, intervention group produced average improvement of 7.0 kgf compared to control group. Gibbosity angles progressed less in intervention group, with 5.71° average delay compared to control group. [Conclusion] Klapp method was effective for gibbosity stabilization and it improves spine extensor muscle strength.

  8. Serum thyroglobulin reference intervals in regions with adequate and more than adequate iodine intake.

    PubMed

    Wang, Zhaojun; Zhang, Hanyi; Zhang, Xiaowen; Sun, Jie; Han, Cheng; Li, Chenyan; Li, Yongze; Teng, Xiaochun; Fan, Chenling; Liu, Aihua; Shan, Zhongyan; Liu, Chao; Weng, Jianping; Teng, Weiping

    2016-11-01

    The purpose of this study was to establish normal thyroglobulin (Tg) reference intervals (RIs) in regions with adequate and more than adequate iodine intake according to the National Academy of Clinical Biochemistry (NACB) guidelines and to investigate the relationships between Tg and other factors.A total of 1317 thyroid disease-free adult subjects (578 men, 739 nonpregnant women) from 2 cities (Guangzhou and Nanjing) were enrolled in this retrospective, observational study. Each subject completed a questionnaire and underwent physical and ultrasonic examination. Serum Tg, thyroid-stimulating hormone (TSH), thyroid peroxidase antibody (TPOAb), Tg antibody (TgAb), and urinary iodine concentration (UIC) were measured. Reference groups were established on the basis of TSH levels: 0.5 to 2.0 and 0.27 to 4.2 mIU/L.The Tg RIs for Guangzhou and Nanjing were 1.6 to 30.0 and 1.9 to 25.8 ng/mL, respectively. No significant differences in Tg were found between genders or among different reference groups. Stepwise linear regression analyses showed that TgAb, thyroid volume, goiter, gender, age, and TSH levels were correlated with Tg.In adults from regions with adequate and more than adequate iodine intake, we found that Tg may be a suitable marker of iodine status; gender-specific Tg RI was unnecessary; there was no difference between Tg RIs in regions with adequate and more than adequate iodine intake; and the TSH criterion for selecting the Tg reference population could follow the local TSH reference rather than 0.5 to 2.0 mIU/L.

  9. Serum thyroglobulin reference intervals in regions with adequate and more than adequate iodine intake

    PubMed Central

    Wang, Zhaojun; Zhang, Hanyi; Zhang, Xiaowen; Sun, Jie; Han, Cheng; Li, Chenyan; Li, Yongze; Teng, Xiaochun; Fan, Chenling; Liu, Aihua; Shan, Zhongyan; Liu, Chao; Weng, Jianping; Teng, Weiping

    2016-01-01

    Abstract The purpose of this study was to establish normal thyroglobulin (Tg) reference intervals (RIs) in regions with adequate and more than adequate iodine intake according to the National Academy of Clinical Biochemistry (NACB) guidelines and to investigate the relationships between Tg and other factors. A total of 1317 thyroid disease-free adult subjects (578 men, 739 nonpregnant women) from 2 cities (Guangzhou and Nanjing) were enrolled in this retrospective, observational study. Each subject completed a questionnaire and underwent physical and ultrasonic examination. Serum Tg, thyroid-stimulating hormone (TSH), thyroid peroxidase antibody (TPOAb), Tg antibody (TgAb), and urinary iodine concentration (UIC) were measured. Reference groups were established on the basis of TSH levels: 0.5 to 2.0 and 0.27 to 4.2 mIU/L. The Tg RIs for Guangzhou and Nanjing were 1.6 to 30.0 and 1.9 to 25.8 ng/mL, respectively. No significant differences in Tg were found between genders or among different reference groups. Stepwise linear regression analyses showed that TgAb, thyroid volume, goiter, gender, age, and TSH levels were correlated with Tg. In adults from regions with adequate and more than adequate iodine intake, we found that Tg may be a suitable marker of iodine status; gender-specific Tg RI was unnecessary; there was no difference between Tg RIs in regions with adequate and more than adequate iodine intake; and the TSH criterion for selecting the Tg reference population could follow the local TSH reference rather than 0.5 to 2.0 mIU/L. PMID:27902589

  10. Evaluation of Strip Footing Bearing Capacity Built on the Anthropogenic Embankment by Random Finite Element Method

    NASA Astrophysics Data System (ADS)

    Pieczynska-Kozlowska, Joanna

    2014-05-01

    One of a geotechnical problem in the area of Wroclaw is an anthropogenic embankment layer delaying to the depth of 4-5m, arising as a result of historical incidents. In such a case an assumption of bearing capacity of strip footing might be difficult. The standard solution is to use a deep foundation or foundation soil replacement. However both methods generate significant costs. In the present paper the authors focused their attention on the influence of anthropogenic embankment variability on bearing capacity. Soil parameters were defined on the basis of CPT test and modeled as 2D anisotropic random fields and the assumption of bearing capacity were made according deterministic finite element methods. Many repeated of the different realizations of random fields lead to stable expected value of bearing capacity. The algorithm used to estimate the bearing capacity of strip footing was the random finite element method (e.g. [1]). In traditional approach of bearing capacity the formula proposed by [2] is taken into account. qf = c'Nc + qNq + 0.5γBN- γ (1) where: qf is the ultimate bearing stress, cis the cohesion, qis the overburden load due to foundation embedment, γ is the soil unit weight, Bis the footing width, and Nc, Nq and Nγ are the bearing capacity factors. The method of evaluation the bearing capacity of strip footing based on finite element method incorporate five parameters: Young's modulus (E), Poisson's ratio (ν), dilation angle (ψ), cohesion (c), and friction angle (φ). In the present study E, ν and ψ are held constant while c and φ are randomized. Although the Young's modulus does not affect the bearing capacity it governs the initial elastic response of the soil. Plastic stress redistribution is accomplished using a viscoplastic algorithm merge with an elastic perfectly plastic (Mohr - Coulomb) failure criterion. In this paper a typical finite element mesh was assumed with 8-node elements consist in 50 columns and 20 rows. Footings width B

  11. Twin-image reduction method for in-line digital holography using periphery and random reference phase-shifting techniques

    NASA Astrophysics Data System (ADS)

    Oshima, Teppei; Matsudo, Yusuke; Kakue, Takashi; Arai, Daisuke; Shimobaba, Tomoyoshi; Ito, Tomoyoshi

    2015-09-01

    Digital holography has the twin image problem that unwanted lights (conjugate and direct lights) overlap in the object light in the reconstruction process. As a method for extracting only the object light, phase-shifting digital holography is widely used; however, this method is not applicable for the observation of moving objects, because this method requires the recording of plural holograms. In this study, we propose a twin-image reduction method by combining the "periphery" method with the "random phase-shifting" method. The proposed method succeeded in improving the reconstruction quality, compared to other one-shot recording methods ("parallel phase-shifting digital holography" and "random phase-shifting").

  12. Recommended Minimum Test Requirements and Test Methods for Assessing Durability of Random-Glass-Fiber Composites

    SciTech Connect

    Battiste, R.L.; Corum, J.M.; Ren, W.; Ruggles, M.B.

    1999-06-01

    This report provides recommended minimum test requirements are suggested test methods for establishing the durability properties and characteristics of candidate random-glass-fiber polymeric composites for automotive structural applications. The recommendations and suggestions are based on experience and results developed at Oak Ridge National Laboratory (ORNL) under a US Department of Energy Advanced Automotive Materials project entitled ''Durability of Lightweight Composite Structures,'' which is closely coordinated with the Automotive Composites Consortium. The report is intended as an aid to suppliers offering new structural composites for automotive applications and to testing organizations that are called on to characterize the composites.

  13. Method to modify random matrix theory using short-time behavior in chaotic systems.

    PubMed

    Smith, A Matthew; Kaplan, Lev

    2009-09-01

    We discuss a modification to random matrix theory (RMT) eigenstate statistics that systematically takes into account the nonuniversal short-time behavior of chaotic systems. The method avoids diagonalization of the Hamiltonian, instead requiring only knowledge of short-time dynamics for a chaotic system or ensemble of similar systems. Standard RMT and semiclassical predictions are recovered in the limits of zero Ehrenfest time and infinite Heisenberg time, respectively. As examples, we discuss wave-function autocorrelations and cross correlations and show how the approach leads to a significant improvement in the accuracy for simple chaotic systems where comparison can be made with brute-force diagonalization.

  14. Statistical Comparison and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; McGhee, David S.

    2004-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield ultimate strength and high cycle fatigue capability. This paper examines the cumulative distribution function (CDF) percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematics is then used to calculate the combined value corresponding to any desired percentile along with a curve fit to this value. Another Excel macro is used to calculate the combination using a Monte Carlo simulation. Unlike the traditional techniques, these methods quantify the calculated load value with a Consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Also, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can lower the design loading substantially without losing any of the identified structural reliability.

  15. Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, A. M.; McGhee, D. S.

    2003-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.

  16. A Novel Hepatocellular Carcinoma Image Classification Method Based on Voting Ranking Random Forests

    PubMed Central

    Xia, Bingbing; Jiang, Huiyan; Liu, Huiling; Yi, Dehui

    2016-01-01

    This paper proposed a novel voting ranking random forests (VRRF) method for solving hepatocellular carcinoma (HCC) image classification problem. Firstly, in preprocessing stage, this paper used bilateral filtering for hematoxylin-eosin (HE) pathological images. Next, this paper segmented the bilateral filtering processed image and got three different kinds of images, which include single binary cell image, single minimum exterior rectangle cell image, and single cell image with a size of n⁎n. After that, this paper defined atypia features which include auxiliary circularity, amendment circularity, and cell symmetry. Besides, this paper extracted some shape features, fractal dimension features, and several gray features like Local Binary Patterns (LBP) feature, Gray Level Cooccurrence Matrix (GLCM) feature, and Tamura features. Finally, this paper proposed a HCC image classification model based on random forests and further optimized the model by voting ranking method. The experiment results showed that the proposed features combined with VRRF method have a good performance in HCC image classification problem. PMID:27293477

  17. A Novel Hepatocellular Carcinoma Image Classification Method Based on Voting Ranking Random Forests.

    PubMed

    Xia, Bingbing; Jiang, Huiyan; Liu, Huiling; Yi, Dehui

    2015-01-01

    This paper proposed a novel voting ranking random forests (VRRF) method for solving hepatocellular carcinoma (HCC) image classification problem. Firstly, in preprocessing stage, this paper used bilateral filtering for hematoxylin-eosin (HE) pathological images. Next, this paper segmented the bilateral filtering processed image and got three different kinds of images, which include single binary cell image, single minimum exterior rectangle cell image, and single cell image with a size of n⁎n. After that, this paper defined atypia features which include auxiliary circularity, amendment circularity, and cell symmetry. Besides, this paper extracted some shape features, fractal dimension features, and several gray features like Local Binary Patterns (LBP) feature, Gray Level Co-occurrence Matrix (GLCM) feature, and Tamura features. Finally, this paper proposed a HCC image classification model based on random forests and further optimized the model by voting ranking method. The experiment results showed that the proposed features combined with VRRF method have a good performance in HCC image classification problem.

  18. Methods of learning in statistical education: Design and analysis of a randomized trial

    NASA Astrophysics Data System (ADS)

    Boyd, Felicity Turner

    Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus

  19. An alternative pseudolikelihood method for multivariate random-effects meta-analysis

    PubMed Central

    Chen, Yong; Hong, Chuan; Riley, Richard D

    2015-01-01

    Recently, multivariate random-effects meta-analysis models have received a great deal of attention, despite its greater complexity compared to univariate meta-analyses. One of its advantages is its ability to account for the within-study and between-study correlations. However, the standard inference procedures, such as the maximum likelihood or maximum restricted likelihood inference, require the within-study correlations, which are usually unavailable. In addition, the standard inference procedures suffer from the problem of singular estimated covariance matrix. In this paper, we propose a pseudolikelihood method to overcome the aforementioned problems. The pseudolikelihood method does not require within-study correlations and is not prone to singular covariance matrix problem. In addition, it can properly estimate the covariance between pooled estimates for different outcomes, which enables valid inference on functions of pooled estimates, and can be applied to meta-analysis where some studies have outcomes missing completely at random. Simulation studies show that the pseudolikelihood method provides unbiased estimates for functions of pooled estimates, well-estimated standard errors, and confidence intervals with good coverage probability. Furthermore, the pseudolikelihood method is found to maintain high relative efficiency compared to that of the standard inferences with known within-study correlations. We illustrate the proposed method through three meta-analyses for comparison of prostate cancer treatment, for the association between paraoxonase 1 activities and coronary heart disease, and for the association between homocysteine level and coronary heart disease. © 2014 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:25363629

  20. Adequate mathematical modelling of environmental processes

    NASA Astrophysics Data System (ADS)

    Chashechkin, Yu. D.

    2012-04-01

    In environmental observations and laboratory visualization both large scale flow components like currents, jets, vortices, waves and a fine structure are registered (different examples are given). The conventional mathematical modeling both analytical and numerical is directed mostly on description of energetically important flow components. The role of a fine structures is still remains obscured. A variety of existing models makes it difficult to choose the most adequate and to estimate mutual assessment of their degree of correspondence. The goal of the talk is to give scrutiny analysis of kinematics and dynamics of flows. A difference between the concept of "motion" as transformation of vector space into itself with a distance conservation and the concept of "flow" as displacement and rotation of deformable "fluid particles" is underlined. Basic physical quantities of the flow that are density, momentum, energy (entropy) and admixture concentration are selected as physical parameters defined by the fundamental set which includes differential D'Alembert, Navier-Stokes, Fourier's and/or Fick's equations and closing equation of state. All of them are observable and independent. Calculations of continuous Lie groups shown that only the fundamental set is characterized by the ten-parametric Galilelian groups reflecting based principles of mechanics. Presented analysis demonstrates that conventionally used approximations dramatically change the symmetries of the governing equations sets which leads to their incompatibility or even degeneration. The fundamental set is analyzed taking into account condition of compatibility. A high order of the set indicated on complex structure of complete solutions corresponding to physical structure of real flows. Analytical solutions of a number problems including flows induced by diffusion on topography, generation of the periodic internal waves a compact sources in week-dissipative media as well as numerical solutions of the same

  1. An improved random walk algorithm for the implicit Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Keady, Kendra P.; Cleveland, Mathew A.

    2017-01-01

    In this work, we introduce a modified Implicit Monte Carlo (IMC) Random Walk (RW) algorithm, which increases simulation efficiency for multigroup radiative transfer problems with strongly frequency-dependent opacities. To date, the RW method has only been implemented in "fully-gray" form; that is, the multigroup IMC opacities are group-collapsed over the full frequency domain of the problem to obtain a gray diffusion problem for RW. This formulation works well for problems with large spatial cells and/or opacities that are weakly dependent on frequency; however, the efficiency of the RW method degrades when the spatial cells are thin or the opacities are a strong function of frequency. To address this inefficiency, we introduce a RW frequency group cutoff in each spatial cell, which divides the frequency domain into optically thick and optically thin components. In the modified algorithm, opacities for the RW diffusion problem are obtained by group-collapsing IMC opacities below the frequency group cutoff. Particles with frequencies above the cutoff are transported via standard IMC, while particles below the cutoff are eligible for RW. This greatly increases the total number of RW steps taken per IMC time-step, which in turn improves the efficiency of the simulation. We refer to this new method as Partially-Gray Random Walk (PGRW). We present numerical results for several multigroup radiative transfer problems, which show that the PGRW method is significantly more efficient than standard RW for several problems of interest. In general, PGRW decreases runtimes by a factor of ∼2-4 compared to standard RW, and a factor of ∼3-6 compared to standard IMC. While PGRW is slower than frequency-dependent Discrete Diffusion Monte Carlo (DDMC), it is also easier to adapt to unstructured meshes and can be used in spatial cells where DDMC is not applicable. This suggests that it may be optimal to employ both DDMC and PGRW in a single simulation.

  2. Finite-element method for calculation of the effective permittivity of random inhomogeneous media

    NASA Astrophysics Data System (ADS)

    Myroshnychenko, Viktor; Brosseau, Christian

    2005-01-01

    The challenge of designing new solid-state materials from calculations performed with the help of computers applied to models of spatial randomness has attracted an increasing amount of interest in recent years. In particular, dispersions of particles in a host matrix are scientifically and technologically important for a variety of reasons. Herein, we report our development of an efficient computer code to calculate the effective (bulk) permittivity of two-phase disordered composite media consisting of hard circular disks made of a lossless dielectric (permittivity ɛ2 ) randomly placed in a plane made of a lossless homogeneous dielectric (permittivity ɛ1 ) at different surface fractions. Specifically, the method is based on (i) a finite-element description of composites in which both the host and the randomly distributed inclusions are isotropic phases, and (ii) an ordinary Monte Carlo sampling. Periodic boundary conditions are employed throughout the simulation and various numbers of disks have been considered in the calculations. From this systematic study, we show how the number of Monte Carlo steps needed to achieve equilibrated distributions of disks increases monotonically with the surface fraction. Furthermore, a detailed study is made of the dependence of the results on a minimum separation distance between disks. Numerical examples are presented to connect the macroscopic property such as the effective permittivity to microstructural characteristics such as the mean coordination number and radial distribution function. In addition, several approximate effective medium theories, exact bounds, exact results for two-dimensional regular arrays, and the exact dilute limit are used to test and validate the finite-element algorithm. Numerical results indicate that the fourth-order bounds provide an excellent estimate of the effective permittivity for a wide range of surface fractions, in accordance with the fact that the bounds become progressively narrower as

  3. Method of successive approximations in the theory of stimulated Raman scattering of a randomly modulated pump

    NASA Astrophysics Data System (ADS)

    Krochik, G. M.

    1980-02-01

    Stimulated Raman scattering of a randomly modulated pump is investigated by the method of successive approximations. This involves expanding solutions in terms of small parameters, which are ratios of the correlation scales of random effects to other characteristic dynamic scales of the problem. Systems of closed equations are obtained for the moments of the amplitudes of the Stokes and pump waves and of the molecular vibrations. These describe the dynamics of the process allowing for changes in the pump intensity and statistics due to a three-wave interaction. By analyzing equations in higher-order approximations, it is possible to establish the conditions of validity of the first (Markov) and second approximations. In particular, it is found that these are valid for pump intensities JL both above and below the critical value Jcr near which the gain begins to increase rapidly and reproduction of the pump spectrum by the Stokes wave is initiated. Solutions are obtained for average intensities of the Stokes wave and molecular vibrations in the first approximation in a constant pump field. It is established that, for JLgtrsimJcr, the Stokes wave undergoes rapid nonsteady-state amplification which is associated with an increase in the amplitude of the molecular vibrations. The results of the calculations show good agreement with known experimental data.

  4. A comparative study of energy minimization methods for Markov random fields with smoothness-based priors.

    PubMed

    Szeliski, Richard; Zabih, Ramin; Scharstein, Daniel; Veksler, Olga; Kolmogorov, Vladimir; Agarwala, Aseem; Tappen, Marshall; Rother, Carsten

    2008-06-01

    Among the most exciting advances in early vision has been the development of efficient energy minimization algorithms for pixel-labeling tasks such as depth or texture computation. It has been known for decades that such problems can be elegantly expressed as Markov random fields, yet the resulting energy minimization problems have been widely viewed as intractable. Recently, algorithms such as graph cuts and loopy belief propagation (LBP) have proven to be very powerful: for example, such methods form the basis for almost all the top-performing stereo methods. However, the tradeoffs among different energy minimization algorithms are still not well understood. In this paper we describe a set of energy minimization benchmarks and use them to compare the solution quality and running time of several common energy minimization algorithms. We investigate three promising recent methods graph cuts, LBP, and tree-reweighted message passing in addition to the well-known older iterated conditional modes (ICM) algorithm. Our benchmark problems are drawn from published energy functions used for stereo, image stitching, interactive segmentation, and denoising. We also provide a general-purpose software interface that allows vision researchers to easily switch between optimization methods. Benchmarks, code, images, and results are available at http://vision.middlebury.edu/MRF/.

  5. Critical Appraisal of Methods Used in Randomized Controlled Trials of Treatments for Temporomandibular Disorders

    PubMed Central

    Fricton, James R.; Ouyang, Wei; Nixdorf, Donald R.; Schiffman, Eric L.; Velly, Ana Miriam; Look, John O.

    2015-01-01

    Aims To evaluate the quality of methods used in randomized controlled trials (RCTs) of treatments for management of pain and dysfunction associated with temporomandibular muscle and joint disorders (TMJD) and to discuss the implications for future RCTs. Methods A systematic review was made of RCTs that were implemented from 1966 through March 2006, to evaluate six types of treatments for TMJD: orthopedic appliances, occlusal therapy, physical medicine modalities, pharmacologic therapy, cognitive-behavioral and psychological therapy, and temporomandibular joint surgery. A quality assessment of 210 published RCTs assessing the internal and external validity of these RCTs was conducted using the Consolidated Standards of Reporting Trials (CONSORT) criteria adapted to the methods of the studies. Results Independent assessments by raters demonstrated consistency with a mean intraclass correlation coefficient of 0.63 (95% confidence interval). The mean percent of criteria met was 58%, with only 10% of the RCTs meeting the four most important criteria. Conclusions Much of the evidence base for TMJD treatments may be susceptible to systematic bias and most past studies should be interpreted with caution. However, a scatter plot of RCT quality versus year of publication shows improvement in RCT quality over time, suggesting that future studies may continue to improve methods that minimize bias. PMID:20401352

  6. Systematic method for electrical characterization of random telegraph noise in MOSFETs

    NASA Astrophysics Data System (ADS)

    Marquez, Carlos; Rodriguez, Noel; Gamiz, Francisco; Ohata, Akiko

    2017-02-01

    This work introduces a new protocol which aims to facilitate massive on-wafer characterization of Random Telegraph Noise (RTN) in MOS transistors. The methodology combines the noise spectral density scanning by gate bias assisted with a modified Weighted Time Lag Plot algorithm to identify unequivocally the single-trap RTN signals in optimum bias conditions for their electrical characterization. The strength of the method is demonstrated by its application for monitoring the distribution of traps over the transistors of a SOI wafer. The influence of the back-gate bias on the RTN characteristics of the SOI devices with coupled front- and back-interfaces has revealed unusual characteristics compatible with the carrier emission to the gate metal contact.

  7. A new hierarchical method for inter-patient heartbeat classification using random projections and RR intervals

    PubMed Central

    2014-01-01

    Background The inter-patient classification schema and the Association for the Advancement of Medical Instrumentation (AAMI) standards are important to the construction and evaluation of automated heartbeat classification systems. The majority of previously proposed methods that take the above two aspects into consideration use the same features and classification method to classify different classes of heartbeats. The performance of the classification system is often unsatisfactory with respect to the ventricular ectopic beat (VEB) and supraventricular ectopic beat (SVEB). Methods Based on the different characteristics of VEB and SVEB, a novel hierarchical heartbeat classification system was constructed. This was done in order to improve the classification performance of these two classes of heartbeats by using different features and classification methods. First, random projection and support vector machine (SVM) ensemble were used to detect VEB. Then, the ratio of the RR interval was compared to a predetermined threshold to detect SVEB. The optimal parameters for the classification models were selected on the training set and used in the independent testing set to assess the final performance of the classification system. Meanwhile, the effect of different lead configurations on the classification results was evaluated. Results Results showed that the performance of this classification system was notably superior to that of other methods. The VEB detection sensitivity was 93.9% with a positive predictive value of 90.9%, and the SVEB detection sensitivity was 91.1% with a positive predictive value of 42.2%. In addition, this classification process was relatively fast. Conclusions A hierarchical heartbeat classification system was proposed based on the inter-patient data division to detect VEB and SVEB. It demonstrated better classification performance than existing methods. It can be regarded as a promising system for detecting VEB and SVEB of unknown patients in

  8. a Method to Estimate Temporal Interaction in a Conditional Random Field Based Approach for Crop Recognition

    NASA Astrophysics Data System (ADS)

    Diaz, P. M. A.; Feitosa, R. Q.; Sanches, I. D.; Costa, G. A. O. P.

    2016-06-01

    This paper presents a method to estimate the temporal interaction in a Conditional Random Field (CRF) based approach for crop recognition from multitemporal remote sensing image sequences. This approach models the phenology of different crop types as a CRF. Interaction potentials are assumed to depend only on the class labels of an image site at two consecutive epochs. In the proposed method, the estimation of temporal interaction parameters is considered as an optimization problem, whose goal is to find the transition matrix that maximizes the CRF performance, upon a set of labelled data. The objective functions underlying the optimization procedure can be formulated in terms of different accuracy metrics, such as overall and average class accuracy per crop or phenological stages. To validate the proposed approach, experiments were carried out upon a dataset consisting of 12 co-registered LANDSAT images of a region in southeast of Brazil. Pattern Search was used as the optimization algorithm. The experimental results demonstrated that the proposed method was able to substantially outperform estimates related to joint or conditional class transition probabilities, which rely on training samples.

  9. Effects of Pilates method in elderly people: Systematic review of randomized controlled trials.

    PubMed

    de Oliveira Francisco, Cristina; de Almeida Fagundes, Alessandra; Gorges, Bruna

    2015-07-01

    The Pilates method has been widely used in physical training and rehabilitation. Evidence regarding the effectiveness of this method in elderly people is limited. Six randomized controlled trials studies involving the use of the Pilates method for elderly people, published prior to December 2013, were selected from the databases PubMed, MEDLINE, Embase, Cochrane, Scielo and PEDro. Three articles suggested that Pilates produced improvements in balance. Two studies evaluated the adherence to Pilates programs. One study assessed Pilates' influence on cardio-metabolic parameters and another study evaluated changes in body composition. Strong evidence was found regarding beneficial effects of Pilates over static and dynamic balance in women. Nevertheless, evidence of balance improvement in both genders, changes in body composition in woman and adherence to Pilates programs were limited. Effects on cardio-metabolic parameters due to Pilates training presented inconclusive results. Pilates may be a useful tool in rehabilitation and prevention programs but more high quality studies are necessary to establish all the effects on elderly populations.

  10. Factors associated with adequate weekly reporting for disease surveillance data among health facilities in Nairobi County, Kenya, 2013

    PubMed Central

    Mwatondo, Athman Juma; Ng'ang'a, Zipporah; Maina, Caroline; Makayotto, Lyndah; Mwangi, Moses; Njeru, Ian; Arvelo, Wences

    2016-01-01

    Introduction Kenya adopted the Integrated Disease Surveillance and Response (IDSR) strategy in 1998 to strengthen disease surveillance and epidemic response. However, the goal of weekly surveillance reporting among health facilities has not been achieved. We conducted a cross-sectional study to determine the prevalence of adequate reporting and factors associated with IDSR reporting among health facilities in one Kenyan County. Methods Health facilities (public and private) were enrolled using stratified random sampling from 348 facilities prioritized for routine surveillance reporting. Adequately-reporting facilities were defined as those which submitted >10 weekly reports during a twelve-week period and a poor reporting facilities were those which submitted <10 weekly reports. Multivariate logistic regression with backward selection was used to identify risk factors associated with adequate reporting. Results From September 2 through November 30, 2013, we enrolled 175 health facilities; 130(74%) were private and 45(26%) were public. Of the 175 health facilities, 77 (44%) facilities classified as adequate reporting and 98 (56%) were reporting poorly. Multivariate analysis identified three factors to be independently associated with weekly adequate reporting: having weekly reporting forms at visit (AOR19, 95% CI: 6-65], having posters showing IDSR functions (AOR8, 95% CI: 2-12) and having a designated surveillance focal person (AOR7, 95% CI: 2-20). Conclusion The majority of health facilities in Nairobi County were reporting poorly to IDSR and we recommend that the Ministry of Health provide all health facilities in Nairobi County with weekly reporting tools and offer specific trainings on IDSR which will help designate a focal surveillance person. PMID:27303581

  11. Genetically controlled random search: a global optimization method for continuous multidimensional functions

    NASA Astrophysics Data System (ADS)

    Tsoulos, Ioannis G.; Lagaris, Isaac E.

    2006-01-01

    A new stochastic method for locating the global minimum of a multidimensional function inside a rectangular hyperbox is presented. A sampling technique is employed that makes use of the procedure known as grammatical evolution. The method can be considered as a "genetic" modification of the Controlled Random Search procedure due to Price. The user may code the objective function either in C++ or in Fortran 77. We offer a comparison of the new method with others of similar structure, by presenting results of computational experiments on a set of test functions. Program summaryTitle of program: GenPrice Catalogue identifier:ADWP Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWP Program available from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: the tool is designed to be portable in all systems running the GNU C++ compiler Installation: University of Ioannina, Greece Programming language used: GNU-C++, GNU-C, GNU Fortran-77 Memory required to execute with typical data: 200 KB No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: no No. of lines in distributed program, including test data, etc.:13 135 No. of bytes in distributed program, including test data, etc.: 78 512 Distribution format: tar. gz Nature of physical problem: A multitude of problems in science and engineering are often reduced to minimizing a function of many variables. There are instances that a local optimum does not correspond to the desired physical solution and hence the search for a better solution is required. Local optimization techniques are frequently trapped in local minima. Global optimization is hence the appropriate tool. For example, solving a nonlinear system of equations via optimization, employing a "least squares" type of objective, one may encounter many local minima that do not correspond to solutions, i.e. minima with values

  12. Study of Electromagnetic Scattering From Material Object Doped Randomly With Thin Metallic Wires Using Finite Element Method

    NASA Technical Reports Server (NTRS)

    Deshpande, Manohar D.

    2005-01-01

    A new numerical simulation method using the finite element methodology (FEM) is presented to study electromagnetic scattering due to an arbitrarily shaped material body doped randomly with thin and short metallic wires. The FEM approach described in many standard text books is appropriately modified to account for the presence of thin and short metallic wires distributed randomly inside an arbitrarily shaped material body. Using this modified FEM approach, the electromagnetic scattering due to cylindrical, spherical material body doped randomly with thin metallic wires is studied.

  13. Method for high-volume sequencing of nucleic acids: random and directed priming with libraries of oligonucleotides

    DOEpatents

    Studier, F.W.

    1995-04-18

    Random and directed priming methods for determining nucleotide sequences by enzymatic sequencing techniques, using libraries of primers of lengths 8, 9 or 10 bases, are disclosed. These methods permit direct sequencing of nucleic acids as large as 45,000 base pairs or larger without the necessity for subcloning. Individual primers are used repeatedly to prime sequence reactions in many different nucleic acid molecules. Libraries containing as few as 10,000 octamers, 14,200 nonamers, or 44,000 decamers would have the capacity to determine the sequence of almost any cosmid DNA. Random priming with a fixed set of primers from a smaller library can also be used to initiate the sequencing of individual nucleic acid molecules, with the sequence being completed by directed priming with primers from the library. In contrast to random cloning techniques, a combined random and directed priming strategy is far more efficient. 2 figs.

  14. Method for high-volume sequencing of nucleic acids: random and directed priming with libraries of oligonucleotides

    DOEpatents

    Studier, F. William

    1995-04-18

    Random and directed priming methods for determining nucleotide sequences by enzymatic sequencing techniques, using libraries of primers of lengths 8, 9 or 10 bases, are disclosed. These methods permit direct sequencing of nucleic acids as large as 45,000 base pairs or larger without the necessity for subcloning. Individual primers are used repeatedly to prime sequence reactions in many different nucleic acid molecules. Libraries containing as few as 10,000 octamers, 14,200 nonamers, or 44,000 decamers would have the capacity to determine the sequence of almost any cosmid DNA. Random priming with a fixed set of primers from a smaller library can also be used to initiate the sequencing of individual nucleic acid molecules, with the sequence being completed by directed priming with primers from the library. In contrast to random cloning techniques, a combined random and directed priming strategy is far more efficient.

  15. Subdiffusive random walk in a membrane system: the generalized method of images approach

    NASA Astrophysics Data System (ADS)

    Kosztołowicz, Tadeusz

    2015-10-01

    Using two random walk models in a system with a thin membrane we find the Green’s functions describing various kinds of diffusion in this system; the membrane is treated here as a thin, partially permeable wall. The models differ in the assumptions concerning how the particle is stopped or reflected by the membrane when the particle’s attempts to pass through it fail. We show that the Green’s functions obtained for both models are equivalent with the exception of the values of these functions at the membranes’ surfaces. As examples we present the Green’s functions for a membrane system in which subdiffusion or slow subdiffusion occurs and we briefly discuss the properties of the functions. We also show that the Green’s functions can be obtained by means of the generalized method of images. Within this method, the Green’s functions appear to be a combination of the Green’s functions derived for a homogeneous system without a membrane by means of the rules presented in this paper. Additionally, the obtained Green’s functions are used to derive a boundary condition at the membrane. It is shown that the condition contains a specific term which can be interpreted as a ‘memory term’ depending on the kind of diffusion occurring in the system which is generated by the membrane.

  16. Upscaling solute transport in naturally fractured porous media with the continuous time random walk method

    SciTech Connect

    Geiger, S.; Cortis, A.; Birkholzer, J.T.

    2010-04-01

    Solute transport in fractured porous media is typically 'non-Fickian'; that is, it is characterized by early breakthrough and long tailing and by nonlinear growth of the Green function-centered second moment. This behavior is due to the effects of (1) multirate diffusion occurring between the highly permeable fracture network and the low-permeability rock matrix, (2) a wide range of advection rates in the fractures and, possibly, the matrix as well, and (3) a range of path lengths. As a consequence, prediction of solute transport processes at the macroscale represents a formidable challenge. Classical dual-porosity (or mobile-immobile) approaches in conjunction with an advection-dispersion equation and macroscopic dispersivity commonly fail to predict breakthrough of fractured porous media accurately. It was recently demonstrated that the continuous time random walk (CTRW) method can be used as a generalized upscaling approach. Here we extend this work and use results from high-resolution finite element-finite volume-based simulations of solute transport in an outcrop analogue of a naturally fractured reservoir to calibrate the CTRW method by extracting a distribution of retention times. This procedure allows us to predict breakthrough at other model locations accurately and to gain significant insight into the nature of the fracture-matrix interaction in naturally fractured porous reservoirs with geologically realistic fracture geometries.

  17. An Efficient Method to Calculate the Failure Rate of Dynamic Systems with Random Parameters using the Total Probability Theorem

    DTIC Science & Technology

    2015-05-12

    using a vibratory system . Our approach can be easily extended to non-stationary Gaussian input processes. Introduction The response of a vibratory...Page 1 of 9 15IDM-0105 An Efficient Method to Calculate the Failure Rate of Dynamic Systems with Random Parameters using the Total Probability...failure rate of a linear vibratory system with random parameters excited by stationary Gaussian processes. The response of such a system is non

  18. Adipose Tissue - Adequate, Accessible Regenerative Material

    PubMed Central

    Kolaparthy, Lakshmi Kanth.; Sanivarapu, Sahitya; Moogla, Srinivas; Kutcham, Rupa Sruthi

    2015-01-01

    The potential use of stem cell based therapies for the repair and regeneration of various tissues offers a paradigm shift that may provide alternative therapeutic solutions for a number of diseases. The use of either embryonic stem cells (ESCs) or induced pluripotent stem cells in clinical situations is limited due to cell regulations and to technical and ethical considerations involved in genetic manipulation of human ESCs, even though these cells are highly beneficial. Mesenchymal stem cells seen to be an ideal population of stem cells in particular, Adipose derived stem cells (ASCs) which can be obtained in large number and easily harvested from adipose tissue. It is ubiquitously available and has several advantages compared to other sources as easily accessible in large quantities with minimal invasive harvesting procedure, and isolation of adipose derived mesenchymal stem cells yield a high amount of stem cells which is essential for stem cell based therapies and tissue engineering. Recently, periodontal tissue regeneration using ASCs has been examined in some animal models. This method has potential in the regeneration of functional periodontal tissues because various secreted growth factors from ASCs might not only promote the regeneration of periodontal tissues but also encourage neovascularization of the damaged tissues. This review summarizes the sources, isolation and characteristics of adipose derived stem cells and its potential role in periodontal regeneration is discussed. PMID:26634060

  19. Causal inference methods to assess safety upper bounds in randomized trials with noncompliance

    PubMed Central

    Berlin, Jesse A; Pinheiro, José; Wilcox, Marsha A

    2015-01-01

    Background Premature discontinuation and other forms of noncompliance with treatment assignment can complicate causal inference of treatment effects in randomized trials. The intent-to-treat analysis gives unbiased estimates for causal effects of treatment assignment on outcome, but may understate potential benefit or harm of actual treatment. The corresponding upper confidence limit can also be underestimated. Purpose To compare estimates of the hazard ratio and upper bound of the two-sided 95% confidence interval from causal inference methods that account for noncompliance with those from the intent-to-treat analysis. Methods We used simulations with parameters chosen to reflect cardiovascular safety trials of diabetes drugs, with a focus on upper bound estimates relative to 1.3, based on regulatory guidelines. A total of 1000 simulations were run under each parameter combination for a hypothetical trial of 10,000 total subjects randomly assigned to active treatment or control at 1:1 ratio. Noncompliance was considered in the form of treatment discontinuation and cross-over at specified proportions, with an assumed true hazard ratio of 0.9, 1, and 1.3, respectively. Various levels of risk associated with being a non-complier (independent of treatment status) were evaluated. Hazard ratio and upper bound estimates from causal survival analysis and intent-to-treat were obtained from each simulation and summarized under each parameter setting. Results Causal analysis estimated the true hazard ratio with little bias in almost all settings examined. Intent-to-treat was unbiased only when the true hazard ratio = 1; otherwise it underestimated both benefit and harm. When upper bound estimates from intent-to-treat were ≥1.3, corresponding estimates from causal analysis were also ≥1.3 in almost 100% of the simulations, regardless of the true hazard ratio. When upper bound estimates from intent-to-treat were <1.3 and the true hazard ratio = 1, corresponding

  20. Spatial cross modulation method using a random diffuser and phase-only spatial light modulator for constructing arbitrary complex fields.

    PubMed

    Shibukawa, Atsushi; Okamoto, Atsushi; Takabayashi, Masanori; Tomita, Akihisa

    2014-02-24

    We propose a spatial cross modulation method using a random diffuser and a phase-only spatial light modulator (SLM), by which arbitrary complex-amplitude fields can be generated with higher spatial resolution and diffraction efficiency than off-axis and double-phase computer-generated holograms. Our method encodes the original complex object as a phase-only diffusion image by scattering the complex object using a random diffuser. In addition, all incoming light to the SLM is consumed for a single diffraction order, making a diffraction efficiency of more than 90% possible. This method can be applied for holographic data storage, three-dimensional displays, and other such applications.

  1. Comparison of three cooling methods for burn patients: A randomized clinical trial.

    PubMed

    Cho, Young Soon; Choi, Young Hwan

    2016-10-01

    Tap water may not be readily available in numerous places as a first aid for burns and, therefore, tea tree oil products are recommended alternatives. Our aim in this study was to compare the cooling effects of three burn-cooling methodologies, running tap water, Burnshield(®), and Burn Cool Spray(®), and suggest indications for each cooling method. This randomized, controlled, study enrolled patients with burns who used the emergency service of Seoul Bestian Hospital from June 2015 to October 2015. The allocation of the cooling methods was randomly generated using a computer. We cooled the burn wounds by applying one of the three methods and measured the skin surface temperature and pain level using a visual analog scale (VAS) scoring. Ninety-six patients were enrolled in this study. The variability in the median(IQR) skin temperatures of the three groups was from 33.5°C (31.5-35.0) to 28.7°C (25.9-30.9), 33.8°C (32.0-35.4) to 33.2°C (30.5-35.0), and 34.0°C (32.0-35.1) to 34.4°C (32.7-35.6) for the tap water, Burn Cool Spray(®), and Burnshield(®), respectively. The variability of the mean VAS pain scores was 6.9 to 4.8 (tap water), 5.6 to 4.5 (Burn Cool Spray(®)), and 5.5 to 3.3 (Burnshield(®)). The reduction of skin surface temperature by tap water was significantly greater than that by the other two methods. All three methods reduced the VAS pain score after 20min of treatment (p<0.001). The tap water had a similar effect to that of the Burn Cool Spray(®) but significantly better than that of Burnshield(®). There was a significant difference in the skin surface temperature and VAS pain score reduction (p=0.014 and p=0.007, respectively) between the groups cooled by tap water below and above 24°C. The patients who visited the center within 30min showed a significantly higher skin temperature than those who came after 30min did (p=0.033). Tap water and Burn Cool Spray(®) reduced the skin surface temperature, but the Burnshield(®) slightly

  2. Sham Control Methods Used in Ear-Acupuncture/Ear-Acupressure Randomized Controlled Trials: A Systematic Review

    PubMed Central

    Zhang, Claire Shuiqing; Yang, Angela Weihong; Zhang, Anthony Lin; May, Brian H.

    2014-01-01

    Abstract Ear-acupuncture/ear-acupressure (EAP) has been used for a range of health conditions with numerous randomized controlled trials (RCTs) investigating its efficacy and safety. However, the design of sham interventions in these RCTs varied significantly. This study systematically reviewed RCTs on EAP for all clinical conditions involving a number of sham EAPs as a control intervention. The review is guided by the Cochrane Handbook for Systematic Reviews of Interventions 5.1.0 and investigated the types and differences of sham EAP interventions. Four electronic English databases (The Cochrane Library, PubMed, Embase, CINAHL®) and two Chinese databases (CQVIP, CNKI) were searched in December 2012 and 55 published RCTs comparing real and sham EAP for any clinical condition were included. Characteristics of participants, real and sham interventions, and outcomes were extracted. Four types of sham methods were identified. Among the 55 RCTs, 25 studies involved treatment on nonspecific ear acupoints as the sham method; seven studies used nonacupoints on the ear; nine studies selected placebo needles or placebo ear-acupressure on the same ear acupoints for the real treatment; 10 studies employed pseudo-intervention; and five studies combined two of the above methods to be the sham control. Other factors of treatment such as number of points, treatment duration, and frequency also varied greatly. Risk of bias assessment suggests that 32 RCTs were “high risk” in terms of participants blinding, and 45 RCTs were “high risk” in terms of personnel blinding. Meta-analysis was not conducted due to the high clinical heterogeneity across included studies. No relationship was found between the sham designs and efficacy outcomes, or between the sham types and dropout rate. No solid conclusion of which design is the most appropriate sham control of EAP could be drawn in this review. PMID:24138333

  3. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  4. Improving function in age-related macular degeneration: design and methods of a randomized clinical trial.

    PubMed

    Rovner, Barry W; Casten, Robin J; Hegel, Mark T; Massof, Robert W; Leiby, Benjamin E; Tasman, William S

    2011-03-01

    Age-Related Macular Degeneration (AMD) is the leading cause of severe vision loss in older adults and impairs the ability to read, drive, and live independently and increases the risk for depression, falls, and earlier mortality. Although new medical treatments have improved AMD's prognosis, vision-related disability remains a major public health problem. Improving Function in AMD (IF-AMD) is a two-group randomized, parallel design, controlled clinical trial that compares the efficacy of Problem-Solving Therapy (PST) with Supportive Therapy (ST) (an attention control treatment) to improve vision function in 240 patients with AMD. PST and ST therapists deliver 6 one-hour respective treatment sessions to subjects in their homes over 2 months. Outcomes are assessed masked to treatment assignment at 3 months (main trial endpoint) and 6 months (maintenance effects). The primary outcome is targeted vision function (TVF), which refers to specific vision-dependent functional goals that subjects highly value but find difficult to achieve. TVF is an innovative outcome measure in that it is targeted and tailored to individual subjects yet is measured in a standardized way. This paper describes the research methods, theoretical and clinical aspects of the study treatments, and the measures used to evaluate functional and psychiatric outcomes in this population.

  5. Improving Function in Age-Related Macular Degeneration: Design and Methods of a Randomized Clinical Trial

    PubMed Central

    Rovner, Barry W.; Casten, Robin J.; Hegel, Mark T.; Massof, Robert W.; Leiby, Benjamin E.; Tasman, William S.

    2010-01-01

    Age-Related Macular Degeneration (AMD) is the leading cause of severe vision loss in older adults and impairs the ability to read, drive, and live independently and increases the risk for depression, falls, and earlier mortality. Although new medical treatments have improved AMD’s prognosis, vision-related disability remains a major public health problem. Improving Function in AMD (IF-AMD) is a two-group randomized, parallel design, controlled clinical trial that compares the efficacy of Problem-Solving Therapy (PST) with Supportive Therapy (ST) (an attention control treatment) to improve vision function in 240 patients with AMD. PST and ST therapists deliver 6 one-hour respective treatment sessions to subjects in their homes over 2 months. Outcomes are assessed masked to treatment assignment at 3 months (main trial endpoint) and 6 months (maintenance effects). The primary outcome is targeted vision function (TVF), which refers to specific vision-dependent functional goals that subjects highly value but find difficult to achieve. TVF is an innovative outcome measure in that it is targeted and tailored to individual subjects yet is measured in a standardized way. This paper describes the research methods, theoretical and clinical aspects of the study treatments, and the measures used to evaluate functional and psychiatric outcomes in this population. PMID:20974293

  6. A general parallelization strategy for random path based geostatistical simulation methods

    NASA Astrophysics Data System (ADS)

    Mariethoz, Grégoire

    2010-07-01

    The size of simulation grids used for numerical models has increased by many orders of magnitude in the past years, and this trend is likely to continue. Efficient pixel-based geostatistical simulation algorithms have been developed, but for very large grids and complex spatial models, the computational burden remains heavy. As cluster computers become widely available, using parallel strategies is a natural step for increasing the usable grid size and the complexity of the models. These strategies must profit from of the possibilities offered by machines with a large number of processors. On such machines, the bottleneck is often the communication time between processors. We present a strategy distributing grid nodes among all available processors while minimizing communication and latency times. It consists in centralizing the simulation on a master processor that calls other slave processors as if they were functions simulating one node every time. The key is to decouple the sending and the receiving operations to avoid synchronization. Centralization allows having a conflict management system ensuring that nodes being simulated simultaneously do not interfere in terms of neighborhood. The strategy is computationally efficient and is versatile enough to be applicable to all random path based simulation methods.

  7. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group.

  8. Proposal of the Methodology for Analysing the Structural Relationship in the System of Random Process Using the Data Mining Methods

    NASA Astrophysics Data System (ADS)

    Michaľčonok, German; Kalinová, Michaela Horalová; Németh, Martin

    2014-12-01

    The aim of this paper is to present the possibilities of applying data mining techniques to the problem of analysis of structural relationships in the system of stationary random processes. In this paper, we will approach the area of the random processes, present the process of structural analysis and select suitable circuit data mining methods applicable to the area of structural analysis. We will propose the methodology for the structural analysis in the system of stationary stochastic processes using data mining methods for active experimental approach, based on the theoretical basis.

  9. Beyond the Randomized Controlled Trial: A Review of Alternatives in mHealth Clinical Trial Methods

    PubMed Central

    Wiljer, David; Cafazzo, Joseph A

    2016-01-01

    Background Randomized controlled trials (RCTs) have long been considered the primary research study design capable of eliciting causal relationships between health interventions and consequent outcomes. However, with a prolonged duration from recruitment to publication, high-cost trial implementation, and a rigid trial protocol, RCTs are perceived as an impractical evaluation methodology for most mHealth apps. Objective Given the recent development of alternative evaluation methodologies and tools to automate mHealth research, we sought to determine the breadth of these methods and the extent that they were being used in clinical trials. Methods We conducted a review of the ClinicalTrials.gov registry to identify and examine current clinical trials involving mHealth apps and retrieved relevant trials registered between November 2014 and November 2015. Results Of the 137 trials identified, 71 were found to meet inclusion criteria. The majority used a randomized controlled trial design (80%, 57/71). Study designs included 36 two-group pretest-posttest control group comparisons (51%, 36/71), 16 posttest-only control group comparisons (23%, 16/71), 7 one-group pretest-posttest designs (10%, 7/71), 2 one-shot case study designs (3%, 2/71), and 2 static-group comparisons (3%, 2/71). A total of 17 trials included a qualitative component to their methodology (24%, 17/71). Complete trial data collection required 20 months on average to complete (mean 21, SD 12). For trials with a total duration of 2 years or more (31%, 22/71), the average time from recruitment to complete data collection (mean 35 months, SD 10) was 2 years longer than the average time required to collect primary data (mean 11, SD 8). Trials had a moderate sample size of 112 participants. Two trials were conducted online (3%, 2/71) and 7 trials collected data continuously (10%, 7/68). Onsite study implementation was heavily favored (97%, 69/71). Trials with four data collection points had a longer study

  10. Single-cluster-update Monte Carlo method for the random anisotropy model

    NASA Astrophysics Data System (ADS)

    Rößler, U. K.

    1999-06-01

    A Wolff-type cluster Monte Carlo algorithm for random magnetic models is presented. The algorithm is demonstrated to reduce significantly the critical slowing down for planar random anisotropy models with weak anisotropy strength. Dynamic exponents z<~1.0 of best cluster algorithms are estimated for models with ratio of anisotropy to exchange constant D/J=1.0 on cubic lattices in three dimensions. For these models, critical exponents are derived from a finite-size scaling analysis.

  11. An asymptotic-preserving stochastic Galerkin method for the radiative heat transfer equations with random inputs and diffusive scalings

    NASA Astrophysics Data System (ADS)

    Jin, Shi; Lu, Hanqing

    2017-04-01

    In this paper, we develop an Asymptotic-Preserving (AP) stochastic Galerkin scheme for the radiative heat transfer equations with random inputs and diffusive scalings. In this problem the random inputs arise due to uncertainties in cross section, initial data or boundary data. We use the generalized polynomial chaos based stochastic Galerkin (gPC-SG) method, which is combined with the micro-macro decomposition based deterministic AP framework in order to handle efficiently the diffusive regime. For linearized problem we prove the regularity of the solution in the random space and consequently the spectral accuracy of the gPC-SG method. We also prove the uniform (in the mean free path) linear stability for the space-time discretizations. Several numerical tests are presented to show the efficiency and accuracy of proposed scheme, especially in the diffusive regime.

  12. Three randomized trials of maternal influenza immunization in Mali, Nepal, and South Africa: Methods and expectations.

    PubMed

    Omer, Saad B; Richards, Jennifer L; Madhi, Shabir A; Tapia, Milagritos D; Steinhoff, Mark C; Aqil, Anushka R; Wairagkar, Niteen

    2015-07-31

    Influenza infection in pregnancy can have adverse impacts on maternal, fetal, and infant outcomes. Influenza vaccination in pregnancy is an appealing strategy to protect pregnant women and their infants. The Bill & Melinda Gates Foundation is supporting three large, randomized trials in Nepal, Mali, and South Africa evaluating the efficacy and safety of maternal immunization to prevent influenza disease in pregnant women and their infants <6 months of age. Results from these individual studies are expected in 2014 and 2015. While the results from the three maternal immunization trials are likely to strengthen the evidence base regarding the impact of influenza immunization in pregnancy, expectations for these results should be realistic. For example, evidence from previous influenza vaccine studies - conducted in general, non-pregnant populations - suggests substantial geographic and year-to-year variability in influenza incidence and vaccine efficacy/effectiveness. Since the evidence generated from the three maternal influenza immunization trials will be complementary, in this paper we present a side-by-side description of the three studies as well as the similarities and differences between these trials in terms of study location, design, outcome evaluation, and laboratory and epidemiological methods. We also describe the likely remaining knowledge gap after the results from these trials become available along with a description of the analyses that will be conducted when the results from these individual data are pooled. Moreover, we highlight that additional research on logistics of seasonal influenza vaccine supply, surveillance and strain matching, and optimal delivery strategies for pregnant women will be important for informing global policy related to maternal influenza immunization.

  13. Prevention of seroma formation after axillary dissection--a comparative randomized clinical trial of three methods.

    PubMed

    Kottayasamy Seenivasagam, Rajkumar; Gupta, Vikas; Singh, Gurpreet

    2013-01-01

    Seroma is a frequent complication after breast cancer surgery. Closed suction drainage for several days is the standard procedure to reduce seroma formation. The aim of this study was to compare the efficacy of external compression dressing, suture flap fixation, and the conventional method of closed suction drains in the prevention of seroma formation. A total of 161 patients were prospectively randomized in a three groups × two subgroups design into control (n = 48), compression dressing (n = 53) and suturing groups (n = 49), and two subgroups, conventional drain removal (n = 75) and early drain removal (n = 75). All patients underwent ALND as part of MRM or BCT. The primary end point was the incidence of seroma. Suture flap fixation significantly reduced the incidence of seroma (p = 0.003), total drain output (p = 0.005), and duration of drainage (p = 0.001) without increase in wound complications. Compression dressing reduced duration of drainage significantly (p = 0.03), but not the total drain output (p = 0.15) or seromas (p = 0.58). Early drain removal on postoperative day 7 irrespective of drain output does not significantly increase seroma formation (p = 0.34) or wound complications. On multivariate analysis, BMI ≥ 30 (p = 0.02) and longer duration of drainage (p = 0.04) were identified as independent predictors for seroma formation. Obliteration of the dead space after breast cancer surgery by suture flap fixation is a safe and easy procedure, which significantly reduces postoperative seroma formation and duration of drainage. Compression dressing offers no advantage over normal dressing. Drains can be removed safely on postoperative day 7 irrespective of output without significant increase in complications.

  14. Novel image fusion method based on adaptive pulse coupled neural network and discrete multi-parameter fractional random transform

    NASA Astrophysics Data System (ADS)

    Lang, Jun; Hao, Zhengchao

    2014-01-01

    In this paper, we first propose the discrete multi-parameter fractional random transform (DMPFRNT), which can make the spectrum distributed randomly and uniformly. Then we introduce this new spectrum transform into the image fusion field and present a new approach for the remote sensing image fusion, which utilizes both adaptive pulse coupled neural network (PCNN) and the discrete multi-parameter fractional random transform in order to meet the requirements of both high spatial resolution and low spectral distortion. In the proposed scheme, the multi-spectral (MS) and panchromatic (Pan) images are converted into the discrete multi-parameter fractional random transform domains, respectively. In DMPFRNT spectrum domain, high amplitude spectrum (HAS) and low amplitude spectrum (LAS) components carry different informations of original images. We take full advantage of the synchronization pulse issuance characteristics of PCNN to extract the HAS and LAS components properly, and give us the PCNN ignition mapping images which can be used to determine the fusion parameters. In the fusion process, local standard deviation of the amplitude spectrum is chosen as the link strength of pulse coupled neural network. Numerical simulations are performed to demonstrate that the proposed method is more reliable and superior than several existing methods based on Hue Saturation Intensity representation, Principal Component Analysis, the discrete fractional random transform etc.

  15. Single particle electron microscopy reconstruction of the exosome complex using the random conical tilt method.

    PubMed

    Liu, Xueqi; Wang, Hong-Wei

    2011-03-28

    of each single particle. There are several methods to assign the view for each particle, including the angular reconstitution(1) and random conical tilt (RCT) method(2). In this protocol, we describe our practice in getting the 3D reconstruction of yeast exosome complex using negative staining EM and RCT. It should be noted that our protocol of electron microscopy and image processing follows the basic principle of RCT but is not the only way to perform the method. We first describe how to embed the protein sample into a layer of Uranyl-Formate with a thickness comparable to the protein size, using a holey carbon grid covered with a layer of continuous thin carbon film. Then the specimen is inserted into a transmission electron microscope to collect untilted (0-degree) and tilted (55-degree) pairs of micrographs that will be used later for processing and obtaining an initial 3D model of the yeast exosome. To this end, we perform RCT and then refine the initial 3D model by using the projection matching refinement method(3).

  16. Barriers to adequate prenatal care utilization in American Samoa

    PubMed Central

    Hawley, Nicola L; Brown, Carolyn; Nu’usolia, Ofeira; Ah-Ching, John; Muasau-Howard, Bethel; McGarvey, Stephen T

    2013-01-01

    Objective To describe the utilization of prenatal care in American Samoan women and to identify socio-demographic predictors of inadequate prenatal care utilization. Methods Using data from prenatal clinic records, women (n=692) were categorized according to the Adequacy of Prenatal Care Utilization Index as having received adequate plus, adequate, intermediate or inadequate prenatal care during their pregnancy. Categorical socio-demographic predictors of the timing of initiation of prenatal care (week of gestation) and the adequacy of received services were identified using one way Analysis of Variance (ANOVA) and independent samples t-tests. Results Between 2001 and 2008 85.4% of women received inadequate prenatal care. Parity (P=0.02), maternal unemployment (P=0.03), and both parents being unemployed (P=0.03) were negatively associated with the timing of prenatal care initation. Giving birth in 2007–2008, after a prenatal care incentive scheme had been introduced in the major hospital, was associated with earlier initiation of prenatal care (20.75 versus 25.12 weeks; P<0.01) and improved adequacy of received services (95.04% versus 83.8%; P=0.02). Conclusion The poor prenatal care utilization in American Samoa is a major concern. Improving healthcare accessibility will be key in encouraging women to attend prenatal care. The significant improvements in the adequacy of prenatal care seen in 2007–2008 suggest that the prenatal care incentive program implemented in 2006 may be a very positive step toward addressing issues of prenatal care utilization in this population. PMID:24045912

  17. A Mixed-Methods Randomized Controlled Trial of Financial Incentives and Peer Networks to Promote Walking among Older Adults

    ERIC Educational Resources Information Center

    Kullgren, Jeffrey T.; Harkins, Kristin A.; Bellamy, Scarlett L.; Gonzales, Amy; Tao, Yuanyuan; Zhu, Jingsan; Volpp, Kevin G.; Asch, David A.; Heisler, Michele; Karlawish, Jason

    2014-01-01

    Background: Financial incentives and peer networks could be delivered through eHealth technologies to encourage older adults to walk more. Methods: We conducted a 24-week randomized trial in which 92 older adults with a computer and Internet access received a pedometer, daily walking goals, and weekly feedback on goal achievement. Participants…

  18. Active video games as a tool to prevent excessive weight gain in adolescents: rationale, design and methods of a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Excessive body weight, low physical activity and excessive sedentary time in youth are major public health concerns. A new generation of video games, the ones that require physical activity to play the games –i.e. active games- may be a promising alternative to traditional non-active games to promote physical activity and reduce sedentary behaviors in youth. The aim of this manuscript is to describe the design of a study evaluating the effects of a family oriented active game intervention, incorporating several motivational elements, on anthropometrics and health behaviors in adolescents. Methods/Design The study is a randomized controlled trial (RCT), with non-active gaming adolescents aged 12 – 16 years old randomly allocated to a ten month intervention (receiving active games, as well as an encouragement to play) or a waiting-list control group (receiving active games after the intervention period). Primary outcomes are adolescents’ measured BMI-SDS (SDS = adjusted for mean standard deviation score), waist circumference-SDS, hip circumference and sum of skinfolds. Secondary outcomes are adolescents’ self-reported time spent playing active and non-active games, other sedentary activities and consumption of sugar-sweetened beverages. In addition, a process evaluation is conducted, assessing the sustainability of the active games, enjoyment, perceived competence, perceived barriers for active game play, game context, injuries from active game play, activity replacement and intention to continue playing the active games. Discussion This is the first adequately powered RCT including normal weight adolescents, evaluating a reasonably long period of provision of and exposure to active games. Next, strong elements are the incorporating motivational elements for active game play and a comprehensive process evaluation. This trial will provide evidence regarding the potential contribution of active games in prevention of excessive weight gain in

  19. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of...

  20. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of...

  1. "Something Adequate"? In Memoriam Seamus Heaney, Sister Quinlan, Nirbhaya

    ERIC Educational Resources Information Center

    Parker, Jan

    2014-01-01

    Seamus Heaney talked of poetry's responsibility to represent the "bloody miracle", the "terrible beauty" of atrocity; to create "something adequate". This article asks, what is adequate to the burning and eating of a nun and the murderous gang rape and evisceration of a medical student? It considers Njabulo Ndebele's…

  2. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...

  3. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  4. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  5. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  6. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  7. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...

  8. 9 CFR 305.3 - Sanitation and adequate facilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...

  9. 9 CFR 305.3 - Sanitation and adequate facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...

  10. 9 CFR 305.3 - Sanitation and adequate facilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...

  11. 9 CFR 305.3 - Sanitation and adequate facilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...

  12. 9 CFR 305.3 - Sanitation and adequate facilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...

  13. 21 CFR 201.5 - Drugs; adequate directions for use.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs; adequate directions for use. 201.5 Section 201.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use....

  14. Method for Generating a Randomized Flight-by-Flight Loading Sequence for an Aircraft

    DTIC Science & Technology

    1980-07-01

    candidates p in the random sampling process Mabc The number of different stresses in the negative gFn flight stress spectrum for the ath control point of... Mabc The number of different stresses in the one g flight Fg 9stress spectrum for the ath control point of an air- craft flying the cth mission...4 abm A simple graph derived from nabm whose ordinates "L~n LT n T are the selection candidates in the random samplingprocess Mabc The number of

  15. Increasing the Degrees of Freedom in Future Group Randomized Trials: The "df*" Method Revisited

    ERIC Educational Resources Information Center

    Murray, David M.; Blitstein, Jonathan L.; Hannan, Peter J.; Shadish, William R.

    2012-01-01

    Background: This article revisits an article published in Evaluation Review in 2005 on sample size estimation and power analysis for group-randomized trials. With help from a careful reader, we learned of an important error in the spreadsheet used to perform the calculations and generate the results presented in that article. As we studied the…

  16. Methods of Learning in Statistical Education: A Randomized Trial of Public Health Graduate Students

    ERIC Educational Resources Information Center

    Enders, Felicity Boyd; Diener-West, Marie

    2006-01-01

    A randomized trial of 265 consenting students was conducted within an introductory biostatistics course: 69 received eight small group cooperative learning sessions; 97 accessed internet learning sessions; 96 received no intervention. Effect on examination score (95% CI) was assessed by intent-to-treat analysis and by incorporating reported…

  17. Mixing Methods in Randomized Controlled Trials (RCTs): Validation, Contextualization, Triangulation, and Control

    ERIC Educational Resources Information Center

    Spillane, James P.; Pareja, Amber Stitziel; Dorner, Lisa; Barnes, Carol; May, Henry; Huff, Jason; Camburn, Eric

    2010-01-01

    In this paper we described how we mixed research approaches in a Randomized Control Trial (RCT) of a school principal professional development program. Using examples from our study we illustrate how combining qualitative and quantitative data can address some key challenges from validating instruments and measures of mediator variables to…

  18. Variability in DNA polymerase efficiency: effects of random error, DNA extraction method, and isolate type

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Using computer-generated data calculated with known amounts of random error (E = 1, 5 & 10%) associated with calculated qPCR cycle number (C ) at four jth 1:10 dilutions, we found that the “efficiency” (eff) associated with each population distribution of n = 10,000 measurements varied from 0.95 to ...

  19. A Comparison of Three Methods of Analyzing Dichotomous Data in a Randomized Block Design.

    ERIC Educational Resources Information Center

    Mandeville, Garrett K.

    Results of a comparative study of F and Q tests, in a randomized block design with one replication per cell, are presented. In addition to these two procedures, a multivariate test was also considered. The model and test statistics, data generation and parameter selection, results, summary and conclusions are presented. Ten tables contain the…

  20. The random card sort method and respondent certainty in contingent valuation: an exploratory investigation of range bias.

    PubMed

    Shackley, Phil; Dixon, Simon

    2014-10-01

    Willingness to pay (WTP) values derived from contingent valuation surveys are prone to a number of biases. Range bias occurs when the range of money values presented to respondents in a payment card affects their stated WTP values. This paper reports the results of an exploratory study whose aim was to investigate whether the effects of range bias can be reduced through the use of an alternative to the standard payment card method, namely, a random card sort method. The results suggest that the random card sort method is prone to range bias but that this bias may be mitigated by restricting the analysis to the WTP values of those respondents who indicate they are 'definitely sure' they would pay their stated WTP.

  1. Asymptotic-preserving methods for hyperbolic and transport equations with random inputs and diffusive scalings

    SciTech Connect

    Jin, Shi; Xiu, Dongbin; Zhu, Xueyu

    2015-05-15

    In this paper we develop a set of stochastic numerical schemes for hyperbolic and transport equations with diffusive scalings and subject to random inputs. The schemes are asymptotic preserving (AP), in the sense that they preserve the diffusive limits of the equations in discrete setting, without requiring excessive refinement of the discretization. Our stochastic AP schemes are extensions of the well-developed deterministic AP schemes. To handle the random inputs, we employ generalized polynomial chaos (gPC) expansion and combine it with stochastic Galerkin procedure. We apply the gPC Galerkin scheme to a set of representative hyperbolic and transport equations and establish the AP property in the stochastic setting. We then provide several numerical examples to illustrate the accuracy and effectiveness of the stochastic AP schemes.

  2. Research on Time-series Modeling and Filtering Methods for MEMS Gyroscope Random Drift Error

    NASA Astrophysics Data System (ADS)

    Wang, Xiao Yi; Meng, Xiu Yun

    2017-03-01

    The precision of MEMS gyroscope is reduced by random drift error. This paper applied time series analysis to model random drift error of MEMS gyroscope. Based on the model established, Kalman filter was employed to compensate for the error. To overcome the disadvantages of conventional Kalman filter, Sage-Husa adaptive filtering algorithm was utilized to improve the accuracy of filtering results and the orthogonal property of innovation in the process of filtering was utilized to deal with outliers. The results showed that, compared with conventional Kalman filter, the modified filter can not only enhance filter accuracy, but also resist to outliers and this assured the stability of filtering thus improving the performance of gyroscopes.

  3. Method for removal of random noise in eddy-current testing system

    DOEpatents

    Levy, Arthur J.

    1995-01-01

    Eddy-current response voltages, generated during inspection of metallic structures for anomalies, are often replete with noise. Therefore, analysis of the inspection data and results is difficult or near impossible, resulting in inconsistent or unreliable evaluation of the structure. This invention processes the eddy-current response voltage, removing the effect of random noise, to allow proper identification of anomalies within and associated with the structure.

  4. Extended proton-neutron quasiparticle random-phase approximation in a boson expansion method

    NASA Astrophysics Data System (ADS)

    Civitarese, O.; Montani, F.; Reboiro, M.

    1999-08-01

    The proton-neutron quasiparticle random phase approximation (pn-QRPA) is extended to include next to leading order terms of the QRPA harmonic expansion. The procedure is tested for the case of a separable Hamiltonian in the SO(5) symmetry representation. The pn-QRPA equation of motion is solved by using a boson expansion technique adapted to the treatment of proton-neutron correlations. The resulting wave functions are used to calculate the matrix elements of double-Fermi transitions.

  5. Method for Evaluation of Outage Probability on Random Access Channel in Mobile Communication Systems

    NASA Astrophysics Data System (ADS)

    Kollár, Martin

    2012-05-01

    In order to access the cell in all mobile communication technologies a so called random-access procedure is used. For example in GSM this is represented by sending the CHANNEL REQUEST message from Mobile Station (MS) to Base Transceiver Station (BTS) which is consequently forwarded as an CHANNEL REQUIRED message to the Base Station Controller (BSC). If the BTS decodes some noise on the Random Access Channel (RACH) as random access by mistake (so- called ‘phantom RACH') then it is a question of pure coincidence which èstablishment cause’ the BTS thinks to have recognized. A typical invalid channel access request or phantom RACH is characterized by an IMMEDIATE ASSIGNMENT procedure (assignment of an SDCCH or TCH) which is not followed by sending an ESTABLISH INDICATION from MS to BTS. In this paper a mathematical model for evaluation of the Power RACH Busy Threshold (RACHBT) in order to guaranty in advance determined outage probability on RACH is described and discussed as well. It focuses on Global System for Mobile Communications (GSM) however the obtained results can be generalized on remaining mobile technologies (ie WCDMA and LTE).

  6. Comparison of methods for estimating the intraclass correlation coefficient for binary responses in cancer prevention cluster randomized trials.

    PubMed

    Wu, Sheng; Crespi, Catherine M; Wong, Weng Kee

    2012-09-01

    The intraclass correlation coefficient (ICC) is a fundamental parameter of interest in cluster randomized trials as it can greatly affect statistical power. We compare common methods of estimating the ICC in cluster randomized trials with binary outcomes, with a specific focus on their application to community-based cancer prevention trials with primary outcome of self-reported cancer screening. Using three real data sets from cancer screening intervention trials with different numbers and types of clusters and cluster sizes, we obtained point estimates and 95% confidence intervals for the ICC using five methods: the analysis of variance estimator, the Fleiss-Cuzick estimator, the Pearson estimator, an estimator based on generalized estimating equations and an estimator from a random intercept logistic regression model. We compared estimates of the ICC for the overall sample and by study condition. Our results show that ICC estimates from different methods can be quite different, although confidence intervals generally overlap. The ICC varied substantially by study condition in two studies, suggesting that the common practice of assuming a common ICC across all clusters in the trial is questionable. A simulation study confirmed pitfalls of erroneously assuming a common ICC. Investigators should consider using sample size and analysis methods that allow the ICC to vary by study condition.

  7. Region 9: Arizona Adequate Letter (10/14/2003)

    EPA Pesticide Factsheets

    This is a letter from Jack P. Broadben,. Director, to Nancy Wrona and Dennis Smith informing them that Maricopa County's motor vehicle emissions budgets in the 2003 MAGCO Maintenance Plan are adequate for transportation conformity purposes.

  8. Region 6: Texas Adequate Letter (4/16/2010)

    EPA Pesticide Factsheets

    This letter from EPA to Texas Commission on Environmental Quality determined 2021 motor vehicle emission budgets for nitrogen oxides (NOx) and volatile organic compounds (VOCs) for Beaumont/Port Arthur area adequate for transportation conformity purposes

  9. Region 2: New Jersey Adequate Letter (5/23/2002)

    EPA Pesticide Factsheets

    This April 22, 2002 letter from EPA to the New Jersey Department of Environmental Protection determined 2007 and 2014 Carbon Monoxide (CO) Mobile Source Emissions Budgets adequate for transportation conformity purposes and will be announced in the Federal

  10. Region 8: Colorado Adequate Letter (10/29/2001)

    EPA Pesticide Factsheets

    This letter from EPA to Colorado Department of Public Health and Environment determined Denvers' particulate matter (PM10) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes.

  11. Region 1: New Hampshire Adequate Letter (8/12/2008)

    EPA Pesticide Factsheets

    This July 9, 2008 letter from EPA to the New Hampshire Department of Environmental Services, determined the 2009 Motor Vehicle Emissions Budgets (MVEBs) are adequate for transportation conformity purposes and will be announced in the Federal Register (FR).

  12. Region 8: Colorado Adequate Letter (1/20/2004)

    EPA Pesticide Factsheets

    This letter from EPA to Colorado Department of Public Health and Environment determined Greeleys' Carbon Monoxide (CO) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes and will be announced in the FR.

  13. Region 8: Utah Adequate Letter (6/10/2005)

    EPA Pesticide Factsheets

    This letter from EPA to Utah Department of Environmental Quality determined Salt Lake Citys' and Ogdens' Carbon Monoxide (CO) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes.

  14. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...

  15. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...

  16. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...

  17. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...

  18. 15 CFR 970.404 - Adequate exploration plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...

  19. Region 6: New Mexico Adequate Letter (8/21/2003)

    EPA Pesticide Factsheets

    This is a letter from Carl Edlund, Director, to Alfredo Santistevan regarding MVEB's contained in the latest revision to the Albuquerque Carbon Monoxide State Implementation Plan (SIP) are adequate for transportation conformity purposes.

  20. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... contained in a system of records are adequately trained to protect the security and privacy of such records..., by degaussing or by overwriting with the appropriate security software, in accordance...

  1. 4 CFR 200.14 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... require access to and use of records contained in a system of records are adequately trained to protect... with the appropriate security software, in accordance with regulations of the Archivist of the...

  2. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... contained in a system of records are adequately trained to protect the security and privacy of such records..., by degaussing or by overwriting with the appropriate security software, in accordance...

  3. 4 CFR 200.14 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... require access to and use of records contained in a system of records are adequately trained to protect... with the appropriate security software, in accordance with regulations of the Archivist of the...

  4. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... contained in a system of records are adequately trained to protect the security and privacy of such records..., by degaussing or by overwriting with the appropriate security software, in accordance...

  5. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... contained in a system of records are adequately trained to protect the security and privacy of such records..., by degaussing or by overwriting with the appropriate security software, in accordance...

  6. 4 CFR 200.14 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... require access to and use of records contained in a system of records are adequately trained to protect... with the appropriate security software, in accordance with regulations of the Archivist of the...

  7. Region 9: Nevada Adequate Letter (3/30/2006)

    EPA Pesticide Factsheets

    This is a letter from Deborah Jordan, Director, to Leo M. Drozdoff regarding Nevada's motor vehicle emissions budgets in the 2005 Truckee Meadows CO Redesignation Request and Maintenance Plan are adequate for transportation conformity decisions.

  8. A Novel Compressed Sensing Method for Magnetic Resonance Imaging: Exponential Wavelet Iterative Shrinkage-Thresholding Algorithm with Random Shift

    PubMed Central

    Zhang, Yudong; Yang, Jiquan; Yang, Jianfei; Liu, Aijun; Sun, Ping

    2016-01-01

    Aim. It can help improve the hospital throughput to accelerate magnetic resonance imaging (MRI) scanning. Patients will benefit from less waiting time. Task. In the last decade, various rapid MRI techniques on the basis of compressed sensing (CS) were proposed. However, both computation time and reconstruction quality of traditional CS-MRI did not meet the requirement of clinical use. Method. In this study, a novel method was proposed with the name of exponential wavelet iterative shrinkage-thresholding algorithm with random shift (abbreviated as EWISTARS). It is composed of three successful components: (i) exponential wavelet transform, (ii) iterative shrinkage-thresholding algorithm, and (iii) random shift. Results. Experimental results validated that, compared to state-of-the-art approaches, EWISTARS obtained the least mean absolute error, the least mean-squared error, and the highest peak signal-to-noise ratio. Conclusion. EWISTARS is superior to state-of-the-art approaches. PMID:27066068

  9. Capabilities and limitations of a current FORTAN implementation of the T-matrix method for randomly oriented, rotationally symmetric scatterers.

    NASA Astrophysics Data System (ADS)

    Mishchenko, M.; Travis, L. D.

    1998-09-01

    The authors describe in detail a software implementation of a current version of the T-matrix method for computing light scattering by polydisperse, randomly oriented, rotationally symmetric particles. The authors give all necessary formulas, describe input and output parameters, discuss numerical aspects of T-matrix computations, demonstrate the capabilities and limitations of the codes, and discuss the performance of the codes in comparison with other available numerical approaches.

  10. Asthma randomized trial of indoor wood smoke (ARTIS): Rationale and Methods

    PubMed Central

    Noonan, Curtis W.; Ward, Tony J.

    2012-01-01

    Background Particulate matter (PM) exposures have been linked with poor respiratory health outcomes, especially among susceptible populations such as asthmatic children. Smoke from biomass combustion for residential home heating is an important source of PM in many rural or peri-urban areas in the United States. Aim To assess the efficacy of residential interventions that reduce indoor PM exposure from wood stoves and to quantify the corresponding improvements in quality of life and health outcomes for asthmatic children. Design The Asthma Randomized Trial of Indoor wood Smoke (ARTIS) study is an in-home intervention study of susceptible children exposed to biomass combustion smoke. Children, ages 7 to 17, with persistent asthma and living in homes that heat with wood stoves were recruited for this three arm randomized placebo-controlled trial. Two household-level intervention strategies, wood stove replacement and air filters, were compared to a sham air filter placebo. Improvement in quality of life of asthmatic children was the primary outcomes. Secondary asthma-related health outcomes included peak expiratory flow (PEF) and forced expiratory volume in first second (FEV1), biomarkers in exhaled breath condensate, and frequency of asthma symptoms, medication usage, and healthcare utilization. Exposure outcomes included indoor and outdoor PM2.5 mass, particle counts of several size fractions, and carbon monoxide. Discussion To our knowledge, this was the first randomized trial in the US to utilize interventions targeting residential wood stoves to assess the impact on indoor PM and health outcomes in a susceptible population. Trial registration ClincialTrials.gov NCT00807183. PMID:22735495

  11. Perception of spatiotemporal random fractals: an extension of colorimetric methods to the study of dynamic texture.

    PubMed

    Billock, V A; Cunningham, D W; Havig, P R; Tsou, B H

    2001-10-01

    Recent work establishes that static and dynamic natural images have fractal-like l/falpha spatiotemporal spectra. Artifical textures, with randomized phase spectra, and 1/falpha amplitude spectra are also used in studies of texture and noise perception. Influenced by colorimetric principles and motivated by the ubiquity of 1/falpha spatial and temporal image spectra, we treat the spatial and temporal frequency exponents as the dimensions characterizing a dynamic texture space, and we characterize two key attributes of this space, the spatiotemporal appearance map and the spatiotemporal discrimination function (a map of MacAdam-like just-noticeable-difference contours).

  12. An adequate design for regression analysis of yield trials.

    PubMed

    Gusmão, L

    1985-12-01

    Based on theoretical demonstrations and illustrated with a numerical example from triticale yield trials in Portugal, the Completely Randomized Design is proposed as the one suited for Regression Analysis. When trials are designed in Complete Randomized Blocks the regression of plot production on block mean instead of the regression of cultivar mean on the overall mean of the trial is proposed as the correct procedure for regression analysis. These proposed procedures, in addition to providing a better agreement with the assumptions for regression and the philosophy of the method, induce narrower confidence intervals and attenuation of the hyperbolic effect. The increase in precision is brought about by both a decrease in the t Student values by an increased number of degrees of freedom, and by a decrease in standard error by a non proportional increase of residual variance and non proportional increase of the sum of squares of the assumed independent variable. The new procedures seem to be promising for a better understanding of the mechanism of specific instability.

  13. Is clinical measurement of anatomic axis of the femur adequate?

    PubMed

    Wu, Chi-Chuan

    2017-03-23

    Background and purpose - The accuracy of using clinical measurement from the anterior superior iliac spine (ASIS) to the center of the knee to determine an anatomic axis of the femur has rarely been studied. A radiographic technique with a full-length standing scanogram (FLSS) was used to assess the adequacy of the clinical measurement. Patients and methods - 100 consecutive young adult patients (mean age 34 (20-40) years) with chronic unilateral lower extremity injuries were studied. The pelvis and intact contralateral lower extremity images in the FLSS were selected for study. The angles between the tibial axis and the femoral shaft anatomic axis (S-AA), the piriformis anatomic axis (P-AA), the clinical anatomic axis (C-AA), and the mechanical axis (MA) were compared between sexes. Results - Only the S-AA and C-AA angles were statistically significantly different in the 100 patients (3.6° vs. 2.8°; p = 0.03). There was a strong correlation between S-AA, P-AA, and C-AA angles (r > 0.9). The average intersecting angle between MA and S-AA in the femur in the 100 patients was 5.5°, and it was 4.8° between MA and C-AA. Interpretation - Clinical measurement of an anatomic axis from the ASIS to the center of the knee may be an adequate and acceptable method to determine lower extremity alignment. The optimal inlet for antegrade femoral intramedullary nailing may be the lateral edge of the piriformis fossa.

  14. A new experimental method for measuring life time and crack growth of materials under multi-stage and random loadings.

    PubMed

    Stanzl, S

    1981-11-01

    The experimental equipment and method of operation of a special computer-controlled fatigue testing machine is described. This resonance testing machine, operating at ultrasonic frequencies (20 kHz), performs one-step, multistage, and random fatigue tests with the aid of a computerized control system in very short testing times. Differences between this method and testing procedure at conventional frequencies are pointed out. However, it is emphasized that the high frequency tests have practical merit aside from lower energy cost and testing times. Initial results of two-stage fatigue experiments are reported.

  15. Methods for conditioning anisotropic, operator-scaling, fractal random fields, and the effect on solute transport simulations

    NASA Astrophysics Data System (ADS)

    Revielle, J.; Benson, D. A.

    2008-12-01

    The fractal scaling of aquifer materials have been observed in many data sets. Typically, the scaling coefficient is different in different directions. To date, only unconditional realizations with these properties can be generated. We present and analyze two methods of creating conditional operator-scaling fractal random fields (OSFRF) which have the ability to condition any number and geometry of measurements into each realization. One method is based on the theory of Orthographic Projection (Feller, 1971) and requires the continuous checking of a conditional probability function. The other method uses a best linear unbiased estimate (i.e., a kriged mean surface between known points) and an unconditional realization to create each conditional field. These two methods are analyzed for computational difficulty and their ability to recreate the desired fractal scaling along different (eigenvector) directions. Finally these methods are applied to a transport experiment through a slab of Massillon sandstone to show the advantage of using conditional OSFRF in solute transport modeling.

  16. Application of the extended boundary condition method to Monte Carlo simulations of scattering of waves by two-dimensional random rough surfaces

    NASA Technical Reports Server (NTRS)

    Tsang, L.; Lou, S. H.; Chan, C. H.

    1991-01-01

    The extended boundary condition method is applied to Monte Carlo simulations of two-dimensional random rough surface scattering. The numerical results are compared with one-dimensional random rough surfaces obtained from the finite-element method. It is found that the mean scattered intensity from two-dimensional rough surfaces differs from that of one dimension for rough surfaces with large slopes.

  17. Single-primer-limited amplification: a method to generate random single-stranded DNA sub-library for aptamer selection.

    PubMed

    He, Chao-Zhu; Zhang, Kun-He; Wang, Ting; Wan, Qin-Si; Hu, Piao-Ping; Hu, Mei-Di; Huang, De-Qiang; Lv, Nong-Hua

    2013-09-01

    The amplification of a random single-stranded DNA (ssDNA) library by polymerase chain reaction (PCR) is a key step in each round of aptamer selection by systematic evolution of ligands by exponential enrichment (SELEX), but it can be impeded by the amplification of by-products due to the severely nonspecific hybridizations among various sequences in the PCR system. To amplify a random ssDNA library free from by-products, we developed a novel method termed single-primer-limited amplification (SPLA), which was initiated from the amplification of minus-stranded DNA (msDNA) of an ssDNA library with reverse primer limited to 5-fold molar quantity of the template, followed by the amplification of plus-stranded DNA (psDNA) of the msDNA with forward primer limited to 10-fold molar quantity of the template and recovery of psDNA by gel excision. We found that the amount of by-products increased with the increase of template amount and thermal cycle number. With the optimized template amount and thermal cycle, SPLA could amplify target ssDNA without detectable by-products and nonspecific products and could produce psDNA 16.1 times as much as that by asymmetric PCR. In conclusion, SPLA is a simple and feasible method to efficiently generate a random ssDNA sub-library for aptamer selection.

  18. Modeling spreading of oil slicks based on random walk methods and Voronoi diagrams.

    PubMed

    Durgut, İsmail; Reed, Mark

    2017-02-19

    We introduce a methodology for representation of a surface oil slick using a Voronoi diagram updated at each time step. The Voronoi cells scale the Gaussian random walk procedure representing the spreading process by individual particle stepping. The step length of stochastically moving particles is based on a theoretical model of the spreading process, establishing a relationship between the step length of diffusive spreading and the thickness of the slick at the particle locations. The Voronoi tessellation provides the areal extent of the slick particles and in turn the thicknesses of the slick and the diffusive-type spreading length for all particles. The algorithm successfully simulates the spreading process and results show very good agreement with the analytical solution. Moreover, the results are robust for a wide range of values for computational time step and total number of particles.

  19. Low-coherence interferometry as a method for assessing the transport parameters in randomly inhomogeneous media

    SciTech Connect

    Zimnyakov, D A; Sina, J S; Yuvchenko, S A; Isaeva, E A; Chekmasov, S P; Ushakova, O V

    2014-01-31

    The specific features of using low-coherence interferometric probing of layers in randomly inhomogeneous media for determination of the radiation propagation transport length both in diffuse regime and in the case of optically thin media are discussed. The transport length is determined by the rate of exponential decay of the interference signal with the increase in the path length difference between the light beams in the reference arm of the low-coherence interferometer and in the object arm, containing the probed layer as a diffuse reflector. The results are presented of experimental testing of the discussed approach with the use of layers of densely packed titanium dioxide nanoparticles and polytetrafluoroethylene. (radiation scattering)

  20. What are the appropriate methods for analyzing patient-reported outcomes in randomized trials when data are missing?

    PubMed

    Hamel, J F; Sebille, V; Le Neel, T; Kubis, G; Boyer, F C; Hardouin, J B

    2015-11-06

    Subjective health measurements using Patient Reported Outcomes (PRO) are increasingly used in randomized trials, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: Classical Test Theory (CTT) and Item Response Theory models (IRT). These two strategies display very similar characteristics when data are complete, but in the common case when data are missing, whether IRT or CTT would be the most appropriate remains unknown and was investigated using simulations. We simulated PRO data such as quality of life data. Missing responses to items were simulated as being completely random, depending on an observable covariate or on an unobserved latent trait. The considered CTT-based methods allowed comparing scores using complete-case analysis, personal mean imputations or multiple-imputations based on a two-way procedure. The IRT-based method was the Wald test on a Rasch model including a group covariate. The IRT-based method and the multiple-imputations-based method for CTT displayed the highest observed power and were the only unbiased method whatever the kind of missing data. Online software and Stata® modules compatibles with the innate mi impute suite are provided for performing such analyses. Traditional procedures (listwise deletion and personal mean imputations) should be avoided, due to inevitable problems of biases and lack of power.

  1. On Adequate Comparisons of Antenna Phase Center Variations

    NASA Astrophysics Data System (ADS)

    Schoen, S.; Kersten, T.

    2013-12-01

    One important part for ensuring the high quality of the International GNSS Service's (IGS) products is the collection and publication of receiver - and satellite antenna phase center variations (PCV). The PCV are crucial for global and regional networks, since they introduce a global scale factor of up to 16ppb or changes in the height component with an amount of up to 10cm, respectively. Furthermore, antenna phase center variations are also important for precise orbit determination, navigation and positioning of mobile platforms, like e.g. the GOCE and GRACE gravity missions, or for the accurate Precise Point Positioning (PPP) processing. Using the EUREF Permanent Network (EPN), Baire et al. (2012) showed that individual PCV values have a significant impact on the geodetic positioning. The statements are further supported by studies of Steigenberger et al. (2013) where the impact of PCV for local-ties are analysed. Currently, there are five calibration institutions including the Institut für Erdmessung (IfE) contributing to the IGS PCV file. Different approaches like field calibrations and anechoic chamber measurements are in use. Additionally, the computation and parameterization of the PCV are completely different within the methods. Therefore, every new approach has to pass a benchmark test in order to ensure that variations of PCV values of an identical antenna obtained from different methods are as consistent as possible. Since the number of approaches to obtain these PCV values rises with the number of calibration institutions, there is the necessity for an adequate comparison concept, taking into account not only the numerical values but also stochastic information and computational issues of the determined PCVs. This is of special importance, since the majority of calibrated receiver antennas published by the IGS origin from absolute field calibrations based on the Hannover Concept, Wübbena et al. (2000). In this contribution, a concept for the adequate

  2. Regulatory requirements for providing adequate veterinary care to research animals.

    PubMed

    Pinson, David M

    2013-09-01

    Provision of adequate veterinary care is a required component of animal care and use programs in the United States. Program participants other than veterinarians, including non-medically trained research personnel and technicians, also provide veterinary care to animals, and administrators are responsible for assuring compliance with federal mandates regarding adequate veterinary care. All program participants therefore should understand the regulatory requirements for providing such care. The author provides a training primer on the US regulatory requirements for the provision of veterinary care to research animals. Understanding the legal basis and conditions of a program of veterinary care will help program participants to meet the requirements advanced in the laws and policies.

  3. Predictors for Reporting of Dietary Assessment Methods in Food-based Randomized Controlled Trials over a Ten-year Period.

    PubMed

    Probst, Yasmine; Zammit, Gail

    2016-09-09

    The importance of monitoring dietary intake within a randomized controlled trial becomes vital to justification of the study outcomes when the study is food-based. A systematic literature review was conducted to determine how dietary assessment methods used to monitor dietary intake are reported and whether assisted technologies are used in conducting such assessments. OVID and ScienceDirect databases 2000-2010 were searched for food-based, parallel, randomized controlled trials conducted with humans using the search terms "clinical trial," "diet$ intervention" AND "diet$ assessment," "diet$ method$," "intake," "diet history," "food record," "food frequency questionnaire," "FFQ," "food diary," "24-hour recall." A total of 1364 abstracts were reviewed and 243 studies identified. The size of the study and country of origin appear to be the two most common predictors of reporting both the dietary assessment method and details of the form of assessment. The journal in which the study is published has no impact. Information technology use may increase in the future allowing other methods and forms of dietary assessment to be used efficiently.

  4. Hybrid random walk-linear discriminant analysis method for unwrapping quantitative phase microscopy images of biological samples

    PubMed Central

    Kim, Diane N. H.; Teitell, Michael A.; Reed, Jason; Zangle, Thomas A.

    2015-01-01

    Abstract. Standard algorithms for phase unwrapping often fail for interferometric quantitative phase imaging (QPI) of biological samples due to the variable morphology of these samples and the requirement to image at low light intensities to avoid phototoxicity. We describe a new algorithm combining random walk-based image segmentation with linear discriminant analysis (LDA)-based feature detection, using assumptions about the morphology of biological samples to account for phase ambiguities when standard methods have failed. We present three versions of our method: first, a method for LDA image segmentation based on a manually compiled training dataset; second, a method using a random walker (RW) algorithm informed by the assumed properties of a biological phase image; and third, an algorithm which combines LDA-based edge detection with an efficient RW algorithm. We show that the combination of LDA plus the RW algorithm gives the best overall performance with little speed penalty compared to LDA alone, and that this algorithm can be further optimized using a genetic algorithm to yield superior performance for phase unwrapping of QPI data from biological samples. PMID:26305212

  5. Correcting treatment effect for treatment switching in randomized oncology trials with a modified iterative parametric estimation method.

    PubMed

    Zhang, Jin; Chen, Cong

    2016-09-20

    In randomized oncology trials, patients in the control arm are sometimes permitted to switch to receive experimental drug after disease progression. This is mainly due to ethical reasons or to reduce the patient dropout rate. While progression-free survival is not usually impacted by crossover, the treatment effect on overall survival can be highly confounded. The rank-preserving structural failure time (RPSFT) model and iterative parametric estimation (IPE) are the main randomization-based methods used to adjust for confounding in the analysis of overall survival. While the RPSFT has been extensively studied, the properties of the IPE have not been thoroughly examined and its application is not common. In this manuscript, we clarify the re-censoring algorithm needed for IPE estimation and incorporate it into a method we propose as modified IPE (MIPE). We compared the MIPE and RPSFT via extensive simulations and then walked through the analysis using the modified IPE in a real clinical trial. We provided practical guidance on bootstrap by examining the performance in estimating the variance and confidence interval for the MIPE. Our results indicate that the MIPE method with the proposed re-censoring rule is an attractive alternative to the RPSFT method. Copyright © 2016 John Wiley & Sons, Ltd.

  6. MULTILEVEL ACCELERATION OF STOCHASTIC COLLOCATION METHODS FOR PDE WITH RANDOM INPUT DATA

    SciTech Connect

    Webster, Clayton G; Jantsch, Peter A; Teckentrup, Aretha L; Gunzburger, Max D

    2013-01-01

    Stochastic Collocation (SC) methods for stochastic partial differential equa- tions (SPDEs) suffer from the curse of dimensionality, whereby increases in the stochastic dimension cause an explosion of computational effort. To combat these challenges, multilevel approximation methods seek to decrease computational complexity by balancing spatial and stochastic discretization errors. As a form of variance reduction, multilevel techniques have been successfully applied to Monte Carlo (MC) methods, but may be extended to accelerate other methods for SPDEs in which the stochastic and spatial degrees of freedom are de- coupled. This article presents general convergence and computational complexity analysis of a multilevel method for SPDEs, demonstrating its advantages with regard to standard, single level approximation. The numerical results will highlight conditions under which multilevel sparse grid SC is preferable to the more traditional MC and SC approaches.

  7. Molecular profiles of Venezuelan isolates of Trypanosoma sp. by random amplified polymorphic DNA method.

    PubMed

    Perrone, T M; Gonzatti, M I; Villamizar, G; Escalante, A; Aso, P M

    2009-05-12

    Nine Trypanosoma sp. Venezuelan isolates, initially presumed to be T. evansi, were collected from three different hosts, capybara (Apure state), horse (Apure state) and donkey (Guarico state) and compared by the random amplification polymorphic DNA technique (RAPD). Thirty-one to 46 reproducible fragments were obtained with 12 of the 40 primers that were used. Most of the primers detected molecular profiles with few polymorphisms between the seven horse, capybara and donkey isolates. Quantitative analyses of the RAPD profiles of these isolates revealed a high degree of genetic conservation with similarity coefficients between 85.7% and 98.5%. Ten of the primers generated polymorphic RAPD profiles with two of the three Trypanosoma sp. horse isolates, namely TeAp-N/D1 and TeGu-N/D1. The similarity coefficient between these two isolates and the rest, ranged from 57.9% to 68.4% and the corresponding dendrogram clustered TeAp-N/D1 and Te Gu-N/D1 in a genetically distinct group.

  8. Sample size and robust marginal methods for cluster-randomized trials with censored event times.

    PubMed

    Zhong, Yujie; Cook, Richard J

    2015-03-15

    In cluster-randomized trials, intervention effects are often formulated by specifying marginal models, fitting them under a working independence assumption, and using robust variance estimates to address the association in the responses within clusters. We develop sample size criteria within this framework, with analyses based on semiparametric Cox regression models fitted with event times subject to right censoring. At the design stage, copula models are specified to enable derivation of the asymptotic variance of estimators from a marginal Cox regression model and to compute the number of clusters necessary to satisfy power requirements. Simulation studies demonstrate the validity of the sample size formula in finite samples for a range of cluster sizes, censoring rates, and degrees of within-cluster association among event times. The power and relative efficiency implications of copula misspecification is studied, as well as the effect of within-cluster dependence in the censoring times. Sample size criteria and other design issues are also addressed for the setting where the event status is only ascertained at periodic assessments and times are interval censored.

  9. Comparability and Reliability Considerations of Adequate Yearly Progress

    ERIC Educational Resources Information Center

    Maier, Kimberly S.; Maiti, Tapabrata; Dass, Sarat C.; Lim, Chae Young

    2012-01-01

    The purpose of this study is to develop an estimate of Adequate Yearly Progress (AYP) that will allow for reliable and valid comparisons among student subgroups, schools, and districts. A shrinkage-type estimator of AYP using the Bayesian framework is described. Using simulated data, the performance of the Bayes estimator will be compared to…

  10. 13 CFR 107.200 - Adequate capital for Licensees.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... operate actively in accordance with your Articles and within the context of your business plan, as... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL...

  11. 13 CFR 107.200 - Adequate capital for Licensees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... operate actively in accordance with your Articles and within the context of your business plan, as... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL...

  12. Do Beginning Teachers Receive Adequate Support from Their Headteachers?

    ERIC Educational Resources Information Center

    Menon, Maria Eliophotou

    2012-01-01

    The article examines the problems faced by beginning teachers in Cyprus and the extent to which headteachers are considered to provide adequate guidance and support to them. Data were collected through interviews with 25 school teachers in Cyprus, who had recently entered teaching (within 1-5 years) in public primary schools. According to the…

  13. 13 CFR 108.200 - Adequate capital for NMVC Companies.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...

  14. 13 CFR 108.200 - Adequate capital for NMVC Companies.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...

  15. 13 CFR 108.200 - Adequate capital for NMVC Companies.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...

  16. 13 CFR 108.200 - Adequate capital for NMVC Companies.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...

  17. 13 CFR 108.200 - Adequate capital for NMVC Companies.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...

  18. Understanding Your Adequate Yearly Progress (AYP), 2011-2012

    ERIC Educational Resources Information Center

    Missouri Department of Elementary and Secondary Education, 2011

    2011-01-01

    The "No Child Left Behind Act (NCLB) of 2001" requires all schools, districts/local education agencies (LEAs) and states to show that students are making Adequate Yearly Progress (AYP). NCLB requires states to establish targets in the following ways: (1) Annual Proficiency Target; (2) Attendance/Graduation Rates; and (3) Participation…

  19. 34 CFR 200.13 - Adequate yearly progress in general.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 1 2011-07-01 2011-07-01 false Adequate yearly progress in general. 200.13 Section 200.13 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE...

  20. 34 CFR 200.13 - Adequate yearly progress in general.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 1 2013-07-01 2013-07-01 false Adequate yearly progress in general. 200.13 Section 200.13 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE...

  1. Region 9: Arizona Adequate Letter (11/1/2001)

    EPA Pesticide Factsheets

    This is a letter from Jack P. Broadbent, Director, Air Division to Nancy Wrona and James Bourney informing them of the adequacy of Revised MAG 1999 Serious Area Carbon Monoxide Plan and that the MAG CO Plan is adequate for Maricopa County.

  2. An analytical method for disentangling the roles of adhesion and crowding for random walk models on a crowded lattice

    NASA Astrophysics Data System (ADS)

    Ellery, Adam J.; Baker, Ruth E.; Simpson, Matthew J.

    2016-10-01

    Migration of cells and molecules in vivo is affected by interactions with obstacles. These interactions can include crowding effects, as well as adhesion/repulsion between the motile cell/molecule and the obstacles. Here we present an analytical framework that can be used to separately quantify the roles of crowding and adhesion/repulsion using a lattice-based random walk model. Our method leads to an exact calculation of the long time Fickian diffusivity, and avoids the need for computationally expensive stochastic simulations.

  3. Statistical Power of Randomization Tests Used with Multiple-Baseline Designs.

    ERIC Educational Resources Information Center

    Ferron, John; Sentovich, Chris

    2002-01-01

    Estimated statistical power for three randomization tests used with multiple-baseline designs using Monte Carlo methods. For an effect size of 0.5, none of the tests provided an adequate level of power, and for an effect size of 1.0, power was adequate for the Koehler-Levin test and the Marascuilo-Busk test only when the series length was long and…

  4. Are Substance Use Prevention Programs More Effective in Schools Making Adequate Yearly Progress? A Study of Project ALERT

    ERIC Educational Resources Information Center

    Clark, Heddy Kovach; Ringwalt, Chris L.; Shamblen, Stephen R.; Hanley, Sean M.; Flewelling, Robert L.

    2011-01-01

    This exploratory study sought to determine if a popular school-based drug prevention program might be effective in schools that are making adequate yearly progress (AYP). Thirty-four schools with grades 6 through 8 in 11 states were randomly assigned either to receive Project ALERT (n = 17) or to a control group (n = 17); of these, 10 intervention…

  5. Methods of and apparatus for recording images occurring just prior to a rapid, random event

    DOEpatents

    Kelley, Edward F.

    1994-01-01

    An apparatus and a method are disclosed for recording images of events in a medium wherein the images that are recorded are of conditions existing just prior to and during the occurrence of an event that triggers recording of these images. The apparatus and method use an optical delay path that employs a spherical focusing mirror facing a circular array of flat return mirrors around a central flat mirror. The image is reflected in a symmetric pattern which balances astigmatism which is created by the spherical mirror. Delays on the order of hundreds of nanoseconds are possible.

  6. A seed-expanding method based on random walks for community detection in networks with ambiguous community structures

    PubMed Central

    Su, Yansen; Wang, Bangju; Zhang, Xingyi

    2017-01-01

    Community detection has received a great deal of attention, since it could help to reveal the useful information hidden in complex networks. Although most previous modularity-based and local modularity-based community detection algorithms could detect strong communities, they may fail to exactly detect several weak communities. In this work, we define a network with clear or ambiguous community structures based on the types of its communities. A seed-expanding method based on random walks is proposed to detect communities for networks, especially for the networks with ambiguous community structures. We identify local maximum degree nodes, and detect seed communities in a network. Then, the probability of a node belonging to each community is calculated based on the total probability model and random walks, and each community is expanded by repeatedly adding the node which is most likely to belong to it. Finally, we use the community optimization method to ensure that each node is in a community. Experimental results on both computer-generated and real-world networks demonstrate that the quality of the communities detected by the proposed algorithm is superior to the- state-of-the-art algorithms in the networks with ambiguous community structures. PMID:28157183

  7. A seed-expanding method based on random walks for community detection in networks with ambiguous community structures

    NASA Astrophysics Data System (ADS)

    Su, Yansen; Wang, Bangju; Zhang, Xingyi

    2017-02-01

    Community detection has received a great deal of attention, since it could help to reveal the useful information hidden in complex networks. Although most previous modularity-based and local modularity-based community detection algorithms could detect strong communities, they may fail to exactly detect several weak communities. In this work, we define a network with clear or ambiguous community structures based on the types of its communities. A seed-expanding method based on random walks is proposed to detect communities for networks, especially for the networks with ambiguous community structures. We identify local maximum degree nodes, and detect seed communities in a network. Then, the probability of a node belonging to each community is calculated based on the total probability model and random walks, and each community is expanded by repeatedly adding the node which is most likely to belong to it. Finally, we use the community optimization method to ensure that each node is in a community. Experimental results on both computer-generated and real-world networks demonstrate that the quality of the communities detected by the proposed algorithm is superior to the- state-of-the-art algorithms in the networks with ambiguous community structures.

  8. A Robust and Versatile Method of Combinatorial Chemical Synthesis of Gene Libraries via Hierarchical Assembly of Partially Randomized Modules.

    PubMed

    Popova, Blagovesta; Schubert, Steffen; Bulla, Ingo; Buchwald, Daniela; Kramer, Wilfried

    2015-01-01

    A major challenge in gene library generation is to guarantee a large functional size and diversity that significantly increases the chances of selecting different functional protein variants. The use of trinucleotides mixtures for controlled randomization results in superior library diversity and offers the ability to specify the type and distribution of the amino acids at each position. Here we describe the generation of a high diversity gene library using tHisF of the hyperthermophile Thermotoga maritima as a scaffold. Combining various rational criteria with contingency, we targeted 26 selected codons of the thisF gene sequence for randomization at a controlled level. We have developed a novel method of creating full-length gene libraries by combinatorial assembly of smaller sub-libraries. Full-length libraries of high diversity can easily be assembled on demand from smaller and much less diverse sub-libraries, which circumvent the notoriously troublesome long-term archivation and repeated proliferation of high diversity ensembles of phages or plasmids. We developed a generally applicable software tool for sequence analysis of mutated gene sequences that provides efficient assistance for analysis of library diversity. Finally, practical utility of the library was demonstrated in principle by assessment of the conformational stability of library members and isolating protein variants with HisF activity from it. Our approach integrates a number of features of nucleic acids synthetic chemistry, biochemistry and molecular genetics to a coherent, flexible and robust method of combinatorial gene synthesis.

  9. Adequate Yearly Progress for Students with Emotional and Behavioral Disorders through Research-Based Practices

    ERIC Educational Resources Information Center

    Vannest, Kimberly J.; Temple-Harvey, Kimberly K.; Mason, Benjamin A.

    2009-01-01

    Because schools are held accountable for the academic performance of all students, it is important to focus on academics and the need for effective teaching practices. Adequate yearly progress, a method of accountability that is part of the No Child Left Behind Act (2001), profoundly affects the education of students who have emotional and…

  10. Odds of Getting Adequate Physical Activity by Dog Walking

    PubMed Central

    Soares, Jesus; Epping, Jacqueline N.; Owens, Chantelle J.; Brown, David R.; Lankford, Tina J.; Simoes, Eduardo J.; Caspersen, Carl J.

    2015-01-01

    Background We aimed to determine the likelihood that adult dog owners who walk their dogs will achieve a healthy level of moderate-intensity (MI) physical activity (PA), defined as at least 150 mins/wk. Methods We conducted a systematic search of 6 databases with data from 1990–2012 on dog owners’ PA, to identify those who achieved MIPA. To compare dog-walkers’ performance with non–dog walkers, we used a random effects model to estimate the unadjusted odds ratio (OR) and corresponding 95% confidence interval (CI). Results We retrieved 9 studies that met our inclusion criterion and allowed OR calculations. These yielded data on 6980 dog owners aged 18 to 81 years (41% men). Among them, 4463 (63.9%) walked their dogs. Based on total weekly PA, 2710 (60.7%) dog walkers, and 950 (37.7%) non–dog walkers achieved at least MIPA. The estimated OR was 2.74 (95% CI 2.09–3.60). Conclusion Across 9 published studies, almost 2 in 3 dog owners reported walking their dogs, and the walkers are more than 2.5 times more likely to achieve at least MIPA. These findings suggest that dog walking may be a viable strategy for dog owners to help achieve levels of PA that may enhance their health. PMID:24733365

  11. Method for Detecting a Random Process in a Convex Hull Volume

    DTIC Science & Technology

    2011-10-04

    1), pp. 66-87 (March 1943) . [0014] Wald , A. and J. Wolfowitz . "On a test whether two samples are from the same population." The Annals of...Method A ( Wald - Wolfowitz Independent Sample Runs Test Procedure) [0052] An initial statistical test on input distributions is performed to evaluate...two-valued data sequence and is well known to those skilled in the art [ Wald , A. 14 and J. Wolfowitz . "On a test whether two samples are from the

  12. A Two-Stage Estimation Method for Random Coefficient Differential Equation Models with Application to Longitudinal HIV Dynamic Data.

    PubMed

    Fang, Yun; Wu, Hulin; Zhu, Li-Xing

    2011-07-01

    We propose a two-stage estimation method for random coefficient ordinary differential equation (ODE) models. A maximum pseudo-likelihood estimator (MPLE) is derived based on a mixed-effects modeling approach and its asymptotic properties for population parameters are established. The proposed method does not require repeatedly solving ODEs, and is computationally efficient although it does pay a price with the loss of some estimation efficiency. However, the method does offer an alternative approach when the exact likelihood approach fails due to model complexity and high-dimensional parameter space, and it can also serve as a method to obtain the starting estimates for more accurate estimation methods. In addition, the proposed method does not need to specify the initial values of state variables and preserves all the advantages of the mixed-effects modeling approach. The finite sample properties of the proposed estimator are studied via Monte Carlo simulations and the methodology is also illustrated with application to an AIDS clinical data set.

  13. Does the World Health Organization criterion adequately define glaucoma blindness?

    PubMed Central

    Mokhles, P; Schouten, JSAG; Beckers, HJM; Webers, CAB

    2017-01-01

    Purpose Blindness in glaucoma is difficult to assess with merely the use of the current World Health Organization (WHO) definition (a visual field restricted to 10° in a radius around central fixation), as this criterion does not cover other types of visual field loss that are encountered in clinical practice and also depict blindness. In this study, a 5-point ordinal scale was developed for the assessment of common visual field defect patterns, with the purpose of comparing blindness as outcome to the findings with the WHO criterion when applied to the same visual fields. The scores with the two methods were compared between two ophthalmologists. In addition, the variability between these assessors in assessing the different visual field types was determined. Methods Two glaucoma specialists randomly assessed a sample of 423 visual fields from 77 glaucoma patients, stripped of all indices and masked for all patient variables. They applied the WHO criterion and a 5-point ordinal scale to all visual fields for the probability of blindness. Results The WHO criterion was mostly found applicable and in good agreement for both assessors to visual fields depicting central island of vision or a temporal crescent. The percentage of blindness scores was higher when using the ordinal scale, 21.7% and 19.6% for assessors A and B, respectively, versus 14.4% and 11.3% for the WHO criterion. However, Kappa was lower, 0.71 versus 0.78 for WHO. Conclusions The WHO criterion is strictly applied and shows good agreement between assessors; however, blindness does not always fit this criterion. More visual fields are labeled as blind when a less stringent criterion is used, but this leads to more interobserver variability. A new criterion that describes the extent, location, and depth of visual field defects together with their consequence for the patient’s quality of life is needed for the classification of glaucoma blindness. PMID:28280297

  14. Unsteady Fast Random Particle Mesh method for efficient prediction of tonal and broadband noises of a centrifugal fan unit

    NASA Astrophysics Data System (ADS)

    Heo, Seung; Cheong, Cheolung; Kim, Taehoon

    2015-09-01

    In this study, efficient numerical method is proposed for predicting tonal and broadband noises of a centrifugal fan unit. The proposed method is based on Hybrid Computational Aero-Acoustic (H-CAA) techniques combined with Unsteady Fast Random Particle Mesh (U-FRPM) method. The U-FRPM method is developed by extending the FRPM method proposed by Ewert et al. and is utilized to synthesize turbulence flow field from unsteady RANS solutions. The H-CAA technique combined with U-FRPM method is applied to predict broadband as well as tonal noises of a centrifugal fan unit in a household refrigerator. Firstly, unsteady flow field driven by a rotating fan is computed by solving the RANS equations with Computational Fluid Dynamic (CFD) techniques. Main source regions around the rotating fan are identified by examining the computed flow fields. Then, turbulence flow fields in the main source regions are synthesized by applying the U-FRPM method. The acoustic analogy is applied to model acoustic sources in the main source regions. Finally, the centrifugal fan noise is predicted by feeding the modeled acoustic sources into an acoustic solver based on the Boundary Element Method (BEM). The sound spectral levels predicted using the current numerical method show good agreements with the measured spectra at the Blade Pass Frequencies (BPFs) as well as in the high frequency range. On the more, the present method enables quantitative assessment of relative contributions of identified source regions to the sound field by comparing predicted sound pressure spectrum due to modeled sources.

  15. Army General Fund Adjustments Not Adequately Documented or Supported

    DTIC Science & Technology

    2016-07-26

    statements were unreliable and lacked an adequate audit trail. Furthermore, DoD and Army managers could not rely on the data in their accounting...risk that AGF financial statements will be materially misstated and the Army will not achieve audit readiness by the congressionally mandated...and $6.5 trillion in yearend adjustments made to Army General Fund data during FY 2015 financial statement compilation. We conducted this audit in

  16. Broadband inversion of 1J(CC) responses in 1,n-ADEQUATE spectra.

    PubMed

    Reibarkh, Mikhail; Williamson, R Thomas; Martin, Gary E; Bermel, Wolfgang

    2013-11-01

    Establishing the carbon skeleton of a molecule greatly facilitates the process of structure elucidation, both manual and computer-assisted. Recent advances in the family of ADEQUATE experiments demonstrated their potential in this regard. 1,1-ADEQUATE, which provides direct (13)C-(13)C correlation via (1)J(CC), and 1,n-ADEQUATE, which typically yields (3)J(CC) and (1)J(CC) correlations, are more sensitive and more widely applicable experiments than INADEQUATE and PANACEA. A recently reported modified pulse sequence that semi-selectively inverts (1)J(CC) correlations in 1,n-ADEQUATE spectra provided a significant improvement, allowing (1)J(CC) and (n)J(CC) correlations to be discerned in the same spectrum. However, the reported experiment requires a careful matching of the amplitude transfer function with (1)J(CC) coupling constants in order to achieve the inversion, and even then some (1)J(CC) correlations could still have positive intensity due to the oscillatory nature of the transfer function. Both shortcomings limit the practicality of the method. We now report a new, dual-optimized inverted (1)J(CC) 1,n-ADEQUATE experiment, which provides more uniform inversion of (1)J(CC) correlations across the range of 29-82 Hz. Unlike the original method, the dual optimization experiment does not require fine-tuning for the molecule's (1)J(CC) coupling constant values. Even more usefully, the dual-optimized version provides up to two-fold improvement in signal-to-noise for some long-range correlations. Using modern, cryogenically-cooled probes, the experiment can be successfully applied to samples of ~1 mg under favorable circumstances. The improvements afforded by dual optimization inverted (1)J(CC) 1,n-ADEQUATE experiment make it a useful and practical tool for NMR structure elucidation and should facilitate the implementation and utilization of the experiment.

  17. Prediction of broadband ground-motion time histories: Hybrid low/high-frequency method with correlated random source parameters

    USGS Publications Warehouse

    Liu, P.; Archuleta, R.J.; Hartzell, S.H.

    2006-01-01

    We present a new method for calculating broadband time histories of ground motion based on a hybrid low-frequency/high-frequency approach with correlated source parameters. Using a finite-difference method we calculate low- frequency synthetics (< ∼1 Hz) in a 3D velocity structure. We also compute broadband synthetics in a 1D velocity model using a frequency-wavenumber method. The low frequencies from the 3D calculation are combined with the high frequencies from the 1D calculation by using matched filtering at a crossover frequency of 1 Hz. The source description, common to both the 1D and 3D synthetics, is based on correlated random distributions for the slip amplitude, rupture velocity, and rise time on the fault. This source description allows for the specification of source parameters independent of any a priori inversion results. In our broadband modeling we include correlation between slip amplitude, rupture velocity, and rise time, as suggested by dynamic fault modeling. The method of using correlated random source parameters is flexible and can be easily modified to adjust to our changing understanding of earthquake ruptures. A realistic attenuation model is common to both the 3D and 1D calculations that form the low- and high-frequency components of the broadband synthetics. The value of Q is a function of the local shear-wave velocity. To produce more accurate high-frequency amplitudes and durations, the 1D synthetics are corrected with a randomized, frequency-dependent radiation pattern. The 1D synthetics are further corrected for local site and nonlinear soil effects by using a 1D nonlinear propagation code and generic velocity structure appropriate for the site’s National Earthquake Hazards Reduction Program (NEHRP) site classification. The entire procedure is validated by comparison with the 1994 Northridge, California, strong ground motion data set. The bias and error found here for response spectral acceleration are similar to the best results

  18. Response moments of dynamic systems under non-Gaussian random excitation by the equivalent non-Gaussian excitation method

    NASA Astrophysics Data System (ADS)

    Tsuchida, Takahiro; Kimura, Koji

    2016-09-01

    Equivalent non-Gaussian excitation method is proposed to obtain the response moments up to the 4th order of dynamic systems under non-Gaussian random excitation. The non-Gaussian excitation is prescribed by the probability density and the power spectrum, and is described by an Ito stochastic differential equation. Generally, moment equations for the response, which are derived from the governing equations for the excitation and the system, are not closed due to the nonlinearity of the diffusion coefficient in the equation for the excitation even though the system is linear. In the equivalent non-Gaussian excitation method, the diffusion coefficient is replaced with the equivalent diffusion coefficient approximately to obtain a closed set of the moment equations. The square of the equivalent diffusion coefficient is expressed by a quadratic polynomial. In numerical examples, a linear system subjected to nonGaussian excitations with bimodal and Rayleigh distributions is analyzed by using the present method. The results show that the method yields the variance, skewness and kurtosis of the response with high accuracy for non-Gaussian excitation with the widely different probability densities and bandwidth. The statistical moments of the equivalent non-Gaussian excitation are also investigated to describe the feature of the method.

  19. Testing Allele Transmission of an SNP Set Using a Family-Based Generalized Genetic Random Field Method.

    PubMed

    Li, Ming; Li, Jingyun; He, Zihuai; Lu, Qing; Witte, John S; Macleod, Stewart L; Hobbs, Charlotte A; Cleves, Mario A

    2016-05-01

    Family-based association studies are commonly used in genetic research because they can be robust to population stratification (PS). Recent advances in high-throughput genotyping technologies have produced a massive amount of genomic data in family-based studies. However, current family-based association tests are mainly focused on evaluating individual variants one at a time. In this article, we introduce a family-based generalized genetic random field (FB-GGRF) method to test the joint association between a set of autosomal SNPs (i.e., single-nucleotide polymorphisms) and disease phenotypes. The proposed method is a natural extension of a recently developed GGRF method for population-based case-control studies. It models offspring genotypes conditional on parental genotypes, and, thus, is robust to PS. Through simulations, we presented that under various disease scenarios the FB-GGRF has improved power over a commonly used family-based sequence kernel association test (FB-SKAT). Further, similar to GGRF, the proposed FB-GGRF method is asymptotically well-behaved, and does not require empirical adjustment of the type I error rates. We illustrate the proposed method using a study of congenital heart defects with family trios from the National Birth Defects Prevention Study (NBDPS).

  20. Effects of foot massage applied 2 different methods on symptom control in colorectal cancer patients: Randomized control trial.

    PubMed

    Uysal, Neşe; Kutlutürkan, Sevinç; Uğur, Işıl

    2017-02-07

    This randomized controlled clinical study aimed to determine the effect of 2 foot massage methods on symptom control in people with colorectal cancer who received chemoradiotherapy. Data were collected between June 16, 2015, and February 10, 2016, in the Department of Radiation Oncology of an oncology training and research hospital. The sample comprised 60 participants. Data were collected using an introductory information form, common terminology criteria for adverse events and European Organization for Research and Treatment of Cancer Quality of Life Questionnaires C30 and CR29. Participants were randomly allocated to 3 groups: classical foot massage, reflexology, and standard care control. The classical massage group received foot massage using classical massage techniques, and the reflexology group received foot reflexology focusing on symptom-oriented reflexes twice a week during a 5-week chemoradiotherapy treatment schedule. The control group received neither classical massage nor reflexology. All patients were provided with the same clinic routine care. The classical massage was effective in reducing pain level and distension incidence while foot reflexology was effective in reducing pain and fatigue level, lowering incidence of distension and urinary frequency and improving life quality.

  1. Preventing cognitive decline in older African Americans with mild cognitive impairment: design and methods of a randomized clinical trial.

    PubMed

    Rovner, Barry W; Casten, Robin J; Hegel, Mark T; Leiby, Benjamin E

    2012-07-01

    Mild Cognitive Impairment (MCI) affects 25% of older African Americans and predicts progression to Alzheimer's disease. An extensive epidemiologic literature suggests that cognitive, physical, and/or social activities may prevent cognitive decline. We describe the methods of a randomized clinical trial to test the efficacy of Behavior Activation to prevent cognitive decline in older African Americans with the amnestic multiple domain subtype of MCI. Community Health Workers deliver 6 initial in-home treatment sessions over 2-3 months and then 6 subsequent in-home booster sessions using language, materials, and concepts that are culturally relevant to older African Americans during this 24 month clinical trial. We are randomizing 200 subjects who are recruited from churches, senior centers, and medical clinics to Behavior Activation or Supportive Therapy, which controls for attention. The primary outcome is episodic memory as measured by the Hopkins Verbal Learning Test-Revised at baseline and at months 3, 12, 18, and 24. The secondary outcomes are general and domain-specific neuropsychological function, activities of daily living, depression, and quality-of-life. The negative results of recent clinical trials of drug treatments for MCI and Alzheimer's disease suggest that behavioral interventions may provide an alternative treatment approach to preserve cognition in an aging society.

  2. Preventing Cognitive Decline in Older African Americans with Mild Cognitive Impairment: Design and Methods of a Randomized Clinical Trial

    PubMed Central

    Rovner, Barry W.; Casten, Robin J.; Hegel, Mark T.; Leiby, Benjamin E.

    2012-01-01

    Mild Cognitive Impairment (MCI) affects 25% of older African Americans and predicts progression to Alzheimer's disease. An extensive epidemiologic literature suggests that cognitive, physical, and/or social activities may prevent cognitive decline. We describe the methods of a randomized clinical trial to test the efficacy of Behavior Activation to prevent cognitive decline in older African Americans with the amnestic multiple domain subtype of MCI. Community Health Workers deliver 6 initial in-home treatment sessions over 2-3 months and then 6 subsequent in-home booster sessions using language, materials, and concepts that are culturally relevant to older African Americans during this 24 month clinical trial. We are randomizing 200 subjects who are recruited from churches, senior centers, and medical clinics to Behavior Activation or Supportive Therapy, which controls for attention. The primary outcome is episodic memory as measured by the Hopkins Verbal Learning Test-Revised at baseline and at months 3, 12, 18, and 24. The secondary outcomes are general and domain-specific neuropsychological function, activities of daily living, depression, and quality-of-life. The negative results of recent clinical trials of drug treatments for MCI and Alzheimer's disease suggest that behavioral interventions may provide an alternative treatment approach to preserve cognition in an aging society. PMID:22406101

  3. Using sexually transmitted infection biomarkers to validate reporting of sexual behavior within a randomized, experimental evaluation of interviewing methods.

    PubMed

    Hewett, Paul C; Mensch, Barbara S; Ribeiro, Manoel Carlos S de A; Jones, Heidi E; Lippman, Sheri A; Montgomery, Mark R; van de Wijgert, Janneke H H M

    2008-07-15

    This paper examines the reporting of sexual and other risk behaviors within a randomized experiment using a computerized versus face-to-face interview mode. Biomarkers for sexually transmitted infection (STI) were used to validate self-reported behavior by interview mode. As part of a parent study evaluating home versus clinic screening and diagnosis for STIs, 818 women aged 18-40 years were recruited in 2004 at or near a primary care clinic in São Paulo, Brazil, and were randomized to a face-to-face interview or audio computer-assisted self-interviewing. Ninety-six percent of participants were tested for chlamydia, gonorrhea, and trichomoniasis. Reporting of STI risk behavior was consistently higher with the computerized mode of interview. Stronger associations between risk behaviors and STI were found with the computerized interview after controlling for sociodemographic factors. These results were obtained by using logistic regression approaches, as well as statistical methods that address potential residual confounding and covariate endogeneity. Furthermore, STI-positive participants were more likely than STI-negative participants to underreport risk behavior in the face-to-face interview. Results strongly suggest that computerized interviewing provides more accurate and reliable behavioral data. The analyses also confirm the benefits of using data on prevalent STIs for externally validating behavioral reporting.

  4. FluoMEP: a new genotyping method combining the advantages of randomly amplified polymorphic DNA and amplified fragment length polymorphism.

    PubMed

    Chang, Alex; Liew, Woei Chang; Chuah, Aaron; Lim, Zijie; Lin, Qifeng; Orban, Laszlo

    2007-02-01

    PCR-based identification of differences between two unknown genomes often requires complex manipulation of the templates prior to amplification and/or gel electrophoretic separation of a large number of samples with manual methods. Here, we describe a new genotyping method, called fluorescent motif enhanced polymorphism (fluoMEP). The fluoMEP method is based on random amplified polymorphic DNA (RAPD) assay, but combines the advantages of the large collection of unlabelled 10mer primers (ca. 5000) from commercial sources and the power of the automated CE devices used for the detection of amplified fragment length polymorphism (AFLP) patterns. The link between these two components is provided by a fluorescently labeled "common primer" that is used in a two-primer PCR together with an unlabeled RAPD primer. By using the same "common primer" and a series of RAPD primers, DNA templates can be screened quickly and effectively for polymorphisms. Our manuscript describes the optimization of the method and its characterization on different templates. We demonstrate by using several different approaches that the addition of the "common primer" to the PCR changes the profile of amplified fragments, allowing for screening various parts of the genome with the same set of unlabeled primers. We also present an in silico analysis of the genomic localization of fragments amplified by a RAPD primer with two different "common primers" and alone.

  5. Synthesis of carbon-supported PtRh random alloy nanoparticles using electron beam irradiation reduction method

    NASA Astrophysics Data System (ADS)

    Matsuura, Yoshiyuki; Seino, Satoshi; Okazaki, Tomohisa; Akita, Tomoki; Nakagawa, Takashi; Yamamoto, Takao A.

    2016-05-01

    Bimetallic nanoparticle catalysts of PtRh supported on carbon were synthesized using an electron beam irradiation reduction method. The PtRh nanoparticle catalysts were composed of particles 2-3 nm in size, which were well dispersed on the surface of the carbon support nanoparticles. Analyses of X-ray diffraction and scanning transmission electron microscopy-energy-dispersive X-ray spectroscopy revealed that the PtRh nanoparticles have a randomly alloyed structure. The lattice constant of the PtRh nanoparticles showed good correlation with Vegard's law. These results are explained by the radiochemical formation process of the PtRh nanoparticles. Catalytic activities of PtRh/C nanoparticles for ethanol oxidation reaction were found to be higher than those obtained with Pt/C.

  6. Genetic modification of preimplantation embryos: toward adequate human research policies.

    PubMed

    Dresser, Rebecca

    2004-01-01

    Citing advances in transgenic animal research and setbacks in human trials of somatic cell genetic interventions, some scientists and others want to begin planning for research involving the genetic modification of human embryos. Because this form of genetic modification could affect later-born children and their offspring, the protection of human subjects should be a priority in decisions about whether to proceed with such research. Yet because of gaps in existing federal policies, embryo modification proposals might not receive adequate scientific and ethical scrutiny. This article describes current policy shortcomings and recommends policy actions designed to ensure that the investigational genetic modification of embryos meets accepted standards for research on human subjects.

  7. Elements for adequate informed consent in the surgical context.

    PubMed

    Abaunza, Hernando; Romero, Klaus

    2014-07-01

    Given a history of atrocities and violations of ethical principles, several documents and regulations have been issued by a wide variety of organizations. They aim at ensuring that health care and clinical research adhere to defined ethical principles. A fundamental component was devised to ensure that the individual has been provided the necessary information to make an informed decision regarding health care or participation in clinical research. This article summarizes the history and regulations for informed consent and discusses suggested components for adequate consent forms for daily clinical practice in surgery as well as clinical research.

  8. Multicomponent Interdisciplinary Group Intervention for Self-Management of Fibromyalgia: A Mixed-Methods Randomized Controlled Trial

    PubMed Central

    Bourgault, Patricia; Lacasse, Anaïs; Marchand, Serge; Courtemanche-Harel, Roxanne; Charest, Jacques; Gaumond, Isabelle; Barcellos de Souza, Juliana; Choinière, Manon

    2015-01-01

    Background This study evaluated the efficacy of the PASSAGE Program, a structured multicomponent interdisciplinary group intervention for the self-management of FMS. Methods A mixed-methods randomized controlled trial (intervention (INT) vs. waitlist (WL)) was conducted with patients suffering from FMS. Data were collected at baseline (T0), at the end of the intervention (T1), and 3 months later (T2). The primary outcome was change in pain intensity (0-10). Secondary outcomes were fibromyalgia severity, pain interference, sleep quality, pain coping strategies, depression, health-related quality of life, patient global impression of change (PGIC), and perceived pain relief. Qualitative group interviews with a subset of patients were also conducted. Complete data from T0 to T2 were available for 43 patients. Results The intervention had a statistically significant impact on the three PGIC measures. At the end of the PASSAGE Program, the percentages of patients who perceived overall improvement in their pain levels, functioning and quality of life were significantly higher in the INT Group (73%, 55%, 77% respectively) than in the WL Group (8%, 12%, 20%). The same differences were observed 3 months post-intervention (Intervention group: 62%, 43%, 38% vs Waitlist Group: 13%, 13%, 9%). The proportion of patients who reported ≥50% pain relief was also significantly higher in the INT Group at the end of the intervention (36% vs 12%) and 3 months post-intervention (33% vs 4%). Results of the qualitative analysis were in line with the quantitative findings regarding the efficacy of the intervention. The improvement, however, was not reflected in the primary outcome and other secondary outcome measures. Conclusion The PASSAGE Program was effective in helping FMS patients gain a sense of control over their symptoms. We suggest including PGIC in future clinical trials on FMS as they appear to capture important aspects of the patients’ experience. Trial registration

  9. A randomized controlled trial of venlafaxine XR for major depressive disorder after spinal cord injury: Methods and lessons learned

    PubMed Central

    Bombardier, Charles H.; Fann, Jesse R.; Wilson, Catherine S.; Heinemann, Allen W.; Richards, J. Scott; Warren, Ann Marie; Brooks, Larry; Warms, Catherine A.; Temkin, Nancy R.; Tate, Denise G.

    2014-01-01

    Context/objective We describe the rationale, design, methods, and lessons learned conducting a treatment trial for major depressive disorder (MDD) or dysthymia in people with spinal cord injury (SCI). Design A multi-site, double-blind, randomized (1:1) placebo controlled trial of venlafaxine XR for MDD or dysthymia. Subjects were block randomized and stratified by site, lifetime history of substance dependence, and prior history of MDD. Setting Six SCI centers throughout the United States. Participants Across participating centers, 2536 subjects were screened and 133 were enrolled into the trial. Subjects were 18–64 years old and at least 1 month post-SCI. Interventions Twelve-week trial of venlafaxine XR versus placebo using a flexible titration schedule. Outcome measures The primary outcome was improvement in depression severity at 12 weeks. The secondary outcome was improvement in pain. Results This article includes study methods, modifications prompted by a formative review process, preliminary data on the study sample and lessons learned. We describe common methodological and operational challenges conducting multi-site trials and how we addressed them. Challenges included study organization and decision making, staff training, obtaining human subjects approval, standardization of measurement and treatment, data and safety monitoring, subject screening and recruitment, unblinding and continuity of care, database management, and data analysis. Conclusions The methodological and operational challenges we faced and the lessons we learned may provide useful information for researchers who aim to conduct clinical trials, especially in the area of medical treatment of depression in people with SCI. PMID:24090228

  10. Deriving welfare measures from discrete choice experiments: inconsistency between current methods and random utility and welfare theory.

    PubMed

    Lancsar, Emily; Savage, Elizabeth

    2004-09-01

    Discrete choice experiments (DCEs) are being used increasingly in health economics to elicit preferences for products and programs. The results of such experiments have been used to calculate measures of welfare or more specifically, respondents' 'willingness to pay' (WTP) for products and programs and their 'marginal willingness to pay' (MWTP) for the attributes that make up such products and programs. In this note we show that the methods currently used to derive measures of welfare from DCEs in the health economics literature are not consistent with random utility theory (RUT), or with microeconomic welfare theory more generally. The inconsistency with welfare theory is an important limitation on the use of such WTP estimates in cost-benefit analyses. We describe an alternative method of deriving measures of welfare (compensating variation) from DCEs that is consistent with RUT and is derived using welfare theory. We demonstrate its use in an empirical application to derive the WTP for asthma medication and compare it to the results elicited from the method currently used in the health economics literature.

  11. Construction and optimization of an efficient amplification method of a random ssDNA library by asymmetric emulsion PCR.

    PubMed

    Shao, Keke; Shi, Xinhui; Zhu, Xiangjun; Cui, Leilei; Shao, Qixiang; Ma, Da

    2015-12-16

    Construction of a random ssDNA sublibrary is an important step of the aptamer screening process. The available construction methods include asymmetric PCR, biotin-streptavidin separation, and lambda exonuclease digestions, in which PCR amplification is a key step. The main drawback of PCR amplification is overamplification increasing nonspecific hybridization among different products and by-products, which may cause the loss of potential high-quality aptamers, inefficient screening, and even screening failure. Cycle number optimization in PCR amplification is the main way to avoid overamplification but does not fundamentally eliminate the nonspecific hybridization, and the decreased cycle number may lead to insufficient product amounts. Here, we developed a new method, "asymmetric emulsion PCR," which could overcome the shortcomings of conventional PCR. In asymmetric emulsion PCR, different templates were separated by emulsion particles, allowing single-molecule PCR, in which each template was separately amplified, and the nonspecific hybridization was avoided. Overamplification or formation of by-products was not observed. The method is so simple that direct amplification of 40 or more cycles can provide a high-quality ssDNA library. Therefore, the asymmetric emulsion PCR would improve the screening efficiency of systematic evolution of ligands by exponential enrichment.

  12. A tilt-pair based method for assigning the projection directions of randomly oriented single-particle molecules.

    PubMed

    Ueno, Yutaka; Mine, Shouhei; Kawasaki, Kazunori

    2015-04-01

    In this article, we describe an improved method to assign the projection angle for averaged images using tilt-pair images for three-dimensional reconstructions from randomly oriented single-particle molecular images. Our study addressed the so-called 'initial volume problem' in the single-particle reconstruction, which involves estimation of projection angles of the particle images. The projected images of the particles in different tilt observations were mixed and averaged for the characteristic views. After the ranking of these group average images in terms of reliable tilt angle information, mutual tilt angles between images are assigned from the constituent tilt-pair information. Then, multiples of the conical tilt series are made and merged to construct a network graph of the particle images in terms of projection angles, which are optimized for the three-dimensional reconstruction. We developed the method with images of a synthetic object and applied it to a single-particle image data set of the purified deacetylase from archaea. With the introduction of low-angle tilt observations to minimize unfavorable imaging conditions due to tilting, the results demonstrated reasonable reconstruction models without imposing symmetry to the structure. This method also guides its users to discriminate particle images of different conformational state of the molecule.

  13. Quantifying dose to the reconstructed breast: Can we adequately treat?

    SciTech Connect

    Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M.; Pierce, Lori J.

    2013-04-01

    To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.

  14. Fundamental Vibration Frequency and Damping Estimation: A Comparison Using the Random Decrement Method, the Empirical Mode Decomposition, and the HV Spectral Ratio Method for Local Site Characterization

    NASA Astrophysics Data System (ADS)

    Huerta-Lopez, C. I.; Upegui Botero, F. M.; Pulliam, J.; Willemann, R. J.; Pasyanos, M.; Schmitz, M.; Rojas Mercedes, N.; Louie, J. N.; Moschetti, M. P.; Martinez-Cruzado, J. A.; Suárez, L.; Huerfano Moreno, V.; Polanco, E.

    2013-12-01

    Site characterization in civil engineering demands to know at least two of the dynamic properties of soil systems, which are: (i) dominant vibration frequency, and (ii) damping. As part of an effort to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques using non invasive/non destructive seismic methods, a workshop (Pan-American Advanced Studies Institute: New Frontiers in Geophysical Research: Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation) was conducted during july 15-25, 2013 in Santo Domingo, Dominican Republic by the alliance of Pan-American Advanced Studies Institute (PASI) and Incorporated Research Institutions for Seismology (IRIS), jointly supported by Department of Energy (DOE) and National Science Foundation (NSF). Preliminary results of the site characterization in terms of fundamental vibration frequency and damping are here presented from data collected during the workshop. Three different methods were used in such estimations and later compared in order to identify the stability of estimations as well as the advantage or disadvantage among these methodologies. The used methods were the: (i) Random Decrement Method (RDM), to estimate fundamental vibration frequency and damping simultaneously; (ii) Empirical Mode Decomposition (EMD), to estimate the vibration modes, and (iii) Horizontal to Vertical Spectra ratio (HVSR), to estimate the fundamental vibration frequency. In all cases ambient vibration and induced vibration were used.

  15. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    PubMed Central

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. Results HPV prevalence for high-risk types was 62.3% (95%CI: 53.7–70.2) detected by s-DRY, 56.2% (95%CI: 47.6–64.4) by Dr-WET, and 54.6% (95%CI: 46.1–62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5–79.8) for s-FTA, 84.6% (95%CI: 66.5–93.9) for s-DRY, and 76.9% (95%CI: 58.0–89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Conclusion Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 43310942 PMID:26630353

  16. Analytical model for random dopant fluctuation in double-gate MOSFET in the subthreshold region using macroscopic modeling method

    NASA Astrophysics Data System (ADS)

    Shin, Yong Hyeon; Yun, Ilgu

    2016-12-01

    An analytical model is proposed for the random dopant fluctuation (RDF) in a symmetric double-gate metal-oxidesemiconductor field-effect-transistor (DG MOSFET) in the subthreshold region. Unintended impurity dopants cannot be absolutely prevented during the device fabrication; hence, it is important to analytically model the fluctuations in the electrical characteristics caused by these impurity dopants. Therefore, a macroscopic modeling method is applied to represent the impurity dopants in DG MOSFETs. With this method, the two-dimensional (2D) Poisson equation is separated into a basic analytical DG MOSFET model with channel doping concentration NA and an impurity-dopant-related term with local doping concentration NRD confined in a specific rectangular area. To solve the second term, the manually solvable 2D Green's function for DG MOSFETs is used. Through calculation of the channel potential (ϕ(x, y)), the variations in the drive current (IDS) and threshold voltage (Vth) are extracted from the analytical model. All results from the analytical model for an impurity dopant in a DG MOSFET are examined by comparisons with the commercially available 2D numerical simulation results, with respect to various oxide thicknesses (tox), channel lengths (L), and location of impurity dopants.

  17. Development of a novel efficient method to construct an adenovirus library displaying random peptides on the fiber knob

    PubMed Central

    Yamamoto, Yuki; Goto, Naoko; Miura, Kazuki; Narumi, Kenta; Ohnami, Shumpei; Uchida, Hiroaki; Miura, Yoshiaki; Yamamoto, Masato; Aoki, Kazunori

    2014-01-01

    Redirection of adenovirus vectors by engineering the capsid-coding region has shown limited success because proper targeting ligands are generally unknown. To overcome this limitation, we constructed an adenovirus library displaying random peptides on the fiber knob, and its screening led to successful selections of several particular targeted vectors. In the previous library construction method, the full length of an adenoviral genome was generated by a Cre-lox mediated in vitro recombination between a fiber-modified plasmid library and the enzyme-digested adenoviral DNA/terminal protein complex (DNA-TPC) before transfection to the producer cells. In this system, the procedures were complicated and time-consuming, and approximately 30% of the vectors in the library were defective with no displaying peptide. These may hinder further extensive exploration of cancer-targeting vectors. To resolve these problems, in this study, we developed a novel method with the transfection of a fiber-modified plasmid library and a fiberless adenoviral DNA-TPC in Cre-expressing 293 cells. The use of in-cell Cre recombination and fiberless adenovirus greatly simplified the library-making steps. The fiberless adenovirus was useful in suppressing the expansion of unnecessary adenovirus vectors. In addition, the complexity of the library was more than a 104 level in one well in a 6-well dish, which was 10-fold higher than that of the original method. The results demonstrated that this novel method is useful in producing a high quality live adenovirus library, which could facilitate the development of targeted adenovirus vectors for a variety of applications in medicine. PMID:24380399

  18. Prostate cancer between prognosis and adequate/proper therapy

    PubMed Central

    Grozescu, T; Popa, F

    2017-01-01

    Knowing the indolent, non-invasive nature of most types of prostate cancer, as well as the simple fact that the disease seems more likely to be associated with age rather than with other factors (50% of men at the age of 50 and 80% at the age of 80 have it [1], with or without presenting any symptom), the big challenge of this clinical entity was to determine severity indicators (so far insufficient) to guide the physician towards an adequate attitude in the clinical setting. The risk of over-diagnosing and over-treating many prostate cancer cases (indicated by all the major European and American studies) is real and poses many question marks. The present paper was meant to deliver new research data and to reset the clinical approach in prostate cancer cases. PMID:28255369

  19. The cerebellopontine angle: does the translabyrinthine approach give adequate access?

    PubMed

    Fagan, P A; Sheehy, J P; Chang, P; Doust, B D; Coakley, D; Atlas, M D

    1998-05-01

    A long-standing but unfounded criticism of the translabyrinthine approach is the misperception that this approach does not give adequate access to the cerebellopontine angle. Because of what is perceived as limited visualization and operating space within the cerebellopontine angle, some surgeons still believe that the translabyrinthine approach is inappropriate for large acoustic tumors. In this study, the surgical access to the cerebellopontine angle by virtue of the translabyrinthine approach is measured and analyzed. The parameters are compared with those measured for the retrosigmoid approach. This series objectively confirms that the translabyrinthine approach offers the neurotologic surgeon a shorter operative depth to the tumor, via a similar-sized craniotomy. This permits superior visualization by virtue of a wider angle of surgical access. Such access is achieved with the merit of minimal cerebellar retraction.

  20. Comparison of non-surgical treatment methods for patients with lumbar spinal stenosis: protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Lumbar spinal stenosis is the most common reason for spinal surgery in older adults. Previous studies have shown that surgery is effective for severe cases of stenosis, but many patients with mild to moderate symptoms are not surgical candidates. These patients and their providers are seeking effective non-surgical treatment methods to manage their symptoms; yet there is a paucity of comparative effectiveness research in this area. This knowledge gap has hindered the development of clinical practice guidelines for non-surgical treatment approaches for lumbar spinal stenosis. Methods/design This study is a prospective randomized controlled clinical trial that will be conducted from November 2013 through October 2016. The sample will consist of 180 older adults (>60 years) who have both an anatomic diagnosis of stenosis confirmed by diagnostic imaging, and signs/symptoms consistent with a clinical diagnosis of lumbar spinal stenosis confirmed by clinical examination. Eligible subjects will be randomized into one of three pragmatic treatment groups: 1) usual medical care; 2) individualized manual therapy and rehabilitative exercise; or 3) community-based group exercise. All subjects will be treated for a 6-week course of care. The primary subjective outcome is the Swiss Spinal Stenosis Questionnaire, a self-reported measure of pain/function. The primary objective outcome is the Self-Paced Walking Test, a measure of walking capacity. The secondary objective outcome will be a measurement of physical activity during activities of daily living, using the SenseWear Armband, a portable device to be worn on the upper arm for one week. The primary analysis will use linear mixed models to compare the main effects of each treatment group on the changes in each outcome measure. Secondary analyses will include a responder analysis by group and an exploratory analysis of potential baseline predictors of treatment outcome. Discussion Our study should provide evidence

  1. Leveraging Random Number Generation for Mastery of Learning in Teaching Quantitative Research Courses via an E-Learning Method

    ERIC Educational Resources Information Center

    Boonsathorn, Wasita; Charoen, Danuvasin; Dryver, Arthur L.

    2014-01-01

    E-Learning brings access to a powerful but often overlooked teaching tool: random number generation. Using random number generation, a practically infinite number of quantitative problem-solution sets can be created. In addition, within the e-learning context, in the spirit of the mastery of learning, it is possible to assign online quantitative…

  2. A new method of infrared thermography for quantification of brown adipose tissue activation in healthy adults (TACTICAL): a randomized trial.

    PubMed

    Ang, Qi Yan; Goh, Hui Jen; Cao, Yanpeng; Li, Yiqun; Chan, Siew-Pang; Swain, Judith L; Henry, Christiani Jeyakumar; Leow, Melvin Khee-Shing

    2017-05-01

    The ability to alter the amount and activity of brown adipose tissue (BAT) in human adults is a potential strategy to manage obesity and related metabolic disorders associated with food, drug, and environmental stimuli with BAT activating/recruiting capacity. Infrared thermography (IRT) provides a non-invasive and inexpensive alternative to the current methods (e.g. (18)F-FDG PET) used to assess BAT. We have quantified BAT activation in the cervical-supraclavicular (C-SCV) region using IRT video imaging and a novel image computational algorithm by studying C-SCV heat production in healthy young men after cold stimulation and the ingestion of capsinoids in a prospective double-blind placebo-controlled randomized trial. Subjects were divided into low-BAT and high-BAT groups based on changes in IR emissions in the C-SCV region induced by cold. The high-BAT group showed significant increases in energy expenditure, fat oxidation, and heat output in the C-SCV region post-capsinoid ingestion compared to post-placebo ingestion, but the low-BAT group did not. Based on these results, we conclude that IRT is a promising tool for quantifying BAT activity.

  3. Biotype stability of Candida albicans isolates after culture storage determined by randomly amplified polymorphic DNA and phenotypical methods.

    PubMed

    Bacelo, Katia Leston; da Costa, Karen Regina Carim; Ferreira, Joseane Cristina; Candido, Regina Celia

    2010-11-01

    Typing methods to evaluate isolates in relation to their phenotypical and molecular characteristics are essential in epidemiological studies. In this study, Candida albicans biotypes were determined before and after storage in order to verify their stability. Twenty C. albicans isolates were typed by Randomly Amplified Polymorphic DNA (RAPD), production of phospholipase and proteinase exoenzymes (enzymotyping) and morphotyping before and after 180 days of storage in Sabouraud dextrose agar (SDA) and sterilised distilled water. Before the storage, 19 RAPD patterns, two enzymotypes and eight morphotypes were identified. The fragment patterns obtained by RAPD, on the one hand, were not significantly altered after storage. On the other hand, the majority of the isolates changed their enzymotype and morphotype after storage. RAPD typing provided the better discriminatory index (DI) among isolates (DI = 0.995) and maintained the profile identified, thereby confirming its utility in epidemiological surveys. Based on the low reproducibility observed after storage in SDA and distilled water by morphotyping (DI = 0.853) and enzymotyping (DI = 0.521), the use of these techniques is not recommended on stored isolates.

  4. Random Decrement Method and Modeling H/V Spectral Ratios: An Application for Soft Shallow Layers Characterization

    NASA Astrophysics Data System (ADS)

    Song, H.; Huerta-Lopez, C. I.; Martinez-Cruzado, J. A.; Rodriguez-Lozoya, H. E.; Espinoza-Barreras, F.

    2009-05-01

    Results of an ongoing study to estimate the ground response upon weak and moderate earthquake excitations are presented. A reliable site characterization in terms of its soil properties and sub-soil layer configuration are parameters required in order to do a trustworthy estimation of the ground response upon dynamic loads. This study can be described by the following four steps: (1) Ambient noise measurements were collected at the study site where a bridge was under construction between the cities of Tijuana and Ensenada in Mexico. The time series were collected using a six channels recorder with an ADC converter of 16 bits within a maximum voltage range of ± 2.5 V, the recorder has an optional settings of: Butterworth/Bessel filters, gain and sampling rate. The sensors were a three orthogonal component (X, Y, Z) accelerometers with a sensitivity of 20 V/g, flat frequency response between DC to 200 Hz, and total full range of ±0.25 of g, (2) experimental H/V Spectral Ratios were computed to estimate the fundamental vibration frequency at the site, (3) using the time domain experimental H/V spectral ratios as well as the original recorded time series, the random decrement method was applied to estimate the fundamental frequency and damping of the site (system), and (4) finally the theoretical H/V spectral ratios were obtained by means of the stiffness matrix wave propagation method.. The interpretation of the obtained results was then finally compared with a geotechnical study available at the site.

  5. Comparison of training methods to improve walking in persons with chronic spinal cord injury: a randomized clinical trial

    PubMed Central

    Alexeeva, Natalia; Sames, Carol; Jacobs, Patrick L.; Hobday, Lori; DiStasio, Marcello M.; Mitchell, Sarah A.; Calancie, Blair

    2011-01-01

    Objective To compare two forms of device-specific training – body-weight-supported (BWS) ambulation on a fixed track (TRK) and BWS ambulation on a treadmill (TM) – to comprehensive physical therapy (PT) for improving walking speed in persons with chronic, motor-incomplete spinal cord injury (SCI). Methods Thirty-five adult subjects with a history of chronic SCI (>1 year; AIS ‘C’ or ‘D’) participated in a 13-week (1 hour/day; 3 days per week) training program. Subjects were randomized into one of the three training groups. Subjects in the two BWS groups trained without the benefit of additional input from a physical therapist or gait expert. For each training session, performance values and heart rate were monitored. Pre- and post-training maximal 10-m walking speed, balance, muscle strength, fitness, and quality of life were assessed in each subject. Results All three training groups showed significant improvement in maximal walking speed, muscle strength, and psychological well-being. A significant improvement in balance was seen for PT and TRK groups but not for subjects in the TM group. In all groups, post-training measures of fitness, functional independence, and perceived health and vitality were unchanged. Conclusions Our results demonstrate that persons with chronic, motor-incomplete SCI can improve walking ability and psychological well-being following a concentrated period of ambulation therapy, regardless of training method. Improvement in walking speed was associated with improved balance and muscle strength. In spite of the fact that we withheld any formal input of a physical therapist or gait expert from subjects in the device-specific training groups, these subjects did just as well as subjects receiving comprehensive PT for improving walking speed and strength. It is likely that further modest benefits would accrue to those subjects receiving a combination of device-specific training with input from a physical therapist or gait expert to

  6. Systemic Crisis of Civilization: In Search for Adequate Solution

    NASA Astrophysics Data System (ADS)

    Khozin, Grigori

    In December 1972 a jumbo jet crashed in the Florida Everglades with the loss of 101 lives. The pilot, distracted by a minor malfunction, failed to note until too late the warning signal that - correctly - indicated an impending disaster. His sudden, astonished cry of Hey, what happening here? were his last words 1. Three decades after this tragic episode, as the Humankind approaches the threshold of the third Millennium, the problem of adequate reaction to warning signals of different nature and of distinguishing minor malfunctions in everyday life of society, in economy and technology as well as in evolution of biosphere from grave threats to the world community and the phenomenon of life on our planet remains crucial to human survival and the future of Civilization. Rational use of knowledge and technology available to the world community remains in this context the corner stone of discussions on the destiny of the intelligent life both on the planet Earth and in the Universe (the fact of intelligent life in the Universe is to be detected by the Humankind)…

  7. ENSURING ADEQUATE SAFETY WHEN USING HYDROGEN AS A FUEL

    SciTech Connect

    Coutts, D

    2007-01-22

    Demonstration projects using hydrogen as a fuel are becoming very common. Often these projects rely on project-specific risk evaluations to support project safety decisions. This is necessary because regulations, codes, and standards (hereafter referred to as standards) are just being developed. This paper will review some of the approaches being used in these evolving standards, and techniques which demonstration projects can implement to bridge the gap between current requirements and stakeholder desires. Many of the evolving standards for hydrogen-fuel use performance-based language, which establishes minimum performance and safety objectives, as compared with prescriptive-based language that prescribes specific design solutions. This is being done for several reasons including: (1) concern that establishing specific design solutions too early will stifle invention, (2) sparse performance data necessary to support selection of design approaches, and (3) a risk-adverse public which is unwilling to accept losses that were incurred in developing previous prescriptive design standards. The evolving standards often contain words such as: ''The manufacturer shall implement the measures and provide the information necessary to minimize the risk of endangering a person's safety or health''. This typically implies that the manufacturer or project manager must produce and document an acceptable level of risk. If accomplished using comprehensive and systematic process the demonstration project risk assessment can ease the transition to widespread commercialization. An approach to adequately evaluate and document the safety risk will be presented.

  8. DARHT -- an adequate EIS: A NEPA case study

    SciTech Connect

    Webb, M.D.

    1997-08-01

    In April 1996 the US District Court in Albuquerque ruled that the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility Environmental Impact Statement (EIS), prepared by the Los Alamos Area Office, US Department of Energy (DOE), was adequate. The DARHT EIS had been prepared in the face of a lawsuit in only 10 months, a third of the time usually allotted for a DOE EIS, and for only a small fraction of the cost of a typical DOE EIS, and for only a small fraction of the cost of a typical DOE EIS. It subject was the first major facility to be built in decades for the DOE nuclear weapons stockpile stewardship program. It was the first EIS to be prepared for a proposal at DOE`s Los Alamos National Laboratory since 1979, and the first ever prepared by the Los Alamos Area Office. Much of the subject matter was classified. The facility had been specially designed to minimize impacts to a nearby prehistoric Native American ruin, and extensive consultation with American Indian Pueblos was required. The week that the draft EIS was published Laboratory biologists identified a previously unknown pair of Mexican spotted owls in the immediate vicinity of the project, bringing into play the consultation requirements of the Endangered Species Act. In spite of these obstacles, the resultant DARHT EIS was reviewed by the court and found to meet all statutory and regulatory requirements; the court praised the treatment of the classified material which served as a basis for the environmental analysis.

  9. Dose Limits for Man do not Adequately Protect the Ecosystem

    SciTech Connect

    Higley, Kathryn A.; Alexakhin, Rudolf M.; McDonald, Joseph C.

    2004-08-01

    It has been known for quite some time that different organisms display differing degrees of sensitivity to the effects of ionizing radiations. Some microorganisms such as the bacterium Micrococcus radiodurans, along with many species of invertebrates, are extremely radio-resistant. Humans might be categorized as being relatively sensitive to radiation, and are a bit more resistant than some pine trees. Therefore, it could be argued that maintaining the dose limits necessary to protect humans will also result in the protection of most other species of flora and fauna. This concept is usually referred to as the anthropocentric approach. In other words, if man is protected then the environment is also adequately protected. The ecocentric approach might be stated as; the health of humans is effectively protected only when the environment is not unduly exposed to radiation. The ICRP is working on new recommendations dealing with the protection of the environment, and this debate should help to highlight a number of relevant issues concerning that topic.

  10. Intracranial Pressure Monitoring in Severe Traumatic Brain Injury in Latin America: Process and Methods for a Multi-Center Randomized Controlled Trial

    PubMed Central

    Lujan, Silvia; Dikmen, Sureyya; Temkin, Nancy; Petroni, Gustavo; Pridgeon, Jim; Barber, Jason; Machamer, Joan; Cherner, Mariana; Chaddock, Kelley; Hendrix, Terence; Rondina, Carlos; Videtta, Walter; Celix, Juanita M.; Chesnut, Randall

    2012-01-01

    Abstract In patients with severe traumatic brain injury (TBI), the influence on important outcomes of the use of information from intracranial pressure (ICP) monitoring to direct treatment has never been tested in a randomized controlled trial (RCT). We are conducting an RCT in six trauma centers in Latin America to test this question. We hypothesize that patients randomized to ICP monitoring will have lower mortality and better outcomes at 6-months post-trauma than patients treated without ICP monitoring. We selected three centers in Bolivia to participate in the trial, based on (1) the absence of ICP monitoring, (2) adequate patient accession and data collection during the pilot phase, (3) preliminary institutional review board approval, and (4) the presence of equipoise about the value of ICP monitoring. We conducted extensive training of site personnel, and initiated the trial on September 1, 2008. Subsequently, we included three additional centers. A total of 176 patients were entered into the trial as of August 31, 2010. Current enrollment is 81% of that expected. The trial is expected to reach its enrollment goal of 324 patients by September of 2011. We are conducting a high-quality RCT to answer a question that is important globally. In addition, we are establishing the capacity to conduct strong research in Latin America, where TBI is a serious epidemic. Finally, we are demonstrating the feasibility and utility of international collaborations that share resources and unique patient populations to conduct strong research about global public health concerns. PMID:22435793

  11. Intracranial pressure monitoring in severe traumatic brain injury in latin america: process and methods for a multi-center randomized controlled trial.

    PubMed

    Carney, Nancy; Lujan, Silvia; Dikmen, Sureyya; Temkin, Nancy; Petroni, Gustavo; Pridgeon, Jim; Barber, Jason; Machamer, Joan; Cherner, Mariana; Chaddock, Kelley; Hendrix, Terence; Rondina, Carlos; Videtta, Walter; Celix, Juanita M; Chesnut, Randall

    2012-07-20

    In patients with severe traumatic brain injury (TBI), the influence on important outcomes of the use of information from intracranial pressure (ICP) monitoring to direct treatment has never been tested in a randomized controlled trial (RCT). We are conducting an RCT in six trauma centers in Latin America to test this question. We hypothesize that patients randomized to ICP monitoring will have lower mortality and better outcomes at 6-months post-trauma than patients treated without ICP monitoring. We selected three centers in Bolivia to participate in the trial, based on (1) the absence of ICP monitoring, (2) adequate patient accession and data collection during the pilot phase, (3) preliminary institutional review board approval, and (4) the presence of equipoise about the value of ICP monitoring. We conducted extensive training of site personnel, and initiated the trial on September 1, 2008. Subsequently, we included three additional centers. A total of 176 patients were entered into the trial as of August 31, 2010. Current enrollment is 81% of that expected. The trial is expected to reach its enrollment goal of 324 patients by September of 2011. We are conducting a high-quality RCT to answer a question that is important globally. In addition, we are establishing the capacity to conduct strong research in Latin America, where TBI is a serious epidemic. Finally, we are demonstrating the feasibility and utility of international collaborations that share resources and unique patient populations to conduct strong research about global public health concerns.

  12. Methods for calculating the temperature zero drift initiated in fiber ring interferometers by random inhomogeneities in single-mode optical fibers

    NASA Astrophysics Data System (ADS)

    Malykin, G. B.

    2008-12-01

    The specific features of the existing methods used for calculating the temperature zero drift initiated in fiber ring interferometers (FRIs) by linear coupling between polarization eigenmodes at random inhomogeneities in single-mode optical fibers are analyzed. The ranges of applicability of each method are determined. It is demonstrated that numerical simulation with a variation in the temperature of the single-mode optical fiber of the FRI loop is the most complex and, at the same time, the most universal method.

  13. Are Vancomycin Trough Concentrations Adequate for Optimal Dosing?

    PubMed Central

    Youn, Gilmer; Jones, Brenda; Jelliffe, Roger W.; Drusano, George L.; Rodvold, Keith A.; Lodise, Thomas P.

    2014-01-01

    The current vancomycin therapeutic guidelines recommend the use of only trough concentrations to manage the dosing of adults with Staphylococcus aureus infections. Both vancomycin efficacy and toxicity are likely to be related to the area under the plasma concentration-time curve (AUC). We assembled richly sampled vancomycin pharmacokinetic data from three studies comprising 47 adults with various levels of renal function. With Pmetrics, the nonparametric population modeling package for R, we compared AUCs estimated from models derived from trough-only and peak-trough depleted versions of the full data set and characterized the relationship between the vancomycin trough concentration and AUC. The trough-only and peak-trough depleted data sets underestimated the true AUCs compared to the full model by a mean (95% confidence interval) of 23% (11 to 33%; P = 0.0001) and 14% (7 to 19%; P < 0.0001), respectively. In contrast, using the full model as a Bayesian prior with trough-only data allowed 97% (93 to 102%; P = 0.23) accurate AUC estimation. On the basis of 5,000 profiles simulated from the full model, among adults with normal renal function and a therapeutic AUC of ≥400 mg · h/liter for an organism for which the vancomycin MIC is 1 mg/liter, approximately 60% are expected to have a trough concentration below the suggested minimum target of 15 mg/liter for serious infections, which could result in needlessly increased doses and a risk of toxicity. Our data indicate that adjustment of vancomycin doses on the basis of trough concentrations without a Bayesian tool results in poor achievement of maximally safe and effective drug exposures in plasma and that many adults can have an adequate vancomycin AUC with a trough concentration of <15 mg/liter. PMID:24165176

  14. A Randomized Exploratory Study to Evaluate Two Acupuncture Methods for the Treatment of Headaches Associated with Traumatic Brain Injury

    PubMed Central

    Bellanti, Dawn M.; Paat, Charmagne F.; Boyd, Courtney C.; Duncan, Alaine; Price, Ashley; Zhang, Weimin; French, Louis M.; Chae, Heechin

    2016-01-01

    Abstract Background: Headaches are prevalent among Service members with traumatic brain injury (TBI); 80% report chronic or recurrent headache. Evidence for nonpharmacologic treatments, such as acupuncture, are needed. Objective: The aim of this research was to determine if two types of acupuncture (auricular acupuncture [AA] and traditional Chinese acupuncture [TCA]) were feasible and more effective than usual care (UC) alone for TBI–related headache. Materials and Methods: Design: This was a three-armed, parallel, randomized exploratory study. Setting: The research took place at three military treatment facilities in the Washington, DC, metropolitan area. Patients: The subjects were previously deployed Service members (18–69 years old) with mild-to-moderate TBI and headaches. Intervention: The interventions explored were UC alone or with the addition of AA or TCA. Outcome Measures: The primary outcome was the Headache Impact Test (HIT). Secondary outcomes were the Numerical Rating Scale (NRS), Pittsburgh Sleep Quality Index, Post-Traumatic Stress Checklist, Symptom Checklist-90-R, Medical Outcome Study Quality of Life (QoL), Beck Depression Inventory, State-Trait Anxiety Inventory, the Automated Neuropsychological Assessment Metrics, and expectancy of outcome and acupuncture efficacy. Results: Mean HIT scores decreased in the AA and TCA groups but increased slightly in the UC-only group from baseline to week 6 [AA, −10.2% (−6.4 points); TCA, −4.6% (−2.9 points); UC, +0.8% (+0.6 points)]. Both acupuncture groups had sizable decreases in NRS (Pain Best), compared to UC (TCA versus UC: P = 0.0008, d = 1.70; AA versus UC: P = 0.0127, d = 1.6). No statistically significant results were found for any other secondary outcome measures. Conclusions: Both AA and TCA improved headache-related QoL more than UC did in Service members with TBI. PMID:27458496

  15. Clinical Efficacy of Two Different Methods to Initiate Sensor-Augmented Insulin Pumps: A Randomized Controlled Trial

    PubMed Central

    Gómez, Francisco Javier; Gálvez Moreno, Maria Ángeles; Castaño, Justo P.

    2016-01-01

    Aim. To analyze clinical effect of a novel approach to initiate sensor-augmented insulin pumps in type 1 diabetes mellitus (T1DM) patients through early real-time continuous glucose monitoring (RT-CGM) initiation. Methods. A 26-week pilot study with T1DM subjects randomized (1 : 1) to start RT-CGM three weeks before continuous subcutaneous insulin infusion (CGM pre-CSII) or adding RT-CGM three weeks after continuous subcutaneous insulin infusion (CGM post-CSII). Results. Twenty-two patients were enrolled with a mean age of 36.6 yr. (range 19–59 yr.) and T1DM duration of 16.8 ± 10.6 yr. Higher adherence in CGM pre-CSII patients was confirmed at study end (84.6 ± 11.1% versus 64.0 ± 25.4%; P = 0.01). The two intervention groups had similar HbA1c reduction at study end of −0.6% (P = 0.9). Hypoglycemic event frequency reduction was observed from baseline to study end only in CGM pre-CSII group (mean difference in change, −6.3%; 95% confidence interval, −12.0 to −0.5; P = 0.04). Moreover, no severe hypoglycemia was detected among CGM pre-CSII subjects during the study follow-up (0.0 ± 0.0 events versus 0.63 ± 1.0 events; P = 0.03). CGM pre-CSII patients showed better satisfaction than CGM post-CSII patients at the end of the study (27.3 ± 9.3 versus 32.9 ± 7.2; P = 0.04). Conclusions. CGM pre-CSII is a novel approach to improve glycemic control and satisfaction in type 1 diabetes sensor-augmented pump treated patients. PMID:28004007

  16. The COPE healthy lifestyles TEEN randomized controlled trial with culturally diverse high school adolescents: baseline characteristics and methods.

    PubMed

    Melnyk, Bernadette Mazurek; Kelly, Stephanie; Jacobson, Diana; Belyea, Michael; Shaibi, Gabriel; Small, Leigh; O'Haver, Judith; Marsiglia, Flavio Francisco

    2013-09-01

    Obesity and mental health disorders remain significant public health problems in adolescents. Substantial health disparities exist with minority youth experiencing higher rates of these problems. Schools are an outstanding venue to provide teens with skills needed to improve their physical and mental health, and academic performance. In this paper, the authors describe the design, intervention, methods and baseline data for a randomized controlled trial with 779 culturally diverse high-school adolescents in the southwest United States. Aims for this prevention study include testing the efficacy of the COPE TEEN program versus an attention control program on the adolescents' healthy lifestyle behaviors, Body Mass Index (BMI) and BMI%, mental health, social skills and academic performance immediately following the intervention programs, and at six and 12 months post interventions. Baseline findings indicate that greater than 40% of the sample is either overweight (n = 148, 19.00%) or obese (n = 182, 23.36%). The predominant ethnicity represented is Hispanic (n = 526, 67.52%). At baseline, 15.79% (n = 123) of the students had above average scores on the Beck Youth Inventory Depression subscale indicating mildly (n = 52, 6.68%), moderately (n = 47, 6.03%), or extremely (n = 24, 3.08%) elevated scores (see Table 1). Anxiety scores were slightly higher with 21.56% (n = 168) reporting responses suggesting mildly (n = 81, 10.40%), moderately (n = 58, 7.45%) or extremely (n = 29, 3.72%) elevated scores. If the efficacy of the COPE TEEN program is supported, it will offer schools a curriculum that can be easily incorporated into high school health courses to improve adolescent healthy lifestyle behaviors, psychosocial outcomes and academic performance.

  17. Impact of the Konstanz method of dilemma discussion on moral judgment in allied health students: a randomized controlled study.

    PubMed

    Lerkiatbundit, Sanguan; Utaipan, Parichat; Laohawiriyanon, Chonlada; Teo, Adisa

    2006-01-01

    The objective of the study was to determine the effect of the Konstanz method of moral dilemma discussion (KMDD) on moral judgment in allied health students. The study employed the Moral Judgment Test, translated from English into Thai and validated in 247 students, as an moral judgment instrument. The scale satisfied four validity criteria: preference hierarchy, quasi-simplex structure of stage preference, affective-cognitive parallelism, and positive correlation between education and moral competence score (C-index). Test-retest reliability at a 1-month interval was 0.90. To investigate the impact of the KMDD, 83 pharmacy technician and dental nursing students were asked to participate in the study. The subjects were randomly assigned into control (n = 41) or experimental (n = 42) groups. The experimental group participated in a 90-min KMDD once a week for 6 consecutive weeks. Students in the control group also met once a week for 6 weeks to discuss the topics not related to ethics. All subjects completed the Moral Judgment Test before and after the intervention and again 6 months later. Split-plot ANOVA of the C-indexes at the beginning revealed that the experimental and control groups were not different (20.57 +/- 13.45 and 24.98 +/- 16.12). However, the experimental group scored significantly higher than the control group did after the intervention (35.18 +/- 10.96 and 24.20 +/- 14.70) and 6 months later (33.00 +/- 11.02 and 23.67 +/- 14.35). The KMDD appears to be a practical and effective intervention for developing moral judgment in allied health students. The effect on moral judgment remains at least 6 months after the intervention.

  18. The COPE healthy lifestyles TEEN randomized controlled trial with culturally diverse high school adolescents: Baseline characteristics and methods

    PubMed Central

    Melnyk, Bernadette Mazurek; Kelly, Stephanie; Jacobson, Diana; Belyea, Michael; Shaibi, Gabriel; Small, Leigh; O’Haver, Judith; Marsiglia, Flavio Francisco

    2014-01-01

    Obesity and mental health disorders remain significant public health problems in adolescents. Substantial health disparities exist with minority youth experiencing higher rates of these problems. Schools are an outstanding venue to provide teens with skills needed to improve their physical and mental health, and academic performance. In this paper, the authors describe the design, intervention, methods and baseline data for a randomized controlled trial with 779 culturally diverse high-school adolescents in the southwest United States. Aims for this prevention study include testing the efficacy of the COPE TEEN program versus an attention control program on the adolescents’ healthy lifestyle behaviors, Body Mass Index (BMI) and BMI%, mental health, social skills and academic performance immediately following the intervention programs, and at six and 12 months post interventions. Baseline findings indicate that greater than 40% of the sample is either overweight (n = 148, 19.00%) or obese (n = 182, 23.36%). The predominant ethnicity represented is Hispanic (n = 526, 67.52%). At baseline, 15.79%(n = 123) of the students had above average scores on the Beck Youth Inventory Depression subscale indicating mildly (n = 52, 6.68%), moderately (n = 47, 6.03%), or extremely (n = 24, 3.08%) elevated scores (see 1). Anxiety scores were slightly higher with 21.56% (n = 168) reporting responses suggesting mildly (n = 81, 10.40%), moderately (n = 58, 7.45%) or extremely (n = 29, 3.72%) elevated scores. If the efficacy of the COPE TEEN program is supported, it will offer schools a curriculum that can be easily incorporated into high school health courses to improve adolescent healthy lifestyle behaviors, psychosocial outcomes and academic performance. PMID:23748156

  19. Impact of Violation of the Missing-at-Random Assumption on Full-Information Maximum Likelihood Method in Multidimensional Adaptive Testing

    ERIC Educational Resources Information Center

    Han, Kyung T.; Guo, Fanmin

    2014-01-01

    The full-information maximum likelihood (FIML) method makes it possible to estimate and analyze structural equation models (SEM) even when data are partially missing, enabling incomplete data to contribute to model estimation. The cornerstone of FIML is the missing-at-random (MAR) assumption. In (unidimensional) computerized adaptive testing…

  20. Approximation of the Lévy Feller advection dispersion process by random walk and finite difference method

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Liu, F.; Turner, I.; Anh, V.

    2007-03-01

    In this paper we present a random walk model for approximating a Lévy-Feller advection-dispersion process, governed by the Lévy-Feller advection-dispersion differential equation (LFADE). We show that the random walk model converges to LFADE by use of a properly scaled transition to vanishing space and time steps. We propose an explicit finite difference approximation (EFDA) for LFADE, resulting from the Grünwald-Letnikov discretization of fractional derivatives. As a result of the interpretation of the random walk model, the stability and convergence of EFDA for LFADE in a bounded domain are discussed. Finally, some numerical examples are presented to show the application of the present technique.

  1. [Incidence of primary malignant lesions in clinically benign teratoma: on the problem of adequate surgical procedure].

    PubMed

    Kindermann, G; Jung, E M; Maassen, V; Bise, K

    1996-08-01

    The Problem of an Adequate Surgical Approach: Frequency of malignant teratomas is, according to the literature, 2%-10%. Examining 194 own cases (1983-1993) it was 1.5%. We found one squamous cell carcinoma (0.5%). Additionally we found 2 immature teratomas (1%). We point out the different biological behaviour of malignant mature teratomas and immature teratomas. We agree with the majority of authors that the method of choice is the intact removal of all teratomas without iatrogen rupture or contamination of the abdominal cavity by contents of the teratoma. This adequate surgical procedure can and should be performed by laparotomy or laparoscopy with endobag. The often practised method of cutting open the cyst during laparoscopy, sucking off the contents or cutting the teratoma into pieces, has been proven to lead to implantation and worsening the prognosis in case of a malignant teratoma. Even the rinsing of the abdominal cavity, usually carried out with this method, could not compensate always for the disadvantage of this "dirty" endoscopical method compared with usual oncological standards. This is pointed out by case reports in the literature and the first analysis of a German survey with early-follow-up of 192 laparoscopically managed ovarian malignancies [11a]. The principle of intact removal of every teratoma should again be kept in mind.

  2. Using Logistic Regression and Random Forests multivariate statistical methods for landslide spatial probability assessment in North-Est Sicily, Italy

    NASA Astrophysics Data System (ADS)

    Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele

    2015-04-01

    first phase of the work addressed to identify the spatial relationships between the landslides location and the 13 related factors by using the Frequency Ratio bivariate statistical method. The analysis was then carried out by adopting a multivariate statistical approach, according to the Logistic Regression technique and Random Forests technique that gave best results in terms of AUC. The models were performed and evaluated with different sample sizes and also taking into account the temporal variation of input variables such as burned areas by wildfire. The most significant outcome of this work are: the relevant influence of the sample size on the model results and the strong importance of some environmental factors (e.g. land use and wildfires) for the identification of the depletion zones of extremely rapid shallow landslides.

  3. Quality assessment of reporting of randomization, allocation concealment, and blinding in traditional chinese medicine RCTs: A review of 3159 RCTs identified from 260 systematic reviews

    PubMed Central

    2011-01-01

    Background Randomized controlled trials (RCTs) which are of poor quality tend to exaggerate the effect estimate and lead to wrong or misleading conclusions. The aim of this study is to assess the quality of randomization methods, allocation concealment and blinding within traditional Chinese medicine (TCM) RCTs, discuss issues identified for current TCM RCTs, and provide suggestions for quality improvement. Methods We searched Chinese Biomedical Database (CBM, 1978 to July 31, 2009) and the Cochrane Library (Issue 2, 2009) to collect TCM systematic reviews and meta-analyses according to inclusion/exclusion criteria, from which RCTs could be identified. The quality assessment involved whether the randomization methods, allocation concealment and blinding were adequate or not based the study reported. Stratified analyses were conducted of different types of diseases published in different journals (both Chinese and foreign) using different interventions. SPSS 15.0 software was used for statistic analyses. Results A total of 3159 RCTs were included, of which 2580 were published in Chinese journals and 579 in foreign journals. There were 381 (12%) RCTs which used adequate randomization methods; 207 (7%) RCTs which used adequate allocation concealment and 601 (19%) which used adequate blinding; there were 130 (4%) RCTs which both used adequate randomization methods and allocation concealment; and there were only 100 (3%) RCTs which used adequate randomization methods, allocation concealment, as well as blinding. In the RCTs published in foreign journals, the adequate randomization methods, allocation concealment and blinding accounted for a relatively large proportion (25%, 26%, and 60%, respectively) and increased with years, while in the RCTs published in Chinese journals, only the adequate randomization methods improved over time. The quality of non-drug intervention (chiefly acupuncture) RCTs was higher than that of drug intervention RCTs. In drug intervention, the

  4. Narita Target Heart Rate Equation Underestimates the Predicted Adequate Exercise Level in Sedentary Young Boys

    PubMed Central

    Siahkouhian, Marefat; Khodadadi, Davar

    2013-01-01

    Purpose Optimal training intensity and the adequate exercise level for physical fitness is one of the most important interests of coaches and sports physiologists. The aim of this study was to investigate the validity of the Narita et al target heart rate equation for the adequate exercise training level in sedentary young boys. Methods Forty two sedentary young boys (19.07±1.16 years) undertook a blood lactate transition threshold maximal treadmill test to volitional exhaustion with continuous respiratory gas measurements according to the Craig method. The anaerobic threshold (AT) of the participants then was calculated using the Narita target heart rate equation. Results Hopkin's spreadsheet to obtain confidence limit and the chance of the true difference between gas measurements and Narita target heart rate equation revealed that the Narita equation most likely underestimates the measured anaerobic threshold in sedentary young boys (168.76±15 vs. 130.08±14.36) (Difference ±90% confidence limit: 38.1±18). Intraclass correlation coefficient (ICC) showed a poor agreement between the criterion method and Narita equation (ICC= 0.03). Conclusion According to the results, the Narita equation underestimates the measured AT. It seems that the Narita equation is a good predictor of aerobic not AT which can be investigated in the future studies. PMID:24427475

  5. When one is not enough: prevalence and characteristics of homes not adequately protected by smoke alarms

    PubMed Central

    Peek-Asa, C; Allareddy, V; Yang, J; Taylor, C; Lundell, J; Zwerling, C

    2005-01-01

    Objective: The National Fire Protection Association (NFPA) has specific recommendations about the number, location, and type of smoke alarms that are needed to provide maximum protection for a household. No previous studies have examined whether or not homes are completely protected according to these guidelines. The authors describe the prevalence and home characteristics associated with compliance to recommendations for smoke alarm installation by the NFPA. Design, setting, and subjects: Data are from the baseline on-site survey of a randomized trial to measure smoke alarm effectiveness. The trial was housed in a longitudinal cohort study in a rural Iowa county. Of 1005 homes invited, 691 (68.8%) participated. Main outcome measures: Information about smoke alarm type, placement, and function, as well as home and occupant characteristics, was collected through an on-site household survey. Results: Although 86.0% of homes had at least one smoke alarm, only 22.3% of homes (approximately one in five) were adequately protected according to NFPA guidelines. Fourteen percent of homes had no functioning smoke alarms. More than half of the homes with smoke alarms did not have enough of them or had installed them incorrectly, and 42.4% of homes with alarms had at least one alarm that did not operate. Homes with at least one high school graduate were nearly four times more likely to be fully protected. Homes that had multiple levels, a basement, or were cluttered or poorly cleaned were significantly less likely to be fully protected. Conclusion: These findings indicate that consumers may not be knowledgeable about the number of alarms they need or how to properly install them. Occupants are also not adequately maintaining the alarms that are installed. PMID:16326772

  6. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  7. The Alchemy of "Costing Out" an Adequate Education

    ERIC Educational Resources Information Center

    Hanushek, Eric A.

    2006-01-01

    In response to the rapid rise in court cases related to the adequacy of school funding, a variety of alternative methods have been developed to provide an analytical base about the necessary expenditure on schools. These approaches have been titled to give an aura of a thoughtful and solid scientific basis: the professional judgment model, the…

  8. Determining the Availability of Adequate Off-Base Housing.

    DTIC Science & Technology

    1988-04-01

    weakness ot relying on the return of a que tionnaire. Historically, questionnaires have a low raturn rate. Th- second method is the Department of the Armv ve...under- etc.); and any other significant features (competition stood by laymen - for example: home post for STRAC with civilians inmigrating for

  9. The Nigerian health care system: Need for integrating adequate medical intelligence and surveillance systems

    PubMed Central

    Welcome, Menizibeya Osain

    2011-01-01

    Objectives: As an important element of national security, public health not only functions to provide adequate and timely medical care but also track, monitor, and control disease outbreak. The Nigerian health care had suffered several infectious disease outbreaks year after year. Hence, there is need to tackle the problem. This study aims to review the state of the Nigerian health care system and to provide possible recommendations to the worsening state of health care in the country. To give up-to-date recommendations for the Nigerian health care system, this study also aims at reviewing the dynamics of health care in the United States, Britain, and Europe with regards to methods of medical intelligence/surveillance. Materials and Methods: Databases were searched for relevant literatures using the following keywords: Nigerian health care, Nigerian health care system, and Nigerian primary health care system. Additional keywords used in the search were as follows: United States (OR Europe) health care dynamics, Medical Intelligence, Medical Intelligence systems, Public health surveillance systems, Nigerian medical intelligence, Nigerian surveillance systems, and Nigerian health information system. Literatures were searched in scientific databases Pubmed and African Journals OnLine. Internet searches were based on Google and Search Nigeria. Results: Medical intelligence and surveillance represent a very useful component in the health care system and control diseases outbreak, bioattack, etc. There is increasing role of automated-based medical intelligence and surveillance systems, in addition to the traditional manual pattern of document retrieval in advanced medical setting such as those in western and European countries. Conclusion: The Nigerian health care system is poorly developed. No adequate and functional surveillance systems are developed. To achieve success in health care in this modern era, a system well grounded in routine surveillance and medical

  10. Design and methods for a pilot randomized clinical trial involving exercise and behavioral activation to treat comorbid type 2 diabetes and major depressive disorder

    PubMed Central

    Schneider, Kristin L.; Pagoto, Sherry L.; Handschin, Barbara; Panza, Emily; Bakke, Susan; Liu, Qin; Blendea, Mihaela; Ockene, Ira S.; Ma, Yunsheng

    2011-01-01

    Background The comorbidity of type 2 diabetes mellitus (T2DM) and depression is associated with poor glycemic control. Exercise has been shown to improve mood and glycemic control, but individuals with comorbid T2DM and depression are disproportionately sedentary compared to the general population and report more difficulty with exercise. Behavioral activation, an evidence-based depression psychotherapy, was designed to help people with depression make gradual behavior changes, and may be helpful to build exercise adherence in sedentary populations. This pilot randomized clinical trial will test the feasibility of a group exercise program enhanced with behavioral activation strategies among women with comorbid T2DM and depression. Methods/Design Sedentary women with inadequately controlled T2DM and depression (N=60) will be randomly assigned to one of two conditions: exercise or usual care. Participants randomized to the exercise condition will attend 38 behavioral activation-enhanced group exercise classes over 24 weeks in addition to usual care. Participants randomized to the usual care condition will receive depression treatment referrals and print information on diabetes management via diet and physical activity. Assessments will occur at baseline and 3-, 6-, and 9-months following randomization. The goals of this pilot study are to demonstrate feasibility and intervention acceptability, estimate the resources and costs required to deliver the intervention and to estimate the standard deviation of continuous outcomes (e.g., depressive symptoms and glycosylated hemoglobin) in preparation for a fully-powered randomized clinical trial. Discussion A novel intervention that combines exercise and behavioral activation strategies could potentially improve glycemic control and mood in women with comorbid type 2 diabetes and depression. Trial registration NCT01024790 PMID:21765864

  11. Design and methods for a pilot randomized clinical trial involving exercise and behavioral activation to treat comorbid type 2 diabetes and major depressive disorder.

    PubMed

    Schneider, Kristin L; Pagoto, Sherry L; Handschin, Barbara; Panza, Emily; Bakke, Susan; Liu, Qin; Blendea, Mihaela; Ockene, Ira S; Ma, Yunsheng

    2011-06-01

    BACKGROUND: The comorbidity of type 2 diabetes mellitus (T2DM) and depression is associated with poor glycemic control. Exercise has been shown to improve mood and glycemic control, but individuals with comorbid T2DM and depression are disproportionately sedentary compared to the general population and report more difficulty with exercise. Behavioral activation, an evidence-based depression psychotherapy, was designed to help people with depression make gradual behavior changes, and may be helpful to build exercise adherence in sedentary populations. This pilot randomized clinical trial will test the feasibility of a group exercise program enhanced with behavioral activation strategies among women with comorbid T2DM and depression. METHODS/DESIGN: Sedentary women with inadequately controlled T2DM and depression (N=60) will be randomly assigned to one of two conditions: exercise or usual care. Participants randomized to the exercise condition will attend 38 behavioral activation-enhanced group exercise classes over 24 weeks in addition to usual care. Participants randomized to the usual care condition will receive depression treatment referrals and print information on diabetes management via diet and physical activity. Assessments will occur at baseline and 3-, 6-, and 9-months following randomization. The goals of this pilot study are to demonstrate feasibility and intervention acceptability, estimate the resources and costs required to deliver the intervention and to estimate the standard deviation of continuous outcomes (e.g., depressive symptoms and glycosylated hemoglobin) in preparation for a fully-powered randomized clinical trial. DISCUSSION: A novel intervention that combines exercise and behavioral activation strategies could potentially improve glycemic control and mood in women with comorbid type 2 diabetes and depression. TRIAL REGISTRATION: NCT01024790.

  12. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  13. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  14. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  15. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  16. 21 CFR 801.5 - Medical devices; adequate directions for use.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...

  17. Effectiveness of the Dader Method for pharmaceutical care in patients with bipolar I disorder: EMDADER-TAB: study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Bipolar I disorder (BD-I) is a chronic mental illness characterized by the presence of one or more manic episodes, or both depressive and manic episodes, usually separated by asymptomatic intervals. Pharmacists can contribute to the management of BD-I, mainly with the use of effective and safe drugs, and improve the patient’s life quality through pharmaceutical care. Some studies have shown the effect of pharmaceutical care in the achievement of therapeutic goals in different illnesses; however, to our knowledge, there is a lack of randomized controlled trials designed to assess the effect of pharmacist intervention in patients with BD. The aim of this study is to assess the effectiveness of the Dader Method for pharmaceutical care in patients with BD-I. Methods/design Randomized, controlled, prospective, single-center clinical trial with duration of 12 months will be performed to compare the effect of Dader Method of pharmaceutical care with the usual care process of patients in a psychiatric clinic. Patients diagnosed with BD-I aged between 18 and 65 years who have been discharged or referred from outpatients service of the San Juan de Dios Clinic (Antioquia, Colombia) will be included. Patients will be randomized into the intervention group who will receive pharmaceutical care provided by pharmacists working in collaboration with psychiatrists, or into the control group who will receive usual care and verbal-written counseling regarding BD. Study outcomes will be assessed at baseline and at 3, 6, 9, and 12 months after randomization. The primary outcome will be to measure the number of hospitalizations, emergency service consultations, and unscheduled outpatient visits. Effectiveness, safety, adherence, and quality of life will be assessed as secondary outcomes. Statistical analyses will be performed using two-tailed McNemar tests, Pearson chi-square tests, and Student’s t-tests; a P value <0.05 will be considered as statistically significant

  18. Using Fuzzy Logic to Identify Schools Which May Be Misclassified by the No Child Left Behind Adequate Yearly Progress Policy

    ERIC Educational Resources Information Center

    Yates, Donald W.

    2009-01-01

    This investigation developed, tested, and prototyped a Fuzzy Inference System (FIS) that would assist decision makers in identifying schools that may have been misclassified by existing Adequate Yearly Progress (AYP) methods. This prototype was then used to evaluate Louisiana elementary schools using published school data for Academic Year 2004. …

  19. Assessment of analysis-of-variance-based methods to quantify the random variations of observers in medical imaging measurements: guidelines to the investigator.

    PubMed

    Zeggelink, William F A Klein; Hart, Augustinus A M; Gilhuijs, Kenneth G A

    2004-07-01

    The random variations of observers in medical imaging measurements negatively affect the outcome of cancer treatment, and should be taken into account during treatment by the application of safety margins that are derived from estimates of the random variations. Analysis-of-variance- (ANOVA-) based methods are the most preferable techniques to assess the true individual random variations of observers, but the number of observers and the number of cases must be taken into account to achieve meaningful results. Our aim in this study is twofold. First, to evaluate three representative ANOVA-based methods for typical numbers of observers and typical numbers of cases. Second, to establish guidelines to the investigator to determine which method, how many observers, and which number of cases are required to obtain the a priori chosen performance. The ANOVA-based methods evaluated in this study are an established technique (pairwise differences method: PWD), a new approach providing additional statistics (residuals method: RES), and a generic technique that uses restricted maximum likelihood (REML) estimation. Monte Carlo simulations were performed to assess the performance of the ANOVA-based methods, which is expressed by their accuracy (closeness of the estimates to the truth), their precision (standard error of the estimates), and the reliability of their statistical test for the significance of a difference in the random variation of an observer between two groups of cases. The highest accuracy is achieved using REML estimation, but for datasets of at least 50 cases or arrangements with 6 or more observers, the differences between the methods are negligible, with deviations from the truth well below +/-3%. For datasets up to 100 cases, it is most beneficial to increase the number of cases to improve the precision of the estimated random variations, whereas for datasets over 100 cases, an improvement in precision is most efficiently achieved by increasing the number of

  20. Association Splitting: A randomized controlled trial of a new method to reduce craving among inpatients with alcohol dependence.

    PubMed

    Schneider, Brooke C; Moritz, Steffen; Hottenrott, Birgit; Reimer, Jens; Andreou, Christina; Jelinek, Lena

    2016-04-30

    Association Splitting, a novel cognitive intervention, was tested in patients with alcohol dependence as an add-on intervention in an initial randomized controlled trial. Preliminary support for Association Splitting has been found in patients with obsessive-compulsive disorder, as well as in an online pilot study of patients with alcohol use disorders. The present variant sought to reduce craving by strengthening neutral associations with alcohol-related stimuli, thus, altering cognitive networks. Eighty-four inpatients with verified diagnoses of alcohol dependence, who were currently undergoing inpatient treatment, were randomly assigned to Association Splitting or Exercise Therapy. Craving was measured at baseline, 4-week follow-up, and six months later with the Obsessive-Compulsive Drinking Scale (primary outcome) and the Alcohol Craving Questionnaire. There was no advantage for Association Splitting after three treatment sessions relative to Exercise Therapy. Among Association Splitting participants, 51.9% endorsed a subjective decline in craving and 88.9% indicated that they would use Association Splitting in the future. Despite high acceptance, an additional benefit of Association Splitting beyond standard inpatient treatment was not found. Given that participants were concurrently undergoing inpatient treatment and Association Splitting has previously shown moderate effects, modification of the study design may improve the potential to detect significant effects in future trials.

  1. Randomized controlled trial to test a computerized psychosocial cancer assessment and referral program: methods and research design.

    PubMed

    O'Hea, Erin L; Cutillo, Alexandra; Dietzen, Laura; Harralson, Tina; Grissom, Grant; Person, Sharina; Boudreaux, Edwin D

    2013-05-01

    The National Cancer Coalition Network, National Cancer Institute, and American College of Surgeons all emphasize the need for oncology providers to identify, address, and monitor psychosocial needs of their patients. The Mental Health Assessment and Dynamic Referral for Oncology (MHADRO) is a patient-driven, computerized, psychosocial assessment that identifies, addresses, and monitors physical, psychological, and social issues faced by oncology patients. This paper presents the methodology of a randomized controlled trial (RCT) that tested the impact of the MHADRO on patient outcomes at 2, 6, and 12 months. Patient outcomes including overall psychological distress, depression, anxiety, functional disability, and use of psychosocial resources will be presented in future publications after all follow-up data is gathered. Eight hundred and thirty six cancer patients with heterogeneous diagnoses, across three comprehensive cancer centers in different parts of the United States, were randomized to the MHADRO (intervention) or an assessment-only control group. Patients in the intervention group were provided detailed, personalized reports and, when needed, referrals to mental health services; their oncology provider received detailed reports designed to foster clinical decision making. Those patients who demonstrated high levels of psychosocial problems were given the option to authorize that a copy of their report be sent electronically to a "best match" mental health professional. Demographic and patient cancer-related data as well as comparisons between patients who were enrolled and those who declined enrollment are presented. Challenges encountered during the RCT and strategies used to address them are discussed.

  2. Randomized clinical trial of multimodal physiotherapy treatment compared to overnight lidocaine ointment in women with provoked vestibulodynia: Design and methods.

    PubMed

    Morin, Mélanie; Dumoulin, Chantale; Bergeron, Sophie; Mayrand, Marie-Hélène; Khalifé, Samir; Waddell, Guy; Dubois, Marie-France

    2016-01-01

    Provoked vestibulodynia (PVD) is a highly prevalent and debilitating condition yet its management relies mainly on non-empirically validated interventions. Among the many causes of PVD, there is growing evidence that pelvic floor muscle (PFM) dysfunctions play an important role in its pathophysiology. Multimodal physiotherapy, which addresses these dysfunctions, is judged by experts to be highly effective and is recommended as a first-line treatment. However, the effectiveness of this promising intervention has been evaluated through only two small uncontrolled trials. The proposed bi-center, single-blind, parallel group, randomized controlled trial (RCT) aims to evaluate the efficacy of multimodal physiotherapy and compare it to a frequently used first-line treatment, topical overnight application of lidocaine, in women with PVD. A total of 212 women diagnosed with PVD according to a standardized protocol were eligible for the study and were randomly assigned to either multimodal physiotherapy or lidocaine treatment for 10weeks. The primary outcome measure is pain during intercourse (assessed with a numerical rating scale). Secondary measures include sexual function, pain quality, psychological factors (including pain catastrophizing, anxiety, depression and fear of pain), PFM morphology and function, and patients' global impression of change. Assessments are made at baseline, post-treatment and at the 6-month follow-up. This manuscript presents and discusses the rationale, design and methodology of the first RCT investigating physiotherapy in comparison to a commonly prescribed first-line treatment, overnight topical lidocaine, for women with PVD.

  3. Adequate nutrient intake can reduce cardiovascular disease risk in African Americans.

    PubMed

    Reusser, Molly E; DiRienzo, Douglas B; Miller, Gregory D; McCarron, David A

    2003-03-01

    Cardiovascular disease kills nearly as many Americans each year as the next seven leading causes of death combined. The prevalence of cardiovascular disease and most of its associated risk factors is markedly higher and increasing more rapidly among African Americans than in any other racial or ethnic group. Improving these statistics may be simply a matter of improving diet quality. In recent years, a substantial and growing body of evidence has revealed that dietary patterns complete in all food groups, including nutrient-rich dairy products, are essential for preventing and reducing cardiovascular disease and the conditions that contribute to it. Several cardiovascular risk factors, including hypertension, insulin resistance syndrome, and obesity, have been shown to be positively influenced by dietary patterns that include adequate intake of dairy products. The benefits of nutrient-rich dietary patterns have been specifically tested in randomized, controlled trials emphasizing African American populations. These studies demonstrated proportionally greater benefits for African Americans without evidence of adverse effects such as symptoms of lactose intolerance. As currently promoted for the prevention of certain cancers and osteoporosis, regular consumption of diets that meet recommended nutrient intake levels might also be the most effective approach for reducing cardiovascular disease risk in African Americans.

  4. Duration of Pulmonary Tuberculosis Infectiousness under Adequate Therapy, as Assessed Using Induced Sputum Samples

    PubMed Central

    Ko, Yousang; Shin, Jeong Hwan; Lee, Hyun-Kyung; Lee, Young Seok; Lee, Suh-Young; Park, So Young; Mo, Eun-Kyung; Kim, Changhwan

    2017-01-01

    Background A sputum culture is the most reliable indicator of the infectiousness of pulmonary tuberculosis (PTB); however, a spontaneous sputum specimen may not be suitable. The aim of this study was to evaluate the infectious period in patients with non–drug-resistant (DR) PTB receiving adequate standard chemotherapy, using induced sputum (IS) specimens. Methods We evaluated the duration of infectiousness of PTB using a retrospective cohort design. Results Among the 35 patients with PTB, 22 were smear-positive. The rates of IS culture positivity from baseline to the sixth week of anti-tuberculosis medication in the smear-positive PTB group were 100%, 100%, 91%, 73%, 36%, and 18%, respectively. For smear-positive PTB cases, the median time of conversion to culture negativity was 35.0 days (range, 28.0–42.0 days). In the smear-negative PTB group (n=13), the weekly rates of positive IS culture were 100%, 77%, 39%, 8%, 0%, and 0%, respectively, and the median time to conversion to culture-negative was 21.0 days (range, 17.5–28.0 days). Conclusion The infectiousness of PTB, under adequate therapy, may persist longer than previously reported, even in patients with non-DR PTB. PMID:28119744

  5. A Novel Method for Assessment of Polyethylene Liner Wear in Radiopaque Tantalum Acetabular Cups: Clinical Validation in Patients Enrolled in a Randomized Controlled Trial.

    PubMed

    Troelsen, Anders; Greene, Meridith E; Ayers, David C; Bragdon, Charles R; Malchau, Henrik

    2015-12-01

    Conventional radiostereometric analysis (RSA) for wear is not possible in patients with tantalum cups. We propose a novel method for wear analysis in tantalum cups. Wear was assessed by gold standard RSA and the novel method in total hip arthroplasty patients enrolled in a randomized controlled trial receiving either titanium or tantalum cups (n=46). The novel method estimated the center of the head using a model based on identification of two proximal markers on the stem and knowledge of the stem/head configuration. The novel method was able to demonstrate a pattern of wear that was similar to the gold standard in titanium cups. The novel method offered accurate assessment and is a viable solution for assessment of wear in studies with tantalum cups.

  6. Comparison of 3D-OP-OSEM and 3D-FBP reconstruction algorithms for High-Resolution Research Tomograph studies: effects of randoms estimation methods

    NASA Astrophysics Data System (ADS)

    van Velden, Floris H. P.; Kloet, Reina W.; van Berckel, Bart N. M.; Wolfensberger, Saskia P. A.; Lammertsma, Adriaan A.; Boellaard, Ronald

    2008-06-01

    The High-Resolution Research Tomograph (HRRT) is a dedicated human brain positron emission tomography (PET) scanner. Recently, a 3D filtered backprojection (3D-FBP) reconstruction method has been implemented to reduce bias in short duration frames, currently observed in 3D ordinary Poisson OSEM (3D-OP-OSEM) reconstructions. Further improvements might be expected using a new method of variance reduction on randoms (VRR) based on coincidence histograms instead of using the delayed window technique (DW) to estimate randoms. The goal of this study was to evaluate VRR in combination with 3D-OP-OSEM and 3D-FBP reconstruction techniques. To this end, several phantom studies and a human brain study were performed. For most phantom studies, 3D-OP-OSEM showed higher accuracy of observed activity concentrations with VRR than with DW. However, both positive and negative deviations in reconstructed activity concentrations and large biases of grey to white matter contrast ratio (up to 88%) were still observed as a function of scan statistics. Moreover 3D-OP-OSEM+VRR also showed bias up to 64% in clinical data, i.e. in some pharmacokinetic parameters as compared with those obtained with 3D-FBP+VRR. In the case of 3D-FBP, VRR showed similar results as DW for both phantom and clinical data, except that VRR showed a better standard deviation of 6-10%. Therefore, VRR should be used to correct for randoms in HRRT PET studies.

  7. Zinc content of selected tissues and taste perception in rats fed zinc deficient and zinc adequate rations

    SciTech Connect

    Boeckner, L.S.; Kies, C.

    1986-03-05

    The objective of the study was to determine the effects of feeding zinc sufficient and zinc deficient rations on taste sensitivity and zinc contents of selected organs in rats. The 36 Sprague-Dawley male weanling rats were divided into 2 groups and fed zinc deficient or zinc adequate rations. The animals were subjected to 4 trial periods in which a choice of deionized distilled water or a solution of quinine sulfate at 1.28 x 10/sup -6/ was given. A randomized schedule for rat sacrifice was used. No differences were found between zinc deficient and zinc adequate rats in taste preference aversion scores for quinine sulfate in the first three trial periods; however, in the last trial period rats in the zinc sufficient group drank somewhat less water containing quinine sulfate as a percentage of total water consumption than did rats fed the zinc deficient ration. Significantly higher zinc contents of kidney, brain and parotid salivary glands were seen in zinc adequate rats compared to zinc deficient rats at the end of the study. However, liver and tongue zinc levels were lower for both groups at the close of the study than were those of rats sacrificed at the beginning of the study.

  8. A small concentration expansion for the effective heat conductivity of a random disperse two-component material; an assessment of Batchelor's renormalization method

    NASA Astrophysics Data System (ADS)

    Vanbeek, P.

    1987-11-01

    The difficulty in the expansion of the effective properties of random disperse media in powers of the volume concentration c of the disperse phase presented by the divergence of certain integrals that perform averaging of two-particle approximations is considered. The random heat conduction problem analyzed by Jeffrey (1974) is treated using Batchelor's (1974) renormalization method. Batchelor's two-particle equation is extended to a hierarchical set of n-particle equations for arbitrary n. The solution of the hierarchy is seen to consist of a sequence of two, three, and more particle terms. The two and three-particle terms are calculated. It is proved that all i-particle terms (i greater than or = 2) can be averaged convergently showing that the hierarchical approach yields a well-defined expansion in integer powers of c of the effective conductivity. It follows that Jeffrey's expression for the effective conductivity is 0(c sq) - accurate.

  9. Impact of a mHealth intervention for peer health workers on AIDS care in rural Uganda: a mixed methods evaluation of a cluster-randomized trial.

    PubMed

    Chang, Larry W; Kagaayi, Joseph; Arem, Hannah; Nakigozi, Gertrude; Ssempijja, Victor; Serwadda, David; Quinn, Thomas C; Gray, Ronald H; Bollinger, Robert C; Reynolds, Steven J

    2011-11-01

    Mobile phone access in low and middle-income countries is rapidly expanding and offers an opportunity to leverage limited human resources for health. We conducted a mixed methods evaluation of a cluster-randomized trial exploratory substudy on the impact of a mHealth (mobile phone) support intervention used by community-based peer health workers (PHW) on AIDS care in rural Uganda. 29 PHWs at 10 clinics were randomized by clinic to receive the intervention or not. PHWs used phones to call and text higher level providers with patient-specific clinical information. 970 patients cared for by the PHWs were followed over a 26 month period. No significant differences were found in patients' risk of virologic failure. Qualitative analyses found improvements in patient care and logistics and broad support for the mHealth intervention among patients, clinic staff, and PHWs. Key challenges identified included variable patient phone access, privacy concerns, and phone maintenance.

  10. Certified randomness in quantum physics.

    PubMed

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  11. Certified randomness in quantum physics

    NASA Astrophysics Data System (ADS)

    Acín, Antonio; Masanes, Lluis

    2016-12-01

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  12. Randomized SUSAN edge detector

    NASA Astrophysics Data System (ADS)

    Qu, Zhi-Guo; Wang, Ping; Gao, Ying-Hui; Wang, Peng

    2011-11-01

    A speed up technique for the SUSAN edge detector based on random sampling is proposed. Instead of sliding the mask pixel by pixel on an image as the SUSAN edge detector does, the proposed scheme places the mask randomly on pixels to find edges in the image; we hereby name it randomized SUSAN edge detector (R-SUSAN). Specifically, the R-SUSAN edge detector adopts three approaches in the framework of random sampling to accelerate a SUSAN edge detector: procedure integration of response computation and nonmaxima suppression, reduction of unnecessary processing for obvious nonedge pixels, and early termination. Experimental results demonstrate the effectiveness of the proposed method.

  13. Salt sales survey: a simplified, cost-effective method to evaluate population salt reduction programs--a cluster-randomized trial.

    PubMed

    Ma, Yuan; He, Feng J; Li, Nicole; Hao, Jesse; Zhang, Jing; Yan, Lijing L; Wu, Yangfeng

    2016-04-01

    Twenty-four-hour urine collection, as a gold standard method of measuring salt intake, is costly and resource consuming, which limits its use in monitoring population salt reduction programs. Our study aimed to determine whether a salt sales survey could serve as an alternative method. This was a substudy of China Rural Health Initiative-Sodium Reduction Study (CRHI-SRS), in which 120 villages were randomly allocated (1:1:2) into a price subsidy+health education (PS+HE) group, a HE-only group or a control group. Salt substitutes (SS) were supplied to shops in the intervention groups; 24-h urine was collected from 2567 randomly selected adults at the end of the trial to evaluate the effects of the intervention. Ten villages were randomly selected from each group (that is, 30 villages in total), and 166 shops from these villages were invited to participate in the monthly salt sales survey. The results showed that during the intervention period, mean monthly sales of SS per shop were 38.0 kg for the PS+HE group, 19.2 kg for the HE only and 2.2 kg for the control group (P<0.05), which was consistent with the results from the 24-h urine sodium and potassium data. The intervention effects of CRHI-SRS on sodium and potassium intake estimated from SS sales were 101% and 114%, respectively, of those observed from the 24-h urine data. Furthermore, the salt sales survey cost only 14% of the cost of the 24-h urine method and had greater statistical power. The results indicate that a salt sales survey could serve as a simple, sensitive and cost-effective method to evaluate community-based salt reduction programs in which salt is mainly added by the consumers.

  14. Adaptive robust image registration approach based on adequately sampling polar transform and weighted angular projection function

    NASA Astrophysics Data System (ADS)

    Wei, Zhao; Tao, Feng; Jun, Wang

    2013-10-01

    An efficient, robust, and accurate approach is developed for image registration, which is especially suitable for large-scale change and arbitrary rotation. It is named the adequately sampling polar transform and weighted angular projection function (ASPT-WAPF). The proposed ASPT model overcomes the oversampling problem of conventional log-polar transform. Additionally, the WAPF presented as the feature descriptor is robust to the alteration in the fovea area of an image, and reduces the computational cost of the following registration process. The experimental results show two major advantages of the proposed method. First, it can register images with high accuracy even when the scale factor is up to 10 and the rotation angle is arbitrary. However, the maximum scaling estimated by the state-of-the-art algorithms is 6. Second, our algorithm is more robust to the size of the sampling region while not decreasing the accuracy of the registration.

  15. J-modulated ADEQUATE experiments using different kinds of refocusing pulses.

    PubMed

    Thiele, Christina M; Bermel, Wolfgang

    2007-10-01

    Owing to the recent developments concerning residual dipolar couplings (RDCs), the interest in methods for the accurate determination of coupling constants is renascenting. We intended to use the J-modulated ADEQUATE experiment by Kövér et al. for the measurement of (13)C - (13)C coupling constants at natural abundance. The use of adiabatic composite chirp pulses instead of the conventional 180 degrees pulses, which compensate for the offset dependence of (13)C 180 degrees pulses, led to irregularities of the line shapes in the indirect dimension causing deviations of the extracted coupling constants. This behaviour was attributed to coupling evolution, during the time of the adiabatic pulse (2 ms), in the J-modulation spin echo. The replacement of this pulse by different kinds of refocusing pulses indicated that a pair of BIPs (broadband inversion pulses), which behave only partially adiabatic, leads to correct line shapes and coupling constants conserving the good sensitivity obtained with adiabatic pulses.

  16. A Self-Administered Method of Acute Pressure Block of Sciatic Nerves for Short-Term Relief of Dental Pain: A Randomized Study

    PubMed Central

    Wang, Xiaolin; Zhao, Wanghong; Wang, Ye; Hu, Jiao; Chen, Qiu; Yu, Juncai; Wu, Bin; Huang, Rong; Gao, Jie; He, Jiman

    2014-01-01

    Objectives While stimulation of the peripheral nerves increases the pain threshold, chronic pressure stimulation of the sciatic nerve is associated with sciatica. We recently found that acute pressure block of the sciatic nerve inhibits pain. Therefore, we propose that, the pain pathology-causing pressure is chronic, not acute. Here, we report a novel self-administered method: acute pressure block of the sciatic nerves is applied by the patients themselves for short-term relief of pain from dental diseases. Design This was a randomized, single-blind study. Setting Hospital patients. Patients Patients aged 16–60 years with acute pulpitis, acute apical periodontitis, or pericoronitis of the third molar of the mandible experiencing pain ≥3 on the 11-point numerical pain rating scale. Interventions Three-minute pressure to sciatic nerves was applied by using the hands (hand pressure method) or by having the patients squat to force the thigh and shin as tightly as possible on the sandwiched sciatic nerve bundles (self-administered method). Outcomes The primary efficacy variable was the mean difference in pain scores from the baseline. Results One hundred seventy-two dental patients were randomized. The self-administered method produced significant relief from pain associated with dental diseases (P ≤ 0.001). The analgesic effect of the self-administered method was similar to that of the hand pressure method. Conclusions The self-administered method is easy to learn and can be applied at any time for pain relief. We believe that patients will benefit from this method. PMID:24400593

  17. Quantum random number generators

    NASA Astrophysics Data System (ADS)

    Herrero-Collantes, Miguel; Garcia-Escartin, Juan Carlos

    2017-01-01

    Random numbers are a fundamental resource in science and engineering with important applications in simulation and cryptography. The inherent randomness at the core of quantum mechanics makes quantum systems a perfect source of entropy. Quantum random number generation is one of the most mature quantum technologies with many alternative generation methods. This review discusses the different technologies in quantum random number generation from the early devices based on radioactive decay to the multiple ways to use the quantum states of light to gather entropy from a quantum origin. Randomness extraction and amplification and the notable possibility of generating trusted random numbers even with untrusted hardware using device-independent generation protocols are also discussed.

  18. Translating Research on Myoelectric Control into Clinics-Are the Performance Assessment Methods Adequate?

    PubMed

    Vujaklija, Ivan; Roche, Aidan D; Hasenoehrl, Timothy; Sturma, Agnes; Amsuess, Sebastian; Farina, Dario; Aszmann, Oskar C

    2017-01-01

    Missing an upper limb dramatically impairs daily-life activities. Efforts in overcoming the issues arising from this disability have been made in both academia and industry, although their clinical outcome is still limited. Translation of prosthetic research into clinics has been challenging because of the difficulties in meeting the necessary requirements of the market. In this perspective article, we suggest that one relevant factor determining the relatively small clinical impact of myocontrol algorithms for upper limb prostheses is the limit of commonly used laboratory performance metrics. The laboratory conditions, in which the majority of the solutions are being evaluated, fail to sufficiently replicate real-life challenges. We qualitatively support this argument with representative data from seven transradial amputees. Their ability to control a myoelectric prosthesis was tested by measuring the accuracy of offline EMG signal classification, as a typical laboratory performance metrics, as well as by clinical scores when performing standard tests of daily living. Despite all subjects reaching relatively high classification accuracy offline, their clinical scores varied greatly and were not strongly predicted by classification accuracy. We therefore support the suggestion to test myocontrol systems using clinical tests on amputees, fully fitted with sockets and prostheses highly resembling the systems they would use in daily living, as evaluation benchmark. Agreement on this level of testing for systems developed in research laboratories would facilitate clinically relevant progresses in this field.

  19. Translating Research on Myoelectric Control into Clinics—Are the Performance Assessment Methods Adequate?

    PubMed Central

    Vujaklija, Ivan; Roche, Aidan D.; Hasenoehrl, Timothy; Sturma, Agnes; Amsuess, Sebastian; Farina, Dario; Aszmann, Oskar C.

    2017-01-01

    Missing an upper limb dramatically impairs daily-life activities. Efforts in overcoming the issues arising from this disability have been made in both academia and industry, although their clinical outcome is still limited. Translation of prosthetic research into clinics has been challenging because of the difficulties in meeting the necessary requirements of the market. In this perspective article, we suggest that one relevant factor determining the relatively small clinical impact of myocontrol algorithms for upper limb prostheses is the limit of commonly used laboratory performance metrics. The laboratory conditions, in which the majority of the solutions are being evaluated, fail to sufficiently replicate real-life challenges. We qualitatively support this argument with representative data from seven transradial amputees. Their ability to control a myoelectric prosthesis was tested by measuring the accuracy of offline EMG signal classification, as a typical laboratory performance metrics, as well as by clinical scores when performing standard tests of daily living. Despite all subjects reaching relatively high classification accuracy offline, their clinical scores varied greatly and were not strongly predicted by classification accuracy. We therefore support the suggestion to test myocontrol systems using clinical tests on amputees, fully fitted with sockets and prostheses highly resembling the systems they would use in daily living, as evaluation benchmark. Agreement on this level of testing for systems developed in research laboratories would facilitate clinically relevant progresses in this field. PMID:28261085

  20. Investigation on wide-band scattering of a 2-D target above 1-D randomly rough surface by FDTD method.

    PubMed

    Li, Juan; Guo, Li-Xin; Jiao, Yong-Chang; Li, Ke

    2011-01-17

    Finite-difference time-domain (FDTD) algorithm with a pulse wave excitation is used to investigate the wide-band composite scattering from a two-dimensional(2-D) infinitely long target with arbitrary cross section located above a one-dimensional(1-D) randomly rough surface. The FDTD calculation is performed with a pulse wave incidence, and the 2-D representative time-domain scattered field in the far zone is obtained directly by extrapolating the currently calculated data on the output boundary. Then the 2-D wide-band scattering result is acquired by transforming the representative time-domain field to the frequency domain with a Fourier transform. Taking the composite scattering of an infinitely long cylinder above rough surface as an example, the wide-band response in the far zone by FDTD with the pulsed excitation is computed and it shows a good agreement with the numerical result by FDTD with the sinusoidal illumination. Finally, the normalized radar cross section (NRCS) from a 2-D target above 1-D rough surface versus the incident frequency, and the representative scattered fields in the far zone versus the time are analyzed in detail.

  1. Switching characteristics in Cu:SiO2 by chemical soak methods for resistive random access memory (ReRAM)

    NASA Astrophysics Data System (ADS)

    Chin, Fun-Tat; Lin, Yu-Hsien; Yang, Wen-Luh; Liao, Chin-Hsuan; Lin, Li-Min; Hsiao, Yu-Ping; Chao, Tien-Sheng

    2015-01-01

    A limited copper (Cu)-source Cu:SiO2 switching layer composed of various Cu concentrations was fabricated using a chemical soaking (CS) technique. The switching layer was then studied for developing applications in resistive random access memory (ReRAM) devices. Observing the resistive switching mechanism exhibited by all the samples suggested that Cu conductive filaments formed and ruptured during the set/reset process. The experimental results indicated that the endurance property failure that occurred was related to the joule heating effect. Moreover, the endurance switching cycle increased as the Cu concentration decreased. In high-temperature tests, the samples demonstrated that the operating (set/reset) voltages decreased as the temperature increased, and an Arrhenius plot was used to calculate the activation energy of the set/reset process. In addition, the samples demonstrated stable data retention properties when baked at 85 °C, but the samples with low Cu concentrations exhibited short retention times in the low-resistance state (LRS) during 125 °C tests. Therefore, Cu concentration is a crucial factor in the trade-off between the endurance and retention properties; furthermore, the Cu concentration can be easily modulated using this CS technique.

  2. Parallelization of a spatial random field characterization process using the Method of Anchored Distributions and the HTCondor high throughput computing system

    NASA Astrophysics Data System (ADS)

    Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.

    2013-12-01

    A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)

  3. Scattering of electromagnetic waves from 3D multilayer random rough surfaces based on the second-order small perturbation method: energy conservation, reflectivity, and emissivity.

    PubMed

    Sanamzadeh, Mohammadreza; Tsang, Leung; Johnson, Joel T; Burkholder, Robert J; Tan, Shurun

    2017-03-01

    A theoretical investigation of energy conservation, reflectivity, and emissivity in the scattering of electromagnetic waves from 3D multilayer media with random rough interfaces using the second-order small perturbation method (SPM2) is presented. The approach is based on the extinction theorem and develops integral equations for surface fields in the spectral domain. Using the SPM2, we calculate the scattered and transmitted coherent fields and incoherent fields. Reflected and transmitted powers are then found in the form of 2D integrations over wavenumber in the spectral domain. In the integrand, there is a summation over the spectral densities of each of the rough interfaces with each weighted by a corresponding kernel function. We show in this paper that there exists a "strong" condition of energy conservation in that the kernel functions multiplying the spectral density of each interface obey energy conservation exactly. This means that energy is conserved independent of the roughness spectral densities of the rough surfaces. Results of this strong condition are illustrated numerically for up to 50 rough interfaces without requiring specification of surface roughness properties. Two examples are illustrated. One is a multilayer configuration having weak contrasts between adjacent layers, random layer thicknesses, and randomly generated permittivity profiles. The second example is a photonic crystal of periodically alternating permittivities of larger dielectric contrast. The methodology is applied to study the effect of roughness on the brightness temperatures of the Antarctic ice sheet, which is characterized by layers of ice with permittivity fluctuations in addition to random rough interfaces. The results show that the influence of roughness can significantly increase horizontally polarized thermal emission while leaving vertically polarized emissions relatively unaffected.

  4. The Human Right to Adequate Housing: A Tool for Promoting and Protecting Individual and Community Health

    PubMed Central

    Thiele, Bret

    2002-01-01

    The human right to adequate housing is enshrined in international law. The right to adequate housing can be traced to the Universal Declaration of Human Rights, which was unanimously adopted by the world community in 1948. Since that time, the right to adequate housing has been reaffirmed on numerous occasions and further defined and elaborated. A key component of this right is habitability of housing, which should comply with health and safety standards. Therefore, the right to adequate housing provides an additional tool for advocates and others interested in promoting healthful housing and living conditions and thereby protecting individual and community health. PMID:11988432

  5. A novel method for diagnosis of smear-negative tuberculosis patients by combining a random unbiased Phi29 amplification with a specific real-time PCR.

    PubMed

    Pang, Yu; Lu, Jie; Yang, Jian; Wang, Yufeng; Cohen, Chad; Ni, Xin; Zhao, Yanlin

    2015-07-01

    In this study, we develop a novel method for diagnosis of smear-negative tuberculosis patients by performing a random unbiased Phi29 amplification prior to the use of a specific real-time PCR. The limit of detection (LOD) of the conventional real-time PCR was 100 colony-forming units (CFU) of MTB genome/reaction, while the REPLI real-time PCR assay could detect 0.4 CFU/reaction. In comparison with the conventional real-time PCR, REPLI real-time PCR shows better sensitivity for the detection of smear-negative tuberculosis (P = 0.015).

  6. Free variable selection QSPR study to predict (19)F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods.

    PubMed

    Goudarzi, Nasser

    2016-04-05

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the (19)F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the (19)F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  7. AFLP fragment isolation technique as a method to produce random sequences for single nucleotide polymorphism discovery in the green turtle, Chelonia mydas.

    PubMed

    Roden, Suzanne E; Dutton, Peter H; Morin, Phillip A

    2009-01-01

    The green sea turtle, Chelonia mydas, was used as a case study for single nucleotide polymorphism (SNP) discovery in a species that has little genetic sequence information available. As green turtles have a complex population structure, additional nuclear markers other than microsatellites could add to our understanding of their complex life history. Amplified fragment length polymorphism technique was used to generate sets of random fragments of genomic DNA, which were then electrophoretically separated with precast gels, stained with SYBR green, excised, and directly sequenced. It was possible to perform this method without the use of polyacrylamide gels, radioactive or fluorescent labeled primers, or hybridization methods, reducing the time, expense, and safety hazards of SNP discovery. Within 13 loci, 2547 base pairs were screened, resulting in the discovery of 35 SNPs. Using this method, it was possible to yield a sufficient number of loci to screen for SNP markers without the availability of prior sequence information.

  8. The use of propensity score methods with survival or time-to-event outcomes: reporting measures of effect similar to those used in randomized experiments.

    PubMed

    Austin, Peter C

    2014-03-30

    Propensity score methods are increasingly being used to estimate causal treatment effects in observational studies. In medical and epidemiological studies, outcomes are frequently time-to-event in nature. Propensity-score methods are often applied incorrectly when estimating the effect of treatment on time-to-event outcomes. This article describes how two different propensity score methods (matching and inverse probability of treatment weighting) can be used to estimate the measures of effect that are frequently reported in randomized controlled trials: (i) marginal survival curves, which describe survival in the population if all subjects were treated or if all subjects were untreated; and (ii) marginal hazard ratios. The use of these propensity score methods allows one to replicate the measures of effect that are commonly reported in randomized controlled trials with time-to-event outcomes: both absolute and relative reductions in the probability of an event occurring can be determined. We also provide guidance on variable selection for the propensity score model, highlight methods for assessing the balance of baseline covariates between treated and untreated subjects, and describe the implementation of a sensitivity analysis to assess the effect of unmeasured confounding variables on the estimated treatment effect when outcomes are time-to-event in nature. The methods in the paper are illustrated by estimating the effect of discharge statin prescribing on the risk of death in a sample of patients hospitalized with acute myocardial infarction. In this tutorial article, we describe and illustrate all the steps necessary to conduct a comprehensive analysis of the effect of treatment on time-to-event outcomes.

  9. A two-stage adaptive stochastic collocation method on nested sparse grids for multiphase flow in randomly heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Liao, Qinzhuo; Zhang, Dongxiao; Tchelepi, Hamdi

    2017-02-01

    A new computational method is proposed for efficient uncertainty quantification of multiphase flow in porous media with stochastic permeability. For pressure estimation, it combines the dimension-adaptive stochastic collocation method on Smolyak sparse grids and the Kronrod-Patterson-Hermite nested quadrature formulas. For saturation estimation, an additional stage is developed, in which the pressure and velocity samples are first generated by the sparse grid interpolation and then substituted into the transport equation to solve for the saturation samples, to address the low regularity problem of the saturation. Numerical examples are presented for multiphase flow with stochastic permeability fields to demonstrate accuracy and efficiency of the proposed two-stage adaptive stochastic collocation method on nested sparse grids.

  10. Use of Linear Programming to Develop Cost-Minimized Nutritionally Adequate Health Promoting Food Baskets

    PubMed Central

    Tetens, Inge; Dejgård Jensen, Jørgen; Smed, Sinne; Gabrijelčič Blenkuš, Mojca; Rayner, Mike; Darmon, Nicole; Robertson, Aileen

    2016-01-01

    Background Food-Based Dietary Guidelines (FBDGs) are developed to promote healthier eating patterns, but increasing food prices may make healthy eating less affordable. The aim of this study was to design a range of cost-minimized nutritionally adequate health-promoting food baskets (FBs) that help prevent both micronutrient inadequacy and diet-related non-communicable diseases at lowest cost. Methods Average prices for 312 foods were collected within the Greater Copenhagen area. The cost and nutrient content of five different cost-minimized FBs for a family of four were calculated per day using linear programming. The FBs were defined using five different constraints: cultural acceptability (CA), or dietary guidelines (DG), or nutrient recommendations (N), or cultural acceptability and nutrient recommendations (CAN), or dietary guidelines and nutrient recommendations (DGN). The variety and number of foods in each of the resulting five baskets was increased through limiting the relative share of individual foods. Results The one-day version of N contained only 12 foods at the minimum cost of DKK 27 (€ 3.6). The CA, DG, and DGN were about twice of this and the CAN cost ~DKK 81 (€ 10.8). The baskets with the greater variety of foods contained from 70 (CAN) to 134 (DGN) foods and cost between DKK 60 (€ 8.1, N) and DKK 125 (€ 16.8, DGN). Ensuring that the food baskets cover both dietary guidelines and nutrient recommendations doubled the cost while cultural acceptability (CAN) tripled it. Conclusion Use of linear programming facilitates the generation of low-cost food baskets that are nutritionally adequate, health promoting, and culturally acceptable. PMID:27760131

  11. Individualized chiropractic and integrative care for low back pain: the design of a randomized clinical trial using a mixed-methods approach

    PubMed Central

    2010-01-01

    Background Low back pain (LBP) is a prevalent and costly condition in the United States. Evidence suggests there is no one treatment which is best for all patients, but instead several viable treatment options. Additionally, multidisciplinary management of LBP may be more effective than monodisciplinary care. An integrative model that includes both complementary and alternative medicine (CAM) and conventional therapies, while also incorporating patient choice, has yet to be tested for chronic LBP. The primary aim of this study is to determine the relative clinical effectiveness of 1) monodisciplinary chiropractic care and 2) multidisciplinary integrative care in 200 adults with non-acute LBP, in both the short-term (after 12 weeks) and long-term (after 52 weeks). The primary outcome measure is patient-rated back pain. Secondary aims compare the treatment approaches in terms of frequency of symptoms, low back disability, fear avoidance, self-efficacy, general health status, improvement, satisfaction, work loss, medication use, lumbar dynamic motion, and torso muscle endurance. Patients' and providers' perceptions of treatment will be described using qualitative methods, and cost-effectiveness and cost utility will be assessed. Methods and Design This paper describes the design of a randomized clinical trial (RCT), with cost-effectiveness and qualitative studies conducted alongside the RCT. Two hundred participants ages 18 and older are being recruited and randomized to one of two 12-week treatment interventions. Patient-rated outcome measures are collected via self-report questionnaires at baseline, and at 4, 12, 26, and 52 weeks post-randomization. Objective outcome measures are assessed at baseline and 12 weeks by examiners blinded to treatment assignment. Health care cost data is collected by self-report questionnaires and treatment records during the intervention phase and by monthly phone interviews thereafter. Qualitative interviews, using a semi

  12. Randomization Strategies.

    PubMed

    Kepler, Christopher K

    2017-04-01

    An understanding of randomization is important both for study design and to assist medical professionals in evaluating the medical literature. Simple randomization can be done through a variety of techniques, but carries a risk of unequal distribution of subjects into treatment groups. Block randomization can be used to overcome this limitation by ensuring that small subgroups are distributed evenly between treatment groups. Finally, techniques can be used to evenly distribute subjects between treatment groups while accounting for confounding variables, so as to not skew results when there is a high index of suspicion that a particular variable will influence outcome.

  13. RWCFusion: identifying phenotype-specific cancer driver gene fusions based on fusion pair random walk scoring method

    PubMed Central

    Zhao, Jianmei; Li, Xuecang; Yao, Qianlan; Li, Meng; Zhang, Jian; Ai, Bo; Liu, Wei; Wang, Qiuyu; Feng, Chenchen; Liu, Yuejuan; Bai, Xuefeng; Song, Chao; Li, Shang; Li, Enmin; Xu, Liyan; Li, Chunquan

    2016-01-01

    While gene fusions have been increasingly detected by next-generation sequencing (NGS) technologies based methods in human cancers, these methods have limitations in identifying driver fusions. In addition, the existing methods to identify driver gene fusions ignored the specificity among different cancers or only considered their local rather than global topology features in networks. Here, we proposed a novel network-based method, called RWCFusion, to identify phenotype-specific cancer driver gene fusions. To evaluate its performance, we used leave-one-out cross-validation in 35 cancers and achieved a high AUC value 0.925 for overall cancers and an average 0.929 for signal cancer. Furthermore, we classified 35 cancers into two classes: haematological and solid, of which the haematological got a highly AUC which is up to 0.968. Finally, we applied RWCFusion to breast cancer and found that top 13 gene fusions, such as BCAS3-BCAS4, NOTCH-NUP214, MED13-BCAS3 and CARM-SMARCA4, have been previously proved to be drivers for breast cancer. Additionally, 8 among the top 10 of the remaining candidate gene fusions, such as SULF2-ZNF217, MED1-ACSF2, and ACACA-STAC2, were inferred to be potential driver gene fusions of breast cancer by us. PMID:27506935

  14. RWCFusion: identifying phenotype-specific cancer driver gene fusions based on fusion pair random walk scoring method.

    PubMed

    Zhao, Jianmei; Li, Xuecang; Yao, Qianlan; Li, Meng; Zhang, Jian; Ai, Bo; Liu, Wei; Wang, Qiuyu; Feng, Chenchen; Liu, Yuejuan; Bai, Xuefeng; Song, Chao; Li, Shang; Li, Enmin; Xu, Liyan; Li, Chunquan

    2016-09-20

    While gene fusions have been increasingly detected by next-generation sequencing (NGS) technologies based methods in human cancers, these methods have limitations in identifying driver fusions. In addition, the existing methods to identify driver gene fusions ignored the specificity among different cancers or only considered their local rather than global topology features in networks. Here, we proposed a novel network-based method, called RWCFusion, to identify phenotype-specific cancer driver gene fusions. To evaluate its performance, we used leave-one-out cross-validation in 35 cancers and achieved a high AUC value 0.925 for overall cancers and an average 0.929 for signal cancer. Furthermore, we classified 35 cancers into two classes: haematological and solid, of which the haematological got a highly AUC which is up to 0.968. Finally, we applied RWCFusion to breast cancer and found that top 13 gene fusions, such as BCAS3-BCAS4, NOTCH-NUP214, MED13-BCAS3 and CARM-SMARCA4, have been previously proved to be drivers for breast cancer. Additionally, 8 among the top 10 of the remaining candidate gene fusions, such as SULF2-ZNF217, MED1-ACSF2, and ACACA-STAC2, were inferred to be potential driver gene fusions of breast cancer by us.

  15. Cost effective analysis of recall methods for cervical cancer screening in Selangor--results from a prospective randomized controlled trial.

    PubMed

    Rashid, Rima Marhayu Abdul; Ramli, Sophia; John, Jennifer; Dahlui, Maznah

    2014-01-01

    Cervical cancer screening in Malaysia is by opportunistic Pap smear which contributes to the low uptake rate. To overcome this, a pilot project called the SIPPS program (translated as information system of Pap smear program) had been introduced whereby women aged 20-65 years old are invited for Pap smear and receive recall to repeat the test. This study aimed at determining which recall method is most cost-effective in getting women to repeat Pap smear. A randomised control trial was conducted where one thousand women were recalled for repeat smear either by registered letter, phone messages, phone call or the usual postal letter. The total cost applied for cost-effectiveness analysis includes the cost of sending letter for first invitation, cost of the recall method and cost of two Pap smears. Cost-effective analysis (CEA) of Pap smear uptake by each recall method was then performed. The uptake of Pap smear by postal letter, registered letters, SMS and phone calls were 18.8%, 20.0%, 21.6% and 34.4%, respectively (p<0.05). The CER for the recall method was lowest by phone call compared to other interventions; RM 69.18 (SD RM 0.14) compared to RM 106.53 (SD RM 0.13), RM 134.02 (SD RM 0.15) and RM 136.38 (SD RM 0.11) for SMS, registered letter and letter, respectively. ICER showed that it is most cost saving if the usual method of recall by postal letter be changed to recall by phone call. The possibility of letter as a recall for repeat Pap smear to reach the women is higher compared to sending SMS or making phone call. However, getting women to do repeat Pap smear is better with phone call which allows direct communication. Despite the high cost of the phone call as a recall method for repeat Pap smear, it is the most cost-effective method compared to others.

  16. Children's behavioral pain reactions during local anesthetic injection using cotton-roll vibration method compared with routine topical anesthesia: A randomized controlled trial

    PubMed Central

    Bagherian, Ali; Sheikhfathollahi, Mahmood

    2016-01-01

    Background: Topical anesthesia has been widely advocated as an important component of atraumatic administration of intraoral local anesthesia. The aim of this study was to use direct observation of children's behavioral pain reactions during local anesthetic injection using cotton-roll vibration method compared with routine topical anesthesia. Materials and Methods: Forty-eight children participated in this randomized controlled clinical trial. They received two separate inferior alveolar nerve block or primary maxillary molar infiltration injections on contralateral sides of the jaws by both cotton-roll vibration (a combination of topical anesthesia gel, cotton roll, and vibration for physical distraction) and control (routine topical anesthesia) methods. Behavioral pain reactions of children were measured according to the author-developed face, head, foot, hand, trunk, and cry (FHFHTC) scale, resulting in total scores between 0 and 18. Results: The total scores on the FHFHTC scale ranged between 0-5 and 0-10 in the cotton-roll vibration and control methods, respectively. The mean ± standard deviation values of total scores on FHFHTC scale were lower in the cotton-roll vibration method (1.21 ± 1.38) than in control method (2.44 ± 2.18), and this was statistically significant (P < 0.001). Conclusion: It may be concluded that the cotton-roll vibration method can be more helpful than the routine topical anesthesia in reducing behavioral pain reactions in children during local anesthesia administration. PMID:27274349

  17. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT... and Adequate Veterinary Care § 2.40 Attending veterinarian and adequate veterinary care (dealers and... veterinary care to its animals in compliance with this section. (1) Each dealer and exhibitor shall employ...

  18. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE... adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall provide adequate veterinary care to its animals in compliance with this section: (1) Each research facility...

  19. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  20. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  1. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  2. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  3. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...

  4. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  5. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  6. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...

  7. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Assurances of adequate capacity and services. 438... Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... provides supporting documentation that demonstrates that it has the capacity to serve the...

  8. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Assurances of adequate capacity and services. 438... Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... provides supporting documentation that demonstrates that it has the capacity to serve the...

  9. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Assurances of adequate capacity and services. 438... Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... provides supporting documentation that demonstrates that it has the capacity to serve the...

  10. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Assurances of adequate capacity and services. 438... Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... provides supporting documentation that demonstrates that it has the capacity to serve the...

  11. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall provide adequate veterinary care to its animals in compliance with this section: (1) Each research facility...

  12. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall provide adequate veterinary care to its animals in compliance with this section: (1) Each research facility...

  13. Stochastic modeling of short-term exposure close to an air pollution source in a naturally ventilated room: an autocorrelated random walk method.

    PubMed

    Cheng, Kai-Chung; Acevedo-Bolton, Viviana; Jiang, Ruo-Ting; Klepeis, Neil E; Ott, Wayne R; Kitanidis, Peter K; Hildemann, Lynn M

    2014-01-01

    For an actively emitting source such as cooking or smoking, indoor measurements have shown a strong "proximity effect" within 1 m. The significant increase in both the magnitude and variation of concentration near a source is attributable to transient high peaks that occur sporadically-and these "microplumes" cause great uncertainty in estimating personal exposure. Recent field studies in naturally ventilated rooms show that close-proximity concentrations are approximately lognormally distributed. We use the autocorrelated random walk method to represent the time-varying directionality of indoor emissions, thereby predicting the time series and frequency distributions of concentrations close to an actively emitting point source. The predicted 5-min concentrations show good agreement with measurements from a point source of CO in a naturally ventilated house-the measured and predicted frequency distributions at 0.5- and 1-m distances are similar and approximately lognormal over a concentration range spanning three orders of magnitude. By including the transient peak concentrations, this random airflow modeling method offers a way to more accurately assess acute exposure levels for cases where well-defined airflow patterns in an indoor space are not available.

  14. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  15. Subspace inverse power method and polynomial chaos representation for the modal frequency responses of random mechanical systems

    NASA Astrophysics Data System (ADS)

    Pagnacco, E.; de Cursi, E. Souza; Sampaio, R.

    2016-07-01

    This study concerns the computation of frequency responses of linear stochastic mechanical systems through a modal analysis. A new strategy, based on transposing standards deterministic deflated and subspace inverse power methods into stochastic framework, is introduced via polynomial chaos representation. Applicability and effectiveness of the proposed schemes is demonstrated through three simple application examples and one realistic application example. It is shown that null and repeated-eigenvalue situations are addressed successfully.

  16. Random thoughts

    NASA Astrophysics Data System (ADS)

    ajansen; kwhitefoot; panteltje1; edprochak; sudhakar, the

    2014-07-01

    In reply to the physicsworld.com news story “How to make a quantum random-number generator from a mobile phone” (16 May, http://ow.ly/xFiYc, see also p5), which describes a way of delivering random numbers by counting the number of photons that impinge on each of the individual pixels in the camera of a Nokia N9 smartphone.

  17. The need of adequate information to achieve total compliance of mass drug administration in Pekalongan

    NASA Astrophysics Data System (ADS)

    Ginandjar, Praba; Saraswati, Lintang Dian; Taufik, Opik; Nurjazuli; Widjanarko, Bagoes

    2017-02-01

    World Health Organization (WHO) initiated The Global Program to Eliminate Lymphatic Filariasis (LF) through mass drug administration (MDA). Pekalongan started MDA in 2011. Yet the LF prevalence in 2015 remained exceed the threshold (1%). This study aimed to describe the inhibiting factors related to the compliance of MDA in community level. This was a rapid survey with cross sectional approach. A two-stages random sampling was used in this study. In the first stage, 25 clusters were randomly selected from 27 villages with proportionate to population size (PPS) methods (C-Survey). In the second stage, 10 subjects were randomly selected from each cluster. Subject consisted of 250 respondents from 25 selected clusters. Variables consisted of MDA coverage, practice of taking medication during MDA, enabling and inhibiting factors to MDA in community level. The results showed most respondents had poor knowledge on filariasis, which influence awareness of the disease. Health-illness perception, did not receive the drugs, lactation, side effect, and size of the drugs were dominant factors of non-compliance to MDA. MDA information and community empowerment were needed to improve MDA coverage. Further study to explore the appropriate model of socialization will support the success of MDA program

  18. A facile method to enhance out-coupling efficiency in organic light-emitting diodes via a random-pyramids textured layer

    NASA Astrophysics Data System (ADS)

    Zhu, Wenqing; Xiao, Teng; Zhai, Guangsheng; Yu, Jingting; Shi, Guanjie; Chen, Guo; Wei, Bin

    2016-09-01

    We demonstrate a facile method to enhance light extraction in organic light-emitting diodes using a polymer layer with a texture consisting of random upright pyramids. The simple fabrication technique of the textured layer is based on silicon alkali-etching and imprint lithography. With the textured layer applied to the external face of the glass substrate, the organic light-emitting diode achieved a 26% enhancement of current efficiency and a 30% enhancement of power efficiency without spectral distortion over wide viewing angles. A ray-tracing optical simulation reveals that the textured layer can alter the traveling path of light and assist in out-coupling a large portion of light delivered into the substrate. The proposed method is a promising approach for achieving enhanced efficiency organic light-emitting diodes for the simple fabrication process and the effective light extraction.

  19. 45 CFR 1159.15 - Who has the responsibility for maintaining adequate technical, physical, and security safeguards...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... of maintaining adequate technical, physical, and security safeguards to prevent...

  20. 45 CFR 1159.15 - Who has the responsibility for maintaining adequate technical, physical, and security safeguards...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... of maintaining adequate technical, physical, and security safeguards to prevent...

  1. Mimicking the quasi-random assembly of protein fibers in the dermis by freeze-drying method.

    PubMed

    Ghaleh, Hakimeh; Abbasi, Farhang; Alizadeh, Mina; Khoshfetrat, Ali Baradar

    2015-04-01

    Freeze-drying is extensively used for fabrication of porous materials in tissue engineering and biomedical applications, due to its versatility and use of no toxic solvent. However, it has some significant drawbacks. Conventional freeze-drying technique leads to the production of heterogeneous porous structures with side orientated columnar pores. As the top and bottom surfaces of the sample are not in contact with similar environments, different rates of heat transfer in the surfaces and the temperature gradient across the sample establish the preferential direction of heat transfer. To achieve a scaffold with a desirable microstructure for skin tissue engineering, freeze-drying method was modified by controlling the rate of cooling and regulation of heat transfer across the sample during the freezing step. It could create a homogeneous porous structure with more equiaxed non-oriented pores. Freezing the polymeric solution in the aluminum mold enhanced pore interconnectivity relative to the polystyrene mold. Recrystallization process was discussed how to influence the mean pore size of the scaffold when the final freezing temperature varied. Higher final freezing temperature can easily provide the energy required for the recrystallization process, which lead to enlarged ice crystals and resulting pores.

  2. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  3. The effectiveness of the McKenzie method in addition to first-line care for acute low back pain: a randomized controlled trial

    PubMed Central

    2010-01-01

    Background Low back pain is a highly prevalent and disabling condition worldwide. Clinical guidelines for the management of patients with acute low back pain recommend first-line treatment consisting of advice, reassurance and simple analgesics. Exercise is also commonly prescribed to these patients. The primary aim of this study was to evaluate the short-term effect of adding the McKenzie method to the first-line care of patients with acute low back pain. Methods A multi-centre randomized controlled trial with a 3-month follow-up was conducted between September 2005 and June 2008. Patients seeking care for acute non-specific low back pain from primary care medical practices were screened. Eligible participants were assigned to receive a treatment programme based on the McKenzie method and first-line care (advice, reassurance and time-contingent acetaminophen) or first-line care alone, for 3 weeks. Primary outcome measures included pain (0-10 Numeric Rating Scale) over the first seven days, pain at 1 week, pain at 3 weeks and global perceived effect (-5 to 5 scale) at 3 weeks. Treatment effects were estimated using linear mixed models. Results One hundred and forty-eight participants were randomized into study groups, of whom 138 (93%) completed the last follow-up. The addition of the McKenzie method to first-line care produced statistically significant but small reductions in pain when compared to first-line care alone: mean of -0.4 points (95% confidence interval, -0.8 to -0.1) at 1 week, -0.7 points (95% confidence interval, -1.2 to -0.1) at 3 weeks, and -0.3 points (95% confidence interval, -0.5 to -0.0) over the first 7 days. Patients receiving the McKenzie method did not show additional effects on global perceived effect, disability, function or on the risk of persistent symptoms. These patients sought less additional health care than those receiving only first-line care (P = 0.002). Conclusions When added to the currently recommended first-line care of acute

  4. Knowledge and Informed Decision-Making about Population-Based Colorectal Cancer Screening Participation in Groups with Low and Adequate Health Literacy

    PubMed Central

    Essink-Bot, M. L.; Dekker, E.; Timmermans, D. R. M.; Uiters, E.; Fransen, M. P.

    2016-01-01

    Objective. To analyze and compare decision-relevant knowledge, decisional conflict, and informed decision-making about colorectal cancer (CRC) screening participation between potential screening participants with low and adequate health literacy (HL), defined as the skills to access, understand, and apply information to make informed decisions about health. Methods. Survey including 71 individuals with low HL and 70 with adequate HL, all eligible for the Dutch organized CRC screening program. Knowledge, attitude, intention to participate, and decisional conflict were assessed after reading the standard information materials. HL was assessed using the Short Assessment of Health Literacy in Dutch. Informed decision-making was analyzed by the multidimensional measure of informed choice. Results. 64% of the study population had adequate knowledge of CRC and CRC screening (low HL 43/71 (61%), adequate HL 47/70 (67%), p > 0.05). 57% were informed decision-makers (low HL 34/71 (55%), adequate HL 39/70 (58%), p > 0.05). Intention to participate was 89% (low HL 63/71 (89%), adequate HL 63/70 (90%)). Respondents with low HL experienced significantly more decisional conflict (25.8 versus 16.1; p = 0.00). Conclusion. Informed decision-making about CRC screening participation was suboptimal among both individuals with low HL and individuals with adequate HL. Further research is required to develop and implement effective strategies to convey decision-relevant knowledge about CRC screening to all screening invitees. PMID:27200089

  5. Mindfulness-Based Stress Reduction for Overweight/Obese Women With and Without Polycystic Ovary Syndrome: Design and Methods of a Pilot Randomized Controlled Trial

    PubMed Central

    Raja-Khan, Nazia; Agito, Katrina; Shah, Julie; Stetter, Christy M.; Gustafson, Theresa S.; Socolow, Holly; Kunselman, Allen R.; Reibel, Diane K.; Legro, Richard S.

    2015-01-01

    Mindfulness-based stress reduction (MBSR) may be beneficial for overweight/obese women, including women with polycystic ovary syndrome (PCOS), as it has been shown to reduce psychological distress and improve quality of life in other patient populations. Preliminary studies suggest that MBSR may also have salutary effects on blood pressure and blood glucose. This paper describes the design and methods of an ongoing pilot randomized controlled trial evaluating the feasibility and effects of MBSR in PCOS and non-PCOS women who are overweight or obese. Eighty six (86) women with body mass index ≥25 kg/m2, including 31 women with PCOS, have been randomized to 8 weeks of MBSR or health education control, and followed for 16 weeks. The primary outcome is mindfulness assessed with the Toronto Mindfulness Scale. Secondary outcomes include measures of blood pressure, blood glucose, quality of life, anxiety and depression. Our overall hypothesis is that MBSR will increase mindfulness and ultimately lead to favorable changes in blood pressure, blood glucose, psychological distress and quality of life in PCOS and non-PCOS women. This would support the integration of MBSR with conventional medical treatments to reduce psychological distress, cardiovascular disease and diabetes in PCOS and non-PCOS women who are overweight or obese. PMID:25662105

  6. The Gamma-Poisson model as a statistical method to determine if micro-organisms are randomly distributed in a food matrix.

    PubMed

    Toft, Nils; Innocent, Giles T; Mellor, Dominic J; Reid, Stuart W J

    2006-02-01

    The Gamma-Poisson model, i.e., a Poisson distribution where the parameter lambda is Gamma distributed, has been suggested as a statistical method for determining whether or not micro-organisms are randomly distributed in a food matrix. In this study, we analyse the Gamma-Poisson model to explore some of the properties of the Gamma-Poisson model left unexplored by the previous study. The conclusion of our analysis is that the Gamma-Poisson model distinguishes poorly between variation at the Poisson level and the Gamma level. Estimated parameter values from simulated data-sets showed large variation around the true values, even for moderate sample sizes (n=100). Furthermore, at these sample sizes the likelihood ratio is not a good test statistic for discriminating between the Gamma-Poisson distribution and the Poisson distribution. Hence, to determine if data are randomly distributed, i.e., Poisson distributed, the Gamma-Poisson distribution is not a good choice. However, the ratio between variation at the Poisson level and the Gamma level does provide a measure of the amount of overdispersion.

  7. A cluster-randomized, placebo-controlled, maternal vitamin a or beta-carotene supplementation trial in bangladesh: design and methods

    PubMed Central

    2011-01-01

    Background We present the design, methods and population characteristics of a large community trial that assessed the efficacy of a weekly supplement containing vitamin A or beta-carotene, at recommended dietary levels, in reducing maternal mortality from early gestation through 12 weeks postpartum. We identify challenges faced and report solutions in implementing an intervention trial under low-resource, rural conditions, including the importance of population choice in promoting generalizability, maintaining rigorous data quality control to reduce inter- and intra- worker variation, and optimizing efficiencies in information and resources flow from and to the field. Methods This trial was a double-masked, cluster-randomized, dual intervention, placebo-controlled trial in a contiguous rural area of ~435 sq km with a population of ~650,000 in Gaibandha and Rangpur Districts of Northwestern Bangladesh. Approximately 120,000 married women of reproductive age underwent 5-weekly home surveillance, of whom ~60,000 were detected as pregnant, enrolled into the trial and gave birth to ~44,000 live-born infants. Upon enrollment, at ~ 9 weeks' gestation, pregnant women received a weekly oral supplement containing vitamin A (7000 ug retinol equivalents (RE)), beta-carotene (42 mg, or ~7000 ug RE) or a placebo through 12 weeks postpartum, according to prior randomized allocation of their cluster of residence. Systems described include enlistment and 5-weekly home surveillance for pregnancy based on menstrual history and urine testing, weekly supervised supplementation, periodic risk factor interviews, maternal and infant vital outcome monitoring, birth defect surveillance and clinical/biochemical substudies. Results The primary outcome was pregnancy-related mortality assessed for 3 months following parturition. Secondary outcomes included fetal loss due to miscarriage or stillbirth, infant mortality under three months of age, maternal obstetric and infectious morbidity, infant

  8. Wireless Network Security Using Randomness

    DTIC Science & Technology

    2012-06-19

    REPORT WIRELESS NETWORK SECURITY USING RANDOMNESS 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The present invention provides systems and methods for... securing communications in a wireless network by utilizing the inherent randomness of propagation errors to enable legitimate users to dynamically...Box 12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Patent, security , wireless networks, randomness Sheng Xiao, Weibo Gong

  9. Inferential Processing among Adequate and Struggling Adolescent Comprehenders and Relations to Reading Comprehension

    PubMed Central

    Barth, Amy E.; Barnes, Marcia; Francis, David J.; Vaughn, Sharon; York, Mary

    2015-01-01

    Separate mixed model analyses of variance (ANOVA) were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6–12 (n = 1203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in comprehension. Results suggest that there is considerable growth across the middle and high school years, particularly for adequate comprehenders in those text integration processes that maintain local coherence. Accuracy in text consistency judgments accounted for significant unique variance for passage-level, but not sentence-level comprehension, particularly for adequate comprehenders. PMID:26166946

  10. Using Multitheory Model of Health Behavior Change to Predict Adequate Sleep Behavior.

    PubMed

    Knowlden, Adam P; Sharma, Manoj; Nahar, Vinayak K

    The purpose of this article was to use the multitheory model of health behavior change in predicting adequate sleep behavior in college students. A valid and reliable survey was administered in a cross-sectional design (n = 151). For initiation of adequate sleep behavior, the construct of behavioral confidence (P < .001) was found to be significant and accounted for 24.4% of the variance. For sustenance of adequate sleep behavior, changes in social environment (P < .02), emotional transformation (P < .001), and practice for change (P < .001) were significant and accounted for 34.2% of the variance.

  11. Morphological characterization of bicontinuous structures in polymer blends and microemulsions by the inverse-clipping method in the context of the clipped-random-wave model.

    PubMed

    Jinnai, H; Nishikawa, Y; Chen, S H; Koizumi, S; Hashimoto, T

    2000-06-01

    A method is proposed to determine the spectral function of the clipped-random-wave (CRW) model directly from scattering data. The spectral function f(k) (k is a wave number) gives the distribution of the magnitude of wave vectors of the sinusoidal waves that describes the essential features of the two-phase morphology. The proposed method involves "inverse clipping" of a correlation function to obtain f(k) and does not require any a priori assumptions for f(k). A critical test of the applicability of the inverse-clipping method was carried out by using three-component bicontinuous microemulsions. The method was then used to determine f(k) of the bicontinuous structure of a phase-separating polymer blend. f(k) for the polymer blend turned out to be a multipeaked function, while f(k) for the microemulsions exhibits a single broad maximum representing periodicity of the morphology. These results indicate the presence of the long-range regularity in the morphology of the polymer blend. Three-dimensional (3D) morphology corresponding to the scattering data of the polymer blend was generated using the CRW model together with the multipeaked f(k). Interface curvatures of the 3D morphology calculated from f(k) were measured and compared with those experimentally determined directly from the laser scanning confocal microscopy in the same blend.

  12. Morphological characterization of bicontinuous structures in polymer blends and microemulsions by the inverse-clipping method in the context of the clipped-random-wave model

    NASA Astrophysics Data System (ADS)

    Jinnai, Hiroshi; Nishikawa, Yukihiro; Chen, Sow-Hsin; Koizumi, Satoshi; Hashimoto, Takeji

    2000-06-01

    A method is proposed to determine the spectral function of the clipped-random-wave (CRW) model directly from scattering data. The spectral function f(k) (k is a wave number) gives the distribution of the magnitude of wave vectors of the sinusoidal waves that describes the essential features of the two-phase morphology. The proposed method involves ``inverse clipping'' of a correlation function to obtain f(k) and does not require any a priori assumptions for f(k). A critical test of the applicability of the inverse-clipping method was carried out by using three-component bicontinuous microemulsions. The method was then used to determine f(k) of the bicontinuous structure of a phase-separating polymer blend. f(k) for the polymer blend turned out to be a multipeaked function, while f(k) for the microemulsions exhibits a single broad maximum representing periodicity of the morphology. These results indicate the presence of the long-range regularity in the morphology of the polymer blend. Three-dimensional (3D) morphology corresponding to the scattering data of the polymer blend was generated using the CRW model together with the multipeaked f(k). Interface curvatures of the 3D morphology calculated from f(k) were measured and compared with those experimentally determined directly from the laser scanning confocal microscopy in the same blend.

  13. Quantifying data retention of perpendicular spin-transfer-torque magnetic random access memory chips using an effective thermal stability factor method

    SciTech Connect

    Thomas, Luc Jan, Guenole; Le, Son; Wang, Po-Kang

    2015-04-20

    The thermal stability of perpendicular Spin-Transfer-Torque Magnetic Random Access Memory (STT-MRAM) devices is investigated at chip level. Experimental data are analyzed in the framework of the Néel-Brown model including distributions of the thermal stability factor Δ. We show that in the low error rate regime important for applications, the effect of distributions of Δ can be described by a single quantity, the effective thermal stability factor Δ{sub eff}, which encompasses both the median and the standard deviation of the distributions. Data retention of memory chips can be assessed accurately by measuring Δ{sub eff} as a function of device diameter and temperature. We apply this method to show that 54 nm devices based on our perpendicular STT-MRAM design meet our 10 year data retention target up to 120 °C.

  14. Region 8: Colorado Lamar and Steamboat Springs Adequate Letter (11/12/2002)

    EPA Pesticide Factsheets

    This letter from EPA to Colorado Department of Public Health and Environment determined Lamar and Steamboat Springs particulate matter (PM10) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes

  15. 75 FR 5893 - Suspension of Community Eligibility for Failure To Maintain Adequate Floodplain Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... To Maintain Adequate Floodplain Management Regulations AGENCY: Federal Emergency Management Agency... floodplain management regulations meeting minimum requirements under the National Flood Insurance Program... they have brought their floodplain management regulations into compliance with the NFIP...

  16. Region 9: California Adequate / Inadequate Letter Attachment (5/30/2008)

    EPA Pesticide Factsheets

    This is a document that states that it has been found adequate for transportation conformitypurposes certain 8-hour ozone and PM2.5 motor vehicleemissions budgets in the 2007 South Coast StateImplementation Plan.

  17. Prevention of mother to child transmission lay counsellors: Are they adequately trained?

    PubMed

    Thurling, Catherine H; Harris, Candice

    2012-06-05

    South Africa's high prevalence of human immunodeficiency virus (HIV) infected women requires a comprehensive health care approach to pregnancy because of the added risk of their HIV status. As a result of the shortage of health care workers in South Africa, lay counsellors play important roles in the prevention of mother to child transmission of HIV (PMTCT). There is no standardization of training of lay counsellors in South Africa, and training varies in length depending on the training organisation. The study aimed to investigate the training of lay counsellors by analysing their training curricula and interviewing lay counsellors about their perceptions of their training. A two phase research method was applied. Phase one documented an analysis of the training curricula. Phase two was semi-structured interviews with the participants. Purposive sampling was undertaken for this study. The total sample size was 13 people, with a final sample of 9 participants, determined at the point of data saturation. The research was qualitative, descriptive and contextual in design. The curricula analysed had different styles of delivery, and the approaches to learning and courses varied, resulting in inconsistent training outcomes. A need for supervision and mentorship in the working environment was also noted. The training of lay counsellors needs to be adapted to meet the extended roles that they are playing in PMTCT. The standardization of training programmes, and the incorporation of a system of mentorship in the work environment, would ensure that the lay counsellors are adequately prepared for their role in PMTCT.

  18. Gaussian membership functions are most adequate in representing uncertainty in measurements

    NASA Technical Reports Server (NTRS)

    Kreinovich, V.; Quintana, C.; Reznik, L.

    1992-01-01

    In rare situations, like fundamental physics, we perform experiments without knowing what their results will be. In the majority of real-life measurement situations, we more or less know beforehand what kind of results we will get. Of course, this is not the precise knowledge of the type 'the result will be between alpha - beta and alpha + beta,' because in this case, we would not need any measurements at all. This is usually a knowledge that is best represented in uncertain terms, like 'perhaps (or 'most likely', etc.) the measured value x is between alpha - beta and alpha + beta.' Traditional statistical methods neglect this additional knowledge and process only the measurement results. So it is desirable to be able to process this uncertain knowledge as well. A natural way to process it is by using fuzzy logic. But, there is a problem; we can use different membership functions to represent the same uncertain statements, and different functions lead to different results. What membership function do we choose? In the present paper, we show that under some reasonable assumptions, Gaussian functions mu(x) = exp(-beta(x(exp 2))) are the most adequate choice of the membership functions for representing uncertainty in measurements. This representation was efficiently used in testing jet engines to airplanes and spaceships.

  19. A Randomized Controlled Trial Comparing the McKenzie Method to Motor Control Exercises in People With Chronic Low Back Pain and a Directional Preference.

    PubMed

    Halliday, Mark H; Pappas, Evangelos; Hancock, Mark J; Clare, Helen A; Pinto, Rafael Z; Robertson, Gavin; Ferreira, Paulo H

    2016-07-01

    Study Design Randomized clinical trial. Background Motor control exercises are believed to improve coordination of the trunk muscles. It is unclear whether increases in trunk muscle thickness can be facilitated by approaches such as the McKenzie method. Furthermore, it is unclear which approach may have superior clinical outcomes. Objectives The primary aim was to compare the effects of the McKenzie method and motor control exercises on trunk muscle recruitment in people with chronic low back pain classified with a directional preference. The secondary aim was to conduct a between-group comparison of outcomes for pain, function, and global perceived effect. Methods Seventy people with chronic low back pain who demonstrated a directional preference using the McKenzie assessment were randomized to receive 12 treatments over 8 weeks with the McKenzie method or with motor control approaches. All outcomes were collected at baseline and at 8-week follow-up by blinded assessors. Results No significant between-group difference was found for trunk muscle thickness of the transversus abdominis (-5.8%; 95% confidence interval [CI]: -15.2%, 3.7%), obliquus internus (-0.7%; 95% CI: -6.6%, 5.2%), and obliquus externus (1.2%; 95% CI: -4.3%, 6.8%). Perceived recovery was slightly superior in the McKenzie group (-0.8; 95% CI: -1.5, -0.1) on a -5 to +5 scale. No significant between-group differences were found for pain or function (P = .99 and P = .26, respectively). Conclusion We found no significant effect of treatment group for trunk muscle thickness. Participants reported a slightly greater sense of perceived recovery with the McKenzie method than with the motor control approach. Level of Evidence Therapy, level 1b-. Registered September 7, 2011 at www.anzctr.org.au (ACTRN12611000971932). J Orthop Sports Phys Ther 2016;46(7):514-522. Epub 12 May 2016. doi:10.2519/jospt.2016.6379.

  20. Site Characterization in the Urban Area of Tijuana, B. C., Mexico by Means of: H/V Spectral Ratios, Spectral Analysis of Surface Waves, and Random Decrement Method

    NASA Astrophysics Data System (ADS)

    Tapia-Herrera, R.; Huerta-Lopez, C. I.; Martinez-Cruzado, J. A.

    2009-05-01

    Results of site characterization for an experimental site in the metropolitan area of Tijuana, B. C., Mexico are presented as part of the on-going research in which time series of earthquakes, ambient noise, and induced vibrations were processed with three different methods: H/V spectral ratios, Spectral Analysis of Surface Waves (SASW), and the Random Decrement Method, (RDM). Forward modeling using the wave propagation stiffness matrix method (Roësset and Kausel, 1981) was used to compute the theoretical SH/P, SV/P spectral ratios, and the experimental H/V spectral ratios were computed following the conventional concepts of Fourier analysis. The modeling/comparison between the theoretical and experimental H/V spectral ratios was carried out. For the SASW method the theoretical dispersion curves were also computed and compared with the experimental one, and finally the theoretical free vibration decay curve was compared with the experimental one obtained with the RDM. All three methods were tested with ambient noise, induced vibrations, and earthquake signals. Both experimental spectral ratios obtained with ambient noise as well as earthquake signals agree quite well with the theoretical spectral ratios, particularly at the fundamental vibration frequency of the recording site. Differences between the fundamental vibration frequencies are evident for sites located at alluvial fill (~0.6 Hz) and at sites located at conglomerate/sandstones fill (0.75 Hz). Shear wave velocities for the soft soil layers of the 4-layer discrete soil model ranges as low as 100 m/s and up to 280 m/s. The results with the SASW provided information that allows to identify low velocity layers, not seen before with the traditional seismic methods. The damping estimations obtained with the RDM are within the expected values, and the dominant frequency of the system also obtained with the RDM correlates within the range of plus-minus 20 % with the one obtained by means of the H/V spectral

  1. Modeling Transport in Fractured Porous Media with the Random-Walk Particle Method: The Transient Activity Range and the Particle-Transfer Probability

    SciTech Connect

    Lehua Pan; G.S. Bodvarsson

    2001-10-22

    Multiscale features of transport processes in fractured porous media make numerical modeling a difficult task, both in conceptualization and computation. Modeling the mass transfer through the fracture-matrix interface is one of the critical issues in the simulation of transport in a fractured porous medium. Because conventional dual-continuum-based numerical methods are unable to capture the transient features of the diffusion depth into the matrix (unless they assume a passive matrix medium), such methods will overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, resulting in artificial early breakthroughs. We have developed a new method for calculating the particle-transfer probability that can capture the transient features of diffusion depth into the matrix within the framework of the dual-continuum random-walk particle method (RWPM) by introducing a new concept of activity range of a particle within the matrix. Unlike the multiple-continuum approach, the new dual-continuum RWPM does not require using additional grid blocks to represent the matrix. It does not assume a passive matrix medium and can be applied to the cases where global water flow exists in both continua. The new method has been verified against analytical solutions for transport in the fracture-matrix systems with various fracture spacing. The calculations of the breakthrough curves of radionuclides from a potential repository to the water table in Yucca Mountain demonstrate the effectiveness of the new method for simulating 3-D, mountain-scale transport in a heterogeneous, fractured porous medium under variably saturated conditions.

  2. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  3. Individual and contextual determinants of adequate maternal health care services in Kenya.

    PubMed

    Achia, Thomas N O; Mageto, Lillian E

    2015-01-01

    This study aimed to examine individual and community level factors associated with adequate use of maternal antenatal health services in Kenya. Individual and community level factors associated with adequate use of maternal health care (MHC) services were obtained from the 2008-09 Kenya Demographic and Health Survey data set. Multilevel partial-proportional odds logit models were fitted using STATA 13.0 to quantify the relations of the selected covariates to adequate MHC use, defined as a three-category ordinal variable. The sample consisted of 3,621 women who had at least one live birth in the five-year period preceding this survey. Only 18 percent of the women had adequate use of MHC services. Greater educational attainment by the woman or her partner, higher socioeconomic status, access to medical insurance coverage, and greater media exposure were the individual-level factors associated with adequate use of MHC services. Greater community ethnic diversity, higher community-level socioeconomic status, and greater community-level health facility deliveries were the contextual-level factors associated with adequate use of MHC. To improve the use of MHC services in Kenya, the government needs to design and implement programs that target underlying individual and community level factors, providing focused and sustained health education to promote the use of antenatal, delivery, and postnatal care.

  4. Random Logic Oxide Screening Methods

    DTIC Science & Technology

    1990-12-01

    removed from the RADC mailing list, or if the addressee is no longer employed by your organization, please notify RADC (RBRP ) Griffiss AFB NY 13441...the probability of having x = 0, i.e. the probability of having no defects present which fail at or below E. F0 (±) = f(x,R) 1=0 = e-A Also, by...Confirmation of the Fowler-Nordheim Law for Large-Area Field Emitter Arrays. Appl Phys Lett. Vol 23, No . 1, July 1973. 2. Y. Nissan-Cohen, J. Shappir

  5. Seeking Clearer Recommendations for Hand Hygiene in Communities Facing Ebola: A Randomized Trial Investigating the Impact of Six Handwashing Methods on Skin Irritation and Dermatitis

    PubMed Central

    Wells, Emma; Mitro, Brittany; Desmarais, Anne Marie; Scheinman, Pamela; Lantagne, Daniele

    2016-01-01

    To prevent disease transmission, 0.05% chlorine solution is commonly recommended for handwashing in Ebola Treatment Units. In the 2014 West Africa outbreak this recommendation was widely extended to community settings, although many organizations recommend soap and hand sanitizer over chlorine. To evaluate skin irritation caused by frequent handwashing that may increase transmission risk in Ebola-affected communities, we conducted a randomized trial with 91 subjects who washed their hands 10 times a day for 28 days. Subjects used soap and water, sanitizer, or one of four chlorine solutions used by Ebola responders (calcium hypochlorite (HTH), sodium dichloroisocyanurate (NaDCC), and generated or pH-stabilized sodium hypochlorite (NaOCl)). Outcomes were self-reported hand feel, irritation as measured by the Hand Eczema Score Index (HECSI) (range 0–360), signs of transmission risk (e.g., cracking), and dermatitis diagnosis. All groups experienced statistically significant increases in HECSI score. Subjects using sanitizer had the smallest increases, followed by higher pH chlorine solutions (HTH and stabilized NaOCl), and soap and water. The greatest increases were among neutral pH chlorine solutions (NaDCC and generated NaOCl). Signs of irritation related to higher transmission risk were observed most frequently in subjects using soap and least frequently by those using sanitizer or HTH. Despite these irritation increases, all methods represented minor changes in HECSI score. Average HECSI score was only 9.10 at endline (range 1–33) and 4% (4/91) of subjects were diagnosed with dermatitis, one each in four groups. Each handwashing method has benefits and drawbacks: soap is widely available and inexpensive, but requires water and does not inactivate the virus; sanitizer is easy-to use and effective but expensive and unacceptable to many communities, and chlorine is easy-to-use but difficult to produce properly and distribute. Overall, we recommend Ebola responders

  6. A Randomized, Single-Blind, Placebo-Controlled Study on the Efficacy of the Arthrokinematic Approach-Hakata Method in Patients with Chronic Nonspecific Low Back Pain

    PubMed Central

    Kogure, Akira; Kotani, Kazuhiko; Katada, Shigehiko; Takagi, Hiroshi; Kamikozuru, Masahiro; Isaji, Takashi; Hakata, Setsuo

    2015-01-01

    Study design cized, single-blind, controlled trial. Objective To investigate the efficacy of the Arthrokinematic approach (AKA)-Hakata (H) method for chronic low back pain. Summary of Background Data The AKA-H method is used to manually treat abnormalities of intra-articular movement. Methods One hundred eighty-six patients with chronic nonspecific low back pain randomly received either the AKA-H method (AKA-H group) or the sham technique (S group) monthly for 6 months. Data were collected at baseline and once a month. Outcome measures were pain intensity (visual analogue scale [VAS]) and quality of life (the Roland-Morris Disability Questionnaire [RDQ] and Short Form SF-36 questionnaire [SF-36]). Results At baseline, the VAS, RDQ, and SF-36 scores showed similar levels between the groups. After 6 months, the AKA-H group had more improvement in the VAS (42.8% improvement) and RDQ score (31.1% improvement) than the sham group (VAS: 10.4% improvement; RDQ: 9.8% improvement; both, P < 0.001). The respective scores for the SF-36 subscales (physical functioning, role physical, bodily pain, social functioning, general health perception, role emotional, and mental health) were also significantly more improved in the AKA-H group than in the sham group (all, P < 0.001). The scores for the physical, psychological, and social aspects of the SF-36 subscales showed similar improvement in the AKA-H group. Conclusion The AKA-H method can be effective in managing chronic low back pain. Trial Registration UMIN Clinical Trials Registry (UMIN-CTR) UMIN000006250. PMID:26646534

  7. Randomized controlled trial to evaluate the effects of combined progressive exercise on metabolic syndrome in breast cancer survivors: rationale, design, and methods

    PubMed Central

    2014-01-01

    Background Metabolic syndrome (MetS) is increasingly present in breast cancer survivors, possibly worsened by cancer-related treatments, such as chemotherapy. MetS greatly increases risk of cardiovascular disease and diabetes, co-morbidities that could impair the survivorship experience, and possibly lead to cancer recurrence. Exercise has been shown to positively influence quality of life (QOL), physical function, muscular strength and endurance, reduce fatigue, and improve emotional well-being; however, the impact on MetS components (visceral adiposity, hyperglycemia, low serum high-density lipoprotein cholesterol, hypertriglyceridemia, and hypertension) remains largely unknown. In this trial, we aim to assess the effects of combined (aerobic and resistance) exercise on components of MetS, as well as on physical fitness and QOL, in breast cancer survivors soon after completing cancer-related treatments. Methods/Design This study is a prospective randomized controlled trial (RCT) investigating the effects of a 16-week supervised progressive aerobic and resistance exercise training intervention on MetS in 100 breast cancer survivors. Main inclusion criteria are histologically-confirmed breast cancer stage I-III, completion of chemotherapy and/or radiation within 6 months prior to initiation of the study, sedentary, and free from musculoskeletal disorders. The primary endpoint is MetS; secondary endpoints include: muscle strength, shoulder function, cardiorespiratory fitness, body composition, bone mineral density, and QOL. Participants randomized to the Exercise group participate in 3 supervised weekly exercise sessions for 16 weeks. Participants randomized to the Control group are offered the same intervention after the 16-week period of observation. Discussion This is the one of few RCTs examining the effects of exercise on MetS in breast cancer survivors. Results will contribute a better understanding of metabolic disease-related effects of resistance and

  8. Estimating efficacy in a randomized trial with product nonadherence: application of multiple methods to a trial of preexposure prophylaxis for HIV prevention.

    PubMed

    Murnane, Pamela M; Brown, Elizabeth R; Donnell, Deborah; Coley, R Yates; Mugo, Nelly; Mujugira, Andrew; Celum, Connie; Baeten, Jared M

    2015-11-15

    Antiretroviral preexposure prophylaxis (PrEP) for persons at high risk of human immunodeficiency virus infection is a promising new prevention strategy. Six randomized trials of oral PrEP were recently conducted and demonstrated efficacy estimates ranging from 75% to no effect, with nonadherence likely resulting in attenuated estimates of the protective effect of PrEP. In 1 of these trials, the Partners PrEP Study (Kenya and Uganda, 2008-2011), participants (4,747 serodiscordant heterosexual couples) were randomized to receipt of tenofovir (TDF), coformulated TDF/emtricitabine (FTC), or placebo. Intention-to-treat analyses found efficacy estimates of 67% for TDF and 75% for TDF/FTC. We applied multiple methods to data from that trial to estimate the efficacy of PrEP with high adherence, including principal stratification and inverse-probability-of-censoring (IPC) weights. Results were further from the null when correcting for nonadherence: 1) among the strata with an estimated 100% probability of high adherence (TDF hazard ratio (HR) = 0.19, 95% confidence interval (CI): 0.07, 0.56; TDF/FTC HR = 0.12, 95% CI: 0.03, 0.52); 2) with IPC weights used to approximate a continuously adherent population (TDF HR = 0.18, 95% CI: 0.06, 0.53; TDF/FTC HR = 0.15, 95% CI: 0.04, 0.52); and 3) in per-protocol analysis (TDF HR = 0.18, 95% CI: 0.06, 0.53; TDF/FTC HR = 0.16, 95% CI: 0.05, 0.53). Our results suggest that the efficacy of PrEP with high adherence is over 80%.

  9. A randomized field trial for the primary prevention of osteoporosis among adolescent females: Comparison of two methods, mother centered and daughter centered

    PubMed Central

    Ansari, Hourieh; Farajzadegan, Ziba; Hajigholami, Ali; Paknahad, Zamzam

    2014-01-01

    Background: Osteoporosis is a serious public health. Since the majority of bone mass occurs during adolescence, primary prevention is important. Probably mother's participation in health education interventions leads to promote health behaviors in children. Aims: To assess a lifestyle modification intervention focused on mothers and students has an impact on osteoporosis preventive behaviors in adolescent girls. Materials and Methods: It is a randomized field trial in female high schools. 210 girls aged between 11 and 15 were randomly selected. Students in groups A and C and mothers in group B were selected Through the sampling frame. Our lifestyle modification was based on group based education in the public girls’ high schools. Subjects in the intervention groups participated in three educational sessions. Students’ osteoporosis preventive behaviors were measured by using a lifestyle questionnaire consisting of items assessing nutrition, physical activity and sun exposure. Repeated measure ANOVA at baseline, 4 week, 2 months and 6 months and were used to analyze the data. Results: After 1 month, diet and sun exposure scores increased significantly (P < 0.001) but it was higher in group B compared with group A. (About diet P < 0.001 and sun exposure = 0. 001). After 6 months, diet and sun exposure status in the group A approximately decreased to baseline, while in group B, diet components were significantly different compared to baseline (P < 0.001). There was no change in physical activity. Conclusion: Osteoporosis prevention intervention of adolescent can be effective when parents or girls participate in training sessions, but education is associated with better outcomes when focused on mothers. PMID:25422660

  10. Fractional randomness

    NASA Astrophysics Data System (ADS)

    Tapiero, Charles S.; Vallois, Pierre

    2016-11-01

    The premise of this paper is that a fractional probability distribution is based on fractional operators and the fractional (Hurst) index used that alters the classical setting of random variables. For example, a random variable defined by its density function might not have a fractional density function defined in its conventional sense. Practically, it implies that a distribution's granularity defined by a fractional kernel may have properties that differ due to the fractional index used and the fractional calculus applied to define it. The purpose of this paper is to consider an application of fractional calculus to define the fractional density function of a random variable. In addition, we provide and prove a number of results, defining the functional forms of these distributions as well as their existence. In particular, we define fractional probability distributions for increasing and decreasing functions that are right continuous. Examples are used to motivate the usefulness of a statistical approach to fractional calculus and its application to economic and financial problems. In conclusion, this paper is a preliminary attempt to construct statistical fractional models. Due to the breadth and the extent of such problems, this paper may be considered as an initial attempt to do so.

  11. Treatment of reducible unstable fractures of the distal radius: randomized clinical study comparing the locked volar plate and external fixator methods: study protocol

    PubMed Central

    2014-01-01

    Background Various treatments are available for reducible unstable fractures of the distal radius, such as closed reduction combined with fixation by external fixator (EF), and rigid internal fixation using a locked volar plate (VP). Although there are studies comparing these methods, there is no conclusive evidence indicating which treatment is best. The hypothesis of this study is that surgical treatment with a VP is more effective than EF from the standpoint of functional outcome (patient-reported). Methods/Design The study is randomized clinical trial with parallel groups and a blinded evaluator and involves the surgical interventions EF and VP. Patients will be randomly assigned (assignment ratio 1:1) using sealed opaque envelopes. This trial will include consecutive adult patients with an acute (up to 15 days) displaced, unstable fracture of the distal end of the radius of type A2, A3, C1, C2 or C3 by the Arbeitsgemeinschaft für Osteosynthesefragen–Association for the Study of Internal Fixation classification and type II or type III by the IDEAL32 classification, without previous surgical treatments of the wrist. The surgical intervention assigned will be performed by three surgical specialists familiar with the techniques described. Evaluations will be performed at 2, and 8 weeks, 3, 6 and 12 months, with the primary outcomes being measured by the Disabilities of the Arm, Shoulder and Hand (DASH) questionnaire and measurement of pain (Visual Analog Pain Scale and digital algometer). Secondary outcomes will include radiographic parameters, objective functional evaluation (goniometry and dynamometry), and the rate of complications and method failure according to the intention-to-treat principle. Final postoperative evaluations (6 and 12 months) will be performed by independent blinded evaluators. For the Student’s t-test, a difference of 10 points in the DASH score, with a 95% confidence interval, a statistical power of 80%, and 20% sampling error

  12. Development of an efficient fungal DNA extraction method to be used in random amplified polymorphic DNA-PCR analysis to differentiate cyclopiazonic acid mold producers.

    PubMed

    Sánchez, Beatriz; Rodríguez, Mar; Casado, Eva M; Martín, Alberto; Córdoba, Juan J

    2008-12-01

    A variety of previously established mechanical and chemical treatments to achieve fungal cell lysis combined with a semiautomatic system operated by a vacuum pump were tested to obtain DNA extract to be directly used in randomly amplified polymorphic DNA (RAPD)-PCR to differentiate cyclopiazonic acid-producing and -nonproducing mold strains. A DNA extraction method that includes digestion with proteinase K and lyticase prior to using a mortar and pestle grinding and a semiautomatic vacuum system yielded DNA of high quality in all the fungal strains and species tested, at concentrations ranging from 17 to 89 ng/microl in 150 microl of the final DNA extract. Two microliters of DNA extracted with this method was directly used for RAPD-PCR using primer (GACA)4. Reproducible RAPD fingerprints showing high differences between producer and nonproducer strains were observed. These differences in the RAPD patterns did not differentiate all the strains tested in clusters by cyclopiazonic acid production but may be very useful to distinguish cyclopiazonic acid producer strains from nonproducer strains by a simple RAPD analysis. Thus, the DNA extracts obtained could be used directly without previous purification and quantification for RAPD analysis to differentiate cyclopiazonic acid producer from nonproducer mold strains. This combined analysis could be adaptable to other toxigenic fungal species to enable differentiation of toxigenic and non-toxigenic molds, a procedure of great interest in food safety.

  13. Efficient numerical methods for the random-field Ising model: Finite-size scaling, reweighting extrapolation, and computation of response functions.

    PubMed

    Fytas, Nikolaos G; Martín-Mayor, Víctor

    2016-06-01

    It was recently shown [Phys. Rev. Lett. 110, 227201 (2013)PRLTAO0031-900710.1103/PhysRevLett.110.227201] that the critical behavior of the random-field Ising model in three dimensions is ruled by a single universality class. This conclusion was reached only after a proper taming of the large scaling corrections of the model by applying a combined approach of various techniques, coming from the zero- and positive-temperature toolboxes of statistical physics. In the present contribution we provide a detailed description of this combined scheme, explaining in detail the zero-temperature numerical scheme and developing the generalized fluctuation-dissipation formula that allowed us to compute connected and disconnected correlation functions of the model. We discuss the error evolution of our method and we illustrate the infinite limit-size extrapolation of several observables within phenomenological renormalization. We present an extension of the quotients method that allows us to obtain estimates of the critical exponent α of the specific heat of the model via the scaling of the bond energy and we discuss the self-averaging properties of the system and the algorithmic aspects of the maximum-flow algorithm used.

  14. Development of a New Method for Detection and Identification of Oenococcus oeni Bacteriophages Based on Endolysin Gene Sequence and Randomly Amplified Polymorphic DNA

    PubMed Central

    Doria, Francesca; Napoli, Chiara; Costantini, Antonella; Berta, Graziella; Saiz, Juan-Carlos

    2013-01-01

    Malolactic fermentation (MLF) is a biochemical transformation conducted by lactic acid bacteria (LAB) that occurs in wine at the end of alcoholic fermentation. Oenococcus oeni is the main species responsible for MLF in most wines. As in other fermented foods, where bacteriophages represent a potential risk for the fermentative process, O. oeni bacteriophages have been reported to be a possible cause of unsuccessful MLF in wine. Thus, preparation of commercial starters that take into account the different sensitivities of O. oeni strains to different phages would be advisable. However, currently, no methods have been described to identify phages infecting O. oeni. In this study, two factors are addressed: detection and typing of bacteriophages. First, a simple PCR method was devised targeting a conserved region of the endolysin (lys) gene to detect temperate O. oeni bacteriophages. For this purpose, 37 O. oeni strains isolated from Italian wines during different phases of the vinification process were analyzed by PCR for the presence of the lys gene, and 25 strains gave a band of the expected size (1,160 bp). This is the first method to be developed that allows identification of lysogenic O. oeni strains without the need for time-consuming phage bacterial-lysis induction methods. Moreover, a phylogenetic analysis was conducted to type bacteriophages. After the treatment of bacteria with UV light, lysis was obtained for 15 strains, and the 15 phage DNAs isolated were subjected to two randomly amplified polymorphic DNA (RAPD)-PCRs. By combining the RAPD profiles and lys sequences, 12 different O. oeni phages were clearly distinguished. PMID:23728816

  15. Impact of Denture Cleaning Method and Overnight Storage Condition on Denture Biofilm Mass and Composition: A Cross-Over Randomized Clinical Trial

    PubMed Central

    Duyck, Joke; Vandamme, Katleen; Krausch-Hofmann, Stefanie; Boon, Lies; De Keersmaecker, Katrien; Jalon, Eline; Teughels, Wim

    2016-01-01

    Background Appropriate oral hygiene is required to maintain oral health in denture wearers. This study aims to compare the role of denture cleaning methods in combination with overnight storage conditions on biofilm mass and composition on acrylic removable dentures. Methods In a cross-over randomized controlled trial in 13 older people, 4 conditions with 2 different mechanical cleaning methods and 2 overnight storage conditions were considered: (i) brushing and immersion in water without a cleansing tablet, (ii) brushing and immersion in water with a cleansing tablet, (iii) ultrasonic cleaning and immersion in water without a cleansing tablet, and (iv) ultrasonic cleaning and immersion in water with a cleansing tablet. Each test condition was performed for 5 consecutive days, preceded by a 2-days wash-out period. Biofilm samples were taken at baseline (control) and at the end of each test period from a standardized region. Total and individual levels of selected oral bacteria (n = 20), and of Candida albicans were identified using the Polymerase Chain Reaction (PCR) technique. Denture biofilm coverage was scored using an analogue denture plaque score. Paired t-tests and Wilcoxon-signed rank tests were used to compare the test conditions. The level of significance was set at α< 5%. Results Overnight denture storage in water with a cleansing tablet significantly reduced the total bacterial count (p<0.01). The difference in total bacterial level between the two mechanical cleaning methods was not statistically significant. No significant effect was observed on the amount of Candida albicans nor on the analogue plaque scores. Conclusions The use of cleansing tablets during overnight denture storage in addition to mechanical denture cleaning did not affect Candida albicans count, but reduced the total bacterial count on acrylic removable dentures compared to overnight storage in water. This effect was more pronounced when combined with ultrasonic cleaning compared to

  16. Development of a new method for detection and identification of Oenococcus oeni bacteriophages based on endolysin gene sequence and randomly amplified polymorphic DNA.

    PubMed

    Doria, Francesca; Napoli, Chiara; Costantini, Antonella; Berta, Graziella; Saiz, Juan-Carlos; Garcia-Moruno, Emilia

    2013-08-01

    Malolactic fermentation (MLF) is a biochemical transformation conducted by lactic acid bacteria (LAB) that occurs in wine at the end of alcoholic fermentation. Oenococcus oeni is the main species responsible for MLF in most wines. As in other fermented foods, where bacteriophages represent a potential risk for the fermentative process, O. oeni bacteriophages have been reported to be a possible cause of unsuccessful MLF in wine. Thus, preparation of commercial starters that take into account the different sensitivities of O. oeni strains to different phages would be advisable. However, currently, no methods have been described to identify phages infecting O. oeni. In this study, two factors are addressed: detection and typing of bacteriophages. First, a simple PCR method was devised targeting a conserved region of the endolysin (lys) gene to detect temperate O. oeni bacteriophages. For this purpose, 37 O. oeni strains isolated from Italian wines during different phases of the vinification process were analyzed by PCR for the presence of the lys gene, and 25 strains gave a band of the expected size (1,160 bp). This is the first method to be developed that allows identification of lysogenic O. oeni strains without the need for time-consuming phage bacterial-lysis induction methods. Moreover, a phylogenetic analysis was conducted to type bacteriophages. After the treatment of bacteria with UV light, lysis was obtained for 15 strains, and the 15 phage DNAs isolated were subjected to two randomly amplified polymorphic DNA (RAPD)-PCRs. By combining the RAPD profiles and lys sequences, 12 different O. oeni phages were clearly distinguished.

  17. Determining median urinary iodine concentration that indicates adequate iodine intake at population level.

    PubMed Central

    Delange, François; de Benoist, Bruno; Burgi, Hans

    2002-01-01

    OBJECTIVE: Urinary iodine concentration is the prime indicator of nutritional iodine status and is used to evaluate population-based iodine supplementation. In 1994, WHO, UNICEF and ICCIDD recommended median urinary iodine concentrations for populations of 100- 200 micro g/l, assuming the 100 micro g/l threshold would limit concentrations <50 micro g/l to METHOD: A questionnaire on frequency distribution of urinary iodine in iodine-replete populations was circulated to 29 scientific groups. FINDINGS: Nineteen groups reported data from 48 populations with median urinary iodine concentrations >100 micro g/l. The total population was 55 892, including 35 661 (64%) schoolchildren. Median urinary iodine concentrations were 111-540 (median 201) micro g/l for all populations, 100-199 micro g/l in 23 (48%) populations and >/=200 micro g/l in 25 (52%). The frequencies of values <50 micro g/l were 0-20.8 (mean 4.8%) overall and 7.2% and 2.5% in populations with medians of 100-199 micro g/l and >200 micro g/l, respectively. The frequency reached 20% only in two places where iodine had been supplemented for <2 years. CONCLUSION: The frequency of urinary iodine concentrations <50 micro g/l in populations with median urinary iodine concentrations >/=100 micro g/l has been overestimated. The threshold of 100 micro g/l does not need to be increased. In populations, median urinary iodine concentrations of 100-200 micro g/l indicate adequate iodine intake and optimal iodine nutrition. PMID:12219154

  18. The Goal of Adequate Nutrition: Can It Be Made Affordable, Sustainable, and Universal?

    PubMed

    McFarlane, Ian

    2016-11-30

    Until about 1900, large proportions of the world population endured hunger and poverty. The 20th century saw world population increase from 1.6 to 6.1 billion, accompanied and to some extent made possible by rapid improvements in health standards and food supply, with associated advances in agricultural and nutrition sciences. In this paper, I use the application of linear programming (LP) in preparation of rations for farm animals to illustrate a method of calculating the lowest cost of a human diet selected from locally available food items, constrained to provide recommended levels of food energy and nutrients; then, to find a realistic minimum cost, I apply the further constraint that the main sources of food energy in the costed diet are weighted in proportion to the actual reported consumption of food items in that area. Worldwide variations in dietary preferences raise the issue as to the sustainability of popular dietary regimes, and the paper reviews the factors associated with satisfying requirements for adequate nutrition within those regimes. The ultimate physical constraints on food supply are described, together with the ways in which climate change may affect those constraints. During the 20th century, food supply increased sufficiently in most areas to keep pace with the rapid increase in world population. Many challenges will need to be overcome if food supply is to continue to meet demand, and those challenges are made more severe by rising expectations of quality of life in the developing world, as well as by the impacts of climate change on agriculture and aquaculture.

  19. The Goal of Adequate Nutrition: Can It Be Made Affordable, Sustainable, and Universal?

    PubMed Central

    McFarlane, Ian

    2016-01-01

    Until about 1900, large proportions of the world population endured hunger and poverty. The 20th century saw world population increase from 1.6 to 6.1 billion, accompanied and to some extent made possible by rapid improvements in health standards and food supply, with associated advances in agricultural and nutrition sciences. In this paper, I use the application of linear programming (LP) in preparation of rations for farm animals to illustrate a method of calculating the lowest cost of a human diet selected from locally available food items, constrained to provide recommended levels of food energy and nutrients; then, to find a realistic minimum cost, I apply the further constraint that the main sources of food energy in the costed diet are weighted in proportion to the actual reported consumption of food items in that area. Worldwide variations in dietary preferences raise the issue as to the sustainability of popular dietary regimes, and the paper reviews the factors associated with satisfying requirements for adequate nutrition within those regimes. The ultimate physical constraints on food supply are described, together with the ways in which climate change may affect those constraints. During the 20th century, food supply increased sufficiently in most areas to keep pace with the rapid increase in world population. Many challenges will need to be overcome if food supply is to continue to meet demand, and those challenges are made more severe by rising expectations of quality of life in the developing world, as well as by the impacts of climate change on agriculture and aquaculture. PMID:28231177

  20. Emotional Experiences of Obese Women with Adequate Gestational Weight Variation: A Qualitative Study

    PubMed Central

    Faria-Schützer, Débora Bicudo; Surita, Fernanda Garanhani de Castro; Alves, Vera Lucia Pereira; Vieira, Carla Maria; Turato, Egberto Ribeiro

    2015-01-01

    Background As a result of the growth of the obese population, the number of obese women of fertile age has increased in the last few years. Obesity in pregnancy is related to greater levels of anxiety, depression and physical harm. However, pregnancy is an opportune moment for the intervention of health care professionals to address obesity. The objective of this study was to describe how obese pregnant women emotionally experience success in adequate weight control. Methods and Findings Using a qualitative design that seeks to understand content in the field of health, the sample of subjects was deliberated, with thirteen obese pregnant women selected to participate in an individual interview. Data was analysed by inductive content analysis and includes complete transcription of the interviews, re-readings using suspended attention, categorization in discussion topics and the qualitative and inductive analysis of the content. The analysis revealed four categories, three of which show the trajectory of body care that obese women experience during pregnancy: 1) The obese pregnant woman starts to think about her body;2) The challenge of the diet for the obese pregnant woman; 3) The relation of the obese pregnant woman with the team of antenatal professionals. The fourth category reveals the origin of the motivation for the change: 4) The potentializing factors for change: the motivation of the obese woman while pregnant. Conclusions During pregnancy, obese women are more in touch with themselves and with their emotional conflicts. Through the transformations of their bodies, women can start a more refined self-care process and experience of the body-mind unit. The fear for their own and their baby's life, due to the risks posed by obesity, appears to be a great potentializing factor for change. The relationship with the professionals of the health care team plays an important role in the motivational support of the obese pregnant woman. PMID:26529600

  1. Importance of adequate exercise in the detection of coronary heart disease by radionuclide ventriculography

    SciTech Connect

    Brady, T.J.; Thrall, J.H.; Lo, K.; Pitt, B.

    1980-12-01

    Rest and exercise radionuclide ventriculograms were obtained on 77 symptomatic patients without prior documented coronary artery disease (CAD). Coronary artery disease was present by angiograms in 48. Radionuclide ventriculography (RNV) was abnormal in 41 patients (overall sensitivity 85%). In 29 patients with normal coronary arteries, RNV was normal in 24 (specificity 83%). To determine if the exercise level affects sensitivity, the studies were graded for adequacy of exercise. It was considered adequate if patients developed (a) chest pain, or (b) ST segment depression of at least 1 mm, or (c) if they achieved a pressure rate product greater than 250. Among the 48 patients with coronary artery disease, 35 achieved adequate exercise. Thirty-three had an abnormal RNV (sensitivity 94%). In 13 patients who failed to achieve adequate exercise, RNV was abnormal in eight (sensitivity of only 62%). Some patients with coronary artery disease may have a normal ventricular response at inadequate levels of stress.

  2. [Prevention of ocular complications of herpes zoster ophthalmicus by adequate treatment with acyclovir].

    PubMed

    Borruat, F X; Buechi, E R; Piguet, B; Fitting, P; Zografos, L; Herbort, C P

    1991-05-01

    We compared the frequency of severe ocular complications secondary to Herpes Zoster Ophthalmicus (HZO) in 232 patients. They were divided into three groups: 1) patients without treatment (n = 164); 2) patients treated adequately (n = 48) with acyclovir (ACV; 5 x 800 mg/d orally and ophthalmic ointment 5 x /d for a minimum of 7 days, given within three days after skin eruption); and, 3) patients treated inadequately (n = 20) with ACV (only topical treatment, insufficient doses, interrupted treatment, delayed treatment). Patients with no treatment or with inadequate treatments showed the same frequency of severe ocular complications (21% (34/164) and 25% (5/20), respectively). In contrast, when adequate treatment of ACV was given complications occurred in only 4% (2/48) of cases. This study emphasizes the need for prompt (within three days after skin eruption) and adequate (5 x 800 mg/d for at least 7 days) treatment of ACV to prevent the severe complications of HZO.

  3. A simple method for analyzing actives in random RNAi screens: introducing the “H Score” for hit nomination & gene prioritization

    PubMed Central

    Bhinder, Bhavneet; Djaballah, Hakim

    2013-01-01

    Due to the numerous challenges in hit identification from random RNAi screening, we have examined current practices with a discovery of a variety of methodologies employed and published in many reports; majority of them, unfortunately, do not address the minimum associated criteria for hit nomination, as this could potentially have been the cause or may well be the explanation as to the lack of confirmation and follow up studies, currently facing the RNAi field. Overall, we find that these criteria or parameters are not well defined, in most cases arbitrary in nature, and hence rendering it extremely difficult to judge the quality of and confidence in nominated hits across published studies. For this purpose, we have developed a simple method to score actives independent of assay readout; and provide, for the first time, a homogenous platform enabling cross-comparison of active gene lists resulting from different RNAi screening technologies. Here, we report on our recently developed method dedicated to RNAi data output analysis referred to as the BDA method applicable to both arrayed and pooled RNAi technologies; wherein the concerns pertaining to inconsistent hit nomination and off-target silencing in conjugation with minimal activity criteria to identify a high value target are addressed. In this report, a combined hit rate per gene, called “H score”, is introduced and defined. The H score provides a very useful tool for stringent active gene nomination, gene list comparison across multiple studies, prioritization of hits, and evaluation of the quality of the nominated gene hits. PMID:22934950

  4. Do we really need a large number of particles to simulate bimolecular reactive transport with random walk methods? A kernel density estimation approach

    NASA Astrophysics Data System (ADS)

    Rahbaralam, Maryam; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2015-12-01

    Random walk particle tracking methods are a computationally efficient family of methods to solve reactive transport problems. While the number of particles in most realistic applications is in the order of 106-109, the number of reactive molecules even in diluted systems might be in the order of fractions of the Avogadro number. Thus, each particle actually represents a group of potentially reactive molecules. The use of a low number of particles may result not only in loss of accuracy, but also may lead to an improper reproduction of the mixing process, limited by diffusion. Recent works have used this effect as a proxy to model incomplete mixing in porous media. In this work, we propose using a Kernel Density Estimation (KDE) of the concentrations that allows getting the expected results for a well-mixed solution with a limited number of particles. The idea consists of treating each particle as a sample drawn from the pool of molecules that it represents; this way, the actual location of a tracked particle is seen as a sample drawn from the density function of the location of molecules represented by that given particle, rigorously represented by a kernel density function. The probability of reaction can be obtained by combining the kernels associated to two potentially reactive particles. We demonstrate that the observed deviation in the reaction vs time curves in numerical experiments reported in the literature could be attributed to the statistical method used to reconstruct concentrations (fixed particle support) from discrete particle distributions, and not to the occurrence of true incomplete mixing. We further explore the evolution of the kernel size with time, linking it to the diffusion process. Our results show that KDEs are powerful tools to improve computational efficiency and robustness in reactive transport simulations, and indicates that incomplete mixing in diluted systems should be modeled based on alternative mechanistic models and not on a

  5. Random grammars

    NASA Astrophysics Data System (ADS)

    Malyshev, V. A.

    1998-04-01

    Contents § 1. Definitions1.1. Grammars1.2. Random grammars and L-systems1.3. Semigroup representations § 2. Infinite string dynamics2.1. Cluster expansion2.2. Cluster dynamics2.3. Local observer § 3. Large time behaviour: small perturbations3.1. Invariant measures3.2. Classification § 4. Large time behaviour: context free case4.1. Invariant measures for grammars4.2. L-systems4.3. Fractal correlation functions4.4. Measures on languages Bibliography

  6. Random Forest as an Imputation Method for Education and Psychology Research: Its Impact on Item Fit and Difficulty of the Rasch Model

    ERIC Educational Resources Information Center

    Golino, Hudson F.; Gomes, Cristiano M. A.

    2016-01-01

    This paper presents a non-parametric imputation technique, named random forest, from the machine learning field. The random forest procedure has two main tuning parameters: the number of trees grown in the prediction and the number of predictors used. Fifty experimental conditions were created in the imputation procedure, with different…

  7. Self-reported segregation experience throughout the life course and its association with adequate health literacy.

    PubMed

    Goodman, Melody S; Gaskin, Darrell J; Si, Xuemei; Stafford, Jewel D; Lachance, Christina; Kaphingst, Kimberly A

    2012-09-01

    Residential segregation has been shown to be associated with health outcomes and health care utilization. We examined the association between racial composition of five physical environments throughout the life course and adequate health literacy among 836 community health center patients in Suffolk County, NY. Respondents who attended a mostly White junior high school or currently lived in a mostly White neighborhood were more likely to have adequate health literacy compared to those educated or living in predominantly minority or diverse environments. This association was independent of the respondent's race, ethnicity, age, education, and country of birth.

  8. Self-reported segregation experience throughout the life course and its association with adequate health literacy

    PubMed Central

    Gaskin, Darrell J.; Si, Xuemei; Stafford, Jewel D.; Lachance, Christina; Kaphingst, Kimberly A.

    2012-01-01

    Residential segregation has been shown to be associated with health outcomes and health care utilization. We examined the association between racial composition of five physical environments throughout the life course and adequate health literacy among 836 community health center patients in Suffolk County, NY. Respondents who attended a mostly White junior high school or currently lived in a mostly White neighborhood were more likely to have adequate health literacy compared to those educated or living in predominantly minority or diverse environments. This association was independent of the respondent’s race, ethnicity, age, education, and country of birth. PMID:22658579

  9. Optimization of pharmacotherapy in chronic heart failure: is heart rate adequately addressed?

    PubMed

    Franke, Jennifer; Wolter, Jan Sebastian; Meme, Lillian; Keppler, Jeannette; Tschierschke, Ramon; Katus, Hugo A; Zugck, Christian

    2013-01-01

    bpm (p <0.01). Likewise, comparing the groups ≥75 and <75 bpm, the primary endpoint was significantly increased in the group of patients with heart rates ≥75 bpm 27 vs. 12.2 %; p < 0.01). 5-year event-free survival was significantly lower among patients with heart rates ≥70 bpm as compared to those with <70 bpm (log-rank test p < 0.05) and among patients in the ≥75 bpm group versus <75 bpm group (log-rank test p < 0.01). In conclusion, in clinical practice, 53 % of CHF patients have inadequate heart rate control (heart rates ≥75 bpm) despite concomitant beta-blocker therapy. In this non-randomized cohort, adequate heart rate control under individually optimized beta-blocker therapy was associated with improved mid- and long-term clinical outcome up to 5 years. As further up titration of beta-blockers is not achievable in many patients, the administration of a selective heart rate lowering agent, such as ivabradine adjuvant to beta-blockers may pose an opportunity to further modulate outcome.

  10. Online self-administered training for post-traumatic stress disorder treatment providers: design and methods for a randomized, prospective intervention study

    PubMed Central

    2012-01-01

    This paper presents the rationale and methods for a randomized controlled evaluation of web-based training in motivational interviewing, goal setting, and behavioral task assignment. Web-based training may be a practical and cost-effective way to address the need for large-scale mental health training in evidence-based practice; however, there is a dearth of well-controlled outcome studies of these approaches. For the current trial, 168 mental health providers treating post-traumatic stress disorder (PTSD) were assigned to web-based training plus supervision, web-based training, or training-as-usual (control). A novel standardized patient (SP) assessment was developed and implemented for objective measurement of changes in clinical skills, while on-line self-report measures were used for assessing changes in knowledge, perceived self-efficacy, and practice related to cognitive behavioral therapy (CBT) techniques. Eligible participants were all actively involved in mental health treatment of veterans with PTSD. Study methodology illustrates ways of developing training content, recruiting participants, and assessing knowledge, perceived self-efficacy, and competency-based outcomes, and demonstrates the feasibility of conducting prospective studies of training efficacy or effectiveness in large healthcare systems. PMID:22583520

  11. Feasibility, acceptability, and effects of gentle Hatha yoga for women with major depression: findings from a randomized controlled mixed-methods study.

    PubMed

    Kinser, Patricia Anne; Bourguignon, Cheryl; Whaley, Diane; Hauenstein, Emily; Taylor, Ann Gill

    2013-06-01

    Major depressive disorder (MDD) is a common, debilitating chronic condition in the United States and worldwide. Particularly in women, depressive symptoms are often accompanied by high levels of stress and ruminations, or repetitive self-critical negative thinking. There is a research and clinical imperative to evaluate complementary therapies that are acceptable and feasible for women with depression and that target specific aspects of depression in women, such as ruminations. To begin to address this need, we conducted a randomized, controlled, mixed-methods community-based study comparing an 8-week yoga intervention with an attention-control activity in 27 women with MDD. After controlling for baseline stress, there was a decrease in depression over time in both the yoga group and the attention-control group, with the yoga group having a unique trend in decreased ruminations. Participants in the yoga group reported experiencing increased connectedness and gaining a coping strategy through yoga. The findings provide support for future large scale research to explore the effects of yoga for depressed women and the unique role of yoga in decreasing rumination.

  12. Study design and methods for a randomized crossover trial substituting brown rice for white rice on diabetes risk factors in India.

    PubMed

    Wedick, Nicole M; Sudha, Vasudevan; Spiegelman, Donna; Bai, Mookambika Ramya; Malik, Vasanti S; Venkatachalam, Siva Sankari; Parthasarathy, Vijayalaksmi; Vaidya, Ruchi; Nagarajan, Lakshmipriya; Arumugam, Kokila; Jones, Clara; Campos, Hannia; Krishnaswamy, Kamala; Willett, Walter; Hu, Frank B; Anjana, Ranjit Mohan; Mohan, Viswanathan

    2015-01-01

    India has the second largest number of people with diabetes in the world following China. Evidence indicates that consumption of whole grains can reduce the risk of type 2 diabetes. This article describes the study design and methods of a trial in progress evaluating the effects of substituting whole grain brown rice for polished (refined) white rice on biomarkers of diabetes risk (glucose metabolism, dyslipidemia, inflammation). This is a randomized controlled clinical trial with a crossover design conducted in Chennai, India among overweight but otherwise healthy volunteers aged 25-65 y with a body mass index ≥23 kg/m(2) and habitual rice consumption ≥200 g/day. The feasibility and cultural appropriateness of this type of intervention in the local environment will also be examined. If the intervention is efficacious, the findings can be incorporated into national-level policies which could include the provision of brown rice as an option or replacement for white rice in government institutions and food programs. This relatively simple dietary intervention has the potential to substantially diminish the burden of diabetes in Asia and elsewhere.

  13. Online self-administered training for post-traumatic stress disorder treatment providers: design and methods for a randomized, prospective intervention study.

    PubMed

    Ruzek, Josef I; Rosen, Raymond C; Marceau, Lisa; Larson, Mary Jo; Garvert, Donn W; Smith, Lauren; Stoddard, Anne

    2012-05-14

    This paper presents the rationale and methods for a randomized controlled evaluation of web-based training in motivational interviewing, goal setting, and behavioral task assignment. Web-based training may be a practical and cost-effective way to address the need for large-scale mental health training in evidence-based practice; however, there is a dearth of well-controlled outcome studies of these approaches. For the current trial, 168 mental health providers treating post-traumatic stress disorder (PTSD) were assigned to web-based training plus supervision, web-based training, or training-as-usual (control). A novel standardized patient (SP) assessment was developed and implemented for objective measurement of changes in clinical skills, while on-line self-report measures were used for assessing changes in knowledge, perceived self-efficacy, and practice related to cognitive behavioral therapy (CBT) techniques. Eligible participants were all actively involved in mental health treatment of veterans with PTSD. Study methodology illustrates ways of developing training content, recruiting participants, and assessing knowledge, perceived self-efficacy, and competency-based outcomes, and demonstrates the feasibility of conducting prospective studies of training efficacy or effectiveness in large healthcare systems.

  14. A minimalistic approach to static and dynamic electron correlations: Amending generalized valence bond method with extended random phase approximation correlation correction

    NASA Astrophysics Data System (ADS)

    Chatterjee, Koushik; Pastorczak, Ewa; Jawulski, Konrad; Pernal, Katarzyna

    2016-06-01

    A perfect-pairing generalized valence bond (GVB) approximation is known to be one of the simplest approximations, which allows one to capture the essence of static correlation in molecular systems. In spite of its attractive feature of being relatively computationally efficient, this approximation misses a large portion of dynamic correlation and does not offer sufficient accuracy to be generally useful for studying electronic structure of molecules. We propose to correct the GVB model and alleviate some of its deficiencies by amending it with the correlation energy correction derived from the recently formulated extended random phase approximation (ERPA). On the examples of systems of diverse electronic structures, we show that the resulting ERPA-GVB method greatly improves upon the GVB model. ERPA-GVB recovers most of the electron correlation and it yields energy barrier heights of excellent accuracy. Thanks to a balanced treatment of static and dynamic correlation, ERPA-GVB stays reliable when one moves from systems dominated by dynamic electron correlation to those for which the static correlation comes into play.

  15. A minimalistic approach to static and dynamic electron correlations: Amending generalized valence bond method with extended random phase approximation correlation correction.

    PubMed

    Chatterjee, Koushik; Pastorczak, Ewa; Jawulski, Konrad; Pernal, Katarzyna

    2016-06-28

    A perfect-pairing generalized valence bond (GVB) approximation is known to be one of the simplest approximations, which allows one to capture the essence of static correlation in molecular systems. In spite of its attractive feature of being relatively computationally efficient, this approximation misses a large portion of dynamic correlation and does not offer sufficient accuracy to be generally useful for studying electronic structure of molecules. We propose to correct the GVB model and alleviate some of its deficiencies by amending it with the correlation energy correction derived from the recently formulated extended random phase approximation (ERPA). On the examples of systems of diverse electronic structures, we show that the resulting ERPA-GVB method greatly improves upon the GVB model. ERPA-GVB recovers most of the electron correlation and it yields energy barrier heights of excellent accuracy. Thanks to a balanced treatment of static and dynamic correlation, ERPA-GVB stays reliable when one moves from systems dominated by dynamic electron correlation to those for which the static correlation comes into play.

  16. Electronic monitoring of symptoms and syndromes associated with cancer: methods of a randomized controlled trial SAKK 95/06 E-MOSAIC

    PubMed Central

    2012-01-01

    Background In patients with advanced, incurable cancer, anticancer treatment may be used to alleviate cancer-related symptoms, but monitoring of them in daily practice is rarely done. We aim to test the effectiveness of a real-time symptom and syndrome assessment using the E-MOSAIC software installed in handheld computer generating a longitudinal monitoring sheet (LoMoS) provided to the oncologists in a phase III setting. Methods In this prospective multicentre cluster randomized phase-III trial patients with any incurable solid tumor and having defined cancer related symptoms, who receive new outpatient chemotherapy in palliative intention (expected tumor-size response rate ≤20%) are eligible. Immediately before the weekly visit to oncologists, all patients complete with nurse assistance the E-MOSAIC Assessment: Edmonton Symptom Assessment Scale, ≤3 additional symptoms, estimated nutritional intake, body weight, Karnofsky and medications for pain and cachexia. Experienced oncologists will be randomized to receive the LoMoS or not. To minimize contamination, LoMoS are removed from the medical charts after visits. Primary endpoint is the difference in global quality of life (items 29 & 30 of EORTC-QlQ-C30) between baseline and last study visit at week 6, with a 10 point between-arm difference considered to be clinically relevant. 20 clusters (=oncologists) per treatment arm with 4–8 patients each are aimed for to achieve a significance level of 5% and a power of 80% in a mixed model approach. Selected co- variables are included in the model for adjustment. Secondary endpoints include patient-perceived patient-physician communication symptom burden over time, and oncologists’ symptom management performance (predefined thresholds of symptoms compared to oncologists’ pharmacological, diagnostic or counselling actions [structured chart review]). Discussion This trial will contribute to the research question, whether structured, longitudinal monitoring of

  17. Quantitative structure-property relationships of retention indices of some sulfur organic compounds using random forest technique as a variable selection and modeling method.

    PubMed

    Goudarzi, Nasser; Shahsavani, Davood; Emadi-Gandaghi, Fereshteh; Chamjangali, Mansour Arab

    2016-10-01

    In this work, a noble quantitative structure-property relationship technique is proposed on the basis of the random forest for prediction of the retention indices of some sulfur organic compounds. In order to calculate the retention indices of these compounds, the theoretical descriptors produced using their molecular structures are employed. The influence of the significant parameters affecting the capability of the developed random forest prediction power such as the number of randomly selected variables applied to split each node (m) and the number of trees (nt ) is studied to obtain the best model. After optimizing the nt and m parameters, the random forest model conducted for m = 70 and nt = 460 was found to yield the best results. The artificial neural network and multiple linear regression modeling techniques are also used to predict the retention index values for these compounds for comparison with the results of random forest model. The descriptors selected by the stepwise regression and random forest model are used to build the artificial neural network models. The results achieved showed the superiority of the random forest model over the other models for prediction of the retention indices of the studied compounds.

  18. Using Mendelian randomization to determine causal effects of maternal pregnancy (intrauterine) exposures on offspring outcomes: Sources of bias and methods for assessing them

    PubMed Central

    2017-01-01

    Mendelian randomization (MR), the use of genetic variants as instrumental variables (IVs) to test causal effects, is increasingly used in aetiological epidemiology. Few of the methodological developments in MR have considered the specific situation of using genetic IVs to test the causal effect of exposures in pregnant women on postnatal offspring outcomes. In this paper, we describe specific ways in which the IV assumptions might be violated when MR is used to test such intrauterine effects. We highlight the importance of considering the extent to which there is overlap between genetic variants in offspring that influence their outcome with genetic variants used as IVs in their mothers. Where there is overlap, and particularly if it generates a strong association of maternal genetic IVs with offspring outcome via the offspring genotype, the exclusion restriction assumption of IV analyses will be violated. We recommend a set of analyses that ought to be considered when MR is used to address research questions concerned with intrauterine effects on post-natal offspring outcomes, and provide details of how these can be undertaken and interpreted. These additional analyses include the use of genetic data from offspring and fathers, examining associations using maternal non-transmitted alleles, and using simulated data in sensitivity analyses (for which we provide code). We explore the extent to which new methods that have been developed for exploring violation of the exclusion restriction assumption in the two-sample setting (MR-Egger and median based methods) might be used when exploring intrauterine effects in one-sample MR. We provide a list of recommendations that researchers should use when applying MR to test the effects of intrauterine exposures on postnatal offspring outcomes and use an illustrative example with real data to demonstrate how our recommendations can be applied and subsequent results appropriately interpreted.

  19. Methods and baseline characteristics of a randomized trial treating early childhood obesity: The Positive Lifestyles for Active Youngsters (Team PLAY) trial

    PubMed Central

    Hare, Marion; Coday, Mace; Williams, Natalie A.; Richey, Phyllis; Tylavsky, Frances; Bush, Andrew

    2012-01-01

    There are few effective obesity interventions directed towards younger children, particularly young minority children. This paper describes the design, intervention, recruitment methods, and baseline data of the ongoing Positive Lifestyles for Active Youngsters (Team PLAY) study. This randomized controlled trial is designed to test the efficacy of a 6-month, moderately intense, primary care feasible, family-based behavioral intervention, targeting both young children and their parent, in promoting healthy weight change. Participants are 270 overweight and obese children (ages 4 to 7 years) and their parent, who were recruited from a primarily African American urban population. Parents and children were instructed in proven cognitive behavioral techniques (e.g. goal setting, self-talk, stimulus control and reinforcement) designed to encourage healthier food choices (more whole grains, fruits and vegetables, and less concentrated fats and sugar), reduce portion sizes, decrease sweetened beverages and increase moderate to vigorous physical activity engagement. The main outcome of this study is change in BMI at two years post enrollment. Recruitment using reactive methods (mailings, TV ads, pamphlets) was found to be more successful than using only a proactive approach (referral through physicians). At baseline, most children were very obese with an average BMI z-score of 2.6. Reported intake of fruits and vegetables and minutes of moderate to vigorous physical activity engagement did not meet national recommendations. If efficacious, Team PLAY would offer a model for obesity treatment directed at families with young children that could be tested and translated to both community and primary care settings. PMID:22342450

  20. Methods and baseline characteristics of a randomized trial treating early childhood obesity: the Positive Lifestyles for Active Youngsters (Team PLAY) trial.

    PubMed

    Hare, Marion E; Coday, Mace; Williams, Natalie A; Richey, Phyllis A; Tylavsky, Frances A; Bush, Andrew J

    2012-05-01

    There are few effective obesity interventions directed towards younger children, particularly young minority children. This paper describes the design, intervention, recruitment methods, and baseline data of the ongoing Positive Lifestyles for Active Youngsters (Team PLAY) study. This randomized controlled trial is designed to test the efficacy of a 6-month, moderately intense, primary care feasible, family-based behavioral intervention, targeting both young children and their parent, in promoting healthy weight change. Participants are 270 overweight and obese children (ages 4 to 7 years) and their parents, who were recruited from a primarily African American urban population. Parents and children were instructed in proven cognitive behavioral techniques (e.g. goal setting, self-talk, stimulus control and reinforcement) designed to encourage healthier food choices (more whole grains, fruits and vegetables, and less concentrated fats and sugar), reduce portion sizes, decrease sweetened beverages and increase moderate to vigorous physical activity engagement. The main outcome of this study is change in BMI at two year post enrollment. Recruitment using reactive methods (mailings, TV ads, pamphlets) was found to be more successful than using only a proactive approach (referral through physicians). At baseline, most children were very obese with an average BMI z-score of 2.6. Reported intake of fruits and vegetables and minutes of moderate to vigorous physical activity engagement did not meet national recommendations. If efficacious, Team PLAY would offer a model for obesity treatment directed at families with young children that could be tested and translated to both community and primary care settings.

  1. Application of bimodal distribution to the detection of changes in uranium concentration in drinking water collected by random daytime sampling method from a large water supply zone.

    PubMed

    Garboś, Sławomir; Święcicka, Dorota

    2015-11-01

    The random daytime (RDT) sampling method was used for the first time in the assessment of average weekly exposure to uranium through drinking water in a large water supply zone. Data set of uranium concentrations determined in 106 RDT samples collected in three runs from the water supply zone in Wroclaw (Poland), cannot be simply described by normal or log-normal distributions. Therefore, a numerical method designed for the detection and calculation of bimodal distribution was applied. The extracted two distributions containing data from the summer season of 2011 and the winter season of 2012 (nI=72) and from the summer season of 2013 (nII=34) allowed to estimate means of U concentrations in drinking water: 0.947 μg/L and 1.23 μg/L, respectively. As the removal efficiency of uranium during applied treatment process is negligible, the effect of increase in uranium concentration can be explained by higher U concentration in the surface-infiltration water used for the production of drinking water. During the summer season of 2013, heavy rains were observed in Lower Silesia region, causing floods over the territory of the entire region. Fluctuations in uranium concentrations in surface-infiltration water can be attributed to releases of uranium from specific sources - migration from phosphate fertilizers and leaching from mineral deposits. Thus, exposure to uranium through drinking water may increase during extreme rainfall events. The average chronic weekly intakes of uranium through drinking water, estimated on the basis of central values of the extracted normal distributions, accounted for 3.2% and 4.1% of tolerable weekly intake.

  2. Is random access memory random?

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.

  3. 75 FR 69648 - Safety Analysis Requirements for Defining Adequate Protection for the Public and the Workers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ... ENERGY Safety Analysis Requirements for Defining Adequate Protection for the Public and the Workers... designed to hold firmly in place. 10 CFR Part 830 imposes a requirement that a documented safety analysis... provide guidance on meeting the requirements imposed by DOE Order 5480.23, Nuclear Safety Analysis...

  4. 42 CFR 413.24 - Adequate cost data and cost finding.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... Adequate data capable of being audited is consistent with good business concepts and effective and efficient management of any organization, whether it is operated for profit or on a nonprofit basis. It is a... contract for services (for example, a management contract), directly assigning the costs to the...

  5. Prenatal zinc supplementation of zinc-adequate rats adversely affects immunity in offspring

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We previously showed that zinc (Zn) supplementation of Zn-adequate dams induced immunosuppressive effects that persist in the offspring after weaning. We investigated whether the immunosuppressive effects were due to in utero exposure and/or mediated via milk using a cross-fostering design. Pregnant...

  6. Towards Defining Adequate Lithium Trials for Individuals with Mental Retardation and Mental Illness.

    ERIC Educational Resources Information Center

    Pary, Robert J.

    1991-01-01

    Use of lithium with mentally retarded individuals with psychiatric conditions and/or behavior disturbances is discussed. The paper describes components of an adequate clinical trial and reviews case studies and double-blind cases. The paper concludes that aggression is the best indicator for lithium use, and reviews treatment parameters and…

  7. ADEQUATE SHELTERS AND QUICK REACTIONS TO WARNING: A KEY TO CIVIL DEFENSE.

    PubMed

    LYNCH, F X

    1963-11-08

    Case histories collected by investigators in Japan during 1945 illustrate both the effectiveness of shelters and the dangers inherent in apathy of the population, which suffered needless casualties by ignoring air raid warnintgs. Adequate shelters and immediate response to warnings are essential to survival in nuclear attack.

  8. Perceptions of Teachers in Their First Year of School Restructuring: Failure to Make Adequate Yearly Progress

    ERIC Educational Resources Information Center

    Moser, Sharon

    2010-01-01

    The 2007-2008 school year marked the first year Florida's Title I schools that did not made Adequate Yearly Progress (AYP) for five consecutive years entered into restructuring as mandated by the "No Child Left Behind Act" of 2001. My study examines the perceptions of teacher entering into their first year of school restructuring due to…

  9. 45 CFR 1182.15 - Institute responsibility for maintaining adequate technical, physical, and security safeguards to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... record systems. These security safeguards shall apply to all systems in which identifiable personal data... data and automated systems shall be adequately trained in the security and privacy of personal data. (4... technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of manual...

  10. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...

  11. Effect of tranquilizers on animal resistance to the adequate stimuli of the vestibular apparatus

    NASA Technical Reports Server (NTRS)

    Maksimovich, Y. B.; Khinchikashvili, N. V.

    1980-01-01

    The effect of tranquilizers on vestibulospinal reflexes and motor activity was studied in 900 centrifuged albino mice. Actometric studies have shown that the tranquilizers have a group capacity for increasing animal resistance to the action of adequate stimuli to the vestibular apparatus.

  12. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  13. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  14. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  15. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...

  16. The Relationship between Parental Involvement and Adequate Yearly Progress among Urban, Suburban, and Rural Schools

    ERIC Educational Resources Information Center

    Ma, Xin; Shen, Jianping; Krenn, Huilan Y.

    2014-01-01

    Using national data from the 2007-08 School and Staffing Survey, we compared the relationships between parental involvement and school outcomes related to adequate yearly progress (AYP) in urban, suburban, and rural schools. Parent-initiated parental involvement demonstrated significantly positive relationships with both making AYP and staying off…

  17. Which Food Security Determinants Predict Adequate Vegetable Consumption among Rural Western Australian Children?

    PubMed

    Godrich, Stephanie L; Lo, Johnny; Davies, Christina R; Darby, Jill; Devine, Amanda

    2017-01-03

    Improving the suboptimal vegetable consumption among the majority of Australian children is imperative in reducing chronic disease risk. The objective of this research was to determine whether there was a relationship between food security determinants (FSD) (i.e., food availability, access, and utilisation dimensions) and adequate vegetable consumption among children living in regional and remote Western Australia (WA). Caregiver-child dyads (n = 256) living in non-metropolitan/rural WA completed cross-sectional surveys that included questions on FSD, demographics and usual vegetable intake. A total of 187 dyads were included in analyses, which included descriptive and logistic regression analyses via IBM SPSS (version 23). A total of 13.4% of children in this sample had adequate vegetable intake. FSD that met inclusion criteria (p ≤ 0.20) for multivariable regression analyses included price; promotion; quality; location of food outlets; variety of vegetable types; financial resources; and transport to outlets. After adjustment for potential demographic confounders, the FSD that predicted adequate vegetable consumption were, variety of vegetable types consumed (p = 0.007), promotion (p = 0.017), location of food outlets (p = 0.027), and price (p = 0.043). Food retail outlets should ensure that adequate varieties of vegetable types (i.e., fresh, frozen, tinned) are available, vegetable messages should be promoted through food retail outlets and in community settings, towns should include a range of vegetable purchasing options, increase their reliance on a local food supply and increase transport options to enable affordable vegetable purchasing.

  18. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall...

  19. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall...

  20. Identifying the Factors Impacting the Adequately Yearly Progress Performance in the United States

    ERIC Educational Resources Information Center

    Hsieh, Ju-Shan

    2013-01-01

    The NCLB (No Child Left Behind Act) specifies that states must develop AYP (adequate yearly progress) statewide measurable objectives for improved achievement by all students, including economically disadvantaged students, students from minority races, students with disabilities, and students with limited English proficiency. By the 2013-2014…

  1. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...

  2. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...

  3. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...

  4. 76 FR 51041 - Hemoglobin Standards and Maintaining Adequate Iron Stores in Blood Donors; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ... HUMAN SERVICES Food and Drug Administration Hemoglobin Standards and Maintaining Adequate Iron Stores in... workshop. The Food and Drug Administration (FDA) is announcing a public workshop entitled: ``Hemoglobin... discuss blood donor hemoglobin and hematocrit qualification standards in the United States, its impact...

  5. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...

  6. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...

  7. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...

  8. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...

  9. 21 CFR 514.117 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...

  10. 30 CFR 227.801 - What if a State does not adequately perform a delegated function?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... delegated function? 227.801 Section 227.801 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR MINERALS REVENUE MANAGEMENT DELEGATION TO STATES Performance Review § 227.801 What if a State does not adequately perform a delegated function? If your performance of the delegated function does...

  11. Science Education as a Contributor to Adequate Yearly Progress and Accountability Programs

    ERIC Educational Resources Information Center

    Judson, Eugene

    2010-01-01

    The No Child Left Behind (NCLB) Act requires states to measure the adequate yearly progress (AYP) of each public school and local educational agency (LEA) and to hold schools and LEAs accountable for failing to make AYP. Although it is required that science be assessed in at least three grades, the achievement results from science examinations are…

  12. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...

  13. Understanding the pelvic pain mechanism is key to find an adequate therapeutic approach.

    PubMed

    Van Kerrebroeck, Philip

    2016-06-25

    Pain is a natural mechanism to actual or potential tissue damage and involves both a sensory and an emotional experience. In chronic pelvic pain, localisation of pain can be widespread and can cause considerable distress. A multidisciplinary approach is needed in order to fully understand the pelvic pain mechanism and to identify an adequate therapeutic approach.

  14. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 1 2012-07-01 2012-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...

  15. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 1 2014-07-01 2014-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...

  16. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 1 2013-07-01 2013-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...

  17. Human milk feeding supports adequate growth in infants

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The ...

  18. Influenza 2005-2006: vaccine supplies adequate, but bird flu looms.

    PubMed

    Mossad, Sherif B

    2005-11-01

    Influenza vaccine supplies appear to be adequate for the 2005-2006 season, though delivery has been somewhat delayed. However, in the event of a pandemic of avian flu-considered inevitable by most experts, although no one knows when it will happen-the United States would be woefully unprepared.

  19. How Much and What Kind? Identifying an Adequate Technology Infrastructure for Early Childhood Education. Policy Brief

    ERIC Educational Resources Information Center

    Daugherty, Lindsay; Dossani, Rafiq; Johnson, Erin-Elizabeth; Wright, Cameron

    2014-01-01

    To realize the potential benefits of technology use in early childhood education (ECE), and to ensure that technology can help to address the digital divide, providers, families of young children, and young children themselves must have access to an adequate technology infrastructure. The goals for technology use in ECE that a technology…

  20. Adequate but not supplemental folic acid combined with soy isoflavones during early life improves bone health at adulthood in male mice.

    PubMed

    Kaludjerovic, Jovana; Ward, Wendy E

    2013-10-01

    Previous investigations from our laboratory have demonstrated that neonatal exposure to soy isoflavones (ISO) improves bone outcomes in CD-1 mice at adulthood with greater benefits in females than males. This study determined whether early-life exposure to supplemental folic acid (FA) - that may enhance DNA methylation of target genes - in combination with ISO provides greater benefits to male bone development than ISO alone. CD-1 dams were randomized to a low (0 mg/kg diet), adequate (2 mg/kg diet) or supplemental (8 mg/kg diet) level of FA during pregnancy and lactation. Offspring received corn oil or ISO (7 mg/kg of body weight per day) from postnatal day 1-10. From weaning, males were fed adequate FA and studied to age 4 months. Offspring exposed to adequate FA+ISO had multiple benefits to bone health: higher (P<.05) bone mineral density (BMD) and greater (P<.05) resistance to fracture at the femur and lumbar spine than mice exposed to adequate FA alone. Exposure to supplemental FA+ISO resulted in higher (P<.05) serum osteoprotegerin (OPG), and a higher ratio of OPG to receptor activator for nuclear factor κβ ligand (RANKL) but did not result in greater BMD or strength at the femur or lumbar spine than supplemental FA alone. In conclusion, early-life exposure to adequate FA+ISO provided functional benefits to male bone development, while improvements induced by supplemental FA+ISO were limited to a higher level of serum OPG. Mechanistic studies are needed to better understand how FA and ISO improve bone development in male offspring.

  1. Comparison of the methods of fibrinolysis by tube thoracostomy and thoracoscopic decortication in children with stage II and III empyema: a prospective randomized study

    PubMed Central

    Cobanoglu, Ufuk; Sayir, Fuat; Bilici, Salim; Melek, Mehmet

    2011-01-01

    Today, in spite of the developments in imaging methods and antibiotherapy, childhood pleural empyema is a prominent cause of morbidity and mortality. In recent years, it has been shown that there has been an increase in the frequency of pleural empyema in children, and antibiotic resistance in microorganisms causing pleural empyema has made treatment difficult. Despite the many studies investigating thoracoscopic debridement and fibrinolytic treatment separately in the management of this disease, there is are not enough studies comparing these two treatments. The aim of this study was to prospectively compare the efficacy of two different treatment methods in stage II and III empyema cases and to present a perspective for treatment options. We excluded from the study cases with: i) thoracoscopic intervention and fibrinolytic agent were contraindicated; ii) immunosuppression or additional infection focus; iii) concomitant diseases, those with bronchopleural fistula diagnosed radiologically, and Stage I cases. This gave a total of 54 cases: 23 (42.6%) in stage II, and 31 (57.4%) cases in stage III. These patients were randomized into two groups of 27 cases each for debridement or fibrinolytic agent application by video-assisted thoracoscopic decortication (VATS). The continuity of symptoms after the operation, duration of thoracic tube in situ, and the length of hospital stay in the VATS group were of significantly shorter duration than in the streptokinase applications (P=0.0001). In 19 of 27 cases (70.37%) in which fibrinolytic treatment was applied and in 21 cases of 27 (77.77%) in which VATS was applied, the lung was fully expanded and the procedure was considered successful. There was no significant difference with respect to success rates between the two groups (P=0.533). The complication rate in our cases was 12.96% and no mortality was observed. Similar success rates in thoracoscopic drainage and enzymatic debridement, and the low cost of enzymatic drainage

  2. Planning 4-Dimensional Computed Tomography (4DCT) Cannot Adequately Represent Daily Intrafractional Motion of Abdominal Tumors

    SciTech Connect

    Ge, Jiajia; Santanam, Lakshmi; Noel, Camille; Parikh, Parag J.

    2013-03-15

    Purpose: To evaluate whether planning 4-dimensional computed tomography (4DCT) can adequately represent daily motion of abdominal tumors in regularly fractionated and stereotactic body radiation therapy (SBRT) patients. Methods and Materials: Intrafractional tumor motion of 10 patients with abdominal tumors (4 pancreas-fractionated and 6 liver-stereotactic patients) with implanted fiducials was measured based on daily orthogonal fluoroscopic movies over 38 treatment fractions. The needed internal margin for at least 90% of tumor coverage was calculated based on a 95th and fifth percentile of daily 3-dimensional tumor motion. The planning internal margin was generated by fusing 4DCT motion from all phase bins. The disagreement between needed and planning internal margin was analyzed fraction by fraction in 3 motion axes (superior-inferior [SI], anterior-posterior [AP], and left-right [LR]). The 4DCT margin was considered as an overestimation/underestimation of daily motion when disagreement exceeded at least 3 mm in the SI axis and/or 1.2 mm in the AP and LR axes (4DCT image resolution). The underlying reasons for this disagreement were evaluated based on interfractional and intrafractional breathing variation. Results: The 4DCT overestimated daily 3-dimensional motion in 39% of the fractions in 7 of 10 patients and underestimated it in 53% of the fractions in 8 of 10 patients. Median underestimation was 3.9 mm, 3.0 mm, and 1.7 mm in the SI axis, AP axis, and LR axis, respectively. The 4DCT was found to capture irregular deep breaths in 3 of 10 patients, with 4DCT motion larger than mean daily amplitude by 18 to 21 mm. The breathing pattern varied from breath to breath and day to day. The intrafractional variation of amplitude was significantly larger than intrafractional variation (2.7 mm vs 1.3 mm) in the primary motion axis (ie, SI axis). The SBRT patients showed significantly larger intrafractional amplitude variation than fractionated patients (3.0 mm vs 2

  3. A Prospective Randomized Study Comparing Mini-surgical Percutaneous Dilatational Tracheostomy With Surgical and Classical Percutaneous Tracheostomy: A New Method Beyond Contraindications.

    PubMed

    Hashemian, Seyed Mohammad-Reza; Digaleh, Hadi

    2015-11-01

    Although percutaneous dilatational tracheostomy (PDT) is more accessible and less time-demanding compared with surgical tracheostomy (ST), it has its own limitations. We introduced a modified PDT technique and brought some surgical knowledge to the bedside to overcome some standard percutaneous dilatational tracheostomy relative contraindications. PDT uses a blind route of tracheal access that usually requires perioperational imaging guidance to protect accidental injuries. Moreover, there are contraindications in certain cases, limiting widespread PDT application. Different PDT modifications and devices have been represented to address the problem; however, these approaches are not generally popular among professionals due to limited accessibility and/or other reasons.We prospectively analyzed the double-blinded trial, patient and nurse head evaluating the complications, and collected data from 360 patients who underwent PDT, ST, or our modified mini-surgical PDT (msPDT, Hashemian method). These patients were divided into 2 groups-contraindicated to PDT-and randomization was done for msPDT or PDT in PDT-indicated group and msPDT or ST for PDT-contraindicated patients. The cases were compared in terms of pre and postoperational complications.Data analysis demonstrated that the mean value of procedural time was significantly lower in the msPDT group, either compared with the standard PDT or the ST group. Paratracheal insertion, intraprocedural hypoxemia, and bleeding were also significantly lower in the msPDT group compared with the standard PDT group. Other complications were not significantly different between msPDT and ST patients.The introduced msPDT represented a semiopen incision, other than blinded PDT route of tracheal access that allowed proceduralist to withdraw bronchoscopy and reduced the total time of procedure. Interestingly, the most important improvement was performing msPDT on PDT-contraindicated patients with the complication rate comparable to

  4. Treatment Outcomes of Corticosteroid Injection and Extracorporeal Shock Wave Therapy as Two Primary Therapeutic Methods for Acute Plantar Fasciitis: A Prospective Randomized Clinical Trial.

    PubMed

    Mardani-Kivi, Mohsen; Karimi Mobarakeh, Mahmoud; Hassanzadeh, Zabihallah; Mirbolook, Ahmadreza; Asadi, Kamran; Ettehad, Hossein; Hashemi-Motlagh, Keyvan; Saheb-Ekhtiari, Khashayar; Fallah-Alipour, Keyvan

    2015-01-01

    The outcome of corticosteroid injection (CSI) and extracorporeal shock wave therapy (ESWT) as primary treatment of acute plantar fasciitis has been debated. The purpose of the present study was to evaluate and compare the therapeutic effects of CSI and ESWT in patients with acute (<6-week duration) symptomatic plantar fasciitis. Of the 116 eligible patients, 68 were randomized to 2 equal groups of 34 patients, each undergoing either ESWT or CSI. The ESWT method included 2000 impulses with energy of 0.15 mJ/mm(2) and a total energy flux density of 900 mJ/mm(2) for 3 consecutive sessions at 1-week intervals. In the CSI group, 40 mg of methyl prednisolone acetate plus 1 mL of lidocaine 2% was injected into the maximal tenderness point at the inframedial calcaneal tuberosity. The success and recurrence rates and pain intensity measured using the visual analog scale, were recorded and compared at the 3-month follow-up visit. The pain intensity had reduced significantly in all patients undergoing either technique. However, the value and trend of pain reduction in the CSI group was significantly greater than those in the ESWT group (p < .0001). In the ESWT and CSI groups, 19 (55.9%) and 5 (14.7%) patients experienced treatment failure, respectively. Age, gender, body mass index, and recurrence rate were similar between the 2 groups (p > .05). Both ESWT and CSI can be used as the primary and/or initial treatment option for treating patients with acute plantar fasciitis; however, the CSI technique had better therapeutic outcomes.

  5. Assessment of adequate quality and collocation of reference measurements with space borne hyperspectral infrared instruments to validate retrievals of temperature and water vapour

    NASA Astrophysics Data System (ADS)

    Calbet, X.

    2015-06-01

    A method is presented to assess whether a given reference ground based point observation, typically a radiosonde measurement, is adequately collocated and sufficiently representative of space borne hyperspectral infrared instrument measurements. Once this assessment is made, the ground based data can be used to validate and potentially calibrate, with a high degree of accuracy, the hyperspectral retrievals of temperature and water vapour.

  6. The concept of adequate causation and Max Weber's comparative sociology of religion.

    PubMed

    Buss, A

    1999-06-01

    Max Weber's The Protestant Ethic and the Spirit of Capitalism, studied in isolation, shows mainly an elective affinity or an adequacy on the level of meaning between the Protestant ethic and the 'spirit' of capitalism. Here it is suggested that Weber's subsequent essays on 'The Economic Ethics of World Religions' are the result of his opinion that adequacy on the level of meaning needs and can be verified by causal adequacy. After some introductory remarks, particularly on elective affinity, the paper tries to develop the concept of adequate causation and the related concept of objective possibility on the basis of the work of v. Kries on whom Weber heavily relied. In the second part, this concept is used to show how the study of the economic ethics of India, China, Rome and orthodox Russia can support the thesis that the 'spirit' of capitalism, although it may not have been caused by the Protestant ethic, was perhaps adequately caused by it.

  7. Ensuring smokers are adequately informed: reflections on consumer rights, manufacturer responsibilities, and policy implications

    PubMed Central

    Chapman, S; Liberman, J

    2005-01-01

    The right to information is a fundamental consumer value. Following the advent of health warnings, the tobacco industry has repeatedly asserted that smokers are fully informed of the risks they take, while evidence demonstrates widespread superficial levels of awareness and understanding. There remains much that tobacco companies could do to fulfil their responsibilities to inform smokers. We explore issues involved in the meaning of "adequately informed" smoking and discuss some of the key policy and regulatory implications. We use the idea of a smoker licensing scheme—under which it would be illegal to sell to smokers who had not demonstrated an adequate level of awareness—as a device to explore some of these issues. We also explore some of the difficulties that addiction poses for the notion that smokers might ever voluntarily assume the risks of smoking. PMID:16046703

  8. Myth 19: Is Advanced Placement an Adequate Program for Gifted Students?

    ERIC Educational Resources Information Center

    Gallagher, Shelagh A.

    2009-01-01

    Is it a myth that Advanced Placement (AP) is an adequate program for gifted students? AP is so covered with myths and assumptions that it is hard to get a clear view of the issues. In this article, the author finds the answer about AP by looking at current realties. First, AP is hard for gifted students to avoid. Second, AP never was a program…

  9. Which Food Security Determinants Predict Adequate Vegetable Consumption among Rural Western Australian Children?

    PubMed Central

    Godrich, Stephanie L.; Lo, Johnny; Davies, Christina R.; Darby, Jill; Devine, Amanda

    2017-01-01

    Improving the suboptimal vegetable consumption among the majority of Australian children is imperative in reducing chronic disease risk. The objective of this research was to determine whether there was a relationship between food security determinants (FSD) (i.e., food availability, access, and utilisation dimensions) and adequate vegetable consumption among children living in regional and remote Western Australia (WA). Caregiver-child dyads (n = 256) living in non-metropolitan/rural WA completed cross-sectional surveys that included questions on FSD, demographics and usual vegetable intake. A total of 187 dyads were included in analyses, which included descriptive and logistic regression analyses via IBM SPSS (version 23). A total of 13.4% of children in this sample had adequate vegetable intake. FSD that met inclusion criteria (p ≤ 0.20) for multivariable regression analyses included price; promotion; quality; location of food outlets; variety of vegetable types; financial resources; and transport to outlets. After adjustment for potential demographic confounders, the FSD that predicted adequate vegetable consumption were, variety of vegetable types consumed (p = 0.007), promotion (p = 0.017), location of food outlets (p = 0.027), and price (p = 0.043). Food retail outlets should ensure that adequate varieties of vegetable types (i.e., fresh, frozen, tinned) are available, vegetable messages should be promoted through food retail outlets and in community settings, towns should include a range of vegetable purchasing options, increase their reliance on a local food supply and increase transport options to enable affordable vegetable purchasing. PMID:28054955

  10. Global risk assessment of aflatoxins in maize and peanuts: are regulatory standards adequately protective?

    PubMed

    Wu, Felicia; Stacy, Shaina L; Kensler, Thomas W

    2013-09-01

    The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America.

  11. Global Risk Assessment of Aflatoxins in Maize and Peanuts: Are Regulatory Standards Adequately Protective?

    PubMed Central

    Wu, Felicia

    2013-01-01

    The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295

  12. Current strategies for the restoration of adequate lordosis during lumbar fusion.

    PubMed

    Barrey, Cédric; Darnis, Alice

    2015-01-18

    Not restoring the adequate lumbar lordosis during lumbar fusion surgery may result in mechanical low back pain, sagittal unbalance and adjacent segment degeneration. The objective of this work is to describe the current strategies and concepts for restoration of adequate lordosis during fusion surgery. Theoretical lordosis can be evaluated from the measurement of the pelvic incidence and from the analysis of spatial organization of the lumbar spine with 2/3 of the lordosis given by the L4-S1 segment and 85% by the L3-S1 segment. Technical aspects involve patient positioning on the operating table, release maneuvers, type of instrumentation used (rod, screw-rod connection, interbody cages), surgical sequence and the overall surgical strategy. Spinal osteotomies may be required in case of fixed kyphotic spine. AP combined surgery is particularly efficient in restoring lordosis at L5-S1 level and should be recommended. Finally, not one but several strategies may be used to achieve the need for restoration of adequate lordosis during fusion surgery.

  13. Oil & gas in the 1990`s and beyond: Adequate supplies, growing demand, flat prices

    SciTech Connect

    Kennedy, J.L.

    1995-06-01

    Long term petroleum market fundamentals are clear: supplies are adequate and world demand will continue to grow steadily. Adequate supplies insure that prices will not increase significantly, on average, till the end of the 1990`s, probably much beyond. Despite plentiful supply and modest price increases, there will be peaks and valleys in the price graph as productive capacity is used up, then expanded. Tens of billions of dollars will be needed over the next decade to expand producing capacity. World oil consumption will increase at about 1.5% per year, at least for the next decade. Demand in Asia and Latin America will grow several times faster than this average world rate. World natural gas demand will grow at more then 2% per year well past 2000. Oil and gas companies around the world have changed the way they operate to survive the market realities of the 1990`s. restructuring, outsourcing, and partnering will continue as increasing costs and flat prices squeeze profits. Energy use patterns will change. Fuel and other product specifications will change. Market shares of oil and gas will shift. But opportunities abound in this new market environment. Growing markets always provide opportunities. Technology has helped operators dramatically lower finding, developing, and producing costs. The petroleum age is far from being over. Growing markets, adequate supply, affordable products, and a 60% market share. Those are the signs of an industry with a bright future.

  14. Current strategies for the restoration of adequate lordosis during lumbar fusion

    PubMed Central

    Barrey, Cédric; Darnis, Alice

    2015-01-01

    Not restoring the adequate lumbar lordosis during lumbar fusion surgery may result in mechanical low back pain, sagittal unbalance and adjacent segment degeneration. The objective of this work is to describe the current strategies and concepts for restoration of adequate lordosis during fusion surgery. Theoretical lordosis can be evaluated from the measurement of the pelvic incidence and from the analysis of spatial organization of the lumbar spine with 2/3 of the lordosis given by the L4-S1 segment and 85% by the L3-S1 segment. Technical aspects involve patient positioning on the operating table, release maneuvers, type of instrumentation used (rod, screw-rod connection, interbody cages), surgical sequence and the overall surgical strategy. Spinal osteotomies may be required in case of fixed kyphotic spine. AP combined surgery is particularly efficient in restoring lordosis at L5-S1 level and should be recommended. Finally, not one but several strategies may be used to achieve the need for restoration of adequate lordosis during fusion surgery. PMID:25621216

  15. A test for adequate wastewater treatment based on glutathione S transferase isoenzyme profile.

    PubMed

    Grammou, A; Samaras, P; Papadimitriou, C; Papadopoulos, A I

    2013-04-01

    Discharge to the environment of treated or non-treated municipal wastewater imposes several threats to coastal and estuarine ecosystems which are difficult to assess. In our study we evaluate the use of the isoenzyme profile of glutathione S transferase (GST) in combination with the kinetic characteristics of the whole enzyme and of heme peroxidase, as a test of adequate treatment of municipal wastewater. For this reason, Artemia nauplii were incubated in artificial seawater prepared by wastewater samples, such as secondary municipal effluents produced by a conventional activated sludge unit and advanced treated effluents produced by the employment of coagulation, activated carbon adsorption and chlorination as single processes or as combined ones. Characteristic changes of the isoenzyme pattern and the enzymes' kinetic properties were caused by chlorinated secondary municipal effluent or by secondary non-chlorinated effluent. Advanced treatment by combination of coagulation and/or carbon adsorption resulted to less prominent changes, suggesting more adequate treatment. Our results suggest that GST isoenzyme profile in combination with the kinetic properties of the total enzyme family is a sensitive test for the evaluation of the adequateness of the treatment of reclaimed wastewater and the reduction of potentially harmful compounds. Potentially, it may offer a 'fingerprint' characteristic of a particular effluent and probably of the treatment level it has been subjected.

  16. Are the current Australian sun exposure guidelines effective in maintaining adequate levels of 25-hydroxyvitamin D?

    PubMed

    Kimlin, Michael; Sun, Jiandong; Sinclair, Craig; Heward, Sue; Hill, Jane; Dunstone, Kimberley; Brodie, Alison

    2016-01-01

    An adequate vitamin D status, as measured by serum 25-hydroxyvitamin D (25(OH)D) concentration, is important in humans for maintenance of healthy bones and muscle function. Serum 25(OH)D concentration was assessed in participants from Melbourne, Australia (37.81S, 144.96E), who were provided with the current Australian guidelines on sun exposure for 25(OH)D adequacy (25(OH)D ≥50 nmol/L). Participants were interviewed in February (summer, n=104) and August (winter, n=99) of 2013. Serum 25(OH)D concentration was examined as a function of measures of sun exposure and sun protection habits with control of key characteristics such as dietary intake of vitamin D, body mass index (BMI) and skin colour, that may modify this relationship. The mean 25(OH)D concentration in participants who complied with the current sun exposure guidelines was 67.3 nmol/L in summer and 41.9 nmol/L in winter. At the end of the study, 69.3% of participants who complied with the summer sun exposure guidelines were 25(OH)D adequate, while only 27.6% of participants who complied with the winter sun exposure guidelines were 25(OH)D adequate at the end of the study. The results suggest that the current Australian guidelines for sun exposure for 25(OH)D adequacy are effective for most in summer and ineffective for most in winter. This article is part of a Special Issue entitled '17th Vitamin D Workshop'.

  17. Adequate Iodine Status in New Zealand School Children Post-Fortification of Bread with Iodised Salt

    PubMed Central

    Jones, Emma; McLean, Rachael; Davies, Briar; Hawkins, Rochelle; Meiklejohn, Eva; Ma, Zheng Feei; Skeaff, Sheila

    2016-01-01

    Iodine deficiency re-emerged in New Zealand in the 1990s, prompting the mandatory fortification of bread with iodised salt from 2009. This study aimed to determine the iodine status of New Zealand children when the fortification of bread was well established. A cross-sectional survey of children aged 8–10 years was conducted in the cities of Auckland and Christchurch, New Zealand, from March to May 2015. Children provided a spot urine sample for the determination of urinary iodine concentration (UIC), a fingerpick blood sample for Thyroglobulin (Tg) concentration, and completed a questionnaire ascertaining socio-demographic information that also included an iodine-specific food frequency questionnaire (FFQ). The FFQ was used to estimate iodine intake from all main food sources including bread and iodised salt. The median UIC for all children (n = 415) was 116 μg/L (females 106 μg/L, males 131 μg/L) indicative of adequate iodine status according to the World Health Organisation (WHO, i.e., median UIC of 100–199 μg/L). The median Tg concentration was 8.7 μg/L, which was <10 μg/L confirming adequate iodine status. There was a significant difference in UIC by sex (p = 0.001) and ethnicity (p = 0.006). The mean iodine intake from the food-only model was 65 μg/day. Bread contributed 51% of total iodine intake in the food-only model, providing a mean iodine intake of 35 μg/day. The mean iodine intake from the food-plus-iodised salt model was 101 μg/day. In conclusion, the results of this study confirm that the iodine status in New Zealand school children is now adequate. PMID:27196925

  18. Nebulized antibiotics. An adequate option for treating ventilator-associated respiratory infection?

    PubMed

    Rodríguez, A; Barcenilla, F

    2015-03-01

    Ventilator-associated tracheobronchitis (VAT) is a frequent complication in critical patients. The 90% of those who develop it receive broad-spectrum antibiotic (ATB) treatment, without any strong evidence of its favorable impact. The use of nebulized ATB could be a valid treatment option, to reduce the use of systemic ATB and the pressure of selection on the local flora. Several studies suggest that an adequate nebulization technique can ensure high levels of ATB even in areas of lung consolidation, and to obtain clinical and microbiological cure. New studies are needed to properly assess the impact of treatment with nebulized ATB on the emergence of resistance.

  19. Computer synthesis of human motion as a part of an adequate motion analysis experiment

    NASA Astrophysics Data System (ADS)

    Ivanov, Alexandre A.; Sholukha, Victor A.; Zinkovsky, Anatoly V.

    1999-05-01

    The role of problem of computer synthesis of a human motion for a traditional problem of control generalized and muscular forces determination is discussed. It is emphasized significance of computer model choice for adequate analysis kinematic and dynamic experimental data. On the basis of an imitation computer model influence of model's parameters values is demonstrated. With help of non-stationary constraints we can simulate human motions that satisfy to the most significant parameters of the concerned class of motion. Some results of simulation are discussed. We arrive at a conclusion that for correct interpretation of an experiment mixed problem of bodies system dynamics must be solved.

  20. Design and Methods of “diaBEAT-it!”: A Hybrid Preference/Randomized Control Trial Design using the RE-AIM Framework

    PubMed Central

    Almeida, Fabio A.; Pardo, Kimberlee A.; Seidel, Richard W.; Davy, Brenda M.; You, Wen; Wall, Sarah S.; Smith, Erin; Greenawald, Mark H.; Estabrooks, Paul A.

    2014-01-01

    Diabetes prevention is a public health priority that is dependent upon the reach, effectiveness, and cost of intervention strategies. However, understanding each of these outcomes within the context of randomized controlled trials is problematic. This study uses a unique hybrid design that allows an assessment of reach by providing participants choice between interventions and an assessment of effectiveness and cost using a standard randomized controlled trial (RCT). The trial, which was developed using the RE-AIM framework, will contrast the effects of 3 interventions: (1) a standard care, small group, diabetes prevention education class (SG), (2) the small group intervention plus 12 months of interactive voice response telephone follow-up (SG-IVR), and (3) a DVD version of the small group intervention with the same IVR follow-up (DVD-IVR). Each intervention includes personal action planning with a focus on key elements of the lifestyle intervention from the Diabetes Prevention Program (DPP). Adult patients at risk for diabetes will be randomly assigned to either choice or RCT. Those assigned to choice (n=240) will have the opportunity to choose between SG-IVR and DVD-IVR. Those assigned to RCT group (n=360) will be randomly assigned to SG, SG-IVR, or DVD-IRV. Assessment of primary (weight loss, reach, & cost) and secondary (physical activity, & dietary intake) outcomes will occur at baseline, 6, 12, and 18 months. This will be the first diabetes prevention trial that will allow the research team to determine the relationships between reach, effectiveness, and cost of different interventions. PMID:24956325