Sample records for adequate randomization methods

  1. Adequate margins for random setup uncertainties in head-and-neck IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astreinidou, Eleftheria; Bel, Arjan; Raaijmakers, Cornelis P.J.

    2005-03-01

    Purpose: To investigate the effect of random setup uncertainties on the highly conformal dose distributions produced by intensity-modulated radiotherapy (IMRT) for clinical head-and-neck cancer patients and to determine adequate margins to account for those uncertainties. Methods and materials: We have implemented in our clinical treatment planning system the possibility of simulating normally distributed patient setup displacements, translations, and rotations. The planning CT data of 8 patients with Stage T1-T3N0M0 oropharyngeal cancer were used. The clinical target volumes of the primary tumor (CTV{sub primary}) and of the lymph nodes (CTV{sub elective}) were expanded by 0.0, 1.5, 3.0, and 5.0 mm inmore » all directions, creating the planning target volumes (PTVs). We performed IMRT dose calculation using our class solution for each PTV margin, resulting in the conventional static plans. Then, the system recalculated the plan for each positioning displacement derived from a normal distribution with {sigma} = 2 mm and {sigma} = 4 mm (standard deviation) for translational deviations and {sigma} = 1 deg for rotational deviations. The dose distributions of the 30 fractions were summed, resulting in the actual plan. The CTV dose coverage of the actual plans was compared with that of the static plans. Results: Random translational deviations of {sigma} = 2 mm and rotational deviations of {sigma} = 1 deg did not affect the CTV{sub primary} volume receiving 95% of the prescribed dose (V{sub 95}) regardless of the PTV margin used. A V{sub 95} reduction of 3% and 1% for a 0.0-mm and 1.5-mm PTV margin, respectively, was observed for {sigma} = 4 mm. The V{sub 95} of the CTV{sub elective} contralateral was approximately 1% and 5% lower than that of the static plan for {sigma} = 2 mm and {sigma} = 4 mm, respectively, and for PTV margins < 5.0 mm. An additional reduction of 1% was observed when rotational deviations were included. The same effect was observed for the CTV

  2. On the Choice of Adequate Randomization Ranges for Limiting the Use of Unwanted Cues in Same-Different, Dual-Pair, and Oddity Tasks

    PubMed Central

    Dai, Huanping; Micheyl, Christophe

    2010-01-01

    A major concern when designing a psychophysical experiment is that participants may use another stimulus feature (“cue”) than that intended by the experimenter. One way to avoid this involves applying random variations to the corresponding feature across stimulus presentations, to make the “unwanted” cue unreliable. An important question facing experimenters who use this randomization (“roving”) technique is: How large should the randomization range be to ensure that participants cannot achieve a certain proportion correct (PC) by using the unwanted cue, while at the same time avoiding unnecessary interference of the randomization with task performance? Previous publications have provided formulas for the selection of adequate randomization ranges in yes-no and multiple-alternative, forced-choice tasks. In this article, we provide figures and tables, which can be used to select randomization ranges that are better suited to experiments involving a same-different, dual-pair, or oddity task. PMID:20139466

  3. Enrichment methods provide a feasible approach to comprehensive and adequately powered investigations of the brain methylome

    PubMed Central

    Chan, Robin F.; Shabalin, Andrey A.; Xie, Lin Y.; Adkins, Daniel E.; Zhao, Min; Turecki, Gustavo; Clark, Shaunna L.; Aberg, Karolina A.

    2017-01-01

    Abstract Methylome-wide association studies are typically performed using microarray technologies that only assay a very small fraction of the CG methylome and entirely miss two forms of methylation that are common in brain and likely of particular relevance for neuroscience and psychiatric disorders. The alternative is to use whole genome bisulfite (WGB) sequencing but this approach is not yet practically feasible with sample sizes required for adequate statistical power. We argue for revisiting methylation enrichment methods that, provided optimal protocols are used, enable comprehensive, adequately powered and cost-effective genome-wide investigations of the brain methylome. To support our claim we use data showing that enrichment methods approximate the sensitivity obtained with WGB methods and with slightly better specificity. However, this performance is achieved at <5% of the reagent costs. Furthermore, because many more samples can be sequenced simultaneously, projects can be completed about 15 times faster. Currently the only viable option available for comprehensive brain methylome studies, enrichment methods may be critical for moving the field forward. PMID:28334972

  4. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discretemore » correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.« less

  5. Random Numbers and Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Scherer, Philipp O. J.

    Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.

  6. An Evaluation of the Effectiveness of Recruitment Methods: The Staying Well after Depression Randomized Controlled Trial

    PubMed Central

    Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J. Mark G.

    2014-01-01

    Background Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited and trials often fail to report sufficient details about the recruitment sources and resources utilised. Purpose We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. Methods We describe eight recruitment methods utilised and two further sources not initiated by the research team and examine their efficacy in terms of (i) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial, (ii) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants, and (iii) comparison of sociodemographic characteristics of individuals recruited from different sources. Results Poster advertising, web-based advertising and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. Limitations It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other

  7. An evaluation of the effectiveness of recruitment methods: the staying well after depression randomized controlled trial.

    PubMed

    Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J Mark G

    2014-04-01

    Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties in recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited, and trials often fail to report sufficient details about the recruitment sources and resources utilized. We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. We describe eight recruitment methods utilized and two further sources not initiated by the research team and examine their efficacy in terms of (1) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial; (2) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants; and (3) comparison of sociodemographic characteristics of individuals recruited from different sources. Poster advertising, web-based advertising, and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters, and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other populations, such as those currently unwell, or in

  8. Randomized trials published in some Chinese journals: how many are randomized?

    PubMed Central

    Wu, Taixiang; Li, Youping; Bian, Zhaoxiang; Liu, Guanjian; Moher, David

    2009-01-01

    Background The approximately 1100 medical journals now active in China are publishing a rapidly increasing number of research reports, including many studies identified by their authors as randomized controlled trials. It has been noticed that these reports mostly present positive results, and their quality and authenticity have consequently been called into question. We investigated the adequacy of randomization of clinical trials published in recent years in China to determine how many of them met acceptable standards for allocating participants to treatment groups. Methods The China National Knowledge Infrastructure electronic database was searched for reports of randomized controlled trials on 20 common diseases published from January 1994 to June 2005. From this sample, a subset of trials that appeared to have used randomization methods was selected. Twenty-one investigators trained in the relevant knowledge, communication skills and quality control issues interviewed the original authors of these trials about the participant randomization methods and related quality-control features of their trials. Results From an initial sample of 37,313 articles identified in the China National Knowledge Infrastructure database, we found 3137 apparent randomized controlled trials. Of these, 1452 were studies of conventional medicine (published in 411 journals) and 1685 were studies of traditional Chinese medicine (published in 352 journals). Interviews with the authors of 2235 of these reports revealed that only 207 studies adhered to accepted methodology for randomization and could on those grounds be deemed authentic randomized controlled trials (6.8%, 95% confidence interval 5.9–7.7). There was no statistically significant difference in the rate of authenticity between randomized controlled trials of traditional interventions and those of conventional interventions. Randomized controlled trials conducted at hospitals affiliated to medical universities were more likely

  9. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  10. Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios

    NASA Technical Reports Server (NTRS)

    Juarez, Alfredo; Harper, Susana A.

    2016-01-01

    The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.

  11. The Hartung-Knapp-Sidik-Jonkman method for random effects meta-analysis is straightforward and considerably outperforms the standard DerSimonian-Laird method

    PubMed Central

    2014-01-01

    Background The DerSimonian and Laird approach (DL) is widely used for random effects meta-analysis, but this often results in inappropriate type I error rates. The method described by Hartung, Knapp, Sidik and Jonkman (HKSJ) is known to perform better when trials of similar size are combined. However evidence in realistic situations, where one trial might be much larger than the other trials, is lacking. We aimed to evaluate the relative performance of the DL and HKSJ methods when studies of different sizes are combined and to develop a simple method to convert DL results to HKSJ results. Methods We evaluated the performance of the HKSJ versus DL approach in simulated meta-analyses of 2–20 trials with varying sample sizes and between-study heterogeneity, and allowing trials to have various sizes, e.g. 25% of the trials being 10-times larger than the smaller trials. We also compared the number of “positive” (statistically significant at p < 0.05) findings using empirical data of recent meta-analyses with > = 3 studies of interventions from the Cochrane Database of Systematic Reviews. Results The simulations showed that the HKSJ method consistently resulted in more adequate error rates than the DL method. When the significance level was 5%, the HKSJ error rates at most doubled, whereas for DL they could be over 30%. DL, and, far less so, HKSJ had more inflated error rates when the combined studies had unequal sizes and between-study heterogeneity. The empirical data from 689 meta-analyses showed that 25.1% of the significant findings for the DL method were non-significant with the HKSJ method. DL results can be easily converted into HKSJ results. Conclusions Our simulations showed that the HKSJ method consistently results in more adequate error rates than the DL method, especially when the number of studies is small, and can easily be applied routinely in meta-analyses. Even with the HKSJ method, extra caution is needed when there are = <5 studies

  12. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  13. Factors associated with adequate weekly reporting for disease surveillance data among health facilities in Nairobi County, Kenya, 2013

    PubMed Central

    Mwatondo, Athman Juma; Ng'ang'a, Zipporah; Maina, Caroline; Makayotto, Lyndah; Mwangi, Moses; Njeru, Ian; Arvelo, Wences

    2016-01-01

    Introduction Kenya adopted the Integrated Disease Surveillance and Response (IDSR) strategy in 1998 to strengthen disease surveillance and epidemic response. However, the goal of weekly surveillance reporting among health facilities has not been achieved. We conducted a cross-sectional study to determine the prevalence of adequate reporting and factors associated with IDSR reporting among health facilities in one Kenyan County. Methods Health facilities (public and private) were enrolled using stratified random sampling from 348 facilities prioritized for routine surveillance reporting. Adequately-reporting facilities were defined as those which submitted >10 weekly reports during a twelve-week period and a poor reporting facilities were those which submitted <10 weekly reports. Multivariate logistic regression with backward selection was used to identify risk factors associated with adequate reporting. Results From September 2 through November 30, 2013, we enrolled 175 health facilities; 130(74%) were private and 45(26%) were public. Of the 175 health facilities, 77 (44%) facilities classified as adequate reporting and 98 (56%) were reporting poorly. Multivariate analysis identified three factors to be independently associated with weekly adequate reporting: having weekly reporting forms at visit (AOR19, 95% CI: 6-65], having posters showing IDSR functions (AOR8, 95% CI: 2-12) and having a designated surveillance focal person (AOR7, 95% CI: 2-20). Conclusion The majority of health facilities in Nairobi County were reporting poorly to IDSR and we recommend that the Ministry of Health provide all health facilities in Nairobi County with weekly reporting tools and offer specific trainings on IDSR which will help designate a focal surveillance person. PMID:27303581

  14. Randomized trials published in some Chinese journals: how many are randomized?

    PubMed

    Wu, Taixiang; Li, Youping; Bian, Zhaoxiang; Liu, Guanjian; Moher, David

    2009-07-02

    The approximately 1100 medical journals now active in China are publishing a rapidly increasing number of research reports, including many studies identified by their authors as randomized controlled trials. It has been noticed that these reports mostly present positive results, and their quality and authenticity have consequently been called into question. We investigated the adequacy of randomization of clinical trials published in recent years in China to determine how many of them met acceptable standards for allocating participants to treatment groups. The China National Knowledge Infrastructure electronic database was searched for reports of randomized controlled trials on 20 common diseases published from January 1994 to June 2005. From this sample, a subset of trials that appeared to have used randomization methods was selected. Twenty-one investigators trained in the relevant knowledge, communication skills and quality control issues interviewed the original authors of these trials about the participant randomization methods and related quality-control features of their trials. From an initial sample of 37,313 articles identified in the China National Knowledge Infrastructure database, we found 3137 apparent randomized controlled trials. Of these, 1452 were studies of conventional medicine (published in 411 journals) and 1685 were studies of traditional Chinese medicine (published in 352 journals). Interviews with the authors of 2235 of these reports revealed that only 207 studies adhered to accepted methodology for randomization and could on those grounds be deemed authentic randomized controlled trials (6.8%, 95% confidence interval 5.9-7.7). There was no statistically significant difference in the rate of authenticity between randomized controlled trials of traditional interventions and those of conventional interventions. Randomized controlled trials conducted at hospitals affiliated to medical universities were more likely to be authentic than trials

  15. 7 CFR 3017.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Adequate evidence. 3017.900 Section 3017.900 Agriculture Regulations of the Department of Agriculture (Continued) OFFICE OF THE CHIEF FINANCIAL OFFICER... Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...

  16. 29 CFR 98.900 - Adequate evidence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 1 2012-07-01 2012-07-01 false Adequate evidence. 98.900 Section 98.900 Labor Office of the Secretary of Labor GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 98.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...

  17. 29 CFR 98.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 1 2011-07-01 2011-07-01 false Adequate evidence. 98.900 Section 98.900 Labor Office of the Secretary of Labor GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 98.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...

  18. 29 CFR 98.900 - Adequate evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 1 2013-07-01 2013-07-01 false Adequate evidence. 98.900 Section 98.900 Labor Office of the Secretary of Labor GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 98.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...

  19. Random-Phase Approximation Methods

    NASA Astrophysics Data System (ADS)

    Chen, Guo P.; Voora, Vamsee K.; Agee, Matthew M.; Balasubramani, Sree Ganesh; Furche, Filipp

    2017-05-01

    Random-phase approximation (RPA) methods are rapidly emerging as cost-effective validation tools for semilocal density functional computations. We present the theoretical background of RPA in an intuitive rather than formal fashion, focusing on the physical picture of screening and simple diagrammatic analysis. A new decomposition of the RPA correlation energy into plasmonic modes leads to an appealing visualization of electron correlation in terms of charge density fluctuations. Recent developments in the areas of beyond-RPA methods, RPA correlation potentials, and efficient algorithms for RPA energy and property calculations are reviewed. The ability of RPA to approximately capture static correlation in molecules is quantified by an analysis of RPA natural occupation numbers. We illustrate the use of RPA methods in applications to small-gap systems such as open-shell d- and f-element compounds, radicals, and weakly bound complexes, where semilocal density functional results exhibit strong functional dependence.

  20. 2 CFR 180.900 - Adequate evidence.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 2 Grants and Agreements 1 2013-01-01 2013-01-01 false Adequate evidence. 180.900 Section 180.900 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements OFFICE OF... Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...

  1. 2 CFR 180.900 - Adequate evidence.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 2 Grants and Agreements 1 2012-01-01 2012-01-01 false Adequate evidence. 180.900 Section 180.900 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements OFFICE OF... Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...

  2. Quality of radiotherapy reporting in randomized controlled trials of prostate cancer.

    PubMed

    Soon, Yu Yang; Chen, Desiree; Tan, Teng Hwee; Tey, Jeremy

    2018-06-07

    Good radiotherapy reporting in clinical trials of prostate radiotherapy is important because it will allow accurate reproducibility of radiotherapy treatment and minimize treatment variations that can affect patient outcomes. The aim of our study is to assess the quality of prostate radiotherapy (RT) treatment reporting in randomized controlled trials in prostate cancer. We searched MEDLINE for randomized trials of prostate cancer, published from 1996 to 2016 and included prostate RT as one of the intervention arms. We assessed if the investigators reported the ten criteria adequately in the trial reports: RT dose prescription method; RT dose-planning procedures; organs at risk (OAR) dose constraints; target volume definition, simulation procedures; treatment verification procedures; total RT dose; fractionation schedule; conduct of quality assurance (QA) as well as presence or absence of deviations in RT treatment planning and delivery. We performed multivariate logistic regression to determine the factors that may influence the quality of reporting. We found 59 eligible trials. There was significant variability in the quality of reporting. Target volume definition, total RT dose and fractionation schedule were reported adequately in 97% of included trials. OAR constraints, simulation procedures and presence or absence of deviations in RT treatment planning and delivery were reported adequately in 30% of included trials. Twenty-four trials (40%) reported seven criteria or more adequately. Multivariable logistic analysis showed that trials that published their quality assurance results and cooperative group trials were more likely to have adequate quality in reporting in at least seven criteria. There is significant variability in the quality of reporting on prostate radiotherapy treatment in randomized trials of prostate cancer. We need to have consensus guidelines to standardize the reporting of radiotherapy treatment in randomized trials.

  3. Subtraction method in the Second Random Phase Approximation

    NASA Astrophysics Data System (ADS)

    Gambacurta, Danilo

    2018-02-01

    We discuss the subtraction method applied to the Second Random Phase Approximation (SRPA). This method has been proposed to overcome double counting and stability issues appearing in beyond mean-field calculations. We show that the subtraction procedure leads to a considerable reduction of the SRPA downwards shift with respect to the random phase approximation (RPA) spectra and to results that are weakly cutoff dependent. Applications to the isoscalar monopole and quadrupole response in 16O and to the low-lying dipole response in 48Ca are shown and discussed.

  4. 10-year trend in quantity and quality of pediatric randomized controlled trials published in mainland China: 2002–2011

    PubMed Central

    2013-01-01

    Background Quality assessment of pediatric randomized controlled trials (RCTs) in China is limited. The aim of this study was to evaluate the quantitative trends and quality indicators of RCTs published in mainland China over a recent 10-year period. Methods We individually searched all 17 available pediatric journals published in China from January 1, 2002 to December 30, 2011 to identify RCTs of drug treatment in participants under the age of 18 years. The quality was evaluated according to the Cochrane quality assessment protocol. Results Of 1287 journal issues containing 44398 articles, a total of 2.4% (1077/44398) articles were included in the analysis. The proportion of RCTs increased from 0.28% in 2002 to 0.32% in 2011. Individual sample sizes ranged from 10 to 905 participants (median 81 participants); 2.3% of the RCTs were multiple center trials; 63.9% evaluated Western medicine, 32.5% evaluated traditional Chinese medicine; 15% used an adequate method of random sequence generation; and 10.4% used a quasi-random method for randomization. Only 1% of the RCTs reported adequate allocation concealment and 0.6% reported the method of blinding. The follow-up period was from 7 days to 96 months, with a median of 7.5 months. There was incomplete outcome data reported in 8.3%, of which 4.5% (4/89) used intention-to-treat analysis. Only 0.4% of the included trials used adequate random sequence allocation, concealment and blinding. The articles published from 2007 to 2011 revealed an improvement in the randomization method compared with articles published from 2002 to 2006 (from 2.7% to 23.6%, p = 0.000). Conclusions In mainland China, the quantity of RCTs did not increase in the pediatric population, and the general quality was relatively poor. Quality improvements were suboptimal in the later 5 years. PMID:23914882

  5. Adverse prognostic value of peritumoral vascular invasion: is it abrogated by adequate endocrine adjuvant therapy? Results from two International Breast Cancer Study Group randomized trials of chemoendocrine adjuvant therapy for early breast cancer

    PubMed Central

    Viale, G.; Giobbie-Hurder, A.; Gusterson, B. A.; Maiorano, E.; Mastropasqua, M. G.; Sonzogni, A.; Mallon, E.; Colleoni, M.; Castiglione-Gertsch, M.; Regan, M. M.; Brown, R. W.; Golouh, R.; Crivellari, D.; Karlsson, P.; Öhlschlegel, C.; Gelber, R. D.; Goldhirsch, A.; Coates, A. S.

    2010-01-01

    Background: Peritumoral vascular invasion (PVI) may assist in assigning optimal adjuvant systemic therapy for women with early breast cancer. Patients and methods: Patients participated in two International Breast Cancer Study Group randomized trials testing chemoendocrine adjuvant therapies in premenopausal (trial VIII) or postmenopausal (trial IX) node-negative breast cancer. PVI was assessed by institutional pathologists and/or central review on hematoxylin–eosin-stained slides in 99% of patients (analysis cohort 2754 patients, median follow-up >9 years). Results: PVI, present in 23% of the tumors, was associated with higher grade tumors and larger tumor size (trial IX only). Presence of PVI increased locoregional and distant recurrence and was significantly associated with poorer disease-free survival. The adverse prognostic impact of PVI in trial VIII was limited to premenopausal patients with endocrine-responsive tumors randomized to therapies not containing goserelin, and conversely the beneficial effect of goserelin was limited to patients whose tumors showed PVI. In trial IX, all patients received tamoxifen: the adverse prognostic impact of PVI was limited to patients with receptor-negative tumors regardless of chemotherapy. Conclusion: Adequate endocrine adjuvant therapy appears to abrogate the adverse impact of PVI in node-negative disease, while PVI may identify patients who will benefit particularly from adjuvant therapy. PMID:19633051

  6. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods

    PubMed Central

    Shara, Nawar; Yassin, Sayf A.; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V.; Wang, Wenyu; Lee, Elisa T.; Umans, Jason G.

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989–1991), 2 (1993–1995), and 3 (1998–1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results. PMID:26414328

  7. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods.

    PubMed

    Shara, Nawar; Yassin, Sayf A; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V; Wang, Wenyu; Lee, Elisa T; Umans, Jason G

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989-1991), 2 (1993-1995), and 3 (1998-1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.

  8. Interpreting findings from Mendelian randomization using the MR-Egger method.

    PubMed

    Burgess, Stephen; Thompson, Simon G

    2017-05-01

    Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.

  9. Random element method for numerical modeling of diffusional processes

    NASA Technical Reports Server (NTRS)

    Ghoniem, A. F.; Oppenheim, A. K.

    1982-01-01

    The random element method is a generalization of the random vortex method that was developed for the numerical modeling of momentum transport processes as expressed in terms of the Navier-Stokes equations. The method is based on the concept that random walk, as exemplified by Brownian motion, is the stochastic manifestation of diffusional processes. The algorithm based on this method is grid-free and does not require the diffusion equation to be discritized over a mesh, it is thus devoid of numerical diffusion associated with finite difference methods. Moreover, the algorithm is self-adaptive in space and explicit in time, resulting in an improved numerical resolution of gradients as well as a simple and efficient computational procedure. The method is applied here to an assortment of problems of diffusion of momentum and energy in one-dimension as well as heat conduction in two-dimensions in order to assess its validity and accuracy. The numerical solutions obtained are found to be in good agreement with exact solution except for a statistical error introduced by using a finite number of elements, the error can be reduced by increasing the number of elements or by using ensemble averaging over a number of solutions.

  10. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Technical Reports Server (NTRS)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  11. Random Walk Method for Potential Problems

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, T.; Raju, I. S.

    2002-01-01

    A local Random Walk Method (RWM) for potential problems governed by Lapalace's and Paragon's equations is developed for two- and three-dimensional problems. The RWM is implemented and demonstrated in a multiprocessor parallel environment on a Beowulf cluster of computers. A speed gain of 16 is achieved as the number of processors is increased from 1 to 23.

  12. Developing a model for the adequate description of electronic communication in hospitals.

    PubMed

    Saboor, Samrend; Ammenwerth, Elske

    2011-01-01

    Adequate information and communication systems (ICT) can help to improve the communication in hospitals. Changes to the ICT-infrastructure of hospitals must be planed carefully. In order to support a comprehensive planning, we presented a classification of 81 common errors of the electronic communication on the MIE 2008 congress. Our objective now was to develop a data model that defines specific requirements for an adequate description of electronic communication processes We first applied the method of explicating qualitative content analysis on the error categorization in order to determine the essential process details. After this, we applied the method of subsuming qualitative content analysis on the results of the first step. A data model for the adequate description of electronic communication. This model comprises 61 entities and 91 relationships. The data model comprises and organizes all details that are necessary for the detection of the respective errors. It can be for either used to extend the capabilities of existing modeling methods or as a basis for the development of a new approach.

  13. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  14. Pressure ulcer healing promoted by adequate protein intake in rats

    PubMed Central

    Qin, Zhanfen; Wang, Yao; Zhao, Wei; Zhang, Yanan; Tian, Yiqing; Sun, Sujuan; Li, Xian

    2018-01-01

    The effect of protein intake on rat pressure ulcer healing was evaluated. One hundred rats were numbered according to body weight and then they were randomly divided into 4 groups (n=25) using the random number table. After rat models of stage II pressure ulcer were established, they were fed with feed containing different protein levels (10, 15, 20 and 25%). Healing time, pressure ulcer area, body weight, albumin (ALB) and hemoglobin (Hb) levels among groups were compared. Hematoxylin and eosin (H&E) staining was also performed to observe pressure ulcer tissue structure. In the healing process of pressure ulcer, rats with 20% protein intake had the shortest healing time and the smallest pressure ulcer area. Body weight, ALB and Hb levels were much closer to the normal level. H&E staining result also suggested that the pressure ulcer healing degree of rats with 20% protein intake was much better than the others. Adequate protein intake is therefore conducive to pressure ulcer healing, while excessive or insufficient protein intake has negative impact on healing. PMID:29731816

  15. Pressure ulcer healing promoted by adequate protein intake in rats.

    PubMed

    Qin, Zhanfen; Wang, Yao; Zhao, Wei; Zhang, Yanan; Tian, Yiqing; Sun, Sujuan; Li, Xian

    2018-05-01

    The effect of protein intake on rat pressure ulcer healing was evaluated. One hundred rats were numbered according to body weight and then they were randomly divided into 4 groups (n=25) using the random number table. After rat models of stage II pressure ulcer were established, they were fed with feed containing different protein levels (10, 15, 20 and 25%). Healing time, pressure ulcer area, body weight, albumin (ALB) and hemoglobin (Hb) levels among groups were compared. Hematoxylin and eosin (H&E) staining was also performed to observe pressure ulcer tissue structure. In the healing process of pressure ulcer, rats with 20% protein intake had the shortest healing time and the smallest pressure ulcer area. Body weight, ALB and Hb levels were much closer to the normal level. H&E staining result also suggested that the pressure ulcer healing degree of rats with 20% protein intake was much better than the others. Adequate protein intake is therefore conducive to pressure ulcer healing, while excessive or insufficient protein intake has negative impact on healing.

  16. 5 CFR 919.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Adequate evidence. 919.900 Section 919.900 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS.... Adequate evidence means information sufficient to support the reasonable belief that a particular act or...

  17. 5 CFR 919.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Adequate evidence. 919.900 Section 919.900 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS.... Adequate evidence means information sufficient to support the reasonable belief that a particular act or...

  18. 5 CFR 919.900 - Adequate evidence.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Adequate evidence. 919.900 Section 919.900 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS.... Adequate evidence means information sufficient to support the reasonable belief that a particular act or...

  19. 5 CFR 919.900 - Adequate evidence.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Adequate evidence. 919.900 Section 919.900 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS.... Adequate evidence means information sufficient to support the reasonable belief that a particular act or...

  20. 5 CFR 919.900 - Adequate evidence.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Adequate evidence. 919.900 Section 919.900 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS.... Adequate evidence means information sufficient to support the reasonable belief that a particular act or...

  1. Random errors in interferometry with the least-squares method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Qi

    2011-01-20

    This investigation analyzes random errors in interferometric surface profilers using the least-squares method when random noises are present. Two types of random noise are considered here: intensity noise and position noise. Two formulas have been derived for estimating the standard deviations of the surface height measurements: one is for estimating the standard deviation when only intensity noise is present, and the other is for estimating the standard deviation when only position noise is present. Measurements on simulated noisy interferometric data have been performed, and standard deviations of the simulated measurements have been compared with those theoretically derived. The relationships havemore » also been discussed between random error and the wavelength of the light source and between random error and the amplitude of the interference fringe.« less

  2. Is the Stock of VET Skills Adequate? Assessment Methodologies.

    ERIC Educational Resources Information Center

    Blandy, Richard; Freeland, Brett

    In Australia and elsewhere, four approaches have been used to determine whether stocks of vocational education and training (VET) skills are adequate to meet industry needs. The four methods are as follows: (1) the manpower requirements approach; (2) the international, national, and industry comparisons approach; (3) the labor market analysis…

  3. 2 CFR 180.900 - Adequate evidence.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Adequate evidence. 180.900 Section 180.900 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements OFFICE OF.... Adequate evidence means information sufficient to support the reasonable belief that a particular act or...

  4. 2 CFR 180.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false Adequate evidence. 180.900 Section 180.900 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements OFFICE OF.... Adequate evidence means information sufficient to support the reasonable belief that a particular act or...

  5. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  6. Patient Satisfaction with Different Interpreting Methods: A Randomized Controlled Trial

    PubMed Central

    Leng, Jennifer; Shapiro, Ephraim; Abramson, David; Motola, Ivette; Shield, David C.; Changrani, Jyotsna

    2007-01-01

    Background Growth of the foreign-born population in the U.S. has led to increasing numbers of limited-English-proficient (LEP) patients. Innovative medical interpreting strategies, including remote simultaneous medical interpreting (RSMI), have arisen to address the language barrier. This study evaluates the impact of interpreting method on patient satisfaction. Methods 1,276 English-, Spanish-, Mandarin-, and Cantonese-speaking patients attending the primary care clinic and emergency department of a large New York City municipal hospital were screened for enrollment in a randomized controlled trial. Language-discordant patients were randomized to RSMI or usual and customary (U&C) interpreting. Patients with language-concordant providers received usual care. Demographic and patient satisfaction questionnaires were administered to all participants. Results 541 patients were language-concordant with their providers and not randomized; 371 were randomized to RSMI, 167 of whom were exposed to RSMI; and 364 were randomized to U&C, 198 of whom were exposed to U&C. Patients randomized to RSMI were more likely than those with U&C to think doctors treated them with respect (RSMI 71%, U&C 64%, p < 0.05), but they did not differ in other measures of physician communication/care. In a linear regression analysis, exposure to RSMI was significantly associated with an increase in overall satisfaction with physician communication/care (β 0.10, 95% CI 0.02–0.18, scale 0–1.0). Patients randomized to RSMI were more likely to think the interpreting method protected their privacy (RSMI 51%, U&C 38%, p < 0.05). Patients randomized to either arm of interpretation reported less comprehension and satisfaction than patients in language-concordant encounters. Conclusions While not a substitute for language-concordant providers, RSMI can improve patient satisfaction and privacy among LEP patients. Implementing RSMI should be considered an important component of a multipronged

  7. Effect of packing method on the randomness of disc packings

    NASA Astrophysics Data System (ADS)

    Zhang, Z. P.; Yu, A. B.; Oakeshott, R. B. S.

    1996-06-01

    The randomness of disc packings, generated by random sequential adsorption (RSA), random packing under gravity (RPG) and Mason packing (MP) which gives a packing density close to that of the RSA packing, has been analysed, based on the Delaunay tessellation, and is evaluated at two levels, i.e. the randomness at individual subunit level which relates to the construction of a triangle from a given edge length distribution and the randomness at network level which relates to the connection between triangles from a given triangle frequency distribution. The Delaunay tessellation itself is also analysed and its almost perfect randomness at the two levels is demonstrated, which verifies the proposed approach and provides a random reference system for the present analysis. It is found that (i) the construction of a triangle subunit is not random for the RSA, MP and RPG packings, with the degree of randomness decreasing from the RSA to MP and then to RPG packing; (ii) the connection of triangular subunits in the network is almost perfectly random for the RSA packing, acceptable for the MP packing and not good for the RPG packing. Packing method is an important factor governing the randomness of disc packings.

  8. A random spatial sampling method in a rural developing nation

    Treesearch

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  9. Safety assessment of a shallow foundation using the random finite element method

    NASA Astrophysics Data System (ADS)

    Zaskórski, Łukasz; Puła, Wojciech

    2015-04-01

    A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical

  10. Efficacy of S-flurbiprofen plaster in knee osteoarthritis treatment: Results from a phase III, randomized, active-controlled, adequate, and well-controlled trial.

    PubMed

    Yataba, Ikuko; Otsuka, Noboru; Matsushita, Isao; Matsumoto, Hideo; Hoshino, Yuichi

    2017-01-01

    S-flurbiprofen plaster (SFPP) is a novel non-steroidal anti-inflammatory drug (NSAID) patch, intended for topical treatment for musculoskeletal diseases. This trial was conducted to examine the effectiveness of SFPP using active comparator, flurbiprofen (FP) patch, on knee osteoarthritis (OA) symptoms. This was a phase III, multi-center, randomized, adequate, and well-controlled trial, both investigators and patients were blinded to the assigned treatment. Enrolled 633 knee OA patients were treated with either SFPP or FP patch for two weeks. The primary endpoint was improvement in knee pain on rising from the chair as assessed by visual analogue scale (rVAS). Safety was evaluated through adverse events (AEs). The change in rVAS was 40.9 mm in SFPP group and 30.6 mm in FP patch group (p < 0.001). The incidence of drug-related AEs at the application site was 9.5% (32 AEs, 29 mild and 3 moderate) in SFPP and 1.6% in FP patch (p < 0.001). Withdrawals due to AE were five in SFPP and one in FP patch. The superiority of SFPP in efficacy was demonstrated. Most of AEs were mild and few AEs led to treatment discontinuation. Therefore, SFPP provides an additional option for knee OA therapy.

  11. Evaluation of active and passive recruitment methods used in randomized controlled trials targeting pediatric obesity.

    PubMed

    Raynor, Hollie A; Osterholt, Kathrin M; Hart, Chantelle N; Jelalian, Elissa; Vivier, Patrick; Wing, Rena R

    2009-01-01

    Evaluate enrollment numbers, randomization rates, costs, and cost-effectiveness of active versus passive recruitment methods for parent-child dyads into two pediatric obesity intervention trials. Recruitment methods were categorized into active (pediatrician referral and targeted mailings, with participants identified by researcher/health care provider) versus passive methods (newspaper, bus, internet, television, and earning statements; fairs/community centers/schools; and word of mouth; with participants self-identified). Numbers of enrolled and randomized families and costs/recruitment method were monitored throughout the 22-month recruitment period. Costs (in USD) per recruitment method included staff time, mileage, and targeted costs of each method. A total of 940 families were referred or made contact, with 164 families randomized (child: 7.2+/-1.6 years, 2.27+/-0.61 standardized body mass index [zBMI], 86.6% obese, 61.7% female, 83.5% Caucasian; parent: 38.0+/-5.8 years, 32.9+/-8.4 BMI, 55.2% obese, 92.7% female, 89.6% caucasian). Pediatrician referral, followed by targeted mailings, produced the largest number of enrolled and randomized families (both methods combined producing 87.2% of randomized families). Passive recruitment methods yielded better retention from enrollment to randomization (p<0.05), but produced few families (21 in total). Approximately $91,000 was spent on recruitment, with cost per randomized family at $554.77. Pediatrician referral was the most cost-effective method, $145.95/randomized family, but yielded only 91 randomized families over 22-months of continuous recruitment. Pediatrician referral and targeted mailings, which are active recruitment methods, were the most successful strategies. However, recruitment demanded significant resources. Successful recruitment for pediatric trials should use several strategies. NCT00259324, NCT00200265.

  12. Patient satisfaction with different interpreting methods: a randomized controlled trial.

    PubMed

    Gany, Francesca; Leng, Jennifer; Shapiro, Ephraim; Abramson, David; Motola, Ivette; Shield, David C; Changrani, Jyotsna

    2007-11-01

    Growth of the foreign-born population in the U.S. has led to increasing numbers of limited-English-proficient (LEP) patients. Innovative medical interpreting strategies, including remote simultaneous medical interpreting (RSMI), have arisen to address the language barrier. This study evaluates the impact of interpreting method on patient satisfaction. 1,276 English-, Spanish-, Mandarin-, and Cantonese-speaking patients attending the primary care clinic and emergency department of a large New York City municipal hospital were screened for enrollment in a randomized controlled trial. Language-discordant patients were randomized to RSMI or usual and customary (U&C) interpreting. Patients with language-concordant providers received usual care. Demographic and patient satisfaction questionnaires were administered to all participants. 541 patients were language-concordant with their providers and not randomized; 371 were randomized to RSMI, 167 of whom were exposed to RSMI; and 364 were randomized to U&C, 198 of whom were exposed to U&C. Patients randomized to RSMI were more likely than those with U&C to think doctors treated them with respect (RSMI 71%, U&C 64%, p < 0.05), but they did not differ in other measures of physician communication/care. In a linear regression analysis, exposure to RSMI was significantly associated with an increase in overall satisfaction with physician communication/care (beta 0.10, 95% CI 0.02-0.18, scale 0-1.0). Patients randomized to RSMI were more likely to think the interpreting method protected their privacy (RSMI 51%, U&C 38%, p < 0.05). Patients randomized to either arm of interpretation reported less comprehension and satisfaction than patients in language-concordant encounters. While not a substitute for language-concordant providers, RSMI can improve patient satisfaction and privacy among LEP patients. Implementing RSMI should be considered an important component of a multipronged approach to addressing language barriers in health

  13. Yoga for veterans with chronic low back pain: Design and methods of a randomized clinical trial.

    PubMed

    Groessl, Erik J; Schmalzl, Laura; Maiya, Meghan; Liu, Lin; Goodman, Debora; Chang, Douglas G; Wetherell, Julie L; Bormann, Jill E; Atkinson, J Hamp; Baxi, Sunita

    2016-05-01

    Chronic low back pain (CLBP) afflicts millions of people worldwide, with particularly high prevalence in military veterans. Many treatment options exist for CLBP, but most have limited effectiveness and some have significant side effects. In general populations with CLBP, yoga has been shown to improve health outcomes with few side effects. However, yoga has not been adequately studied in military veteran populations. In the current paper we will describe the design and methods of a randomized clinical trial aimed at examining whether yoga can effectively reduce disability and pain in US military veterans with CLBP. A total of 144 US military veterans with CLBP will be randomized to either yoga or a delayed treatment comparison group. The yoga intervention will consist of 2× weekly yoga classes for 12weeks, complemented by regular home practice guided by a manual. The delayed treatment group will receive the same intervention after six months. The primary outcome is the change in back pain-related disability measured with the Roland-Morris Disability Questionnaire at baseline and 12-weeks. Secondary outcomes include pain intensity, pain interference, depression, anxiety, fatigue/energy, quality of life, self-efficacy, sleep quality, and medication usage. Additional process and/or mediational factors will be measured to examine dose response and effect mechanisms. Assessments will be conducted at baseline, 6-weeks, 12-weeks, and 6-months. All randomized participants will be included in intention-to-treat analyses. Study results will provide much needed evidence on the feasibility and effectiveness of yoga as a therapeutic modality for the treatment of CLBP in US military veterans. Published by Elsevier Inc.

  14. Evaluation of active and passive recruitment methods used in randomized controlled trials targeting pediatric obesity

    PubMed Central

    RAYNOR, HOLLIE A.; OSTERHOLT, KATHRIN M.; HART, CHANTELLE N.; JELALIAN, ELISSA; VIVIER, PATRICK; WING, RENA R.

    2016-01-01

    Objective Evaluate enrollment numbers, randomization rates, costs, and cost-effectiveness of active versus passive recruitment methods for parent-child dyads into two pediatric obesity intervention trials. Methods Recruitment methods were categorized into active (pediatrician referral and targeted mailings, with participants identified by researcher/health care provider) versus passive methods (newspaper, bus, internet, television, and earning statements; fairs/community centers/schools; and word of mouth; with participants self-identified). Numbers of enrolled and randomized families and costs/recruitment method were monitored throughout the 22-month recruitment period. Costs (in USD) per recruitment method included staff time, mileage, and targeted costs of each method. Results A total of 940 families were referred or made contact, with 164 families randomized (child: 7.2±1.6 years, 2.27±0.61 standardized body mass index [zBMI], 86.6% obese, 61.7% female, 83.5% white; parent: 38.0±5.8 years, 32.9±8.4 BMI, 55.2% obese, 92.7% female, 89.6% white). Pediatrician referral, followed by targeted mailings, produced the largest number of enrolled and randomized families (both methods combined producing 87.2% of randomized families). Passive recruitment methods yielded better retention from enrollment to randomization (p <0.05), but produced few families (21 in total). Approximately $91 000 was spent on recruitment, with cost per randomized family at $554.77. Pediatrician referral was the most cost-effective method, $145.95/randomized family, but yielded only 91 randomized families over 22-months of continuous recruitment. Conclusion Pediatrician referral and targeted mailings, which are active recruitment methods, were the most successful strategies. However, recruitment demanded significant resources. Successful recruitment for pediatric trials should use several strategies. Clinical Trials Registration: NCT00259324, NCT00200265 PMID:19922036

  15. Are Substance Use Prevention Programs More Effective in Schools Making Adequate Yearly Progress? A Study of Project ALERT

    ERIC Educational Resources Information Center

    Clark, Heddy Kovach; Ringwalt, Chris L.; Shamblen, Stephen R.; Hanley, Sean M.; Flewelling, Robert L.

    2011-01-01

    This exploratory study sought to determine if a popular school-based drug prevention program might be effective in schools that are making adequate yearly progress (AYP). Thirty-four schools with grades 6 through 8 in 11 states were randomly assigned either to receive Project ALERT (n = 17) or to a control group (n = 17); of these, 10 intervention…

  16. 41 CFR 105-68.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Adequate evidence. 105-68.900 Section 105-68.900 Public Contracts and Property Management Federal Property Management... evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular...

  17. 41 CFR 105-68.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Adequate evidence. 105-68.900 Section 105-68.900 Public Contracts and Property Management Federal Property Management... evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular...

  18. 41 CFR 105-68.900 - Adequate evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false Adequate evidence. 105-68.900 Section 105-68.900 Public Contracts and Property Management Federal Property Management... evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular...

  19. 41 CFR 105-68.900 - Adequate evidence.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false Adequate evidence. 105-68.900 Section 105-68.900 Public Contracts and Property Management Federal Property Management... evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular...

  20. 41 CFR 105-68.900 - Adequate evidence.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Adequate evidence. 105-68.900 Section 105-68.900 Public Contracts and Property Management Federal Property Management... evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular...

  1. Interactive Book Reading to Accelerate Word Learning by Kindergarten Children With Specific Language Impairment: Identifying an Adequate Intensity and Variation in Treatment Response

    PubMed Central

    Voelmle, Krista; Fierro, Veronica; Flake, Kelsey; Fleming, Kandace K.; Romine, Rebecca Swinburne

    2017-01-01

    Purpose This study sought to identify an adequate intensity of interactive book reading for new word learning by children with specific language impairment (SLI) and to examine variability in treatment response. Method An escalation design adapted from nontoxic drug trials (Hunsberger, Rubinstein, Dancey, & Korn, 2005) was used in this Phase I/II preliminary clinical trial. A total of 27 kindergarten children with SLI were randomized to 1 of 4 intensities of interactive book reading: 12, 24, 36, or 48 exposures. Word learning was monitored through a definition task and a naming task. An intensity response curve was examined to identify the adequate intensity. Correlations and classification accuracy were used to examine variation in response to treatment relative to pretreatment and early treatment measures. Results Response to treatment improved as intensity increased from 12 to 24 to 36 exposures, and then no further improvements were observed as intensity increased to 48 exposures. There was variability in treatment response: Children with poor phonological awareness, low vocabulary, and/or poor nonword repetition were less likely to respond to treatment. Conclusion The adequate intensity for this version of interactive book reading was 36 exposures, but further development of the treatment is needed to increase the benefit for children with SLI. PMID:28036410

  2. Recording 2-D Nutation NQR Spectra by Random Sampling Method

    PubMed Central

    Sinyavsky, Nikolaj; Jadzyn, Maciej; Ostafin, Michal; Nogaj, Boleslaw

    2010-01-01

    The method of random sampling was introduced for the first time in the nutation nuclear quadrupole resonance (NQR) spectroscopy where the nutation spectra show characteristic singularities in the form of shoulders. The analytic formulae for complex two-dimensional (2-D) nutation NQR spectra (I = 3/2) were obtained and the condition for resolving the spectral singularities for small values of an asymmetry parameter η was determined. Our results show that the method of random sampling of a nutation interferogram allows significant reduction of time required to perform a 2-D nutation experiment and does not worsen the spectral resolution. PMID:20949121

  3. Fast physical-random number generation using laser diode's frequency noise: influence of frequency discriminator

    NASA Astrophysics Data System (ADS)

    Matsumoto, Kouhei; Kasuya, Yuki; Yumoto, Mitsuki; Arai, Hideaki; Sato, Takashi; Sakamoto, Shuichi; Ohkawa, Masashi; Ohdaira, Yasuo

    2018-02-01

    Not so long ago, pseudo random numbers generated by numerical formulae were considered to be adequate for encrypting important data-files, because of the time needed to decode them. With today's ultra high-speed processors, however, this is no longer true. So, in order to thwart ever-more advanced attempts to breach our system's protections, cryptologists have devised a method that is considered to be virtually impossible to decode, and uses what is a limitless number of physical random numbers. This research describes a method, whereby laser diode's frequency noise generate a large quantities of physical random numbers. Using two types of photo detectors (APD and PIN-PD), we tested the abilities of two types of lasers (FP-LD and VCSEL) to generate random numbers. In all instances, an etalon served as frequency discriminator, the examination pass rates were determined using NIST FIPS140-2 test at each bit, and the Random Number Generation (RNG) speed was noted.

  4. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a...

  5. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a...

  6. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a...

  7. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a...

  8. 40 CFR 716.25 - Adequate file search.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a...

  9. Slow-release L-Cysteine (Acetium®) Lozenge Is an Effective New Method in Smoking Cessation. A Randomized, Double-blind, Placebo-controlled Intervention.

    PubMed

    Syrjänen, Kari; Eronen, Katja; Hendolin, Panu; Paloheimo, Lea; Eklund, Carita; Bäckström, Anna; Suovaniemi, Osmo

    2017-07-01

    Because of the major health problems and annual economic burden caused by cigarette smoking, effective new tools for smoking intervention are urgently needed. Our previous randomized controlled trial (RCT) provided promising results on the efficacy of slow-release L-cysteine lozenge in smoking intervention, but the study was not adequately powered. To confirm in an adequately-powered study the results of the previous RCT implicating that effective elimination of acetaldehyde in saliva by slow-release L-cysteine (Acetium® lozenge, Biohit Oyj, Helsinki), would assist in smoking cessation by reducing acetaldehyde-enhanced nicotine addiction. On this matter, we undertook a double-blind, randomized, placebo-controlled trial comparing Acetium® lozenge and placebo in smoking intervention. A cohort of 1,998 cigarette smokers were randomly allocated to intervention (n=996) and placebo arms (n=1,002). At baseline, smoking history was recorded by a questionnaire, with nicotine dependence testing according to the Fagerström scale (FTND). The subjects used smoking diary recording the daily numbers of cigarettes, lozenges and subjective sensations of smoking. The data were analysed separately for point prevalence of abstinence (PPA) and prolonged abstinence (PA) endpoints. Altogether, 753 study subjects completed the trial per protocol (PP), 944 with violations (mITT), and the rest (n=301) were lost to follow-up (LTF). During the 6-month intervention, 331 subjects stopped smoking; 181 (18.2%) in the intervention arm and 150 (15.0%) in the placebo arm (OR=1.43; 95%CI=1.09-1.88); p=0.010). In the PP group, 170 (45.3%) quitted smoking in the intervention arm compared to 134 (35.4%) in the placebo arm (OR=1.51, 95%CI=1.12-2.02; p=0.006). In multivariate (Poisson regression) model, decreased level of smoking pleasure (p=0.010) and "smoking sensations changed" were powerful independent predictors of quit events (IRR=12.01; 95%CI=1.5-95.6). Acetium® lozenge, herein confirmed in an

  10. Adequate supervision for children and adolescents.

    PubMed

    Anderst, James; Moffatt, Mary

    2014-11-01

    Primary care providers (PCPs) have the opportunity to improve child health and well-being by addressing supervision issues before an injury or exposure has occurred and/or after an injury or exposure has occurred. Appropriate anticipatory guidance on supervision at well-child visits can improve supervision of children, and may prevent future harm. Adequate supervision varies based on the child's development and maturity, and the risks in the child's environment. Consideration should be given to issues as wide ranging as swimming pools, falls, dating violence, and social media. By considering the likelihood of harm and the severity of the potential harm, caregivers may provide adequate supervision by minimizing risks to the child while still allowing the child to take "small" risks as needed for healthy development. Caregivers should initially focus on direct (visual, auditory, and proximity) supervision of the young child. Gradually, supervision needs to be adjusted as the child develops, emphasizing a safe environment and safe social interactions, with graduated independence. PCPs may foster adequate supervision by providing concrete guidance to caregivers. In addition to preventing injury, supervision includes fostering a safe, stable, and nurturing relationship with every child. PCPs should be familiar with age/developmentally based supervision risks, adequate supervision based on those risks, characteristics of neglectful supervision based on age/development, and ways to encourage appropriate supervision throughout childhood. Copyright 2014, SLACK Incorporated.

  11. The Wire-Grasping Method as a New Technique for Forceps Biopsy of Biliary Strictures: A Prospective Randomized Controlled Study of Effectiveness

    PubMed Central

    Yamashita, Yasunobu; Ueda, Kazuki; Kawaji, Yuki; Tamura, Takashi; Itonaga, Masahiro; Yoshida, Takeichi; Maeda, Hiroki; Magari, Hirohito; Maekita, Takao; Iguchi, Mikitaka; Tamai, Hideyuki; Ichinose, Masao; Kato, Jun

    2016-01-01

    Background/Aims Transpapillary forceps biopsy is an effective diagnostic technique in patients with biliary stricture. This prospective study aimed to determine the usefulness of the wire-grasping method as a new technique for forceps biopsy. Methods Consecutive patients with biliary stricture or irregularities of the bile duct wall were randomly allocated to either the direct or wire-grasping method group. In the wire-grasping method, forceps in the duodenum grasps a guide-wire placed into the bile duct beforehand, and then, the forceps are pushed through the papilla without endoscopic sphincterotomy. In the direct method, forceps are directly pushed into the bile duct alongside a guide-wire. The primary endpoint was the success rate of obtaining specimens suitable for adequate pathological examination. Results In total, 32 patients were enrolled, and 28 (14 in each group) were eligible for analysis. The success rate was significantly higher using the wire-grasping method than the direct method (100% vs 50%, p=0.016). Sensitivity and accuracy for the diagnosis of cancer were comparable in patients with the successful procurement of biopsy specimens between the two methods (91% vs 83% and 93% vs 86%, respectively). Conclusions The wire-grasping method is useful for diagnosing patients with biliary stricture or irregularities of the bile duct wall. PMID:27021502

  12. Feasibility study of modeling liver thermal damage using minimally invasive optical method adequate for in situ measurement.

    PubMed

    Zhao, Jinzhe; Zhao, Qi; Jiang, Yingxu; Li, Weitao; Yang, Yamin; Qian, Zhiyu; Liu, Jia

    2018-06-01

    Liver thermal ablation techniques have been widely used for the treatment of liver cancer. Kinetic model of damage propagation play an important role for ablation prediction and real-time efficacy assessment. However, practical methods for modeling liver thermal damage are rare. A minimally invasive optical method especially adequate for in situ liver thermal damage modeling is introduced in this paper. Porcine liver tissue was heated by water bath under different temperatures. During thermal treatment, diffuse reflectance spectrum of liver was measured by optical fiber and used to deduce reduced scattering coefficient (μ ' s ). Arrhenius parameters were obtained through non-isothermal heating approach with damage marker of μ ' s . Activation energy (E a ) and frequency factor (A) was deduced from these experiments. A pair of averaged value is 1.200 × 10 5  J mol -1 and 4.016 × 10 17  s -1 . The results were verified for their reasonableness and practicality. Therefore, it is feasible to modeling liver thermal damage based on minimally invasive measurement of optical property and in situ kinetic analysis of damage progress with Arrhenius model. These parameters and this method are beneficial for preoperative planning and real-time efficacy assessment of liver ablation therapy. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Methods and analysis of realizing randomized grouping.

    PubMed

    Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi

    2011-07-01

    Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.

  14. The Wire-Grasping Method as a New Technique for Forceps Biopsy of Biliary Strictures: A Prospective Randomized Controlled Study of Effectiveness.

    PubMed

    Yamashita, Yasunobu; Ueda, Kazuki; Kawaji, Yuki; Tamura, Takashi; Itonaga, Masahiro; Yoshida, Takeichi; Maeda, Hiroki; Magari, Hirohito; Maekita, Takao; Iguchi, Mikitaka; Tamai, Hideyuki; Ichinose, Masao; Kato, Jun

    2016-07-15

    Transpapillary forceps biopsy is an effective diagnostic technique in patients with biliary stricture. This prospective study aimed to determine the usefulness of the wire-grasping method as a new technique for forceps biopsy. Consecutive patients with biliary stricture or irregularities of the bile duct wall were randomly allocated to either the direct or wire-grasping method group. In the wiregrasping method, forceps in the duodenum grasps a guidewire placed into the bile duct beforehand, and then, the forceps are pushed through the papilla without endoscopic sphincterotomy. In the direct method, forceps are directly pushed into the bile duct alongside a guide-wire. The primary endpoint was the success rate of obtaining specimens suitable for adequate pathological examination. In total, 32 patients were enrolled, and 28 (14 in each group) were eligible for analysis. The success rate was significantly higher using the wire-grasping method than the direct method (100% vs 50%, p=0.016). Sensitivity and accuracy for the diagnosis of cancer were comparable in patients with the successful procurement of biopsy specimens between the two methods (91% vs 83% and 93% vs 86%, respectively). The wire-grasping method is useful for diagnosing patients with biliary stricture or irregularities of the bile duct wall.

  15. A new time domain random walk method for solute transport in 1-D heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banton, O.; Delay, F.; Porel, G.

    A new method to simulate solute transport in 1-D heterogeneous media is presented. This time domain random walk method (TDRW), similar in concept to the classical random walk method, calculates the arrival time of a particle cloud at a given location (directly providing the solute breakthrough curve). The main advantage of the method is that the restrictions on the space increments and the time steps which exist with the finite differences and random walk methods are avoided. In a homogeneous zone, the breakthrough curve (BTC) can be calculated directly at a given distance using a few hundred particles or directlymore » at the boundary of the zone. Comparisons with analytical solutions and with the classical random walk method show the reliability of this method. The velocity and dispersivity calculated from the simulated results agree within two percent with the values used as input in the model. For contrasted heterogeneous media, the random walk can generate high numerical dispersion, while the time domain approach does not.« less

  16. Emergency contraceptive pills as a backup for lactational amenorrhea method (LAM) of contraception: a randomized controlled trial.

    PubMed

    Shaaban, Omar M; Hassen, Shaimaa G; Nour, Sanna A; Kames, Mervat A; Yones, Entsar M

    2013-03-01

    The use of breastfeeding as a method of birth spacing occasionally ends in "unplanned pregnancy." This is due to unexpected expiration of one or more of the lactation amenorrhea method (LAM) prerequisites. The current study tests a new concept that the in-advance provision of single packet of progestogen emergency contraception (EC) pills during the postpartum LAM counseling may decrease the incidence of unplanned pregnancy during breastfeeding. This was a registered two-armed randomized controlled trial (NCT 01111929). Women intending to breastfeed and to postpone pregnancy for 1 year or more were approached. They received adequate postpartum contraceptive counseling. Women intending to use LAM were randomly assigned to one of two groups. The LAM-only group received the proper LAM counseling and did not receive counseling about EC. The LAM-EC group received counseling for both LAM and EC with in-advance provision of one packet of EC pills. They were advised to use these pills if one of the prerequisites of LAM expires and sexual relation has occurred before the initiation of another regular contraceptive protection. All the participants were advised that they need to use another regular method upon expiration of any of the LAM prerequisites. Eligible women were 1158 parturients randomized into two equal groups. Forty-four percent of the women provided with EC used them. Significantly more women in the LAM-EC group initiated regular contraception within or shortly after the first 6 months postpartum when compared with those in the LAM-only group (30.5% vs. 7.3%, respectively; p=.0004). Pregnancy occurred in 5% of the LAM-only group as compared with 0.8% in the LAM-EC group (p=.005). Minimal side effects were reported after EC use. In-advance provision of EC pills can increase the rate of initiation of regular contraception once one or more of the prerequisites of LAM expire. Consequently, the use of EC pills as a temporary backup of LAM can decrease the incidence

  17. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate...

  18. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate...

  19. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate...

  20. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate...

  1. A novel attack method about double-random-phase-encoding-based image hiding method

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Xiao, Zhijun; Zhu, Xianchen

    2018-03-01

    By using optical image processing techniques, a novel text encryption and hiding method applied by double-random phase-encoding technique is proposed in the paper. The first step is that the secret message is transformed into a 2-dimension array. The higher bits of the elements in the array are used to fill with the bit stream of the secret text, while the lower bits are stored specific values. Then, the transformed array is encoded by double random phase encoding technique. Last, the encoded array is embedded on a public host image to obtain the image embedded with hidden text. The performance of the proposed technique is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient.

  2. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  3. Avoidable waste related to inadequate methods and incomplete reporting of interventions: a systematic review of randomized trials performed in Sub-Saharan Africa.

    PubMed

    Ndounga Diakou, Lee Aymar; Ntoumi, Francine; Ravaud, Philippe; Boutron, Isabelle

    2017-07-05

    Randomized controlled trials (RCTs) are needed to improve health care in Sub-Saharan Africa (SSA). However, inadequate methods and incomplete reporting of interventions can prevent the transposition of research in practice which leads waste of research. The aim of this systematic review was to assess the avoidable waste in research related to inadequate methods and incomplete reporting of interventions in RCTs performed in SSA. We performed a methodological systematic review of RCTs performed in SSA and published between 1 January 2014 and 31 March 2015. We searched PubMed, the Cochrane library and the African Index Medicus to identify reports. We assessed the risk of bias using the Cochrane Risk of Bias tool, and for each risk of bias item, determined whether easy adjustments with no or minor cost could change the domain to low risk of bias. The reporting of interventions was assessed by using standardized checklists based on the Consolidated Standards for Reporting Trials, and core items of the Template for Intervention Description and Replication. Corresponding authors of reports with incomplete reporting of interventions were contacted to obtain additional information. Data were descriptively analyzed. Among 121 RCTs selected, 74 (61%) evaluated pharmacological treatments (PTs), including drugs and nutritional supplements; and 47 (39%) nonpharmacological treatments (NPTs) (40 participative interventions, 1 surgical procedure, 3 medical devices and 3 therapeutic strategies). Overall, the randomization sequence was adequately generated in 76 reports (62%) and the intervention allocation concealed in 48 (39%). The primary outcome was described as blinded in 46 reports (38%), and incomplete outcome data were adequately addressed in 78 (64%). Applying easy methodological adjustments with no or minor additional cost to trials with at least one domain at high risk of bias could have reduced the number of domains at high risk for 24 RCTs (19%). Interventions were

  4. Note on coefficient matrices from stochastic Galerkin methods for random diffusion equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou Tao, E-mail: tzhou@lsec.cc.ac.c; Tang Tao, E-mail: ttang@hkbu.edu.h

    2010-11-01

    In a recent work by Xiu and Shen [D. Xiu, J. Shen, Efficient stochastic Galerkin methods for random diffusion equations, J. Comput. Phys. 228 (2009) 266-281], the Galerkin methods are used to solve stochastic diffusion equations in random media, where some properties for the coefficient matrix of the resulting system are provided. They also posed an open question on the properties of the coefficient matrix. In this work, we will provide some results related to the open question.

  5. The supersymmetric method in random matrix theory and applications to QCD

    NASA Astrophysics Data System (ADS)

    Verbaarschot, Jacobus

    2004-12-01

    The supersymmetric method is a powerful method for the nonperturbative evaluation of quenched averages in disordered systems. Among others, this method has been applied to the statistical theory of S-matrix fluctuations, the theory of universal conductance fluctuations and the microscopic spectral density of the QCD Dirac operator. We start this series of lectures with a general review of Random Matrix Theory and the statistical theory of spectra. An elementary introduction of the supersymmetric method in Random Matrix Theory is given in the second and third lecture. We will show that a Random Matrix Theory can be rewritten as an integral over a supermanifold. This integral will be worked out in detail for the Gaussian Unitary Ensemble that describes level correlations in systems with broken time-reversal invariance. We especially emphasize the role of symmetries. As a second example of the application of the supersymmetric method we discuss the calculation of the microscopic spectral density of the QCD Dirac operator. This is the eigenvalue density near zero on the scale of the average level spacing which is known to be given by chiral Random Matrix Theory. Also in this case we use symmetry considerations to rewrite the generating function for the resolvent as an integral over a supermanifold. The main topic of the second last lecture is the recent developments on the relation between the supersymmetric partition function and integrable hierarchies (in our case the Toda lattice hierarchy). We will show that this relation is an efficient way to calculate superintegrals. Several examples that were given in previous lectures will be worked out by means of this new method. Finally, we will discuss the quenched QCD Dirac spectrum at nonzero chemical potential. Because of the nonhermiticity of the Dirac operator the usual supersymmetric method has not been successful in this case. However, we will show that the supersymmetric partition function can be evaluated by means

  6. Diagnostic games: from adequate formalization of clinical experience to structure discovery.

    PubMed

    Shifrin, Michael A; Kasparova, Eva I

    2008-01-01

    A method of obtaining well-founded and reproducible results in clinical decision making is presented. It is based on "diagnostic games", a procedure of elicitation and formalization of experts' knowledge and experience. The use of this procedure allows formulating decision rules in the terms of an adequate language, that are both unambiguous and clinically clear.

  7. Factors Affecting the Presence of Adequately Iodized Salt at Home in Wolaita, Southern Ethiopia: Community Based Study

    PubMed Central

    Abdurahmen, Junayde

    2018-01-01

    Background Universal use of iodized salt is a simple and inexpensive method to prevent and eliminate iodine deficiency disorders like mental retardation. However, little is known about the level of adequately iodized salt consumption in the study area. Therefore, the study was aimed at assessing the proportion of households having adequately iodized salt and associated factors in Wolaita Sodo town and its peripheries, Southern Ethiopia. Methods A cross-sectional study was conducted from May 10 to 20, 2016, in 441 households in Sodo town and its peripheries. Samples were selected using the systematic sampling technique. An iodometric titration method (AOAC, 2000) was used to analyze the iodine content of the salt samples. Data entry and analysis were done using Epi Info version 3.5.1 and SPSS version 16, respectively. Result The female to male ratio of the respondents was 219. The mean age of the respondents was 30.2 (±7.3 SD). The proportion of households having adequately iodized salt was 37.7%, with 95% CI of 33.2% to 42.2%. Not exposing salt to sunlight with [OR: 3.75; 95% CI: 2.14, 6.57], higher monthly income [OR: 3.71; 95% CI: 1.97–7.01], and formal education of respondents with [OR: 1.75; 95% CI: 1.14, 2.70] were found associated with the presence of adequately iodized salt at home. Conclusion This study revealed low levels of households having adequately iodized salt in Wolaita Sodo town and its peripheries. The evidence here shows that there is a need to increase the supply of adequately iodized salt to meet the goal for monitoring progress towards sustainable elimination of IDD. PMID:29765978

  8. A two-level stochastic collocation method for semilinear elliptic equations with random coefficients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Luoping; Zheng, Bin; Lin, Guang

    In this work, we propose a novel two-level discretization for solving semilinear elliptic equations with random coefficients. Motivated by the two-grid method for deterministic partial differential equations (PDEs) introduced by Xu, our two-level stochastic collocation method utilizes a two-grid finite element discretization in the physical space and a two-level collocation method in the random domain. In particular, we solve semilinear equations on a coarse meshmore » $$\\mathcal{T}_H$$ with a low level stochastic collocation (corresponding to the polynomial space $$\\mathcal{P}_{P}$$) and solve linearized equations on a fine mesh $$\\mathcal{T}_h$$ using high level stochastic collocation (corresponding to the polynomial space $$\\mathcal{P}_p$$). We prove that the approximated solution obtained from this method achieves the same order of accuracy as that from solving the original semilinear problem directly by stochastic collocation method with $$\\mathcal{T}_h$$ and $$\\mathcal{P}_p$$. The two-level method is computationally more efficient, especially for nonlinear problems with high random dimensions. Numerical experiments are also provided to verify the theoretical results.« less

  9. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  10. A review of randomized controlled trials comparing the effectiveness of hand held computers with paper methods for data collection

    PubMed Central

    Lane, Shannon J; Heddle, Nancy M; Arnold, Emmy; Walker, Irwin

    2006-01-01

    Background Handheld computers are increasingly favoured over paper and pencil methods to capture data in clinical research. Methods This study systematically identified and reviewed randomized controlled trials (RCTs) that compared the two methods for self-recording and reporting data, and where at least one of the following outcomes was assessed: data accuracy; timeliness of data capture; and adherence to protocols for data collection. Results A comprehensive key word search of NLM Gateway's database yielded 9 studies fitting the criteria for inclusion. Data extraction was performed and checked by two of the authors. None of the studies included all outcomes. The results overall, favor handheld computers over paper and pencil for data collection among study participants but the data are not uniform for the different outcomes. Handheld computers appear superior in timeliness of receipt and data handling (four of four studies) and are preferred by most subjects (three of four studies). On the other hand, only one of the trials adequately compared adherence to instructions for recording and submission of data (handheld computers were superior), and comparisons of accuracy were inconsistent between five studies. Conclusion Handhelds are an effective alternative to paper and pencil modes of data collection; they are faster and were preferred by most users. PMID:16737535

  11. Seismic random noise attenuation method based on empirical mode decomposition of Hausdorff dimension

    NASA Astrophysics Data System (ADS)

    Yan, Z.; Luan, X.

    2017-12-01

    Introduction Empirical mode decomposition (EMD) is a noise suppression algorithm by using wave field separation, which is based on the scale differences between effective signal and noise. However, since the complexity of the real seismic wave field results in serious aliasing modes, it is not ideal and effective to denoise with this method alone. Based on the multi-scale decomposition characteristics of the signal EMD algorithm, combining with Hausdorff dimension constraints, we propose a new method for seismic random noise attenuation. First of all, We apply EMD algorithm adaptive decomposition of seismic data and obtain a series of intrinsic mode function (IMF)with different scales. Based on the difference of Hausdorff dimension between effectively signals and random noise, we identify IMF component mixed with random noise. Then we use threshold correlation filtering process to separate the valid signal and random noise effectively. Compared with traditional EMD method, the results show that the new method of seismic random noise attenuation has a better suppression effect. The implementation process The EMD algorithm is used to decompose seismic signals into IMF sets and analyze its spectrum. Since most of the random noise is high frequency noise, the IMF sets can be divided into three categories: the first category is the effective wave composition of the larger scale; the second category is the noise part of the smaller scale; the third category is the IMF component containing random noise. Then, the third kind of IMF component is processed by the Hausdorff dimension algorithm, and the appropriate time window size, initial step and increment amount are selected to calculate the Hausdorff instantaneous dimension of each component. The dimension of the random noise is between 1.0 and 1.05, while the dimension of the effective wave is between 1.05 and 2.0. On the basis of the previous steps, according to the dimension difference between the random noise and

  12. Mendelian randomization analysis of a time-varying exposure for binary disease outcomes using functional data analysis methods.

    PubMed

    Cao, Ying; Rajan, Suja S; Wei, Peng

    2016-12-01

    A Mendelian randomization (MR) analysis is performed to analyze the causal effect of an exposure variable on a disease outcome in observational studies, by using genetic variants that affect the disease outcome only through the exposure variable. This method has recently gained popularity among epidemiologists given the success of genetic association studies. Many exposure variables of interest in epidemiological studies are time varying, for example, body mass index (BMI). Although longitudinal data have been collected in many cohort studies, current MR studies only use one measurement of a time-varying exposure variable, which cannot adequately capture the long-term time-varying information. We propose using the functional principal component analysis method to recover the underlying individual trajectory of the time-varying exposure from the sparsely and irregularly observed longitudinal data, and then conduct MR analysis using the recovered curves. We further propose two MR analysis methods. The first assumes a cumulative effect of the time-varying exposure variable on the disease risk, while the second assumes a time-varying genetic effect and employs functional regression models. We focus on statistical testing for a causal effect. Our simulation studies mimicking the real data show that the proposed functional data analysis based methods incorporating longitudinal data have substantial power gains compared to standard MR analysis using only one measurement. We used the Framingham Heart Study data to demonstrate the promising performance of the new methods as well as inconsistent results produced by the standard MR analysis that relies on a single measurement of the exposure at some arbitrary time point. © 2016 WILEY PERIODICALS, INC.

  13. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain... assurance, data analysis and reporting, and the holding of hearings and adjudication of cases. A portion of...

  14. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain... assurance, data analysis and reporting, and the holding of hearings and adjudication of cases. A portion of...

  15. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain... assurance, data analysis and reporting, and the holding of hearings and adjudication of cases. A portion of...

  16. 40 CFR 51.354 - Adequate tools and resources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain... assurance, data analysis and reporting, and the holding of hearings and adjudication of cases. A portion of...

  17. HSQC-1,n-ADEQUATE: a new approach to long-range 13C-13C correlation by covariance processing.

    PubMed

    Martin, Gary E; Hilton, Bruce D; Willcott, M Robert; Blinov, Kirill A

    2011-10-01

    Long-range, two-dimensional heteronuclear shift correlation NMR methods play a pivotal role in the assembly of novel molecular structures. The well-established GHMBC method is a high-sensitivity mainstay technique, affording connectivity information via (n)J(CH) coupling pathways. Unfortunately, there is no simple way of determining the value of n and hence no way of differentiating two-bond from three- and occasionally four-bond correlations. Three-bond correlations, however, generally predominate. Recent work has shown that the unsymmetrical indirect covariance or generalized indirect covariance processing of multiplicity edited GHSQC and 1,1-ADEQUATE spectra provides high-sensitivity access to a (13)C-(13) C connectivity map in the form of an HSQC-1,1-ADEQUATE spectrum. Covariance processing of these data allows the 1,1-ADEQUATE connectivity information to be exploited with the inherent sensitivity of the GHSQC spectrum rather than the intrinsically lower sensitivity of the 1,1-ADEQUATE spectrum itself. Data acquisition times and/or sample size can be substantially reduced when covariance processing is to be employed. In an extension of that work, 1,n-ADEQUATE spectra can likewise be subjected to covariance processing to afford high-sensitivity access to the equivalent of (4)J(CH) GHMBC connectivity information. The method is illustrated using strychnine as a model compound. Copyright © 2011 John Wiley & Sons, Ltd.

  18. Factors associated with adequate weekly reporting for disease surveillance data among health facilities in Nairobi County, Kenya, 2013.

    PubMed

    Mwatondo, Athman Juma; Ng'ang'a, Zipporah; Maina, Caroline; Makayotto, Lyndah; Mwangi, Moses; Njeru, Ian; Arvelo, Wences

    2016-01-01

    Kenya adopted the Integrated Disease Surveillance and Response (IDSR) strategy in 1998 to strengthen disease surveillance and epidemic response. However, the goal of weekly surveillance reporting among health facilities has not been achieved. We conducted a cross-sectional study to determine the prevalence of adequate reporting and factors associated with IDSR reporting among health facilities in one Kenyan County. Health facilities (public and private) were enrolled using stratified random sampling from 348 facilities prioritized for routine surveillance reporting. Adequately-reporting facilities were defined as those which submitted >10 weekly reports during a twelve-week period and a poor reporting facilities were those which submitted <10 weekly reports. Multivariate logistic regression with backward selection was used to identify risk factors associated with adequate reporting. From September 2 through November 30, 2013, we enrolled 175 health facilities; 130(74%) were private and 45(26%) were public. Of the 175 health facilities, 77 (44%) facilities classified as adequate reporting and 98 (56%) were reporting poorly. Multivariate analysis identified three factors to be independently associated with weekly adequate reporting: having weekly reporting forms at visit (AOR19, 95% CI: 6-65], having posters showing IDSR functions (AOR8, 95% CI: 2-12) and having a designated surveillance focal person (AOR7, 95% CI: 2-20). The majority of health facilities in Nairobi County were reporting poorly to IDSR and we recommend that the Ministry of Health provide all health facilities in Nairobi County with weekly reporting tools and offer specific trainings on IDSR which will help designate a focal surveillance person.

  19. Determination of Slope Safety Factor with Analytical Solution and Searching Critical Slip Surface with Genetic-Traversal Random Method

    PubMed Central

    2014-01-01

    In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679

  20. Random-breakage mapping method applied to human DNA sequences

    NASA Technical Reports Server (NTRS)

    Lobrich, M.; Rydberg, B.; Cooper, P. K.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    The random-breakage mapping method [Game et al. (1990) Nucleic Acids Res., 18, 4453-4461] was applied to DNA sequences in human fibroblasts. The methodology involves NotI restriction endonuclease digestion of DNA from irradiated calls, followed by pulsed-field gel electrophoresis, Southern blotting and hybridization with DNA probes recognizing the single copy sequences of interest. The Southern blots show a band for the unbroken restriction fragments and a smear below this band due to radiation induced random breaks. This smear pattern contains two discontinuities in intensity at positions that correspond to the distance of the hybridization site to each end of the restriction fragment. By analyzing the positions of those discontinuities we confirmed the previously mapped position of the probe DXS1327 within a NotI fragment on the X chromosome, thus demonstrating the validity of the technique. We were also able to position the probes D21S1 and D21S15 with respect to the ends of their corresponding NotI fragments on chromosome 21. A third chromosome 21 probe, D21S11, has previously been reported to be close to D21S1, although an uncertainty about a second possible location existed. Since both probes D21S1 and D21S11 hybridized to a single NotI fragment and yielded a similar smear pattern, this uncertainty is removed by the random-breakage mapping method.

  1. Predictors of adequate depression treatment among Medicaid-enrolled adults.

    PubMed

    Teh, Carrie Farmer; Sorbero, Mark J; Mihalyo, Mark J; Kogan, Jane N; Schuster, James; Reynolds, Charles F; Stein, Bradley D

    2010-02-01

    To determine whether Medicaid-enrolled depressed adults receive adequate treatment for depression and to identify the characteristics of those receiving inadequate treatment. Claims data from a Medicaid-enrolled population in a large mid-Atlantic state between July 2006 and January 2008. We examined rates and predictors of minimally adequate psychotherapy and pharmacotherapy among adults with a new depression treatment episode during the study period (N=1,098). Many depressed adults received either minimally adequate psychotherapy or pharmacotherapy. Black individuals and individuals who began their depression treatment episode with an inpatient psychiatric stay for depression were markedly less likely to receive minimally adequate psychotherapy and more likely to receive inadequate treatment. Racial minorities and individuals discharged from inpatient treatment for depression are at risk for receiving inadequate depression treatment.

  2. A new compound control method for sine-on-random mixed vibration test

    NASA Astrophysics Data System (ADS)

    Zhang, Buyun; Wang, Ruochen; Zeng, Falin

    2017-09-01

    Vibration environmental test (VET) is one of the important and effective methods to provide supports for the strength design, reliability and durability test of mechanical products. A new separation control strategy was proposed to apply in multiple-input multiple-output (MIMO) sine on random (SOR) mixed mode vibration test, which is the advanced and intensive test type of VET. As the key problem of the strategy, correlation integral method was applied to separate the mixed signals which included random and sinusoidal components. The feedback control formula of MIMO linear random vibration system was systematically deduced in frequency domain, and Jacobi control algorithm was proposed in view of the elements, such as self-spectrum, coherence, and phase of power spectral density (PSD) matrix. Based on the excessive correction of excitation in sine vibration test, compression factor was introduced to reduce the excitation correction, avoiding the destruction to vibration table or other devices. The two methods were synthesized to be applied in MIMO SOR vibration test system. In the final, verification test system with the vibration of a cantilever beam as the control object was established to verify the reliability and effectiveness of the methods proposed in the paper. The test results show that the exceeding values can be controlled in the tolerance range of references accurately, and the method can supply theory and application supports for mechanical engineering.

  3. Analytic method for calculating properties of random walks on networks

    NASA Technical Reports Server (NTRS)

    Goldhirsch, I.; Gefen, Y.

    1986-01-01

    A method for calculating the properties of discrete random walks on networks is presented. The method divides complex networks into simpler units whose contribution to the mean first-passage time is calculated. The simplified network is then further iterated. The method is demonstrated by calculating mean first-passage times on a segment, a segment with a single dangling bond, a segment with many dangling bonds, and a looplike structure. The results are analyzed and related to the applicability of the Einstein relation between conductance and diffusion.

  4. Applying a weighted random forests method to extract karst sinkholes from LiDAR data

    NASA Astrophysics Data System (ADS)

    Zhu, Junfeng; Pierskalla, William P.

    2016-02-01

    Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.

  5. Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method

    NASA Astrophysics Data System (ADS)

    Shamsoddini, A.; Aboodi, M. R.; Karami, J.

    2017-09-01

    Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  6. 22 CFR 1006.900 - Adequate evidence.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 2 2014-04-01 2014-04-01 false Adequate evidence. 1006.900 Section 1006.900 Foreign Relations INTER-AMERICAN FOUNDATION GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT... reasonable belief that a particular act or omission has occurred. ...

  7. 22 CFR 1508.900 - Adequate evidence.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 2 2014-04-01 2014-04-01 false Adequate evidence. 1508.900 Section 1508.900 Foreign Relations AFRICAN DEVELOPMENT FOUNDATION GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT... reasonable belief that a particular act or omission has occurred. ...

  8. Factors Affecting the Presence of Adequately Iodized Salt at Home in Wolaita, Southern Ethiopia: Community Based Study.

    PubMed

    Kumma, Wondimagegn Paulos; Haji, Yusuf; Abdurahmen, Junayde; Mehretie Adinew, Yohannes

    2018-01-01

    Universal use of iodized salt is a simple and inexpensive method to prevent and eliminate iodine deficiency disorders like mental retardation. However, little is known about the level of adequately iodized salt consumption in the study area. Therefore, the study was aimed at assessing the proportion of households having adequately iodized salt and associated factors in Wolaita Sodo town and its peripheries, Southern Ethiopia. A cross-sectional study was conducted from May 10 to 20, 2016, in 441 households in Sodo town and its peripheries. Samples were selected using the systematic sampling technique. An iodometric titration method (AOAC, 2000) was used to analyze the iodine content of the salt samples. Data entry and analysis were done using Epi Info version 3.5.1 and SPSS version 16, respectively. The female to male ratio of the respondents was 219. The mean age of the respondents was 30.2 (±7.3 SD). The proportion of households having adequately iodized salt was 37.7%, with 95% CI of 33.2% to 42.2%. Not exposing salt to sunlight with [OR: 3.75; 95% CI: 2.14, 6.57], higher monthly income [OR: 3.71; 95% CI: 1.97-7.01], and formal education of respondents with [OR: 1.75; 95% CI: 1.14, 2.70] were found associated with the presence of adequately iodized salt at home. This study revealed low levels of households having adequately iodized salt in Wolaita Sodo town and its peripheries. The evidence here shows that there is a need to increase the supply of adequately iodized salt to meet the goal for monitoring progress towards sustainable elimination of IDD.

  9. 22 CFR 208.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Adequate evidence. 208.900 Section 208.900 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT GOVERNMENTWIDE DEBARMENT AND SUSPENSION... support the reasonable belief that a particular act or omission has occurred. ...

  10. 22 CFR 208.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Adequate evidence. 208.900 Section 208.900 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT GOVERNMENTWIDE DEBARMENT AND SUSPENSION... support the reasonable belief that a particular act or omission has occurred. ...

  11. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  12. 10 CFR 503.35 - Inability to obtain adequate capital.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... capital investment, through tariffs, without unreasonably adverse economic effect on its service area... 10 Energy 4 2010-01-01 2010-01-01 false Inability to obtain adequate capital. 503.35 Section 503... New Facilities § 503.35 Inability to obtain adequate capital. (a) Eligibility. Section 212(a)(1)(D) of...

  13. "Something Adequate"? In Memoriam Seamus Heaney, Sister Quinlan, Nirbhaya

    ERIC Educational Resources Information Center

    Parker, Jan

    2014-01-01

    Seamus Heaney talked of poetry's responsibility to represent the "bloody miracle", the "terrible beauty" of atrocity; to create "something adequate". This article asks, what is adequate to the burning and eating of a nun and the murderous gang rape and evisceration of a medical student? It considers Njabulo Ndebele's…

  14. 2 CFR 180.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Adequate evidence. 180.900 Section 180.900 Grants and Agreements OFFICE OF MANAGEMENT AND BUDGET GOVERNMENTWIDE GUIDANCE FOR GRANTS AND AGREEMENTS... belief that a particular act or omission has occurred. ...

  15. Methodological reporting of randomized clinical trials in respiratory research in 2010.

    PubMed

    Lu, Yi; Yao, Qiuju; Gu, Jie; Shen, Ce

    2013-09-01

    Although randomized controlled trials (RCTs) are considered the highest level of evidence, they are also subject to bias, due to a lack of adequately reported randomization, and therefore the reporting should be as explicit as possible for readers to determine the significance of the contents. We evaluated the methodological quality of RCTs in respiratory research in high ranking clinical journals, published in 2010. We assessed the methodological quality, including generation of the allocation sequence, allocation concealment, double-blinding, sample-size calculation, intention-to-treat analysis, flow diagrams, number of medical centers involved, diseases, funding sources, types of interventions, trial registration, number of times the papers have been cited, journal impact factor, journal type, and journal endorsement of the CONSORT (Consolidated Standards of Reporting Trials) rules, in RCTs published in 12 top ranking clinical respiratory journals and 5 top ranking general medical journals. We included 176 trials, of which 93 (53%) reported adequate generation of the allocation sequence, 66 (38%) reported adequate allocation concealment, 79 (45%) were double-blind, 123 (70%) reported adequate sample-size calculation, 88 (50%) reported intention-to-treat analysis, and 122 (69%) included a flow diagram. Multivariate logistic regression analysis revealed that journal impact factor ≥ 5 was the only variable that significantly influenced adequate allocation sequence generation. Trial registration and journal impact factor ≥ 5 significantly influenced adequate allocation concealment. Medical interventions, trial registration, and journal endorsement of the CONSORT statement influenced adequate double-blinding. Publication in one of the general medical journal influenced adequate sample-size calculation. The methodological quality of RCTs in respiratory research needs improvement. Stricter enforcement of the CONSORT statement should enhance the quality of RCTs.

  16. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION... support the reasonable belief that a particular act or omission has occurred. ...

  17. 29 CFR 1471.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Adequate evidence. 1471.900 Section 1471.900 Labor Regulations Relating to Labor (Continued) FEDERAL MEDIATION AND CONCILIATION SERVICE GOVERNMENTWIDE DEBARMENT... information sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  18. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION... support the reasonable belief that a particular act or omission has occurred. ...

  19. 29 CFR 1471.900 - Adequate evidence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false Adequate evidence. 1471.900 Section 1471.900 Labor Regulations Relating to Labor (Continued) FEDERAL MEDIATION AND CONCILIATION SERVICE GOVERNMENTWIDE DEBARMENT... information sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  20. 29 CFR 1471.900 - Adequate evidence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false Adequate evidence. 1471.900 Section 1471.900 Labor Regulations Relating to Labor (Continued) FEDERAL MEDIATION AND CONCILIATION SERVICE GOVERNMENTWIDE DEBARMENT... information sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  1. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 9 2013-04-01 2013-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION... support the reasonable belief that a particular act or omission has occurred. ...

  2. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION... support the reasonable belief that a particular act or omission has occurred. ...

  3. 29 CFR 1471.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Adequate evidence. 1471.900 Section 1471.900 Labor Regulations Relating to Labor (Continued) FEDERAL MEDIATION AND CONCILIATION SERVICE GOVERNMENTWIDE DEBARMENT... information sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  4. 21 CFR 1404.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION... support the reasonable belief that a particular act or omission has occurred. ...

  5. 29 CFR 1471.900 - Adequate evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false Adequate evidence. 1471.900 Section 1471.900 Labor Regulations Relating to Labor (Continued) FEDERAL MEDIATION AND CONCILIATION SERVICE GOVERNMENTWIDE DEBARMENT... information sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  6. Methods and optical fibers that decrease pulse degradation resulting from random chromatic dispersion

    DOEpatents

    Chertkov, Michael; Gabitov, Ildar

    2004-03-02

    The present invention provides methods and optical fibers for periodically pinning an actual (random) accumulated chromatic dispersion of an optical fiber to a predicted accumulated dispersion of the fiber through relatively simple modifications of fiber-optic manufacturing methods or retrofitting of existing fibers. If the pinning occurs with sufficient frequency (at a distance less than or are equal to a correlation scale), pulse degradation resulting from random chromatic dispersion is minimized. Alternatively, pinning may occur quasi-periodically, i.e., the pinning distance is distributed between approximately zero and approximately two to three times the correlation scale.

  7. Random Qualitative Validation: A Mixed-Methods Approach to Survey Validation

    ERIC Educational Resources Information Center

    Van Duzer, Eric

    2012-01-01

    The purpose of this paper is to introduce the process and value of Random Qualitative Validation (RQV) in the development and interpretation of survey data. RQV is a method of gathering clarifying qualitative data that improves the validity of the quantitative analysis. This paper is concerned with validity in relation to the participants'…

  8. High prevalence of iodine deficiency in pregnant women living in adequate iodine area

    PubMed Central

    Mioto, Verônica Carneiro Borges; Monteiro, Ana Carolina de Castro Nassif Gomes; de Camargo, Rosalinda Yossie Asato; Borel, Andréia Rodrigues; Catarino, Regina Maria; Kobayashi, Sergio; Chammas, Maria Cristina; Marui, Suemi

    2018-01-01

    Objectives Iodine deficiency during pregnancy is associated with obstetric and neonatal adverse outcomes. Serum thyroglobulin (sTg) and thyroid volume (TV) are optional tools to urinary iodine concentration (UIC) for defining iodine status. This cross-sectional study aims to evaluate the iodine status of pregnant women living in iodine-adequate area by spot UIC and correlation with sTg, TV and thyroid function. Methods Two hundred and seventy-three pregnant women were evaluated at three trimesters. All had no previous thyroid disease, no iodine supplementation and negative thyroperoxidase and thyroglobulin antibodies. Thyroid function and sTg were measured using electrochemiluminescence immunoassays. TV was determined by ultrasonography; UIC was determined using a modified Sandell–Kolthoff method. Results Median UIC was 146 µg/L, being 52% iodine deficient and only 4% excessive. TSH values were 1.50 ± 0.92, 1.50 ± 0.92 and 1.91 ± 0.96 mIU/L, respectively, in each trimester (P = 0.001). sTg did not change significantly during trimesters with median 11.2 ng/mL and only 3.3% had above 40 ng/mL. Mean TV was 9.3 ± 3.4 mL, which positively correlated with body mass index, but not with sTg. Only 4.5% presented with goitre. When pregnant women were categorized as iodine deficient (UIC < 150 µg/L), adequate (≥150 and <250 µg/L) and excessive (≥250 µg/L), sTg, thyroid hormones and TV at each trimester showed no statistical differences. Conclusions Iodine deficiency was detected frequently in pregnant women living in iodine-adequate area. sTg concentration and TV did not correlate to UIC. Our observation also demonstrated that the Brazilian salt-iodization programme prevents deficiency, but does not maintain iodine status within adequate and recommended ranges for pregnant women. PMID:29700098

  9. 31 CFR 19.900 - Adequate evidence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance: Treasury 1 2012-07-01 2012-07-01 false Adequate evidence. 19.900 Section 19.900 Money and Finance: Treasury Office of the Secretary of the Treasury GOVERNMENTWIDE DEBARMENT AND... sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  10. 31 CFR 19.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance: Treasury 1 2011-07-01 2011-07-01 false Adequate evidence. 19.900 Section 19.900 Money and Finance: Treasury Office of the Secretary of the Treasury GOVERNMENTWIDE DEBARMENT AND... sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  11. 34 CFR 85.900 - Adequate evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 1 2011-07-01 2011-07-01 false Adequate evidence. 85.900 Section 85.900 Education Office of the Secretary, Department of Education GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT... reasonable belief that a particular act or omission has occurred. (Authority: E.O. 12549 (3 CFR, 1986 Comp...

  12. 31 CFR 19.900 - Adequate evidence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 1 2014-07-01 2014-07-01 false Adequate evidence. 19.900 Section 19.900 Money and Finance: Treasury Office of the Secretary of the Treasury GOVERNMENTWIDE DEBARMENT AND... sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  13. 31 CFR 19.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Adequate evidence. 19.900 Section 19.900 Money and Finance: Treasury Office of the Secretary of the Treasury GOVERNMENTWIDE DEBARMENT AND... sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  14. 34 CFR 85.900 - Adequate evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Adequate evidence. 85.900 Section 85.900 Education Office of the Secretary, Department of Education GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT... reasonable belief that a particular act or omission has occurred. Authority: E.O. 12549 (3 CFR, 1986 Comp., p...

  15. 31 CFR 19.900 - Adequate evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance: Treasury 1 2013-07-01 2013-07-01 false Adequate evidence. 19.900 Section 19.900 Money and Finance: Treasury Office of the Secretary of the Treasury GOVERNMENTWIDE DEBARMENT AND... sufficient to support the reasonable belief that a particular act or omission has occurred. ...

  16. Thermodynamic method for generating random stress distributions on an earthquake fault

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2012-01-01

    This report presents a new method for generating random stress distributions on an earthquake fault, suitable for use as initial conditions in a dynamic rupture simulation. The method employs concepts from thermodynamics and statistical mechanics. A pattern of fault slip is considered to be analogous to a micro-state of a thermodynamic system. The energy of the micro-state is taken to be the elastic energy stored in the surrounding medium. Then, the Boltzmann distribution gives the probability of a given pattern of fault slip and stress. We show how to decompose the system into independent degrees of freedom, which makes it computationally feasible to select a random state. However, due to the equipartition theorem, straightforward application of the Boltzmann distribution leads to a divergence which predicts infinite stress. To avoid equipartition, we show that the finite strength of the fault acts to restrict the possible states of the system. By analyzing a set of earthquake scaling relations, we derive a new formula for the expected power spectral density of the stress distribution, which allows us to construct a computer algorithm free of infinities. We then present a new technique for controlling the extent of the rupture by generating a random stress distribution thousands of times larger than the fault surface, and selecting a portion which, by chance, has a positive stress perturbation of the desired size. Finally, we present a new two-stage nucleation method that combines a small zone of forced rupture with a larger zone of reduced fracture energy.

  17. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  18. Shear wave speed estimation by adaptive random sample consensus method.

    PubMed

    Lin, Haoming; Wang, Tianfu; Chen, Siping

    2014-01-01

    This paper describes a new method for shear wave velocity estimation that is capable of extruding outliers automatically without preset threshold. The proposed method is an adaptive random sample consensus (ARANDSAC) and the metric used here is finding the certain percentage of inliers according to the closest distance criterion. To evaluate the method, the simulation and phantom experiment results were compared using linear regression with all points (LRWAP) and radon sum transform (RS) method. The assessment reveals that the relative biases of mean estimation are 20.00%, 4.67% and 5.33% for LRWAP, ARANDSAC and RS respectively for simulation, 23.53%, 4.08% and 1.08% for phantom experiment. The results suggested that the proposed ARANDSAC algorithm is accurate in shear wave speed estimation.

  19. Developing appropriate methods for cost-effectiveness analysis of cluster randomized trials.

    PubMed

    Gomes, Manuel; Ng, Edmond S-W; Grieve, Richard; Nixon, Richard; Carpenter, James; Thompson, Simon G

    2012-01-01

    Cost-effectiveness analyses (CEAs) may use data from cluster randomized trials (CRTs), where the unit of randomization is the cluster, not the individual. However, most studies use analytical methods that ignore clustering. This article compares alternative statistical methods for accommodating clustering in CEAs of CRTs. Our simulation study compared the performance of statistical methods for CEAs of CRTs with 2 treatment arms. The study considered a method that ignored clustering--seemingly unrelated regression (SUR) without a robust standard error (SE)--and 4 methods that recognized clustering--SUR and generalized estimating equations (GEEs), both with robust SE, a "2-stage" nonparametric bootstrap (TSB) with shrinkage correction, and a multilevel model (MLM). The base case assumed CRTs with moderate numbers of balanced clusters (20 per arm) and normally distributed costs. Other scenarios included CRTs with few clusters, imbalanced cluster sizes, and skewed costs. Performance was reported as bias, root mean squared error (rMSE), and confidence interval (CI) coverage for estimating incremental net benefits (INBs). We also compared the methods in a case study. Each method reported low levels of bias. Without the robust SE, SUR gave poor CI coverage (base case: 0.89 v. nominal level: 0.95). The MLM and TSB performed well in each scenario (CI coverage, 0.92-0.95). With few clusters, the GEE and SUR (with robust SE) had coverage below 0.90. In the case study, the mean INBs were similar across all methods, but ignoring clustering underestimated statistical uncertainty and the value of further research. MLMs and the TSB are appropriate analytical methods for CEAs of CRTs with the characteristics described. SUR and GEE are not recommended for studies with few clusters.

  20. Randomized trial of anesthetic methods for intravitreal injections.

    PubMed

    Blaha, Gregory R; Tilton, Elisha P; Barouch, Fina C; Marx, Jeffrey L

    2011-03-01

    To compare the effectiveness of four different anesthetic methods for intravitreal injection. Twenty-four patients each received four intravitreal injections using each of four types of anesthesia (proparacaine, tetracaine, lidocaine pledget, and subconjunctival injection of lidocaine) in a prospective, masked, randomized block design. Pain was graded by the patient on a 0 to 10 scale for both the anesthesia and the injection. The average combined pain scores for both the anesthesia and the intravitreal injection were 4.4 for the lidocaine pledget, 3.5 for topical proparacaine, 3.8 for the subconjunctival lidocaine injection, and 4.1 for topical tetracaine. The differences were not significant (P = 0.65). There were also no statistical differences in the individual anesthesia or injection pain scores. Subconjunctival lidocaine injection had the most side effects. Topical anesthesia is an effective method for limiting pain associated with intravitreal injections.

  1. Funding the Formula Adequately in Oklahoma

    ERIC Educational Resources Information Center

    Hancock, Kenneth

    2015-01-01

    This report is a longevity, simulational study that looks at how the ratio of state support to local support effects the number of school districts that breaks the common school's funding formula which in turns effects the equity of distribution to the common schools. After nearly two decades of adequately supporting the funding formula, Oklahoma…

  2. Impedance measurement using a two-microphone, random-excitation method

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Parrott, T. L.

    1978-01-01

    The feasibility of using a two-microphone, random-excitation technique for the measurement of acoustic impedance was studied. Equations were developed, including the effect of mean flow, which show that acoustic impedance is related to the pressure ratio and phase difference between two points in a duct carrying plane waves only. The impedances of a honeycomb ceramic specimen and a Helmholtz resonator were measured and compared with impedances obtained using the conventional standing-wave method. Agreement between the two methods was generally good. A sensitivity analysis was performed to pinpoint possible error sources and recommendations were made for future study. The two-microphone approach evaluated in this study appears to have some advantages over other impedance measuring techniques.

  3. Two new methods to fit models for network meta-analysis with random inconsistency effects.

    PubMed

    Law, Martin; Jackson, Dan; Turner, Rebecca; Rhodes, Kirsty; Viechtbauer, Wolfgang

    2016-07-28

    Meta-analysis is a valuable tool for combining evidence from multiple studies. Network meta-analysis is becoming more widely used as a means to compare multiple treatments in the same analysis. However, a network meta-analysis may exhibit inconsistency, whereby the treatment effect estimates do not agree across all trial designs, even after taking between-study heterogeneity into account. We propose two new estimation methods for network meta-analysis models with random inconsistency effects. The model we consider is an extension of the conventional random-effects model for meta-analysis to the network meta-analysis setting and allows for potential inconsistency using random inconsistency effects. Our first new estimation method uses a Bayesian framework with empirically-based prior distributions for both the heterogeneity and the inconsistency variances. We fit the model using importance sampling and thereby avoid some of the difficulties that might be associated with using Markov Chain Monte Carlo (MCMC). However, we confirm the accuracy of our importance sampling method by comparing the results to those obtained using MCMC as the gold standard. The second new estimation method we describe uses a likelihood-based approach, implemented in the metafor package, which can be used to obtain (restricted) maximum-likelihood estimates of the model parameters and profile likelihood confidence intervals of the variance components. We illustrate the application of the methods using two contrasting examples. The first uses all-cause mortality as an outcome, and shows little evidence of between-study heterogeneity or inconsistency. The second uses "ear discharge" as an outcome, and exhibits substantial between-study heterogeneity and inconsistency. Both new estimation methods give results similar to those obtained using MCMC. The extent of heterogeneity and inconsistency should be assessed and reported in any network meta-analysis. Our two new methods can be used to fit

  4. A method for reducing the order of nonlinear dynamic systems

    NASA Astrophysics Data System (ADS)

    Masri, S. F.; Miller, R. K.; Sassi, H.; Caughey, T. K.

    1984-06-01

    An approximate method that uses conventional condensation techniques for linear systems together with the nonparametric identification of the reduced-order model generalized nonlinear restoring forces is presented for reducing the order of discrete multidegree-of-freedom dynamic systems that possess arbitrary nonlinear characteristics. The utility of the proposed method is demonstrated by considering a redundant three-dimensional finite-element model half of whose elements incorporate hysteretic properties. A nonlinear reduced-order model, of one-third the order of the original model, is developed on the basis of wideband stationary random excitation and the validity of the reduced-order model is subsequently demonstrated by its ability to predict with adequate accuracy the transient response of the original nonlinear model under a different nonstationary random excitation.

  5. Developing Appropriate Methods for Cost-Effectiveness Analysis of Cluster Randomized Trials

    PubMed Central

    Gomes, Manuel; Ng, Edmond S.-W.; Nixon, Richard; Carpenter, James; Thompson, Simon G.

    2012-01-01

    Aim. Cost-effectiveness analyses (CEAs) may use data from cluster randomized trials (CRTs), where the unit of randomization is the cluster, not the individual. However, most studies use analytical methods that ignore clustering. This article compares alternative statistical methods for accommodating clustering in CEAs of CRTs. Methods. Our simulation study compared the performance of statistical methods for CEAs of CRTs with 2 treatment arms. The study considered a method that ignored clustering—seemingly unrelated regression (SUR) without a robust standard error (SE)—and 4 methods that recognized clustering—SUR and generalized estimating equations (GEEs), both with robust SE, a “2-stage” nonparametric bootstrap (TSB) with shrinkage correction, and a multilevel model (MLM). The base case assumed CRTs with moderate numbers of balanced clusters (20 per arm) and normally distributed costs. Other scenarios included CRTs with few clusters, imbalanced cluster sizes, and skewed costs. Performance was reported as bias, root mean squared error (rMSE), and confidence interval (CI) coverage for estimating incremental net benefits (INBs). We also compared the methods in a case study. Results. Each method reported low levels of bias. Without the robust SE, SUR gave poor CI coverage (base case: 0.89 v. nominal level: 0.95). The MLM and TSB performed well in each scenario (CI coverage, 0.92–0.95). With few clusters, the GEE and SUR (with robust SE) had coverage below 0.90. In the case study, the mean INBs were similar across all methods, but ignoring clustering underestimated statistical uncertainty and the value of further research. Conclusions. MLMs and the TSB are appropriate analytical methods for CEAs of CRTs with the characteristics described. SUR and GEE are not recommended for studies with few clusters. PMID:22016450

  6. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE... adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall provide adequate veterinary care to its animals in compliance with this section: (1) Each research facility shall...

  7. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE... adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall provide adequate veterinary care to its animals in compliance with this section: (1) Each research facility shall...

  8. Analysis of random structure-acoustic interaction problems using coupled boundary element and finite element methods

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Pates, Carl S., III

    1994-01-01

    A coupled boundary element (BEM)-finite element (FEM) approach is presented to accurately model structure-acoustic interaction systems. The boundary element method is first applied to interior, two and three-dimensional acoustic domains with complex geometry configurations. Boundary element results are very accurate when compared with limited exact solutions. Structure-interaction problems are then analyzed with the coupled FEM-BEM method, where the finite element method models the structure and the boundary element method models the interior acoustic domain. The coupled analysis is compared with exact and experimental results for a simplistic model. Composite panels are analyzed and compared with isotropic results. The coupled method is then extended for random excitation. Random excitation results are compared with uncoupled results for isotropic and composite panels.

  9. Mechanical and Pharmacologic Methods of Labor Induction: A Randomized Controlled Trial

    PubMed Central

    Levine, Lisa D.; Downes, Katheryne L.; Elovitz, Michal A.; Parry, Samuel; Sammel, Mary D.; Srinivas, Sindhu K

    2016-01-01

    Objective To evaluate the effectiveness of four commonly used induction methods. Methods This randomized trial compared four induction methods: Misoprostol alone, Foley alone, Misoprostol–cervical Foley concurrently, and Foley–oxytocin concurrently,. Women undergoing labor induction with full term (≥37 weeks), singleton, vertex presenting gestations, with no contraindication to vaginal delivery, intact membranes, Bishop score ≤6, and cervical dilation ≤2cm were included. Women were enrolled only once during the study period. Our primary outcome was time to delivery. Neither patients nor providers were blinded to assigned treatment group since examinations are required for placement of all methods; however, research personnel were blinded during data abstraction. A sample size of 123 per group (N=492) was planned to compare the four groups pairwise (P≤.008), with a 4-hour reduction in delivery time considered clinically meaningful. Results From May 2013 through June 2015, 997 women were screened and 491 were randomized and analyzed. Demographic and clinical characteristics were similar among the four treatment groups. When comparing all induction method groups, combination methods achieved a faster median time to delivery than single-agent methods, (misoprostol–Foley: 13.1 hours, Foley–oxytocin: 14.5 hours, misoprostol: 17.6 hours, Foley: 17.7 hours, p<0.001). When censored for cesarean and adjusting for parity, women who received misoprostol–Foley delivered almost twice as likely to deliver before women who received misoprostol alone (hazard ratio (HR, 95% CI) 1.92 [1.42–2.59]) or Foley alone (HR, 95%CI: 1.87 [1.39–2.52]), whereas Foley–oxytocin was not statistically different from single-agent methods. Conclusion After censoring for cesarean and adjusting for parity, misoprostol–cervical Foley resulted in twice the chance of delivering before either single-agent method. PMID:27824758

  10. A Semi-Analytical Method for the PDFs of A Ship Rolling in Random Oblique Waves

    NASA Astrophysics Data System (ADS)

    Liu, Li-qin; Liu, Ya-liu; Xu, Wan-hai; Li, Yan; Tang, You-gang

    2018-03-01

    The PDFs (probability density functions) and probability of a ship rolling under the random parametric and forced excitations were studied by a semi-analytical method. The rolling motion equation of the ship in random oblique waves was established. The righting arm obtained by the numerical simulation was approximately fitted by an analytical function. The irregular waves were decomposed into two Gauss stationary random processes, and the CARMA (2, 1) model was used to fit the spectral density function of parametric and forced excitations. The stochastic energy envelope averaging method was used to solve the PDFs and the probability. The validity of the semi-analytical method was verified by the Monte Carlo method. The C11 ship was taken as an example, and the influences of the system parameters on the PDFs and probability were analyzed. The results show that the probability of ship rolling is affected by the characteristic wave height, wave length, and the heading angle. In order to provide proper advice for the ship's manoeuvring, the parametric excitations should be considered appropriately when the ship navigates in the oblique seas.

  11. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  12. WE-AB-207A-04: Random Undersampled Cone Beam CT: Theoretical Analysis and a Novel Reconstruction Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, C; Chen, L; Jia, X

    2016-06-15

    Purpose: Reducing x-ray exposure and speeding up data acquisition motived studies on projection data undersampling. It is an important question that for a given undersampling ratio, what the optimal undersampling approach is. In this study, we propose a new undersampling scheme: random-ray undersampling. We will mathematically analyze its projection matrix properties and demonstrate its advantages. We will also propose a new reconstruction method that simultaneously performs CT image reconstruction and projection domain data restoration. Methods: By representing projection operator under the basis of singular vectors of full projection operator, matrix representations for an undersampling case can be generated and numericalmore » singular value decomposition can be performed. We compared properties of matrices among three undersampling approaches: regular-view undersampling, regular-ray undersampling, and the proposed random-ray undersampling. To accomplish CT reconstruction for random undersampling, we developed a novel method that iteratively performs CT reconstruction and missing projection data restoration via regularization approaches. Results: For a given undersampling ratio, random-ray undersampling preserved mathematical properties of full projection operator better than the other two approaches. This translates to advantages of reconstructing CT images at lower errors. Different types of image artifacts were observed depending on undersampling strategies, which were ascribed to the unique singular vectors of the sampling operators in the image domain. We tested the proposed reconstruction algorithm on a Forbid phantom with only 30% of the projection data randomly acquired. Reconstructed image error was reduced from 9.4% in a TV method to 7.6% in the proposed method. Conclusion: The proposed random-ray undersampling is mathematically advantageous over other typical undersampling approaches. It may permit better image reconstruction at the same

  13. Conic Sampling: An Efficient Method for Solving Linear and Quadratic Programming by Randomly Linking Constraints within the Interior

    PubMed Central

    Serang, Oliver

    2012-01-01

    Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics. PMID:22952741

  14. How to Do Random Allocation (Randomization)

    PubMed Central

    Shin, Wonshik

    2014-01-01

    Purpose To explain the concept and procedure of random allocation as used in a randomized controlled study. Methods We explain the general concept of random allocation and demonstrate how to perform the procedure easily and how to report it in a paper. PMID:24605197

  15. Perceptions on the right to adequate food after a major landslide disaster: a cross-sectional survey of two districts in Uganda.

    PubMed

    Rukundo, Peter M; Iversen, Per O; Andreassen, Bård A; Oshaug, Arne; Kikafunda, Joyce; Rukooko, Byaruhanga

    2015-04-25

    Despite the instruments on the right to adequate food adopted by the United Nations, there exists limited information on how this right is perceived. Following a major 2010 landslide disaster in the Bududa district of Eastern Uganda and the resettlement of some affected households into the Kiryandongo district in Western Uganda, we surveyed both districts to explore perceptions about the right to adequate food among households with different experiences; disaster-affected and controls. We deployed qualitative and quantitative techniques to a cross-sectional survey. The index respondent was the head of each randomly selected household from the landslide affected communities and controls from a bordering sub-county. Data was collected by interviews and focus group discussions (FGDs). Structured entries were tested statistically to report associations using Pearson's Chi-square at the 95% CI. Information from FGDs was transcribed, coded, sequenced and patterned. Findings from both techniques were triangulated to facilitate interpretations. Analysis included 1,078 interview entries and 12 FGDs. Significant differences between the affected and control households (P < 0.05) were observed with: age; education level; religious affiliation; existence of assets that complement food source; and having received relief food. Analysis between groups showed differences in responses on: whether everyone has a right to adequate food; who was supposed to supply relief food; whether relief food was adequate; and preferred choice on the means to ensure the right to adequate food. FGDs emphasized that access to land was the most important means to food and income. Affected households desired remedial interventions especially alternative land for livelihood. Despite the provision of adequate relief food being a state's obligation, there was no opportunity to exercise choice and preference. Comprehension and awareness of accountability and transparency issues was also low. Though a

  16. Percolation of the site random-cluster model by Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Wang, Songsong; Zhang, Wanzhou; Ding, Chengxiang

    2015-08-01

    We propose a site random-cluster model by introducing an additional cluster weight in the partition function of the traditional site percolation. To simulate the model on a square lattice, we combine the color-assignation and the Swendsen-Wang methods to design a highly efficient cluster algorithm with a small critical slowing-down phenomenon. To verify whether or not it is consistent with the bond random-cluster model, we measure several quantities, such as the wrapping probability Re, the percolating cluster density P∞, and the magnetic susceptibility per site χp, as well as two exponents, such as the thermal exponent yt and the fractal dimension yh of the percolating cluster. We find that for different exponents of cluster weight q =1.5 , 2, 2.5 , 3, 3.5 , and 4, the numerical estimation of the exponents yt and yh are consistent with the theoretical values. The universalities of the site random-cluster model and the bond random-cluster model are completely identical. For larger values of q , we find obvious signatures of the first-order percolation transition by the histograms and the hysteresis loops of percolating cluster density and the energy per site. Our results are helpful for the understanding of the percolation of traditional statistical models.

  17. The iodized salt programme in Bangalore, India provides adequate iodine intakes in pregnant women and more-than-adequate iodine intakes in their children.

    PubMed

    Jaiswal, Nidhi; Melse-Boonstra, Alida; Sharma, Surjeet Kaur; Srinivasan, Krishnamachari; Zimmermann, Michael B

    2015-02-01

    To compare the iodine status of pregnant women and their children who were sharing all meals in Bangalore, India. A cross-sectional study evaluating demographic characteristics, household salt iodine concentration and salt usage patterns, urinary iodine concentrations (UIC) in women and children, and maternal thyroid volume (ultrasound). Antenatal clinic of an urban tertiary-care hospital, which serves a low-income population. Healthy pregnant women in all trimesters, aged 18-35 years, who had healthy children aged 3-15 years. Median (range) iodine concentrations of household powdered and crystal salt were 55·9 (17·2-65·9) ppm and 18·9 (2·2-68·2) ppm, respectively. The contribution of iodine-containing supplements and multi-micronutrient powders to iodine intake in the families was negligible. Adequately iodized salt, together with small amounts of iodine in local foods, were providing adequate iodine during pregnancy: (i) the overall median (range) UIC in women was 172 (5-1024) µg/l; (ii) the median UIC was >150 µg/l in all trimesters; and (iii) thyroid size was not significantly different across trimesters. At the same time, the median (range) UIC in children was 220 (10-782) µg/l, indicating more-than-adequate iodine intake at this age. Median UIC was significantly higher in children than in their mothers (P=0·008). In this selected urban population of southern India, the iodized salt programme provides adequate iodine to women throughout pregnancy, at the expense of higher iodine intake in their children. Thus we suggest that the current cut-off for median UIC in children indicating more-than-adequate intake, recommended by the WHO/UNICEF/International Council for the Control of Iodine Deficiency Disorders may, need to be reconsidered.

  18. A method for determining the weak statistical stationarity of a random process

    NASA Technical Reports Server (NTRS)

    Sadeh, W. Z.; Koper, C. A., Jr.

    1978-01-01

    A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.

  19. Methods for synthesizing findings on moderation effects across multiple randomized trials.

    PubMed

    Brown, C Hendricks; Sloboda, Zili; Faggiano, Fabrizio; Teasdale, Brent; Keller, Ferdinand; Burkhart, Gregor; Vigna-Taglianti, Federica; Howe, George; Masyn, Katherine; Wang, Wei; Muthén, Bengt; Stephens, Peggy; Grey, Scott; Perrino, Tatiana

    2013-04-01

    This paper presents new methods for synthesizing results from subgroup and moderation analyses across different randomized trials. We demonstrate that such a synthesis generally results in additional power to detect significant moderation findings above what one would find in a single trial. Three general methods for conducting synthesis analyses are discussed, with two methods, integrative data analysis and parallel analyses, sharing a large advantage over traditional methods available in meta-analysis. We present a broad class of analytic models to examine moderation effects across trials that can be used to assess their overall effect and explain sources of heterogeneity, and present ways to disentangle differences across trials due to individual differences, contextual level differences, intervention, and trial design.

  20. Adequate antiplatelet regimen in patients on chronic anti-vitamin K treatment undergoing percutaneous coronary intervention

    PubMed Central

    Brugaletta, Salvatore; Martin-Yuste, Victoria; Ferreira-González, Ignacio; Cola, Clarissa; Alvarez-Contreras, Luis; Antonio, Marta De; Garcia-Moll, Xavier; García-Picart, Joan; Martí, Vicens; Balcells-Iranzo, Jordi; Sabaté, Manel

    2011-01-01

    AIM: To investigate the impact of dual antiplatelet therapy (DAT) in patients on anti-vitamin K (AVK) regimen requiring percutaneous coronary intervention (PCI). METHODS: Between February 2006 and February 2008, 138 consecutive patients under chronic AVK treatment were enrolled in this registry. Of them, 122 received bare metal stent implantation and 16 received drug eluting stent implantation. The duration of DAT, on top of AVK treatment, was decided at the discretion of the clinician. Adequate duration of DAT was defined according to type of stent implanted and to its clinical indication. RESULTS: The baseline clinical characteristics of patients reflect their high risk, with high incidence of comorbid conditions (Charlson score ≥ 3 in 89% of the patients). At a mean follow-up of 17 ± 11 mo, 22.9% of patients developed a major adverse cardiac event (MACE): 12.6% died from cardiovascular disease and almost 6% had an acute myocardial infarction. Major hemorrhagic events were observed in 7.4%. Adequate DAT was obtained in only 44% of patients. In the multivariate analysis, no adequate DAT and Charlson score were the only independent predictors of MACE (both P = 0.02). CONCLUSION: Patients on chronic AVK therapy represent a high risk population and suffer from a high MACE rate after PCI. An adequate DAT regimen and absence of comorbid conditions are strongly associated with better clinical outcomes. PMID:22125672

  1. Incorporating BIRD-based homodecoupling in the dual-optimized, inverted 1 JCC 1,n-ADEQUATE experiment.

    PubMed

    Saurí, Josep; Bermel, Wolfgang; Parella, Teodor; Thomas Williamson, R; Martin, Gary E

    2018-03-13

    1,n-ADEQUATE is a powerful NMR technique for elucidating the structure of proton-deficient small molecules that can help establish the carbon skeleton of a given molecule by providing long-range three-bond 13 C─ 13 C correlations. Care must be taken when using the experiment to identify the simultaneous presence of one-bond 13 C─ 13 C correlations that are not filtered out, unlike the HMBC experiment that has a low-pass J-filter to filter 1 J CH responses out. Dual-optimized, inverted 1 J CC 1,n-ADEQUATE is an improved variant of the experiment that affords broadband inversion of direct responses, obviating the need to take additional steps to identify these correlations. Even though ADEQUATE experiments can now be acquired in a reasonable amount of experimental time if a cryogenic probe is available, low sensitivity is still the main impediment limiting the application of this elegant experiment. Here, we wish to report a further refinement that incorporates real-time bilinear rotation decoupling-based homodecoupling methodology into the dual-optimized, inverted 1 J CC 1,n-ADEQUATE pulse sequence. Improved sensitivity and resolution are achieved by collapsing homonuclear proton-proton couplings from the observed multiplets for most spin systems. The application of the method is illustrated with several model compounds. Copyright © 2018 John Wiley & Sons, Ltd.

  2. A New Method of Random Environmental Walking for Assessing Behavioral Preferences for Different Lighting Applications

    PubMed Central

    Patching, Geoffrey R.; Rahm, Johan; Jansson, Märit; Johansson, Maria

    2017-01-01

    Accurate assessment of people’s preferences for different outdoor lighting applications is increasingly considered important in the development of new urban environments. Here a new method of random environmental walking is proposed to complement current methods of assessing urban lighting applications, such as self-report questionnaires. The procedure involves participants repeatedly walking between different lighting applications by random selection of a lighting application and preferred choice or by random selection of a lighting application alone. In this manner, participants are exposed to all lighting applications of interest more than once and participants’ preferences for the different lighting applications are reflected in the number of times they walk to each lighting application. On the basis of an initial simulation study, to explore the feasibility of this approach, a comprehensive field test was undertaken. The field test included random environmental walking and collection of participants’ subjective ratings of perceived pleasantness (PP), perceived quality, perceived strength, and perceived flicker of four lighting applications. The results indicate that random environmental walking can reveal participants’ preferences for different lighting applications that, in the present study, conformed to participants’ ratings of PP and perceived quality of the lighting applications. As a complement to subjectively stated environmental preferences, random environmental walking has the potential to expose behavioral preferences for different lighting applications. PMID:28337163

  3. Bystander fatigue and CPR quality by older bystanders: a randomized crossover trial comparing continuous chest compressions and 30:2 compressions to ventilations.

    PubMed

    Liu, Shawn; Vaillancourt, Christian; Kasaboski, Ann; Taljaard, Monica

    2016-11-01

    This study sought to measure bystander fatigue and cardiopulmonary resuscitation (CPR) quality after five minutes of CPR using the continuous chest compression (CCC) versus the 30:2 chest compression to ventilation method in older lay persons, a population most likely to perform CPR on cardiac arrest victims. This randomized crossover trial took place at three tertiary care hospitals and a seniors' center. Participants were aged ≥55 years without significant physical limitations (frailty score ≤3/7). They completed two 5-minute CPR sessions (using 30:2 and CCC) on manikins; sessions were separated by a rest period. We used concealed block randomization to determine CPR method order. Metronome feedback maintained a compression rate of 100/minute. We measured heart rate (HR), mean arterial pressure (MAP), and Borg Exertion Scale. CPR quality measures included total number of compressions and number of adequate compressions (depth ≥5 cm). Sixty-three participants were enrolled: mean age 70.8 years, female 66.7%, past CPR training 60.3%. Bystander fatigue was similar between CPR methods: mean difference in HR -0.59 (95% CI -3.51-2.33), MAP 1.64 (95% CI -0.23-3.50), and Borg 0.46 (95% CI 0.07-0.84). Compared to 30:2, participants using CCC performed more chest compressions (480.0 v. 376.3, mean difference 107.7; p<0.0001) and more adequate chest compressions (381.5 v. 324.9, mean difference. 62.0; p=0.0001), although good compressions/minute declined significantly faster with the CCC method (p=0.0002). CPR quality decreased significantly faster when performing CCC compared to 30:2. However, performing CCC produced more adequate compressions overall with a similar level of fatigue compared to the 30:2 method.

  4. A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn

    2006-01-01

    A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…

  5. Multi-label spacecraft electrical signal classification method based on DBN and random forest

    PubMed Central

    Li, Ke; Yu, Nan; Li, Pengfei; Song, Shimin; Wu, Yalei; Li, Yang; Liu, Meng

    2017-01-01

    In spacecraft electrical signal characteristic data, there exists a large amount of data with high-dimensional features, a high computational complexity degree, and a low rate of identification problems, which causes great difficulty in fault diagnosis of spacecraft electronic load systems. This paper proposes a feature extraction method that is based on deep belief networks (DBN) and a classification method that is based on the random forest (RF) algorithm; The proposed algorithm mainly employs a multi-layer neural network to reduce the dimension of the original data, and then, classification is applied. Firstly, we use the method of wavelet denoising, which was used to pre-process the data. Secondly, the deep belief network is used to reduce the feature dimension and improve the rate of classification for the electrical characteristics data. Finally, we used the random forest algorithm to classify the data and comparing it with other algorithms. The experimental results show that compared with other algorithms, the proposed method shows excellent performance in terms of accuracy, computational efficiency, and stability in addressing spacecraft electrical signal data. PMID:28486479

  6. Multi-label spacecraft electrical signal classification method based on DBN and random forest.

    PubMed

    Li, Ke; Yu, Nan; Li, Pengfei; Song, Shimin; Wu, Yalei; Li, Yang; Liu, Meng

    2017-01-01

    In spacecraft electrical signal characteristic data, there exists a large amount of data with high-dimensional features, a high computational complexity degree, and a low rate of identification problems, which causes great difficulty in fault diagnosis of spacecraft electronic load systems. This paper proposes a feature extraction method that is based on deep belief networks (DBN) and a classification method that is based on the random forest (RF) algorithm; The proposed algorithm mainly employs a multi-layer neural network to reduce the dimension of the original data, and then, classification is applied. Firstly, we use the method of wavelet denoising, which was used to pre-process the data. Secondly, the deep belief network is used to reduce the feature dimension and improve the rate of classification for the electrical characteristics data. Finally, we used the random forest algorithm to classify the data and comparing it with other algorithms. The experimental results show that compared with other algorithms, the proposed method shows excellent performance in terms of accuracy, computational efficiency, and stability in addressing spacecraft electrical signal data.

  7. Encryption method based on pseudo random spatial light modulation for single-fibre data transmission

    NASA Astrophysics Data System (ADS)

    Kowalski, Marcin; Zyczkowski, Marek

    2017-11-01

    Optical cryptosystems can provide encryption and sometimes compression simultaneously. They are increasingly attractive for information securing especially for image encryption. Our studies shown that the optical cryptosystems can be used to encrypt optical data transmission. We propose and study a new method for securing fibre data communication. The paper presents a method for optical encryption of data transmitted with a single optical fibre. The encryption process relies on pseudo-random spatial light modulation, combination of two encryption keys and the Compressed Sensing framework. A linear combination of light pulses with pseudo-random patterns provides a required encryption performance. We propose an architecture to transmit the encrypted data through the optical fibre. The paper describes the method, presents the theoretical analysis, design of physical model and results of experiment.

  8. The impact of urban gardens on adequate and healthy food: a systematic review.

    PubMed

    Garcia, Mariana T; Ribeiro, Silvana M; Germani, Ana Claudia Camargo Gonçalves; Bógus, Cláudia M

    2018-02-01

    To examine the impacts on food and nutrition-related outcomes resulting from participation in urban gardens, especially on healthy food practices, healthy food access, and healthy food beliefs, knowledge and attitudes. The systematic review identified studies by searching the PubMed, ERIC, LILACS, Web of Science and Embase databases. An assessment of quality and bias risk of the studies was carried out and a narrative summary was produced. Studies published as original articles in peer-reviewed scientific journals in English, Spanish or Portuguese between 2005 and 2015 were included. The studies included were based on data from adult participants in urban gardens. Twenty-four studies were initially selected based on the eligibility criteria, twelve of which were included. There was important heterogeneity of settings, population and assessment methods. Assessment of quality and bias risk of the studies revealed the need for greater methodological rigour. Most studies investigated community gardens and employed a qualitative approach. The following were reported: greater fruit and vegetable consumption, better access to healthy foods, greater valuing of cooking, harvest sharing with family and friends, enhanced importance of organic production, and valuing of adequate and healthy food. Thematic patterns related to adequate and healthy food associated with participation in urban gardens were identified, revealing a positive impact on practices of adequate and healthy food and mainly on food perceptions.

  9. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  10. Digital double random amplitude image encryption method based on the symmetry property of the parametric discrete Fourier transform

    NASA Astrophysics Data System (ADS)

    Bekkouche, Toufik; Bouguezel, Saad

    2018-03-01

    We propose a real-to-real image encryption method. It is a double random amplitude encryption method based on the parametric discrete Fourier transform coupled with chaotic maps to perform the scrambling. The main idea behind this method is the introduction of a complex-to-real conversion by exploiting the inherent symmetry property of the transform in the case of real-valued sequences. This conversion allows the encrypted image to be real-valued instead of being a complex-valued image as in all existing double random phase encryption methods. The advantage is to store or transmit only one image instead of two images (real and imaginary parts). Computer simulation results and comparisons with the existing double random amplitude encryption methods are provided for peak signal-to-noise ratio, correlation coefficient, histogram analysis, and key sensitivity.

  11. Method for high-volume sequencing of nucleic acids: random and directed priming with libraries of oligonucleotides

    DOEpatents

    Studier, F. William

    1995-04-18

    Random and directed priming methods for determining nucleotide sequences by enzymatic sequencing techniques, using libraries of primers of lengths 8, 9 or 10 bases, are disclosed. These methods permit direct sequencing of nucleic acids as large as 45,000 base pairs or larger without the necessity for subcloning. Individual primers are used repeatedly to prime sequence reactions in many different nucleic acid molecules. Libraries containing as few as 10,000 octamers, 14,200 nonamers, or 44,000 decamers would have the capacity to determine the sequence of almost any cosmid DNA. Random priming with a fixed set of primers from a smaller library can also be used to initiate the sequencing of individual nucleic acid molecules, with the sequence being completed by directed priming with primers from the library. In contrast to random cloning techniques, a combined random and directed priming strategy is far more efficient.

  12. Method for high-volume sequencing of nucleic acids: random and directed priming with libraries of oligonucleotides

    DOEpatents

    Studier, F.W.

    1995-04-18

    Random and directed priming methods for determining nucleotide sequences by enzymatic sequencing techniques, using libraries of primers of lengths 8, 9 or 10 bases, are disclosed. These methods permit direct sequencing of nucleic acids as large as 45,000 base pairs or larger without the necessity for subcloning. Individual primers are used repeatedly to prime sequence reactions in many different nucleic acid molecules. Libraries containing as few as 10,000 octamers, 14,200 nonamers, or 44,000 decamers would have the capacity to determine the sequence of almost any cosmid DNA. Random priming with a fixed set of primers from a smaller library can also be used to initiate the sequencing of individual nucleic acid molecules, with the sequence being completed by directed priming with primers from the library. In contrast to random cloning techniques, a combined random and directed priming strategy is far more efficient. 2 figs.

  13. Digital servo control of random sound fields

    NASA Technical Reports Server (NTRS)

    Nakich, R. B.

    1973-01-01

    It is necessary to place number of sensors at different positions in sound field to determine actual sound intensities to which test object is subjected. It is possible to determine whether specification is being met adequately or exceeded. Since excitation is of random nature, signals are essentially coherent and it is impossible to obtain true average.

  14. Customizing elastic pressure bandages for reuse to a predetermined, sub-bandage pressure: A randomized controlled trial.

    PubMed

    Sermsathanasawadi, Nuttawut; Tarapongpun, Tanakorn; Pianchareonsin, Rattana; Puangpunngam, Nattawut; Wongwanit, Chumpol; Chinsakchai, Khamin; Mutirangura, Pramook; Ruangsetakit, Chanean

    2017-01-01

    Objective A randomized clinical trial was performed to compare the effectiveness of unmarked bandages and customized bandages with visual markers in reproducing the desired sub-bandage pressure during self-bandaging by patients. Method Ninety patients were randomly allocated to two groups ("customized bandages" and "unmarked bandages") and asked to perform self-bandaging three times. The achievement of a pressure between 35 and 45 mmHg in at least two of the three attempts was defined as adequate quality. Results Adequate quality was achieved by 33.0% when applying the unmarked bandages, and 60.0% when applying the customized bandages ( p = 0.02). Use of the customized bandage and previous experience of bandaging were independent predictors for the achievement of the predetermined sub-bandage pressure ( p = 0.005 and p = 0.021, respectively). Conclusion Customized bandages may achieve predetermined sub-bandage pressures more closely than standard, unmarked, compression bandages. Clinical trials registration ClinicalTrials.gov (NCT02729688). Effectiveness of a Pressure Indicator Guided and a Conventional Bandaging in Treatment of Venous Leg Ulcer. https://clinicaltrials.gov/ct2/show/NCT02729688.

  15. Temporal trends in receipt of adequate lymphadenectomy in bladder cancer 1988 to 2010.

    PubMed

    Cole, Alexander P; Dalela, Deepansh; Hanske, Julian; Mullane, Stephanie A; Choueiri, Toni K; Meyer, Christian P; Nguyen, Paul L; Menon, Mani; Kibel, Adam S; Preston, Mark A; Bellmunt, Joaquim; Trinh, Quoc-Dien

    2015-12-01

    The importance of pelvic lymphadenectomy (LND) for diagnostic and therapeutic purposes at the time of radical cystectomy (RC) for bladder cancer is well documented. Although some debate remains on the optimal number of lymph nodes removed, 10 nodes has been proposed as constituting an adequate LND. We used data from the Surveillance, Epidemiology, and End Results database to examine predictors and temporal trends in the receipt of an adequate LND at the time of RC for bladder cancer. Within the Surveillance, Epidemiology, and End Results database, we extracted data on all patients with nonmetastatic bladder cancer receiving RC in the years 1988 to 2010. First, we assess the proportion of individuals undergoing RC who received an adequate LND (≥10 nodes removed) over time. Second, we calculate odds ratios (ORs) of receiving an adequate LND using logistic regression modeling to compare study periods. Covariates included sex, race, age, region, tumor stage, urban vs. rural location, and insurance status. Among the 5,696 individuals receiving RC during the years 1988 to 2010, 2,576 (45.2%) received an adequate LND. Over the study period, the proportion of individuals receiving an adequate LND increased from 26.4% to 61.3%. The odds of receiving an adequate LND increased over the study period; a patient undergoing RC in 2008 to 2010 was over 4-fold more likely to receive an adequate LND relative to a patient treated in 1988 to 1991 (OR = 4.63, 95% CI: 3.32-6.45). In addition to time of surgery, tumor stage had a positive association with receipt of adequate LND (OR = 1.49 for stage IV [T4 N1 or N0] vs. stage I [T1 or Tis], 95% CI: 1.22-1.82). Age, sex, marital status, and race were not significant predictors of adequate LND. Adequacy of pelvic LND remains an important measure of surgical quality in bladder cancer. Our data show that over the years 1988 to 2010, the likelihood of receiving an adequate LND has increased substantially; however, a substantial minority of

  16. Radiation Therapy Intensification for Solid Tumors: A Systematic Review of Randomized Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamoah, Kosj; Showalter, Timothy N.; Ohri, Nitin, E-mail: ohri.nitin@gmail.com

    Purpose: To systematically review the outcomes of randomized trials testing radiation therapy (RT) intensification, including both dose escalation and/or the use of altered fractionation, as a strategy to improve disease control for a number of malignancies. Methods and Materials: We performed a literature search to identify randomized trials testing RT intensification for cancers of the central nervous system, head and neck, breast, lung, esophagus, rectum, and prostate. Findings were described qualitatively. Where adequate data were available, pooled estimates for the effect of RT intensification on local control (LC) or overall survival (OS) were obtained using the inverse variance method. Results: Inmore » primary central nervous system tumors, esophageal cancer, and rectal cancer, randomized trials have not demonstrated that RT intensification improves clinical outcomes. In breast cancer and prostate cancer, dose escalation has been shown to improve LC or biochemical disease control but not OS. Radiation therapy intensification may improve LC and OS in head and neck and lung cancers, but these benefits have generally been limited to studies that did not incorporate concurrent chemotherapy. Conclusions: In randomized trials, the benefits of RT intensification have largely been restricted to trials in which concurrent chemotherapy was not used. Novel strategies to optimize the incorporation of RT in the multimodality treatment of solid tumors should be explored.« less

  17. Region 8: Colorado Adequate Letter (10/29/2001)

    EPA Pesticide Factsheets

    This letter from EPA to Colorado Department of Public Health and Environment determined Denvers' particulate matter (PM10) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes.

  18. Restricted random search method based on taboo search in the multiple minima problem

    NASA Astrophysics Data System (ADS)

    Hong, Seung Do; Jhon, Mu Shik

    1997-03-01

    The restricted random search method is proposed as a simple Monte Carlo sampling method to search minima fast in the multiple minima problem. This method is based on taboo search applied recently to continuous test functions. The concept of the taboo region instead of the taboo list is used and therefore the sampling of a region near an old configuration is restricted in this method. This method is applied to 2-dimensional test functions and the argon clusters. This method is found to be a practical and efficient method to search near-global configurations of test functions and the argon clusters.

  19. Methods for Synthesizing Findings on Moderation Effects Across Multiple Randomized Trials

    PubMed Central

    Brown, C Hendricks; Sloboda, Zili; Faggiano, Fabrizio; Teasdale, Brent; Keller, Ferdinand; Burkhart, Gregor; Vigna-Taglianti, Federica; Howe, George; Masyn, Katherine; Wang, Wei; Muthén, Bengt; Stephens, Peggy; Grey, Scott; Perrino, Tatiana

    2011-01-01

    This paper presents new methods for synthesizing results from subgroup and moderation analyses across different randomized trials. We demonstrate that such a synthesis generally results in additional power to detect significant moderation findings above what one would find in a single trial. Three general methods for conducting synthesis analyses are discussed, with two methods, integrative data analysis, and parallel analyses, sharing a large advantage over traditional methods available in meta-analysis. We present a broad class of analytic models to examine moderation effects across trials that can be used to assess their overall effect and explain sources of heterogeneity, and present ways to disentangle differences across trials due to individual differences, contextual level differences, intervention, and trial design. PMID:21360061

  20. Region 1: Connecticut Adequate Letter (6/14/2017)

    EPA Pesticide Factsheets

    Letter from Office of Ecosystem Protection to Connecticut Department of Energy & Environmental Protection determined submitted 2017 Motor Vehicle Emissions Budgets adequate for transportation conformity purposes, Greater Connecticut area. (March 20, 2017)

  1. Region 8: Utah Adequate Letter (6/10/2005)

    EPA Pesticide Factsheets

    This letter from EPA to Utah Department of Environmental Quality determined Salt Lake Citys' and Ogdens' Carbon Monoxide (CO) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes.

  2. Random density matrices versus random evolution of open system

    NASA Astrophysics Data System (ADS)

    Pineda, Carlos; Seligman, Thomas H.

    2015-10-01

    We present and compare two families of ensembles of random density matrices. The first, static ensemble, is obtained foliating an unbiased ensemble of density matrices. As criterion we use fixed purity as the simplest example of a useful convex function. The second, dynamic ensemble, is inspired in random matrix models for decoherence where one evolves a separable pure state with a random Hamiltonian until a given value of purity in the central system is achieved. Several families of Hamiltonians, adequate for different physical situations, are studied. We focus on a two qubit central system, and obtain exact expressions for the static case. The ensemble displays a peak around Werner-like states, modulated by nodes on the degeneracies of the density matrices. For moderate and strong interactions good agreement between the static and the dynamic ensembles is found. Even in a model where one qubit does not interact with the environment excellent agreement is found, but only if there is maximal entanglement with the interacting one. The discussion is started recalling similar considerations for scattering theory. At the end, we comment on the reach of the results for other convex functions of the density matrix, and exemplify the situation with the von Neumann entropy.

  3. Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, A. M.; McGhee, D. S.

    2003-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.

  4. The Efficiency of Random Forest Method for Shoreline Extraction from LANDSAT-8 and GOKTURK-2 Imageries

    NASA Astrophysics Data System (ADS)

    Bayram, B.; Erdem, F.; Akpinar, B.; Ince, A. K.; Bozkurt, S.; Catal Reis, H.; Seker, D. Z.

    2017-11-01

    Coastal monitoring plays a vital role in environmental planning and hazard management related issues. Since shorelines are fundamental data for environment management, disaster management, coastal erosion studies, modelling of sediment transport and coastal morphodynamics, various techniques have been developed to extract shorelines. Random Forest is one of these techniques which is used in this study for shoreline extraction.. This algorithm is a machine learning method based on decision trees. Decision trees analyse classes of training data creates rules for classification. In this study, Terkos region has been chosen for the proposed method within the scope of "TUBITAK Project (Project No: 115Y718) titled "Integration of Unmanned Aerial Vehicles for Sustainable Coastal Zone Monitoring Model - Three-Dimensional Automatic Coastline Extraction and Analysis: Istanbul-Terkos Example". Random Forest algorithm has been implemented to extract the shoreline of the Black Sea where near the lake from LANDSAT-8 and GOKTURK-2 satellite imageries taken in 2015. The MATLAB environment was used for classification. To obtain land and water-body classes, the Random Forest method has been applied to NIR bands of LANDSAT-8 (5th band) and GOKTURK-2 (4th band) imageries. Each image has been digitized manually and shorelines obtained for accuracy assessment. According to accuracy assessment results, Random Forest method is efficient for both medium and high resolution images for shoreline extraction studies.

  5. A simple method for assessing occupational exposure via the one-way random effects model.

    PubMed

    Krishnamoorthy, K; Mathew, Thomas; Peng, Jie

    2016-11-01

    A one-way random effects model is postulated for the log-transformed shift-long personal exposure measurements, where the random effect in the model represents an effect due to the worker. Simple closed-form confidence intervals are proposed for the relevant parameters of interest using the method of variance estimates recovery (MOVER). The performance of the confidence bounds is evaluated and compared with those based on the generalized confidence interval approach. Comparison studies indicate that the proposed MOVER confidence bounds are better than the generalized confidence bounds for the overall mean exposure and an upper percentile of the exposure distribution. The proposed methods are illustrated using a few examples involving industrial hygiene data.

  6. Visible and near infrared spectroscopy coupled to random forest to quantify some soil quality parameters

    NASA Astrophysics Data System (ADS)

    de Santana, Felipe Bachion; de Souza, André Marcelo; Poppi, Ronei Jesus

    2018-02-01

    This study evaluates the use of visible and near infrared spectroscopy (Vis-NIRS) combined with multivariate regression based on random forest to quantify some quality soil parameters. The parameters analyzed were soil cation exchange capacity (CEC), sum of exchange bases (SB), organic matter (OM), clay and sand present in the soils of several regions of Brazil. Current methods for evaluating these parameters are laborious, timely and require various wet analytical methods that are not adequate for use in precision agriculture, where faster and automatic responses are required. The random forest regression models were statistically better than PLS regression models for CEC, OM, clay and sand, demonstrating resistance to overfitting, attenuating the effect of outlier samples and indicating the most important variables for the model. The methodology demonstrates the potential of the Vis-NIR as an alternative for determination of CEC, SB, OM, sand and clay, making possible to develop a fast and automatic analytical procedure.

  7. Methodological reporting quality of randomized controlled trials: A survey of seven core journals of orthopaedics from Mainland China over 5 years following the CONSORT statement.

    PubMed

    Zhang, J; Chen, X; Zhu, Q; Cui, J; Cao, L; Su, J

    2016-11-01

    In recent years, the number of randomized controlled trials (RCTs) in the field of orthopaedics is increasing in Mainland China. However, randomized controlled trials (RCTs) are inclined to bias if they lack methodological quality. Therefore, we performed a survey of RCT to assess: (1) What about the quality of RCTs in the field of orthopedics in Mainland China? (2) Whether there is difference between the core journals of the Chinese department of orthopedics and Orthopaedics Traumatology Surgery & Research (OTSR). This research aimed to evaluate the methodological reporting quality according to the CONSORT statement of randomized controlled trials (RCTs) in seven key orthopaedic journals published in Mainland China over 5 years from 2010 to 2014. All of the articles were hand researched on Chongqing VIP database between 2010 and 2014. Studies were considered eligible if the words "random", "randomly", "randomization", "randomized" were employed to describe the allocation way. Trials including animals, cadavers, trials published as abstracts and case report, trials dealing with subgroups analysis, or trials without the outcomes were excluded. In addition, eight articles selected from Orthopaedics Traumatology Surgery & Research (OTSR) between 2010 and 2014 were included in this study for comparison. The identified RCTs are analyzed using a modified version of the Consolidated Standards of Reporting Trials (CONSORT), including the sample size calculation, allocation sequence generation, allocation concealment, blinding and handling of dropouts. A total of 222 RCTs were identified in seven core orthopaedic journals. No trials reported adequate sample size calculation, 74 (33.4%) reported adequate allocation generation, 8 (3.7%) trials reported adequate allocation concealment, 18 (8.1%) trials reported adequate blinding and 16 (7.2%) trials reported handling of dropouts. In OTSR, 1 (12.5%) trial reported adequate sample size calculation, 4 (50.0%) reported adequate

  8. Rectal cancer delivery of radiotherapy in adequate time and with adequate dose is influenced by treatment center, treatment schedule, and gender and is prognostic parameter for local control: Results of study CAO/ARO/AIO-94

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fietkau, Rainer; Roedel, Claus; Hohenberger, Werner

    2007-03-15

    Purpose: The impact of the delivery of radiotherapy (RT) on treatment results in rectal cancer patients is unknown. Methods and Materials: The data from 788 patients with rectal cancer treated within the German CAO/AIO/ARO-94 phase III trial were analyzed concerning the impact of the delivery of RT (adequate RT: minimal radiation RT dose delivered, 4300 cGy for neoadjuvant RT or 4700 cGy for adjuvant RT; completion of RT in <44 days for neoadjuvant RT or <49 days for adjuvant RT) in different centers on the locoregional recurrence rate (LRR) and disease-free survival (DFS) at 5 years. The LRR, DFS, andmore » delivery of RT were analyzed as endpoints in multivariate analysis. Results: A significant difference was found between the centers and the delivery of RT. The overall delivery of RT was a prognostic factor for the LRR (no RT, 29.6% {+-} 7.8%; inadequate RT, 21.2% {+-} 5.6%; adequate RT, 6.8% {+-} 1.4%; p = 0.0001) and DFS (no RT, 55.1% {+-} 9.1%; inadequate RT, 57.4% {+-} 6.3%; adequate RT, 69.1% {+-} 2.3%; p = 0.02). Postoperatively, delivery of RT was a prognostic factor for LRR on multivariate analysis (together with pathologic stage) but not for DFS (independent parameters, pathologic stage and age). Preoperatively, on multivariate analysis, pathologic stage, but not delivery of RT, was an independent prognostic parameter for LRR and DFS (together with adequate chemotherapy). On multivariate analysis, the treatment center, treatment schedule (neoadjuvant vs. adjuvant RT), and gender were prognostic parameters for adequate RT. Conclusion: Delivery of RT should be regarded as a prognostic factor for LRR in rectal cancer and is influenced by the treatment center, treatment schedule, and patient gender.« less

  9. Key management of the double random-phase-encoding method using public-key encryption

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2010-03-01

    Public-key encryption has been used to encode the key of the encryption process. In the proposed technique, an input image has been encrypted by using the double random-phase-encoding method using extended fractional Fourier transform. The key of the encryption process have been encoded by using the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. The encoded key has then been transmitted to the receiver side along with the encrypted image. In the decryption process, first the encoded key has been decrypted using the secret key and then the encrypted image has been decrypted by using the retrieved key parameters. The proposed technique has advantage over double random-phase-encoding method because the problem associated with the transmission of the key has been eliminated by using public-key encryption. Computer simulation has been carried out to validate the proposed technique.

  10. Region 9: Nevada Adequate Letter (3/30/2006)

    EPA Pesticide Factsheets

    This is a letter from Deborah Jordan, Director, to Leo M. Drozdoff regarding Nevada's motor vehicle emissions budgets in the 2005 Truckee Meadows CO Redesignation Request and Maintenance Plan are adequate for transportation conformity decisions.

  11. Region 6: Texas Adequate Letter (4/16/2010)

    EPA Pesticide Factsheets

    This letter from EPA to Texas Commission on Environmental Quality determined 2021 motor vehicle emission budgets for nitrogen oxides (NOx) and volatile organic compounds (VOCs) for Beaumont/Port Arthur area adequate for transportation conformity purposes

  12. Region 6: Texas Adequate Letter (6/21/17)

    EPA Pesticide Factsheets

    Letter from EPA approves Motor Vehicle Emissions Budgets contained in latest revisions to Houston/Galveston/Brazoria (HGB) 2008 8-hour Ozone State Implementation Plan, adequate for transportation conformity purposes and announced in the Federal Register.

  13. A modified hybrid uncertain analysis method for dynamic response field of the LSOAAC with random and interval parameters

    NASA Astrophysics Data System (ADS)

    Zi, Bin; Zhou, Bin

    2016-07-01

    For the prediction of dynamic response field of the luffing system of an automobile crane (LSOAAC) with random and interval parameters, a hybrid uncertain model is introduced. In the hybrid uncertain model, the parameters with certain probability distribution are modeled as random variables, whereas, the parameters with lower and upper bounds are modeled as interval variables instead of given precise values. Based on the hybrid uncertain model, the hybrid uncertain dynamic response equilibrium equation, in which different random and interval parameters are simultaneously included in input and output terms, is constructed. Then a modified hybrid uncertain analysis method (MHUAM) is proposed. In the MHUAM, based on random interval perturbation method, the first-order Taylor series expansion and the first-order Neumann series, the dynamic response expression of the LSOAAC is developed. Moreover, the mathematical characteristics of extrema of bounds of dynamic response are determined by random interval moment method and monotonic analysis technique. Compared with the hybrid Monte Carlo method (HMCM) and interval perturbation method (IPM), numerical results show the feasibility and efficiency of the MHUAM for solving the hybrid LSOAAC problems. The effects of different uncertain models and parameters on the LSOAAC response field are also investigated deeply, and numerical results indicate that the impact made by the randomness in the thrust of the luffing cylinder F is larger than that made by the gravity of the weight in suspension Q . In addition, the impact made by the uncertainty in the displacement between the lower end of the lifting arm and the luffing cylinder a is larger than that made by the length of the lifting arm L .

  14. Local spatiotemporal time-frequency peak filtering method for seismic random noise reduction

    NASA Astrophysics Data System (ADS)

    Liu, Yanping; Dang, Bo; Li, Yue; Lin, Hongbo

    2014-12-01

    To achieve a higher level of seismic random noise suppression, the Radon transform has been adopted to implement spatiotemporal time-frequency peak filtering (TFPF) in our previous studies. Those studies involved performing TFPF in full-aperture Radon domain, including linear Radon and parabolic Radon. Although the superiority of this method to the conventional TFPF has been tested through processing on synthetic seismic models and field seismic data, there are still some limitations in the method. Both full-aperture linear Radon and parabolic Radon are applicable and effective for some relatively simple situations (e.g., curve reflection events with regular geometry) but inapplicable for complicated situations such as reflection events with irregular shapes, or interlaced events with quite different slope or curvature parameters. Therefore, a localized approach to the application of the Radon transform must be applied. It would serve the filter method better by adapting the transform to the local character of the data variations. In this article, we propose an idea that adopts the local Radon transform referred to as piecewise full-aperture Radon to realize spatiotemporal TFPF, called local spatiotemporal TFPF. Through experiments on synthetic seismic models and field seismic data, this study demonstrates the advantage of our method in seismic random noise reduction and reflection event recovery for relatively complicated situations of seismic data.

  15. Randomized clinical trials in dentistry: Risks of bias, risks of random errors, reporting quality, and methodologic quality over the years 1955–2013

    PubMed Central

    Armijo-Olivo, Susan; Cummings, Greta G.; Amin, Maryam; Flores-Mir, Carlos

    2017-01-01

    Objectives To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. Methods We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Results Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955–2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. Conclusions The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute

  16. Object-based change detection method using refined Markov random field

    NASA Astrophysics Data System (ADS)

    Peng, Daifeng; Zhang, Yongjun

    2017-01-01

    In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.

  17. Region 8: Colorado Adequate Letter (1/20/2004)

    EPA Pesticide Factsheets

    This letter from EPA to Colorado Department of Public Health and Environment determined Greeleys' Carbon Monoxide (CO) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes and will be announced in the FR.

  18. Region 4: Tennessee Adequate Letter (9/30/2010)

    EPA Pesticide Factsheets

    This letter acknowledges that the EPA has reviewed Tennessee's Knoxville Area redesignation request and maintenace plan, as well as the motor vehicle emissions budgets (MVEBs) and have determined that these MVEBs are adequate for transportation conformity

  19. Region 9: California Adequate Letter (7/14/2017)

    EPA Pesticide Factsheets

    EPA approves California Air Resources Board Motor Vehicle Emissions Budgets in San Joaquin Valley Unified Air Pollution Control Districts 2016 Plan for 2008 8-Hour Ozone Standard adequate for transportation conformity purposes announced in Federal Register

  20. Region 9: Arizona Adequate Letter (10/14/2003)

    EPA Pesticide Factsheets

    This is a letter from Jack P. Broadben,. Director, to Nancy Wrona and Dennis Smith informing them that Maricopa County's motor vehicle emissions budgets in the 2003 MAGCO Maintenance Plan are adequate for transportation conformity purposes.

  1. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    EPA Science Inventory

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  2. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter

    PubMed Central

    Huang, Lei

    2015-01-01

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required. PMID:26437409

  3. Cluster randomized trials in comparative effectiveness research: randomizing hospitals to test methods for prevention of healthcare-associated infections.

    PubMed

    Platt, Richard; Takvorian, Samuel U; Septimus, Edward; Hickok, Jason; Moody, Julia; Perlin, Jonathan; Jernigan, John A; Kleinman, Ken; Huang, Susan S

    2010-06-01

    The need for evidence about the effectiveness of therapeutics and other medical practices has triggered new interest in methods for comparative effectiveness research. Describe an approach to comparative effectiveness research involving cluster randomized trials in networks of hospitals, health plans, or medical practices with centralized administrative and informatics capabilities. We discuss the example of an ongoing cluster randomized trial to prevent methicillin-resistant Staphylococcus aureus (MRSA) infection in intensive care units (ICUs). The trial randomizes 45 hospitals to: (a) screening cultures of ICU admissions, followed by Contact Precautions if MRSA-positive, (b) screening cultures of ICU admissions followed by decolonization if MRSA-positive, or (c) universal decolonization of ICU admissions without screening. All admissions to adult ICUs. The primary outcome is MRSA-positive clinical cultures occurring >or=2 days following ICU admission. Secondary outcomes include blood and urine infection caused by MRSA (and, separately, all pathogens), as well as the development of resistance to decolonizing agents. Recruitment of hospitals is complete. Data collection will end in Summer 2011. This trial takes advantage of existing personnel, procedures, infrastructure, and information systems in a large integrated hospital network to conduct a low-cost evaluation of prevention strategies under usual practice conditions. This approach is applicable to many comparative effectiveness topics in both inpatient and ambulatory settings.

  4. A new Lagrangian random choice method for steady two-dimensional supersonic/hypersonic flow

    NASA Technical Reports Server (NTRS)

    Loh, C. Y.; Hui, W. H.

    1991-01-01

    Glimm's (1965) random choice method has been successfully applied to compute steady two-dimensional supersonic/hypersonic flow using a new Lagrangian formulation. The method is easy to program, fast to execute, yet it is very accurate and robust. It requires no grid generation, resolves slipline and shock discontinuities crisply, can handle boundary conditions most easily, and is applicable to hypersonic as well as supersonic flow. It represents an accurate and fast alternative to the existing Eulerian methods. Many computed examples are given.

  5. Design and analysis of group-randomized trials in cancer: A review of current practices.

    PubMed

    Murray, David M; Pals, Sherri L; George, Stephanie M; Kuzmichev, Andrey; Lai, Gabriel Y; Lee, Jocelyn A; Myles, Ranell L; Nelson, Shakira M

    2018-06-01

    The purpose of this paper is to summarize current practices for the design and analysis of group-randomized trials involving cancer-related risk factors or outcomes and to offer recommendations to improve future trials. We searched for group-randomized trials involving cancer-related risk factors or outcomes that were published or online in peer-reviewed journals in 2011-15. During 2016-17, in Bethesda MD, we reviewed 123 articles from 76 journals to characterize their design and their methods for sample size estimation and data analysis. Only 66 (53.7%) of the articles reported appropriate methods for sample size estimation. Only 63 (51.2%) reported exclusively appropriate methods for analysis. These findings suggest that many investigators do not adequately attend to the methodological challenges inherent in group-randomized trials. These practices can lead to underpowered studies, to an inflated type 1 error rate, and to inferences that mislead readers. Investigators should work with biostatisticians or other methodologists familiar with these issues. Funders and editors should ensure careful methodological review of applications and manuscripts. Reviewers should ensure that studies are properly planned and analyzed. These steps are needed to improve the rigor and reproducibility of group-randomized trials. The Office of Disease Prevention (ODP) at the National Institutes of Health (NIH) has taken several steps to address these issues. ODP offers an online course on the design and analysis of group-randomized trials. ODP is working to increase the number of methodologists who serve on grant review panels. ODP has developed standard language for the Application Guide and the Review Criteria to draw investigators' attention to these issues. Finally, ODP has created a new Research Methods Resources website to help investigators, reviewers, and NIH staff better understand these issues. Published by Elsevier Inc.

  6. Long-acting reversible contraceptive acceptability and unintended pregnancy among women presenting for short-acting methods: a randomized patient preference trial.

    PubMed

    Hubacher, David; Spector, Hannah; Monteith, Charles; Chen, Pai-Lien; Hart, Catherine

    2017-02-01

    Measures of contraceptive effectiveness combine technology and user-related factors. Observational studies show higher effectiveness of long-acting reversible contraception compared with short-acting reversible contraception. Women who choose long-acting reversible contraception may differ in key ways from women who choose short-acting reversible contraception, and it may be these differences that are responsible for the high effectiveness of long-acting reversible contraception. Wider use of long-acting reversible contraception is recommended, but scientific evidence of acceptability and successful use is lacking in a population that typically opts for short-acting methods. The objective of the study was to reduce bias in measuring contraceptive effectiveness and better isolate the independent role that long-acting reversible contraception has in preventing unintended pregnancy relative to short-acting reversible contraception. We conducted a partially randomized patient preference trial and recruited women aged 18-29 years who were seeking a short-acting method (pills or injectable). Participants who agreed to randomization were assigned to 1 of 2 categories: long-acting reversible contraception or short-acting reversible contraception. Women who declined randomization but agreed to follow-up in the observational cohort chose their preferred method. Under randomization, participants chose a specific method in the category and received it for free, whereas participants in the preference cohort paid for the contraception in their usual fashion. Participants were followed up prospectively to measure primary outcomes of method continuation and unintended pregnancy at 12 months. Kaplan-Meier techniques were used to estimate method continuation probabilities. Intent-to-treat principles were applied after method initiation for comparing incidence of unintended pregnancy. We also measured acceptability in terms of level of happiness with the products. Of the 916

  7. Continued midazolam versus diphenhydramine in difficult-to-sedate patients: a randomized double-blind trial.

    PubMed

    Sachar, Hamita; Pichetshote, Nipaporn; Nandigam, Kavitha; Vaidya, Keta; Laine, Loren

    2018-05-01

    Current guidelines recommend diphenhydramine in patients undergoing endoscopy who are not adequately sedated with a benzodiazepine and opioid combination. Because this practice has not been adequately assessed, we performed a randomized, double-blind trial comparing diphenhydramine with continued midazolam in such patients. Patients undergoing elective colonoscopy with moderate sedation were eligible. Sedation was measured with the Modified Observer's Assessment of Alertness/Sedation (MOAA/S) score with adequate sedation defined as 3 on a 0- to 5-point scale. Patients not adequately sedated with midazolam 5 mg and fentanyl 100 μg were randomly assigned to diphenhydramine 25 mg versus continued midazolam 1.5 mg. Adequacy of sedation was assessed 3 minutes after each study medication dose. If MOAA/S was 4 to 5, study medication was repeated, to a maximum of 3 doses. The primary endpoint was adequate sedation. The planned enrollment of 200 patients (100 in each study group) was attained. Adequate sedation was achieved less often with diphenhydramine than midazolam (27% vs 65%, difference = -38%; 95% CI, -50% to -24%; P < .0001). After study medications were completed, more patients required additional medication for sedation or analgesia with diphenhydramine versus midazolam (84% vs 68%, P = .008), whereas the time to discharge from the recovery unit was similar (134 vs 129 minutes). Treatment effect was consistent across subgroups including age ≤55, substance abuse, benzodiazepine use, opioid use, and psychiatric medication use. Endoscopists performing moderate sedation should continue midazolam rather than switching to diphenhydramine in patients who do not achieve adequate sedation with usual doses of midazolam and an opioid. (Clinical trial registration number: NCT01769586.). Published by Elsevier Inc.

  8. Locally adaptive methods for KDE-based random walk models of reactive transport in porous media

    NASA Astrophysics Data System (ADS)

    Sole-Mari, G.; Fernandez-Garcia, D.

    2017-12-01

    Random Walk Particle Tracking (RWPT) coupled with Kernel Density Estimation (KDE) has been recently proposed to simulate reactive transport in porous media. KDE provides an optimal estimation of the area of influence of particles which is a key element to simulate nonlinear chemical reactions. However, several important drawbacks can be identified: (1) the optimal KDE method is computationally intensive and thereby cannot be used at each time step of the simulation; (2) it does not take advantage of the prior information about the physical system and the previous history of the solute plume; (3) even if the kernel is optimal, the relative error in RWPT simulations typically increases over time as the particle density diminishes by dilution. To overcome these problems, we propose an adaptive branching random walk methodology that incorporates the physics, the particle history and maintains accuracy with time. The method allows particles to efficiently split and merge when necessary as well as to optimally adapt their local kernel shape without having to recalculate the kernel size. We illustrate the advantage of the method by simulating complex reactive transport problems in randomly heterogeneous porous media.

  9. Region 5: Wisconsin Adequate Letter (4/16/2015)

    EPA Pesticide Factsheets

    This March 13, 2015 letter from EPA approves Wisconsins Kenosha and Sheboygan counties Early Progress Plan for year 2015 Motor Vehicle Emissions Budgets (MVEBs) for VOC and NOx finding them adequate for transportation conformity purposes and will be announ

  10. Statistical Comparison and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; McGhee, David S.

    2004-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield ultimate strength and high cycle fatigue capability. This paper examines the cumulative distribution function (CDF) percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematics is then used to calculate the combined value corresponding to any desired percentile along with a curve fit to this value. Another Excel macro is used to calculate the combination using a Monte Carlo simulation. Unlike the traditional techniques, these methods quantify the calculated load value with a Consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Also, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can lower the design loading substantially without losing any of the identified structural reliability.

  11. Region 8: Colorado Adequate Letter (8/17/2011)

    EPA Pesticide Factsheets

    This March 4, 2011 letter from EPA to Chistopher E. Urbina M.D., MPH, Colorado Department of Public Health and Environment states that EPA has found that the Greeley, CO second 10 year Limited Maintenance Plan (LMP) adequate for transportation conformity

  12. Region 8: Colorado Adequate Letter (6/11/2012)

    EPA Pesticide Factsheets

    This August 9, 2011 letter from EPA to Chistopher E. Urbina M.D., MPH, Colorado Department of Public Health and Environment states that EPA has found that the Fort Collins, CO second 10 year Limited Maintenance Plan (LMP) adequate for transportation

  13. Using Multitheory Model of Health Behavior Change to Predict Adequate Sleep Behavior.

    PubMed

    Knowlden, Adam P; Sharma, Manoj; Nahar, Vinayak K

    The purpose of this article was to use the multitheory model of health behavior change in predicting adequate sleep behavior in college students. A valid and reliable survey was administered in a cross-sectional design (n = 151). For initiation of adequate sleep behavior, the construct of behavioral confidence (P < .001) was found to be significant and accounted for 24.4% of the variance. For sustenance of adequate sleep behavior, changes in social environment (P < .02), emotional transformation (P < .001), and practice for change (P < .001) were significant and accounted for 34.2% of the variance.

  14. Assessment of health risks resulting from early-life exposures: Are current chemical toxicity testing protocols and risk assessment methods adequate?

    PubMed

    Felter, Susan P; Daston, George P; Euling, Susan Y; Piersma, Aldert H; Tassinari, Melissa S

    2015-03-01

    Abstract Over the last couple of decades, the awareness of the potential health impacts associated with early-life exposures has increased. Global regulatory approaches to chemical risk assessment are intended to be protective for the diverse human population including all life stages. However, questions persist as to whether the current testing approaches and risk assessment methodologies are adequately protective for infants and children. Here, we review physiological and developmental differences that may result in differential sensitivity associated with early-life exposures. It is clear that sensitivity to chemical exposures during early-life can be similar, higher, or lower than that of adults, and can change quickly within a short developmental timeframe. Moreover, age-related exposure differences provide an important consideration for overall susceptibility. Differential sensitivity associated with a life stage can reflect the toxicokinetic handling of a xenobiotic exposure, the toxicodynamic response, or both. Each of these is illustrated with chemical-specific examples. The adequacy of current testing protocols, proposed new tools, and risk assessment methods for systemic noncancer endpoints are reviewed in light of the potential for differential risk to infants and young children.

  15. Adequate nutrient intake can reduce cardiovascular disease risk in African Americans.

    PubMed

    Reusser, Molly E; DiRienzo, Douglas B; Miller, Gregory D; McCarron, David A

    2003-03-01

    Cardiovascular disease kills nearly as many Americans each year as the next seven leading causes of death combined. The prevalence of cardiovascular disease and most of its associated risk factors is markedly higher and increasing more rapidly among African Americans than in any other racial or ethnic group. Improving these statistics may be simply a matter of improving diet quality. In recent years, a substantial and growing body of evidence has revealed that dietary patterns complete in all food groups, including nutrient-rich dairy products, are essential for preventing and reducing cardiovascular disease and the conditions that contribute to it. Several cardiovascular risk factors, including hypertension, insulin resistance syndrome, and obesity, have been shown to be positively influenced by dietary patterns that include adequate intake of dairy products. The benefits of nutrient-rich dietary patterns have been specifically tested in randomized, controlled trials emphasizing African American populations. These studies demonstrated proportionally greater benefits for African Americans without evidence of adverse effects such as symptoms of lactose intolerance. As currently promoted for the prevention of certain cancers and osteoporosis, regular consumption of diets that meet recommended nutrient intake levels might also be the most effective approach for reducing cardiovascular disease risk in African Americans.

  16. [METHODS OF EVALUATION OF MUSCLE MASS: A SYSTEMATIC REVIEW OF RANDOMIZED CONTROLLED TRIALS].

    PubMed

    Moreira, Osvaldo Costa; de Oliveira, Cláudia Eliza Patrocínio; Candia-Luján, Ramón; Romero-Pérez, Ena Monserrat; de Paz Fernandez, José Antonio

    2015-09-01

    in recent years, research about muscle mass has gained popularity for their relationship to health. Thus precise measurement of muscle mass may have clinical application once may interfere with the diagnosis and prescription drug or drug treatment. to conduct a systematic review of the methods most used for evaluation of muscle mass in randomized controlled trials, with its advantages and disadvantages. we conducted a search of the data bases Pub- Med, Web of Science and Scopus, with words "muscle mass", "measurement", "assessment" and "evaluation", combined in this way: "muscle mass" AND (assessment OR measurement OR evaluation). 23 studies were recovered and analyzed, all in English. 69.56% only used a method for quantification of muscle mass; 69.57% used dual X-ray absorptiometry (DXA); in 45.46% the type of measure used was the body lean mass; and 51.61% chose the whole body as a site of measurement. in the randomized controlled trials analyzed the majority used just one method of assessment, with the DXA being the method most used, the body lean mass the measurement type most used and total body the most common site of measure. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  17. High prevalence of iodine deficiency in pregnant women living in adequate iodine area.

    PubMed

    Mioto, Verônica Carneiro Borges; Monteiro, Ana Carolina de Castro Nassif Gomes; de Camargo, Rosalinda Yossie Asato; Borel, Andréia Rodrigues; Catarino, Regina Maria; Kobayashi, Sergio; Chammas, Maria Cristina; Marui, Suemi

    2018-05-01

    Iodine deficiency during pregnancy is associated with obstetric and neonatal adverse outcomes. Serum thyroglobulin (sTg) and thyroid volume (TV) are optional tools to urinary iodine concentration (UIC) for defining iodine status. This cross-sectional study aims to evaluate the iodine status of pregnant women living in iodine-adequate area by spot UIC and correlation with sTg, TV and thyroid function. Two hundred and seventy-three pregnant women were evaluated at three trimesters. All had no previous thyroid disease, no iodine supplementation and negative thyroperoxidase and thyroglobulin antibodies. Thyroid function and sTg were measured using electrochemiluminescence immunoassays. TV was determined by ultrasonography; UIC was determined using a modified Sandell-Kolthoff method. Median UIC was 146 µg/L, being 52% iodine deficient and only 4% excessive. TSH values were 1.50 ± 0.92, 1.50 ± 0.92 and 1.91 ± 0.96 mIU/L, respectively, in each trimester ( P  = 0.001). sTg did not change significantly during trimesters with median 11.2 ng/mL and only 3.3% had above 40 ng/mL. Mean TV was 9.3 ± 3.4 mL, which positively correlated with body mass index, but not with sTg. Only 4.5% presented with goitre.When pregnant women were categorized as iodine deficient (UIC < 150 µg/L), adequate (≥150 and <250 µg/L) and excessive (≥250 µg/L), sTg, thyroid hormones and TV at each trimester showed no statistical differences. Iodine deficiency was detected frequently in pregnant women living in iodine-adequate area. sTg concentration and TV did not correlate to UIC. Our observation also demonstrated that the Brazilian salt-iodization programme prevents deficiency, but does not maintain iodine status within adequate and recommended ranges for pregnant women. © 2018 The authors.

  18. Region 8: Colorado Adequate Letter (6/11/2012)

    EPA Pesticide Factsheets

    This August 11, 2011 letter from EPA to Chistopher E. Urbina M.D., MPH, Colorado Department of Public Health and Environment states that EPA has found that the Aspen PM10 maintenance plan and the 2023 motor vehicle emissions budget (MVEB) adequate

  19. Region 9: Arizona Adequate Letter (11/1/2001)

    EPA Pesticide Factsheets

    This is a letter from Jack P. Broadbent, Director, Air Division to Nancy Wrona and James Bourney informing them of the adequacy of Revised MAG 1999 Serious Area Carbon Monoxide Plan and that the MAG CO Plan is adequate for Maricopa County.

  20. Region 9: California Adequate Letter (1/22/2018)

    EPA Pesticide Factsheets

    This December 19, 2017 letter form EPA, finding adequate certain motor vehicle emissions budgets for the 2006 fine particulate matter (PM2.5) National Ambient Air Quality Standars in the Final 2016 Air Quality Managemnet Plan for the South Coast area (2016

  1. Ray tracing method for simulation of laser beam interaction with random packings of powders

    NASA Astrophysics Data System (ADS)

    Kovalev, O. B.; Kovaleva, I. O.; Belyaev, V. V.

    2018-03-01

    Selective laser sintering is a technology of rapid manufacturing of a free form that is created as a solid object by selectively fusing successive layers of powder using a laser. The motivation of this study is due to the currently insufficient understanding of the processes and phenomena of selective laser melting of powders whose time scales differ by orders of magnitude. To construct random packings from mono- and polydispersed solid spheres, the algorithm of their generation based on the discrete element method is used. A numerical method of ray tracing is proposed that is used to simulate the interaction of laser radiation with a random bulk packing of spherical particles and to predict the optical properties of the granular layer, the extinction and absorption coefficients, depending on the optical properties of a powder material.

  2. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Crevillén-García, D.; Power, H.

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  3. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media.

    PubMed

    Crevillén-García, D; Power, H

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  4. K-Means Algorithm Performance Analysis With Determining The Value Of Starting Centroid With Random And KD-Tree Method

    NASA Astrophysics Data System (ADS)

    Sirait, Kamson; Tulus; Budhiarti Nababan, Erna

    2017-12-01

    Clustering methods that have high accuracy and time efficiency are necessary for the filtering process. One method that has been known and applied in clustering is K-Means Clustering. In its application, the determination of the begining value of the cluster center greatly affects the results of the K-Means algorithm. This research discusses the results of K-Means Clustering with starting centroid determination with a random and KD-Tree method. The initial determination of random centroid on the data set of 1000 student academic data to classify the potentially dropout has a sse value of 952972 for the quality variable and 232.48 for the GPA, whereas the initial centroid determination by KD-Tree has a sse value of 504302 for the quality variable and 214,37 for the GPA variable. The smaller sse values indicate that the result of K-Means Clustering with initial KD-Tree centroid selection have better accuracy than K-Means Clustering method with random initial centorid selection.

  5. 13 CFR 107.200 - Adequate capital for Licensees.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... operate actively in accordance with your Articles and within the context of your business plan, as... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS...

  6. Interactive Book Reading to Accelerate Word Learning by Kindergarten Children With Specific Language Impairment: Identifying an Adequate Intensity and Variation in Treatment Response.

    PubMed

    Storkel, Holly L; Voelmle, Krista; Fierro, Veronica; Flake, Kelsey; Fleming, Kandace K; Romine, Rebecca Swinburne

    2017-01-01

    This study sought to identify an adequate intensity of interactive book reading for new word learning by children with specific language impairment (SLI) and to examine variability in treatment response. An escalation design adapted from nontoxic drug trials (Hunsberger, Rubinstein, Dancey, & Korn, 2005) was used in this Phase I/II preliminary clinical trial. A total of 27 kindergarten children with SLI were randomized to 1 of 4 intensities of interactive book reading: 12, 24, 36, or 48 exposures. Word learning was monitored through a definition task and a naming task. An intensity response curve was examined to identify the adequate intensity. Correlations and classification accuracy were used to examine variation in response to treatment relative to pretreatment and early treatment measures. Response to treatment improved as intensity increased from 12 to 24 to 36 exposures, and then no further improvements were observed as intensity increased to 48 exposures. There was variability in treatment response: Children with poor phonological awareness, low vocabulary, and/or poor nonword repetition were less likely to respond to treatment. The adequate intensity for this version of interactive book reading was 36 exposures, but further development of the treatment is needed to increase the benefit for children with SLI.

  7. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ORGANIZATIONS, COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Cost Basis § 417... health care industry. (b) Provision of data. (1) The HMO or CMP must provide adequate cost and... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and...

  8. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Cost Basis § 417.568 Adequate... definitions and accounting, statistics, and reporting practices that are widely accepted in the health care... 42 Public Health 3 2010-10-01 2010-10-01 false Adequate financial records, statistical data, and...

  9. 42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Cost Basis § 417.568 Adequate... definitions and accounting, statistics, and reporting practices that are widely accepted in the health care... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and...

  10. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    NASA Astrophysics Data System (ADS)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we

  11. Introducing two Random Forest based methods for cloud detection in remote sensing images

    NASA Astrophysics Data System (ADS)

    Ghasemian, Nafiseh; Akhoondzadeh, Mehdi

    2018-07-01

    Cloud detection is a necessary phase in satellite images processing to retrieve the atmospheric and lithospheric parameters. Currently, some cloud detection methods based on Random Forest (RF) model have been proposed but they do not consider both spectral and textural characteristics of the image. Furthermore, they have not been tested in the presence of snow/ice. In this paper, we introduce two RF based algorithms, Feature Level Fusion Random Forest (FLFRF) and Decision Level Fusion Random Forest (DLFRF) to incorporate visible, infrared (IR) and thermal spectral and textural features (FLFRF) including Gray Level Co-occurrence Matrix (GLCM) and Robust Extended Local Binary Pattern (RELBP_CI) or visible, IR and thermal classifiers (DLFRF) for highly accurate cloud detection on remote sensing images. FLFRF first fuses visible, IR and thermal features. Thereafter, it uses the RF model to classify pixels to cloud, snow/ice and background or thick cloud, thin cloud and background. DLFRF considers visible, IR and thermal features (both spectral and textural) separately and inserts each set of features to RF model. Then, it holds vote matrix of each run of the model. Finally, it fuses the classifiers using the majority vote method. To demonstrate the effectiveness of the proposed algorithms, 10 Terra MODIS and 15 Landsat 8 OLI/TIRS images with different spatial resolutions are used in this paper. Quantitative analyses are based on manually selected ground truth data. Results show that after adding RELBP_CI to input feature set cloud detection accuracy improves. Also, the average cloud kappa values of FLFRF and DLFRF on MODIS images (1 and 0.99) are higher than other machine learning methods, Linear Discriminate Analysis (LDA), Classification And Regression Tree (CART), K Nearest Neighbor (KNN) and Support Vector Machine (SVM) (0.96). The average snow/ice kappa values of FLFRF and DLFRF on MODIS images (1 and 0.85) are higher than other traditional methods. The

  12. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    PubMed Central

    Power, H.

    2017-01-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen–Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error. PMID:28878974

  13. Influence of adequate pelvic floor muscle contraction on the movement of the coccyx during pelvic floor muscle training.

    PubMed

    Fujisaki, Akiko; Shigeta, Miwa; Shimoinaba, Misa; Yoshimura, Yasukuni

    2018-04-01

    [Purpose] Pelvic floor muscle training is a first-line therapy for female stress urinary incontinence. Previous studies have suggested that the coccyx tip moves ventrally and cranially during pelvic floor muscle contraction. The study aimed to elucidate the influence of adequate pelvic floor muscle contraction on coccyx movement. [Subjects and Methods] Sixty-three females (57 patients with stress urinary incontinence and additional 6 healthy volunteers) were enrolled. Using magnetic resonance imaging, coccyx movement was evaluated during pelvic floor muscle contraction and strain. An adequate contraction was defined as a contraction with good Oxford grading scale [≥3] and without inadequate muscle substitution patterns. [Results] Inadequate muscle substitution patterns were observed in 33 participants (52.4%). No significant difference was observed in the movement of the coccyx tip in the ventrodorsal direction between females with and without inadequate muscle substitution patterns. However, a significant increase in the movement of the coccyx tip in the cranial direction was detected in the group without inadequate muscle substitution patterns. Compared to participants with inadequate pelvic floor muscle contraction, those who had adequate pelvic floor muscle contraction exhibited significantly increased cranial movement of the coccyx. [Conclusion] Adequate pelvic floor muscle contraction can produce cranial movement of the coccyx tip.

  14. Child malnutrition and mortality among families not utilizing adequately iodized salt in Indonesia.

    PubMed

    Semba, Richard D; de Pee, Saskia; Hess, Sonja Y; Sun, Kai; Sari, Mayang; Bloem, Martin W

    2008-02-01

    Salt iodization is the main strategy for reducing iodine deficiency disorders worldwide. Characteristics of families not using iodized salt need to be known to expand coverage. The objective was to determine whether families who do not use iodized salt have a higher prevalence of child malnutrition and mortality and to identify factors associated with not using iodized salt. Use of adequately iodized salt (>or =30 ppm), measured by rapid test kits, was assessed between January 1999 and September 2003 in 145 522 and 445 546 families in urban slums and rural areas, respectively, in Indonesia. Adequately iodized salt was used by 66.6% and 67.2% of families from urban slums and rural areas, respectively. Among families who used adequately iodized salt, mortality in neonates, infants, and children aged <5 y was 3.3% compared with 4.2%, 5.5% compared with 7.1%, and 6.9% compared with 9.1%, respectively (P < 0.0001 for all), in urban slums; among families who did not use adequately iodized salt, the respective values were 4.2% compared with 6.3%, 7.1% compared with 11.2%, and 8.5% compared with 13.3% (P < 0.0001 for all) in rural areas. Families not using adequately iodized salt were more likely to have children who were stunted, underweight, and wasted. In multivariate analyses that controlled for potential confounders, low maternal education was the strongest factor associated with not using adequately iodized salt. In Indonesia, nonuse of adequately iodized salt is associated with a higher prevalence of child malnutrition and mortality in neonates, infants, and children aged <5 y. Stronger efforts are needed to expand salt iodization in Indonesia.

  15. Comparison of Time-to-First Event and Recurrent Event Methods in Randomized Clinical Trials.

    PubMed

    Claggett, Brian; Pocock, Stuart; Wei, L J; Pfeffer, Marc A; McMurray, John J V; Solomon, Scott D

    2018-03-27

    Background -Most Phase-3 trials feature time-to-first event endpoints for their primary and/or secondary analyses. In chronic diseases where a clinical event can occur more than once, recurrent-event methods have been proposed to more fully capture disease burden and have been assumed to improve statistical precision and power compared to conventional "time-to-first" methods. Methods -To better characterize factors that influence statistical properties of recurrent-events and time-to-first methods in the evaluation of randomized therapy, we repeatedly simulated trials with 1:1 randomization of 4000 patients to active vs control therapy, with true patient-level risk reduction of 20% (i.e. RR=0.80). For patients who discontinued active therapy after a first event, we assumed their risk reverted subsequently to their original placebo-level risk. Through simulation, we varied a) the degree of between-patient heterogeneity of risk and b) the extent of treatment discontinuation. Findings were compared with those from actual randomized clinical trials. Results -As the degree of between-patient heterogeneity of risk was increased, both time-to-first and recurrent-events methods lost statistical power to detect a true risk reduction and confidence intervals widened. The recurrent-events analyses continued to estimate the true RR=0.80 as heterogeneity increased, while the Cox model produced estimates that were attenuated. The power of recurrent-events methods declined as the rate of study drug discontinuation post-event increased. Recurrent-events methods provided greater power than time-to-first methods in scenarios where drug discontinuation was ≤30% following a first event, lesser power with drug discontinuation rates of ≥60%, and comparable power otherwise. We confirmed in several actual trials in chronic heart failure that treatment effect estimates were attenuated when estimated via the Cox model and that increased statistical power from recurrent-events methods

  16. Region 6: New Mexico Adequate Letter (8/21/2003)

    EPA Pesticide Factsheets

    This is a letter from Carl Edlund, Director, to Alfredo Santistevan regarding MVEB's contained in the latest revision to the Albuquerque Carbon Monoxide State Implementation Plan (SIP) are adequate for transportation conformity purposes.

  17. Region 10: Oregon Oakridge Adequate Letter (6/21/2017)

    EPA Pesticide Factsheets

    EPA approves motor vehicle emissions budget in the Oakridge-Westfir PM2.5 Attainment State Implementation Plan for the 2006 PM2.5 national ambient air quality standard, adequate for transportation conformity purposes.

  18. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn; Lin, Guang, E-mail: guanglin@purdue.edu

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  19. Understanding Your Adequate Yearly Progress (AYP), 2011-2012

    ERIC Educational Resources Information Center

    Missouri Department of Elementary and Secondary Education, 2011

    2011-01-01

    The "No Child Left Behind Act (NCLB) of 2001" requires all schools, districts/local education agencies (LEAs) and states to show that students are making Adequate Yearly Progress (AYP). NCLB requires states to establish targets in the following ways: (1) Annual Proficiency Target; (2) Attendance/Graduation Rates; and (3) Participation…

  20. Selection of adequate site location during early stages of construction project management: A multi-criteria decision analysis approach

    NASA Astrophysics Data System (ADS)

    Marović, Ivan; Hanak, Tomaš

    2017-10-01

    In the management of construction projects special attention should be given to the planning as the most important phase of decision-making process. Quality decision-making based on adequate and comprehensive collaboration of all involved stakeholders is crucial in project’s early stages. Fundamental reasons for existence of this problem arise from: specific conditions of construction industry (final products are inseparable from the location i.e. location has a strong influence of building design and its structural characteristics as well as technology which will be used during construction), investors’ desires and attitudes, and influence of socioeconomic and environment aspects. Considering all mentioned reasons one can conclude that selection of adequate construction site location for future investment is complex, low structured and multi-criteria problem. To take into account all the dimensions, the proposed model for selection of adequate site location is devised. The model is based on AHP (for designing the decision-making hierarchy) and PROMETHEE (for pairwise comparison of investment locations) methods. As a result of mixing basis feature of both methods, operational synergies can be achieved in multi-criteria decision analysis. Such gives the decision-maker a sense of assurance, knowing that if the procedure proposed by the presented model has been followed, it will lead to a rational decision, carefully and systematically thought out.

  1. A simple method for semi-random DNA amplicon fragmentation using the methylation-dependent restriction enzyme MspJI.

    PubMed

    Shinozuka, Hiroshi; Cogan, Noel O I; Shinozuka, Maiko; Marshall, Alexis; Kay, Pippa; Lin, Yi-Han; Spangenberg, German C; Forster, John W

    2015-04-11

    Fragmentation at random nucleotide locations is an essential process for preparation of DNA libraries to be used on massively parallel short-read DNA sequencing platforms. Although instruments for physical shearing, such as the Covaris S2 focused-ultrasonicator system, and products for enzymatic shearing, such as the Nextera technology and NEBNext dsDNA Fragmentase kit, are commercially available, a simple and inexpensive method is desirable for high-throughput sequencing library preparation. MspJI is a recently characterised restriction enzyme which recognises the sequence motif CNNR (where R = G or A) when the first base is modified to 5-methylcytosine or 5-hydroxymethylcytosine. A semi-random enzymatic DNA amplicon fragmentation method was developed based on the unique cleavage properties of MspJI. In this method, random incorporation of 5-methyl-2'-deoxycytidine-5'-triphosphate is achieved through DNA amplification with DNA polymerase, followed by DNA digestion with MspJI. Due to the recognition sequence of the enzyme, DNA amplicons are fragmented in a relatively sequence-independent manner. The size range of the resulting fragments was capable of control through optimisation of 5-methyl-2'-deoxycytidine-5'-triphosphate concentration in the reaction mixture. A library suitable for sequencing using the Illumina MiSeq platform was prepared and processed using the proposed method. Alignment of generated short reads to a reference sequence demonstrated a relatively high level of random fragmentation. The proposed method may be performed with standard laboratory equipment. Although the uniformity of coverage was slightly inferior to the Covaris physical shearing procedure, due to efficiencies of cost and labour, the method may be more suitable than existing approaches for implementation in large-scale sequencing activities, such as bacterial artificial chromosome (BAC)-based genome sequence assembly, pan-genomic studies and locus-targeted genotyping-by-sequencing.

  2. Methods of learning in statistical education: Design and analysis of a randomized trial

    NASA Astrophysics Data System (ADS)

    Boyd, Felicity Turner

    Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus

  3. Predictors for achieving adequate protein and energy intake in nursing home rehabilitation patients.

    PubMed

    van Zwienen-Pot, J I; Visser, M; Kruizenga, H M

    2018-07-01

    Adequate energy and protein intake could be essential for contributing significantly to the rehabilitations process. Data on the actual nutritional intake of older nursing home rehabilitation patients have not yet been investigated. To investigate the nutritional intake and predictors for achieving protein and energy requirements on the 14th day of admission in nursing home rehabilitation patients. Fifty-nine patients aged 65+ years newly admitted to nursing home rehabilitation wards were included. Data on potential variables were collected on admission. On the fourteenth day nutritional intake was assessed. Intake was considered 'adequate' if patients had achieved ≥ 1.2 g of protein/kg bodyweight and ≥ 85% of their energy needs according to Harris and Benedict + 30%. Multiple logistic regression analyses were performed to select predictors for adequate intake. Protein and energy intake was assessed in 79 patients [67% female, mean age 82 ± (SD) 8 years, BMI 25 ± 6 kg/m 2 ]. Mean energy intake was 1677 kcal (± 433) and mean protein intake was 68 g (± 20). Fourteen patients (18%) achieved an adequate protein and energy intake. Predictors for adequate intake were use of sip/tube feeding (OR = 7.7; 95% CI = 1.35-44.21), BMI (0.68; 0.53-0.87) and nausea (8.59; 1.42-52.01). Only 18% of older nursing home rehabilitation patients had an adequate protein and energy intake at 14 days after admission. Patients with higher BMI were less likely, while those using sip/tube feeding or feeling nauseous were more likely to achieve an adequate protein and energy intake.

  4. Behavioral interventions to promote adequate sleep among women: protocol for a systematic review and meta-analysis.

    PubMed

    Vézina-Im, Lydi-Anne; Moreno, Jennette Palcic; Nicklas, Theresa A; Baranowski, Tom

    2017-05-11

    Short and poor sleep have been associated with adverse health outcomes in adults, such as overweight/obesity and type 2 diabetes, especially among women. Women therefore represent an important target for interventions aimed at improving sleep and such interventions have been advocated to enhance maternal, fetal, and infant health. This systematic review will assess the efficacy or effectiveness of behavioral interventions aimed at promoting adequate sleep among women. The primary outcomes will be changes in sleep duration and/or sleep quality from baseline to post-intervention and to the last available follow-up measured either through self-reports or objectively. Secondary outcomes will be assessing the behavior change techniques that are responsible for the changes in sleep duration and quality among women. Behavioral interventions that are non-pharmacological and target either sleep directly or sleep hygiene behaviors will be included. Randomized controlled trials, quasi-experimental, and one-group pre-post studies will be included, but treated separately in the analyses, given that a limited number of studies on the topic of sleep is expected. MEDLINE/PubMed, PsycINFO, CINAHL, EMBASE, and Proquest Dissertations and Theses will be investigated. There will be no restriction on the year of publication of the articles, but we will include only the ones written in English or French. Two authors will independently assess articles for eligibility and will extract data using a standardized data extraction form that will have been previously pilot-tested. The quality of the studies will be assessed using the Effective Public Health Practice Project tool for quantitative study designs. The intervention procedures will be classified according to the latest validated taxonomy of behavior change techniques. If there is a sufficient number of studies (k > 5), a meta-analysis of the results will be performed with a random-effect model. If the heterogeneity is high (I 2

  5. Randomized controlled trial of a computer-based module to improve contraceptive method choice.

    PubMed

    Garbers, Samantha; Meserve, Allison; Kottke, Melissa; Hatcher, Robert; Ventura, Alicia; Chiasson, Mary Ann

    2012-10-01

    Unintended pregnancy is common in the United States, and interventions are needed to improve contraceptive use among women at higher risk of unintended pregnancy, including Latinas and women with low educational attainment. A three-arm randomized controlled trial was conducted at two family planning sites serving low-income, predominantly Latina populations. The trial tested the efficacy of a computer-based contraceptive assessment module in increasing the proportion of patients choosing an effective method of contraception (<10 pregnancies/100 women per year, typical use). Participants were randomized to complete the module and receive tailored health materials, to complete the module and receive generic health materials, or to a control condition. In intent-to-treat analyses adjusted for recruitment site (n=2231), family planning patients who used the module were significantly more likely to choose an effective contraceptive method: 75% among those who received tailored materials [odds ratio (OR)=1.56; 95% confidence interval (CI): 1.23-1.98] and 78% among those who received generic materials (OR=1.74; 95% CI: 1.35-2.25), compared to 65% among control arm participants. The findings support prior research suggesting that patient-centered interventions can positively influence contraceptive method choice. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... adequate yearly progress. A school or LEA makes AYP if it complies with paragraph (c) and with either paragraph (a) or (b) of this section separately in reading/language arts and in mathematics. (a)(1) A school... school or LEA, respectively, meets or exceeds the State's other academic indicators under § 200.19. (2...

  7. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... adequate yearly progress. A school or LEA makes AYP if it complies with paragraph (c) and with either paragraph (a) or (b) of this section separately in reading/language arts and in mathematics. (a)(1) A school... school or LEA, respectively, meets or exceeds the State's other academic indicators under § 200.19. (2...

  8. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... adequate yearly progress. A school or LEA makes AYP if it complies with paragraph (c) and with either paragraph (a) or (b) of this section separately in reading/language arts and in mathematics. (a)(1) A school... school or LEA, respectively, meets or exceeds the State's other academic indicators under § 200.19. (2...

  9. 34 CFR 200.20 - Making adequate yearly progress.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... adequate yearly progress. A school or LEA makes AYP if it complies with paragraph (c) and with either paragraph (a) or (b) of this section separately in reading/language arts and in mathematics. (a)(1) A school... school or LEA, respectively, meets or exceeds the State's other academic indicators under § 200.19. (2...

  10. Comparability and Reliability Considerations of Adequate Yearly Progress

    ERIC Educational Resources Information Center

    Maier, Kimberly S.; Maiti, Tapabrata; Dass, Sarat C.; Lim, Chae Young

    2012-01-01

    The purpose of this study is to develop an estimate of Adequate Yearly Progress (AYP) that will allow for reliable and valid comparisons among student subgroups, schools, and districts. A shrinkage-type estimator of AYP using the Bayesian framework is described. Using simulated data, the performance of the Bayes estimator will be compared to…

  11. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides...

  12. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides...

  13. 40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides...

  14. Reflective Random Indexing and indirect inference: a scalable method for discovery of implicit connections.

    PubMed

    Cohen, Trevor; Schvaneveldt, Roger; Widdows, Dominic

    2010-04-01

    The discovery of implicit connections between terms that do not occur together in any scientific document underlies the model of literature-based knowledge discovery first proposed by Swanson. Corpus-derived statistical models of semantic distance such as Latent Semantic Analysis (LSA) have been evaluated previously as methods for the discovery of such implicit connections. However, LSA in particular is dependent on a computationally demanding method of dimension reduction as a means to obtain meaningful indirect inference, limiting its ability to scale to large text corpora. In this paper, we evaluate the ability of Random Indexing (RI), a scalable distributional model of word associations, to draw meaningful implicit relationships between terms in general and biomedical language. Proponents of this method have achieved comparable performance to LSA on several cognitive tasks while using a simpler and less computationally demanding method of dimension reduction than LSA employs. In this paper, we demonstrate that the original implementation of RI is ineffective at inferring meaningful indirect connections, and evaluate Reflective Random Indexing (RRI), an iterative variant of the method that is better able to perform indirect inference. RRI is shown to lead to more clearly related indirect connections and to outperform existing RI implementations in the prediction of future direct co-occurrence in the MEDLINE corpus. 2009 Elsevier Inc. All rights reserved.

  15. Region 1: New Hampshire Adequate Letter (8/12/2008)

    EPA Pesticide Factsheets

    This July 9, 2008 letter from EPA to the New Hampshire Department of Environmental Services, determined the 2009 Motor Vehicle Emissions Budgets (MVEBs) are adequate for transportation conformity purposes and will be announced in the Federal Register (FR).

  16. Region 6: Texas Austin Adequate Letter (11/23/2016)

    EPA Pesticide Factsheets

    EPA letter approves the Motor Vehicle Emissions Budgets contained in the latest revision to Dallas/Fort Worth 2008 8-hour Ozone State Implementation Plan, finding them adequate for transportation conformity purposes to be announced in the Federal Register.

  17. Certified randomness in quantum physics.

    PubMed

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  18. Methods for calculating confidence and credible intervals for the residual between-study variance in random effects meta-regression models

    PubMed Central

    2014-01-01

    Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829

  19. Are adequate methods available to detect protist parasites on fresh produce?

    USDA-ARS?s Scientific Manuscript database

    Human parasitic protists such as Cryptosporidium, Giardia and microsporidia contaminate a variety of fresh produce worldwide. Existing detection methods lack sensitivity and specificity for most foodborne parasites. Furthermore, detection has been problematic because these parasites adhere tenacious...

  20. Essential energy space random walk via energy space metadynamics method to accelerate molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Li, Hongzhi; Min, Donghong; Liu, Yusong; Yang, Wei

    2007-09-01

    To overcome the possible pseudoergodicity problem, molecular dynamic simulation can be accelerated via the realization of an energy space random walk. To achieve this, a biased free energy function (BFEF) needs to be priori obtained. Although the quality of BFEF is essential for sampling efficiency, its generation is usually tedious and nontrivial. In this work, we present an energy space metadynamics algorithm to efficiently and robustly obtain BFEFs. Moreover, in order to deal with the associated diffusion sampling problem caused by the random walk in the total energy space, the idea in the original umbrella sampling method is generalized to be the random walk in the essential energy space, which only includes the energy terms determining the conformation of a region of interest. This essential energy space generalization allows the realization of efficient localized enhanced sampling and also offers the possibility of further sampling efficiency improvement when high frequency energy terms irrelevant to the target events are free of activation. The energy space metadynamics method and its generalization in the essential energy space for the molecular dynamics acceleration are demonstrated in the simulation of a pentanelike system, the blocked alanine dipeptide model, and the leucine model.

  1. Brief Report: A Randomized, Placebo-Controlled Proof-of-Concept Trial of Adjunctive Topiramate for Alcohol Use Disorders in Bipolar Disorder

    PubMed Central

    Sylvia, Louisa G.; Gold, Alexandra K.; Stange, Jonathan P.; Peckham, Andrew D.; Deckersbach, Thilo; Calabrese, Joseph R.; Weiss, Roger D.; Perlis, Roy H.; Nierenberg, Andrew A.; Ostacher, Michael J.

    2016-01-01

    Background and Objectives Topiramate is effective for alcohol use disorders (AUDs) among non-psychiatric patients. We examined topiramate for treating comorbid AUDs in bipolar disorder (BD). Methods Twelve participants were randomized to topiramate or placebo for 12 weeks. Results The topiramate group, with two out of five participants (40%) completing treatment, experienced less improvement in drinking patterns than the placebo group, with five out of seven participants (71%) completing treatment. Discussion and Conclusions Topiramate did not improve drinking behavior and was not well-tolerated. This study failed to recruit adequately. Problems surrounding high attrition, a small study sample, and missing data preclude interpretation of study findings. Scientific Significance This is the first randomized, placebo-controlled trial of topiramate for AUDs in BD. PMID:26894822

  2. A Novel Motion Compensation Method for Random Stepped Frequency Radar with M-sequence

    NASA Astrophysics Data System (ADS)

    Liao, Zhikun; Hu, Jiemin; Lu, Dawei; Zhang, Jun

    2018-01-01

    The random stepped frequency radar is a new kind of synthetic wideband radar. In the research, it has been found that it possesses a thumbtack-like ambiguity function which is considered to be the ideal one. This also means that only a precise motion compensation could result in the correct high resolution range profile. In this paper, we will introduce the random stepped frequency radar coded by M-sequence firstly and briefly analyse the effect of relative motion between target and radar on the distance imaging, which is called defocusing problem. Then, a novel motion compensation method, named complementary code cancellation, will be put forward to solve this problem. Finally, the simulated experiments will demonstrate its validity and the computational analysis will show up its efficiency.

  3. Region 1: New Hampshire Adequate Letter (5/29/2012)

    EPA Pesticide Factsheets

    This April 25, 2012 letter from EPA to the New Hampshire Department of Environmental Services, determined the 2008 and 2022 Motor Vehicle Emissions Budgets (MVEBs) are adequate for transportation conformity purposes and will be announced in the Federal Reg

  4. Region 5: Ohio Columbus Adequate Letter (8/23/2016)

    EPA Pesticide Factsheets

    Letter from EPA to State of Ohio determined the 2008 8-hour ozone standard plan for years 2020 and 2030 Motor Vehicle Emissions Budgets for volatile organic compounds and nitrogen oxides for Columbus area adequate for transportation conformity purposes.

  5. A multi-level intervention in worksites to increase fruit and vegetable access and intake: Rationale, design and methods of the ‘Good to Go’ cluster randomized trial

    PubMed Central

    Risica, Patricia M.; Gorham, Gemma; Dionne, Laura; Nardi, William; Ng, Doug; Middler, Reese; Mello, Jennifer; Akpolat, Rahmet; Gettens, Katelyn; Gans, Kim M.

    2018-01-01

    Background Fruit and vegetable (F&V) consumption is an important contributor to chronic disease prevention. However, most Americans do not eat adequate amounts. The worksite is an advantageous setting to reach large, diverse segments of the population with interventions to increase F&V intake, but research gaps exist. No studies have evaluated the implementation of mobile F&V markets at worksites nor compared the effectiveness of such markets with or without nutrition education. Methods This paper describes the protocol for Good to Go (GTG), a cluster randomized trial to evaluate F&V intake change in employees from worksites randomized into three experimental arms: discount, fresh F&V markets (Access Only arm); markets plus educational components including campaigns, cooking demonstrations, videos, newsletters, and a web site (Access Plus arm); and an attention placebo comparison intervention on physical activity and stress reduction (Comparison). Secondary aims include: 1) Process evaluation to determine costs, reach, fidelity, and dose as well as the relationship of these variables with changes in F&V intake; 2) Applying a mediating variable framework to examine relationships of psychosocial factors/determinants with changes in F&V consumption; and 3) Cost effectiveness analysis of the different intervention arms. Discussion The GTG study will fill important research gaps in the field by implementing a rigorous cluster randomized trial to evaluate the efficacy of an innovative environmental intervention providing access and availability to F&V at the worksite and whether this access intervention is further enhanced by accompanying educational interventions. GTG will provide an important contribution to public health research and practice. Trial registration number NCT02729675, ClinicalTrials.gov PMID:29242108

  6. Hydrolyzed Formula With Reduced Protein Content Supports Adequate Growth: A Randomized Controlled Noninferiority Trial.

    PubMed

    Ahrens, Birgit; Hellmuth, Christian; Haiden, Nadja; Olbertz, Dirk; Hamelmann, Eckard; Vusurovic, Milica; Fleddermann, Manja; Roehle, Robert; Knoll, Anette; Koletzko, Berthold; Wahn, Ulrich; Beyer, Kirsten

    2018-05-01

    A high protein content of nonhydrolyzed infant formula exceeding metabolic requirements can induce rapid weight gain and obesity. Hydrolyzed formula with too low protein (LP) content may result in inadequate growth. The aim of this study was to investigate noninferiority of partial and extensively hydrolyzed formulas (pHF, eHF) with lower hydrolyzed protein content than conventionally, regularly used formulas, with or without synbiotics for normal growth of healthy term infants. In an European multi-center, parallel, prospective, controlled, double-blind trial, 402 formula-fed infants were randomly assigned to four groups: LP-formulas (1.9 g protein/100 kcal) as pHF with or without synbiotics, LP-eHF formula with synbiotics, or regular protein eHF (2.3 g protein/100 kcal). One hundred and one breast-fed infants served as observational reference group. As primary endpoint, noninferiority of daily weight gain during the first 4 months of life was investigated comparing the LP-group to a regular protein eHF group. A comparison of daily weight gain in infants receiving LPpHF (2.15 g/day CI -0.18 to inf.) with infants receiving regular protein eHF showed noninferior weight gain (-3.5 g/day margin; per protocol [PP] population). Noninferiority was also confirmed for the other tested LP formulas. Likewise, analysis of metabolic parameters and plasma amino acid concentrations demonstrated a safe and balanced nutritional composition. Energetic efficiency for growth (weight) was slightly higher in LPeHF and synbiotics compared with LPpHF and synbiotics. All tested hydrolyzed LP formulas allowed normal weight gain without being inferior to regular protein eHF in the first 4 months of life. This trial was registered at clinicaltrials.gov, NCT01143233.

  7. An evaluation of random analysis methods for the determination of panel damping

    NASA Technical Reports Server (NTRS)

    Bhat, W. V.; Wilby, J. F.

    1972-01-01

    An analysis is made of steady-state and non-steady-state methods for the measurement of panel damping. Particular emphasis is placed on the use of random process techniques in conjunction with digital data reduction methods. The steady-state methods considered use the response power spectral density, response autocorrelation, excitation-response crosspower spectral density, or single-sided Fourier transform (SSFT) of the response autocorrelation function. Non-steady-state methods are associated mainly with the use of rapid frequency sweep excitation. Problems associated with the practical application of each method are evaluated with specific reference to the case of a panel exposed to a turbulent airflow, and two methods, the power spectral density and the single-sided Fourier transform methods, are selected as being the most suitable. These two methods are demonstrated experimentally, and it is shown that the power spectral density method is satisfactory under most conditions, provided that appropriate corrections are applied to account for filter bandwidth and background noise errors. Thus, the response power spectral density method is recommended for the measurement of the damping of panels exposed to a moving airflow.

  8. Region 8: Colorado Springs Adequate Letter (8/17/2011)

    EPA Pesticide Factsheets

    This March 3, 2011 letter from EPA to Chistopher E. Urbina M.D., MPH, Colorado Department of Public Health and Environment states that EPA has found that the Colorado Springs, CO second 10 year Limited Maintenance Plan (LMP) adequate for transportation

  9. Minimum requirements for adequate nighttime conspicuity of highway signs

    DOT National Transportation Integrated Search

    1988-02-01

    A laboratory and field study were conducted to assess the minimum luminance levels of signs to ensure that they will be detected and identified at adequate distances under nighttime driving conditions. A total of 30 subjects participated in the field...

  10. Inferential Processing among Adequate and Struggling Adolescent Comprehenders and Relations to Reading Comprehension

    PubMed Central

    Barth, Amy E.; Barnes, Marcia; Francis, David J.; Vaughn, Sharon; York, Mary

    2015-01-01

    Separate mixed model analyses of variance (ANOVA) were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6–12 (n = 1203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in comprehension. Results suggest that there is considerable growth across the middle and high school years, particularly for adequate comprehenders in those text integration processes that maintain local coherence. Accuracy in text consistency judgments accounted for significant unique variance for passage-level, but not sentence-level comprehension, particularly for adequate comprehenders. PMID:26166946

  11. Systematic pain assessment in nursing homes: a cluster-randomized trial using mixed-methods approach.

    PubMed

    Mamhidir, Anna-Greta; Sjölund, Britt-Marie; Fläckman, Birgitta; Wimo, Anders; Sköldunger, Anders; Engström, Maria

    2017-02-28

    Chronic pain affects nursing home residents' daily life. Pain assessment is central to adequate pain management. The overall aim was to investigate effects of a pain management intervention on nursing homes residents and to describe staffs' experiences of the intervention. A cluster-randomized trial and a mixed-methods approach. Randomized nursing home assignment to intervention or comparison group. The intervention group after theoretical and practical training sessions, performed systematic pain assessments using predominately observational scales with external and internal facilitators supporting the implementation. No measures were taken in the comparison group; pain management continued as before, but after the study corresponding training was provided. Resident data were collected baseline and at two follow-ups using validated scales and record reviews. Nurse group interviews were carried out twice. Primary outcome measures were wellbeing and proxy-measured pain. Secondary outcome measures were ADL-dependency and pain documentation. Using both non-parametric statistics on residential level and generalized estimating equation (GEE) models to take clustering effects into account, the results revealed non-significant interaction effects for the primary outcome measures, while for ADL-dependency using Katz-ADL there was a significant interaction effect. Comparison group (n = 66 residents) Katz-ADL values showed increased dependency over time, while the intervention group demonstrated no significant change over time (n = 98). In the intervention group, 13/44 residents showed decreased pain scores over the period, 14/44 had no pain score changes ≥ 30% in either direction measured with Doloplus-2. Furthermore, 17/44 residents showed increased pain scores ≥ 30% over time, indicating pain/risk for pain; 8 identified at the first assessment and 9 were new, i.e. developed pain over time. No significant changes in the use of drugs was found in any of

  12. Prediction models for clustered data: comparison of a random intercept and standard regression model

    PubMed Central

    2013-01-01

    Background When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Methods Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. Results The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. Conclusion The models with random intercept discriminate

  13. An improved random walk algorithm for the implicit Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keady, Kendra P., E-mail: keadyk@lanl.gov; Cleveland, Mathew A.

    In this work, we introduce a modified Implicit Monte Carlo (IMC) Random Walk (RW) algorithm, which increases simulation efficiency for multigroup radiative transfer problems with strongly frequency-dependent opacities. To date, the RW method has only been implemented in “fully-gray” form; that is, the multigroup IMC opacities are group-collapsed over the full frequency domain of the problem to obtain a gray diffusion problem for RW. This formulation works well for problems with large spatial cells and/or opacities that are weakly dependent on frequency; however, the efficiency of the RW method degrades when the spatial cells are thin or the opacities aremore » a strong function of frequency. To address this inefficiency, we introduce a RW frequency group cutoff in each spatial cell, which divides the frequency domain into optically thick and optically thin components. In the modified algorithm, opacities for the RW diffusion problem are obtained by group-collapsing IMC opacities below the frequency group cutoff. Particles with frequencies above the cutoff are transported via standard IMC, while particles below the cutoff are eligible for RW. This greatly increases the total number of RW steps taken per IMC time-step, which in turn improves the efficiency of the simulation. We refer to this new method as Partially-Gray Random Walk (PGRW). We present numerical results for several multigroup radiative transfer problems, which show that the PGRW method is significantly more efficient than standard RW for several problems of interest. In general, PGRW decreases runtimes by a factor of ∼2–4 compared to standard RW, and a factor of ∼3–6 compared to standard IMC. While PGRW is slower than frequency-dependent Discrete Diffusion Monte Carlo (DDMC), it is also easier to adapt to unstructured meshes and can be used in spatial cells where DDMC is not applicable. This suggests that it may be optimal to employ both DDMC and PGRW in a single simulation.« less

  14. Region 8: Colorado Telluride Adequate Letter (8/17/2011)

    EPA Pesticide Factsheets

    This March 4, 2011 letter from EPA to Chistopher E. Urbina M.D., MPH, Colorado Department of Public Health and Environment states that EPA has found that the Telluride, CO PM10 maintenance plan and the 2021 motor vehicle emisssions budget (MVEB) adequate

  15. Generalized self-consistent method for predicting the effective elastic properties of composites with random hybrid structures

    NASA Astrophysics Data System (ADS)

    Pan'kov, A. A.

    1997-05-01

    The feasibility of using a generalized self-consistent method for predicting the effective elastic properties of composites with random hybrid structures has been examined. Using this method, the problem is reduced to solution of simpler special averaged problems for composites with single inclusions and corresponding transition layers in the medium examined. The dimensions of the transition layers are defined by correlation radii of the composite random structure of the composite, while the heterogeneous elastic properties of the transition layers take account of the probabilities for variation of the size and configuration of the inclusions using averaged special indicator functions. Results are given for a numerical calculation of the averaged indicator functions and analysis of the effect of the micropores in the matrix-fiber interface region on the effective elastic properties of unidirectional fiberglass—epoxy using the generalized self-consistent method and compared with experimental data and reported solutions.

  16. Multilevel Analysis Methods for Partially Nested Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Sanders, Elizabeth A.

    2011-01-01

    This paper explores multilevel modeling approaches for 2-group randomized experiments in which a treatment condition involving clusters of individuals is compared to a control condition involving only ungrouped individuals, otherwise known as partially nested cluster randomized designs (PNCRTs). Strategies for comparing groups from a PNCRT in the…

  17. Do we assess urethral function adequately in LUTD and NLUTD? ICI-RS 2015.

    PubMed

    Gajewski, Jerzy B; Rosier, Peter F W M; Rahnama'i, Sajjad; Abrams, Paul

    2017-04-01

    Urethral function, as well as anatomy, play a significant role in voiding reflex and abnormalities in one or both contribute to the pathophysiology of Lower Urinary Tract Dysfunction (LUTD). We have several diagnostic tools to assess the urethral function or dysfunction but the question remains, are these adequate? This is a report of the proceedings of Think Tank P1: 'Do we assess urethral function adequately in LUTD and NLUTD?' from the annual International Consultation on Incontinence-Research Society, which took place September 22-24, 2014 in Bristol, UK. We have collected and discussed, as a committee, the evidence with regard to the urethra and the available relevant methods of testing urethral function, with the emphasis on female and male voiding dysfunction. We looked into previous research and clinical studies and compiled summaries of pertinent testing related to urethral function. The discussion has focused on clinical applications and the desirability of further development of functional tests and analyses in this field. There are limitations to most of the urethral function tests. Future perspectives and research should concentrate on further development of functional testing and imaging techniques with emphasis on standardization and clinical application of these tests. Neurourol. Urodynam. 36:935-942, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Adequate Initial Antidepressant Treatment Among Patients With Chronic Obstructive Pulmonary Disease in a Cohort of Depressed Veterans

    PubMed Central

    Pirraglia, Paul A.; Charbonneau, Andrea; Kader, Boris; Berlowitz, Dan R.

    2006-01-01

    Objective: Depression is common among patients with chronic obstructive pulmonary disease (COPD). Patients with COPD may be more likely to have inadequate treatment with antidepressant medications. We tested the hypothesis that depressed patients with COPD have lower odds of adequate duration of antidepressant therapy in the first 3 months of treatment compared to those without COPD. Method: Using administrative and centralized pharmacy data from 14 northeastern Veterans Affairs Medical Centers, we identified 778 veterans with depression (ICD-9-CM codes 296.2x, 296.3x, and 311.xx) who were in the acute phase of antidepressant treatment from June 1, 1999, through August 31, 1999. Within this group, we identified those patients with COPD (23%). An adequate duration of antidepressant treatment was defined as ≥ 80% of days on an antidepressant. We used multivariable logistic regression models to determine the adjusted odds of adequate acute phase antidepressant treatment duration. Results: Those patients with COPD had markedly lower odds of adequate acute phase treatment duration (odds ratio = 0.67, 95% CI = 0.47 to 0.96); this was not observed with other medical diagnoses such as coronary heart disease, diabetes mellitus, or osteoarthritis. Conclusions: The first few months of treatment appears to be a critical period for depressed patients with COPD who are started on antidepressants. The causes for early antidepressant treatment inadequacy among patients with COPD require further investigation. More intensive efforts may be necessary early in the course of treatment to assure high-quality pharmacologic therapy of depressed patients with COPD. PMID:16862230

  19. Do implant overdentures improve dietary intake? A randomized clinical trial.

    PubMed

    Hamdan, N M; Gray-Donald, K; Awad, M A; Johnson-Down, L; Wollin, S; Feine, J S

    2013-12-01

    People wearing mandibular two-implant overdentures (IOD) chew food with less difficulty than those wearing conventional complete dentures (CD). However, there is still controversy over whether or not this results in better dietary intake. In this randomized clinical trials (RCT), the amounts of total dietary fiber (TDF), macronutrients, 9 micronutrients, and energy in diets consumed by persons with IOD and CD were compared. Male and female edentate patients ≥ 65 yrs (n = 255) were randomly divided into 2 groups and assigned to receive a maxillary CD and either a mandibular IOD or a CD. One year following prosthesis delivery, 217 participants (CD = 114, IOD = 103) reported the food and quantities they consumed to a registered dietician through a standard 24-hour dietary recall method. The mean and median values of TDF, macro- and micronutrients, and energy consumed by both groups were calculated and compared analytically. No significant between-group differences were found (ps > .05). Despite quality-of-life benefits from IODs, this adequately powered study reveals no evidence of nutritional advantages for independently living medically healthy edentate elders wearing two-implant mandibular overdentures over those wearing conventional complete dentures in their dietary intake at one year following prosthesis delivery.

  20. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT... and Adequate Veterinary Care § 2.40 Attending veterinarian and adequate veterinary care (dealers and... veterinary care to its animals in compliance with this section. (1) Each dealer and exhibitor shall employ an...

  1. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT... and Adequate Veterinary Care § 2.40 Attending veterinarian and adequate veterinary care (dealers and... veterinary care to its animals in compliance with this section. (1) Each dealer and exhibitor shall employ an...

  2. Test-treatment RCTs are susceptible to bias: a review of the methodological quality of randomized trials that evaluate diagnostic tests.

    PubMed

    Ferrante di Ruffano, Lavinia; Dinnes, Jacqueline; Sitch, Alice J; Hyde, Chris; Deeks, Jonathan J

    2017-02-24

    There is a growing recognition for the need to expand our evidence base for the clinical effectiveness of diagnostic tests. Many international bodies are calling for diagnostic randomized controlled trials to provide the most rigorous evidence of impact to patient health. Although these so-called test-treatment RCTs are very challenging to undertake due to their methodological complexity, they have not been subjected to a systematic appraisal of their methodological quality. The extent to which these trials may be producing biased results therefore remains unknown. We set out to address this issue by conducting a methodological review of published test-treatment trials to determine how often they implement adequate methods to limit bias and safeguard the validity of results. We ascertained all test-treatment RCTs published 2004-2007, indexed in CENTRAL, including RCTs which randomized patients to diagnostic tests and measured patient outcomes after treatment. Tests used for screening, monitoring or prognosis were excluded. We assessed adequacy of sequence generation, allocation concealment and intention-to-treat, appropriateness of primary analyses, blinding and reporting of power calculations, and extracted study characteristics including the primary outcome. One hundred three trials compared 105 control with 119 experimental interventions, and reported 150 primary outcomes. Randomization and allocation concealment were adequate in 57 and 37% of trials. Blinding was uncommon (patients 5%, clinicians 4%, outcome assessors 21%), as was an adequate intention-to-treat analysis (29%). Overall 101 of 103 trials (98%) were at risk of bias, as judged using standard Cochrane criteria. Test-treatment trials are particularly susceptible to attrition and inadequate primary analyses, lack of blinding and under-powering. These weaknesses pose much greater methodological and practical challenges to conducting reliable RCT evaluations of test-treatment strategies than standard

  3. Current strategies for the restoration of adequate lordosis during lumbar fusion

    PubMed Central

    Barrey, Cédric; Darnis, Alice

    2015-01-01

    Not restoring the adequate lumbar lordosis during lumbar fusion surgery may result in mechanical low back pain, sagittal unbalance and adjacent segment degeneration. The objective of this work is to describe the current strategies and concepts for restoration of adequate lordosis during fusion surgery. Theoretical lordosis can be evaluated from the measurement of the pelvic incidence and from the analysis of spatial organization of the lumbar spine with 2/3 of the lordosis given by the L4-S1 segment and 85% by the L3-S1 segment. Technical aspects involve patient positioning on the operating table, release maneuvers, type of instrumentation used (rod, screw-rod connection, interbody cages), surgical sequence and the overall surgical strategy. Spinal osteotomies may be required in case of fixed kyphotic spine. AP combined surgery is particularly efficient in restoring lordosis at L5-S1 level and should be recommended. Finally, not one but several strategies may be used to achieve the need for restoration of adequate lordosis during fusion surgery. PMID:25621216

  4. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less

  5. Reporting of harm and safety results in randomized controlled trials published in 5 dermatology journals.

    PubMed

    Haddad, Cynthia; Sigha, Odette Berline; Lebrun-Vignes, Bénédicte; Chosidow, Olivier; Fardet, Laurence

    2017-07-01

    Randomized controlled trials (RCTs) are considered the gold standard for assessing efficacy and short-term harm of medicines. However, several studies have come to the conclusion that harm is less well reported than efficacy outcomes. To describe harm reporting in publications on dermatological RCTs and assess parameters that could influence the quality of harm reporting. Methodologic systematic review of dermatologic RCTs published from 2010 to 2014 in 5 dermatological journals. Among 110 assessed publications on RCTs, 80 (73%) adequately reported harm and 52% adequately reported its severity. Overall, 40% of the assessed manuscripts perfectly reported and discussed harm. The adequate reporting of harm was significantly associated with the type of trial (odds ratio [OR] 4.41, 95% confidence interval [CI] 1.60-12.35 for multicenter compared with monocentric trials) and having a predefined method for collecting harm data (OR 5.93, 95% CI 2.26-15.56). Reporting of harm severity was better in pharmacologic trials (OR 6.48, 95% CI 2.00-21.0) compared with nonpharmacologic trials and in trials for which a method for collecting harm (OR 5.65, 95% CI 2.00-16.4) and its severity (OR 3.60, 95% CI 1.00-12.8) was defined before the study onset. Assessment was restricted to RCTs and 5 dermatological journals. Harm is quite well reported in dermatologic journals. Efforts should be made on reporting severity of harm. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  6. Randomized clinical trials in dentistry: Risks of bias, risks of random errors, reporting quality, and methodologic quality over the years 1955-2013.

    PubMed

    Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos

    2017-01-01

    To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955-2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent

  7. 4 CFR 200.14 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... identifiable personal data and automated systems shall be adequately trained in the security and privacy of... the security and privacy of such records. (5) The disposal and destruction of identifiable personal....14 Section 200.14 Accounts RECOVERY ACCOUNTABILITY AND TRANSPARENCY BOARD PRIVACY ACT OF 1974 § 200...

  8. 4 CFR 200.14 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... identifiable personal data and automated systems shall be adequately trained in the security and privacy of....14 Section 200.14 Accounts RECOVERY ACCOUNTABILITY AND TRANSPARENCY BOARD PRIVACY ACT OF 1974 § 200... records in which identifiable personal data are processed or maintained, including all reports and output...

  9. An asymptotic-preserving stochastic Galerkin method for the radiative heat transfer equations with random inputs and diffusive scalings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shi, E-mail: sjin@wisc.edu; Institute of Natural Sciences, Department of Mathematics, MOE-LSEC and SHL-MAC, Shanghai Jiao Tong University, Shanghai 200240; Lu, Hanqing, E-mail: hanqing@math.wisc.edu

    2017-04-01

    In this paper, we develop an Asymptotic-Preserving (AP) stochastic Galerkin scheme for the radiative heat transfer equations with random inputs and diffusive scalings. In this problem the random inputs arise due to uncertainties in cross section, initial data or boundary data. We use the generalized polynomial chaos based stochastic Galerkin (gPC-SG) method, which is combined with the micro–macro decomposition based deterministic AP framework in order to handle efficiently the diffusive regime. For linearized problem we prove the regularity of the solution in the random space and consequently the spectral accuracy of the gPC-SG method. We also prove the uniform (inmore » the mean free path) linear stability for the space-time discretizations. Several numerical tests are presented to show the efficiency and accuracy of proposed scheme, especially in the diffusive regime.« less

  10. Random Fields

    NASA Astrophysics Data System (ADS)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  11. Cognitive Attributes of Adequate and Inadequate Responders to Reading Intervention in Middle School

    ERIC Educational Resources Information Center

    Miciak, Jeremy; Stuebing, Karla K.; Vaughn, Sharon; Roberts, Greg; Barth, Amy E.; Fletcher, Jack M.

    2014-01-01

    No studies have investigated the cognitive attributes of middle school students who are adequate and inadequate responders to Tier 2 reading intervention. We compared students in Grades 6 and 7 representing groups of adequate responders (n = 77) and inadequate responders who fell below criteria in (a) comprehension (n = 54); (b) fluency (n = 45);…

  12. 34 CFR 200.14 - Components of Adequate Yearly Progress.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Components of Adequate Yearly Progress. 200.14 Section 200.14 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE DISADVANTAGED...

  13. 13 CFR 108.200 - Adequate capital for NMVC Companies.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for NMVC Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200...

  14. Active video games as a tool to prevent excessive weight gain in adolescents: rationale, design and methods of a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Excessive body weight, low physical activity and excessive sedentary time in youth are major public health concerns. A new generation of video games, the ones that require physical activity to play the games –i.e. active games- may be a promising alternative to traditional non-active games to promote physical activity and reduce sedentary behaviors in youth. The aim of this manuscript is to describe the design of a study evaluating the effects of a family oriented active game intervention, incorporating several motivational elements, on anthropometrics and health behaviors in adolescents. Methods/Design The study is a randomized controlled trial (RCT), with non-active gaming adolescents aged 12 – 16 years old randomly allocated to a ten month intervention (receiving active games, as well as an encouragement to play) or a waiting-list control group (receiving active games after the intervention period). Primary outcomes are adolescents’ measured BMI-SDS (SDS = adjusted for mean standard deviation score), waist circumference-SDS, hip circumference and sum of skinfolds. Secondary outcomes are adolescents’ self-reported time spent playing active and non-active games, other sedentary activities and consumption of sugar-sweetened beverages. In addition, a process evaluation is conducted, assessing the sustainability of the active games, enjoyment, perceived competence, perceived barriers for active game play, game context, injuries from active game play, activity replacement and intention to continue playing the active games. Discussion This is the first adequately powered RCT including normal weight adolescents, evaluating a reasonably long period of provision of and exposure to active games. Next, strong elements are the incorporating motivational elements for active game play and a comprehensive process evaluation. This trial will provide evidence regarding the potential contribution of active games in prevention of excessive weight gain in

  15. The WOMEN study: what is the optimal method for ischemia evaluation in women? A multi-center, prospective, randomized study to establish the optimal method for detection of coronary artery disease (CAD) risk in women at an intermediate-high pretest likelihood of CAD: study design.

    PubMed

    Mieres, Jennifer H; Shaw, Leslee J; Hendel, Robert C; Heller, Gary V

    2009-01-01

    Coronary artery disease remains the leading cause of morbidity and mortality in women. The optimal non-invasive test for evaluation of ischemic heart disease in women is unknown. Although current guidelines support the choice of the exercise tolerance test (ETT) as a first line test for women with a normal baseline ECG and adequate exercise capabilities, supportive data for this recommendation are controversial. The what is the optimal method for ischemia evaluation in women? (WOMEN) study was designed to determine the optimal non-invasive strategy for CAD risk detection of intermediate and high risk women presenting with chest pain or equivalent symptoms suggestive of ischemic heart disease. The study will prospectively compare the 2-year event rates in women capable of performing exercise treadmill testing or Tc-99 m tetrofosmin SPECT myocardial perfusion imaging (MPI). The study will enroll women presenting for the evaluation of chest pain or anginal equivalent symptoms who are capable of performing >5 METs of exercise while at intermediate-high pretest risk for ischemic heart disease who will be randomized to either ETT testing alone or with Tc-99 m tetrofosmin SPECT MPI. The null hypothesis for this project is that the exercise ECG has the same negative predictive value for risk detection as gated myocardial perfusion SPECT in women. The primary aim is to compare 2-year cardiac event rates in women randomized to SPECT MPI to those randomized to ETT. The WOMEN study seeks to provide objective information for guidelines for the evaluation of symptomatic women with an intermediate-high likelihood for CAD.

  16. Region 9: California Adequate / Inadequate Letter Attachment (5/30/2008)

    EPA Pesticide Factsheets

    This is a document that states that it has been found adequate for transportation conformitypurposes certain 8-hour ozone and PM2.5 motor vehicleemissions budgets in the 2007 South Coast StateImplementation Plan.

  17. 10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the security and privacy of personal data. (4) The disposal and disposition of identifiable personal... contained in a system of records are adequately trained to protect the security and privacy of such records....114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114...

  18. A Random Forest-based ensemble method for activity recognition.

    PubMed

    Feng, Zengtao; Mo, Lingfei; Li, Meng

    2015-01-01

    This paper presents a multi-sensor ensemble approach to human physical activity (PA) recognition, using random forest. We designed an ensemble learning algorithm, which integrates several independent Random Forest classifiers based on different sensor feature sets to build a more stable, more accurate and faster classifier for human activity recognition. To evaluate the algorithm, PA data collected from the PAMAP (Physical Activity Monitoring for Aging People), which is a standard, publicly available database, was utilized to train and test. The experimental results show that the algorithm is able to correctly recognize 19 PA types with an accuracy of 93.44%, while the training is faster than others. The ensemble classifier system based on the RF (Random Forest) algorithm can achieve high recognition accuracy and fast calculation.

  19. A randomized trial comparing digital and live lecture formats [ISRCTN40455708

    PubMed Central

    Solomon, David J; Ferenchick, Gary S; Laird-Fick, Heather S; Kavanaugh, Kevin

    2004-01-01

    Background Medical education is increasingly being conducted in community-based teaching sites at diverse locations, making it difficult to provide a consistent curriculum. We conducted a randomized trial to assess whether students who viewed digital lectures would perform as well on a measure of cognitive knowledge as students who viewed live lectures. Students' perceptions of the digital lecture format and their opinion as whether a digital lecture format could serve as an adequate replacement for live lectures was also assessed. Methods Students were randomized to either attend a lecture series at our main campus or view digital versions of the same lectures at community-based teaching sites. Both groups completed the same examination based on the lectures, and the group viewing the digital lectures completed a feedback form on the digital format. Results There were no differences in performance as measured by means or average rank. Despite technical problems, the students who viewed the digital lectures overwhelmingly felt the digital lectures could replace live lectures. Conclusions This study provides preliminary evidence digital lectures can be a viable alternative to live lectures as a means of delivering didactic presentations in a community-based setting. PMID:15569389

  20. Recruitment methods and costs for a randomized, placebo-controlled trial of chiropractic care for lumbar spinal stenosis: a single-site pilot study.

    PubMed

    Cambron, Jerrilyn A; Dexheimer, Jennifer M; Chang, Mabel; Cramer, Gregory D

    2010-01-01

    The purpose of this article is to describe the methods for recruitment in a clinical trial on chiropractic care for lumbar spinal stenosis. This randomized, placebo-controlled pilot study investigated the efficacy of different amounts of total treatment dosage over 6 weeks in 60 volunteer subjects with lumbar spinal stenosis. Subjects were recruited for this study through several media venues, focusing on successful and cost-effective strategies. Included in our efforts were radio advertising, newspaper advertising, direct mail, and various other low-cost initiatives. Of the 1211 telephone screens, 60 responders (5.0%) were randomized into the study. The most successful recruitment method was radio advertising, generating more than 64% of the calls (776 subjects). Newspaper and magazine advertising generated approximately 9% of all calls (108 subjects), and direct mail generated less than 7% (79 subjects). The total direct cost for recruitment was $40 740 or $679 per randomized patient. The costs per randomization were highest for direct mail ($995 per randomization) and lowest for newspaper/magazine advertising ($558 per randomization). Success of recruitment methods may vary based on target population and location. Planning of recruitment efforts is essential to the success of any clinical trial. Copyright 2010 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  1. Do Beginning Teachers Receive Adequate Support from Their Headteachers?

    ERIC Educational Resources Information Center

    Menon, Maria Eliophotou

    2012-01-01

    The article examines the problems faced by beginning teachers in Cyprus and the extent to which headteachers are considered to provide adequate guidance and support to them. Data were collected through interviews with 25 school teachers in Cyprus, who had recently entered teaching (within 1-5 years) in public primary schools. According to the…

  2. Prediction models for clustered data: comparison of a random intercept and standard regression model.

    PubMed

    Bouwmeester, Walter; Twisk, Jos W R; Kappen, Teus H; van Klei, Wilton A; Moons, Karel G M; Vergouwe, Yvonne

    2013-02-15

    When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. The models with random intercept discriminate better than the standard model only

  3. Randomized Controlled Trial of Teaching Methods: Do Classroom Experiments Improve Economic Education in High Schools?

    ERIC Educational Resources Information Center

    Eisenkopf, Gerald; Sulser, Pascal A.

    2016-01-01

    The authors present results from a comprehensive field experiment at Swiss high schools in which they compare the effectiveness of teaching methods in economics. They randomly assigned classes into an experimental and a conventional teaching group, or a control group that received no specific instruction. Both teaching treatments improve economic…

  4. Random regression models using different functions to model test-day milk yield of Brazilian Holstein cows.

    PubMed

    Bignardi, A B; El Faro, L; Torres Júnior, R A A; Cardoso, V L; Machado, P F; Albuquerque, L G

    2011-10-31

    We analyzed 152,145 test-day records from 7317 first lactations of Holstein cows recorded from 1995 to 2003. Our objective was to model variations in test-day milk yield during the first lactation of Holstein cows by random regression model (RRM), using various functions in order to obtain adequate and parsimonious models for the estimation of genetic parameters. Test-day milk yields were grouped into weekly classes of days in milk, ranging from 1 to 44 weeks. The contemporary groups were defined as herd-test-day. The analyses were performed using a single-trait RRM, including the direct additive, permanent environmental and residual random effects. In addition, contemporary group and linear and quadratic effects of the age of cow at calving were included as fixed effects. The mean trend of milk yield was modeled with a fourth-order orthogonal Legendre polynomial. The additive genetic and permanent environmental covariance functions were estimated by random regression on two parametric functions, Ali and Schaeffer and Wilmink, and on B-spline functions of days in milk. The covariance components and the genetic parameters were estimated by the restricted maximum likelihood method. Results from RRM parametric and B-spline functions were compared to RRM on Legendre polynomials and with a multi-trait analysis, using the same data set. Heritability estimates presented similar trends during mid-lactation (13 to 31 weeks) and between week 37 and the end of lactation, for all RRM. Heritabilities obtained by multi-trait analysis were of a lower magnitude than those estimated by RRM. The RRMs with a higher number of parameters were more useful to describe the genetic variation of test-day milk yield throughout the lactation. RRM using B-spline and Legendre polynomials as base functions appears to be the most adequate to describe the covariance structure of the data.

  5. A Mixed-Methods, Randomized, Controlled Feasibility Trial to Inform the Design of a Phase III Trial to Test the Effect of the Handheld Fan on Physical Activity and Carer Anxiety in Patients With Refractory Breathlessness.

    PubMed

    Johnson, Miriam J; Booth, Sara; Currow, David C; Lam, Lawrence T; Phillips, Jane L

    2016-05-01

    The handheld fan is an inexpensive and safe way to provide facial airflow, which may reduce the sensation of chronic refractory breathlessness, a frequently encountered symptom. To test the feasibility of developing an adequately powered, multicenter, multinational randomized controlled trial comparing the efficacy of a handheld fan and exercise advice with advice alone in increasing activity in people with chronic refractory breathlessness from a variety of medical conditions, measuring recruitment rates; data quality; and potential primary outcome measures. This was a Phase II, multisite, international, parallel, nonblinded, mixed-methods randomized controlled trial. Participants were centrally randomized to fan or control. All received breathlessness self-management/exercise advice and were followed up weekly for four weeks. Participants/carers were invited to participate in a semistructured interview at the study's conclusion. Ninety-seven people were screened, 49 randomized (mean age 68 years; 49% men), and 43 completed the study. Site recruitment varied from 0.25 to 3.3/month and screening:randomization from 1.1:1 to 8.5:1. There were few missing data except for the Chronic Obstructive Pulmonary Disease Self-Efficacy Scale (two-thirds of data missing). No harms were observed. Three interview themes included 1) a fan is a helpful self-management strategy, 2) a fan aids recovery, and 3) a symptom control trial was welcome. A definitive, multisite trial to study the use of the handheld fan as part of self-management of chronic refractory breathlessness is feasible. Participants found the fan useful. However, the value of information for changing practice or policy is unlikely to justify the expense of such a trial, given perceived benefits, the minimal costs, and an absence of harms demonstrated in this study. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  6. Region 8: Colorado Canon City Adequate Letter (8/17/2011)

    EPA Pesticide Factsheets

    This May 4, 2011 letter from EPA to Chistopher E. Urbina M.D., MPH, Colorado Department of Public Health and Environment states that EPA has found that the Canon City PM10 maintenance plan and the 2020 motor vehicle emissions budget (MVEB) adequate

  7. Evaluation of Strip Footing Bearing Capacity Built on the Anthropogenic Embankment by Random Finite Element Method

    NASA Astrophysics Data System (ADS)

    Pieczynska-Kozlowska, Joanna

    2014-05-01

    One of a geotechnical problem in the area of Wroclaw is an anthropogenic embankment layer delaying to the depth of 4-5m, arising as a result of historical incidents. In such a case an assumption of bearing capacity of strip footing might be difficult. The standard solution is to use a deep foundation or foundation soil replacement. However both methods generate significant costs. In the present paper the authors focused their attention on the influence of anthropogenic embankment variability on bearing capacity. Soil parameters were defined on the basis of CPT test and modeled as 2D anisotropic random fields and the assumption of bearing capacity were made according deterministic finite element methods. Many repeated of the different realizations of random fields lead to stable expected value of bearing capacity. The algorithm used to estimate the bearing capacity of strip footing was the random finite element method (e.g. [1]). In traditional approach of bearing capacity the formula proposed by [2] is taken into account. qf = c'Nc + qNq + 0.5γBN- γ (1) where: qf is the ultimate bearing stress, cis the cohesion, qis the overburden load due to foundation embedment, γ is the soil unit weight, Bis the footing width, and Nc, Nq and Nγ are the bearing capacity factors. The method of evaluation the bearing capacity of strip footing based on finite element method incorporate five parameters: Young's modulus (E), Poisson's ratio (ν), dilation angle (ψ), cohesion (c), and friction angle (φ). In the present study E, ν and ψ are held constant while c and φ are randomized. Although the Young's modulus does not affect the bearing capacity it governs the initial elastic response of the soil. Plastic stress redistribution is accomplished using a viscoplastic algorithm merge with an elastic perfectly plastic (Mohr - Coulomb) failure criterion. In this paper a typical finite element mesh was assumed with 8-node elements consist in 50 columns and 20 rows. Footings width B

  8. A comparison of methods for estimating the random effects distribution of a linear mixed model.

    PubMed

    Ghidey, Wendimagegn; Lesaffre, Emmanuel; Verbeke, Geert

    2010-12-01

    This article reviews various recently suggested approaches to estimate the random effects distribution in a linear mixed model, i.e. (1) the smoothing by roughening approach of Shen and Louis,(1) (2) the semi-non-parametric approach of Zhang and Davidian,(2) (3) the heterogeneity model of Verbeke and Lesaffre( 3) and (4) a flexible approach of Ghidey et al. (4) These four approaches are compared via an extensive simulation study. We conclude that for the considered cases, the approach of Ghidey et al. (4) often shows to have the smallest integrated mean squared error for estimating the random effects distribution. An analysis of a longitudinal dental data set illustrates the performance of the methods in a practical example.

  9. Using Fuzzy Logic to Identify Schools Which May Be Misclassified by the No Child Left Behind Adequate Yearly Progress Policy

    ERIC Educational Resources Information Center

    Yates, Donald W.

    2009-01-01

    This investigation developed, tested, and prototyped a Fuzzy Inference System (FIS) that would assist decision makers in identifying schools that may have been misclassified by existing Adequate Yearly Progress (AYP) methods. This prototype was then used to evaluate Louisiana elementary schools using published school data for Academic Year 2004. …

  10. Do Implant Overdentures Improve Dietary Intake? A Randomized Clinical Trial

    PubMed Central

    Hamdan, N.M.; Gray-Donald, K.; Awad, M.A.; Johnson-Down, L.; Wollin, S.; Feine, J.S.

    2013-01-01

    People wearing mandibular two-implant overdentures (IOD) chew food with less difficulty than those wearing conventional complete dentures (CD). However, there is still controversy over whether or not this results in better dietary intake. In this randomized clinical trials (RCT), the amounts of total dietary fiber (TDF), macronutrients, 9 micronutrients, and energy in diets consumed by persons with IOD and CD were compared. Male and female edentate patients ≥ 65 yrs (n = 255) were randomly divided into 2 groups and assigned to receive a maxillary CD and either a mandibular IOD or a CD. One year following prosthesis delivery, 217 participants (CD = 114, IOD = 103) reported the food and quantities they consumed to a registered dietician through a standard 24-hour dietary recall method. The mean and median values of TDF, macro- and micronutrients, and energy consumed by both groups were calculated and compared analytically. No significant between-group differences were found (ps > .05). Despite quality-of-life benefits from IODs, this adequately powered study reveals no evidence of nutritional advantages for independently living medically healthy edentate elders wearing two-implant mandibular overdentures over those wearing conventional complete dentures in their dietary intake at one year following prosthesis delivery (International Clinical Trials ISRCTN24273915). PMID:24158335

  11. Generation of pseudo-random numbers

    NASA Technical Reports Server (NTRS)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  12. Calculation of the Cost of an Adequate Education in Kentucky: A Professional Judgment Approach

    ERIC Educational Resources Information Center

    Verstegen, Deborah A.

    2004-01-01

    What is an adequate education and how much does it cost? In 1989, Kentucky's State Supreme Court found the entire system of education unconstitutional--"all of its parts and parcels". The Court called for all children to have access to an adequate education, one that is uniform and has as its goal the development of seven capacities,…

  13. The Method of Randomization for Cluster-Randomized Trials: Challenges of Including Patients with Multiple Chronic Conditions

    PubMed Central

    Esserman, Denise; Allore, Heather G.; Travison, Thomas G.

    2016-01-01

    Cluster-randomized clinical trials (CRT) are trials in which the unit of randomization is not a participant but a group (e.g. healthcare systems or community centers). They are suitable when the intervention applies naturally to the cluster (e.g. healthcare policy); when lack of independence among participants may occur (e.g. nursing home hygiene); or when it is most ethical to apply an intervention to all within a group (e.g. school-level immunization). Because participants in the same cluster receive the same intervention, CRT may approximate clinical practice, and may produce generalizable findings. However, when not properly designed or interpreted, CRT may induce biased results. CRT designs have features that add complexity to statistical estimation and inference. Chief among these is the cluster-level correlation in response measurements induced by the randomization. A critical consideration is the experimental unit of inference; often it is desirable to consider intervention effects at the level of the individual rather than the cluster. Finally, given that the number of clusters available may be limited, simple forms of randomization may not achieve balance between intervention and control arms at either the cluster- or participant-level. In non-clustered clinical trials, balance of key factors may be easier to achieve because the sample can be homogenous by exclusion of participants with multiple chronic conditions (MCC). CRTs, which are often pragmatic, may eschew such restrictions. Failure to account for imbalance may induce bias and reducing validity. This article focuses on the complexities of randomization in the design of CRTs, such as the inclusion of patients with MCC, and imbalances in covariate factors across clusters. PMID:27478520

  14. Randomized Comparison of 3 Methods to Screen for Domestic Violence in Family Practice

    PubMed Central

    Chen, Ping-Hsin; Rovi, Sue; Washington, Judy; Jacobs, Abbie; Vega, Marielos; Pan, Ko-Yu; Johnson, Mark S.

    2007-01-01

    PURPOSE We undertook a study to compare 3 ways of administering brief domestic violence screening questionnaires: self-administered questionnaire, medical staff interview, and physician interview. METHODS We conducted a randomized trial of 3 screening protocols for domestic violence in 4 urban family medicine practices with mostly minority patients. We randomly assigned 523 female patients, aged 18 years or older and currently involved with a partner, to 1 of 3 screening protocols. Each included 2 brief screening tools: HITS and WAST-Short. Outcome measures were domestic violence disclosure, patient and clinician comfort with the screening, and time spent screening. RESULTS Overall prevalence of domestic violence was 14%. Most patients (93.4%) and clinicians (84.5%) were comfortable with the screening questions and method of administering them. Average time spent screening was 4.4 minutes. Disclosure rates, patient and clinician comfort with screening, and time spent screening were similar among the 3 protocols. In addition, WAST-Short was validated in this sample of minority women by comparison with HITS and with the 8-item WAST. CONCLUSIONS Domestic violence is common, and we found that most patients and clinicians are comfortable with domestic violence screening in urban family medicine settings. Patient self-administered domestic violence screening is as effective as clinician interview in terms of disclosure, comfort, and time spent screening. PMID:17893385

  15. A Randomized Phase II Dose-Response Exercise Trial among Colon Cancer Survivors: Purpose, Study Design, Methods, and Recruitment Results

    PubMed Central

    Brown, Justin C.; Troxel, Andrea B.; Ky, Bonnie; Damjanov, Nevena; Zemel, Babette S.; Rickels, Michael R.; Rhim, Andrew D.; Rustgi, Anil K.; Courneya, Kerry S.; Schmitz, Kathryn H.

    2016-01-01

    Background Observational studies indicate that higher volumes of physical activity are associated with improved disease outcomes among colon cancer survivors. The aim of this report is to describe the purpose, study design, methods, and recruitment results of the COURAGE trial, a National Cancer Institute (NCI) sponsored, phase II, randomized, dose-response exercise trial among colon cancer survivors. Methods/Results The primary objective of the COURAGE trial is to quantify the feasibility, safety, and physiologic effects of low-dose (150 min·wk−1) and high-dose (300 min·wk−1) moderate-intensity aerobic exercise compared to usual-care control group over six months. The exercise groups are provided with in-home treadmills and heart rate monitors. Between January and July 2015, 1,433 letters were mailed using a population-based state cancer registry; 126 colon cancer survivors inquired about participation, and 39 were randomized onto the study protocol. Age was associated with inquiry about study participation (P<0.001) and randomization onto the study protocol (P<0.001). No other demographic, clinical, or geographic characteristics were associated with study inquiry or randomization. The final trial participant was randomized in August 2015. Six month endpoint data collection was completed in February 2016. Discussion The recruitment of colon cancer survivors into an exercise trial is feasible. The findings from this trial will inform key design aspects for future phase 2 and phase 3 randomized controlled trials to examine the efficacy of exercise to improve clinical outcomes among colon cancer survivors. PMID:26970181

  16. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and Performance Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... with the State's requirements for availability of services, as set forth in § 438.206. (e) CMS' right...

  17. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  18. The REFLECT statement: methods and processes of creating reporting guidelines for randomized controlled trials for livestock and food safety.

    PubMed

    O'Connor, A M; Sargeant, J M; Gardner, I A; Dickson, J S; Torrence, M E; Dewey, C E; Dohoo, I R; Evans, R B; Gray, J T; Greiner, M; Keefe, G; Lefebvre, S L; Morley, P S; Ramirez, A; Sischo, W; Smith, D R; Snedeker, K; Sofos, J; Ward, M P; Wills, R

    2010-01-01

    The conduct of randomized controlled trials in livestock with production, health, and food-safety outcomes presents unique challenges that may not be adequately reported in trial reports. The objective of this project was to modify the CONSORT (Consolidated Standards of Reporting Trials) statement to reflect the unique aspects of reporting these livestock trials. A two-day consensus meeting was held on November 18-19, 2008 in Chicago, IL, United States of America, to achieve the objective. Prior to the meeting, a Web-based survey was conducted to identify issues for discussion. The 24 attendees were biostatisticians, epidemiologists, food-safety researchers, livestock-production specialists, journal editors, assistant editors, and associate editors. Prior to the meeting, the attendees completed a Web-based survey indicating which CONSORT statement items may need to be modified to address unique issues for livestock trials. The consensus meeting resulted in the production of the REFLECT (Reporting Guidelines For Randomized Control Trials) statement for livestock and food safety (LFS) and 22-item checklist. Fourteen items were modified from the CONSORT checklist, and an additional sub-item was proposed to address challenge trials. The REFLECT statement proposes new terminology, more consistent with common usage in livestock production, to describe study subjects. Evidence was not always available to support modification to or inclusion of an item. The use of the REFLECT statement, which addresses issues unique to livestock trials, should improve the quality of reporting and design for trials reporting production, health, and food-safety outcomes.

  19. The REFLECT statement: methods and processes of creating reporting guidelines for randomized controlled trials for livestock and food safety.

    PubMed

    O'Connor, A M; Sargeant, J M; Gardner, I A; Dickson, J S; Torrence, M E; Dewey, C E; Dohoo, I R; Evans, R B; Gray, J T; Greiner, M; Keefe, G; Lefebvre, S L; Morley, P S; Ramirez, A; Sischo, W; Smith, D R; Snedeker, K; Sofos, J; Ward, M P; Wills, R

    2010-01-01

    The conduct of randomized controlled trials in livestock with production, health, and food-safety outcomes presents unique challenges that might not be adequately reported in trial reports. The objective of this project was to modify the CONSORT (Consolidated Standards of Reporting Trials) statement to reflect the unique aspects of reporting these livestock trials. A 2-day consensus meeting was held on November 18-19, 2008 in Chicago, IL, to achieve the objective. Before the meeting, a Web-based survey was conducted to identify issues for discussion. The 24 attendees were biostatisticians, epidemiologists, food-safety researchers, livestock production specialists, journal editors, assistant editors, and associate editors. Before the meeting, the attendees completed a Web-based survey indicating which CONSORT statement items would need to be modified to address unique issues for livestock trials. The consensus meeting resulted in the production of the REFLECT (Reporting Guidelines for Randomized Control Trials) statement for livestock and food safety and 22-item checklist. Fourteen items were modified from the CONSORT checklist, and an additional subitem was proposed to address challenge trials. The REFLECT statement proposes new terminology, more consistent with common usage in livestock production, to describe study subjects. Evidence was not always available to support modification to or inclusion of an item. The use of the REFLECT statement, which addresses issues unique to livestock trials, should improve the quality of reporting and design for trials reporting production, health, and food-safety outcomes.

  20. The REFLECT statement: methods and processes of creating reporting guidelines for randomized controlled trials for livestock and food safety.

    PubMed

    O'Connor, A M; Sargeant, J M; Gardner, I A; Dickson, J S; Torrence, M E; Dewey, C E; Dohoo, I R; Evans, R B; Gray, J T; Greiner, M; Keefe, G; Lefebvre, S L; Morley, P S; Ramirez, A; Sischo, W; Smith, D R; Snedeker, K; Sofos, J N; Ward, M P; Wills, R

    2010-01-01

    The conduct of randomized controlled trials in livestock with production, health, and food-safety outcomes presents unique challenges that may not be adequately reported in trial reports. The objective of this project was to modify the CONSORT (Consolidated Standards of Reporting Trials) statement to reflect the unique aspects of reporting these livestock trials. A two-day consensus meeting was held on November 18-19, 2008 in Chicago, Ill, United States of America, to achieve the objective. Prior to the meeting, a Web-based survey was conducted to identify issues for discussion. The 24 attendees were biostatisticians, epidemiologists, food-safety researchers, livestock production specialists, journal editors, assistant editors, and associate editors. Prior to the meeting, the attendees completed a Web-based survey indicating which CONSORT statement items may need to be modified to address unique issues for livestock trials. The consensus meeting resulted in the production of the REFLECT (Reporting Guidelines for Randomized Control Trials) statement for livestock and food safety (LFS) and 22-item checklist. Fourteen items were modified from the CONSORT checklist, and an additional sub-item was proposed to address challenge trials. The REFLECT statement proposes new terminology, more consistent with common usage in livestock production, to describe study subjects. Evidence was not always available to support modification to or inclusion of an item. The use of the REFLECT statement, which addresses issues unique to livestock trials, should improve the quality of reporting and design for trials reporting production, health, and food-safety outcomes.

  1. Army General Fund Adjustments Not Adequately Documented or Supported

    DTIC Science & Technology

    2016-07-26

    compilation process. Finding The Office of the Assistant Secretary of the Army (Financial Management & Comptroller) (OASA[FM&C]) and the Defense Finance and...statements were unreliable and lacked an adequate audit trail. Furthermore, DoD and Army managers could not rely on the data in their accounting...systems when making management and resource decisions. Until the Army and DFAS Indianapolis correct these control deficiencies, there is considerable

  2. Echocardiographic Methods, Quality Review, and Measurement Accuracy in a Randomized Multicenter Clinical Trial of Marfan Syndrome

    PubMed Central

    Selamet Tierney, Elif Seda; Levine, Jami C.; Chen, Shan; Bradley, Timothy J.; Pearson, Gail D.; Colan, Steven D.; Sleeper, Lynn A.; Campbell, M. Jay; Cohen, Meryl S.; Backer, Julie De; Guey, Lin T.; Heydarian, Haleh; Lai, Wyman W.; Lewin, Mark B.; Marcus, Edward; Mart, Christopher R.; Pignatelli, Ricardo H.; Printz, Beth F.; Sharkey, Angela M.; Shirali, Girish S.; Srivastava, Shubhika; Lacro, Ronald V.

    2013-01-01

    Background The Pediatric Heart Network is conducting a large international randomized trial to compare aortic root growth and other cardiovascular outcomes in 608 subjects with Marfan syndrome randomized to receive atenolol or losartan for 3 years. The authors report here the echocardiographic methods and baseline echocardiographic characteristics of the randomized subjects, describe the interobserver agreement of aortic measurements, and identify factors influencing agreement. Methods Individuals aged 6 months to 25 years who met the original Ghent criteria and had body surface area–adjusted maximum aortic root diameter (ROOTmax) Z scores > 3 were eligible for inclusion. The primary outcome measure for the trial is the change over time in ROOTmax Z score. A detailed echocardiographic protocol was established and implemented across 22 centers, with an extensive training and quality review process. Results Interobserver agreement for the aortic measurements was excellent, with intraclass correlation coefficients ranging from 0.921 to 0.989. Lower interobserver percentage error in ROOTmax measurements was independently associated (model R2 = 0.15) with better image quality (P = .002) and later study reading date (P < .001). Echocardiographic characteristics of the randomized subjects did not differ by treatment arm. Subjects with ROOTmax Z scores ≥ 4.5 (36%) were more likely to have mitral valve prolapse and dilation of the main pulmonary artery and left ventricle, but there were no differences in aortic regurgitation, aortic stiffness indices, mitral regurgitation, or left ventricular function compared with subjects with ROOTmax Z scores < 4.5. Conclusions The echocardiographic methodology, training, and quality review process resulted in a robust evaluation of aortic root dimensions, with excellent reproducibility. PMID:23582510

  3. A Method of Reducing Random Drift in the Combined Signal of an Array of Inertial Sensors

    DTIC Science & Technology

    2015-09-30

    stability of the collective output, Bayard et al, US Patent 6,882,964. The prior art methods rely upon the use of Kalman filtering and averaging...including scale-factor errors, quantization effects, temperature effects, random drift, and additive noise. A comprehensive account of all of these

  4. MendelianRandomization: an R package for performing Mendelian randomization analyses using summarized data.

    PubMed

    Yavorska, Olena O; Burgess, Stephen

    2017-12-01

    MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3). © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  5. Bayesian network meta-analysis for cluster randomized trials with binary outcomes.

    PubMed

    Uhlmann, Lorenz; Jensen, Katrin; Kieser, Meinhard

    2017-06-01

    Network meta-analysis is becoming a common approach to combine direct and indirect comparisons of several treatment arms. In recent research, there have been various developments and extensions of the standard methodology. Simultaneously, cluster randomized trials are experiencing an increased popularity, especially in the field of health services research, where, for example, medical practices are the units of randomization but the outcome is measured at the patient level. Combination of the results of cluster randomized trials is challenging. In this tutorial, we examine and compare different approaches for the incorporation of cluster randomized trials in a (network) meta-analysis. Furthermore, we provide practical insight on the implementation of the models. In simulation studies, it is shown that some of the examined approaches lead to unsatisfying results. However, there are alternatives which are suitable to combine cluster randomized trials in a network meta-analysis as they are unbiased and reach accurate coverage rates. In conclusion, the methodology can be extended in such a way that an adequate inclusion of the results obtained in cluster randomized trials becomes feasible. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... animal health, behavior, and well-being is conveyed to the attending veterinarian; (4) Guidance to... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE...

  7. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... animal health, behavior, and well-being is conveyed to the attending veterinarian; (4) Guidance to... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE...

  8. 9 CFR 2.33 - Attending veterinarian and adequate veterinary care.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... animal health, behavior, and well-being is conveyed to the attending veterinarian; (4) Guidance to... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE...

  9. 12 CFR 1229.5 - Capital distributions for adequately capitalized Banks.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CAPITAL CLASSIFICATIONS AND PROMPT CORRECTIVE ACTION Federal Home Loan Banks § 1229.5 Capital... classification of adequately capitalized. A Bank may not make a capital distribution if such distribution would... redeem its shares of stock if the transaction is made in connection with the issuance of additional Bank...

  10. Region 10: Idaho Northern Ada County Adequate Letter (6/21/2013)

    EPA Pesticide Factsheets

    EPA approves motor vehicle emissions budget in the Northern Ada County PM10 State Implementation Plan, Maintenance Plan: Ten-Year Update for PM10 national ambient air quality standard, adequate for transportation conformity purposes.

  11. Cognitive Attributes of Adequate and Inadequate Responders to Reading Intervention in Middle School

    PubMed Central

    Miciak, Jeremy; Stuebing, Karla K.; Vaughn, Sharon; Roberts, Greg; Barth, Amy Elizabeth; Fletcher, Jack M.

    2016-01-01

    No studies have investigated the cognitive attributes of middle school students who are adequate and inadequate responders to Tier 2 reading intervention. We compared students in Grades 6 and 7 representing groups of adequate responders (n = 77) and inadequate responders who fell below criteria in (a) comprehension (n = 54); (b) fluency (n = 45); and (c) decoding, fluency, and comprehension (DFC; n = 45). These students received measures of phonological awareness, listening comprehension, rapid naming, processing speed, verbal knowledge, and nonverbal reasoning. Multivariate comparisons showed a significant Group-by-Task interaction: the comprehension-impaired group demonstrated primary difficulties with verbal knowledge and listening comprehension, the DFC group with phonological awareness, and the fluency-impaired group with phonological awareness and rapid naming. A series of regression models investigating whether responder status explained unique variation in cognitive skills yielded largely null results consistent with a continuum of severity associated with level of reading impairment, with no evidence for qualitative differences in the cognitive attributes of adequate and inadequate responders. PMID:28579668

  12. Cognitive Attributes of Adequate and Inadequate Responders to Reading Intervention in Middle School.

    PubMed

    Miciak, Jeremy; Stuebing, Karla K; Vaughn, Sharon; Roberts, Greg; Barth, Amy Elizabeth; Fletcher, Jack M

    2014-12-01

    No studies have investigated the cognitive attributes of middle school students who are adequate and inadequate responders to Tier 2 reading intervention. We compared students in Grades 6 and 7 representing groups of adequate responders ( n = 77) and inadequate responders who fell below criteria in (a) comprehension ( n = 54); (b) fluency ( n = 45); and (c) decoding, fluency, and comprehension (DFC; n = 45). These students received measures of phonological awareness, listening comprehension, rapid naming, processing speed, verbal knowledge, and nonverbal reasoning. Multivariate comparisons showed a significant Group-by-Task interaction: the comprehension-impaired group demonstrated primary difficulties with verbal knowledge and listening comprehension, the DFC group with phonological awareness, and the fluency-impaired group with phonological awareness and rapid naming. A series of regression models investigating whether responder status explained unique variation in cognitive skills yielded largely null results consistent with a continuum of severity associated with level of reading impairment, with no evidence for qualitative differences in the cognitive attributes of adequate and inadequate responders.

  13. Calibrating random forests for probability estimation.

    PubMed

    Dankowski, Theresa; Ziegler, Andreas

    2016-09-30

    Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  14. 42 CFR 413.24 - Adequate cost data and cost finding.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... familiar with the laws and regulations regarding the provision of health care services, and that the... 42 Public Health 2 2013-10-01 2013-10-01 false Adequate cost data and cost finding. 413.24 Section 413.24 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...

  15. 42 CFR 413.24 - Adequate cost data and cost finding.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... familiar with the laws and regulations regarding the provision of health care services, and that the... 42 Public Health 2 2012-10-01 2012-10-01 false Adequate cost data and cost finding. 413.24 Section 413.24 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...

  16. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... enrollment in its service area in accordance with the State's standards for access to care under this subpart... 42 Public Health 4 2014-10-01 2014-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND...

  17. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... enrollment in its service area in accordance with the State's standards for access to care under this subpart... 42 Public Health 4 2011-10-01 2011-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND...

  18. 42 CFR 438.207 - Assurances of adequate capacity and services.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... enrollment in its service area in accordance with the State's standards for access to care under this subpart... 42 Public Health 4 2012-10-01 2012-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND...

  19. A stochastic simulation method for the assessment of resistive random access memory retention reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berco, Dan, E-mail: danny.barkan@gmail.com; Tseng, Tseung-Yuen, E-mail: tseng@cc.nctu.edu.tw

    This study presents an evaluation method for resistive random access memory retention reliability based on the Metropolis Monte Carlo algorithm and Gibbs free energy. The method, which does not rely on a time evolution, provides an extremely efficient way to compare the relative retention properties of metal-insulator-metal structures. It requires a small number of iterations and may be used for statistical analysis. The presented approach is used to compare the relative robustness of a single layer ZrO{sub 2} device with a double layer ZnO/ZrO{sub 2} one, and obtain results which are in good agreement with experimental data.

  20. A multi-level intervention in subsidized housing sites to increase fruit and vegetable access and intake: Rationale, design and methods of the 'Live Well, Viva Bien' cluster randomized trial.

    PubMed

    Gans, Kim M; Gorham, Gemma; Risica, Patricia M; Dulin-Keita, Akilah; Dionne, Laura; Gao, Tina; Peters, Sarah; Principato, Ludovica

    2016-06-28

    Adequate fruit and vegetable (F&V) intake is important for disease prevention. Yet, most Americans, especially low-income and racial/ethnic minorities, do not eat adequate amounts. These disparities are partly attributable to food environments in low-income neighborhoods where residents often have limited access to affordable, healthful food and easy access to inexpensive, unhealthful foods. Increasing access to affordable healthful food in underserved neighborhoods through mobile markets is a promising, year-round strategy for improving dietary behaviors and reducing F&V intake disparities. However, to date, there have been no randomized controlled trials studying their effectiveness. The objective of the 'Live Well, Viva Bien' (LWVB) cluster randomized controlled trial is to evaluate the efficacy of a multicomponent mobile market intervention at increasing F&V intake among residents of subsidized housing complexes. One housing complex served as a pilot site for the intervention group and the remaining 14 demographically-matched sites were randomized into either the intervention or control group. The intervention group received bimonthly, discount, mobile, fresh F&V markets in conjunction with a nutrition education intervention (two F&V campaigns, newsletters, DVDs and cooking demonstrations) for 12 months. The control group received physical activity and stress reduction interventions. Outcome measures include F&V intake (measured by two validated F&V screeners at baseline, six-month and twelve-months) along with potential psychosocial mediating variables. Extensive quantitative and qualitative process evaluation was also conducted throughout the study. Modifying neighborhood food environments in ways that increase access to affordable, healthful food is a promising strategy for improving dietary behaviors among low-income, racial and ethnic minority groups at increased risk for obesity and other food-related chronic diseases. Discount, mobile F&V markets

  1. Random Variation in Student Performance by Class Size: Implications of NCLB in Rural Pennsylvania

    ERIC Educational Resources Information Center

    Goetz, Stephan J.

    2005-01-01

    Schools that fail to make "adequate yearly progress" under NCLB face sanctions and may lose students to other schools. In smaller schools, random yearly variation in innate student ability and behavior can cause changes in scores that are beyond the influence of teachers. This study examines changes in reading and math scores across…

  2. Effect of random phase mask on input plane in photorefractive authentic memory with two-wave encryption method

    NASA Astrophysics Data System (ADS)

    Mita, Akifumi; Okamoto, Atsushi; Funakoshi, Hisatoshi

    2004-06-01

    We have proposed an all-optical authentic memory with the two-wave encryption method. In the recording process, the image data are encrypted to a white noise by the random phase masks added on the input beam with the image data and the reference beam. Only reading beam with the phase-conjugated distribution of the reference beam can decrypt the encrypted data. If the encrypted data are read out with an incorrect phase distribution, the output data are transformed into a white noise. Moreover, during read out, reconstructions of the encrypted data interfere destructively resulting in zero intensity. Therefore our memory has a merit that we can detect unlawful accesses easily by measuring the output beam intensity. In our encryption method, the random phase mask on the input plane plays important roles in transforming the input image into a white noise and prohibiting to decrypt a white noise to the input image by the blind deconvolution method. Without this mask, when unauthorized users observe the output beam by using CCD in the readout with the plane wave, the completely same intensity distribution as that of Fourier transform of the input image is obtained. Therefore the encrypted image will be decrypted easily by using the blind deconvolution method. However in using this mask, even if unauthorized users observe the output beam using the same method, the encrypted image cannot be decrypted because the observed intensity distribution is dispersed at random by this mask. Thus it can be said the robustness is increased by this mask. In this report, we compare two correlation coefficients, which represents the degree of a white noise of the output image, between the output image and the input image in using this mask or not. We show that the robustness of this encryption method is increased as the correlation coefficient is improved from 0.3 to 0.1 by using this mask.

  3. 48 CFR 52.216-29 - Time-and-Materials/Labor-Hour Proposal Requirements-Non-Commercial Item Acquisition With Adequate...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-Hour Proposal Requirements-Non-Commercial Item Acquisition With Adequate Price Competition. 52.216-29... Proposal Requirements—Non-Commercial Item Acquisition With Adequate Price Competition (FEB 2007) (a) The... Time-and-Materials/Labor-Hour Proposal Requirements—Non-Commercial Item Acquisition With Adequate Price...

  4. 42 CFR 413.24 - Adequate cost data and cost finding.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... provision of health care services, and that the services identified in this cost report were provided in... 42 Public Health 2 2014-10-01 2014-10-01 false Adequate cost data and cost finding. 413.24 Section 413.24 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...

  5. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  6. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  7. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  8. Predicting the accuracy of ligand overlay methods with Random Forest models.

    PubMed

    Nandigam, Ravi K; Evans, David A; Erickson, Jon A; Kim, Sangtae; Sutherland, Jeffrey J

    2008-12-01

    The accuracy of binding mode prediction using standard molecular overlay methods (ROCS, FlexS, Phase, and FieldCompare) is studied. Previous work has shown that simple decision tree modeling can be used to improve accuracy by selection of the best overlay template. This concept is extended to the use of Random Forest (RF) modeling for template and algorithm selection. An extensive data set of 815 ligand-bound X-ray structures representing 5 gene families was used for generating ca. 70,000 overlays using four programs. RF models, trained using standard measures of ligand and protein similarity and Lipinski-related descriptors, are used for automatically selecting the reference ligand and overlay method maximizing the probability of reproducing the overlay deduced from X-ray structures (i.e., using rmsd < or = 2 A as the criteria for success). RF model scores are highly predictive of overlay accuracy, and their use in template and method selection produces correct overlays in 57% of cases for 349 overlay ligands not used for training RF models. The inclusion in the models of protein sequence similarity enables the use of templates bound to related protein structures, yielding useful results even for proteins having no available X-ray structures.

  9. [The human right to adequate food: an urban vision].

    PubMed

    Casemiro, Juliana Pereira; Valla, Victor Vincent; Guimarães, Maria Beatriz Lisboa

    2010-07-01

    The human right to adequate food is comprehended in two dimensions: being free of hunger and denutrition and having access to an adequate food. The urban context, in which the possession of food is done primarily through merchandising because of its strong consuming appealing, became a big challenge to debate this topic in poor districts today. Here we combine considerations of a qualitative study carried out in São João de Meriti, Rio de Janeiro State, joining leaders from Pastoral da Criança in focal group sessions. The unemployment, the sub-employment and the difficulty in reaching the public health system, the social assistance and basic sanitation were presented as the major obstacles to bring into effect the human right to food. It was possible to determine that, among the strategies to fight the poverty and hunger, a big highlight is the establishment of mutual help mechanisms. The social support, generosity and religiousness were presented as the most important categories among the thoughts of the leaders. Facing a reality in which poverty and hunger appear as something inherent or become a mechanism of change during elections, the issue of the clienteles appears as a huge concern and challenge for those leaders.

  10. Which Food Security Determinants Predict Adequate Vegetable Consumption among Rural Western Australian Children?

    PubMed Central

    Godrich, Stephanie L.; Lo, Johnny; Davies, Christina R.; Darby, Jill; Devine, Amanda

    2017-01-01

    Improving the suboptimal vegetable consumption among the majority of Australian children is imperative in reducing chronic disease risk. The objective of this research was to determine whether there was a relationship between food security determinants (FSD) (i.e., food availability, access, and utilisation dimensions) and adequate vegetable consumption among children living in regional and remote Western Australia (WA). Caregiver-child dyads (n = 256) living in non-metropolitan/rural WA completed cross-sectional surveys that included questions on FSD, demographics and usual vegetable intake. A total of 187 dyads were included in analyses, which included descriptive and logistic regression analyses via IBM SPSS (version 23). A total of 13.4% of children in this sample had adequate vegetable intake. FSD that met inclusion criteria (p ≤ 0.20) for multivariable regression analyses included price; promotion; quality; location of food outlets; variety of vegetable types; financial resources; and transport to outlets. After adjustment for potential demographic confounders, the FSD that predicted adequate vegetable consumption were, variety of vegetable types consumed (p = 0.007), promotion (p = 0.017), location of food outlets (p = 0.027), and price (p = 0.043). Food retail outlets should ensure that adequate varieties of vegetable types (i.e., fresh, frozen, tinned) are available, vegetable messages should be promoted through food retail outlets and in community settings, towns should include a range of vegetable purchasing options, increase their reliance on a local food supply and increase transport options to enable affordable vegetable purchasing. PMID:28054955

  11. Which Food Security Determinants Predict Adequate Vegetable Consumption among Rural Western Australian Children?

    PubMed

    Godrich, Stephanie L; Lo, Johnny; Davies, Christina R; Darby, Jill; Devine, Amanda

    2017-01-03

    Improving the suboptimal vegetable consumption among the majority of Australian children is imperative in reducing chronic disease risk. The objective of this research was to determine whether there was a relationship between food security determinants (FSD) (i.e., food availability, access, and utilisation dimensions) and adequate vegetable consumption among children living in regional and remote Western Australia (WA). Caregiver-child dyads ( n = 256) living in non-metropolitan/rural WA completed cross-sectional surveys that included questions on FSD, demographics and usual vegetable intake. A total of 187 dyads were included in analyses, which included descriptive and logistic regression analyses via IBM SPSS (version 23). A total of 13.4% of children in this sample had adequate vegetable intake. FSD that met inclusion criteria ( p ≤ 0.20) for multivariable regression analyses included price; promotion; quality; location of food outlets; variety of vegetable types; financial resources; and transport to outlets. After adjustment for potential demographic confounders, the FSD that predicted adequate vegetable consumption were, variety of vegetable types consumed ( p = 0.007), promotion ( p = 0.017), location of food outlets ( p = 0.027), and price ( p = 0.043). Food retail outlets should ensure that adequate varieties of vegetable types (i.e., fresh, frozen, tinned) are available, vegetable messages should be promoted through food retail outlets and in community settings, towns should include a range of vegetable purchasing options, increase their reliance on a local food supply and increase transport options to enable affordable vegetable purchasing.

  12. Unravelling changing interspecific interactions across environmental gradients using Markov random fields.

    PubMed

    Clark, Nicholas J; Wells, Konstans; Lindberg, Oscar

    2018-05-16

    Inferring interactions between co-occurring species is key to identify processes governing community assembly. Incorporating interspecific interactions in predictive models is common in ecology, yet most methods do not adequately account for indirect interactions (where an interaction between two species is masked by their shared interactions with a third) and assume interactions do not vary along environmental gradients. Markov random fields (MRF) overcome these limitations by estimating interspecific interactions, while controlling for indirect interactions, from multispecies occurrence data. We illustrate the utility of MRFs for ecologists interested in interspecific interactions, and demonstrate how covariates can be included (a set of models known as Conditional Random Fields, CRF) to infer how interactions vary along environmental gradients. We apply CRFs to two data sets of presence-absence data. The first illustrates how blood parasite (Haemoproteus, Plasmodium, and nematode microfilaria spp.) co-infection probabilities covary with relative abundance of their avian hosts. The second shows that co-occurrences between mosquito larvae and predatory insects vary along water temperature gradients. Other applications are discussed, including the potential to identify replacement or shifting impacts of highly connected species along climate or land-use gradients. We provide tools for building CRFs and plotting/interpreting results as an R package. © 2018 by the Ecological Society of America.

  13. Light-reflection random-target method for measurement of the modulation transfer function of a digital video-camera

    NASA Astrophysics Data System (ADS)

    Pospisil, J.; Jakubik, P.; Machala, L.

    2005-11-01

    This article reports the suggestion, realization and verification of the newly developed measuring means of the noiseless and locally shift-invariant modulation transfer function (MTF) of a digital video camera in a usual incoherent visible region of optical intensity, especially of its combined imaging, detection, sampling and digitizing steps which are influenced by the additive and spatially discrete photodetector, aliasing and quantization noises. Such means relates to the still camera automatic working regime and static two-dimensional spatially continuous light-reflection random target of white-noise property. The introduced theoretical reason for such a random-target method is also performed under exploitation of the proposed simulation model of the linear optical intensity response and possibility to express the resultant MTF by a normalized and smoothed rate of the ascertainable output and input power spectral densities. The random-target and resultant image-data were obtained and processed by means of a processing and evaluational PC with computation programs developed on the basis of MATLAB 6.5E The present examples of results and other obtained results of the performed measurements demonstrate the sufficient repeatability and acceptability of the described method for comparative evaluations of the performance of digital video cameras under various conditions.

  14. Adequate sleep moderates the prospective association between alcohol use and consequences.

    PubMed

    Miller, Mary Beth; DiBello, Angelo M; Lust, Sarah A; Carey, Michael P; Carey, Kate B

    2016-12-01

    Inadequate sleep and heavy alcohol use have been associated with negative outcomes among college students; however, few studies have examined the interactive effects of sleep and drinking quantity in predicting alcohol-related consequences. This study aimed to determine if adequate sleep moderates the prospective association between weekly drinking quantity and consequences. College students (N=568) who were mandated to an alcohol prevention intervention reported drinks consumed per week, typical sleep quantity (calculated from sleep/wake times), and perceptions of sleep adequacy as part of a larger research trial. Assessments were completed at baseline and one-, three-, and five-month follow-ups. Higher baseline quantities of weekly drinking and inadequate sleep predicted alcohol-related consequences at baseline and one-month follow-up. Significant interactions emerged between baseline weekly drinking quantity and adequate sleep in the prediction of alcohol-related consequences at baseline, one-, three-, and five-month assessments. Simple slopes analyses revealed that weekly drinking quantity was positively associated with alcohol-related consequences for those reporting both adequate and inadequate sleep, but this association was consistently stronger among those who reported inadequate sleep. Subjective evaluation of sleep adequacy moderates both the concurrent and prospective associations between weekly drinking quantity and consequences, such that heavy-drinking college students reporting inadequate sleep experience more consequences as a result of drinking. Research needs to examine the mechanism(s) by which inadequate sleep affects alcohol risk among young adults. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. New high resolution Random Telegraph Noise (RTN) characterization method for resistive RAM

    NASA Astrophysics Data System (ADS)

    Maestro, M.; Diaz, J.; Crespo-Yepes, A.; Gonzalez, M. B.; Martin-Martinez, J.; Rodriguez, R.; Nafria, M.; Campabadal, F.; Aymerich, X.

    2016-01-01

    Random Telegraph Noise (RTN) is one of the main reliability problems of resistive switching-based memories. To understand the physics behind RTN, a complete and accurate RTN characterization is required. The standard equipment used to analyse RTN has a typical time resolution of ∼2 ms which prevents evaluating fast phenomena. In this work, a new RTN measurement procedure, which increases the measurement time resolution to 2 μs, is proposed. The experimental set-up, together with the recently proposed Weighted Time Lag (W-LT) method for the analysis of RTN signals, allows obtaining a more detailed and precise information about the RTN phenomenon.

  16. A randomized controlled trial of the different impression methods for the complete denture fabrication: Patient reported outcomes.

    PubMed

    Jo, Ayami; Kanazawa, Manabu; Sato, Yusuke; Iwaki, Maiko; Akiba, Norihisa; Minakuchi, Shunsuke

    2015-08-01

    To compare the effect of conventional complete dentures (CD) fabricated using two different impression methods on patient-reported outcomes in a randomized controlled trial (RCT). A cross-over RCT was performed with edentulous patients, required maxillomandibular CDs. Mandibular CDs were fabricated using two different methods. The conventional method used a custom tray border moulded with impression compound and a silicone. The simplified used a stock tray and an alginate. Participants were randomly divided into two groups. The C-S group had the conventional method used first, followed by the simplified. The S-C group was in the reverse order. Adjustment was performed four times. A wash out period was set for 1 month. The primary outcome was general patient satisfaction, measured using visual analogue scales, and the secondary outcome was oral health-related quality of life, measured using the Japanese version of the Oral Health Impact Profile for edentulous (OHIP-EDENT-J) questionnaire scores. Twenty-four participants completed the trial. With regard to general patient satisfaction, the conventional method was significantly more acceptable than the simplified. No significant differences were observed between the two methods in the OHIP-EDENT-J scores. This study showed CDs fabricated with a conventional method were significantly more highly rated for general patient satisfaction than a simplified. CDs, fabricated with the conventional method that included a preliminary impression made using alginate in a stock tray and subsequently a final impression made using silicone in a border moulded custom tray resulted in higher general patient satisfaction. UMIN000009875. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Evaluation of a newly developed infant chest compression technique: A randomized crossover manikin trial.

    PubMed

    Smereka, Jacek; Bielski, Karol; Ladny, Jerzy R; Ruetzler, Kurt; Szarpak, Lukasz

    2017-04-01

    Providing adequate chest compression is essential during infant cardio-pulmonary-resuscitation (CPR) but was reported to be performed poor. The "new 2-thumb technique" (nTTT), which consists in using 2 thumbs directed at the angle of 90° to the chest while closing the fingers of both hands in a fist, was recently introduced. Therefore, the aim of this study was to compare 3 chest compression techniques, namely, the 2-finger-technique (TFT), the 2-thumb-technique (TTHT), and the nTTT in an randomized infant-CPR manikin setting. A total of 73 paramedics with at least 1 year of clinical experience performed 3 CPR settings with a chest compression:ventilation ratio of 15:2, according to current guidelines. Chest compression was performed with 1 out of the 3 chest compression techniques in a randomized sequence. Chest compression rate and depth, chest decompression, and adequate ventilation after chest compression served as outcome parameters. The chest compression depth was 29 (IQR, 28-29) mm in the TFT group, 42 (40-43) mm in the TTHT group, and 40 (39-40) mm in the nTTT group (TFT vs TTHT, P < 0.001; TFT vs nTTT, P < 0.001; TTHT vs nTTT, P < 0.01). The median compression rate with TFT, TTHT, and nTTT varied and amounted to 136 (IQR, 133-144) min versus 117 (115-121) min versus 111 (109-113) min. There was a statistically significant difference in the compression rate between TFT and TTHT (P < 0.001), TFT and nTTT (P < 0.001), as well as TTHT and nTTT (P < 0.001). Incorrect decompressions after CC were significantly increased in the TTHT group compared with the TFT (P < 0.001) and the nTTT (P < 0.001) group. The nTTT provides adequate chest compression depth and rate and was associated with adequate chest decompression and possibility to adequately ventilate the infant manikin. Further clinical studies are necessary to confirm these initial findings.

  18. Is ginger effective for the treatment of irritable bowel syndrome? A double blind randomized controlled pilot trial.

    PubMed

    van Tilburg, Miranda A L; Palsson, Olafur S; Ringel, Yehuda; Whitehead, William E

    2014-02-01

    Ginger is one of the most commonly used herbal medicines for irritable bowel syndrome (IBS) but no data exists about its effectiveness. Double blind randomized controlled trial. University of North Carolina, Chapel Hill, North Carolina, USA. Forty-five IBS patients were randomly assigned to three groups: placebo, 1g of ginger, and 2g of ginger daily for 28 days. The IBS severity scale (IBS-SS) was administered, as well as adequate relief of symptoms scale. A responder was defined as having at least 25% reduction in IBS-SS post-treatment. There were 57.1% responders to placebo, 46.7% to 1g and 33.3% to 2g of ginger. Adequate relief was reported by 53.3% on placebo and 53.3% in both ginger groups combined. Side effects were mild and reported by 35.7% in the placebo and 16.7% in the ginger groups. This double blind randomized controlled pilot study suggests ginger is well tolerated but did not perform better than placebo. Larger trials are needed before any definitive conclusions can be drawn. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. What are the appropriate methods for analyzing patient-reported outcomes in randomized trials when data are missing?

    PubMed

    Hamel, J F; Sebille, V; Le Neel, T; Kubis, G; Boyer, F C; Hardouin, J B

    2017-12-01

    Subjective health measurements using Patient Reported Outcomes (PRO) are increasingly used in randomized trials, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: Classical Test Theory (CTT) and Item Response Theory models (IRT). These two strategies display very similar characteristics when data are complete, but in the common case when data are missing, whether IRT or CTT would be the most appropriate remains unknown and was investigated using simulations. We simulated PRO data such as quality of life data. Missing responses to items were simulated as being completely random, depending on an observable covariate or on an unobserved latent trait. The considered CTT-based methods allowed comparing scores using complete-case analysis, personal mean imputations or multiple-imputations based on a two-way procedure. The IRT-based method was the Wald test on a Rasch model including a group covariate. The IRT-based method and the multiple-imputations-based method for CTT displayed the highest observed power and were the only unbiased method whatever the kind of missing data. Online software and Stata® modules compatibles with the innate mi impute suite are provided for performing such analyses. Traditional procedures (listwise deletion and personal mean imputations) should be avoided, due to inevitable problems of biases and lack of power.

  20. Assessing vitamin D nutritional status: Is capillary blood adequate?

    PubMed

    Jensen, M E; Ducharme, F M; Théorêt, Y; Bélanger, A-S; Delvin, E

    2016-06-01

    Venous blood is the usual sample for measuring various biomarkers, including 25-hydroxyvitamin D (25OHD). However, it can prove challenging in infants and young children. Hence the finger-prick capillary collection is an alternative, being a relatively simple procedure perceived to be less invasive. We elected to validate the use of capillary blood sampling for 25OHD quantification by liquid chromatography tandem-mass spectrometry (LC/MS-MS). Venous and capillary blood samples were simultaneously collected from 15 preschool-aged children with asthma 10days after receiving 100,000IU of vitamin-D3 or placebo and 20 apparently healthy adult volunteers. 25OHD was measured by an in-house LC/MS-MS method. The venous 25OHD values varied between 23 and 255nmol/l. The venous and capillary blood total 25OHD concentrations highly correlated (r(2)=0.9963). The mean difference (bias) of capillary blood 25OHD compared to venous blood was 2.0 (95% CI: -7.5, 11.5) nmol/l. Our study demonstrates excellent agreement with no evidence of a clinically important bias between venous and capillary serum 25OHD concentrations measured by LC/MS-MS over a wide range of values. Under those conditions, capillary blood is therefore adequate for the measurement of 25OHD. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Methods and procedures for: A randomized double-blind study investigating dose-dependent longitudinal effects of vitamin D supplementation on bone health.

    PubMed

    Burt, Lauren A; Gaudet, Sharon; Kan, Michelle; Rose, Marianne S; Billington, Emma O; Boyd, Steven K; Hanley, David A

    2018-04-01

    The optimum dose of vitamin D and corresponding serum 25-hydroxyvitamin D (25OHD) concentration for bone health is still debated and some health practitioners are recommending doses well above the Canada/USA recommended Dietary Reference Intake (DRI). We designed a three-year randomized double-blind clinical trial investigating whether there are dose-dependent effects of vitamin D supplementation above the Dietary Reference Intake (DRI) on bone health. The primary aims of this study are to assess, whether supplementation of vitamin D 3 increases 1) volumetric bone mineral density measured by high-resolution peripheral quantitative computed tomography (HR-pQCT); 2) bone strength assessed by finite element analysis, and 3) areal bone mineral density by dual X-ray absorptiometry (DXA). Secondary aims are to understand whether vitamin D 3 supplementation improves parameters of bone microarchitecture, balance, physical function and quality of life. Participants are men and women aged 55-70 years, with women at least 5-years post-menopause. The intervention is daily vitamin D 3 supplementation doses of 400, 4000 or 10,000 IU. Participants not achieving adequate dietary calcium intake are provided with calcium supplementation, up to a maximum supplemental dose of 600 mg elemental calcium per day. Results from this three-year study will provide evidence whether daily vitamin D 3 supplementation with adequate calcium intake can affect bone density, bone microarchitecture and bone strength in men and women. Furthermore, the safety of high dose daily vitamin D 3 supplementation will be explored. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Extrapolating Survival from Randomized Trials Using External Data: A Review of Methods

    PubMed Central

    Jackson, Christopher; Stevens, John; Ren, Shijie; Latimer, Nick; Bojke, Laura; Manca, Andrea; Sharples, Linda

    2016-01-01

    This article describes methods used to estimate parameters governing long-term survival, or times to other events, for health economic models. Specifically, the focus is on methods that combine shorter-term individual-level survival data from randomized trials with longer-term external data, thus using the longer-term data to aid extrapolation of the short-term data. This requires assumptions about how trends in survival for each treatment arm will continue after the follow-up period of the trial. Furthermore, using external data requires assumptions about how survival differs between the populations represented by the trial and external data. Study reports from a national health technology assessment program in the United Kingdom were searched, and the findings were combined with “pearl-growing” searches of the academic literature. We categorized the methods that have been used according to the assumptions they made about how the hazards of death vary between the external and internal data and through time, and we discuss the appropriateness of the assumptions in different circumstances. Modeling choices, parameter estimation, and characterization of uncertainty are discussed, and some suggestions for future research priorities in this area are given. PMID:27005519

  3. 45 CFR 1159.15 - Who has the responsibility for maintaining adequate technical, physical, and security safeguards...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... disclosure or destruction of manual and automatic record systems. These security safeguards shall apply to... use of records contained in a system of records are adequately trained to protect the security and... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of...

  4. An investigation of new toxicity test method performance in validation studies: 1. Toxicity test methods that have predictive capacity no greater than chance.

    PubMed

    Bruner, L H; Carr, G J; Harbell, J W; Curren, R D

    2002-06-01

    An approach commonly used to measure new toxicity test method (NTM) performance in validation studies is to divide toxicity results into positive and negative classifications, and the identify true positive (TP), true negative (TN), false positive (FP) and false negative (FN) results. After this step is completed, the contingent probability statistics (CPS), sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) are calculated. Although these statistics are widely used and often the only statistics used to assess the performance of toxicity test methods, there is little specific guidance in the validation literature on what values for these statistics indicate adequate performance. The purpose of this study was to begin developing data-based answers to this question by characterizing the CPS obtained from an NTM whose data have a completely random association with a reference test method (RTM). Determining the CPS of this worst-case scenario is useful because it provides a lower baseline from which the performance of an NTM can be judged in future validation studies. It also provides an indication of relationships in the CPS that help identify random or near-random relationships in the data. The results from this study of randomly associated tests show that the values obtained for the statistics vary significantly depending on the cut-offs chosen, that high values can be obtained for individual statistics, and that the different measures cannot be considered independently when evaluating the performance of an NTM. When the association between results of an NTM and RTM is random the sum of the complementary pairs of statistics (sensitivity + specificity, NPV + PPV) is approximately 1, and the prevalence (i.e., the proportion of toxic chemicals in the population of chemicals) and PPV are equal. Given that combinations of high sensitivity-low specificity or low specificity-high sensitivity (i.e., the sum of the sensitivity and

  5. Observational studies are complementary to randomized controlled trials.

    PubMed

    Grootendorst, Diana C; Jager, Kitty J; Zoccali, Carmine; Dekker, Friedo W

    2010-01-01

    Randomized controlled trials (RCTs) are considered the gold standard study design to investigate the effect of health interventions, including treatment. However, in some situations, it may be unnecessary, inappropriate, impossible, or inadequate to perform an RCT. In these special situations, well-designed observational studies, including cohort and case-control studies, may provide an alternative to doing nothing in order to obtain estimates of treatment effect. It should be noted that such studies should be performed with caution and correctly. The aims of this review are (1) to explain why RCTs are considered the optimal study design to evaluate treatment effects, (2) to describe the situations in which an RCT is not possible and observational studies are an adequate alternative, and (3) to explain when randomization is not needed and can be approximated in observational studies. Examples from the nephrology literature are used for illustration. Copyright 2009 S. Karger AG, Basel.

  6. Increased calcium absorption from synthetic stable amorphous calcium carbonate: Double-blind randomized crossover clinical trial in post-menopausal women

    USDA-ARS?s Scientific Manuscript database

    Calcium supplementation is a widely recognized strategy for achieving adequate calcium intake. We designed this blinded, randomized, crossover interventional trial to compare the bioavailability of a new stable synthetic amorphous calcium carbonate (ACC) with that of crystalline calcium carbonate (C...

  7. A dose optimization method for electron radiotherapy using randomized aperture beams

    NASA Astrophysics Data System (ADS)

    Engel, Konrad; Gauer, Tobias

    2009-09-01

    The present paper describes the entire optimization process of creating a radiotherapy treatment plan for advanced electron irradiation. Special emphasis is devoted to the selection of beam incidence angles and beam energies as well as to the choice of appropriate subfields generated by a refined version of intensity segmentation and a novel random aperture approach. The algorithms have been implemented in a stand-alone programme using dose calculations from a commercial treatment planning system. For this study, the treatment planning system Pinnacle from Philips has been used and connected to the optimization programme using an ASCII interface. Dose calculations in Pinnacle were performed by Monte Carlo simulations for a remote-controlled electron multileaf collimator (MLC) from Euromechanics. As a result, treatment plans for breast cancer patients could be significantly improved when using randomly generated aperture beams. The combination of beams generated through segmentation and randomization achieved the best results in terms of target coverage and sparing of critical organs. The treatment plans could be further improved by use of a field reduction algorithm. Without a relevant loss in dose distribution, the total number of MLC fields and monitor units could be reduced by up to 20%. In conclusion, using randomized aperture beams is a promising new approach in radiotherapy and exhibits potential for further improvements in dose optimization through a combination of randomized electron and photon aperture beams.

  8. Self-correcting random number generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Pooser, Raphael C.

    2016-09-06

    A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, tomore » provide a random number according to one or more performance criteria.« less

  9. Region 8: Colorado Lamar and Steamboat Springs Adequate Letter (11/12/2002)

    EPA Pesticide Factsheets

    This letter from EPA to Colorado Department of Public Health and Environment determined Lamar and Steamboat Springs particulate matter (PM10) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes

  10. Design and methods for a randomized clinical trial treating comorbid obesity and major depressive disorder

    PubMed Central

    Schneider, Kristin L; Bodenlos, Jamie S; Ma, Yunsheng; Olendzki, Barbara; Oleski, Jessica; Merriam, Philip; Crawford, Sybil; Ockene, Ira S; Pagoto, Sherry L

    2008-01-01

    Background Obesity is often comorbid with depression and individuals with this comorbidity fare worse in behavioral weight loss treatment. Treating depression directly prior to behavioral weight loss treatment might bolster weight loss outcomes in this population, but this has not yet been tested in a randomized clinical trial. Methods and design This randomized clinical trial will examine whether behavior therapy for depression administered prior to standard weight loss treatment produces greater weight loss than standard weight loss treatment alone. Obese women with major depressive disorder (N = 174) will be recruited from primary care clinics and the community and randomly assigned to one of the two treatment conditions. Treatment will last 2 years, and will include a 6-month intensive treatment phase followed by an 18-month maintenance phase. Follow-up assessment will occur at 6-months and 1- and 2 years following randomization. The primary outcome is weight loss. The study was designed to provide 90% power for detecting a weight change difference between conditions of 3.1 kg (standard deviation of 5.5 kg) at 1-year assuming a 25% rate of loss to follow-up. Secondary outcomes include depression, physical activity, dietary intake, psychosocial variables and cardiovascular risk factors. Potential mediators (e.g., adherence, depression, physical activity and caloric intake) of the intervention effect on weight change will also be examined. Discussion Treating depression before administering intensive health behavior interventions could potentially boost the impact on both mental and physical health outcomes. Trial registration NCT00572520 PMID:18793398

  11. Are general surgery residents adequately prepared for hepatopancreatobiliary fellowships? A questionnaire-based study

    PubMed Central

    Osman, Houssam; Parikh, Janak; Patel, Shirali; Jeyarajah, D Rohan

    2015-01-01

    Background The present study was conducted to assess the preparedness of hepatopancreatobiliary (HPB) fellows upon entering fellowship, identify challenges encountered by HPB fellows during the initial part of their HPB training, and identify potential solutions to these challenges that can be applied during residency training. Methods A questionnaire was distributed to all HPB fellows in accredited HPB fellowship programmes in two consecutive academic years (n = 42). Reponses were then analysed. Results A total of 19 (45%) fellows responded. Prior to their fellowship, 10 (53%) were in surgical residency and the rest were in other surgical fellowships or surgical practice. Thirteen (68%) were graduates of university-based residency programmes. All fellows felt comfortable in performing basic laparoscopic procedures independently at the completion of residency and less comfortable in performing advanced laparoscopy. Eight (42%) fellows cited a combination of inadequate case volume and lack of autonomy during residency as the reasons for this lack of comfort. Thirteen (68%) identified inadequate preoperative workup and management as their biggest fear upon entering practice after general surgery training. A total of 17 (89%) fellows felt they were adequately prepared to enter HPB fellowship. Extra rotations in transplant, vascular or minimally invasive surgery were believed to be most helpful in preparing general surgery residents pursing HPB fellowships. Conclusions Overall, HPB fellows felt themselves to be adequately prepared for fellowship. Advanced laparoscopic procedures and the perioperative management of complex patients are two of the challenges facing HPB fellows. General surgery residents who plan to pursue an HPB fellowship may benefit from spending extra rotations on certain subspecialties. Focus on perioperative workup and management should be an integral part of residency and fellowship training. PMID:25387852

  12. Scanning method as an unbiased simulation technique and its application to the study of self-attracting random walks

    NASA Astrophysics Data System (ADS)

    Meirovitch, Hagai

    1985-12-01

    The scanning method proposed by us [J. Phys. A 15, L735 (1982); Macromolecules 18, 563 (1985)] for simulation of polymer chains is further developed and applied, for the first time, to a model with finite interactions. In addition to ``importance sampling,'' we remove the bias introduced by the scanning method with a procedure suggested recently by Schmidt [Phys. Rev. Lett. 51, 2175 (1983)]; this procedure has the advantage of enabling one to estimate the statistical error. We find these two procedures to be equally efficient. The model studied is an N-step random walk on a lattice, in which a random walk i has a statistical weight &, where p<1 is an attractive energy parameter and Mi is the number of distinct sites visited by walk i. This model, which corresponds to a model of random walks moving in a medium with randomly distributed static traps, has been solved analytically for N-->∞ for any dimension d by Donsker and Varadhan (DV) and by others. and lnφ, where φ is the survival probability in the trapping problem, diverge like Nα with α=d/(d+2). Most numerical studies, however, have failed to reach the DV regime in which d/(d+2) becomes a good approximation for α. On the other hand, our results for α (obtained for N<=150) are close to the DV values for p<=0.7 and p<=0.6 for d=2 and 3, respectively. This suggests that the scanning method is more efficient than both the commonly used direct Monte Carlo technique, and the Rosenbluth and Rosenbluth method [J. Chem. Phys. 23, 356 (1954)]. Our results support the conclusion of Havlin et al. [Phys. Rev. Lett. 53, 407 (1984)] that the DV regime exists already for φ<=10-13 for both d=2 and 3. We also find that at the percolation threshold pc the exponents for the end-to-end distance are small, but larger than zero, and that the probability of a walk returning to the origin behaves approximately as N-1/3 for both d=2 and 3.

  13. State Implementation Plans (SIP): Submissions that EPA has Found Adequate or Inadequate

    EPA Pesticide Factsheets

    EPA/OTAQ’s State and Local Transportation Resources are for air quality and transportation government and community leaders. Information on state implementation plans (SIPs) that EPA has found either adequate or inadequate is provided here.

  14. Effects of Pilates method in elderly people: Systematic review of randomized controlled trials.

    PubMed

    de Oliveira Francisco, Cristina; de Almeida Fagundes, Alessandra; Gorges, Bruna

    2015-07-01

    The Pilates method has been widely used in physical training and rehabilitation. Evidence regarding the effectiveness of this method in elderly people is limited. Six randomized controlled trials studies involving the use of the Pilates method for elderly people, published prior to December 2013, were selected from the databases PubMed, MEDLINE, Embase, Cochrane, Scielo and PEDro. Three articles suggested that Pilates produced improvements in balance. Two studies evaluated the adherence to Pilates programs. One study assessed Pilates' influence on cardio-metabolic parameters and another study evaluated changes in body composition. Strong evidence was found regarding beneficial effects of Pilates over static and dynamic balance in women. Nevertheless, evidence of balance improvement in both genders, changes in body composition in woman and adherence to Pilates programs were limited. Effects on cardio-metabolic parameters due to Pilates training presented inconclusive results. Pilates may be a useful tool in rehabilitation and prevention programs but more high quality studies are necessary to establish all the effects on elderly populations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. A smart rotary technique versus conventional pulpectomy for primary teeth: A randomized controlled clinical study

    PubMed Central

    Mokhtari, Negar; Shirazi, Alireza-Sarraf

    2017-01-01

    Background Techniques with adequate accuracy of working length determination along with shorter duration of treatment in pulpectomy procedure seems to be essential in pediatric dentistry. The aim of the present study was to evaluate the accuracy of root canal length measurement with Root ZX II apex locator and rotary system in pulpectomy of primary teeth. Material and Methods In this randomized control clinical trial complete pulpectomy was performed on 80 mandibular primary molars in 80, 4-6-year-old children. The study population was randomly divided into case and control groups. In control group conventional pulpectomy was performed and in the case group working length was determined by electronic apex locator Root ZXII and instrumented with Mtwo rotary files. Statistical evaluation was performed using Mann-Whitney and Chi-Square tests (P<0.05). Results There were no significant differences between electronic apex locator Root ZXII and conventional method in accuracy of root canal length determination. However significantly less time was needed for instrumenting with rotary files (P=0.000). Conclusions Considering the comparable results in accuracy of root canal length determination and the considerably shorter instrumentation time in Root ZXII apex locator and rotary system, it may be suggested for pulpectomy in primary molar teeth. Key words:Rotary technique, conventional technique, pulpectomy, primary teeth. PMID:29302280

  16. Characterization of nitride hole lateral transport in a charge trap flash memory by using a random telegraph signal method

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Heng; Jiang, Cheng-Min; Lin, Hsiao-Yi; Wang, Tahui; Tsai, Wen-Jer; Lu, Tao-Cheng; Chen, Kuang-Chao; Lu, Chih-Yuan

    2017-07-01

    We use a random telegraph signal method to investigate nitride trapped hole lateral transport in a charge trap flash memory. The concept of this method is to utilize an interface oxide trap and its associated random telegraph signal as an internal probe to detect a local channel potential change resulting from nitride charge lateral movement. We apply different voltages to the drain of a memory cell and vary a bake temperature in retention to study the electric field and temperature dependence of hole lateral movement in a nitride. Thermal energy absorption by trapped holes in lateral transport is characterized. Mechanisms of hole lateral transport in retention are investigated. From the measured and modeled results, we find that thermally assisted trap-to-band tunneling is a major trapped hole emission mechanism in nitride hole lateral transport.

  17. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  18. Quantum Random Number Generation Using a Quanta Image Sensor

    PubMed Central

    Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.

    2016-01-01

    A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698

  19. Iteration of ultrasound aberration correction methods

    NASA Astrophysics Data System (ADS)

    Maasoey, Svein-Erik; Angelsen, Bjoern; Varslot, Trond

    2004-05-01

    Aberration in ultrasound medical imaging is usually modeled by time-delay and amplitude variations concentrated on the transmitting/receiving array. This filter process is here denoted a TDA filter. The TDA filter is an approximation to the physical aberration process, which occurs over an extended part of the human body wall. Estimation of the TDA filter, and performing correction on transmit and receive, has proven difficult. It has yet to be shown that this method works adequately for severe aberration. Estimation of the TDA filter can be iterated by retransmitting a corrected signal and re-estimate until a convergence criterion is fulfilled (adaptive imaging). Two methods for estimating time-delay and amplitude variations in receive signals from random scatterers have been developed. One method correlates each element signal with a reference signal. The other method use eigenvalue decomposition of the receive cross-spectrum matrix, based upon a receive energy-maximizing criterion. Simulations of iterating aberration correction with a TDA filter have been investigated to study its convergence properties. A weak and strong human-body wall model generated aberration. Both emulated the human abdominal wall. Results after iteration improve aberration correction substantially, and both estimation methods converge, even for the case of strong aberration.

  20. Random regression analyses using B-splines functions to model growth from birth to adult age in Canchim cattle.

    PubMed

    Baldi, F; Alencar, M M; Albuquerque, L G

    2010-12-01

    The objective of this work was to estimate covariance functions using random regression models on B-splines functions of animal age, for weights from birth to adult age in Canchim cattle. Data comprised 49,011 records on 2435 females. The model of analysis included fixed effects of contemporary groups, age of dam as quadratic covariable and the population mean trend taken into account by a cubic regression on orthogonal polynomials of animal age. Residual variances were modelled through a step function with four classes. The direct and maternal additive genetic effects, and animal and maternal permanent environmental effects were included as random effects in the model. A total of seventeen analyses, considering linear, quadratic and cubic B-splines functions and up to seven knots, were carried out. B-spline functions of the same order were considered for all random effects. Random regression models on B-splines functions were compared to a random regression model on Legendre polynomials and with a multitrait model. Results from different models of analyses were compared using the REML form of the Akaike Information criterion and Schwarz' Bayesian Information criterion. In addition, the variance components and genetic parameters estimated for each random regression model were also used as criteria to choose the most adequate model to describe the covariance structure of the data. A model fitting quadratic B-splines, with four knots or three segments for direct additive genetic effect and animal permanent environmental effect and two knots for maternal additive genetic effect and maternal permanent environmental effect, was the most adequate to describe the covariance structure of the data. Random regression models using B-spline functions as base functions fitted the data better than Legendre polynomials, especially at mature ages, but higher number of parameters need to be estimated with B-splines functions. © 2010 Blackwell Verlag GmbH.

  1. A Randomized Study of a Method for Optimizing Adolescent Assent to Biomedical Research

    PubMed Central

    Annett, Robert D.; Brody, Janet L.; Scherer, David G.; Turner, Charles W.; Dalen, Jeanne; Raissy, Hengameh

    2018-01-01

    Purpose Voluntary consent/assent with adolescents invited to participate in research raises challenging problems. No studies to date have attempted to manipulate autonomy in relation to assent/consent processes. This study evaluated the effects of an autonomy-enhanced individualized assent/consent procedure embedded within a randomized pediatric asthma clinical trial. Methods Families were randomly assigned to remain together or separated during a consent/assent process, the latter we characterize as an autonomy-enhanced assent/consent procedure. We hypothesized that separating adolescents from their parents would improve adolescent assent by increasing knowledge and appreciation of the clinical trial and willingness to participate. Results 64 adolescent-parent dyads completed procedures. The together versus separate randomization made no difference in adolescent or parent willingness to participate. However, significant differences were found in both parent and adolescent knowledge of the asthma clinical trial based on the assent/consent procedure and adolescent age. The separate assent/consent procedure improved knowledge of study risks and benefits for older adolescents and their parents but not for the younger youth or their parents. Regardless of the assent/consent process, younger adolescents had lower comprehension of information associated with the study medication and research risks and benefits, but not study procedures or their research rights and privileges. Conclusions The use of an autonomy-enhanced assent/consent procedure for adolescents may improve their and their parent’s informed assent/consent without impacting research participation decisions. Traditional assent/consent procedures may result in a “diffusion of responsibility” effect between parents and older adolescents, specifically in attending to key information associated with study risks and benefits. PMID:28949898

  2. Random walk study of electron motion in helium in crossed electromagnetic fields

    NASA Technical Reports Server (NTRS)

    Englert, G. W.

    1972-01-01

    Random walk theory, previously adapted to electron motion in the presence of an electric field, is extended to include a transverse magnetic field. In principle, the random walk approach avoids mathematical complexity and concomitant simplifying assumptions and permits determination of energy distributions and transport coefficients within the accuracy of available collisional cross section data. Application is made to a weakly ionized helium gas. Time of relaxation of electron energy distribution, determined by the random walk, is described by simple expressions based on energy exchange between the electron and an effective electric field. The restrictive effect of the magnetic field on electron motion, which increases the required number of collisions per walk to reach a terminal steady state condition, as well as the effect of the magnetic field on electron transport coefficients and mean energy can be quite adequately described by expressions involving only the Hall parameter.

  3. Assessing the germplasm of Laminaria (phaeophyceae) with random amplified polymorphic DNA (RAPD) method

    NASA Astrophysics Data System (ADS)

    He, Yingjun; Zou, Yuping; Wang, Xiaodong; Zheng, Zhiguo; Zhang, Daming; Duan, Delin

    2003-06-01

    Eighteen gametophytes including L. japonica, L. ochotensis and L. longissima, were verified with random amplified polymorphic DNA (RAPD) technique. Eighteen ten-base primers were chosen from 100 primers selected for final amplification test. Among the total of 205 bands amplified, 181 (88.3%) were polymorphic. The genetic distance among different strains ranged from 0.072 to 0.391. The dendrogram constructed by unweighted pair-group method with arithmetic (UPGMA) method showed that the female and male gametophytes of the same cell lines could be grouped in pairs respectively. It indicated that RAPD analysis could be used not only to distinguish different strains of Laminaria, but also to distinguish male and female gametophyte within the same cell lines. There is ambiguous systematic relationship if judged merely by the present data. It seems that the use of RAPD marker is limited to elucidation of the phylogenetic relationship among the species of Laminaria.

  4. Benthic macroinvertebrate field sampling effort required to produce a sample adequate for the assessment of rivers and streams of Neuquén Province, Argentina

    EPA Science Inventory

    This multi-year pilot study evaluated a proposed field method for its effectiveness in the collection of a benthic macroinvertebrate sample adequate for use in the condition assessment of streams and rivers in the Neuquén Province, Argentina. A total of 13 sites, distribut...

  5. Prediction of broadband ground-motion time histories: Hybrid low/high-frequency method with correlated random source parameters

    USGS Publications Warehouse

    Liu, P.; Archuleta, R.J.; Hartzell, S.H.

    2006-01-01

    We present a new method for calculating broadband time histories of ground motion based on a hybrid low-frequency/high-frequency approach with correlated source parameters. Using a finite-difference method we calculate low- frequency synthetics (< ∼1 Hz) in a 3D velocity structure. We also compute broadband synthetics in a 1D velocity model using a frequency-wavenumber method. The low frequencies from the 3D calculation are combined with the high frequencies from the 1D calculation by using matched filtering at a crossover frequency of 1 Hz. The source description, common to both the 1D and 3D synthetics, is based on correlated random distributions for the slip amplitude, rupture velocity, and rise time on the fault. This source description allows for the specification of source parameters independent of any a priori inversion results. In our broadband modeling we include correlation between slip amplitude, rupture velocity, and rise time, as suggested by dynamic fault modeling. The method of using correlated random source parameters is flexible and can be easily modified to adjust to our changing understanding of earthquake ruptures. A realistic attenuation model is common to both the 3D and 1D calculations that form the low- and high-frequency components of the broadband synthetics. The value of Q is a function of the local shear-wave velocity. To produce more accurate high-frequency amplitudes and durations, the 1D synthetics are corrected with a randomized, frequency-dependent radiation pattern. The 1D synthetics are further corrected for local site and nonlinear soil effects by using a 1D nonlinear propagation code and generic velocity structure appropriate for the site’s National Earthquake Hazards Reduction Program (NEHRP) site classification. The entire procedure is validated by comparison with the 1994 Northridge, California, strong ground motion data set. The bias and error found here for response spectral acceleration are similar to the best results

  6. Implementation of selective prevention for cardiometabolic diseases; are Dutch general practices adequately prepared?

    PubMed

    Stol, Daphne M; Hollander, Monika; Nielen, Markus M J; Badenbroek, Ilse F; Schellevis, François G; de Wit, Niek J

    2018-03-01

    Current guidelines acknowledge the need for cardiometabolic disease (CMD) prevention and recommend five-yearly screening of a targeted population. In recent years programs for selective CMD-prevention have been developed, but implementation is challenging. The question arises if general practices are adequately prepared. Therefore, the aim of this study is to assess the organizational preparedness of Dutch general practices and the facilitators and barriers for performing CMD-prevention in practices currently implementing selective CMD-prevention. Observational study. Dutch primary care. General practices. Organizational characteristics. General practices implementing selective CMD-prevention are more often organized as a group practice (49% vs. 19%, p = .000) and are better organized regarding chronic disease management compared to reference practices. They are motivated for performing CMD-prevention and can be considered as 'frontrunners' of Dutch general practices with respect to their practice organization. The most important reported barriers are a limited availability of staff (59%) and inadequate funding (41%). The organizational infrastructure of Dutch general practices is considered adequate for performing most steps of selective CMD-prevention. Implementation of prevention programs including easily accessible lifestyle interventions needs attention. All stakeholders involved share the responsibility to realize structural funding for programmed CMD-prevention. Aforementioned conditions should be taken into account with respect to future implementation of selective CMD-prevention. Key Points   There is need for adequate CMD prevention. Little is known about the organization of selective CMD prevention in general practices.   • The organizational infrastructure of Dutch general practices is adequate for performing most steps of selective CMD prevention.   • Implementation of selective CMD prevention programs including easily accessible

  7. Interactive Book Reading to Accelerate Word Learning by Kindergarten Children with Specific Language Impairment: Identifying an Adequate Intensity and Variation in Treatment Response

    ERIC Educational Resources Information Center

    Storkel, Holly L.; Voelmle, Krista; Fierro, Veronica; Flake, Kelsey; Fleming, Kandace K.; Romine, Rebecca Swinburne

    2017-01-01

    Purpose: This study sought to identify an adequate intensity of interactive book reading for new word learning by children with specific language impairment (SLI) and to examine variability in treatment response. Method: An escalation design adapted from nontoxic drug trials (Hunsberger, Rubinstein, Dancey, & Korn, 2005) was used in this Phase…

  8. Random phase encoding for optical security

    NASA Astrophysics Data System (ADS)

    Wang, RuiKang K.; Watson, Ian A.; Chatwin, Christopher R.

    1996-09-01

    A new optical encoding method for security applications is proposed. The encoded image (encrypted into the security products) is merely a random phase image statistically and randomly generated by a random number generator using a computer, which contains no information from the reference pattern (stored for verification) or the frequency plane filter (a phase-only function for decoding). The phase function in the frequency plane is obtained using a modified phase retrieval algorithm. The proposed method uses two phase-only functions (images) at both the input and frequency planes of the optical processor leading to maximum optical efficiency. Computer simulation shows that the proposed method is robust for optical security applications.

  9. Human milk feeding supports adequate growth in infants

    USDA-ARS?s Scientific Manuscript database

    Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The ...

  10. Extracting the field-effect mobilities of random semiconducting single-walled carbon nanotube networks: A critical comparison of methods

    NASA Astrophysics Data System (ADS)

    Schießl, Stefan P.; Rother, Marcel; Lüttgens, Jan; Zaumseil, Jana

    2017-11-01

    The field-effect mobility is an important figure of merit for semiconductors such as random networks of single-walled carbon nanotubes (SWNTs). However, owing to their network properties and quantum capacitance, the standard models for field-effect transistors cannot be applied without modifications. Several different methods are used to determine the mobility with often very different results. We fabricated and characterized field-effect transistors with different polymer-sorted, semiconducting SWNT network densities ranging from low (≈6 μm-1) to densely packed quasi-monolayers (≈26 μm-1) with a maximum on-conductance of 0.24 μS μm-1 and compared four different techniques to evaluate the field-effect mobility. We demonstrate the limits and requirements for each method with regard to device layout and carrier accumulation. We find that techniques that take into account the measured capacitance on the active device give the most reliable mobility values. Finally, we compare our experimental results to a random-resistor-network model.

  11. PREFACE: The random search problem: trends and perspectives The random search problem: trends and perspectives

    NASA Astrophysics Data System (ADS)

    da Luz, Marcos G. E.; Grosberg, Alexander; Raposo, Ernesto P.; Viswanathan, Gandhi M.

    2009-10-01

    aircraft, a given web site). Regarding the nature of the searching drive, in certain instances, it can be guided almost entirely by external cues, either by the cognitive (memory) or detective (olfaction, vision, etc) skills of the searcher. However, in many situations the movement is non-oriented, being in essence a stochastic process. Therefore, in such cases (and even when a small deterministic component in the locomotion exists) a random search effectively defines the final rates of encounters. Hence, one reason underlying the richness of the random search problem relates just to the `ignorance' of the locations of the randomly located targets. Contrary to conventional wisdom, the lack of complete information does not necessarily lead to greater complexity. As an illustrative example, let us consider the case of complete information. If the positions of all target sites are known in advance, then the question of what sequential order to visit the sites so to reduce the energy costs of locomotion itself becomes a rather challenging problem: the famous `travelling salesman' optimization query, belonging to the NP-complete class of problems. The ignorance of the target site locations, however, considerably modifies the problem and renders it not amenable to be treated by purely deterministic computational methods. In fact, as expected, the random search problem is not particularly suited to search algorithms that do not use elements of randomness. So, only a statistical approach to the search problem can adequately deal with the element of ignorance. In other words, the incomplete information renders the search under-determined, i.e., it is not possible to find the `best' solution to the problem because all the information is not given. Instead, one must guess and probabilistic or stochastic strategies become unavoidable. Also, the random search problem bears a relation to reaction-diffusion processes, because the search involves a diffusive aspect, movement, as well as a

  12. Transitioning from adequate to inadequate sleep duration associated with higher smoking rate and greater nicotine dependence in a population sample

    PubMed Central

    Patterson, Freda; Grandner, Michael A.; Lozano, Alicia; Satti, Aditi; Ma, Grace

    2017-01-01

    Introduction Inadequate sleep (≤6 and ≥9 h) is more prevalent in smokers than non-smokers but the extent to which sleep duration in smokers relates to smoking behaviors and cessation outcomes, is not yet clear. To begin to address this knowledge gap, we investigated the extent to which sleep duration predicted smoking behaviors and quitting intention in a population sample. Methods Data from current smokers who completed the baseline (N=635) and 5-year follow-up (N=477) assessment in the United Kingdom Biobank cohort study were analyzed. Multivariable regression models using smoking behavior outcomes (cigarettes per day, time to first cigarette, difficulty not smoking for a day, quitting intention) and sleep duration (adequate (7–8 h) versus inadequate (≤6 and ≥9 h) as the predictor were generated. All models adjusted for age, sex, race, and education. Results Worsening sleep duration (adequate to inadequate) predicted a more than three-fold higher odds in increased cigarettes per day (OR =3.18; 95% CI =1.25–8.06), a more than three-fold increased odds of not smoking for the day remaining difficult (OR =3.90; 95% CI =1.27–12.01), and a > 8-fold increased odds of higher nicotine dependence (OR= 8.98; 95% CI =2.81–28.66). Improving sleep duration (i.e., inadequate to adequate sleep) did not predict reduced cigarette consumption or nicotine dependence in this population sample. Conclusion Transitioning from adequate to inadequate sleep duration may be a risk factor for developing a more “hard-core” smoking profile. The extent to which achieving healthy sleep may promote, or optimize smoking cessation treatment response, warrants investigation. PMID:28950118

  13. Do observational studies using propensity score methods agree with randomized trials? A systematic comparison of studies on acute coronary syndromes

    PubMed Central

    Dahabreh, Issa J.; Sheldrick, Radley C.; Paulus, Jessica K.; Chung, Mei; Varvarigou, Vasileia; Jafri, Haseeb; Rassen, Jeremy A.; Trikalinos, Thomas A.; Kitsios, Georgios D.

    2012-01-01

    Aims Randomized controlled trials (RCTs) are the gold standard for assessing the efficacy of therapeutic interventions because randomization protects from biases inherent in observational studies. Propensity score (PS) methods, proposed as a potential solution to confounding of the treatment–outcome association, are widely used in observational studies of therapeutic interventions for acute coronary syndromes (ACS). We aimed to systematically assess agreement between observational studies using PS methods and RCTs on therapeutic interventions for ACS. Methods and results We searched for observational studies of interventions for ACS that used PS methods to estimate treatment effects on short- or long-term mortality. Using a standardized algorithm, we matched observational studies to RCTs based on patients’ characteristics, interventions, and outcomes (‘topics’), and we compared estimates of treatment effect between the two designs. When multiple observational studies or RCTs were identified for the same topic, we performed a meta-analysis and used the summary relative risk for comparisons. We matched 21 observational studies investigating 17 distinct clinical topics to 63 RCTs (median = 3 RCTs per observational study) for short-term (7 topics) and long-term (10 topics) mortality. Estimates from PS analyses differed statistically significantly from randomized evidence in two instances; however, observational studies reported more extreme beneficial treatment effects compared with RCTs in 13 of 17 instances (P = 0.049). Sensitivity analyses limited to large RCTs, and using alternative meta-analysis models yielded similar results. Conclusion For the treatment of ACS, observational studies using PS methods produce treatment effect estimates that are of more extreme magnitude compared with those from RCTs, although the differences are rarely statistically significant. PMID:22711757

  14. Estimation of the Nonlinear Random Coefficient Model when Some Random Effects Are Separable

    ERIC Educational Resources Information Center

    du Toit, Stephen H. C.; Cudeck, Robert

    2009-01-01

    A method is presented for marginal maximum likelihood estimation of the nonlinear random coefficient model when the response function has some linear parameters. This is done by writing the marginal distribution of the repeated measures as a conditional distribution of the response given the nonlinear random effects. The resulting distribution…

  15. An observational study showed that explaining randomization using gambling-related metaphors and computer-agency descriptions impeded randomized clinical trial recruitment.

    PubMed

    Jepson, Marcus; Elliott, Daisy; Conefrey, Carmel; Wade, Julia; Rooshenas, Leila; Wilson, Caroline; Beard, David; Blazeby, Jane M; Birtle, Alison; Halliday, Alison; Stein, Rob; Donovan, Jenny L

    2018-07-01

    To explore how the concept of randomization is described by clinicians and understood by patients in randomized controlled trials (RCTs) and how it contributes to patient understanding and recruitment. Qualitative analysis of 73 audio recordings of recruitment consultations from five, multicenter, UK-based RCTs with identified or anticipated recruitment difficulties. One in 10 appointments did not include any mention of randomization. Most included a description of the method or process of allocation. Descriptions often made reference to gambling-related metaphors or similes, or referred to allocation by a computer. Where reference was made to a computer, some patients assumed that they would receive the treatment that was "best for them". Descriptions of the rationale for randomization were rarely present and often only came about as a consequence of patients questioning the reason for a random allocation. The methods and processes of randomization were usually described by recruiters, but often without clarity, which could lead to patient misunderstanding. The rationale for randomization was rarely mentioned. Recruiters should avoid problematic gambling metaphors and illusions of agency in their explanations and instead focus on clearer descriptions of the rationale and method of randomization to ensure patients are better informed about randomization and RCT participation. Copyright © 2018 University of Bristol. Published by Elsevier Inc. All rights reserved.

  16. A general method for handling missing binary outcome data in randomized controlled trials.

    PubMed

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-12-01

    The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. We propose a sensitivity analysis where standard analyses, which could include 'missing = smoking' and 'last observation carried forward', are embedded in a wider class of models. We apply our general method to data from two smoking cessation trials. A total of 489 and 1758 participants from two smoking cessation trials. The abstinence outcomes were obtained using telephone interviews. The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. © 2014 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  17. A general method for handling missing binary outcome data in randomized controlled trials

    PubMed Central

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-01-01

    Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Participants A total of 489 and 1758 participants from two smoking cessation trials. Measurements The abstinence outcomes were obtained using telephone interviews. Findings The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. Conclusions A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. PMID:25171441

  18. Random forests ensemble classifier trained with data resampling strategy to improve cardiac arrhythmia diagnosis.

    PubMed

    Ozçift, Akin

    2011-05-01

    Supervised classification algorithms are commonly used in the designing of computer-aided diagnosis systems. In this study, we present a resampling strategy based Random Forests (RF) ensemble classifier to improve diagnosis of cardiac arrhythmia. Random forests is an ensemble classifier that consists of many decision trees and outputs the class that is the mode of the class's output by individual trees. In this way, an RF ensemble classifier performs better than a single tree from classification performance point of view. In general, multiclass datasets having unbalanced distribution of sample sizes are difficult to analyze in terms of class discrimination. Cardiac arrhythmia is such a dataset that has multiple classes with small sample sizes and it is therefore adequate to test our resampling based training strategy. The dataset contains 452 samples in fourteen types of arrhythmias and eleven of these classes have sample sizes less than 15. Our diagnosis strategy consists of two parts: (i) a correlation based feature selection algorithm is used to select relevant features from cardiac arrhythmia dataset. (ii) RF machine learning algorithm is used to evaluate the performance of selected features with and without simple random sampling to evaluate the efficiency of proposed training strategy. The resultant accuracy of the classifier is found to be 90.0% and this is a quite high diagnosis performance for cardiac arrhythmia. Furthermore, three case studies, i.e., thyroid, cardiotocography and audiology, are used to benchmark the effectiveness of the proposed method. The results of experiments demonstrated the efficiency of random sampling strategy in training RF ensemble classification algorithm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Using published data in Mendelian randomization: a blueprint for efficient identification of causal risk factors.

    PubMed

    Burgess, Stephen; Scott, Robert A; Timpson, Nicholas J; Davey Smith, George; Thompson, Simon G

    2015-07-01

    Finding individual-level data for adequately-powered Mendelian randomization analyses may be problematic. As publicly-available summarized data on genetic associations with disease outcomes from large consortia are becoming more abundant, use of published data is an attractive analysis strategy for obtaining precise estimates of the causal effects of risk factors on outcomes. We detail the necessary steps for conducting Mendelian randomization investigations using published data, and present novel statistical methods for combining data on the associations of multiple (correlated or uncorrelated) genetic variants with the risk factor and outcome into a single causal effect estimate. A two-sample analysis strategy may be employed, in which evidence on the gene-risk factor and gene-outcome associations are taken from different data sources. These approaches allow the efficient identification of risk factors that are suitable targets for clinical intervention from published data, although the ability to assess the assumptions necessary for causal inference is diminished. Methods and guidance are illustrated using the example of the causal effect of serum calcium levels on fasting glucose concentrations. The estimated causal effect of a 1 standard deviation (0.13 mmol/L) increase in calcium levels on fasting glucose (mM) using a single lead variant from the CASR gene region is 0.044 (95 % credible interval -0.002, 0.100). In contrast, using our method to account for the correlation between variants, the corresponding estimate using 17 genetic variants is 0.022 (95 % credible interval 0.009, 0.035), a more clearly positive causal effect.

  20. Prenatal zinc supplementation of zinc-adequate rats adversely affects immunity in offspring

    USDA-ARS?s Scientific Manuscript database

    We previously showed that zinc (Zn) supplementation of Zn-adequate dams induced immunosuppressive effects that persist in the offspring after weaning. We investigated whether the immunosuppressive effects were due to in utero exposure and/or mediated via milk using a cross-fostering design. Pregnant...

  1. The need of adequate information to achieve total compliance of mass drug administration in Pekalongan

    NASA Astrophysics Data System (ADS)

    Ginandjar, Praba; Saraswati, Lintang Dian; Taufik, Opik; Nurjazuli; Widjanarko, Bagoes

    2017-02-01

    World Health Organization (WHO) initiated The Global Program to Eliminate Lymphatic Filariasis (LF) through mass drug administration (MDA). Pekalongan started MDA in 2011. Yet the LF prevalence in 2015 remained exceed the threshold (1%). This study aimed to describe the inhibiting factors related to the compliance of MDA in community level. This was a rapid survey with cross sectional approach. A two-stages random sampling was used in this study. In the first stage, 25 clusters were randomly selected from 27 villages with proportionate to population size (PPS) methods (C-Survey). In the second stage, 10 subjects were randomly selected from each cluster. Subject consisted of 250 respondents from 25 selected clusters. Variables consisted of MDA coverage, practice of taking medication during MDA, enabling and inhibiting factors to MDA in community level. The results showed most respondents had poor knowledge on filariasis, which influence awareness of the disease. Health-illness perception, did not receive the drugs, lactation, side effect, and size of the drugs were dominant factors of non-compliance to MDA. MDA information and community empowerment were needed to improve MDA coverage. Further study to explore the appropriate model of socialization will support the success of MDA program

  2. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  3. Video on Diet Before Outpatient Colonoscopy Does Not Improve Quality of Bowel Preparation: A Prospective, Randomized, Controlled Trial.

    PubMed

    Rice, Sean C; Higginbotham, Tina; Dean, Melanie J; Slaughter, James C; Yachimski, Patrick S; Obstein, Keith L

    2016-11-01

    Successful outpatient colonoscopy (CLS) depends on many factors including the quality of a patient's bowel preparation. Although education on consumption of the pre-CLS purgative can improve bowel preparation quality, no study has evaluated dietary education alone. We have created an educational video on pre-CLS dietary instructions to determine whether dietary education would improve outpatient bowel preparation quality. A prospective randomized, blinded, controlled study of patients undergoing outpatient CLS was performed. All patients received a 4 l polyethylene glycol-based split-dose bowel preparation and standard institutional pre-procedure instructions. Patients were then randomly assigned to an intervention arm or to a no intervention arm. A 4-min educational video detailing clear liquid diet restriction was made available to patients in the intervention arm, whereas those randomized to no intervention did not have access to the video. Patients randomized to the video were provided with the YouTube video link 48-72 h before CLS. An attending endoscopist blinded to randomization performed the CLS. Bowel preparation quality was scored using the Boston Bowel Preparation Scale (BBPS). Adequate preparation was defined as a BBPS total score of ≥6 with all segment scores ≥2. Wilcoxon rank-sum and Pearson's χ 2 -tests were performed to assess differences between groups. Ninety-two patients were randomized (video: n=42; control: n=50) with 47 total video views being tallied. There were no demographic differences between groups. There was no statistically significant difference in adequate preparation between groups (video=74%; control=68%; P=0.54). The availability of a supplementary patient educational video on clear liquid diet alone was insufficient to improve bowel preparation quality when compared with standard pre-procedure instruction at our institution.

  4. Acupuncture for alcohol withdrawal: a randomized controlled trial.

    PubMed

    Trümpler, François; Oez, Suzan; Stähli, Peter; Brenner, Hans Dieter; Jüni, Peter

    2003-01-01

    Previous trials on acupuncture in alcohol addiction were in outpatients and focused on relapse prevention. Rates of dropout were high and interpretation of results difficult. We compared auricular laser and needle acupuncture with sham laser stimulation in reducing the duration of alcohol withdrawal. Inpatients undergoing alcohol withdrawal were randomly allocated to laser acupuncture (n = 17), needle acupuncture (n = 15) or sham laser stimulation (n = 16). Attempts were made to blind patients, therapists and outcome assessors, but this was not feasible for needle acupuncture. The duration of withdrawal symptoms (as assessed using a nurse-rated scale) was the primary outcome; the duration of sedative prescription was the secondary outcome. Patients randomized to laser and sham laser had identical withdrawal symptom durations (median 4 days). Patients randomized to needle stimulation had a shorter duration of withdrawal symptoms (median 3 days; P = 0.019 versus sham intervention), and tended to have a shorter duration of sedative use, but these differences diminished after adjustment for baseline differences. The data from this pilot trial do not suggest a relevant benefit of auricular laser acupuncture for alcohol withdrawal. A larger trial including adequate sham interventions is needed, however, to reliably determine the effectiveness of any type of auricular acupuncture in this condition.

  5. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  6. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  7. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  8. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  9. 46 CFR 16.230 - Random testing requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... programs for the chemical testing for dangerous drugs on a random basis of crewmembers on inspected vessels... establish programs for the chemical testing for dangerous drugs on a random basis of crewmembers on... random drug testing shall be made by a scientifically valid method, such as a random number table or a...

  10. Use of Linear Programming to Develop Cost-Minimized Nutritionally Adequate Health Promoting Food Baskets

    PubMed Central

    Tetens, Inge; Dejgård Jensen, Jørgen; Smed, Sinne; Gabrijelčič Blenkuš, Mojca; Rayner, Mike; Darmon, Nicole; Robertson, Aileen

    2016-01-01

    Background Food-Based Dietary Guidelines (FBDGs) are developed to promote healthier eating patterns, but increasing food prices may make healthy eating less affordable. The aim of this study was to design a range of cost-minimized nutritionally adequate health-promoting food baskets (FBs) that help prevent both micronutrient inadequacy and diet-related non-communicable diseases at lowest cost. Methods Average prices for 312 foods were collected within the Greater Copenhagen area. The cost and nutrient content of five different cost-minimized FBs for a family of four were calculated per day using linear programming. The FBs were defined using five different constraints: cultural acceptability (CA), or dietary guidelines (DG), or nutrient recommendations (N), or cultural acceptability and nutrient recommendations (CAN), or dietary guidelines and nutrient recommendations (DGN). The variety and number of foods in each of the resulting five baskets was increased through limiting the relative share of individual foods. Results The one-day version of N contained only 12 foods at the minimum cost of DKK 27 (€ 3.6). The CA, DG, and DGN were about twice of this and the CAN cost ~DKK 81 (€ 10.8). The baskets with the greater variety of foods contained from 70 (CAN) to 134 (DGN) foods and cost between DKK 60 (€ 8.1, N) and DKK 125 (€ 16.8, DGN). Ensuring that the food baskets cover both dietary guidelines and nutrient recommendations doubled the cost while cultural acceptability (CAN) tripled it. Conclusion Use of linear programming facilitates the generation of low-cost food baskets that are nutritionally adequate, health promoting, and culturally acceptable. PMID:27760131

  11. Randomized controlled trial of internal and external targeted temperature management methods in post- cardiac arrest patients.

    PubMed

    Look, Xinqi; Li, Huihua; Ng, Mingwei; Lim, Eric Tien Siang; Pothiawala, Sohil; Tan, Kenneth Boon Kiat; Sewa, Duu Wen; Shahidah, Nur; Pek, Pin Pin; Ong, Marcus Eng Hock

    2018-01-01

    Targeted temperature management post-cardiac arrest is currently implemented using various methods, broadly categorized as internal and external. This study aimed to evaluate survival-to-hospital discharge and neurological outcomes (Glasgow-Pittsburgh Score) of post-cardiac arrest patients undergoing internal cooling verses external cooling. A randomized controlled trial of post-resuscitation cardiac arrest patients was conducted from October 2008-September 2014. Patients were randomized to either internal or external cooling methods. Historical controls were selected matched by age and gender. Analysis using SPSS version 21.0 presented descriptive statistics and frequencies while univariate logistic regression was done using R 3.1.3. 23 patients were randomized to internal cooling and 22 patients to external cooling and 42 matched controls were selected. No significant difference was seen between internal and external cooling in terms of survival, neurological outcomes and complications. However in the internal cooling arm, there was lower risk of developing overcooling (p=0.01) and rebound hyperthermia (p=0.02). Compared to normothermia, internal cooling had higher survival (OR=3.36, 95% CI=(1.130, 10.412), and lower risk of developing cardiac arrhythmias (OR=0.18, 95% CI=(0.04, 0.63)). Subgroup analysis showed those with cardiac cause of arrest (OR=4.29, 95% CI=(1.26, 15.80)) and sustained ROSC (OR=5.50, 95% CI=(1.64, 20.39)) had better survival with internal cooling compared to normothermia. Cooling curves showed tighter temperature control for internal compared to external cooling. Internal cooling showed tighter temperature control compared to external cooling. Internal cooling can potentially provide better survival-to-hospital discharge outcomes and reduce cardiac arrhythmia complications in carefully selected patients as compared to normothermia. Copyright © 2017. Published by Elsevier Inc.

  12. Prospective, randomized comparison between pulsatile GnRH therapy and combined gonadotropin (FSH+LH) treatment for ovulation induction in women with hypothalamic amenorrhea and underlying polycystic ovary syndrome.

    PubMed

    Dubourdieu, Sophie; Fréour, Thomas; Dessolle, Lionel; Barrière, Paul

    2013-05-01

    To compare the efficacy of pulsatile GnRH therapy versus combined gonadotropins for ovulation induction in women with both hypothalamic amenorrhoea and polycystic ovarian syndrome (HA/PCOS) according to their current hypothalamic status. This single-centre, prospective, randomized study was conducted in the Nantes University Hospital, France. Thirty consecutive patients were treated for ovulation induction with either pulsatile GnRH therapy or combined gonadotropins (rFSH+rLH). Frequency of adequate ovarian response (mono- or bi-follicular) and clinical pregnancy rate were then compared between both groups. Ovarian response was similar in both groups with comparable frequency of adequate ovarian response (73% vs 60%), but the clinical pregnancy rate was significantly higher in the pulsatile GnRH therapy group than in the combined gonadotropin group (46% vs 0%). HA/PCOS is a specific subgroup of infertile women. Pulsatile GnRH therapy is an effective and safe method of ovulation induction that can be used successfully in these patients. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. [Quantitative and qualitative research methods, can they coexist yet?].

    PubMed

    Hunt, Elena; Lavoie, Anne-Marise

    2011-06-01

    Qualitative design is gaining ground in Nursing research. In spite of a relative progress however, the evidence based practice movement continues to dominate and to underline the exclusive value of quantitative design (particularly that of randomized clinical trials) for clinical decision making. In the actual context convenient to those in power making utilitarian decisions on one hand, and facing nursing criticism of the establishment in favor of qualitative research on the other hand, it is difficult to chose a practical and ethical path that values the nursing role within the health care system, keeping us committed to quality care and maintaining researcher's integrity. Both qualitative and quantitative methods have advantages and disadvantages, and clearly, none of them can, by itself, capture, describe and explain reality adequately. Therefore, a balance between the two methods is needed. Researchers bare responsibility to society and science, and they should opt for the appropriate design susceptible to answering the research question, not promote the design favored by the research funding distributors.

  14. A Mixed-Methods Randomized Controlled Trial of Financial Incentives and Peer Networks to Promote Walking among Older Adults

    ERIC Educational Resources Information Center

    Kullgren, Jeffrey T.; Harkins, Kristin A.; Bellamy, Scarlett L.; Gonzales, Amy; Tao, Yuanyuan; Zhu, Jingsan; Volpp, Kevin G.; Asch, David A.; Heisler, Michele; Karlawish, Jason

    2014-01-01

    Background: Financial incentives and peer networks could be delivered through eHealth technologies to encourage older adults to walk more. Methods: We conducted a 24-week randomized trial in which 92 older adults with a computer and Internet access received a pedometer, daily walking goals, and weekly feedback on goal achievement. Participants…

  15. Smoke alarm tests may not adequately indicate smoke alarm function.

    PubMed

    Peek-Asa, Corinne; Yang, Jingzhen; Hamann, Cara; Young, Tracy

    2011-01-01

    Smoke alarms are one of the most promoted prevention strategies to reduce residential fire deaths, and they can reduce residential fire deaths by half. Smoke alarm function can be measured by two tests: the smoke alarm button test and the chemical smoke test. Using results from a randomized trial of smoke alarms, we compared smoke alarm response to the button test and the smoke test. The smoke alarms found in the study homes at baseline were tested, as well as study alarms placed into homes as part of the randomized trial. Study alarms were tested at 12 and 42 months postinstallation. The proportion of alarms that passed the button test but not the smoke test ranged from 0.5 to 5.8% of alarms; this result was found most frequently among ionization alarms with zinc or alkaline batteries. These alarms would indicate to the owner (through the button test) that the smoke alarm was working, but the alarm would not actually respond in the case of a fire (as demonstrated by failing the smoke test). The proportion of alarms that passed the smoke test but not the button test ranged from 1.0 to 3.0%. These alarms would appear nonfunctional to the owner (because the button test failed), even though the alarm would operate in response to a fire (as demonstrated by passing the smoke test). The general public is not aware of the potential for inaccuracy in smoke alarm tests, and burn professionals can advocate for enhanced testing methods. The optimal test to determine smoke alarm function is the chemical smoke test.

  16. A Novel Method of Newborn Chest Compression: A Randomized Crossover Simulation Study.

    PubMed

    Smereka, Jacek; Szarpak, Lukasz; Ladny, Jerzy R; Rodriguez-Nunez, Antonio; Ruetzler, Kurt

    2018-01-01

    Objective: To compare a novel two-thumb chest compression technique with standard techniques during newborn resuscitation performed by novice physicians in terms of median depth of chest compressions, degree of full chest recoil, and effective compression efficacy. Patients and Methods: The total of 74 novice physicians with less than 1-year work experience participated in the study. They performed chest compressions using three techniques: (A) The new two-thumb technique (nTTT). The novel method of chest compressions in an infant consists in using two thumbs directed at the angle of 90° to the chest while closing the fingers of both hands in a fist. (B) TFT. With this method, the rescuer compresses the sternum with the tips of two fingers. (C) TTHT. Two thumbs are placed over the lower third of the sternum, with the fingers encircling the torso and supporting the back. Results: The median depth of chest compressions for nTTT was 3.8 (IQR, 3.7-3.9) cm, for TFT-2.1 (IQR, 1.7-2.5) cm, while for TTHT-3.6 (IQR, 3.5-3.8) cm. There was a significant difference between nTTT and TFT, and TTHT and TFT ( p < 0.001) for each time interval during resuscitation. The degree of full chest recoil was 93% (IQR, 91-97) for nTTT, 99% (IQR, 96-100) for TFT, and 90% (IQR, 74-91) for TTHT. There was a statistically significant difference in the degree of complete chest relaxation between nTTT and TFT ( p < 0.001), between nTTT and TTHT ( p = 0.016), and between TFT and TTHT ( p < 0.001). Conclusion: The median chest compression depth for nTTT and TTHT is significantly higher than that for TFT. The degree of full chest recoil was highest for TFT, then for nTTT and TTHT. The effective compression efficiency with nTTT was higher than for TTHT and TFT. Our novel newborn chest compression method in this manikin study provided adequate chest compression depth and degree of full chest recoil, as well as very good effective compression efficiency. Further clinical studies are necessary to

  17. General physicians do not take adequate travel histories.

    PubMed

    Price, Victoria A; Smith, Rachel A S; Douthwaite, Sam; Thomas, Sherine; Almond, D Solomon; Miller, Alastair R O; Beeching, Nicholas J; Thompson, Gail; Ustianowski, Andrew; Beadsworth, Mike B J

    2011-01-01

    Our aim was to document how often travel histories were taken and the quality of their content. Patients admitted over 2 months to acute medical units of two hospitals in the Northwest of England with a history of fever, rash, diarrhea, vomiting, jaundice, or presenting as "unwell post-travel" were identified. The initial medical clerking was assessed. A total of 132 relevant admissions were identified. A travel history was documented in only 26 patients (19.7%). Of the 16 patients who had traveled, there was no documentation of pretravel advice or of sexual/other activities abroad in 15 (93.8%) and 12 (75.0%) patients, respectively. There needs to be better awareness and education about travel-related illness and the importance of taking an adequate travel history. © 2011 International Society of Travel Medicine.

  18. Region 8: Colorado Denver, Pagosa Springs and Telluride Adequate Letter (8/18/2000)

    EPA Pesticide Factsheets

    This letter from EPA to Colorado Department of Public Health and Environment determined Denvers' Carbon Monoxide (CO) maintenance plan, Pagosa Springs and Tellurides' Particulate Matter (PM10) maintenance plans for Motor Vehicle Emissions Budgets adequate

  19. Analysis of training sample selection strategies for regression-based quantitative landslide susceptibility mapping methods

    NASA Astrophysics Data System (ADS)

    Erener, Arzu; Sivas, A. Abdullah; Selcuk-Kestel, A. Sevtap; Düzgün, H. Sebnem

    2017-07-01

    All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

  20. Social Franchising and a Nationwide Mass Media Campaign Increased the Prevalence of Adequate Complementary Feeding in Vietnam: A Cluster-Randomized Program Evaluation.

    PubMed

    Rawat, Rahul; Nguyen, Phuong Hong; Tran, Lan Mai; Hajeebhoy, Nemat; Nguyen, Huan Van; Baker, Jean; Frongillo, Edward A; Ruel, Marie T; Menon, Purnima

    2017-04-01

    Background: Rigorous evaluations of health system-based interventions in large-scale programs to improve complementary feeding (CF) practices are limited. Alive & Thrive applied principles of social franchising within the government health system in Vietnam to improve the quality of interpersonal counseling (IPC) for infant and young child feeding combined with a national mass media (MM) campaign and community mobilization (CM). Objective: We evaluated the impact of enhanced IPC + MM + CM (intensive) compared with standard IPC + less-intensive MM and CM (nonintensive) on CF practices and anthropometric indicators. Methods: A cluster-randomized, nonblinded evaluation design with cross-sectional surveys ( n = ∼500 children aged 6-23.9 mo and ∼1000 children aged 24-59.9 mo/group) implemented at baseline (2010) and endline (2014) was used. Difference-in-difference estimates (DDEs) of impact were calculated for intent-to-treat (ITT) analyses and modified per-protocol analyses (MPAs; mothers who attended the social franchising at least once: 62%). Results: Groups were similar at baseline. In ITT analyses, there were no significant differences between groups in changes in CF practices over time. In the MPAs, greater improvements in the intensive than in the nonintensive group were seen for minimum dietary diversity [DDE: 6.4 percentage points (pps); P < 0.05] and minimum acceptable diet (8.0 pps; P < 0.05). Significant stunting declines occurred in both intensive (7.1 pps) and nonintensive (5.4 pps) groups among children aged 24-59.9 mo, with no differential decline. Conclusions: When combined with MM and CM, an at-scale social franchising approach to improve IPC, delivered through the existing health care system, significantly improved CF practices, but not child growth, among mothers who used counseling services at least once. A greater impact may be achieved with strategies designed to increase service utilization. This trial was registered at clinicaltrials.gov as

  1. The Recruitment Experience of a Randomized Clinical Trial to Aid Young Adult Smokers to Stop Smoking without Weight Gain with Interactive Technology.

    PubMed

    Coday, Mace; Richey, Phyllis; Thomas, Fridtjof; Tran, Quynh T; Terrell, Sarah B; Tylavsky, Fran; Miro, Danielle; Caufield, Margaret; Johnson, Karen C

    2016-04-15

    Multiple recruitment strategies are often needed to recruit an adequate number of participants, especially hard to reach groups. Technology-based recruitment methods hold promise as a more robust form of reaching and enrolling historically hard to reach young adults. The TARGIT study is a randomized two-arm clinical trial in young adults using interactive technology testing an efficacious proactive telephone Quitline versus the Quitline plus a behavioral weight management intervention focusing on smoking cessation and weight change. All randomized participants in the TARGIT study were required to be a young adult smoker (18-35 years), who reported smoking at least 10 cigarettes per day, had a BMI < 40 kg/m 2, and were willing to stop smoking and not gain weight. Traditional recruitment methods were compared to technology-based strategies using standard descriptive statistics based on counts and proportions to describe the recruitment process from initial pre-screening (PS) to randomization into TARGIT. Participants at PS were majority Black (59.80%), female (52.66%), normal or over weight (combined 62.42%), 29.5 years old, and smoked 18.4 cigarettes per day. There were differences in men and women with respect to reasons for ineligibility during PS (p < 0.001; ignoring gender specific pregnancy-related ineligibility). TARGIT experienced a disproportionate loss of minorities during recruitment as well as a prolonged recruitment period due to either study ineligibility or not completing screening activities. Recruitment into longer term behavioral change intervention trials can be challenging and multiple methods are often required to recruit hard to reach groups.

  2. Region 8: Colorado Denver 2008 8-hour ozone Adequate Letter (4/2/2018)

    EPA Pesticide Factsheets

    EPA letter to Colorado Department of Public Health and Environment determined Metro-Denver/North Front Range ozone attainment plan and 2017 Motor Vehicle Emissions Budgets adequate for transportation conformity and will be announced in Federal Register.

  3. Gaussian membership functions are most adequate in representing uncertainty in measurements

    NASA Technical Reports Server (NTRS)

    Kreinovich, V.; Quintana, C.; Reznik, L.

    1992-01-01

    In rare situations, like fundamental physics, we perform experiments without knowing what their results will be. In the majority of real-life measurement situations, we more or less know beforehand what kind of results we will get. Of course, this is not the precise knowledge of the type 'the result will be between alpha - beta and alpha + beta,' because in this case, we would not need any measurements at all. This is usually a knowledge that is best represented in uncertain terms, like 'perhaps (or 'most likely', etc.) the measured value x is between alpha - beta and alpha + beta.' Traditional statistical methods neglect this additional knowledge and process only the measurement results. So it is desirable to be able to process this uncertain knowledge as well. A natural way to process it is by using fuzzy logic. But, there is a problem; we can use different membership functions to represent the same uncertain statements, and different functions lead to different results. What membership function do we choose? In the present paper, we show that under some reasonable assumptions, Gaussian functions mu(x) = exp(-beta(x(exp 2))) are the most adequate choice of the membership functions for representing uncertainty in measurements. This representation was efficiently used in testing jet engines to airplanes and spaceships.

  4. The Relationship between Adequate Yearly Progress and the Quality of Professional Development

    ERIC Educational Resources Information Center

    Wolff, Lori A.; McClelland, Susan S.; Stewart, Stephanie E.

    2010-01-01

    Based on publicly available data, the study examined the relationship between adequate yearly progress status and teachers' perceptions of the quality of their professional development. The sample included responses of 5,558 teachers who completed the questionnaire in the 2005-2006 school year. Results of the statistical analysis show a…

  5. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park and...

  6. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 1 2014-07-01 2014-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park and...

  7. 36 CFR 13.960 - Who determines when there is adequate snow cover?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park and...

  8. Physical Principle for Generation of Randomness

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2009-01-01

    A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)

  9. A functional renormalization method for wave propagation in random media

    NASA Astrophysics Data System (ADS)

    Lamagna, Federico; Calzetta, Esteban

    2017-08-01

    We develop the exact renormalization group approach as a way to evaluate the effective speed of the propagation of a scalar wave in a medium with random inhomogeneities. We use the Martin-Siggia-Rose formalism to translate the problem into a non equilibrium field theory one, and then consider a sequence of models with a progressively lower infrared cutoff; in the limit where the cutoff is removed we recover the problem of interest. As a test of the formalism, we compute the effective dielectric constant of an homogeneous medium interspersed with randomly located, interpenetrating bubbles. A simple approximation to the renormalization group equations turns out to be equivalent to a self-consistent two-loops evaluation of the effective dielectric constant.

  10. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  11. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  12. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  13. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  14. 36 CFR 79.7 - Methods to fund curatorial services.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...

  15. Changes in energy expenditure in preterm infants during weaning: a randomized comparison of two weaning methods from an incubator.

    PubMed

    Weintraub, Valentin; Mimouni, Francis B; Dollberg, Shaul

    2007-03-01

    We aimed to compare resting energy expenditure (REE) of infants exposed to either one of two weaning methods and to confirm the increase in REE during weaning from incubator. The study was a prospective randomized clinical trial of weaning preterm infants using either of two methods. REE was measured at baseline and 6, 23, 30, and 47 h, using indirect calorimetry. At measurement, infants were clinically and thermally stable, asleep, 2 h after feeding. Forty-two patients were randomized to "open incubator" (n = 23) or "warming bassinet" (n = 19). The groups did not differ in baseline clinical characteristics. REE increased significantly in both groups within 23 h, and remained stable at 30 and 47 h. At 6 and 23 h, the incubator group had significantly higher increase in REE than the warming bassinet group. By 30 h and at 47 h post-weaning, the REE of both groups became similar. In conclusion, REE increases significantly at weaning from incubator. The warming bassinet delays the increase in REE observed when infants are weaned using a turned off incubator. Whether one method is superior to the other in terms of thermic stress cannot be determined from this study.

  16. Are the defined substrate-based methods adequate to determine the microbiological quality of natural recreational waters?

    PubMed

    Valente, Marta Sofia; Pedro, Paulo; Alonso, M Carmen; Borrego, Juan J; Dionísio, Lídia

    2010-03-01

    Monitoring the microbiological quality of water used for recreational activities is very important to human public health. Although the sanitary quality of recreational marine waters could be evaluated by standard methods, they are time-consuming and need confirmation. For these reasons, faster and more sensitive methods, such as the defined substrate-based technology, have been developed. In the present work, we have compared the standard method of membrane filtration using Tergitol-TTC agar for total coliforms and Escherichia coli, and Slanetz and Bartley agar for enterococci, and the IDEXX defined substrate technology for these faecal pollution indicators to determine the microbiological quality of natural recreational waters. ISO 17994:2004 standard was used to compare these methods. The IDEXX for total coliforms and E. coli, Colilert, showed higher values than those obtained by the standard method. Enterolert test, for the enumeration of enterococci, showed lower values when compared with the standard method. It may be concluded that more studies to evaluate the precision and accuracy of the rapid tests are required in order to apply them for routine monitoring of marine and freshwater recreational bathing areas. The main advantages of these methods are that they are more specific, feasible and simpler than the standard methodology.

  17. Global Risk Assessment of Aflatoxins in Maize and Peanuts: Are Regulatory Standards Adequately Protective?

    PubMed Central

    Wu, Felicia

    2013-01-01

    The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295

  18. Analytic methods for questions pertaining to a randomized pretest, posttest, follow-up design.

    PubMed

    Rausch, Joseph R; Maxwell, Scott E; Kelley, Ken

    2003-09-01

    Delineates 5 questions regarding group differences that are likely to be of interest to researchers within the framework of a randomized pretest, posttest, follow-up (PPF) design. These 5 questions are examined from a methodological perspective by comparing and discussing analysis of variance (ANOVA) and analysis of covariance (ANCOVA) methods and briefly discussing hierarchical linear modeling (HLM) for these questions. This article demonstrates that the pretest should be utilized as a covariate in the model rather than as a level of the time factor or as part of the dependent variable within the analysis of group differences. It is also demonstrated that how the posttest and the follow-up are utilized in the analysis of group differences is determined by the specific question asked by the researcher.

  19. An Evaluation of Two Methods for Generating Synthetic HL7 Segments Reflecting Real-World Health Information Exchange Transactions

    PubMed Central

    Mwogi, Thomas S.; Biondich, Paul G.; Grannis, Shaun J.

    2014-01-01

    Motivated by the need for readily available data for testing an open-source health information exchange platform, we developed and evaluated two methods for generating synthetic messages. The methods used HL7 version 2 messages obtained from the Indiana Network for Patient Care. Data from both methods were analyzed to assess how effectively the output reflected original ‘real-world’ data. The Markov Chain method (MCM) used an algorithm based on transitional probability matrix while the Music Box model (MBM) randomly selected messages of particular trigger type from the original data to generate new messages. The MBM was faster, generated shorter messages and exhibited less variation in message length. The MCM required more computational power, generated longer messages with more message length variability. Both methods exhibited adequate coverage, producing a high proportion of messages consistent with original messages. Both methods yielded similar rates of valid messages. PMID:25954458

  20. 36 CFR 79.9 - Standards to determine when a repository possesses the capability to provide adequate long-term...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... repository possesses the capability to provide adequate long-term curatorial services. 79.9 Section 79.9... FEDERALLY-OWNED AND ADMINISTERED ARCHAEOLOGICAL COLLECTIONS § 79.9 Standards to determine when a repository... shall determine that a repository has the capability to provide adequate long-term curatorial services...

  1. Impact of Violation of the Missing-at-Random Assumption on Full-Information Maximum Likelihood Method in Multidimensional Adaptive Testing

    ERIC Educational Resources Information Center

    Han, Kyung T.; Guo, Fanmin

    2014-01-01

    The full-information maximum likelihood (FIML) method makes it possible to estimate and analyze structural equation models (SEM) even when data are partially missing, enabling incomplete data to contribute to model estimation. The cornerstone of FIML is the missing-at-random (MAR) assumption. In (unidimensional) computerized adaptive testing…

  2. Random forests of interaction trees for estimating individualized treatment effects in randomized trials.

    PubMed

    Su, Xiaogang; Peña, Annette T; Liu, Lei; Levine, Richard A

    2018-04-29

    Assessing heterogeneous treatment effects is a growing interest in advancing precision medicine. Individualized treatment effects (ITEs) play a critical role in such an endeavor. Concerning experimental data collected from randomized trials, we put forward a method, termed random forests of interaction trees (RFIT), for estimating ITE on the basis of interaction trees. To this end, we propose a smooth sigmoid surrogate method, as an alternative to greedy search, to speed up tree construction. The RFIT outperforms the "separate regression" approach in estimating ITE. Furthermore, standard errors for the estimated ITE via RFIT are obtained with the infinitesimal jackknife method. We assess and illustrate the use of RFIT via both simulation and the analysis of data from an acupuncture headache trial. Copyright © 2018 John Wiley & Sons, Ltd.

  3. “Whatever My Mother Wants”: Barriers to Adequate Pain Management

    PubMed Central

    Yennurajalingam, Sriram; Bruera, Eduardo

    2013-01-01

    Abstract Opioids are the preferred medications to treat cancer pain; however, several barriers to cancer pain management exist, including those related to the patient, health care provider, and family caregiver. We describe one such situation in which a family member prevents the patient from receiving adequate pain management at the end of life despite interdepartmental and interdisciplinary efforts. This case highlights the importance of understanding and addressing fears regarding opioid use and implementing an integrated approach including oncologists and palliative care physicians, along with early referrals to palliative care. PMID:22946542

  4. 21 CFR 740.10 - Labeling of cosmetic products for which adequate substantiation of safety has not been obtained.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 7 2013-04-01 2013-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not been...

  5. 21 CFR 740.10 - Labeling of cosmetic products for which adequate substantiation of safety has not been obtained.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 7 2012-04-01 2012-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not been...

  6. 21 CFR 740.10 - Labeling of cosmetic products for which adequate substantiation of safety has not been obtained.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not been...

  7. 21 CFR 740.10 - Labeling of cosmetic products for which adequate substantiation of safety has not been obtained.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not been...

  8. 21 CFR 740.10 - Labeling of cosmetic products for which adequate substantiation of safety has not been obtained.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 7 2014-04-01 2014-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not been...

  9. The content of African diets is adequate to achieve optimal efficacy with fixed-dose artemether-lumefantrine: a review of the evidence

    PubMed Central

    Premji, Zulfiqarali G; Abdulla, Salim; Ogutu, Bernhards; Ndong, Alice; Falade, Catherine O; Sagara, Issaka; Mulure, Nathan; Nwaiwu, Obiyo; Kokwaro, Gilbert

    2008-01-01

    A fixed-dose combination of artemether-lumefantrine (AL, Coartem®) has shown high efficacy, good tolerability and cost-effectiveness in adults and children with uncomplicated malaria caused by Plasmodium falciparum. Lumefantrine bioavailability is enhanced by food, particularly fat. As the fat content of sub-Saharan African meals is approximately a third that of Western countries, it raises the question of whether fat consumption by African patients is sufficient for good efficacy. Data from healthy volunteers have indicated that drinking 36 mL soya milk (containing only 1.2 g of fat) results in 90% of the lumefantrine absorption obtained with 500 mL milk (16 g fat). African diets are typically based on a carbohydrate staple (starchy root vegetables, fruit [plantain] or cereals) supplemented by soups, relishes and sauces derived from vegetables, pulses, nuts or fish. The most important sources of dietary fat in African countries are oil crops (e.g. peanuts, soya beans) and cooking oils as red palm, peanut, coconut and sesame oils. Total fat intake in the majority of subSaharan countries is estimated to be in the range 30–60 g/person/day across the whole population (average 43 g/person/day). Breast-feeding of infants up to two years of age is standard, with one study estimating a fat intake of 15–30 g fat/day from breast milk up to the age of 18 months. Weaning foods typically contain low levels of fat, and the transition from breast milk to complete weaning is associated with a marked reduction in dietary fat. Nevertheless, fat intake >10 g/day has been reported in young children post-weaning. A randomized trial in Uganda reported no difference in the efficacy of AL between patients receiving supervised meals with a fixed fat content (~23 g fat) or taking AL unsupervised, suggesting that fat intake at home was sufficient for optimal efficacy. Moreover, randomized trials in African children aged 5–59 months have shown similar high cure rates to those observed

  10. A smart rotary technique versus conventional pulpectomy for primary teeth: A randomized controlled clinical study.

    PubMed

    Mokhtari, Negar; Shirazi, Alireza-Sarraf; Ebrahimi, Masoumeh

    2017-11-01

    Techniques with adequate accuracy of working length determination along with shorter duration of treatment in pulpectomy procedure seems to be essential in pediatric dentistry. The aim of the present study was to evaluate the accuracy of root canal length measurement with Root ZX II apex locator and rotary system in pulpectomy of primary teeth. In this randomized control clinical trial complete pulpectomy was performed on 80 mandibular primary molars in 80, 4-6-year-old children. The study population was randomly divided into case and control groups. In control group conventional pulpectomy was performed and in the case group working length was determined by electronic apex locator Root ZXII and instrumented with Mtwo rotary files. Statistical evaluation was performed using Mann-Whitney and Chi-Square tests ( P <0.05). There were no significant differences between electronic apex locator Root ZXII and conventional method in accuracy of root canal length determination. However significantly less time was needed for instrumenting with rotary files ( P =0.000). Considering the comparable results in accuracy of root canal length determination and the considerably shorter instrumentation time in Root ZXII apex locator and rotary system, it may be suggested for pulpectomy in primary molar teeth. Key words: Rotary technique, conventional technique, pulpectomy, primary teeth.

  11. Some practical problems in implementing randomization.

    PubMed

    Downs, Matt; Tucker, Kathryn; Christ-Schmidt, Heidi; Wittes, Janet

    2010-06-01

    While often theoretically simple, implementing randomization to treatment in a masked, but confirmable, fashion can prove difficult in practice. At least three categories of problems occur in randomization: (1) bad judgment in the choice of method, (2) design and programming errors in implementing the method, and (3) human error during the conduct of the trial. This article focuses on these latter two types of errors, dealing operationally with what can go wrong after trial designers have selected the allocation method. We offer several case studies and corresponding recommendations for lessening the frequency of problems in allocating treatment or for mitigating the consequences of errors. Recommendations include: (1) reviewing the randomization schedule before starting a trial, (2) being especially cautious of systems that use on-demand random number generators, (3) drafting unambiguous randomization specifications, (4) performing thorough testing before entering a randomization system into production, (5) maintaining a dataset that captures the values investigators used to randomize participants, thereby allowing the process of treatment allocation to be reproduced and verified, (6) resisting the urge to correct errors that occur in individual treatment assignments, (7) preventing inadvertent unmasking to treatment assignments in kit allocations, and (8) checking a sample of study drug kits to allow detection of errors in drug packaging and labeling. Although we performed a literature search of documented randomization errors, the examples that we provide and the resultant recommendations are based largely on our own experience in industry-sponsored clinical trials. We do not know how representative our experience is or how common errors of the type we have seen occur. Our experience underscores the importance of verifying the integrity of the treatment allocation process before and during a trial. Clinical Trials 2010; 7: 235-245. http://ctj.sagepub.com.

  12. A matrix-based method of moments for fitting the multivariate random effects model for meta-analysis and meta-regression

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2013-01-01

    Multivariate meta-analysis is becoming more commonly used. Methods for fitting the multivariate random effects model include maximum likelihood, restricted maximum likelihood, Bayesian estimation and multivariate generalisations of the standard univariate method of moments. Here, we provide a new multivariate method of moments for estimating the between-study covariance matrix with the properties that (1) it allows for either complete or incomplete outcomes and (2) it allows for covariates through meta-regression. Further, for complete data, it is invariant to linear transformations. Our method reduces to the usual univariate method of moments, proposed by DerSimonian and Laird, in a single dimension. We illustrate our method and compare it with some of the alternatives using a simulation study and a real example. PMID:23401213

  13. Retention of cardiopulmonary resuscitation skills after hands-only training versus conventional training in novices: a randomized controlled trial

    PubMed Central

    Kim, Young Joon; Cho, Youngsuk; Cho, Gyu Chong; Ji, Hyun Kyung; Han, Song Yi; Lee, Jin Hyuck

    2017-01-01

    Objective Cardiopulmonary resuscitation (CPR) training can improve performance during simulated cardiac arrest; however, retention of skills after training remains uncertain. Recently, hands-only CPR has been shown to be as effective as conventional CPR. The purpose of this study is to compare the retention rate of CPR skills in laypersons after hands-only or conventional CPR training. Methods Participants were randomly assigned to 1 of 2 CPR training methods: 80 minutes of hands-only CPR training or 180 minutes of conventional CPR training. Each participant’s CPR skills were evaluated at the end of training and 3 months thereafter using the Resusci Anne manikin with a skill-reporting software. Results In total, 252 participants completed training; there were 125 in the hands-only CPR group and 127 in the conventional CPR group. After 3 months, 118 participants were randomly selected to complete a post-training test. The hands-only CPR group showed a significant decrease in average compression rate (P=0.015), average compression depth (P=0.031), and proportion of adequate compression depth (P=0.011). In contrast, there was no difference in the skills of the conventional CPR group after 3 months. Conclusion Conventional CPR training appears to be more effective for the retention of chest compression skills than hands-only CPR training; however, the retention of artificial ventilation skills after conventional CPR training is poor. PMID:28717778

  14. Application of random forests methods to diabetic retinopathy classification analyses.

    PubMed

    Casanova, Ramon; Saldana, Santiago; Chew, Emily Y; Danis, Ronald P; Greven, Craig M; Ambrosius, Walter T

    2014-01-01

    Diabetic retinopathy (DR) is one of the leading causes of blindness in the United States and world-wide. DR is a silent disease that may go unnoticed until it is too late for effective treatment. Therefore, early detection could improve the chances of therapeutic interventions that would alleviate its effects. Graded fundus photography and systemic data from 3443 ACCORD-Eye Study participants were used to estimate Random Forest (RF) and logistic regression classifiers. We studied the impact of sample size on classifier performance and the possibility of using RF generated class conditional probabilities as metrics describing DR risk. RF measures of variable importance are used to detect factors that affect classification performance. Both types of data were informative when discriminating participants with or without DR. RF based models produced much higher classification accuracy than those based on logistic regression. Combining both types of data did not increase accuracy but did increase statistical discrimination of healthy participants who subsequently did or did not have DR events during four years of follow-up. RF variable importance criteria revealed that microaneurysms counts in both eyes seemed to play the most important role in discrimination among the graded fundus variables, while the number of medicines and diabetes duration were the most relevant among the systemic variables. We have introduced RF methods to DR classification analyses based on fundus photography data. In addition, we propose an approach to DR risk assessment based on metrics derived from graded fundus photography and systemic data. Our results suggest that RF methods could be a valuable tool to diagnose DR diagnosis and evaluate its progression.

  15. Application of Random Forests Methods to Diabetic Retinopathy Classification Analyses

    PubMed Central

    Casanova, Ramon; Saldana, Santiago; Chew, Emily Y.; Danis, Ronald P.; Greven, Craig M.; Ambrosius, Walter T.

    2014-01-01

    Background Diabetic retinopathy (DR) is one of the leading causes of blindness in the United States and world-wide. DR is a silent disease that may go unnoticed until it is too late for effective treatment. Therefore, early detection could improve the chances of therapeutic interventions that would alleviate its effects. Methodology Graded fundus photography and systemic data from 3443 ACCORD-Eye Study participants were used to estimate Random Forest (RF) and logistic regression classifiers. We studied the impact of sample size on classifier performance and the possibility of using RF generated class conditional probabilities as metrics describing DR risk. RF measures of variable importance are used to detect factors that affect classification performance. Principal Findings Both types of data were informative when discriminating participants with or without DR. RF based models produced much higher classification accuracy than those based on logistic regression. Combining both types of data did not increase accuracy but did increase statistical discrimination of healthy participants who subsequently did or did not have DR events during four years of follow-up. RF variable importance criteria revealed that microaneurysms counts in both eyes seemed to play the most important role in discrimination among the graded fundus variables, while the number of medicines and diabetes duration were the most relevant among the systemic variables. Conclusions and Significance We have introduced RF methods to DR classification analyses based on fundus photography data. In addition, we propose an approach to DR risk assessment based on metrics derived from graded fundus photography and systemic data. Our results suggest that RF methods could be a valuable tool to diagnose DR diagnosis and evaluate its progression. PMID:24940623

  16. One-step random mutagenesis by error-prone rolling circle amplification

    PubMed Central

    Fujii, Ryota; Kitaoka, Motomitsu; Hayashi, Kiyoshi

    2004-01-01

    In vitro random mutagenesis is a powerful tool for altering properties of enzymes. We describe here a novel random mutagenesis method using rolling circle amplification, named error-prone RCA. This method consists of only one DNA amplification step followed by transformation of the host strain, without treatment with any restriction enzymes or DNA ligases, and results in a randomly mutated plasmid library with 3–4 mutations per kilobase. Specific primers or special equipment, such as a thermal-cycler, are not required. This method permits rapid preparation of randomly mutated plasmid libraries, enabling random mutagenesis to become a more commonly used technique. PMID:15507684

  17. Methods for a multicenter randomized trial for mixed urinary incontinence: rationale and patient-centeredness of the ESTEEM trial.

    PubMed

    Sung, Vivian W; Borello-France, Diane; Dunivan, Gena; Gantz, Marie; Lukacz, Emily S; Moalli, Pamela; Newman, Diane K; Richter, Holly E; Ridgeway, Beri; Smith, Ariana L; Weidner, Alison C; Meikle, Susan

    2016-10-01

    Mixed urinary incontinence (MUI) can be a challenging condition to manage. We describe the protocol design and rationale for the Effects of Surgical Treatment Enhanced with Exercise for Mixed Urinary Incontinence (ESTEEM) trial, designed to compare a combined conservative and surgical treatment approach versus surgery alone for improving patient-centered MUI outcomes at 12 months. ESTEEM is a multisite, prospective, randomized trial of female participants with MUI randomized to a standardized perioperative behavioral/pelvic floor exercise intervention plus midurethral sling versus midurethral sling alone. We describe our methods and four challenges encountered during the design phase: defining the study population, selecting relevant patient-centered outcomes, determining sample size estimates using a patient-reported outcome measure, and designing an analysis plan that accommodates MUI failure rates. A central theme in the design was patient centeredness, which guided many key decisions. Our primary outcome is patient-reported MUI symptoms measured using the Urogenital Distress Inventory (UDI) score at 12 months. Secondary outcomes include quality of life, sexual function, cost-effectiveness, time to failure, and need for additional treatment. The final study design was implemented in November 2013 across eight clinical sites in the Pelvic Floor Disorders Network. As of 27 February 2016, 433 total/472 targeted participants had been randomized. We describe the ESTEEM protocol and our methods for reaching consensus for methodological challenges in designing a trial for MUI by maintaining the patient perspective at the core of key decisions. This trial will provide information that can directly impact patient care and clinical decision making.

  18. Alabama's Foundation Program: An Adequate and Equitable School Funding Mechanism?

    ERIC Educational Resources Information Center

    Coe, Dennis Randal

    2016-01-01

    The purpose of this study was to determine the extent to which the foundation program was an adequate and equitable funding mechanism for public schools in the state of Alabama. This study analyzed funding and academic data and evaluated adequacy and equity through the lenses of poverty, geographic location, local tax effort, and type of school…

  19. Counterinsurgency and Operational Art: Is the Joint Campaign Planning Model Adequate?

    DTIC Science & Technology

    2003-01-01

    ART: IS THE JOINT CAMAPIGN PLANNING MODEL ADEQUATE? by MAJ Thomas Erik Miller, USA, 90 pages. The United States has conducted or supported more than a...increase. Some of the effects of the fall of the Soviet Union were a loosening of internal and external political and social controls in formerly Soviet...order” in the social , economic and political arena through rapid growth in population and urbanization in the underdeveloped world, globalization and

  20. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects.

    PubMed

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called "cluster randomization"). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  1. Effects of a novel method for enteral nutrition infusion involving a viscosity-regulating pectin solution: A multicenter randomized controlled trial.

    PubMed

    Tabei, Isao; Tsuchida, Shigeru; Akashi, Tetsuro; Ookubo, Katsuichiro; Hosoda, Satoru; Furukawa, Yoshiyuki; Tanabe, Yoshiaki; Tamura, Yoshiko

    2018-02-01

    The initial complications associated with infusion of enteral nutrition (EN) for clinical and nutritional care are vomiting, aspiration pneumonia, and diarrhea. There are many recommendations to prevent these complications. A novel method involving a viscosity-regulating pectin solution has been demonstrated. In Japan, this method along with the other so-called "semi-solid EN" approaches has been widely used in practice. However, there has been no randomized clinical trial to prove the efficiency and safety of a viscosity-regulating pectin solution in EN management. Therefore, we planned and initiated a multicenter randomized controlled trial to determine the efficiency and safety. This study included 34 patients from 7 medical institutions who participated. Institutional review board (IRB) approval was obtained from all participating institutions. Patients who required EN management were enrolled and randomly assigned to the viscosity regulation of enteral feeding (VREF) group and control group. The VREF group (n = 15) was managed with the addition of a viscosity-regulating pectin solution. The control group (n = 12) was managed with conventional EN administration, usually in a gradual step-up method. Daily clinical symptoms of pneumonia, fever, vomiting, and diarrhea; defecation frequency; and stool form were observed in the 2 week trial period. The dose of EN and duration of infusion were also examined. A favorable trend for clinical symptoms was noticed in the VREF group. No significant differences were observed in episodes of pneumonia, fever, vomiting, and diarrhea between the 2 groups. An apparent reduction in infusion duration and hardening of stool form were noted in the VREF group. The novel method involving a viscosity-regulating pectin solution with EN administration can be clinically performed safely and efficiently, similar to the conventional method. Moreover, there were benefits, such as improvement in stool form, a short time for EN infusion

  2. Systematic evaluation of the methodology of randomized controlled trials of anticoagulation in patients with cancer.

    PubMed

    Rada, Gabriel; Schünemann, Holger J; Labedi, Nawman; El-Hachem, Pierre; Kairouz, Victor F; Akl, Elie A

    2013-02-14

    Randomized controlled trials (RCTs) that are inappropriately designed or executed may provide biased findings and mislead clinical practice. In view of recent interest in the treatment and prevention of thrombotic complications in cancer patients we evaluated the characteristics, risk of bias and their time trends in RCTs of anticoagulation in patients with cancer. We conducted a comprehensive search, including a search of four electronic databases (MEDLINE, EMBASE, ISI the Web of Science, and CENTRAL) up to February 2010. We included RCTs in which the intervention and/or comparison consisted of: vitamin K antagonists, unfractionated heparin (UFH), low molecular weight heparin (LMWH), direct thrombin inhibitors or fondaparinux. We performed descriptive analyses and assessed the association between the variables of interest and the year of publication. We included 67 RCTs with 24,071 participants. In twenty one trials (31%) DVT diagnosis was triggered by clinical suspicion; the remaining trials either screened for DVT or were unclear about their approach. 41 (61%), 22 (33%), and 11 (16%) trials respectively reported on major bleeding, minor bleeding, and thrombocytopenia. The percentages of trials satisfying risk of bias criteria were: adequate sequence generation (85%), adequate allocation concealment (61%), participants' blinding (39%), data collectors' blinding (44%), providers' blinding (41%), outcome assessors' blinding (75%), data analysts' blinding (15%), intention to treat analysis (57%), no selective outcome reporting (12%), no stopping early for benefit (97%). The mean follow-up rate was 96%. Adequate allocation concealment and the reporting of intention to treat analysis were the only two quality criteria that improved over time. Many RCTs of anticoagulation in patients with cancer appear to use insufficiently rigorous outcome assessment methods and to have deficiencies in key methodological features. It is not clear whether this reflects a problem in the

  3. The adequate rocuronium dose required for complete block of the adductor muscles of the thigh.

    PubMed

    Fujimoto, M; Kawano, K; Yamamoto, T

    2018-03-01

    Rocuronium can prevent the obturator jerk during transurethral resection of bladder tumors. We investigated the adequate rocuronium dose required for complete block of the thigh adductor muscles, and its correlation with individual responses of the adductor pollicis muscle to rocuronium. Eleven patients scheduled for transurethral resection of bladder tumors under general anesthesia were investigated. After general anesthesia induction, neuromuscular monitoring of the adductor pollicis muscle and ultrasonography-guided stimulation of the obturator nerve was commenced. Rocuronium, 0.15 mg/kg, was repeatedly administered intravenously. The adequate rocuronium dose required for complete block of the thigh muscles, defined as the cumulative dose of rocuronium administered until that time, and its correlation with the first twitch response of the adductor pollicis muscle on train-of-four stimulation after initial rocuronium administration was analyzed. The rocuronium dose found adequate for complete block of the thigh muscles was 0.30 mg/kg in seven patients and 0.45 mg/kg in the remaining four patients, which did not correlate with the first twitch response. At the time of complete block of the thigh muscles, the neuromuscular blockade level of the adductor pollicis muscle varied greatly, although the level was never more profound than a post-tetanic count of 1. Although the response of the adductor pollicis muscle to rocuronium cannot be used to determine the adequate rocuronium dose required for complete block of the thigh muscles, intense blockade, with maintenance of post-tetanic count at ≤ 1 in the adductor pollicis muscle is essential to prevent the obturator jerk. © 2017 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  4. Leadership Style and Adequate Yearly Progress: A Correlational Study of Effective Principal Leadership

    ERIC Educational Resources Information Center

    Leapley-Portscheller, Claudia Iris

    2008-01-01

    Principals are responsible for leading efforts to reach increasingly higher levels of student academic proficiency in schools associated with adequate yearly progress (AYP) requirements. The purpose of this quantitative, correlational study was to identify the degree to which perceptions of principal transformational, transactional, and…

  5. Video Coaching as an Efficient Teaching Method for Surgical Residents-A Randomized Controlled Trial.

    PubMed

    Soucisse, Mikael L; Boulva, Kerianne; Sideris, Lucas; Drolet, Pierre; Morin, Michel; Dubé, Pierre

    As surgical training is evolving and operative exposure is decreasing, new, effective, and experiential learning methods are needed to ensure surgical competency and patient safety. Video coaching is an emerging concept in surgery that needs further investigation. In this randomized controlled trial conducted at a single teaching hospital, participating residents were filmed performing a side-to-side intestinal anastomosis on cadaveric dog bowel for baseline assessment. The Surgical Video Coaching (SVC) group then participated in a one-on-one video playback coaching and debriefing session with a surgeon, during which constructive feedback was given. The control group went on with their normal clinical duties without coaching or debriefing. All participants were filmed making a second intestinal anastomosis. This was compared to their first anastomosis using a 7-category-validated technical skill global rating scale, the Objective Structured Assessment of Technical Skills. A single independent surgeon who did not participate in coaching or debriefing to the SVC group reviewed all videos. A satisfaction survey was then sent to the residents in the coaching group. Department of Surgery, HôpitalMaisonneuve-Rosemont, tertiary teaching hospital affiliated to the University of Montreal, Canada. General surgery residents from University of Montreal were recruited to take part in this trial. A total of 28 residents were randomized and completed the study. After intervention, the SVC group (n = 14) significantly increased their Objective Structured Assessment of Technical Skills score (mean of differences 3.36, [1.09-5.63], p = 0.007) when compared to the control group (n = 14) (mean of differences 0.29, p = 0.759). All residents agreed or strongly agreed that video coaching was a time-efficient teaching method. Video coaching is an effective and efficient teaching intervention to improve surgical residents' technical skills. Crown Copyright © 2017. Published by Elsevier

  6. Efficacy and Safety of Fixed-Dose Perindopril Arginine/Amlodipine in Hypertensive Patients Not Adequately Controlled with Amlodipine 5 mg or Perindopril tert-Butylamine 4 mg Monotherapy.

    PubMed

    Hu, Dayi; Sun, Yihong; Liao, Yuhua; Huang, Jing; Zhao, Ruiping; Yang, Kan

    2016-01-01

    To assess the blood pressure-lowering efficacy and tolerability of perindopril/amlodipine fixed-dose combinations in Chinese patients with mild-to-moderate essential hypertension not adequately controlled with monotherapy alone. In 2 separate double-blind studies, patients received a 4-week run-in monotherapy of amlodipine 5 mg or perindopril 4 mg, respectively. Those whose blood pressure was uncontrolled were then randomized to receive the fixed-dose combination of perindopril 5 mg/amlodipine 5 mg (Per/Amlo group) or remain on the monotherapy for 8 weeks. Patients who were uncontrolled at the week 8 (W8) visit were up-titrated for the Per/Amlo combination, or received additional treatment if on monotherapy, for a further 4 weeks. The main efficacy assessment was at 8 weeks. After 8 weeks, systolic blood pressure (SBP; primary criterion) was statistically significantly lower in the Per/Amlo group (vs. Amlo 5 mg, p = 0.0095; vs. Per 4 mg, p < 0.0001). Uncontrolled patients at W8 who received an up-titration of the Per/Amlo combination showed a further SBP reduction. These changes were mirrored by reassuring reductions in diastolic blood pressure. The fixed-dose combinations were well tolerated. Single-pill combinations of perindopril and amlodipine provide hypertensive patients with a convenient and effective method of reducing blood pressure. © 2016 S. Karger AG, Basel.

  7. Impact of hemostasis methods, electrocoagulation versus suture, in laparoscopic endometriotic cystectomy on the ovarian reserve: a randomized controlled trial.

    PubMed

    Tanprasertkul, Chamnan; Ekarattanawong, Sophapun; Sreshthaputra, Opas; Vutyavanich, Teraporn

    2014-08-01

    To evaluate the impact on ovarian reserve between two different methods ofhemostasis after laparoscopic ovarian endometrioma excision. A randomized controlled study was conducted from January to December 2013 in Thammasat University Hospital, Thailand. Reproductive women, age 18-45years who underwent laparoscopic ovarian cystectomy were randomized in electrocoagulation and suture groups. Clinical baseline data and ovarian reserve outcome (anti-Mullerian hormone (AMH)) were evaluated. Fifty participants were recruited and randomized in two groups. Electrocoagulation and suture groups consisted of 25 participants. Baseline characteristics between 2 groups (age, weight, BMI, height, cyst diameter, duration and estimated blood loss) were not statistically different. There were no significant difference of AMIH between electrocoagulation and suture group atpre-operative (2.90±2.26 vs. 2.52±2.37 ng/ml), 1 week (1.78±1.51 vs. 1.99±1.71 ng/ml), 1 month (1.76±1.50 vs. 2.09±1.62 ng/ml), 3 months (2.09±1.66 vs. 1.96±1.68 ng/ml) and 6 months (2.11±1.84 vs 1.72±1.68 ng/ml), respectively. However mean AMH ofboth groups significantly decreased since the first week of operation. Effect oflaparoscopic ovarian surgery had significantly declined and sustained AMH level until 6 months. Laparoscopic cystectomy of ovarian endometrioma has negative impact to ovarian reserve. Either electroco- agulation or suture method had no different effects.

  8. Effectiveness of the Dader Method for pharmaceutical care in patients with bipolar I disorder: EMDADER-TAB: study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Bipolar I disorder (BD-I) is a chronic mental illness characterized by the presence of one or more manic episodes, or both depressive and manic episodes, usually separated by asymptomatic intervals. Pharmacists can contribute to the management of BD-I, mainly with the use of effective and safe drugs, and improve the patient’s life quality through pharmaceutical care. Some studies have shown the effect of pharmaceutical care in the achievement of therapeutic goals in different illnesses; however, to our knowledge, there is a lack of randomized controlled trials designed to assess the effect of pharmacist intervention in patients with BD. The aim of this study is to assess the effectiveness of the Dader Method for pharmaceutical care in patients with BD-I. Methods/design Randomized, controlled, prospective, single-center clinical trial with duration of 12 months will be performed to compare the effect of Dader Method of pharmaceutical care with the usual care process of patients in a psychiatric clinic. Patients diagnosed with BD-I aged between 18 and 65 years who have been discharged or referred from outpatients service of the San Juan de Dios Clinic (Antioquia, Colombia) will be included. Patients will be randomized into the intervention group who will receive pharmaceutical care provided by pharmacists working in collaboration with psychiatrists, or into the control group who will receive usual care and verbal-written counseling regarding BD. Study outcomes will be assessed at baseline and at 3, 6, 9, and 12 months after randomization. The primary outcome will be to measure the number of hospitalizations, emergency service consultations, and unscheduled outpatient visits. Effectiveness, safety, adherence, and quality of life will be assessed as secondary outcomes. Statistical analyses will be performed using two-tailed McNemar tests, Pearson chi-square tests, and Student’s t-tests; a P value <0.05 will be considered as statistically significant

  9. Random discrete linear canonical transform.

    PubMed

    Wei, Deyun; Wang, Ruikui; Li, Yuan-Min

    2016-12-01

    Linear canonical transforms (LCTs) are a family of integral transforms with wide applications in optical, acoustical, electromagnetic, and other wave propagation problems. In this paper, we propose the random discrete linear canonical transform (RDLCT) by randomizing the kernel transform matrix of the discrete linear canonical transform (DLCT). The RDLCT inherits excellent mathematical properties from the DLCT along with some fantastic features of its own. It has a greater degree of randomness because of the randomization in terms of both eigenvectors and eigenvalues. Numerical simulations demonstrate that the RDLCT has an important feature that the magnitude and phase of its output are both random. As an important application of the RDLCT, it can be used for image encryption. The simulation results demonstrate that the proposed encryption method is a security-enhanced image encryption scheme.

  10. Non-penetrating sham needle, is it an adequate sham control in acupuncture research?

    PubMed

    Lee, Hyangsook; Bang, Heejung; Kim, Youngjin; Park, Jongbae; Lee, Sangjae; Lee, Hyejung; Park, Hi-Joon

    2011-01-01

    This study aimed to determine whether a non-penetrating sham needle can serve as an adequate sham control. We conducted a randomised, subject-blind, sham-controlled trial in both acupuncture-naïve and experienced healthy volunteers. Participants were randomly allocated to receive either real acupuncture (n=39) or non-penetrating sham acupuncture (n=40) on the hand (LI4), abdomen (CV12) and leg (ST36). The procedures were standardised and identical for both groups. Participants rated acupuncture sensations on a 10-point scale. A blinding index was calculated based on the participants' guesses on the type of acupuncture they had received (real, sham or do not know) for each acupuncture point. The association of knowledge about and experience in acupuncture with correct guessing was also examined. The subjects in both groups were similar with respect to age, gender, experience or knowledge about acupuncture. The sham needle tended to produce less penetration, pain and soreness only at LI4. Blinding appeared to be successfully achieved for ST36. Although 41% of participants in the real acupuncture group made correct guesses for LI4, 31% guessed incorrectly for CV12, beyond chance level. People with more experience and knowledge about acupuncture were more likely to correctly guess the type of needle they received at ST36 only, compared to that at the other points. A non-penetrating sham needle may successfully blind participants and thus, may be a credible sham control. However, the small sample size, the different needle sensations, and the degree and direction of unblinding across acupuncture points warrant further studies in Korea as well as other countries to confirm our finding. Our results also justify the incorporation of formal testing of the use of sham controls in clinical trials of acupuncture. Copyright © 2010 Elsevier Ltd. All rights reserved.

  11. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  12. Is my bottom-up uncertainty estimation on metal measurement adequate?

    NASA Astrophysics Data System (ADS)

    Marques, J. R.; Faustino, M. G.; Monteiro, L. R.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Is the estimated uncertainty under GUM recommendation associated with metal measurement adequately estimated? How to evaluate if the measurement uncertainty really covers all uncertainty that is associated with the analytical procedure? Considering that, many laboratories frequently underestimate or less frequently overestimate uncertainties on its results; this paper presents the evaluation of estimated uncertainties on two ICP-OES procedures of seven metal measurements according to GUM approach. Horwitz function and proficiency tests scaled standard uncertainties were used in this evaluation. Our data shows that most elements expanded uncertainties were from two to four times underestimated. Possible causes and corrections are discussed herein.

  13. Working group on the “adequate minimum” V=volcanic observatory

    USGS Publications Warehouse

    Tilling, R.I.

    1982-01-01

    A working group consisting of R. I. Tilling (United States, Chairman), M. Espendola (Mexico), E. Malavassi (Costa Rica), L. Villari (Italy), and J.P Viode (France) met on the island of Guadeloupe on February 20, 1981, to discuss informally the requirements for a "Minimum" volcano observatory, one which would have the essential monitoring equipment and staff to provide reliable information on the state of an active volcno. Given the premise that any monitoring of a volcano is better than none at all, the owrking group then proceeded to consider the concept of an "adequate minimum" observatory. 

  14. [The pregnant employee in anaesthesia and intensive care - An evidence-based approach to designing adequate workplaces].

    PubMed

    Röher, Katharina; Göpfert, Matthias S

    2015-07-01

    In the light of a rising percentage of women among employees in anaesthesia and intensive care designing adequate workplaces for pregnant employees plays an increasingly important role. Here it is necessary to align the varied interests of the pregnant employee, fellow employees and the employer, where the legal requirements of the Maternity Protection Act ("Mutterschutzgesetz") form the statutory framework. This review describes how adequate workplaces for pregnant employees in anaesthesia and intensive care can be established considering the scientific evidence on the subject. © Georg Thieme Verlag Stuttgart · New York.

  15. Evaluation of variable selection methods for random forests and omics data sets.

    PubMed

    Degenhardt, Frauke; Seifert, Stephan; Szymczak, Silke

    2017-10-16

    Machine learning methods and in particular random forests are promising approaches for prediction based on high dimensional omics data sets. They provide variable importance measures to rank predictors according to their predictive power. If building a prediction model is the main goal of a study, often a minimal set of variables with good prediction performance is selected. However, if the objective is the identification of involved variables to find active networks and pathways, approaches that aim to select all relevant variables should be preferred. We evaluated several variable selection procedures based on simulated data as well as publicly available experimental methylation and gene expression data. Our comparison included the Boruta algorithm, the Vita method, recurrent relative variable importance, a permutation approach and its parametric variant (Altmann) as well as recursive feature elimination (RFE). In our simulation studies, Boruta was the most powerful approach, followed closely by the Vita method. Both approaches demonstrated similar stability in variable selection, while Vita was the most robust approach under a pure null model without any predictor variables related to the outcome. In the analysis of the different experimental data sets, Vita demonstrated slightly better stability in variable selection and was less computationally intensive than Boruta.In conclusion, we recommend the Boruta and Vita approaches for the analysis of high-dimensional data sets. Vita is considerably faster than Boruta and thus more suitable for large data sets, but only Boruta can also be applied in low-dimensional settings. © The Author 2017. Published by Oxford University Press.

  16. The REFLECT statement: methods and processes of creating reporting guidelines for randomized controlled trials for livestock and food safety by modifying the CONSORT statement.

    PubMed

    O'Connor, A M; Sargeant, J M; Gardner, I A; Dickson, J S; Torrence, M E; Dewey, C E; Dohoo, I R; Evans, R B; Gray, J T; Greiner, M; Keefe, G; Lefebvre, S L; Morley, P S; Ramirez, A; Sischo, W; Smith, D R; Snedeker, K; Sofos, J; Ward, M P; Wills, R

    2010-03-01

    The conduct of randomized controlled trials in livestock with production, health and food-safety outcomes presents unique challenges that may not be adequately reported in trial reports. The objective of this project was to modify the CONSORT (Consolidated Standards of Reporting Trials) statement to reflect the unique aspects of reporting these livestock trials. A 2-day consensus meeting was held on 18-19 November 2008 in Chicago, IL, USA, to achieve the objective. Prior to the meeting, a Web-based survey was conducted to identify issues for discussion. The 24 attendees were biostatisticians, epidemiologists, food-safety researchers, livestock-production specialists, journal editors, assistant editors and associate editors. Prior to the meeting, the attendees completed a Web-based survey indicating which CONSORT statement items may need to be modified to address unique issues for livestock trials. The consensus meeting resulted in the production of the REFLECT (Reporting Guidelines for Randomized Control Trials) statement for livestock and food safety and 22-item checklist. Fourteen items were modified from the CONSORT checklist and an additional sub-item was proposed to address challenge trials. The REFLECT statement proposes new terminology, more consistent with common usage in livestock production, to describe study subjects. Evidence was not always available to support modification to or inclusion of an item. The use of the REFLECT statement, which addresses issues unique to livestock trials, should improve the quality of reporting and design for trials reporting production, health and food-safety outcomes.

  17. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects

    PubMed Central

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called “cluster randomization”). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies. PMID:28584874

  18. Weighted re-randomization tests for minimization with unbalanced allocation.

    PubMed

    Han, Baoguang; Yu, Menggang; McEntegart, Damian

    2013-01-01

    Re-randomization test has been considered as a robust alternative to the traditional population model-based methods for analyzing randomized clinical trials. This is especially so when the clinical trials are randomized according to minimization, which is a popular covariate-adaptive randomization method for ensuring balance among prognostic factors. Among various re-randomization tests, fixed-entry-order re-randomization is advocated as an effective strategy when a temporal trend is suspected. Yet when the minimization is applied to trials with unequal allocation, fixed-entry-order re-randomization test is biased and thus compromised in power. We find that the bias is due to non-uniform re-allocation probabilities incurred by the re-randomization in this case. We therefore propose a weighted fixed-entry-order re-randomization test to overcome the bias. The performance of the new test was investigated in simulation studies that mimic the settings of a real clinical trial. The weighted re-randomization test was found to work well in the scenarios investigated including the presence of a strong temporal trend. Copyright © 2013 John Wiley & Sons, Ltd.

  19. A multi-level intervention in worksites to increase fruit and vegetable access and intake: Rationale, design and methods of the 'Good to Go' cluster randomized trial.

    PubMed

    Risica, Patricia M; Gorham, Gemma; Dionne, Laura; Nardi, William; Ng, Doug; Middler, Reese; Mello, Jennifer; Akpolat, Rahmet; Gettens, Katelyn; Gans, Kim M

    2018-02-01

    Fruit and vegetable (F&V) consumption is an important contributor to chronic disease prevention. However, most Americans do not eat adequate amounts. The worksite is an advantageous setting to reach large, diverse segments of the population with interventions to increase F&V intake, but research gaps exist. No studies have evaluated the implementation of mobile F&V markets at worksites nor compared the effectiveness of such markets with or without nutrition education. This paper describes the protocol for Good to Go (GTG), a cluster randomized trial to evaluate F&V intake change in employees from worksites randomized into three experimental arms: discount, fresh F&V markets (Access Only arm); markets plus educational components including campaigns, cooking demonstrations, videos, newsletters, and a web site (Access Plus arm); and an attention placebo comparison intervention on physical activity and stress reduction (Comparison). Secondary aims include: 1) Process evaluation to determine costs, reach, fidelity, and dose as well as the relationship of these variables with changes in F&V intake; 2) Applying a mediating variable framework to examine relationships of psychosocial factors/determinants with changes in F&V consumption; and 3) Cost effectiveness analysis of the different intervention arms. The GTG study will fill important research gaps in the field by implementing a rigorous cluster randomized trial to evaluate the efficacy of an innovative environmental intervention providing access and availability to F&V at the worksite and whether this access intervention is further enhanced by accompanying educational interventions. GTG will provide an important contribution to public health research and practice. Trial registration number NCT02729675, ClinicalTrials.gov. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Randomized comparison of two Internet-supported fertility-awareness-based methods of family planning.

    PubMed

    Fehring, Richard J; Schneider, Mary; Raviele, Kathleen; Rodriguez, Dana; Pruszynski, Jessica

    2013-07-01

    The aim was to compare the efficacy and acceptability of two Internet-supported fertility-awareness-based methods of family planning. Six hundred and sixty-seven women and their male partners were randomized into either an electronic hormonal fertility monitor (EHFM) group or a cervical mucus monitoring (CMM) group. Both groups utilized a Web site with instructions, charts and support. Acceptability was assessed online at 1, 3 and 6 months. Pregnancy rates were determined by survival analysis. The EHFM participants (N=197) had a total pregnancy rate of 7 per 100 users over 12 months of use compared with 18.5 for the CMM group (N=164). The log rank survival test showed a significant difference (p<.01) in survival functions. Mean acceptability for both groups increased significantly over time (p<.0001). Continuation rates at 12 months were 40.6% for the monitor group and 36.6% for the mucus group. In comparison with the CMM, the EHFM method of family planning was more effective. All users had an increase in acceptability over time. Results are tempered by the high dropout rate. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed Central

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group. PMID:28034175

  2. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group.

  3. Active classifier selection for RGB-D object categorization using a Markov random field ensemble method

    NASA Astrophysics Data System (ADS)

    Durner, Maximilian; Márton, Zoltán.; Hillenbrand, Ulrich; Ali, Haider; Kleinsteuber, Martin

    2017-03-01

    In this work, a new ensemble method for the task of category recognition in different environments is presented. The focus is on service robotic perception in an open environment, where the robot's task is to recognize previously unseen objects of predefined categories, based on training on a public dataset. We propose an ensemble learning approach to be able to flexibly combine complementary sources of information (different state-of-the-art descriptors computed on color and depth images), based on a Markov Random Field (MRF). By exploiting its specific characteristics, the MRF ensemble method can also be executed as a Dynamic Classifier Selection (DCS) system. In the experiments, the committee- and topology-dependent performance boost of our ensemble is shown. Despite reduced computational costs and using less information, our strategy performs on the same level as common ensemble approaches. Finally, the impact of large differences between datasets is analyzed.

  4. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  5. A Two-Stage Estimation Method for Random Coefficient Differential Equation Models with Application to Longitudinal HIV Dynamic Data.

    PubMed

    Fang, Yun; Wu, Hulin; Zhu, Li-Xing

    2011-07-01

    We propose a two-stage estimation method for random coefficient ordinary differential equation (ODE) models. A maximum pseudo-likelihood estimator (MPLE) is derived based on a mixed-effects modeling approach and its asymptotic properties for population parameters are established. The proposed method does not require repeatedly solving ODEs, and is computationally efficient although it does pay a price with the loss of some estimation efficiency. However, the method does offer an alternative approach when the exact likelihood approach fails due to model complexity and high-dimensional parameter space, and it can also serve as a method to obtain the starting estimates for more accurate estimation methods. In addition, the proposed method does not need to specify the initial values of state variables and preserves all the advantages of the mixed-effects modeling approach. The finite sample properties of the proposed estimator are studied via Monte Carlo simulations and the methodology is also illustrated with application to an AIDS clinical data set.

  6. Truly random number generation: an example

    NASA Astrophysics Data System (ADS)

    Frauchiger, Daniela; Renner, Renato

    2013-10-01

    Randomness is crucial for a variety of applications, ranging from gambling to computer simulations, and from cryptography to statistics. However, many of the currently used methods for generating randomness do not meet the criteria that are necessary for these applications to work properly and safely. A common problem is that a sequence of numbers may look random but nevertheless not be truly random. In fact, the sequence may pass all standard statistical tests and yet be perfectly predictable. This renders it useless for many applications. For example, in cryptography, the predictability of a "andomly" chosen password is obviously undesirable. Here, we review a recently developed approach to generating true | and hence unpredictable | randomness.

  7. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT...

  8. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT...

  9. 9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT...

  10. Towards Defining Adequate Lithium Trials for Individuals with Mental Retardation and Mental Illness.

    ERIC Educational Resources Information Center

    Pary, Robert J.

    1991-01-01

    Use of lithium with mentally retarded individuals with psychiatric conditions and/or behavior disturbances is discussed. The paper describes components of an adequate clinical trial and reviews case studies and double-blind cases. The paper concludes that aggression is the best indicator for lithium use, and reviews treatment parameters and…

  11. Comparison of Registered and Reported Outcomes in Randomized Clinical Trials Published in Anesthesiology Journals.

    PubMed

    Jones, Philip M; Chow, Jeffrey T Y; Arango, Miguel F; Fridfinnson, Jason A; Gai, Nan; Lam, Kevin; Turkstra, Timothy P

    2017-10-01

    Randomized clinical trials (RCTs) provide high-quality evidence for clinical decision-making. Trial registration is one of the many tools used to improve the reporting of RCTs by reducing publication bias and selective outcome reporting bias. The purpose of our study is to examine whether RCTs published in the top 6 general anesthesiology journals were adequately registered and whether the reported primary and secondary outcomes corresponded to the originally registered outcomes. Following a prespecified protocol, an electronic database was used to systematically screen and extract data from RCTs published in the top 6 general anesthesiology journals by impact factor (Anaesthesia, Anesthesia & Analgesia, Anesthesiology, British Journal of Anaesthesia, Canadian Journal of Anesthesia, and European Journal of Anaesthesiology) during the years 2007, 2010, 2013, and 2015. A manual search of each journal's Table of Contents was performed (in duplicate) to identify eligible RCTs. An adequately registered trial was defined as being registered in a publicly available trials registry before the first patient being enrolled with an unambiguously defined primary outcome. For adequately registered trials, the outcomes registered in the trial registry were compared with the outcomes reported in the article, with outcome discrepancies documented and analyzed by the type of discrepancy. During the 4 years studied, there were 860 RCTs identified, with 102 RCTs determined to be adequately registered (12%). The proportion of adequately registered trials increased over time, with 38% of RCTs being adequately registered in 2015. The most common reason in 2015 for inadequate registration was registering the RCT after the first patient had already been enrolled. Among adequately registered trials, 92% had at least 1 primary or secondary outcome discrepancy. In 2015, 42% of RCTs had at least 1 primary outcome discrepancy, while 90% of RCTs had at least 1 secondary outcome discrepancy

  12. Conditional Monte Carlo randomization tests for regression models.

    PubMed

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Randomized, controlled pilot study comparing large-volume paracentesis using wall suction and traditional glass vacuum bottle methods.

    PubMed

    Konerman, Monica A; Price, Jennifer; Torres, Dawn; Li, Zhiping

    2014-09-01

    Large-volume paracentesis (LVP) can be time and labor intensive depending on the amount of ascites removed and the method of drainage. Wall suction has been adopted as the preferred method of drainage at many centers, though the safety and benefits of this technique have not been formally evaluated. The primary objective of this study was to define the cost and time savings of wall suction over the traditional glass vacuum bottle method for ascites drainage. The secondary objective was to compare the safety profile and patient satisfaction using these two techniques. We conducted a randomized, controlled pilot study of the wall suction versus vacuum bottle methods for LVP in hospitalized patients. All LVPs were performed under ultrasound guidance by a single proceduralist. Patients with at least 4 liters removed received 25% intravenous albumin, 8 g/liter fluid removed. Demographic, clinical characteristics, and procedure details were recorded. Laboratory and hemodynamic data were recorded for 24 h prior to and 24-48 h post LVP. An electronic chart review was conducted to evaluate procedure-related complications. Data were compared using Fisher's exact test, t test, or Mann-Whitney U test. Thirty-four patients were randomized to wall suction at 200 mmHg (n = 17) or glass vacuum bottle drainage (n = 17). Wall suction was significantly faster and less costly than vacuum bottle drainage (7 versus 15 min, p = 0.002; $4.59 versus $12.73, p < 0.001). There were no differences in outcomes at 24 and 48 h post LVP, or at 60-day follow up. Performing LVP using wall suction resulted in significantly shorter procedure time and supply cost savings. There were no differences in outcomes between the groups, suggesting equivalent safety, though larger studies powered to detect small differences are needed. Given its efficiency, convenience, and cost effectiveness, wall suction may be a superior method of ascites drainage for LVP.

  14. Adequate Education: Issues in Its Definition and Implementation. School Finance Project, Working Papers.

    ERIC Educational Resources Information Center

    Tron, Esther, Ed.

    Section 1203 of the Education Amendments of 1978 mandated the undertaking of studies concerning the adequate financing of elementary and secondary education in the 1980s. Created to carry out this mandate, the School Finance Project established as one of its goals reporting to Congress on issues implicit in funding educational adequacy. Several…

  15. Developing an adequate "pneumatraumatology": understanding the spiritual impacts of traumatic injury.

    PubMed

    Bidwell, Duane R

    2002-01-01

    Psychosocial interventions and systematic theology are primary resources for chaplains and congregational pastors who care for victims of physical trauma. Yet these resources may not be adequate to address the spiritual impacts of trauma. This article proposes a preliminary "pneumatraumatology," drawing on early Christian asceticism and Buddhist mysticism to describe one way of understanding the spiritual impacts of traumatic injury. It also suggests possible responses to these impacts informed by narrative/constructionist perspectives and Breggemann's understanding of the dimensions of spiritual transformation in the Hebrew Bible.

  16. Comparison of four different pain relief methods during hysterosalpingography: A randomized controlled study

    PubMed Central

    Unlu, Bekir Serdar; Yilmazer, Mehmet; Koken, Gulengul; Arioz, Dagistan Tolga; Unlu, Ebru; Baki, Elif Dogan; Kurttay, Cemile; Karacin, Osman

    2015-01-01

    BACKGROUND: Hysterosalpingography (HSG) is the most commonly used method for evaluating the anatomy and patency of the uterine cavity and fallopian tubes, and is an important tool in the evaluation of infertility. The most frequent side effect is the pain associated with the procedure. OBJECTIVES: To evaluate four analgesic methods to determine the most useful method for reducing discomfort associated with HSG. METHODS: In the present prospective study, 75 patients undergoing HSG for evaluation of infertility were randomly assigned to four groups: 550 mg of a nonsteroidal anti-inflammatory drug (NSAID) (group 1); 550 mg NSAID + paracervical block (group 2); 550 mg NSAID + paracervical analgesic cream (group 3); or 550 mg NSAID + intrauterine analgesic instillation (group 4). A visual analogue scale was used to assess the pain perception at five predefined steps. RESULTS: Instillation of the liquids used for HSG was found to be the most painful step of HSG, and this step was where the only significant difference among groups was observed. When comparing visual analogue scale scores, group 2 and group 3 reported significantly less pain than the other groups. Group 1 reported significantly higher mean (± SD) scores (7.2±1.6) compared with groups 2 and 3 (4.7±2.5 and 3.8±2.4, respectively) (P<0.001). In addition, group 2 reported significantly less pain than group 4 (4.7±2.5 versus 6.7±1.8, respectively) (P<0.02). CONCLUSIONS: For effective pain relief during HSG, in addition to 550 mg NSAID, local application of lidocaine cream to the posterior fornix of the cervix uteri and paracervical lidocaine injection into the cervix uteri appear to be the most effective methods. PMID:25848848

  17. What's in a name? The challenge of describing interventions in systematic reviews: analysis of a random sample of reviews of non-pharmacological stroke interventions

    PubMed Central

    Hoffmann, Tammy C; Walker, Marion F; Langhorne, Peter; Eames, Sally; Thomas, Emma; Glasziou, Paul

    2015-01-01

    Objective To assess, in a sample of systematic reviews of non-pharmacological interventions, the completeness of intervention reporting, identify the most frequently missing elements, and assess review authors’ use of and beliefs about providing intervention information. Design Analysis of a random sample of systematic reviews of non-pharmacological stroke interventions; online survey of review authors. Data sources and study selection The Cochrane Library and PubMed were searched for potentially eligible systematic reviews and a random sample of these assessed for eligibility until 60 (30 Cochrane, 30 non-Cochrane) eligible reviews were identified. Data collection In each review, the completeness of the intervention description in each eligible trial (n=568) was assessed by 2 independent raters using the Template for Intervention Description and Replication (TIDieR) checklist. All review authors (n=46) were invited to complete a survey. Results Most reviews were missing intervention information for the majority of items. The most incompletely described items were: modifications, fidelity, materials, procedure and tailoring (missing from all interventions in 97%, 90%, 88%, 83% and 83% of reviews, respectively). Items that scored better, but were still incomplete for the majority of reviews, were: ‘when and how much’ (in 31% of reviews, adequate for all trials; in 57% of reviews, adequate for some trials); intervention mode (in 22% of reviews, adequate for all trials; in 38%, adequate for some trials); and location (in 19% of reviews, adequate for all trials). Of the 33 (71%) authors who responded, 58% reported having further intervention information but not including it, and 70% tried to obtain information. Conclusions Most focus on intervention reporting has been directed at trials. Poor intervention reporting in stroke systematic reviews is prevalent, compounded by poor trial reporting. Without adequate intervention descriptions, the conduct, usability and

  18. Effectiveness of 3 methods of anchorage reinforcement for maximum anchorage in adolescents: A 3-arm multicenter randomized clinical trial.

    PubMed

    Sandler, Jonathan; Murray, Alison; Thiruvenkatachari, Badri; Gutierrez, Rodrigo; Speight, Paul; O'Brien, Kevin

    2014-07-01

    The objective of this 3-arm parallel randomized clinical trial was to compare the effectiveness of temporary anchorage devices (TADs), Nance button palatal arches, and headgear for anchorage supplementation in the treatment of patients with malocclusions that required maximum anchorage. This trial was conducted between August 2008 and February 2013 in 2 orthodontic departments in the United Kingdom. The study included 78 patients (ages, 12-18 years; mean age, 14.2 years) who needed maximum anchorage. Eligibility criteria included no active caries, exemplary oral hygiene, and maximum anchorage required. The primary outcome was mesial molar movement during the period in which anchorage supplementation was required. The secondary outcomes were duration of anchorage reinforcement, number of treatment visits, number of casual and failed appointments, total treatment time, dento-occlusal change, and patients' perceptions of the method of anchorage supplementation. Treatment allocation was implemented by contacting via the Internet the randomization center at the University of Nottingham, Clinical Trials Unit. The randomization was based on a computer-generated pseudo-random code with random permuted blocks of randomly varying size. A research assistant who was blinded to the group allocation recorded all data. The patients were randomly allocated to receive anchorage supplementation with TADs, a Nance button on a palatal arch, or headgear. They were all treated with maxillary and mandibular preadjusted edgewise fixed appliances with 0.022-in slot prescription brackets. They were followed until orthodontic treatment was complete. Seventy-eight patients were randomized in a 1:1:1 ratio among the 3 groups. The baseline characteristics were similar in the groups, and they were treated for an average of 27.4 months (SD, 7.1 months); 71 completed orthodontic treatment. The data were analyzed on a per-protocol basis and showed no differences in the effectiveness of anchorage

  19. Random demographic household surveys in highly mobile pastoral communities in Chad

    PubMed Central

    Béchir, Mahamat; Hattendorf, Jan; Bonfoh, Bassirou; Zinsstag, Jakob; Schelling, Esther

    2011-01-01

    Abstract Problem Reliable demographic data is a central requirement for health planning and management, and for the implementation of adequate interventions. This study addresses the lack of demographic data on mobile pastoral communities in the Sahel. Approach A total of 1081 Arab, Fulani and Gorane women and 2541 children (1336 boys and 1205 girls) were interviewed and registered by a biometric fingerprint scanner in five repeated random transect demographic and health surveys conducted from March 2007 to January 2008 in the Lake Chad region in Chad. Local setting Important determinants for the planning and implementation of household surveys among mobile pastoral communities include: environmental factors; availability of women for interviews; difficulties in defining “own” children; the need for information-education-communication campaigns; and informed consent of husbands in typically patriarchal societies. Relevant changes Due to their high mobility, only 5% (56/1081) of registered women were encountered twice. Therefore, it was not possible to establish a demographic and health cohort. Lessons learnt Prospective demographic and health cohorts are the most accurate method to assess child mortality and other demographic indices. However, their feasibility in a highly mobile pastoral setting remains to be shown. Future interdisciplinary scientific efforts need to target innovative methods, tools and approaches to include marginalized communities in operational health and demographic surveillance systems. PMID:21556307

  20. Impact of aspirin resistance on outcomes among patients following coronary artery bypass grafting: exploratory analysis from randomized controlled trial (NCT01159639).

    PubMed

    Petricevic, Mate; Kopjar, Tomislav; Gasparovic, Hrvoje; Milicic, Davor; Svetina, Lucija; Zdilar, Boris; Boban, Marko; Mihaljevic, Martina Zrno; Biocina, Bojan

    2015-05-01

    Individual variability in the response to aspirin, has been established by various platelet function assays, however, the clinical relevance of aspirin resistance (AR) in patients undergoing coronary artery bypass grafting (CABG) has to be evaluated. Our working group conducted a randomized controlled trial (NCT01159639) with the aim to assess impact of dual antiplatelet therapy (APT) on outcomes among patients with AR following CABG. Patients that were aspirin resistant on fourth postoperative day (POD 4) were randomly assigned to receive either dual APT with clopidogrel (75 mg) plus aspirin (300 mg)-intervention arm or monotherapy with aspirin (300 mg)-control arm. This exploratory analysis compares clinical outcomes between aspirin resistant patients allocated to control arm and patients that have had adequate platelet inhibitory response to aspirin at POD 4. Both groups were treated with 300 mg of aspirin per day following surgery. We sought to evaluate the impact of early postoperative AR on outcomes among patients following CABG. Exploratory analysis included a total number of 325 patients. Of those, 215 patients with adequate response to aspirin and 110 patients with AR allocated to aspirin monotherapy following randomization protocol. The primary efficacy end point (MACCEs-major adverse cardiac and cardiovascular events) occurred in 10 and 6 % of patients with AR and with adequate aspirin response, respectively (p = 0.27). Non-significant differences were observed in bleeding events occurrence. Subgroup analysis of the primary end point revealed that aspirin resistant patients with BMI > 30 kg/m(2) tend to have a higher occurrence of MACCEs 18 versus 5 % (relative risk 0.44 [95 % CI 0.16-1.16]; p = 0.05). This exploratory analysis did not reveal significant impact of aspirin resistance on outcomes among patients undergoing CABG. Further, sufficiently powered studies are needed in order to evaluate clinical relevance of AR in patients undergoing CABG.