Sample records for small non-random sample

  1. Differences Between Fraternity and Non-Fraternity Members at the University of Maryland. Research Report No. 12-70.

    ERIC Educational Resources Information Center

    Lynch, Robert C.; Sedlacek, William E.

    To ascertain the nature and extent of the differences between fraternity and non-fraternity men at the University of Maryland, a study was conducted in June 1969 with a small random sample (approximately 50 in each group). Their spring 1969 semester grades, ACT (or converted SAT) composite scores, and responses to selected items on the 1969…

  2. Interpreting survival data from clinical trials of surgery versus stereotactic body radiation therapy in operable Stage I non-small cell lung cancer patients.

    PubMed

    Samson, Pamela; Keogan, Kathleen; Crabtree, Traves; Colditz, Graham; Broderick, Stephen; Puri, Varun; Meyers, Bryan

    2017-01-01

    To identify the variability of short- and long-term survival outcomes among closed Phase III randomized controlled trials with small sample sizes comparing SBRT (stereotactic body radiation therapy) and surgical resection in operable clinical Stage I non-small cell lung cancer (NSCLC) patients. Clinical Stage I NSCLC patients who underwent surgery at our institution meeting the inclusion/exclusion criteria for STARS (Randomized Study to Compare CyberKnife to Surgical Resection in Stage I Non-small Cell Lung Cancer), ROSEL (Trial of Either Surgery or Stereotactic Radiotherapy for Early Stage (IA) Lung Cancer), or both were identified. Bootstrapping analysis provided 10,000 iterations to depict 30-day mortality and three-year overall survival (OS) in cohorts of 16 patients (to simulate the STARS surgical arm), 27 patients (to simulate the pooled surgical arms of STARS and ROSEL), and 515 (to simulate the goal accrual for the surgical arm of STARS). From 2000 to 2012, 749/873 (86%) of clinical Stage I NSCLC patients who underwent resection were eligible for STARS only, ROSEL only, or both studies. When patients eligible for STARS only were repeatedly sampled with a cohort size of 16, the 3-year OS rates ranged from 27 to 100%, and 30-day mortality varied from 0 to 25%. When patients eligible for ROSEL or for both STARS and ROSEL underwent bootstrapping with n=27, the 3-year OS ranged from 46 to 100%, while 30-day mortality varied from 0 to 15%. Finally, when patients eligible for STARS were repeatedly sampled in groups of 515, 3-year OS narrowed to 70-85%, with 30-day mortality varying from 0 to 4%. Short- and long-term survival outcomes from trials with small sample sizes are extremely variable and unreliable for extrapolation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Space-Time Smoothing of Complex Survey Data: Small Area Estimation for Child Mortality.

    PubMed

    Mercer, Laina D; Wakefield, Jon; Pantazis, Athena; Lutambi, Angelina M; Masanja, Honorati; Clark, Samuel

    2015-12-01

    Many people living in low and middle-income countries are not covered by civil registration and vital statistics systems. Consequently, a wide variety of other types of data including many household sample surveys are used to estimate health and population indicators. In this paper we combine data from sample surveys and demographic surveillance systems to produce small area estimates of child mortality through time. Small area estimates are necessary to understand geographical heterogeneity in health indicators when full-coverage vital statistics are not available. For this endeavor spatio-temporal smoothing is beneficial to alleviate problems of data sparsity. The use of conventional hierarchical models requires careful thought since the survey weights may need to be considered to alleviate bias due to non-random sampling and non-response. The application that motivated this work is estimation of child mortality rates in five-year time intervals in regions of Tanzania. Data come from Demographic and Health Surveys conducted over the period 1991-2010 and two demographic surveillance system sites. We derive a variance estimator of under five years child mortality that accounts for the complex survey weighting. For our application, the hierarchical models we consider include random effects for area, time and survey and we compare models using a variety of measures including the conditional predictive ordinate (CPO). The method we propose is implemented via the fast and accurate integrated nested Laplace approximation (INLA).

  4. Space-Time Smoothing of Complex Survey Data: Small Area Estimation for Child Mortality

    PubMed Central

    Mercer, Laina D; Wakefield, Jon; Pantazis, Athena; Lutambi, Angelina M; Masanja, Honorati; Clark, Samuel

    2016-01-01

    Many people living in low and middle-income countries are not covered by civil registration and vital statistics systems. Consequently, a wide variety of other types of data including many household sample surveys are used to estimate health and population indicators. In this paper we combine data from sample surveys and demographic surveillance systems to produce small area estimates of child mortality through time. Small area estimates are necessary to understand geographical heterogeneity in health indicators when full-coverage vital statistics are not available. For this endeavor spatio-temporal smoothing is beneficial to alleviate problems of data sparsity. The use of conventional hierarchical models requires careful thought since the survey weights may need to be considered to alleviate bias due to non-random sampling and non-response. The application that motivated this work is estimation of child mortality rates in five-year time intervals in regions of Tanzania. Data come from Demographic and Health Surveys conducted over the period 1991–2010 and two demographic surveillance system sites. We derive a variance estimator of under five years child mortality that accounts for the complex survey weighting. For our application, the hierarchical models we consider include random effects for area, time and survey and we compare models using a variety of measures including the conditional predictive ordinate (CPO). The method we propose is implemented via the fast and accurate integrated nested Laplace approximation (INLA). PMID:27468328

  5. Differential expression analysis for RNAseq using Poisson mixed models

    PubMed Central

    Sun, Shiquan; Hood, Michelle; Scott, Laura; Peng, Qinke; Mukherjee, Sayan; Tung, Jenny

    2017-01-01

    Abstract Identifying differentially expressed (DE) genes from RNA sequencing (RNAseq) studies is among the most common analyses in genomics. However, RNAseq DE analysis presents several statistical and computational challenges, including over-dispersed read counts and, in some settings, sample non-independence. Previous count-based methods rely on simple hierarchical Poisson models (e.g. negative binomial) to model independent over-dispersion, but do not account for sample non-independence due to relatedness, population structure and/or hidden confounders. Here, we present a Poisson mixed model with two random effects terms that account for both independent over-dispersion and sample non-independence. We also develop a scalable sampling-based inference algorithm using a latent variable representation of the Poisson distribution. With simulations, we show that our method properly controls for type I error and is generally more powerful than other widely used approaches, except in small samples (n <15) with other unfavorable properties (e.g. small effect sizes). We also apply our method to three real datasets that contain related individuals, population stratification or hidden confounders. Our results show that our method increases power in all three data compared to other approaches, though the power gain is smallest in the smallest sample (n = 6). Our method is implemented in MACAU, freely available at www.xzlab.org/software.html. PMID:28369632

  6. Is a 'convenience' sample useful for estimating immunization coverage in a small population?

    PubMed

    Weir, Jean E; Jones, Carrie

    2008-01-01

    Rapid survey methodologies are widely used for assessing immunization coverage in developing countries, approximating true stratified random sampling. Non-random ('convenience') sampling is not considered appropriate for estimating immunization coverage rates but has the advantages of low cost and expediency. We assessed the validity of a convenience sample of children presenting to a travelling clinic by comparing the coverage rate in the convenience sample to the true coverage established by surveying each child in three villages in rural Papua New Guinea. The rate of DTF immunization coverage as estimated by the convenience sample was within 10% of the true coverage when the proportion of children in the sample was two-thirds or when only children over the age of one year were counted, but differed by 11% when the sample included only 53% of the children and when all eligible children were included. The convenience sample may be sufficiently accurate for reporting purposes and is useful for identifying areas of low coverage.

  7. Methods for estimating population density in data-limited areas: evaluating regression and tree-based models in Peru.

    PubMed

    Anderson, Weston; Guikema, Seth; Zaitchik, Ben; Pan, William

    2014-01-01

    Obtaining accurate small area estimates of population is essential for policy and health planning but is often difficult in countries with limited data. In lieu of available population data, small area estimate models draw information from previous time periods or from similar areas. This study focuses on model-based methods for estimating population when no direct samples are available in the area of interest. To explore the efficacy of tree-based models for estimating population density, we compare six different model structures including Random Forest and Bayesian Additive Regression Trees. Results demonstrate that without information from prior time periods, non-parametric tree-based models produced more accurate predictions than did conventional regression methods. Improving estimates of population density in non-sampled areas is important for regions with incomplete census data and has implications for economic, health and development policies.

  8. Methods for Estimating Population Density in Data-Limited Areas: Evaluating Regression and Tree-Based Models in Peru

    PubMed Central

    Anderson, Weston; Guikema, Seth; Zaitchik, Ben; Pan, William

    2014-01-01

    Obtaining accurate small area estimates of population is essential for policy and health planning but is often difficult in countries with limited data. In lieu of available population data, small area estimate models draw information from previous time periods or from similar areas. This study focuses on model-based methods for estimating population when no direct samples are available in the area of interest. To explore the efficacy of tree-based models for estimating population density, we compare six different model structures including Random Forest and Bayesian Additive Regression Trees. Results demonstrate that without information from prior time periods, non-parametric tree-based models produced more accurate predictions than did conventional regression methods. Improving estimates of population density in non-sampled areas is important for regions with incomplete census data and has implications for economic, health and development policies. PMID:24992657

  9. "At Risk of Harm"? An Exploratory Survey of School Counsellors in the UK, Their Perceptions of Confidentiality, Information Sharing and Risk Management

    ERIC Educational Resources Information Center

    Jenkins, Peter; Palmer, Joanne

    2012-01-01

    The primary objective of this study was to explore perceptions of UK school counsellors of confidentiality and information sharing in therapeutic work with children and young people, using qualitative methods. The research design employed a two-stage process, using questionnaires and follow-up interviews, with a small, non-random sample of school…

  10. Inflation with a graceful exit in a random landscape

    NASA Astrophysics Data System (ADS)

    Pedro, F. G.; Westphal, A.

    2017-03-01

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N ≪ 10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  11. Differential expression analysis for RNAseq using Poisson mixed models.

    PubMed

    Sun, Shiquan; Hood, Michelle; Scott, Laura; Peng, Qinke; Mukherjee, Sayan; Tung, Jenny; Zhou, Xiang

    2017-06-20

    Identifying differentially expressed (DE) genes from RNA sequencing (RNAseq) studies is among the most common analyses in genomics. However, RNAseq DE analysis presents several statistical and computational challenges, including over-dispersed read counts and, in some settings, sample non-independence. Previous count-based methods rely on simple hierarchical Poisson models (e.g. negative binomial) to model independent over-dispersion, but do not account for sample non-independence due to relatedness, population structure and/or hidden confounders. Here, we present a Poisson mixed model with two random effects terms that account for both independent over-dispersion and sample non-independence. We also develop a scalable sampling-based inference algorithm using a latent variable representation of the Poisson distribution. With simulations, we show that our method properly controls for type I error and is generally more powerful than other widely used approaches, except in small samples (n <15) with other unfavorable properties (e.g. small effect sizes). We also apply our method to three real datasets that contain related individuals, population stratification or hidden confounders. Our results show that our method increases power in all three data compared to other approaches, though the power gain is smallest in the smallest sample (n = 6). Our method is implemented in MACAU, freely available at www.xzlab.org/software.html. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Do Personality Problems Improve During Psychodynamic Supportive-Expressive Psychotherapy? Secondary Outcome Results From a Randomized Controlled Trial for Psychiatric Outpatients with Personality Disorders

    PubMed Central

    Vinnars, Bo; Thormählen, Barbro; Gallop, Robert; Norén, Kristina; Barber, Jacques P.

    2009-01-01

    Studies involving patients with personality disorders (PD) have not focused on improvement of core aspects of the PD. This paper examines changes in quality of object relations, interpersonal problems, psychological mindedness, and personality traits in a sample of 156 patients with DSM-IV PD diagnoses being randomized to either manualized or non manualized dynamic psychotherapy. Effect sizes adjusted for symptomatic change and reliable change indices were calculated. We found that both treatments were equally effective at reducing personality pathology. Only in neuroticism did the non manualized group do better during the follow-up period. The largest improvement was found in quality of object relations. For the remaining variables only small and clinically insignificant magnitudes of change were found. PMID:20161588

  13. Resampling methods in Microsoft Excel® for estimating reference intervals

    PubMed Central

    Theodorsson, Elvar

    2015-01-01

    Computer- intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles.
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples. PMID:26527366

  14. Resampling methods in Microsoft Excel® for estimating reference intervals.

    PubMed

    Theodorsson, Elvar

    2015-01-01

    Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. 
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
 Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.

  15. A two-stage Monte Carlo approach to the expression of uncertainty with finite sample sizes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowder, Stephen Vernon; Moyer, Robert D.

    2005-05-01

    Proposed supplement I to the GUM outlines a 'propagation of distributions' approach to deriving the distribution of a measurand for any non-linear function and for any set of random inputs. The supplement's proposed Monte Carlo approach assumes that the distributions of the random inputs are known exactly. This implies that the sample sizes are effectively infinite. In this case, the mean of the measurand can be determined precisely using a large number of Monte Carlo simulations. In practice, however, the distributions of the inputs will rarely be known exactly, but must be estimated using possibly small samples. If these approximatedmore » distributions are treated as exact, the uncertainty in estimating the mean is not properly taken into account. In this paper, we propose a two-stage Monte Carlo procedure that explicitly takes into account the finite sample sizes used to estimate parameters of the input distributions. We will illustrate the approach with a case study involving the efficiency of a thermistor mount power sensor. The performance of the proposed approach will be compared to the standard GUM approach for finite samples using simple non-linear measurement equations. We will investigate performance in terms of coverage probabilities of derived confidence intervals.« less

  16. Experiments in fault tolerant software reliability

    NASA Technical Reports Server (NTRS)

    Mcallister, David F.; Tai, K. C.; Vouk, Mladen A.

    1987-01-01

    The reliability of voting was evaluated in a fault-tolerant software system for small output spaces. The effectiveness of the back-to-back testing process was investigated. Version 3.0 of the RSDIMU-ATS, a semi-automated test bed for certification testing of RSDIMU software, was prepared and distributed. Software reliability estimation methods based on non-random sampling are being studied. The investigation of existing fault-tolerance models was continued and formulation of new models was initiated.

  17. Acute, Five- and Ten-Day Inhalation Study of Hydroprocessed Esters and Fatty Acids-Mixed Fats (HEFA-F) Jet Fuel

    DTIC Science & Technology

    2012-09-01

    Groups ...................18 Table 6. Urine Specific Gravity in the Ten-Day Duration Groups ...............................................19 Table 7...exposed rats were not consistently significant compared to control groups and were considered random due to small group sizes. In urine samples...sources. Alternative fuels may be developed by synthesis from simpler molecules (e.g., natural gas) or by refining from non-petroleum sources (e.g

  18. Systematic lymphadenectomy versus sampling of ipsilateral mediastinal lymph-nodes during lobectomy for non-small-cell lung cancer: a systematic review of randomized trials and a meta-analysis.

    PubMed

    Mokhles, Sahar; Macbeth, Fergus; Treasure, Tom; Younes, Riad N; Rintoul, Robert C; Fiorentino, Francesca; Bogers, Ad J J C; Takkenberg, Johanna J M

    2017-06-01

    To re-examine the evidence for recommendations for complete dissection versus sampling of ipsilateral mediastinal lymph nodes during lobectomy for cancer. We searched for randomized trials of systematic mediastinal lymphadenectomy versus mediastinal sampling. We performed a textual analysis of the authors' own starting assumptions and conclusion. We analysed the trial designs and risk of bias. We extracted data on early mortality, perioperative complications, overall survival, local recurrence and distant recurrence for meta-analysis. We found five randomized controlled trials recruiting 1980 patients spanning 1989-2007. The expressed starting position in 3/5 studies was a conviction that systematic dissection was effective. Long-term survival was better with lymphadenectomy compared with sampling (Hazard Ratio 0.78; 95% CI 0.69-0.89) as was perioperative survival (Odds Ratio 0.59; 95% CI 0.25-1.36, non-significant). But there was an overall high risk of bias and a lack of intention to treat analysis. There were higher rates (non-significant) of perioperative complications including bleeding, chylothorax and recurrent nerve palsy with lymphadenectomy. The high risk of bias in these trials makes the overall conclusion insecure. The finding of clinically important surgically related morbidities but lower perioperative mortality with lymphadenectomy seems inconsistent. The multiple variables in patients, cancers and available treatments suggest that large pragmatic multicentre trials, testing currently available strategies, are the best way to find out which are more effective. The number of patients affected with lung cancer makes trials feasible. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  19. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous throughfall studies relied on method-of-moments variogram estimation and sample sizes ≪200, currently available data are prone to large uncertainties.

  20. Emergence of superconductivity and magnetic ordering tuned by Fe-vacancy in alkali-metal Fe chalcogenides RbxFe2-ySe2

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yoshiaki; Kototani, Shouhei; Itoh, Masayuki; Sato, Masatoshi

    2014-12-01

    Samples of RbxFe2-ySe2 exhibiting superconductivity [superconducting (SC) samples] undergo a phase-separation into two phases, a Fe-vacancy ordered phase with antiferromagnetic (AFM) transition at TN1~500 K (AFM1 phase) and a phase with little Fe- vacancy and SC transition at Tc~30 K (SC phase). The samples of RbxFe2-ySe2 exhibiting no SC behaviour (non-SC samples) are phase-separated into three phases, the AFM1 phase, another AFM phase with TN2 ~150 K (AFM2 phase), and a paramagnetic phase with no SC transitions (paramagnetic non-SC phase). In this paper, we present the experimental results of magnetic susceptibility, electrical resistivity, and NMR measurements on single crystals of RbxFe2-ySe2 to reveal physical properties of these co-existing phases in the SC and non-SC samples. The 87Rb and 77Se NMR spectra show that the Fe vacancy concentration is very small in the Fe planes of the SC phase, whereas the AFM2 and paramagnetic non-SC phases in non-SC samples have larger amount of Fe vacancies. The randomness induced by the Fe vacancy in the non-SC samples makes the AFM2 and paramagnetic non-SC phases insulating/semiconducting and magnetically active, resulting in the absence of the superconductivity in RbxFe2-ySe2.

  1. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    PubMed

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  2. Sampling versus systematic full lymphatic dissection in surgical treatment of non-small cell lung cancer.

    PubMed

    Koulaxouzidis, Georgios; Karagkiouzis, Grigorios; Konstantinou, Marios; Gkiozos, Ioannis; Syrigos, Konstantinos

    2013-04-22

    The extent of mediastinal lymph node assessment during surgery for non-small cell cancer remains controversial. Different techniques are used, ranging from simple visual inspection of the unopened mediastinum to an extended bilateral lymph node dissection. Furthermore, different terms are used to define these techniques. Sampling is the removal of one or more lymph nodes under the guidance of pre-operative findings. Systematic (full) nodal dissection is the removal of all mediastinal tissue containing the lymph nodes systematically within anatomical landmarks. A Medline search was conducted to identify articles in the English language that addressed the role of mediastinal lymph node resection in the treatment of non-small cell lung cancer. Opinions as to the reasons for favoring full lymphatic dissection include complete resection, improved nodal staging and better local control due to resection of undetected micrometastasis. Arguments against routine full lymphatic dissection are increased morbidity, increase in operative time, and lack of evidence of improved survival. For complete resection of non-small cell lung cancer, many authors recommend a systematic nodal dissection as the standard approach during surgery, and suggest that this provides both adequate nodal staging and guarantees complete resection. Whether extending the lymph node dissection influences survival or recurrence rate is still not known. There are valid arguments in favor in terms not only of an improved local control but also of an improved long-term survival. However, the impact of lymph node dissection on long-term survival should be further assessed by large-scale multicenter randomized trials.

  3. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad

    2017-04-01

    Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.

  4. A randomized, placebo-controlled proof-of-concept trial of adjunctive topiramate for alcohol use disorders in bipolar disorder.

    PubMed

    Sylvia, Louisa G; Gold, Alexandra K; Stange, Jonathan P; Peckham, Andrew D; Deckersbach, Thilo; Calabrese, Joseph R; Weiss, Roger D; Perlis, Roy H; Nierenberg, Andrew A; Ostacher, Michael J

    2016-03-01

    Topiramate is effective for alcohol use disorders (AUDs) among non-psychiatric patients. We examined topiramate for treating comorbid AUDs in bipolar disorder (BD). Twelve participants were randomized to topiramate or placebo for 12 weeks. The topiramate group, with two out of five participants (40%) completing treatment, experienced less improvement in drinking patterns than the placebo group, with five out of seven participants (71%) completing treatment. Topiramate did not improve drinking behavior and was not well-tolerated. This study failed to recruit adequately. Problems surrounding high attrition, a small study sample, and missing data preclude interpretation of study findings. This is the first randomized, placebo-controlled trial of topiramate for AUDs in BD. © American Academy of Addiction Psychiatry.

  5. Post-traumatic stress disorder in older adults: a systematic review of the psychotherapy treatment literature.

    PubMed

    Dinnen, Stephanie; Simiola, Vanessa; Cook, Joan M

    2015-01-01

    Older adults represent the fastest growing segment of the US and industrialized populations. However, older adults have generally not been included in randomized clinical trials of psychotherapy for post-traumatic stress disorder (PTSD). This review examined reports of psychological treatment for trauma-related problems, primarily PTSD, in studies with samples of at least 50% adults aged 55 and older using standardized measures. A systematic review of the literature was conducted on psychotherapy for PTSD with older adults using PubMed, Medline, PsychInfo, CINAHL, PILOTS, and Google Scholar. A total of 42 studies were retrieved for full review; 22 were excluded because they did not provide at least one outcome measure or results were not reported by age in the case of mixed-age samples. Of the 20 studies that met review criteria, there were: 13 case studies or series, three uncontrolled pilot studies, two randomized clinical trials, one non-randomized concurrent control study and one post hoc effectiveness study. Significant methodological limitations in the current older adult PTSD treatment outcome literature were found reducing its internal validity and generalizability, including non-randomized research designs, lack of comparison conditions and small sample sizes. Select evidence-based interventions validated in younger and middle-aged populations appear acceptable and efficacious with older adults. There are few treatment studies on subsets of the older adult population including cultural and ethnic minorities, women, the oldest old (over 85), and those who are cognitively impaired. Implications for clinical practice and future research directions are discussed.

  6. Signaling protein signature predicts clinical outcome of non-small-cell lung cancer.

    PubMed

    Jin, Bao-Feng; Yang, Fan; Ying, Xiao-Min; Gong, Lin; Hu, Shuo-Feng; Zhao, Qing; Liao, Yi-Da; Chen, Ke-Zhong; Li, Teng; Tai, Yan-Hong; Cao, Yuan; Li, Xiao; Huang, Yan; Zhan, Xiao-Yan; Qin, Xuan-He; Wu, Jin; Chen, Shuai; Guo, Sai-Sai; Zhang, Yu-Cheng; Chen, Jing; Shen, Dan-Hua; Sun, Kun-Kun; Chen, Lu; Li, Wei-Hua; Li, Ai-Ling; Wang, Na; Xia, Qing; Wang, Jun; Zhou, Tao

    2018-03-06

    Non-small-cell lung cancer (NSCLC) is characterized by abnormalities of numerous signaling proteins that play pivotal roles in cancer development and progression. Many of these proteins have been reported to be correlated with clinical outcomes of NSCLC. However, none of them could provide adequate accuracy of prognosis prediction in clinical application. A total of 384 resected NSCLC specimens from two hospitals in Beijing (BJ) and Chongqing (CQ) were collected. Using immunohistochemistry (IHC) staining on stored formalin-fixed paraffin-embedded (FFPE) surgical samples, we examined the expression levels of 75 critical proteins on BJ samples. Random forest algorithm (RFA) and support vector machines (SVM) computation were applied to identify protein signatures on 2/3 randomly assigned BJ samples. The identified signatures were tested on the remaining BJ samples, and were further validated with CQ independent cohort. A 6-protein signature for adenocarcinoma (ADC) and a 5-protein signature for squamous cell carcinoma (SCC) were identified from training sets and tested in testing sets. In independent validation with CQ cohort, patients can also be divided into high- and low-risk groups with significantly different median overall survivals by Kaplan-Meier analysis, both in ADC (31 months vs. 87 months, HR 2.81; P <  0.001) and SCC patients (27 months vs. not reached, HR 9.97; P <  0.001). Cox regression analysis showed that both signatures are independent prognostic indicators and outperformed TNM staging (ADC: adjusted HR 3.07 vs. 2.43, SCC: adjusted HR 7.84 vs. 2.24). Particularly, we found that only the ADC patients in high-risk group significantly benefited from adjuvant chemotherapy (P = 0.018). Both ADC and SCC protein signatures could effectively stratify the prognosis of NSCLC patients, and may support patient selection for adjuvant chemotherapy.

  7. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR)

    PubMed Central

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-01-01

    Background Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. Methods We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. Application We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. Conclusion This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy. PMID:17543100

  8. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR).

    PubMed

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-06-01

    Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy.

  9. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    PubMed

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  10. Methodology Series Module 5: Sampling Strategies.

    PubMed

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  11. A systematic review of treatments for anxiety in youth with autism spectrum disorders.

    PubMed

    Vasa, Roma A; Carroll, Laura M; Nozzolillo, Alixandra A; Mahajan, Rajneesh; Mazurek, Micah O; Bennett, Amanda E; Wink, Logan K; Bernal, Maria Pilar

    2014-12-01

    This study systematically examined the efficacy and safety of psychopharmacological and non-psychopharmacological treatments for anxiety in youth with autism spectrum disorders (ASD). Four psychopharmacological, nine cognitive behavioral therapy (CBT), and two alternative treatment studies met inclusion criteria. Psychopharmacological studies were descriptive or open label, sometimes did not specify the anxiety phenotype, and reported behavioral activation. Citalopram and buspirone yielded some improvement, whereas fluvoxamine did not. Non-psychopharmacological studies were mainly randomized controlled trials (RCTs) with CBT demonstrating moderate efficacy for anxiety disorders in youth with high functioning ASD. Deep pressure and neurofeedback provided some benefit. All studies were short-term and included small sample sizes. Large scale and long term RCTs examining psychopharmacological and non-psychopharmacological treatments are sorely needed.

  12. Methodology Series Module 5: Sampling Strategies

    PubMed Central

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  13. Mercury exposure in a high fish eating Bolivian Amazonian population with intense small-scale gold-mining activities.

    PubMed

    Barbieri, Flavia Laura; Cournil, Amandine; Gardon, Jacques

    2009-08-01

    Methylmercury exposure in Amazonian communities through fish consumption has been widely documented in Brazil. There is still a lack of data in other Amazonian countries, which is why we conducted this study in the Bolivian Amazon basin. Simple random sampling was used from a small village located in the lower Beni River, where there is intense gold mining and high fish consumption. All participants were interviewed and hair samples were taken to measure total mercury concentrations. The hair mercury geometric mean in the general population was 3.02 microg/g (CI: 2.69-3.37; range: 0.42-15.65). Age and gender were not directly associated with mercury levels. Fish consumption showed a positive relation and so did occupation, especially small-scale gold mining. Hair mercury levels were lower than those found in Brazilian studies, but still higher than in non-exposed populations. It is necessary to assess mercury exposure in the Amazonian regions where data is still lacking, using a standardized indicator.

  14. Brief Report: A Randomized, Placebo-Controlled Proof-of-Concept Trial of Adjunctive Topiramate for Alcohol Use Disorders in Bipolar Disorder

    PubMed Central

    Sylvia, Louisa G.; Gold, Alexandra K.; Stange, Jonathan P.; Peckham, Andrew D.; Deckersbach, Thilo; Calabrese, Joseph R.; Weiss, Roger D.; Perlis, Roy H.; Nierenberg, Andrew A.; Ostacher, Michael J.

    2016-01-01

    Background and Objectives Topiramate is effective for alcohol use disorders (AUDs) among non-psychiatric patients. We examined topiramate for treating comorbid AUDs in bipolar disorder (BD). Methods Twelve participants were randomized to topiramate or placebo for 12 weeks. Results The topiramate group, with two out of five participants (40%) completing treatment, experienced less improvement in drinking patterns than the placebo group, with five out of seven participants (71%) completing treatment. Discussion and Conclusions Topiramate did not improve drinking behavior and was not well-tolerated. This study failed to recruit adequately. Problems surrounding high attrition, a small study sample, and missing data preclude interpretation of study findings. Scientific Significance This is the first randomized, placebo-controlled trial of topiramate for AUDs in BD. PMID:26894822

  15. Experimental Investigations of Non-Stationary Properties In Radiometer Receivers Using Measurements of Multiple Calibration References

    NASA Technical Reports Server (NTRS)

    Racette, Paul; Lang, Roger; Zhang, Zhao-Nan; Zacharias, David; Krebs, Carolyn A. (Technical Monitor)

    2002-01-01

    Radiometers must be periodically calibrated because the receiver response fluctuates. Many techniques exist to correct for the time varying response of a radiometer receiver. An analytical technique has been developed that uses generalized least squares regression (LSR) to predict the performance of a wide variety of calibration algorithms. The total measurement uncertainty including the uncertainty of the calibration can be computed using LSR. The uncertainties of the calibration samples used in the regression are based upon treating the receiver fluctuations as non-stationary processes. Signals originating from the different sources of emission are treated as simultaneously existing random processes. Thus, the radiometer output is a series of samples obtained from these random processes. The samples are treated as random variables but because the underlying processes are non-stationary the statistics of the samples are treated as non-stationary. The statistics of the calibration samples depend upon the time for which the samples are to be applied. The statistics of the random variables are equated to the mean statistics of the non-stationary processes over the interval defined by the time of calibration sample and when it is applied. This analysis opens the opportunity for experimental investigation into the underlying properties of receiver non stationarity through the use of multiple calibration references. In this presentation we will discuss the application of LSR to the analysis of various calibration algorithms, requirements for experimental verification of the theory, and preliminary results from analyzing experiment measurements.

  16. Why are mixed-race people perceived as more attractive?

    PubMed

    Lewis, Michael B

    2010-01-01

    Previous, small scale, studies have suggested that people of mixed race are perceived as being more attractive than non-mixed-race people. Here, it is suggested that the reason for this is the genetic process of heterosis or hybrid vigour (ie cross-bred offspring have greater genetic fitness than pure-bred offspring). A random sample of 1205 black, white, and mixed-race faces was collected. These faces were then rated for their perceived attractiveness. There was a small but highly significant effect, with mixed-race faces, on average, being perceived as more attractive. This result is seen as a perceptual demonstration of heterosis in humans-a biological process that may have implications far beyond just attractiveness.

  17. Canadian Phase III Randomized Trial of Stereotactic Body Radiotherapy Versus Conventionally Hypofractionated Radiotherapy for Stage I, Medically Inoperable Non-Small-Cell Lung Cancer - Rationale and Protocol Design for the Ontario Clinical Oncology Group (OCOG)-LUSTRE Trial.

    PubMed

    Swaminath, Anand; Wierzbicki, Marcin; Parpia, Sameer; Wright, James R; Tsakiridis, Theodoros K; Okawara, Gordon S; Kundapur, Vijayananda; Bujold, Alexis; Ahmed, Naseer; Hirmiz, Khalid; Kurien, Elizabeth; Filion, Edith; Gabos, Zsolt; Faria, Sergio; Louie, Alexander V; Owen, Timothy; Wai, Elaine; Ramchandar, Kevin; Chan, Elisa K; Julian, Jim; Cline, Kathryn; Whelan, Timothy J

    2017-03-01

    We describe a Canadian phase III randomized controlled trial of stereotactic body radiotherapy (SBRT) versus conventionally hypofractionated radiotherapy (CRT) for the treatment of stage I medically inoperable non-small-cell lung cancer (OCOG-LUSTRE Trial). Eligible patients are randomized in a 2:1 fashion to either SBRT (48 Gy in 4 fractions for peripherally located lesions; 60 Gy in 8 fractions for centrally located lesions) or CRT (60 Gy in 15 fractions). The primary outcome of the study is 3-year local control, which we hypothesize will improve from 75% with CRT to 87.5% with SBRT. With 85% power to detect a difference of this magnitude (hazard ratio = 0.46), a 2-sided α = 0.05 and a 2:1 randomization, we require a sample size of 324 patients (216 SBRT, 108 CRT). Important secondary outcomes include overall survival, disease-free survival, toxicity, radiation-related treatment death, quality of life, and cost-effectiveness. A robust radiation therapy quality assurance program has been established to assure consistent and high quality SBRT and CRT delivery. Despite widespread interest and adoption of SBRT, there still remains a concern regarding long-term control and risks of toxicity (particularly in patients with centrally located lesions). The OCOG-LUSTRE study is the only randomized phase III trial testing SBRT in a medically inoperable population, and the results of this trial will attempt to prove that the benefits of SBRT outweigh the potential risks. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. A Semantic Differential Evaluation of Attitudinal Outcomes of Introductory Physical Science.

    ERIC Educational Resources Information Center

    Hecht, Alfred Roland

    This study was designed to assess the attitudinal outcomes of Introductory Physical Science (IPS) curriculum materials used in schools. Random samples of 240 students receiving IPS instruction and 240 non-science students were assigned to separate Solomon four-group designs with non-equivalent control groups. Random samples of 60 traditional…

  19. Sampling in epidemiological research: issues, hazards and pitfalls.

    PubMed

    Tyrer, Stephen; Heyman, Bob

    2016-04-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research.

  20. Sampling in epidemiological research: issues, hazards and pitfalls

    PubMed Central

    Tyrer, Stephen; Heyman, Bob

    2016-01-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985

  1. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  2. Psychiatric disorders among the Mapuche in Chile.

    PubMed

    Vicente, Benjamin; Kohn, Robert; Rioseco, Pedro; Saldivia, Sandra; Torres, Silverio

    2005-06-01

    The Mapuche are the largest indigenous group in Chile; yet almost all data on the mental health of indigenous populations are from North America. The study examines the differential DSM-III-R prevalence rates of psychiatric disorders and service utilization among indigenous and non-indigenous community residence. The Composite International Diagnostic Interview (CIDI) was administered to a stratified random sample of 75 Mapuche and 434 non-Mapuche residents of the province of Cautín. Lifetime prevalence and 12-month prevalence rates were estimated. Approximately 28.4% of the Mapuche population had a lifetime, and 15.7% a 12-month, prevalent psychiatric disorder compared to 38.0% and 25.7%, respectively, of the non-Mapuche. Few significant differences were noted between the two groups; however, generalized anxiety disorder, simple phobia, and drug dependence were less prevalent among the Mapuche. Service utilization among the Mapuche with mental illness was low. This is a preliminary study based on a small sample size. Further research on the mental health of indigenous populations of South America is needed.

  3. Variation of mutational burden in healthy human tissues suggests non-random strand segregation and allows measuring somatic mutation rates.

    PubMed

    Werner, Benjamin; Sottoriva, Andrea

    2018-06-01

    The immortal strand hypothesis poses that stem cells could produce differentiated progeny while conserving the original template strand, thus avoiding accumulating somatic mutations. However, quantitating the extent of non-random DNA strand segregation in human stem cells remains difficult in vivo. Here we show that the change of the mean and variance of the mutational burden with age in healthy human tissues allows estimating strand segregation probabilities and somatic mutation rates. We analysed deep sequencing data from healthy human colon, small intestine, liver, skin and brain. We found highly effective non-random DNA strand segregation in all adult tissues (mean strand segregation probability: 0.98, standard error bounds (0.97,0.99)). In contrast, non-random strand segregation efficiency is reduced to 0.87 (0.78,0.88) in neural tissue during early development, suggesting stem cell pool expansions due to symmetric self-renewal. Healthy somatic mutation rates differed across tissue types, ranging from 3.5 × 10-9/bp/division in small intestine to 1.6 × 10-7/bp/division in skin.

  4. fixedTimeEvents: An R package for the distribution of distances between discrete events in fixed time

    NASA Astrophysics Data System (ADS)

    Liland, Kristian Hovde; Snipen, Lars

    When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.

  5. Testing how voluntary participation requirements in an environmental study affect the planned random sample design outcomes: implications for the predictions of values and their uncertainty.

    NASA Astrophysics Data System (ADS)

    Ander, Louise; Lark, Murray; Smedley, Pauline; Watts, Michael; Hamilton, Elliott; Fletcher, Tony; Crabbe, Helen; Close, Rebecca; Studden, Mike; Leonardi, Giovanni

    2015-04-01

    Random sampling design is optimal in order to be able to assess outcomes, such as the mean of a given variable across an area. However, this optimal sampling design may be compromised to an unknown extent by unavoidable real-world factors: the extent to which the study design can still be considered random, and the influence this may have on the choice of appropriate statistical data analysis is examined in this work. We take a study which relied on voluntary participation for the sampling of private water tap chemical composition in England, UK. This study was designed and implemented as a categorical, randomised study. The local geological classes were grouped into 10 types, which were considered to be most important in likely effects on groundwater chemistry (the source of all the tap waters sampled). Locations of the users of private water supplies were made available to the study group from the Local Authority in the area. These were then assigned, based on location, to geological groups 1 to 10 and randomised within each group. However, the permission to collect samples then required active, voluntary participation by householders and thus, unlike many environmental studies, could not always follow the initial sample design. Impediments to participation ranged from 'willing but not available' during the designated sampling period, to a lack of response to requests to sample (assumed to be wholly unwilling or unable to participate). Additionally, a small number of unplanned samples were collected via new participants making themselves known to the sampling teams, during the sampling period. Here we examine the impact this has on the 'random' nature of the resulting data distribution, by comparison with the non-participating known supplies. We consider the implications this has on choice of statistical analysis methods to predict values and uncertainty at un-sampled locations.

  6. Chemical classification of iron meteorites. XI. Multi-element studies of 38 new irons and the high abundance of ungrouped irons from Antarctica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wasson, J.T.; Ouyang, Xinwei; Wang, Jianmin

    1989-03-01

    The authors report concentrations of 14 elements in the metal of 38 iron meteorites and a pallasite. The meteorites are classified based on these data and on structural observations. Three samples are paired with previously classified irons; thus, these additional 35 irons raise the number of well-classified, independent iron meteorites to 598. One Yamato iron contains 342 mg/g Ni, the second highest Ni content in an IAB iron after Oktibbeha County. Two small irons from Western Australia appear to be metal nodules from mesosiderites. Several of the new irons are from Antarctica. Of 24 independent irons from Antarctica, 8 aremore » ungrouped. The fraction, 0.333, is much higher than the fraction 0.161 among all 598 classified irons. Statistical tests show that it is highly improbably ({approximately}2.9% probability) that the Antarctic population is a random sample of the larger population. The difference is probably related to the fact that the median mass of Antarctic irons is about two orders of magnitude smaller than that of non-Antarctic irons. It is doubtful that the difference results from fragmentation patterns yielding different size distributions favoring smaller masses among ungrouped irons. More likely is the possibility that smaller meteoroids tend to sample a larger number of asteroidal source regions, perhaps because small meteoroids tend to have higher ejection velocities or because small meteoroids have random-walked a greater increment of orbital semimajor axis away from that of the parent body.« less

  7. Sub-grouping patients with non-specific low back pain based on cluster analysis of discriminatory clinical items.

    PubMed

    Billis, Evdokia; McCarthy, Christopher J; Roberts, Chris; Gliatis, John; Papandreou, Maria; Gioftsos, George; Oldham, Jacqueline A

    2013-02-01

    To identify potential subgroups amongst patients with non-specific low back pain based on a consensus list of potentially discriminatory examination items. Exploratory study. A convenience sample of 106 patients with non-specific low back pain (43 males, 63 females, mean age 36 years, standard deviation 15.9 years) and 7 physiotherapists. Based on 3 focus groups and a two-round Delphi involving 23 health professionals and a random stratified sample of 150 physiotherapists, respectively, a comprehensive examination list comprising the most "discriminatory" items was compiled. Following reliability analysis, the most reliable clinical items were assessed with a sample of patients with non-specific low back pain. K-means cluster analysis was conducted for 2-, 3- and 4-cluster options to explore for meaningful homogenous subgroups. The most clinically meaningful cluster was a two-subgroup option, comprising a small group (n = 24) with more severe clinical presentation (i.e. more widespread pain, functional and sleeping problems, other symptoms, increased investigations undertaken, more severe clinical signs, etc.) and a larger less dysfunctional group (n = 80). A number of potentially discriminatory clinical items were identified by health professionals and sub-classified, based on a sample of patients with non-specific low back pain, into two subgroups. However, further work is needed to validate this classification process.

  8. Estimation of population mean in the presence of measurement error and non response under stratified random sampling

    PubMed Central

    Shabbir, Javid

    2018-01-01

    In the present paper we propose an improved class of estimators in the presence of measurement error and non-response under stratified random sampling for estimating the finite population mean. The theoretical and numerical studies reveal that the proposed class of estimators performs better than other existing estimators. PMID:29401519

  9. Rash rates with egfr inhibitors: meta-analysis

    PubMed Central

    Mittmann, N.; Seung, S.J.

    2011-01-01

    Introduction Currently marketed epidermal growth factor receptor inhibitors (egfris) have been associated with high rates of dermatologic toxicity. Methods We formally reviewed the literature at medline and embase. Additional searches were conducted using Internet search engines. Studies were eligible if they were randomized controlled clinical trials of egfris, specifically cetuximab and panitumumab, in which at least one arm consisted of a non-egfri treatment and rash safety data were reported. The random effects method was used to pool differences in incident rash rates. Results are summarized as differences in incident rash (egfri therapy rate minus the non-egfri therapy rate) with corresponding 95% confidence intervals (cis) for all severity grades of rash and for grades 3 and 4 rash. Results Sixteen studies met the initial inclusion criteria of randomized controlled trials comparing egfri with non-egfri therapy. Seven publications that provided information on all severity grades of rash were found to have an overall difference in incident rash rate of 0.74 (95% ci: 0.68 to 0.81; p < 0.01). Thirteen studies that reported the incidence of grades 3 and 4 rash showed an overall difference in the incident rash rate of 0.12 (95% ci: 0.09 to 0.14; p < 0.01) between egfri and non-egfri therapy. Sensitivity analyses showed that the results were generally robust, but sensitive to small samples. Conclusions Results quantify the difference in rash rates between egfri and non-egfri therapy. PMID:21505590

  10. Investigating Test Equating Methods in Small Samples through Various Factors

    ERIC Educational Resources Information Center

    Asiret, Semih; Sünbül, Seçil Ömür

    2016-01-01

    In this study, equating methods for random group design using small samples through factors such as sample size, difference in difficulty between forms, and guessing parameter was aimed for comparison. Moreover, which method gives better results under which conditions was also investigated. In this study, 5,000 dichotomous simulated data…

  11. Efficacy and Safety of First-Line Necitumumab Plus Gemcitabine and Cisplatin Versus Gemcitabine and Cisplatin in East Asian Patients with Stage IV Squamous Non-small Cell Lung Cancer: A Subgroup Analysis of the Phase 3, Open-Label, Randomized SQUIRE Study.

    PubMed

    Park, Keunchil; Cho, Eun Kyung; Bello, Maximino; Ahn, Myung-Ju; Thongprasert, Sumitra; Song, Eun-Kee; Soldatenkova, Victoria; Depenbrock, Henrik; Puri, Tarun; Orlando, Mauro

    2017-10-01

    The phase 3 randomized SQUIRE study revealed significantly longer overall survival (OS) and progression-free survival (PFS) for necitumumab plus gemcitabine and cisplatin (neci+GC) than for gemcitabine and cisplatin alone (GC) in 1,093 patients with previously untreated advanced squamous non-small cell lung cancer (NSCLC). This post hoc subgroup analysis assessed the efficacy and safety of neci+GC among East Asian (EA) patients enrolled in the study. All patients received up to six 3-week cycles of gemcitabine (days 1 and 8, 1,250 mg/m²) and cisplatin (day 1, 75 mg/m²). Patients in the neci+GC arm also received necitumumab (days 1 and 8, 800 mg) until disease progression or unacceptable toxicity. Hazard ratios (HRs) and 95% confidence intervals (CIs) were estimated from stratified Cox proportional hazards models. In EA patients, there were improvements for neci+GC (n=43) versus GC (n=41) in OS (HR, 0.805; 95% CI, 0.484 to 1.341) and PFS (HR, 0.720; 95% CI, 0.439 to 1.180), consistent with the results for non-EA patients observed in the present study. The overall safety data were consistent between EA and non-EA patients. A numerically higher proportion of patients experienced serious adverse events (AEs), grade ≥ 3 AEs, and AEs with an outcome of death for neci+GC versus GC in EA patients and EA patients versus non-EA patients for neci+GC. Although limited by the small sample size and post hoc nature of the analysis, these findings are consistent with those of the overall study and suggest that neci+GC offers a survival advantage and favorable benefit/risk for EA patients with advanced squamous NSCLC.

  12. Randomization Does Not Help Much, Comparability Does

    PubMed Central

    Saint-Mont, Uwe

    2015-01-01

    According to R.A. Fisher, randomization “relieves the experimenter from the anxiety of considering innumerable causes by which the data may be disturbed.” Since, in particular, it is said to control for known and unknown nuisance factors that may considerably challenge the validity of a result, it has become very popular. This contribution challenges the received view. First, looking for quantitative support, we study a number of straightforward, mathematically simple models. They all demonstrate that the optimism surrounding randomization is questionable: In small to medium-sized samples, random allocation of units to treatments typically yields a considerable imbalance between the groups, i.e., confounding due to randomization is the rule rather than the exception. In the second part of this contribution, the reasoning is extended to a number of traditional arguments in favour of randomization. This discussion is rather non-technical, and sometimes touches on the rather fundamental Frequentist/Bayesian debate. However, the result of this analysis turns out to be quite similar: While the contribution of randomization remains doubtful, comparability contributes much to a compelling conclusion. Summing up, classical experimentation based on sound background theory and the systematic construction of exchangeable groups seems to be advisable. PMID:26193621

  13. Collaborative Indoor Access Point Localization Using Autonomous Mobile Robot Swarm.

    PubMed

    Awad, Fahed; Naserllah, Muhammad; Omar, Ammar; Abu-Hantash, Alaa; Al-Taj, Abrar

    2018-01-31

    Localization of access points has become an important research problem due to the wide range of applications it addresses such as dismantling critical security threats caused by rogue access points or optimizing wireless coverage of access points within a service area. Existing proposed solutions have mostly relied on theoretical hypotheses or computer simulation to demonstrate the efficiency of their methods. The techniques that rely on estimating the distance using samples of the received signal strength usually assume prior knowledge of the signal propagation characteristics of the indoor environment in hand and tend to take a relatively large number of uniformly distributed random samples. This paper presents an efficient and practical collaborative approach to detect the location of an access point in an indoor environment without any prior knowledge of the environment. The proposed approach comprises a swarm of wirelessly connected mobile robots that collaboratively and autonomously collect a relatively small number of non-uniformly distributed random samples of the access point's received signal strength. These samples are used to efficiently and accurately estimate the location of the access point. The experimental testing verified that the proposed approach can identify the location of the access point in an accurate and efficient manner.

  14. Collaborative Indoor Access Point Localization Using Autonomous Mobile Robot Swarm

    PubMed Central

    Awad, Fahed; Naserllah, Muhammad; Omar, Ammar; Abu-Hantash, Alaa; Al-Taj, Abrar

    2018-01-01

    Localization of access points has become an important research problem due to the wide range of applications it addresses such as dismantling critical security threats caused by rogue access points or optimizing wireless coverage of access points within a service area. Existing proposed solutions have mostly relied on theoretical hypotheses or computer simulation to demonstrate the efficiency of their methods. The techniques that rely on estimating the distance using samples of the received signal strength usually assume prior knowledge of the signal propagation characteristics of the indoor environment in hand and tend to take a relatively large number of uniformly distributed random samples. This paper presents an efficient and practical collaborative approach to detect the location of an access point in an indoor environment without any prior knowledge of the environment. The proposed approach comprises a swarm of wirelessly connected mobile robots that collaboratively and autonomously collect a relatively small number of non-uniformly distributed random samples of the access point’s received signal strength. These samples are used to efficiently and accurately estimate the location of the access point. The experimental testing verified that the proposed approach can identify the location of the access point in an accurate and efficient manner. PMID:29385042

  15. A Comparison of Techniques for Scheduling Earth-Observing Satellites

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna

    2004-01-01

    Scheduling observations by coordinated fleets of Earth Observing Satellites (EOS) involves large search spaces, complex constraints and poorly understood bottlenecks, conditions where evolutionary and related algorithms are often effective. However, there are many such algorithms and the best one to use is not clear. Here we compare multiple variants of the genetic algorithm: stochastic hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on ten realistically-sized EOS scheduling problems. Schedules are represented by a permutation (non-temperal ordering) of the observation requests. A simple deterministic scheduler assigns times and resources to each observation request in the order indicated by the permutation, discarding those that violate the constraints created by previously scheduled observations. Simulated annealing performs best. Random mutation outperform a more 'intelligent' mutator. Furthermore, the best mutator, by a small margin, was a novel approach we call temperature dependent random sampling that makes large changes in the early stages of evolution and smaller changes towards the end of search.

  16. Factual knowledge about AIDS and dating practices among high school students from selected schools.

    PubMed

    Nyachuru-Sihlangu, R H; Ndlovu, J

    1992-06-01

    Following various educational strategies by governmental and non-governmental organisations to educate youths and school teachers about HIV infection and prevention, this KABP survey was one attempt to evaluate the results. The study sample of 478 high school students was drawn from four randomly selected schools in Mashonaland and Matabeleland including high and low density, government and mission co-educational schools. The sample was randomly selected and stratified to represent sex and grade level. The KABP self administered questionnaire was used. The paper analyses the relationship between the knowledge and dating patterns. Generally, respondents demonstrated a 50pc to 80pc accuracy of factual knowledge. Of the 66pc Forms I through IV pupils who dated, 30pc preferred only sexually involved relationships and a small number considered the possibility of HIV/AIDS infection. A theoretically based tripartite coalition involving the school, the family health care services for education, guidance and support to promote responsible behaviour throughout childhood was suggested.

  17. Generating log-normal mock catalog of galaxies in redshift space

    NASA Astrophysics Data System (ADS)

    Agrawal, Aniket; Makiya, Ryu; Chiang, Chi-Ting; Jeong, Donghui; Saito, Shun; Komatsu, Eiichiro

    2017-10-01

    We present a public code to generate a mock galaxy catalog in redshift space assuming a log-normal probability density function (PDF) of galaxy and matter density fields. We draw galaxies by Poisson-sampling the log-normal field, and calculate the velocity field from the linearised continuity equation of matter fields, assuming zero vorticity. This procedure yields a PDF of the pairwise velocity fields that is qualitatively similar to that of N-body simulations. We check fidelity of the catalog, showing that the measured two-point correlation function and power spectrum in real space agree with the input precisely. We find that a linear bias relation in the power spectrum does not guarantee a linear bias relation in the density contrasts, leading to a cross-correlation coefficient of matter and galaxies deviating from unity on small scales. We also find that linearising the Jacobian of the real-to-redshift space mapping provides a poor model for the two-point statistics in redshift space. That is, non-linear redshift-space distortion is dominated by non-linearity in the Jacobian. The power spectrum in redshift space shows a damping on small scales that is qualitatively similar to that of the well-known Fingers-of-God (FoG) effect due to random velocities, except that the log-normal mock does not include random velocities. This damping is a consequence of non-linearity in the Jacobian, and thus attributing the damping of the power spectrum solely to FoG, as commonly done in the literature, is misleading.

  18. The American Entrepreneurial and Small-Business Culture.

    ERIC Educational Resources Information Center

    Jackson, John E.

    A study examined public attitudes toward entrepreneurs and small business owners and people's perceptions of the entrepreneurial character and challenge. Information was gathered from two surveys, composed of three samples, conducted in 1985. Samples included: (1) 1,001 persons contacted through random telephone dialing; (2) 250 telephone…

  19. Analysis of potential influence factors on background urinary benzene concentration among a non-smoking, non-occupationally exposed general population sample.

    PubMed

    Campagna, Marcello; Satta, Giannina; Campo, Laura; Flore, Valeria; Ibba, Antonio; Meloni, Michele; Tocco, Maria Giuseppina; Avataneo, Giuseppe; Flore, Costantino; Fustinoni, Silvia; Cocco, Pierluigi

    2014-01-01

    Analytical difficulties and lack of a biological exposure index and reference values have prevented using unmetabolized urinary benzene (UB) excretion as a biomarker of low-level environmental exposure. To explore what environmental factors beyond active smoking may contribute to environmental exposure to benzene, we monitored UB excretion in a non-smoking, non-occupationally exposed sample of the general population. Two spot urine samples were obtained from 86 non-smoking, non-occupationally exposed subjects, selected among a random sample of the general population of the metropolitan area of Cagliari (Sardinia, Italy), at 8:00 a.m. (UBm) and 8:00 p.m. (UBe). UB was measured by headspace solid-phase microextraction (HS-SPME) followed by gas chromatography-mass spectrometry analysis. Questionnaire information on personal and environmental exposures during the sampling day was gathered with personal interviews. Multivariate analysis of variance and multiple regression model were applied to investigate the role of such variables on the level of UB. The ninety-fifth percentile of UBe in this population was 311.5 ng/L, which is tentatively proposed as the UB guidance value for unexposed populations. UBm and urban residence were the only predictors of a significant increase in UBe excretion. Self-reported residential vehicular traffic will not account for the excess median value among urban residents; commuting time among urban residents showed a suggestive nonsignificant linear correlation with UBe, but the small sample size prevented reliable inference to be drawn. Age, environmental tobacco smoking, employment status and body mass index did not affect UB excretion. Our findings support the use of unmetabolized UB as a specific and sensitive biomarker of low-level environmental exposure to benzene.

  20. Propagation of Circularly Polarized Light Through a Two-Dimensional Random Medium

    NASA Astrophysics Data System (ADS)

    Gorodnichev, E. E.

    2017-12-01

    The problem of small-angle multiple-scattering of circularly polarized light in a two-dimensional medium with large fiberlike inhomogeneities is studied. The attenuation lengths for elements the density matrix are calculated. It is found that with increasing the sample thickness the intensity of waves polarized along the fibers decays faster than the other density matrix elements. With further increase in the thickness, the off-diagonal element which is responsible for correlation between the cross-polarized waves dissapears. In the case of very thick samples the scattered field proves to be polarized perpendicular to the fibers. It is shown that the difference in the attenuation lengths of the density matrix elements results in a non-monotonic depth dependence of the degree of polarization.

  1. The Association between Childhood and Adolescent Sexual Abuse and Proxies for Sexual Risk Behavior: A Random Sample of the General Population of Sweden

    ERIC Educational Resources Information Center

    Steel, Jennifer L.; Herlitz, Claes A.

    2005-01-01

    Objective: Several studies with small and ''high risk'' samples have demonstrated that a history of childhood or adolescent sexual abuse (CASA) is associated with sexual risk behaviors (SRBs). However, few studies with large random samples from the general population have specifically examined the relationship between CASA and SRBs with a…

  2. Mediastinal lymph node dissection versus mediastinal lymph node sampling for early stage non-small cell lung cancer: a systematic review and meta-analysis.

    PubMed

    Huang, Xiongfeng; Wang, Jianmin; Chen, Qiao; Jiang, Jielin

    2014-01-01

    This systematic review and meta-analysis aimed to evaluate the overall survival, local recurrence, distant metastasis, and complications of mediastinal lymph node dissection (MLND) versus mediastinal lymph node sampling (MLNS) in stage I-IIIA non-small cell lung cancer (NSCLC) patients. A systematic search of published literature was conducted using the main databases (MEDLINE, PubMed, EMBASE, and Cochrane databases) to identify relevant randomized controlled trials that compared MLND vs. MLNS in NSCLC patients. Methodological quality of included randomized controlled trials was assessed according to the criteria from the Cochrane Handbook for Systematic Review of Interventions (Version 5.1.0). Meta-analysis was performed using The Cochrane Collaboration's Review Manager 5.3. The results of the meta-analysis were expressed as hazard ratio (HR) or risk ratio (RR), with their corresponding 95% confidence interval (CI). We included results reported from six randomized controlled trials, with a total of 1,791 patients included in the primary meta-analysis. Compared to MLNS in NSCLC patients, there was no statistically significant difference in MLND on overall survival (HR = 0.77, 95% CI 0.55 to 1.08; P = 0.13). In addition, the results indicated that local recurrence rate (RR = 0.93, 95% CI 0.68 to 1.28; P = 0.67), distant metastasis rate (RR = 0.88, 95% CI 0.74 to 1.04; P = 0.15), and total complications rate (RR = 1.10, 95% CI 0.67 to 1.79; P = 0.72) were similar, no significant difference found between the two groups. Results for overall survival, local recurrence rate, and distant metastasis rate were similar between MLND and MLNS in early stage NSCLC patients. There was no evidence that MLND increased complications compared with MLNS. Whether or not MLND is superior to MLNS for stage II-IIIA remains to be determined.

  3. Survey on the treatment of non-small-cell lung cancer in Italy.

    PubMed

    Alexanian, A; Torri, V

    2000-07-01

    The results of the Italian part of an international survey on therapeutic preferences and opinions about prognosis of patients affected by non-small-cell lung cancer (NSCLC) are shown. The investigation was conducted by the means of a postal questionnaire aiming to gather information on preferences about treatment and beliefs about survival of three hypothetical patients affected by NSCLC in different stages (T2N1M0, T2N3M0, M1); three sources of Italian physicians potentially treating patients affected by NSCLC were the target population: participants in the Adjuvant Lung Project Italy (Alpi) trial, a 20% random sample of the Italian Medical Oncology Association (AIOM) and representatives of almost all the pneumology wards in Italy. Overall, there were 287 evaluable responses, 89% of respondents were males, mean age was 46 years, years from graduation 21 and charge of patients per clinician 82. The most important result is the wide variation of answers both about therapy and prognosis. Expectations about size of prognosis improvement with a new chemotherapy seem to be excessive. The results are discussed in relation to the twin surveys of Canada and England and Wales and to the meta-analyses on the efficacy of chemotherapy as an adjunct to primary treatment and on postoperative radiotherapy in non-small-cell lung cancer.

  4. Intestinal helminth infestation is associated with increased bronchial responsiveness in children.

    PubMed

    da Silva, Emerson R; Sly, Peter D; de Pereira, Marilyn U; Pinto, Leonardo A; Jones, Marcus H; Pitrez, Paulo M; Stein, Renato T

    2008-07-01

    Non-atopic asthma is the predominant phenotype in non-affluent parts of Latin America. We recently reported that infestation with Ascaris lumbricoides increased the risk of non-atopic asthma in less affluent areas of Brazil but the mechanism is unclear. The present study was conducted to determine whether helminth infestation is associated with heightened bronchial responsiveness (BHR), a common finding in asthma. A random sample of 50 asthmatic and 50 non-asthmatic controls (mean age 10.1 years) were selected from a larger cohort (n = 1,011) without knowledge of their helminth infestation status. Three stool samples were collected from each child on different days and each sample was analyzed by the Kato-Katz method for quantitative determination of helminth eggs. Bronchial provocation tests were performed with inhaled 4.5% hypertonic saline using the ISAAC Phase II standardized protocol. There was no difference between the prevalence of positive BHR in the asthmatics (20.4%) compared with the controls (14.6%) (P = 1.0). Helminth infestation was detected in 24.0% of children, with A. lumbricoides being the most common. Children with high load infestation (>or=100 eggs/g) were five times more likely to have BHR than children with low load or no infestation. Despite the small sample size the results of the present study suggest that the link between high load helminth infestation and non-atopic asthma may be mediated via heightened bronchial responsiveness, possibly due to an inflammatory response to the pulmonary phase of the helminth life cycle.

  5. Provider-related barriers to rapid HIV testing in U.S. urban non-profit community clinics, community-based organizations (CBOs) and hospitals.

    PubMed

    Bogart, Laura M; Howerton, Devery; Lange, James; Setodji, Claude Messan; Becker, Kirsten; Klein, David J; Asch, Steven M

    2010-06-01

    We examined provider-reported barriers to rapid HIV testing in U.S. urban non-profit community clinics, community-based organizations (CBOs), and hospitals. 12 primary metropolitan statistical areas (PMSAs; three per region) were sampled randomly, with sampling weights proportional to AIDS case reports. Across PMSAs, all 671 hospitals and a random sample of 738 clinics/CBOs were telephoned for a survey on rapid HIV test availability. Of the 671 hospitals, 172 hospitals were randomly selected for barriers questions, for which 158 laboratory and 136 department staff were eligible and interviewed in 2005. Of the 738 clinics/CBOs, 276 were randomly selected for barriers questions, 206 were reached, and 118 were eligible and interviewed in 2005-2006. In multivariate models, barriers regarding translation of administrative/quality assurance policies into practice were significantly associated with rapid HIV testing availability. For greater rapid testing diffusion, policies are needed to reduce administrative barriers and provide quality assurance training to non-laboratory staff.

  6. Neither fixed nor random: weighted least squares meta-analysis.

    PubMed

    Stanley, T D; Doucouliagos, Hristos

    2015-06-15

    This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects. Copyright © 2015 John Wiley & Sons, Ltd.

  7. Detection of Torque teno midi virus/Small anellovirus (TTMDV/SAV) in the sera of domestic village chickens and its vertical transmission from hen to eggs

    PubMed Central

    Bouzari, M; Salmanizadeh, Sh

    2015-01-01

    Although the infection of different animals and non-human primates with other members of Anelloviridae have already been reported there is no report about infection of animals with Torque teno midi virus/Small anellovirs (TTMDV/SAV). The aim of this study was to detect the virus in domestic village chickens. Blood samples were collected from 79 domestic village chickens in Isfahan. Blood samples of five adult laying hens and one cockerel were collected in three consecutive weeks (days 1, 8 and 14) as experimental chickens. Ten eggs were randomly collected from the eggs laid during days 12 to 17 and thin and thick egg whites and yolk samples were collected aseptically. After DNA extraction Nested-PCR was performed using SMAs/SMAr primers. In PCR, 431 bp and 441 bp products were detected. The detected bands were extracted and sequenced. Totally 26 out of 79 (32.9%) of the blood samples were positive for the virus. The frequency of the infection of the different parts of the eggs tested was 76%. For the first time TTMDV/SAV was detected in domestic village chickens which also vertically transmitted to eggs. PMID:27175162

  8. Detection of Torque teno midi virus/Small anellovirus (TTMDV/SAV) in the sera of domestic village chickens and its vertical transmission from hen to eggs.

    PubMed

    Bouzari, M; Salmanizadeh, Sh

    2015-01-01

    Although the infection of different animals and non-human primates with other members of Anelloviridae have already been reported there is no report about infection of animals with Torque teno midi virus/Small anellovirs (TTMDV/SAV). The aim of this study was to detect the virus in domestic village chickens. Blood samples were collected from 79 domestic village chickens in Isfahan. Blood samples of five adult laying hens and one cockerel were collected in three consecutive weeks (days 1, 8 and 14) as experimental chickens. Ten eggs were randomly collected from the eggs laid during days 12 to 17 and thin and thick egg whites and yolk samples were collected aseptically. After DNA extraction Nested-PCR was performed using SMAs/SMAr primers. In PCR, 431 bp and 441 bp products were detected. The detected bands were extracted and sequenced. Totally 26 out of 79 (32.9%) of the blood samples were positive for the virus. The frequency of the infection of the different parts of the eggs tested was 76%. For the first time TTMDV/SAV was detected in domestic village chickens which also vertically transmitted to eggs.

  9. Finite-sample corrected generalized estimating equation of population average treatment effects in stepped wedge cluster randomized trials.

    PubMed

    Scott, JoAnna M; deCamp, Allan; Juraska, Michal; Fay, Michael P; Gilbert, Peter B

    2017-04-01

    Stepped wedge designs are increasingly commonplace and advantageous for cluster randomized trials when it is both unethical to assign placebo, and it is logistically difficult to allocate an intervention simultaneously to many clusters. We study marginal mean models fit with generalized estimating equations for assessing treatment effectiveness in stepped wedge cluster randomized trials. This approach has advantages over the more commonly used mixed models that (1) the population-average parameters have an important interpretation for public health applications and (2) they avoid untestable assumptions on latent variable distributions and avoid parametric assumptions about error distributions, therefore, providing more robust evidence on treatment effects. However, cluster randomized trials typically have a small number of clusters, rendering the standard generalized estimating equation sandwich variance estimator biased and highly variable and hence yielding incorrect inferences. We study the usual asymptotic generalized estimating equation inferences (i.e., using sandwich variance estimators and asymptotic normality) and four small-sample corrections to generalized estimating equation for stepped wedge cluster randomized trials and for parallel cluster randomized trials as a comparison. We show by simulation that the small-sample corrections provide improvement, with one correction appearing to provide at least nominal coverage even with only 10 clusters per group. These results demonstrate the viability of the marginal mean approach for both stepped wedge and parallel cluster randomized trials. We also study the comparative performance of the corrected methods for stepped wedge and parallel designs, and describe how the methods can accommodate interval censoring of individual failure times and incorporate semiparametric efficient estimators.

  10. A systematic review of non-pharmacological interventions for primary Sjögren's syndrome.

    PubMed

    Hackett, Katie L; Deane, Katherine H O; Strassheim, Victoria; Deary, Vincent; Rapley, Tim; Newton, Julia L; Ng, Wan-Fai

    2015-11-01

    To evaluate the effects of non-pharmacological interventions for primary SS (pSS) on outcomes falling within the World Health Organization International Classification of Functioning Disability and Health domains. We searched the following databases from inception to September 2014: Cochrane Database of Systematic Reviews; Medline; Embase; PsychINFO; CINAHL; and clinical trials registers. We included randomized controlled trials of any non-pharmacological intervention. Two authors independently reviewed titles and abstracts against the inclusion/exclusion criteria and independently assessed trial quality and extracted data. A total of 1463 studies were identified, from which 17 full text articles were screened and 5 studies were included in the review; a total of 130 participants were randomized. The included studies investigated the effectiveness of an oral lubricating device for dry mouth, acupuncture for dry mouth, lacrimal punctum plugs for dry eyes and psychodynamic group therapy for coping with symptoms. Overall, the studies were of low quality and at high risk of bias. Although one study showed punctum plugs to improve dry eyes, the sample size was relatively small. Further high-quality studies to evaluate non-pharmacological interventions for PSS are needed. © The Author 2015. Published by Oxford University Press on behalf of the British Society for Rheumatology.

  11. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    ERIC Educational Resources Information Center

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  12. Efficient sampling of complex network with modified random walk strategies

    NASA Astrophysics Data System (ADS)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  13. Can work ability explain the social gradient in sickness absence: a study of a general population in Sweden.

    PubMed

    Löve, Jesper; Holmgren, Kristina; Torén, Kjell; Hensing, Gunnel

    2012-03-07

    Understanding the reasons for the social gradient in sickness absence might provide an opportunity to reduce the general rates of sickness absence. The complete explanation for this social gradient still remains unclear and there is a need for studies using randomized working population samples. The main aim of the present study was to investigate if self-reported work ability could explain the association between low socioeconomic position and belonging to a sample of new cases of sick-listed employees. The two study samples consisted of a randomized working population (n = 2,763) and a sample of new cases of sick-listed employees (n = 3,044), 19-64 years old. Both samples were drawn from the same randomized general population. Socioeconomic status was measured with occupational position and physical and mental work ability was measured with two items extracted from the work ability index. There was an association between lower socioeconomic status and belonging to the sick-listed sample among both women and men. In men the crude Odds ratios increased for each downwards step in socioeconomic status, OR 1.32 (95% CI 0.98-1.78), OR 1.53 (1.05-2.24), OR 2.80 (2.11-3.72), and OR 2.98 (2.27-3.90). Among women this gradient was not as pronounced. Physical work ability constituted the strongest explanatory factor explaining the total association between socioeconomic status and being sick-listed in women. However, among men, the association between skilled non-manual, OR 2.07 (1.54-2.78), and non-skilled manual, OR 2.03 (1.53-2.71) positions in relation to being sick-listed remained. The explanatory effect of mental work ability was small. Surprisingly, even in the sick-listed sample most respondents had high mental and physical work ability. These results suggest that physical work ability may be an important key in explaining the social gradient in sickness absence, particularly in women. Hence, it is possible that the factors associated with the social gradient in sickness absence may differ, to some extent, between women and men.

  14. Detecting spatial structures in throughfall data: the effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-04-01

    In the last three decades, an increasing number of studies analyzed spatial patterns in throughfall to investigate the consequences of rainfall redistribution for biogeochemical and hydrological processes in forests. In the majority of cases, variograms were used to characterize the spatial properties of the throughfall data. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and an appropriate layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation methods on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with heavy outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling), and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the numbers recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous throughfall studies relied on method-of-moments variogram estimation and sample sizes << 200, our current knowledge about throughfall spatial variability stands on shaky ground.

  15. Using random telephone sampling to recruit generalizable samples for family violence studies.

    PubMed

    Slep, Amy M Smith; Heyman, Richard E; Williams, Mathew C; Van Dyke, Cheryl E; O'Leary, Susan G

    2006-12-01

    Convenience sampling methods predominate in recruiting for laboratory-based studies within clinical and family psychology. The authors used random digit dialing (RDD) to determine whether they could feasibly recruit generalizable samples for 2 studies (a parenting study and an intimate partner violence study). RDD screen response rate was 42-45%; demographics matched those in the 2000 U.S. Census, with small- to medium-sized differences on race, age, and income variables. RDD respondents who qualified for, but did not participate in, the laboratory study of parents showed small differences on income, couple conflicts, and corporal punishment. Time and cost are detailed, suggesting that RDD may be a feasible, effective method by which to recruit more generalizable samples for in-laboratory studies of family violence when those studies have sufficient resources. (c) 2006 APA, all rights reserved.

  16. [A multicenter, large-sample, randomized clinical trial on improving the median survival time of advanced non-small cell lung cancer by combination of Ginseng Rg3 and chemotherapy].

    PubMed

    Zhang, Y; Wang, X Q; Liu, H; Liu, J; Hou, W; Lin, H S

    2018-04-23

    Objective: To observe the efficacy of the combination of chemotherapy and Ginseng Rg3 on advanced non-small cell lung cancer(NSCLC). Methods: In the multi-center, large-sample, randomized, double blind trial, 414 patients with Ⅲ-Ⅳ NSCLC were enrolled.199 were in the experimental group and 215 the control group. The patients in the experimental group were treated with the standard first-line chemotherapy combined with Ginseng Rg3. The patients in the control group were treated with the same chemotherapy combined with placebo. Median overall survival (OS), Karnofsky performance scale (KPS), Traditional Chinese Medicine (TCM) symptoms score and side effects of two groups were observed as main indexes. Results: The median OS were 12.03 months in the experimental group, which was significantly better than that in the control group (8.46 months, P <0.05). Hemoglobin and white blood cells were decreased after the first and second cycle of treatment in both groups. Both adverse events were significantly milder in the treatment group ( P <0.05). In addition, after two courses of treatment, the KPS of patients was 78.95±9.14 in the experimental group and 76.77±9.15 in the control group, while the TCM symptoms score was 2.45±1.73 in the experimental group and 2.92±2.06 in the control group, with significant difference ( P <0.05). Conclusions: Combination of TCM with Western medicine such as chemotherapy could prolong the survival of patients with advanced NSCLC. The combined therapy improved patients' symptoms and reduced chemotherapy induced myelosuppression.

  17. Reducing seed dependent variability of non-uniformly sampled multidimensional NMR data

    NASA Astrophysics Data System (ADS)

    Mobli, Mehdi

    2015-07-01

    The application of NMR spectroscopy to study the structure, dynamics and function of macromolecules requires the acquisition of several multidimensional spectra. The one-dimensional NMR time-response from the spectrometer is extended to additional dimensions by introducing incremented delays in the experiment that cause oscillation of the signal along "indirect" dimensions. For a given dimension the delay is incremented at twice the rate of the maximum frequency (Nyquist rate). To achieve high-resolution requires acquisition of long data records sampled at the Nyquist rate. This is typically a prohibitive step due to time constraints, resulting in sub-optimal data records to the detriment of subsequent analyses. The multidimensional NMR spectrum itself is typically sparse, and it has been shown that in such cases it is possible to use non-Fourier methods to reconstruct a high-resolution multidimensional spectrum from a random subset of non-uniformly sampled (NUS) data. For a given acquisition time, NUS has the potential to improve the sensitivity and resolution of a multidimensional spectrum, compared to traditional uniform sampling. The improvements in sensitivity and/or resolution achieved by NUS are heavily dependent on the distribution of points in the random subset acquired. Typically, random points are selected from a probability density function (PDF) weighted according to the NMR signal envelope. In extreme cases as little as 1% of the data is subsampled. The heavy under-sampling can result in poor reproducibility, i.e. when two experiments are carried out where the same number of random samples is selected from the same PDF but using different random seeds. Here, a jittered sampling approach is introduced that is shown to improve random seed dependent reproducibility of multidimensional spectra generated from NUS data, compared to commonly applied NUS methods. It is shown that this is achieved due to the low variability of the inherent sensitivity of the random subset chosen from a given PDF. Finally, it is demonstrated that metrics used to find optimal NUS distributions are heavily dependent on the inherent sensitivity of the random subset, and such optimisation is therefore less critical when using the proposed sampling scheme.

  18. Prevalence and risk factors for Maedi-Visna in sheep farms in Mecklenburg-Western-Pomerania.

    PubMed

    Hüttner, Klim; Seelmann, Matthias; Feldhusen, Frerk

    2010-01-01

    Despite indications of a considerable spread of Maedi-Visna among sheep flocks in Germany, prevalence studies of this important infection are hardly available. Prior to any health schemes and guidelines, knowledge about regional disease distribution is essential. Depending upon herd size, 70 farms were randomly selected, of which 41 cooperated. A total of 2229 blood samples were taken at random and serologically examined. For assessment of selected farm characteristics a questionnaire exercise was conducted at all farms involved. The average herd prevalence is 51.2%, the within-herd prevalence is 28,8%. In the unvariate analysis of risk factors, small (10-100 sheep) and large (> 250 sheep) farms are more MVV-affected than medium sized farms. The average stable and pasture space per sheep is larger at non-infected- compared to infected farms. Owners judgement on general herd health turns out to be better at non-infected compared to infected farms. Taking infected farms only, the risk of within-herd prevalence above 20% is significant higher in crossbred than in purebred flocks.

  19. Multi-class computational evolution: development, benchmark evaluation and application to RNA-Seq biomarker discovery.

    PubMed

    Crabtree, Nathaniel M; Moore, Jason H; Bowyer, John F; George, Nysia I

    2017-01-01

    A computational evolution system (CES) is a knowledge discovery engine that can identify subtle, synergistic relationships in large datasets. Pareto optimization allows CESs to balance accuracy with model complexity when evolving classifiers. Using Pareto optimization, a CES is able to identify a very small number of features while maintaining high classification accuracy. A CES can be designed for various types of data, and the user can exploit expert knowledge about the classification problem in order to improve discrimination between classes. These characteristics give CES an advantage over other classification and feature selection algorithms, particularly when the goal is to identify a small number of highly relevant, non-redundant biomarkers. Previously, CESs have been developed only for binary class datasets. In this study, we developed a multi-class CES. The multi-class CES was compared to three common feature selection and classification algorithms: support vector machine (SVM), random k-nearest neighbor (RKNN), and random forest (RF). The algorithms were evaluated on three distinct multi-class RNA sequencing datasets. The comparison criteria were run-time, classification accuracy, number of selected features, and stability of selected feature set (as measured by the Tanimoto distance). The performance of each algorithm was data-dependent. CES performed best on the dataset with the smallest sample size, indicating that CES has a unique advantage since the accuracy of most classification methods suffer when sample size is small. The multi-class extension of CES increases the appeal of its application to complex, multi-class datasets in order to identify important biomarkers and features.

  20. Generating log-normal mock catalog of galaxies in redshift space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, Aniket; Makiya, Ryu; Saito, Shun

    We present a public code to generate a mock galaxy catalog in redshift space assuming a log-normal probability density function (PDF) of galaxy and matter density fields. We draw galaxies by Poisson-sampling the log-normal field, and calculate the velocity field from the linearised continuity equation of matter fields, assuming zero vorticity. This procedure yields a PDF of the pairwise velocity fields that is qualitatively similar to that of N-body simulations. We check fidelity of the catalog, showing that the measured two-point correlation function and power spectrum in real space agree with the input precisely. We find that a linear biasmore » relation in the power spectrum does not guarantee a linear bias relation in the density contrasts, leading to a cross-correlation coefficient of matter and galaxies deviating from unity on small scales. We also find that linearising the Jacobian of the real-to-redshift space mapping provides a poor model for the two-point statistics in redshift space. That is, non-linear redshift-space distortion is dominated by non-linearity in the Jacobian. The power spectrum in redshift space shows a damping on small scales that is qualitatively similar to that of the well-known Fingers-of-God (FoG) effect due to random velocities, except that the log-normal mock does not include random velocities. This damping is a consequence of non-linearity in the Jacobian, and thus attributing the damping of the power spectrum solely to FoG, as commonly done in the literature, is misleading.« less

  1. Estimation of reference intervals from small samples: an example using canine plasma creatinine.

    PubMed

    Geffré, A; Braun, J P; Trumel, C; Concordet, D

    2009-12-01

    According to international recommendations, reference intervals should be determined from at least 120 reference individuals, which often are impossible to achieve in veterinary clinical pathology, especially for wild animals. When only a small number of reference subjects is available, the possible bias cannot be known and the normality of the distribution cannot be evaluated. A comparison of reference intervals estimated by different methods could be helpful. The purpose of this study was to compare reference limits determined from a large set of canine plasma creatinine reference values, and large subsets of this data, with estimates obtained from small samples selected randomly. Twenty sets each of 120 and 27 samples were randomly selected from a set of 1439 plasma creatinine results obtained from healthy dogs in another study. Reference intervals for the whole sample and for the large samples were determined by a nonparametric method. The estimated reference limits for the small samples were minimum and maximum, mean +/- 2 SD of native and Box-Cox-transformed values, 2.5th and 97.5th percentiles by a robust method on native and Box-Cox-transformed values, and estimates from diagrams of cumulative distribution functions. The whole sample had a heavily skewed distribution, which approached Gaussian after Box-Cox transformation. The reference limits estimated from small samples were highly variable. The closest estimates to the 1439-result reference interval for 27-result subsamples were obtained by both parametric and robust methods after Box-Cox transformation but were grossly erroneous in some cases. For small samples, it is recommended that all values be reported graphically in a dot plot or histogram and that estimates of the reference limits be compared using different methods.

  2. Involved-Field Radiotherapy versus Elective Nodal Irradiation in Combination with Concurrent Chemotherapy for Locally Advanced Non-Small Cell Lung Cancer: A Prospective Randomized Study

    PubMed Central

    Chen, Ming; Bao, Yong; Ma, Hong-Lian; Wang, Jin; Wang, Yan; Peng, Fang; Zhou, Qi-Chao; Xie, Cong-Hua

    2013-01-01

    This prospective randomized study is to evaluate the locoregional failure and its impact on survival by comparing involved field radiotherapy (IFRT) with elective nodal irradiation (ENI) in combination with concurrent chemotherapy for locally advanced non-small cell lung cancer. It appears that higher dose could be delivered in IFRT arm than that in ENI arm, and IFRT did not increase the risk of initially uninvolved or isolated nodal failures. Both a tendency of improved locoregional progression-free survival and a significant increased overall survival rate are in favor of IFRT arm in this study. PMID:23762840

  3. A Meta-analytic Review of Non-specific Effects in Randomized Controlled Trials of Cognitive Remediation for Schizophrenia.

    PubMed

    Radhakrishnan, Rajiv; Kiluk, Brian D; Tsai, Jack

    2016-03-01

    Cognitive remediation (CR) has been found to improve cognitive performance among adults with schizophrenia in randomized controlled trials (RCTs). However, improvements in cognitive performance are often observed in the control groups of RCTs as well. There has been no comprehensive examination of change in control groups for CR, which may inform trial methodology and improve our understanding of measured outcomes for cognitive remediation. In this meta-analysis, we calculated pre-post change in cognitive test performance within control groups of RCTs in 32 CR trials (n = 794 participants) published between 1970 and 2011, and examined the association between pre-post change and sample size, duration of treatment, type of control group, and participants' age, intelligence, duration of illness, and psychiatric symptoms. Results showed that control groups in CR trials showed small effect size changes (Cohen's d = 0.12 ± 0.16) in cognitive test performance over the trial duration. Study characteristics associated with pre-post change included participant age and sample size. These findings suggest attention to change in control groups may help improve detection of cognitive remediation effects for schizophrenia.

  4. Quality Assurance of NCI Thesaurus by Mining Structural-Lexical Patterns

    PubMed Central

    Abeysinghe, Rashmie; Brooks, Michael A.; Talbert, Jeffery; Licong, Cui

    2017-01-01

    Quality assurance of biomedical terminologies such as the National Cancer Institute (NCI) Thesaurus is an essential part of the terminology management lifecycle. We investigate a structural-lexical approach based on non-lattice subgraphs to automatically identify missing hierarchical relations and missing concepts in the NCI Thesaurus. We mine six structural-lexical patterns exhibiting in non-lattice subgraphs: containment, union, intersection, union-intersection, inference-contradiction, and inference union. Each pattern indicates a potential specific type of error and suggests a potential type of remediation. We found 809 non-lattice subgraphs with these patterns in the NCI Thesaurus (version 16.12d). Domain experts evaluated a random sample of 50 small non-lattice subgraphs, of which 33 were confirmed to contain errors and make correct suggestions (33/50 = 66%). Of the 25 evaluated subgraphs revealing multiple patterns, 22 were verified correct (22/25 = 88%). This shows the effectiveness of our structurallexical-pattern-based approach in detecting errors and suggesting remediations in the NCI Thesaurus. PMID:29854100

  5. Neighbourhood non-employment and daily smoking: a population-based study of women and men in Sweden.

    PubMed

    Ohlander, Emma; Vikström, Max; Lindström, Martin; Sundquist, Kristina

    2006-02-01

    To examine whether neighbourhood non-employment is associated with daily smoking after adjustment for individual characteristics, such as employment status. Cross-sectional study of a simple, random sample of 31,164 women and men aged 25-64, representative of the entire population in Sweden. Data were collected from the years 1993-2000. The individual variables included age, sex, employment status, occupation and housing tenure. Logistic regression was used in the analysis with neighbourhood non-employment rates measured at small area market statistics level. There was a significant association between neighbourhood non-employment rates and daily smoking for both women and men. After adjustment for employment status and housing tenure the odds ratios of daily smoking were 1.39 (95% CI = 1.22-1.58) for women and 1.41 (95% CI = 1.23-1.61) for men living in neighbourhoods with the highest non-employment rates. The individual variables of unemployment, low occupational level and renting were associated with daily smoking. Neighbourhood non-employment is associated with daily smoking. Smoking prevention in primary health care should address both individuals and neighbourhoods.

  6. The use of single-date MODIS imagery for estimating large-scale urban impervious surface fraction with spectral mixture analysis and machine learning techniques

    NASA Astrophysics Data System (ADS)

    Deng, Chengbin; Wu, Changshan

    2013-12-01

    Urban impervious surface information is essential for urban and environmental applications at the regional/national scales. As a popular image processing technique, spectral mixture analysis (SMA) has rarely been applied to coarse-resolution imagery due to the difficulty of deriving endmember spectra using traditional endmember selection methods, particularly within heterogeneous urban environments. To address this problem, we derived endmember signatures through a least squares solution (LSS) technique with known abundances of sample pixels, and integrated these endmember signatures into SMA for mapping large-scale impervious surface fraction. In addition, with the same sample set, we carried out objective comparative analyses among SMA (i.e. fully constrained and unconstrained SMA) and machine learning (i.e. Cubist regression tree and Random Forests) techniques. Analysis of results suggests three major conclusions. First, with the extrapolated endmember spectra from stratified random training samples, the SMA approaches performed relatively well, as indicated by small MAE values. Second, Random Forests yields more reliable results than Cubist regression tree, and its accuracy is improved with increased sample sizes. Finally, comparative analyses suggest a tentative guide for selecting an optimal approach for large-scale fractional imperviousness estimation: unconstrained SMA might be a favorable option with a small number of samples, while Random Forests might be preferred if a large number of samples are available.

  7. Sample size re-estimation and other midcourse adjustments with sequential parallel comparison design.

    PubMed

    Silverman, Rachel K; Ivanova, Anastasia

    2017-01-01

    Sequential parallel comparison design (SPCD) was proposed to reduce placebo response in a randomized trial with placebo comparator. Subjects are randomized between placebo and drug in stage 1 of the trial, and then, placebo non-responders are re-randomized in stage 2. Efficacy analysis includes all data from stage 1 and all placebo non-responding subjects from stage 2. This article investigates the possibility to re-estimate the sample size and adjust the design parameters, allocation proportion to placebo in stage 1 of SPCD, and weight of stage 1 data in the overall efficacy test statistic during an interim analysis.

  8. Exercise as Treatment for Anxiety: Systematic Review and Analysis

    PubMed Central

    Stonerock, Gregory L.; Hoffman, Benson M.; Smith, Patrick J.; Blumenthal, James A.

    2015-01-01

    Background Exercise has been shown to reduce symptoms of anxiety, but few studies have studied exercise in individuals pre-selected because of their high anxiety. Purpose To review and critically evaluate studies of exercise training in adults with either high levels of anxiety or an anxiety disorder. Methods We conducted a systematic review of randomized clinical trials (RCTs) in which anxious adults were randomized to an exercise or non-exercise control condition. Data were extracted concerning anxiety outcomes and study design. Existing meta-analyses were also reviewed. Results Evidence from 12 RCTs suggested benefits of exercise, for select groups, similar to established treatments and greater than placebo. However, most studies had significant methodological limitations, including small sample sizes, concurrent therapies, and inadequate assessment of adherence and fitness levels. Conclusions Exercise may be a useful treatment for anxiety, but lack of data from rigorous, methodologically sound RCTs precludes any definitive conclusions about its effectiveness. PMID:25697132

  9. Compact Quantum Random Number Generator with Silicon Nanocrystals Light Emitting Device Coupled to a Silicon Photomultiplier

    NASA Astrophysics Data System (ADS)

    Bisadi, Zahra; Acerbi, Fabio; Fontana, Giorgio; Zorzi, Nicola; Piemonte, Claudio; Pucker, Georg; Pavesi, Lorenzo

    2018-02-01

    A small-sized photonic quantum random number generator, easy to be implemented in small electronic devices for secure data encryption and other applications, is highly demanding nowadays. Here, we propose a compact configuration with Silicon nanocrystals large area light emitting device (LED) coupled to a Silicon photomultiplier to generate random numbers. The random number generation methodology is based on the photon arrival time and is robust against the non-idealities of the detector and the source of quantum entropy. The raw data show high quality of randomness and pass all the statistical tests in national institute of standards and technology tests (NIST) suite without a post-processing algorithm. The highest bit rate is 0.5 Mbps with the efficiency of 4 bits per detected photon.

  10. Iodine and mental development of children 5 years old and under: a systematic review and meta-analysis.

    PubMed

    Bougma, Karim; Aboud, Frances E; Harding, Kimberly B; Marquis, Grace S

    2013-04-22

    Several reviews and meta-analyses have examined the effects of iodine on mental development. None focused on young children, so they were incomplete in summarizing the effects on this important age group. The current systematic review therefore examined the relationship between iodine and mental development of children 5 years old and under. A systematic review of articles using Medline (1980-November 2011) was carried out. We organized studies according to four designs: (1) randomized controlled trial with iodine supplementation of mothers; (2) non-randomized trial with iodine supplementation of mothers and/or infants; (3) prospective cohort study stratified by pregnant women's iodine status; (4) prospective cohort study stratified by newborn iodine status. Average effect sizes for these four designs were 0.68 (2 RCT studies), 0.46 (8 non-RCT studies), 0.52 (9 cohort stratified by mothers' iodine status), and 0.54 (4 cohort stratified by infants' iodine status). This translates into 6.9 to 10.2 IQ points lower in iodine deficient children compared with iodine replete children. Thus, regardless of study design, iodine deficiency had a substantial impact on mental development. Methodological concerns included weak study designs, the omission of important confounders, small sample sizes, the lack of cluster analyses, and the lack of separate analyses of verbal and non-verbal subtests. Quantifying more precisely the contribution of iodine deficiency to delayed mental development in young children requires more well-designed randomized controlled trials, including ones on the role of iodized salt.

  11. Urine sampling techniques in symptomatic primary-care patients: a diagnostic accuracy review.

    PubMed

    Holm, Anne; Aabenhus, Rune

    2016-06-08

    Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection in primary care. The aim of this study was to determine the accuracy of urine culture from different sampling-techniques in symptomatic non-pregnant women in primary care. A systematic review was conducted by searching Medline and Embase for clinical studies conducted in primary care using a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. We included seven studies investigating urine sampling technique in 1062 symptomatic patients in primary care. Mid-stream-clean-catch had a positive predictive value of 0.79 to 0.95 and a negative predictive value close to 1 compared to sterile techniques. Two randomized controlled trials found no difference in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However, the evidence presented is in-direct and the difference between mid-stream-clean-catch, mid-stream-urine and random samples remains to be investigated in a paired design to verify the present findings.

  12. A cluster randomized trial of strategies to increase uptake amongst young women invited for their first cervical screen: The STRATEGIC trial.

    PubMed

    Kitchener, H; Gittins, M; Cruickshank, M; Moseley, C; Fletcher, S; Albrow, R; Gray, A; Brabin, L; Torgerson, D; Crosbie, E J; Sargent, A; Roberts, C

    2018-06-01

    Objectives To measure the feasibility and effectiveness of interventions to increase cervical screening uptake amongst young women. Methods A two-phase cluster randomized trial conducted in general practices in the NHS Cervical Screening Programme. In Phase 1, women in practices randomized to intervention due for their first invitation to cervical screening received a pre-invitation leaflet and, separately, access to online booking. In Phase 2, non-attenders at six months were randomized to one of: vaginal self-sample kits sent unrequested or offered; timed appointments; nurse navigator; or the choice between nurse navigator or self-sample kits. Primary outcome was uplift in intervention vs. control practices, at 3 and 12 months post invitation. Results Phase 1 randomized 20,879 women. Neither pre-invitation leaflet nor online booking increased screening uptake by three months (18.8% pre-invitation leaflet vs. 19.2% control and 17.8% online booking vs. 17.2% control). Uptake was higher amongst human papillomavirus vaccinees at three months (OR 2.07, 95% CI 1.69-2.53, p < 0.001). Phase 2 randomized 10,126 non-attenders, with 32-34 clusters for each intervention and 100 clusters as controls. Sending self-sample kits increased uptake at 12 months (OR 1.51, 95% CI 1.20-1.91, p = 0.001), as did timed appointments (OR 1.41, 95% CI 1.14-1.74, p = 0.001). The offer of a nurse navigator, a self-sample kits on request, and choice between timed appointments and nurse navigator were ineffective. Conclusions Amongst non-attenders, self-sample kits sent and timed appointments achieved an uplift in screening over the short term; longer term impact is less certain. Prior human papillomavirus vaccination was associated with increased screening uptake.

  13. Linking species richness curves from non-contiguous sampling to contiguous-nested SAR: An empirical study

    NASA Astrophysics Data System (ADS)

    Lazarina, Maria; Kallimanis, Athanasios S.; Pantis, John D.; Sgardelis, Stefanos P.

    2014-11-01

    The species-area relationship (SAR) is one of the few generalizations in ecology. However, many different relationships are denoted as SARs. Here, we empirically evaluated the differences between SARs derived from nested-contiguous and non-contiguous sampling designs, using plants, birds and butterflies datasets from Great Britain, Greece, Massachusetts, New York and San Diego. The shape of SAR depends on the sampling scheme, but there is little empirical documentation on the magnitude of the deviation between different types of SARs and the factors affecting it. We implemented a strictly nested sampling design to construct nested-contiguous SAR (SACR), and systematic nested but non-contiguous, and random designs to construct non-contiguous species richness curves (SASRs for systematic and SACs for random designs) per dataset. The SACR lay below any SASR and most of the SACs. The deviation between them was related to the exponent f of the power law relationship between sampled area and extent. The lower the exponent f, the higher was the deviation between the curves. We linked SACR to SASR and SAC through the concept of "effective" area (Ae), i.e. the nested-contiguous area containing equal number of species with the accumulated sampled area (AS) of a non-contiguous sampling. The relationship between effective and sampled area was modeled as log(Ae) = klog(AS). A Generalized Linear Model was used to estimate the values of k from sampling design and dataset properties. The parameter k increased with the average distance between samples and with beta diversity, while k decreased with f. For both systematic and random sampling, the model performed well in predicting effective area in both the training set and in the test set which was totally independent from the training one. Through effective area, we can link different types of species richness curves based on sampling design properties, sampling effort, spatial scale and beta diversity patterns.

  14. Conic Sampling: An Efficient Method for Solving Linear and Quadratic Programming by Randomly Linking Constraints within the Interior

    PubMed Central

    Serang, Oliver

    2012-01-01

    Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics. PMID:22952741

  15. Prevention and management of non-steroidal anti-inflammatory drugs-induced small intestinal injury

    PubMed Central

    Park, Sung Chul; Chun, Hoon Jai; Kang, Chang Don; Sul, Donggeun

    2011-01-01

    Non-steroidal anti-inflammatory drug (NSAID)-induced small bowel injury is a topic that deserves attention since the advent of capsule endoscopy and balloon enteroscopy. NSAID enteropathy is common and is mostly asymptomatic. However, massive bleeding, stricture, or perforation may occur. The pathogenesis of small intestine injury by NSAIDs is complex and different from that of the upper gastrointestinal tract. No drug has yet been developed that can completely prevent or treat NSAID enteropathy. Therefore, a long-term randomized study in chronic NSAID users is needed. PMID:22180706

  16. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    PubMed

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  17. Recruitment for Occupational Research: Using Injured Workers as the Point of Entry into Workplaces

    PubMed Central

    Koehoorn, Mieke; Trask, Catherine M.; Teschke, Kay

    2013-01-01

    Objective To investigate the feasibility, costs and sample representativeness of a recruitment method that used workers with back injuries as the point of entry into diverse working environments. Methods Workers' compensation claims were used to randomly sample workers from five heavy industries and to recruit their employers for ergonomic assessments of the injured worker and up to 2 co-workers. Results The final study sample included 54 workers from the workers’ compensation registry and 72 co-workers. This sample of 126 workers was based on an initial random sample of 822 workers with a compensation claim, or a ratio of 1 recruited worker to approximately 7 sampled workers. The average recruitment cost was CND$262/injured worker and CND$240/participating worksite including co-workers. The sample was representative of the heavy industry workforce, and was successful in recruiting the self-employed (8.2%), workers from small employers (<20 workers, 38.7%), and workers from diverse working environments (49 worksites, 29 worksite types, and 51 occupations). Conclusions The recruitment rate was low but the cost per participant reasonable and the sample representative of workers in small worksites. Small worksites represent a significant portion of the workforce but are typically underrepresented in occupational research despite having distinct working conditions, exposures and health risks worthy of investigation. PMID:23826387

  18. Recording 2-D Nutation NQR Spectra by Random Sampling Method

    PubMed Central

    Sinyavsky, Nikolaj; Jadzyn, Maciej; Ostafin, Michal; Nogaj, Boleslaw

    2010-01-01

    The method of random sampling was introduced for the first time in the nutation nuclear quadrupole resonance (NQR) spectroscopy where the nutation spectra show characteristic singularities in the form of shoulders. The analytic formulae for complex two-dimensional (2-D) nutation NQR spectra (I = 3/2) were obtained and the condition for resolving the spectral singularities for small values of an asymmetry parameter η was determined. Our results show that the method of random sampling of a nutation interferogram allows significant reduction of time required to perform a 2-D nutation experiment and does not worsen the spectral resolution. PMID:20949121

  19. A weighted ℓ{sub 1}-minimization approach for sparse polynomial chaos expansions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, Ji; Hampton, Jerrad; Doostan, Alireza, E-mail: alireza.doostan@colorado.edu

    2014-06-15

    This work proposes a method for sparse polynomial chaos (PC) approximation of high-dimensional stochastic functions based on non-adapted random sampling. We modify the standard ℓ{sub 1}-minimization algorithm, originally proposed in the context of compressive sampling, using a priori information about the decay of the PC coefficients, when available, and refer to the resulting algorithm as weightedℓ{sub 1}-minimization. We provide conditions under which we may guarantee recovery using this weighted scheme. Numerical tests are used to compare the weighted and non-weighted methods for the recovery of solutions to two differential equations with high-dimensional random inputs: a boundary value problem with amore » random elliptic operator and a 2-D thermally driven cavity flow with random boundary condition.« less

  20. Parental emotional management benefits family relationships: A randomized controlled trial in Hong Kong, China.

    PubMed

    Fabrizio, Cecilia S; Lam, Tai Hing; Hirschmann, Malia R; Pang, Irene; Yu, Nancy Xiaonan; Wang, Xin; Stewart, Sunita M

    2015-08-01

    There is a shortage of culturally appropriate, brief, preventive interventions designed to be sustainable and acceptable for community participants in nonwestern cultures. Parents' ability to regulate their emotions is an important factor for psychological well-being of the family. In Chinese societies, emotional regulation may be more important in light of the cultural desirability of maintaining harmonious family relationships. The objectives of our randomized controlled trial were to test the effectiveness of our Effective Parenting Programme (EPP) to increase the use of emotional management strategies (primary outcome) and enhance the parent-child relationship (secondary outcome). We utilized design characteristics that promoted recruitment, retention, and intervention sustainability. We randomized a community sample of 412 Hong Kong middle- and low-income mothers of children aged 6-8 years to the EPP or attention control group. At 3, 6 and 12- month follow up, the Effective Parent Program group reported greater increases in the use of emotion management strategies during parent-child interactions, with small to medium effect size, and lower negative affect and greater positive affect, subjective happiness, satisfaction with the parent-child relationship, and family harmony, compared to the control group, with small to medium effect size. Our results provided evidence of effectiveness for a sustainable, preventive, culturally appropriate, cognitive behaviorally-based emotion management program, in a non-clinical setting for Chinese mothers. HKCTR-1190. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A comprehensive algorithm for determining whether a run-in strategy will be a cost-effective design modification in a randomized clinical trial.

    PubMed

    Schechtman, K B; Gordon, M E

    1993-01-30

    In randomized clinical trials, poor compliance and treatment intolerance lead to reduced between-group differences, increased sample size requirements, and increased cost. A run-in strategy is intended to reduce these problems. In this paper, we develop a comprehensive set of measures specifically sensitive to the effect of a run-in on cost and sample size requirements, both before and after randomization. Using these measures, we describe a step-by-step algorithm through which one can estimate the cost-effectiveness of a potential run-in. Because the cost-effectiveness of a run-in is partly mediated by its effect on sample size, we begin by discussing the likely impact of a planned run-in on the required number of randomized, eligible, and screened subjects. Run-in strategies are most likely to be cost-effective when: (1) per patient costs during the post-randomization as compared to the screening period are high; (2) poor compliance is associated with a substantial reduction in response to treatment; (3) the number of screened patients needed to identify a single eligible patient is small; (4) the run-in is inexpensive; (5) for most patients, the run-in compliance status is maintained following randomization and, most importantly, (6) many subjects excluded by the run-in are treatment intolerant or non-compliant to the extent that we expect little or no treatment response. Our analysis suggests that conditions for the cost-effectiveness of run-in strategies are stringent. In particular, if the only purpose of a run-in is to exclude ordinary partial compliers, the run-in will frequently add to the cost of the trial. Often, the cost-effectiveness of a run-in requires that one can identify and exclude a substantial number of treatment intolerant or otherwise unresponsive subjects.

  2. Reaching Asian Americans: sampling strategies and incentives.

    PubMed

    Lee, Soo-Kyung; Cheng, Yu-Yao

    2006-07-01

    Reaching and recruiting representative samples of minority populations is often challenging. This study examined in Chinese and Korean Americans: 1) whether using two different sampling strategies (random sampling vs. convenience sampling) significantly affected characteristics of recruited participants and 2) whether providing different incentives in the mail survey produced different response rates. We found statistically significant, however mostly not remarkable, differences between random and convenience samples. Offering monetary incentives in the mail survey improved response rates among Chinese Americans, while offering a small gift did not improve response rates among either Chinese or Korean Americans. This information will be useful for researchers and practitioners working with Asian Americans.

  3. Analgesic effects of treatments for non-specific low back pain: a meta-analysis of placebo-controlled randomized trials.

    PubMed

    Machado, L A C; Kamper, S J; Herbert, R D; Maher, C G; McAuley, J H

    2009-05-01

    Estimates of treatment effects reported in placebo-controlled randomized trials are less subject to bias than those estimates provided by other study designs. The objective of this meta-analysis was to estimate the analgesic effects of treatments for non-specific low back pain reported in placebo-controlled randomized trials. Medline, Embase, Cinahl, PsychInfo and Cochrane Central Register of Controlled Trials databases were searched for eligible trials from earliest records to November 2006. Continuous pain outcomes were converted to a common 0-100 scale and pooled using a random effects model. A total of 76 trials reporting on 34 treatments were included. Fifty percent of the investigated treatments had statistically significant effects, but for most the effects were small or moderate: 47% had point estimates of effects of <10 points on the 100-point scale, 38% had point estimates from 10 to 20 points and 15% had point estimates of >20 points. Treatments reported to have large effects (>20 points) had been investigated only in a single trial. This meta-analysis revealed that the analgesic effects of many treatments for non-specific low back pain are small and that they do not differ in populations with acute or chronic symptoms.

  4. Small-Sample Adjustments for Tests of Moderators and Model Fit in Robust Variance Estimation in Meta-Regression

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Pustejovsky, James E.

    2015-01-01

    Randomized experiments are commonly used to evaluate the effectiveness of educational interventions. The goal of the present investigation is to develop small-sample corrections for multiple contrast hypothesis tests (i.e., F-tests) such as the omnibus test of meta-regression fit or a test for equality of three or more levels of a categorical…

  5. Toward cost-efficient sampling methods

    NASA Astrophysics Data System (ADS)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  6. Small non-coding RNA profiling in human biofluids and surrogate tissues from healthy individuals: description of the diverse and most represented species.

    PubMed

    Ferrero, Giulio; Cordero, Francesca; Tarallo, Sonia; Arigoni, Maddalena; Riccardo, Federica; Gallo, Gaetano; Ronco, Guglielmo; Allasia, Marco; Kulkarni, Neha; Matullo, Giuseppe; Vineis, Paolo; Calogero, Raffaele A; Pardini, Barbara; Naccarati, Alessio

    2018-01-09

    The role of non-coding RNAs in different biological processes and diseases is continuously expanding. Next-generation sequencing together with the parallel improvement of bioinformatics analyses allows the accurate detection and quantification of an increasing number of RNA species. With the aim of exploring new potential biomarkers for disease classification, a clear overview of the expression levels of common/unique small RNA species among different biospecimens is necessary. However, except for miRNAs in plasma, there are no substantial indications about the pattern of expression of various small RNAs in multiple specimens among healthy humans. By analysing small RNA-sequencing data from 243 samples, we have identified and compared the most abundantly and uniformly expressed miRNAs and non-miRNA species of comparable size with the library preparation in four different specimens (plasma exosomes, stool, urine, and cervical scrapes). Eleven miRNAs were commonly detected among all different specimens while 231 miRNAs were globally unique across them. Classification analysis using these miRNAs provided an accuracy of 99.6% to recognize the sample types. piRNAs and tRNAs were the most represented non-miRNA small RNAs detected in all specimen types that were analysed, particularly in urine samples. With the present data, the most uniformly expressed small RNAs in each sample type were also identified. A signature of small RNAs for each specimen could represent a reference gene set in validation studies by RT-qPCR. Overall, the data reported hereby provide an insight of the constitution of the human miRNome and of other small non-coding RNAs in various specimens of healthy individuals.

  7. "HOOF-Print" Genotyping and Haplotype Inference Discriminates among Brucella spp Isolates From a Small Spatial Scale

    USDA-ARS?s Scientific Manuscript database

    We demonstrate that the “HOOF-Print” assay provides high power to discriminate among Brucella isolates collected on a small spatial scale (within Portugal). Additionally, we illustrate how haplotype identification using non-random association among markers allows resolution of B. melitensis biovars ...

  8. A Randomized Trial to Compare Alternative Educational Interventions to Increase Colorectal Cancer Screening in a Hard-to-Reach Urban Minority Population with Health Insurance.

    PubMed

    Basch, Charles E; Zybert, Patricia; Wolf, Randi L; Basch, Corey H; Ullman, Ralph; Shmukler, Celia; King, Fionnuala; Neugut, Alfred I; Shea, Steven

    2015-10-01

    This randomized controlled trial assessed different educational approaches for increasing colorectal cancer screening uptake in a sample of primarily non-US born urban minority individuals, over aged 50, with health insurance, and out of compliance with screening guidelines. In one group, participants were mailed printed educational material (n = 180); in a second, participants' primary care physicians received academic detailing to improve screening referral and follow-up practices (n = 185); in a third, physicians received academic detailing and participants received tailored telephone education (n = 199). Overall, 21.5% of participants (n = 121) received appropriate screening within one year of randomization. There were no statistically significant pairwise differences between groups in screening rate. Among those 60 years of age or older, however, the detailing plus telephone education group had a higher screening rate than the print group (27.3 vs. 7.7%, p = .02). Different kinds of interventions will be required to increase colorectal cancer screening among the increasingly small population segment that remains unscreened. ClinicalTrials.gov Identifier: NCT02392143.

  9. Panels of tumor-derived RNA markers in peripheral blood of patients with non-small cell lung cancer: their dependence on age, gender and clinical stages.

    PubMed

    Chian, Chih-Feng; Hwang, Yi-Ting; Terng, Harn-Jing; Lee, Shih-Chun; Chao, Tsui-Yi; Chang, Hung; Ho, Ching-Liang; Wu, Yi-Ying; Perng, Wann-Cherng

    2016-08-02

    Peripheral blood mononuclear cell (PBMC)-derived gene signatures were investigated for their potential use in the early detection of non-small cell lung cancer (NSCLC). In our study, 187 patients with NSCLC and 310 age- and gender-matched controls, and an independent set containing 29 patients for validation were included. Eight significant NSCLC-associated genes were identified, including DUSP6, EIF2S3, GRB2, MDM2, NF1, POLDIP2, RNF4, and WEE1. The logistic model containing these significant markers was able to distinguish subjects with NSCLC from controls with an excellent performance, 80.7% sensitivity, 90.6% specificity, and an area under the receiver operating characteristic curve (AUC) of 0.924. Repeated random sub-sampling for 100 times was used to validate the performance of classification training models with an average AUC of 0.92. Additional cross-validation using the independent set resulted in the sensitivity 75.86%. Furthermore, six age/gender-dependent genes: CPEB4, EIF2S3, GRB2, MCM4, RNF4, and STAT2 were identified using age and gender stratification approach. STAT2 and WEE1 were explored as stage-dependent using stage-stratified subpopulation. We conclude that these logistic models using different signatures for total and stratified samples are potential complementary tools for assessing the risk of NSCLC.

  10. A Preliminary Comparison of Motor Learning Across Different Non-invasive Brain Stimulation Paradigms Shows No Consistent Modulations.

    PubMed

    Lopez-Alonso, Virginia; Liew, Sook-Lei; Fernández Del Olmo, Miguel; Cheeran, Binith; Sandrini, Marco; Abe, Mitsunari; Cohen, Leonardo G

    2018-01-01

    Non-invasive brain stimulation (NIBS) has been widely explored as a way to safely modulate brain activity and alter human performance for nearly three decades. Research using NIBS has grown exponentially within the last decade with promising results across a variety of clinical and healthy populations. However, recent work has shown high inter-individual variability and a lack of reproducibility of previous results. Here, we conducted a small preliminary study to explore the effects of three of the most commonly used excitatory NIBS paradigms over the primary motor cortex (M1) on motor learning (Sequential Visuomotor Isometric Pinch Force Tracking Task) and secondarily relate changes in motor learning to changes in cortical excitability (MEP amplitude and SICI). We compared anodal transcranial direct current stimulation (tDCS), paired associative stimulation (PAS 25 ), and intermittent theta burst stimulation (iTBS), along with a sham tDCS control condition. Stimulation was applied prior to motor learning. Participants ( n = 28) were randomized into one of the four groups and were trained on a skilled motor task. Motor learning was measured immediately after training (online), 1 day after training (consolidation), and 1 week after training (retention). We did not find consistent differential effects on motor learning or cortical excitability across groups. Within the boundaries of our small sample sizes, we then assessed effect sizes across the NIBS groups that could help power future studies. These results, which require replication with larger samples, are consistent with previous reports of small and variable effect sizes of these interventions on motor learning.

  11. A Preliminary Comparison of Motor Learning Across Different Non-invasive Brain Stimulation Paradigms Shows No Consistent Modulations

    PubMed Central

    Lopez-Alonso, Virginia; Liew, Sook-Lei; Fernández del Olmo, Miguel; Cheeran, Binith; Sandrini, Marco; Abe, Mitsunari; Cohen, Leonardo G.

    2018-01-01

    Non-invasive brain stimulation (NIBS) has been widely explored as a way to safely modulate brain activity and alter human performance for nearly three decades. Research using NIBS has grown exponentially within the last decade with promising results across a variety of clinical and healthy populations. However, recent work has shown high inter-individual variability and a lack of reproducibility of previous results. Here, we conducted a small preliminary study to explore the effects of three of the most commonly used excitatory NIBS paradigms over the primary motor cortex (M1) on motor learning (Sequential Visuomotor Isometric Pinch Force Tracking Task) and secondarily relate changes in motor learning to changes in cortical excitability (MEP amplitude and SICI). We compared anodal transcranial direct current stimulation (tDCS), paired associative stimulation (PAS25), and intermittent theta burst stimulation (iTBS), along with a sham tDCS control condition. Stimulation was applied prior to motor learning. Participants (n = 28) were randomized into one of the four groups and were trained on a skilled motor task. Motor learning was measured immediately after training (online), 1 day after training (consolidation), and 1 week after training (retention). We did not find consistent differential effects on motor learning or cortical excitability across groups. Within the boundaries of our small sample sizes, we then assessed effect sizes across the NIBS groups that could help power future studies. These results, which require replication with larger samples, are consistent with previous reports of small and variable effect sizes of these interventions on motor learning. PMID:29740271

  12. Analysis of correlation factors and pregnancy outcomes of hypertensive disorders of pregnancy - a secondary analysis of a random sampling in Beijing, China.

    PubMed

    Zhu, Yu-Chun; Yang, Hui-Xia; Wei, Yu-Mei; Zhu, Wei-Wei; Meng, Wen-Ying; Wang, Yong-Qing; Shang, Li-Xin; Cai, Zhen-Yu; Ji, Li-Ping; Wang, Yun-Feng; Sun, Ying; Liu, Jia-Xiu; Wei, Li; Sun, Yu-Feng; Zhang, Xue-Ying; Luo, Tian-Xia; Chen, Hai-Xia; Yu, Li-Jun

    2017-03-01

    We aimed to assess the prevalence and risk factors for hypertensive disorders and to study the main pregnancy outcomes in the Beijing area of China. This study randomly sampled 15 hospitals in Beijing from Jun 2013 to Nov 2013 and evaluated 15 194 deliveries. Logistic regression analysis was used to study the association between risk factors and hypertensive disorders. Pregnancy outcomes included preterm birth, cesarean delivery and small for gestational age (SGA). The prevalence of hypertensive disorders, preeclampsia (PE) and severe PE was 4.4, 2.7 and 1.8%, respectively. The risk factors for hypertensive disorders and severe PE were maternal body mass index before pregnancy, gestational weight gain (GWG), gestational diabetes and pre-gestational diabetes, and third trimester cholesterol (CHOL) levels. First trimester high-density lipoprotein was a protective factor for severe PE. The incidence of hypertensive disorders increased with maternal age. Preterm delivery, cesarean delivery and small infant size for gestational age were more prevalent in the severe PE group compared with the non-hypertensive group. In the Beijing area of China, maternal body mass index before pregnancy, GWG, maternal complications of gestational diabetes and pre-gestational diabetes, and third trimester CHOL levels are risk factors for both hypertensive disorders of pregnancy and severe PE. First trimester high-density lipoprotein is a protective factor for severe PE. Severe preeclampsia leads to a higher incidence of preterm delivery, cesarean delivery and SGA infants.

  13. Mindfulness-based stress reduction and cognitive-behavioral therapy for chronic low back pain: similar effects on mindfulness, catastrophizing, self-efficacy, and acceptance in a randomized controlled trial

    PubMed Central

    Turner, Judith A.; Anderson, Melissa L.; Balderson, Benjamin H.; Cook, Andrea J.; Sherman, Karen J.; Cherkin, Daniel C.

    2016-01-01

    Cognitive-behavioral therapy (CBT) is believed to improve chronic pain problems by decreasing patient catastrophizing and increasing patient self-efficacy for managing pain. Mindfulness-based stress reduction (MBSR) is believed to benefit chronic pain patients by increasing mindfulness and pain acceptance. However, little is known about how these therapeutic mechanism variables relate to each other or whether they are differentially impacted by MBSR versus CBT. In a randomized controlled trial comparing MBSR, CBT, and usual care (UC) for adults aged 20-70 years with chronic low back pain (CLBP) (N = 342), we examined (1) baseline relationships among measures of catastrophizing, self-efficacy, acceptance, and mindfulness; and (2) changes on these measures in the 3 treatment groups. At baseline, catastrophizing was associated negatively with self-efficacy, acceptance, and 3 aspects of mindfulness (non-reactivity, non-judging, and acting with awareness; all P-values <0.01). Acceptance was associated positively with self-efficacy (P < 0.01) and mindfulness (P-values < 0.05) measures. Catastrophizing decreased slightly more post-treatment with MBSR than with CBT or UC (omnibus P = 0.002). Both treatments were effective compared with UC in decreasing catastrophizing at 52 weeks (omnibus P = 0.001). In both the entire randomized sample and the sub-sample of participants who attended ≥6 of the 8 MBSR or CBT sessions, differences between MBSR and CBT at up to 52 weeks were few, small in size, and of questionable clinical meaningfulness. The results indicate overlap across measures of catastrophizing, self-efficacy, acceptance, and mindfulness, and similar effects of MBSR and CBT on these measures among individuals with CLBP. PMID:27257859

  14. Representative Sampling: Follow-up of Spring 1972 and Spring 1973 Students. TEX-SIS FOLLOW-UP SC3.

    ERIC Educational Resources Information Center

    Wilkinson, Larry; And Others

    This report presents the findings of a research study, conducted by the College of the Mainland (COM) as a subcontractor for Project FOLLOW-UP, designed to test the accuracy of random sampling and to measure non-response bias in mail surveys. In 1975, a computer-generated random sample of 500 students was drawn from a population of 1,256 students…

  15. Health indicators: eliminating bias from convenience sampling estimators.

    PubMed

    Hedt, Bethany L; Pagano, Marcello

    2011-02-28

    Public health practitioners are often called upon to make inference about a health indicator for a population at large when the sole available information are data gathered from a convenience sample, such as data gathered on visitors to a clinic. These data may be of the highest quality and quite extensive, but the biases inherent in a convenience sample preclude the legitimate use of powerful inferential tools that are usually associated with a random sample. In general, we know nothing about those who do not visit the clinic beyond the fact that they do not visit the clinic. An alternative is to take a random sample of the population. However, we show that this solution would be wasteful if it excluded the use of available information. Hence, we present a simple annealing methodology that combines a relatively small, and presumably far less expensive, random sample with the convenience sample. This allows us to not only take advantage of powerful inferential tools, but also provides more accurate information than that available from just using data from the random sample alone. Copyright © 2011 John Wiley & Sons, Ltd.

  16. Results of a pilot test of a brief computer-assisted tailored HIV prevention intervention for use with a range of demographic and risk groups.

    PubMed

    Zule, William A; Bobashev, Georgiy V; Reif, Susan M; Poulton, Winona; Coomes, Curtis M; Wechsberg, Wendee M

    2013-11-01

    There is a need for brief HIV prevention interventions that can be disseminated and implemented widely. This article reports the results of a small randomized field experiment that compared the relative effects of a brief two-session counselor-delivered computer-tailored intervention and a control condition. The intervention is designed for use with African-American, non-Hispanic white and Hispanic males and females who may be at risk of HIV through unprotected sex, selling sex, male to male sex, injecting drug use or use of stimulants. Participants (n = 120) were recruited using a quota sampling approach and randomized using block randomization, which resulted in ten male and ten female participants of each racial/ethnic group (i.e. African-American, non-Hispanic white and Hispanic) being assigned to either the intervention or a control arm. In logistic regression analyses using a generalized estimating equations approach, at 3-month followup, participants in the intervention arm were more likely than participants in the control arm to report condom use at last sex (Odds ratio [OR] = 4.75; 95 % Confidence interval [CI] = 1.70-13.26; p = 0.003). The findings suggest that a brief tailored intervention may increase condom use. Larger studies with longer followups are needed to determine if these results can be replicated.

  17. Internet-Users and Internet Non-Users Attitude towards Research: A Comparative Study on Post-Graduate Students

    ERIC Educational Resources Information Center

    Noor ul Amin, Syed

    2017-01-01

    The purpose of the present study was to compare the Internet-user and Internet Non-user post-graduate students on their attitude towards research. The sample comprised 600 post graduate students (300 Internet-users and 300 Internet-Non-users) drawn from different faculties of University of Kashmir (J&K), India. Random sampling technique was…

  18. A Model for Predicting Behavioural Sleep Problems in a Random Sample of Australian Pre-Schoolers

    ERIC Educational Resources Information Center

    Hall, Wendy A.; Zubrick, Stephen R.; Silburn, Sven R.; Parsons, Deborah E.; Kurinczuk, Jennifer J.

    2007-01-01

    Behavioural sleep problems (childhood insomnias) can cause distress for both parents and children. This paper reports a model describing predictors of high sleep problem scores in a representative population-based random sample survey of non-Aboriginal singleton children born in 1995 and 1996 (1085 girls and 1129 boys) in Western Australia.…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atanov, N.; et al.

    The Mu2e experiment at Fermilab will search for the coherentmore » $$\\mu \\to e$$ conversion on aluminum atoms. The detector system consists of a straw tube tracker and a crystal calorimeter. A pre-production of 150 Silicon Photomultiplier arrays for the Mu2e calorimeter has been procured. A detailed quality assur- ance has been carried out on each SiPM for the determination of its own operation voltage, gain, dark current and PDE. The measurement of the mean-time-to-failure for a small random sample of the pro-production group has been also completed as well as the determination of the dark current increase as a function of the ioninizing and non-ioninizing dose.« less

  20. [Arf6, RalA and BIRC5 protein expression in non small cell lung cancer].

    PubMed

    Knizhnik, A V; Kovaleva, O B; Laktionov, K K; Mochal'nikova, V V; Komel'kov, A V; Chevkina, E M; Zborovskaia, I B

    2011-01-01

    Evaluation of tumor markers expression pattern which determines individual progression parameters is one of the major topics in molecular oncopathology research. This work presents research on expression analysis of several Ras-Ral associated signal transduction pathway proteins (Arf6, RalA and BIRC5) in accordance with clinical criteria in non small cell lung cancer patients. Using Western-blot analysis and RT-PCR Arf6, RalA and BIRC5 expression has been analyzed in parallel in 53 non small cell lung cancer samples of different origin. Arf6 protein expression was elevated in 55% non small cell lung cancer tumor samples in comparison with normal tissue. In the group of squamous cell lung cancer Arf6 expression elevation was observed more often. RalA protein expression was decreased in comparison to normal tissue samples in 64% of non small cell lung cancer regardless to morphological structure. Correlation between RalA protein expression decrease and absence of regional metastases was revealed for squamous cell lung cancer. BIRC5 protein expression in tumor samples versus corresponding normal tissue was 1.3 times more often elevated in the squamous cell lung cancer group (in 76% tumor samples). At the same time elevation of BIRC5 expression was fixed only in 63% of adenocarcinoma tumor samples. A statistically significant decrease (p = 0.0158) of RalA protein expression and increase (p = 0.0498) of Arf6 protein expression in comparison with normal tissue was found for T1-2N0M0 and T1-2N1-2M0 groups of squamous cell lung cancer correspondingly.

  1. Non-Gaussian Multi-resolution Modeling of Magnetosphere-Ionosphere Coupling Processes

    NASA Astrophysics Data System (ADS)

    Fan, M.; Paul, D.; Lee, T. C. M.; Matsuo, T.

    2016-12-01

    The most dynamic coupling between the magnetosphere and ionosphere occurs in the Earth's polar atmosphere. Our objective is to model scale-dependent stochastic characteristics of high-latitude ionospheric electric fields that originate from solar wind magnetosphere-ionosphere interactions. The Earth's high-latitude ionospheric electric field exhibits considerable variability, with increasing non-Gaussian characteristics at decreasing spatio-temporal scales. Accurately representing the underlying stochastic physical process through random field modeling is crucial not only for scientific understanding of the energy, momentum and mass exchanges between the Earth's magnetosphere and ionosphere, but also for modern technological systems including telecommunication, navigation, positioning and satellite tracking. While a lot of efforts have been made to characterize the large-scale variability of the electric field in the context of Gaussian processes, no attempt has been made so far to model the small-scale non-Gaussian stochastic process observed in the high-latitude ionosphere. We construct a novel random field model using spherical needlets as building blocks. The double localization of spherical needlets in both spatial and frequency domains enables the model to capture the non-Gaussian and multi-resolutional characteristics of the small-scale variability. The estimation procedure is computationally feasible due to the utilization of an adaptive Gibbs sampler. We apply the proposed methodology to the computational simulation output from the Lyon-Fedder-Mobarry (LFM) global magnetohydrodynamics (MHD) magnetosphere model. Our non-Gaussian multi-resolution model results in characterizing significantly more energy associated with the small-scale ionospheric electric field variability in comparison to Gaussian models. By accurately representing unaccounted-for additional energy and momentum sources to the Earth's upper atmosphere, our novel random field modeling approach will provide a viable remedy to the current numerical models' systematic biases resulting from the underestimation of high-latitude energy and momentum sources.

  2. Effects of ignition location models on the burn patterns of simulated wildfires

    USGS Publications Warehouse

    Bar-Massada, A.; Syphard, A.D.; Hawbaker, T.J.; Stewart, S.I.; Radeloff, V.C.

    2011-01-01

    Fire simulation studies that use models such as FARSITE often assume that ignition locations are distributed randomly, because spatially explicit information about actual ignition locations are difficult to obtain. However, many studies show that the spatial distribution of ignition locations, whether human-caused or natural, is non-random. Thus, predictions from fire simulations based on random ignitions may be unrealistic. However, the extent to which the assumption of ignition location affects the predictions of fire simulation models has never been systematically explored. Our goal was to assess the difference in fire simulations that are based on random versus non-random ignition location patterns. We conducted four sets of 6000 FARSITE simulations for the Santa Monica Mountains in California to quantify the influence of random and non-random ignition locations and normal and extreme weather conditions on fire size distributions and spatial patterns of burn probability. Under extreme weather conditions, fires were significantly larger for non-random ignitions compared to random ignitions (mean area of 344.5 ha and 230.1 ha, respectively), but burn probability maps were highly correlated (r = 0.83). Under normal weather, random ignitions produced significantly larger fires than non-random ignitions (17.5 ha and 13.3 ha, respectively), and the spatial correlations between burn probability maps were not high (r = 0.54), though the difference in the average burn probability was small. The results of the study suggest that the location of ignitions used in fire simulation models may substantially influence the spatial predictions of fire spread patterns. However, the spatial bias introduced by using a random ignition location model may be minimized if the fire simulations are conducted under extreme weather conditions when fire spread is greatest. ?? 2010 Elsevier Ltd.

  3. Customizing chemotherapy in non-small cell lung cancer: the promise is still unmet

    PubMed Central

    2015-01-01

    A combination of cytotoxic agents with cis-platin remains the cornerstone of treatment for the vast majority of patients with non-small cell lung cancer (NSCLC). Molecular analysis of the primary may lead better prognostication and eventually in more accurate therapeutic approaches. Data from retrospective analysis of randomized trials as well as large patients’ series have suggested that chemotherapy may be customized upon molecular-genetic analysis of the tumor cells. The Spanish Lung Cancer Group (SLCG) in collaboration with French lung Cancer Group (FLCG) had conduct randomized, phase III, biomarkers-driven trial and supported simultaneously a randomized phase II trial in collaborating centers in China. Despite the evidence from the preclinical data and the results from the retrospective studies, the results of these trials published recently in Annals of Oncology were in favor of ‘standard approach’. The present commentary tries to give some explanation for the disappointing results, provide potential solution for the future trials and explain why the vision of customizing treatment is still alive. PMID:26629440

  4. Statistical description of non-Gaussian samples in the F2 layer of the ionosphere during heliogeophysical disturbances

    NASA Astrophysics Data System (ADS)

    Sergeenko, N. P.

    2017-11-01

    An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.

  5. A random urine test can identify patients at risk of mesalamine non-adherence: a prospective study.

    PubMed

    Gifford, Anne E; Berg, Anders H; Lahiff, Conor; Cheifetz, Adam S; Horowitz, Gary; Moss, Alan C

    2013-02-01

    Mesalamine non-adherence is common among patients with ulcerative colitis (UC), and can be difficult to identify in practice. We sought to determine whether a random urine test for salicylates could be used as a marker of 5-aminosalicylic acid (5-ASA) ingestion and identify patients at risk of non-adherence. Our aim is to determine whether measurement of salicylates in a random urine sample correlates with 5-ASA levels, and predicts an individual's risk of mesalamine non-adherence. Prospective observational study. Urinary salicylates (by colorimetry) and 5-ASA (by liquid chromatography and tandem-mass spectrometry) were measured in a random urine sample at baseline in patients and controls. Mesalamine adherence was quantified by patient self-reports at enrollment and pharmacy refills of mesalamine over 6 months. A total of 93 patients with UC taking mesalamine maintenance therapy were prospectively enrolled from the clinic. Random urine salicylate levels (by colorimetry) were highly correlated with urine 5-ASA metabolite levels (by mass spectrometry; R2=0.9). A random urine salicylate level above 15 mg/dl distinguished patients who had recently taken mesalamine from controls (area under the curve value 0.9, sensitivity 95%, specificity 77%). A significant proportion of patients (27%) who self-identified as "high adherers" by an adherence questionnaire (Morisky Medication Adherence Scale-8) had random levels of urine salicylate below this threshold. These patients were at higher risk of objectively measured non-adherence to mesalamine over the subsequent 6 months (RR: 2.7, 95% CI: 1.1-7.0). A random urine salicylate level measured in the clinic can identify patients who have not recently taken mesalamine, and who are at higher risk of longitudinal non-adherence. This test could be used to screen patients who may warrant interventions to improve adherence and prevent disease relapse.

  6. Empirical study of seven data mining algorithms on different characteristics of datasets for biomedical classification applications.

    PubMed

    Zhang, Yiyan; Xin, Yi; Li, Qin; Ma, Jianshe; Li, Shuai; Lv, Xiaodan; Lv, Weiqi

    2017-11-02

    Various kinds of data mining algorithms are continuously raised with the development of related disciplines. The applicable scopes and their performances of these algorithms are different. Hence, finding a suitable algorithm for a dataset is becoming an important emphasis for biomedical researchers to solve practical problems promptly. In this paper, seven kinds of sophisticated active algorithms, namely, C4.5, support vector machine, AdaBoost, k-nearest neighbor, naïve Bayes, random forest, and logistic regression, were selected as the research objects. The seven algorithms were applied to the 12 top-click UCI public datasets with the task of classification, and their performances were compared through induction and analysis. The sample size, number of attributes, number of missing values, and the sample size of each class, correlation coefficients between variables, class entropy of task variable, and the ratio of the sample size of the largest class to the least class were calculated to character the 12 research datasets. The two ensemble algorithms reach high accuracy of classification on most datasets. Moreover, random forest performs better than AdaBoost on the unbalanced dataset of the multi-class task. Simple algorithms, such as the naïve Bayes and logistic regression model are suitable for a small dataset with high correlation between the task and other non-task attribute variables. K-nearest neighbor and C4.5 decision tree algorithms perform well on binary- and multi-class task datasets. Support vector machine is more adept on the balanced small dataset of the binary-class task. No algorithm can maintain the best performance in all datasets. The applicability of the seven data mining algorithms on the datasets with different characteristics was summarized to provide a reference for biomedical researchers or beginners in different fields.

  7. Sampling procedures for throughfall monitoring: A simulation study

    NASA Astrophysics Data System (ADS)

    Zimmermann, Beate; Zimmermann, Alexander; Lark, Richard Murray; Elsenbeer, Helmut

    2010-01-01

    What is the most appropriate sampling scheme to estimate event-based average throughfall? A satisfactory answer to this seemingly simple question has yet to be found, a failure which we attribute to previous efforts' dependence on empirical studies. Here we try to answer this question by simulating stochastic throughfall fields based on parameters for statistical models of large monitoring data sets. We subsequently sampled these fields with different sampling designs and variable sample supports. We evaluated the performance of a particular sampling scheme with respect to the uncertainty of possible estimated means of throughfall volumes. Even for a relative error limit of 20%, an impractically large number of small, funnel-type collectors would be required to estimate mean throughfall, particularly for small events. While stratification of the target area is not superior to simple random sampling, cluster random sampling involves the risk of being less efficient. A larger sample support, e.g., the use of trough-type collectors, considerably reduces the necessary sample sizes and eliminates the sensitivity of the mean to outliers. Since the gain in time associated with the manual handling of troughs versus funnels depends on the local precipitation regime, the employment of automatically recording clusters of long troughs emerges as the most promising sampling scheme. Even so, a relative error of less than 5% appears out of reach for throughfall under heterogeneous canopies. We therefore suspect a considerable uncertainty of input parameters for interception models derived from measured throughfall, in particular, for those requiring data of small throughfall events.

  8. [Vis-NIR spectroscopic pattern recognition combined with SG smoothing applied to breed screening of transgenic sugarcane].

    PubMed

    Liu, Gui-Song; Guo, Hao-Song; Pan, Tao; Wang, Ji-Hua; Cao, Gan

    2014-10-01

    Based on Savitzky-Golay (SG) smoothing screening, principal component analysis (PCA) combined with separately supervised linear discriminant analysis (LDA) and unsupervised hierarchical clustering analysis (HCA) were used for non-destructive visible and near-infrared (Vis-NIR) detection for breed screening of transgenic sugarcane. A random and stability-dependent framework of calibration, prediction, and validation was proposed. A total of 456 samples of sugarcane leaves planting in the elongating stage were collected from the field, which was composed of 306 transgenic (positive) samples containing Bt and Bar gene and 150 non-transgenic (negative) samples. A total of 156 samples (negative 50 and positive 106) were randomly selected as the validation set; the remaining samples (negative 100 and positive 200, a total of 300 samples) were used as the modeling set, and then the modeling set was subdivided into calibration (negative 50 and positive 100, a total of 150 samples) and prediction sets (negative 50 and positive 100, a total of 150 samples) for 50 times. The number of SG smoothing points was ex- panded, while some modes of higher derivative were removed because of small absolute value, and a total of 264 smoothing modes were used for screening. The pairwise combinations of first three principal components were used, and then the optimal combination of principal components was selected according to the model effect. Based on all divisions of calibration and prediction sets and all SG smoothing modes, the SG-PCA-LDA and SG-PCA-HCA models were established, the model parameters were optimized based on the average prediction effect for all divisions to produce modeling stability. Finally, the model validation was performed by validation set. With SG smoothing, the modeling accuracy and stability of PCA-LDA, PCA-HCA were signif- icantly improved. For the optimal SG-PCA-LDA model, the recognition rate of positive and negative validation samples were 94.3%, 96.0%; and were 92.5%, 98.0% for the optimal SG-PCA-LDA model, respectively. Vis-NIR spectro- scopic pattern recognition combined with SG smoothing could be used for accurate recognition of transgenic sugarcane leaves, and provided a convenient screening method for transgenic sugarcane breeding.

  9. Random noise attenuation of non-uniformly sampled 3D seismic data along two spatial coordinates using non-equispaced curvelet transform

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Yang, Hui; Li, Hongxing; Huang, Guangnan; Ding, Zheyi

    2018-04-01

    The attenuation of random noise is important for improving the signal to noise ratio (SNR). However, the precondition for most conventional denoising methods is that the noisy data must be sampled on a uniform grid, making the conventional methods unsuitable for non-uniformly sampled data. In this paper, a denoising method capable of regularizing the noisy data from a non-uniform grid to a specified uniform grid is proposed. Firstly, the denoising method is performed for every time slice extracted from the 3D noisy data along the source and receiver directions, then the 2D non-equispaced fast Fourier transform (NFFT) is introduced in the conventional fast discrete curvelet transform (FDCT). The non-equispaced fast discrete curvelet transform (NFDCT) can be achieved based on the regularized inversion of an operator that links the uniformly sampled curvelet coefficients to the non-uniformly sampled noisy data. The uniform curvelet coefficients can be calculated by using the inversion algorithm of the spectral projected-gradient for ℓ1-norm problems. Then local threshold factors are chosen for the uniform curvelet coefficients for each decomposition scale, and effective curvelet coefficients are obtained respectively for each scale. Finally, the conventional inverse FDCT is applied to the effective curvelet coefficients. This completes the proposed 3D denoising method using the non-equispaced curvelet transform in the source-receiver domain. The examples for synthetic data and real data reveal the effectiveness of the proposed approach in applications to noise attenuation for non-uniformly sampled data compared with the conventional FDCT method and wavelet transformation.

  10. Treatment Patterns and Differences in Survival of Non-Small Cell Lung Cancer Patients Between Academic and Non-Academic Hospitals in the Netherlands.

    PubMed

    van der Linden, Naomi; Bongers, Mathilda L; Coupé, Veerle M H; Smit, Egbert F; Groen, Harry J M; Welling, Alle; Schramel, Franz M N H; Uyl-de Groot, Carin A

    2017-09-01

    The aims of this study are to analyze differences in survival between academic and non-academic hospitals and to provide insight into treatment patterns for non-small cell lung cancer (NSCLC). Results show the state of NSCLC survival and care in the Netherlands. The Netherlands Cancer Registry provided data on NSCLC survival for all Dutch hospitals. We used the Kaplan-Meier estimate to calculate median survival time by hospital type and a Cox proportional hazards model to estimate the relative risk of mortality (expressed as hazard ratios) for patients diagnosed in academic versus non-academic hospitals, with adjustment for age, gender, and tumor histology, and stratifying for disease stage. Data on treatment patterns in Dutch hospitals was obtained from 4 hospitals (2 academic, 2 non-academic). A random sample of patients diagnosed with NSCLC from January 2009 until January 2011 was identified through hospital databases. Data was obtained on patient characteristics, tumor characteristics, and treatments. The Cox proportional hazards model shows a significantly decreased hazard ratio of mortality for patients diagnosed in academic hospitals, as opposed to patients diagnosed in non-academic hospitals. This is specifically true for primary radiotherapy patients and patients who receive systemic treatment for non-metastasized NSCLC. Patients diagnosed in academic hospitals have better median overall survival than patients diagnosed in non-academic hospitals, especially for patients treated with radiotherapy, systemic treatment, or combinations. This difference may be caused by residual confounding since the estimates were not adjusted for performance status. A wide variety of surgical, radiotherapeutic, and systemic treatments is prescribed. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  12. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  13. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  14. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  15. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  16. The Great, Late Lesbian and Bisexual Women's Discrimination Survey.

    PubMed

    Rankine, J

    2001-01-01

    SUMMARY This 1992 New Zealand survey of discrimination against 261 lesbian and bisexual women found comparable rates of public abuse and workplace discrimination to those reported by surveys in other developed countries. The women reported higher rates of assault in public places than a random sample of New Zealand women. Indigenous Maori women reported higher rates of assault, threats, verbal abuse, and workplace discrimination than the non-Maori women surveyed. Aggression against the women was often in response to public expression of affection for another woman or to rejection of men's public sexual advances. The respondents reported hostile educational environments that coincided with peer harassment of students attracted to their own gender. Around two-thirds of the women had hidden their sexuality on some occasions at work to avoid discrimination. No significant differences between the discrimination experiences of lesbian and bisexual women emerged, although the bisexual sample was too small for statistical analysis.

  17. Chemical classification of iron meteorites. XI - Multi-element studies of 38 new irons and the high abundance of ungrouped irons from Antarctica

    NASA Technical Reports Server (NTRS)

    Wasson, John T.; Ouyang, Xinwei; Wang, Jianmin; Jerde, Eric

    1989-01-01

    Concentrations of 14 elements in the metal of 38 iron meteorites and a pallasite are reported. Three samples are paired with previously classified irons, raising the number of well-classified, independent iron meteorites to 598. Several of the new irons are from Antarctica. Of 24 independent irons from Antarctica, eight are ungrouped, a much higher fraction than that among all classified irons. The difference is probably related to the fact that the median mass of Antarctic irons is about two orders of magnitude smaller than that of non-Antarctic irons. Smaller meteoroids may tend to sample a larger number of asteroidal source regions, perhaps because small meteoroids tend to have higher ejection velocities or because they have random-walked a greater increment of orbital semimajor axis away from that of the parent body.

  18. Determination of the influence of dispersion pattern of pesticide-resistant individuals on the reliability of resistance estimates using different sampling plans.

    PubMed

    Shah, R; Worner, S P; Chapman, R B

    2012-10-01

    Pesticide resistance monitoring includes resistance detection and subsequent documentation/ measurement. Resistance detection would require at least one (≥1) resistant individual(s) to be present in a sample to initiate management strategies. Resistance documentation, on the other hand, would attempt to get an estimate of the entire population (≥90%) of the resistant individuals. A computer simulation model was used to compare the efficiency of simple random and systematic sampling plans to detect resistant individuals and to document their frequencies when the resistant individuals were randomly or patchily distributed. A patchy dispersion pattern of resistant individuals influenced the sampling efficiency of systematic sampling plans while the efficiency of random sampling was independent of such patchiness. When resistant individuals were randomly distributed, sample sizes required to detect at least one resistant individual (resistance detection) with a probability of 0.95 were 300 (1%) and 50 (10% and 20%); whereas, when resistant individuals were patchily distributed, using systematic sampling, sample sizes required for such detection were 6000 (1%), 600 (10%) and 300 (20%). Sample sizes of 900 and 400 would be required to detect ≥90% of resistant individuals (resistance documentation) with a probability of 0.95 when resistant individuals were randomly dispersed and present at a frequency of 10% and 20%, respectively; whereas, when resistant individuals were patchily distributed, using systematic sampling, a sample size of 3000 and 1500, respectively, was necessary. Small sample sizes either underestimated or overestimated the resistance frequency. A simple random sampling plan is, therefore, recommended for insecticide resistance detection and subsequent documentation.

  19. Pharmacodynamic effects of the fetal estrogen estetrol in postmenopausal women: results from a multiple-rising-dose study.

    PubMed

    Coelingh Bennink, Herjan J T; Verhoeven, Carole; Zimmerman, Yvette; Visser, Monique; Foidart, Jean-Michel; Gemzell-Danielsson, Kristina

    2017-06-01

    Estetrol (E4) is an estrogen produced exclusively by the human fetal liver during pregnancy. In this study the pharmacodynamic effects of escalating doses of E4 in postmenopausal women were investigated. This was a partly randomized, open-label, multiple-rising-dose study in 49 postmenopausal women. Participants were randomized to receive either 2 mg E4 or 2 mg estradiol-valerate (E2 V) for 28 days. Subsequent dose-escalation groups were (non-randomized): 10, 20 and 40 mg E4. Blood samples were collected regularly for measuring endocrine and hemostasis variables, lipids and lipoproteins, fasting glucose and bone turnover markers. Estetrol treatment resulted in a decrease of follicle-stimulating hormone and luteinizing hormone and an increase of sex-hormone binding globulin. Changes in hemostasis variables were small. A lowering effect on low-density lipoprotein cholesterol was accompanied with an increase in high-density lipoprotein cholesterol and no or minimal changes in triglycerides. The considerable decrease in osteocalcin levels in the three highest E4 dose groups and the small decrease in C-telopeptide levels were comparable to the E2 V control group and suggest a preventive effect on bone loss. All changes observed were dose-dependent. In this study, estetrol treatment showed dose-dependent estrogenic effects on endocrine parameters, bone turnover markers, and lipids and lipoproteins. The effect on triglycerides was small as were the effects on hemostatic variables. These results support the further investigation of estetrol as a candidate for hormone therapy. Quantitatively, the effects of 10 mg estetrol were similar to the study comparator 2 mg estradiol valerate.

  20. The topology of large-scale structure. V - Two-dimensional topology of sky maps

    NASA Astrophysics Data System (ADS)

    Gott, J. R., III; Mao, Shude; Park, Changbom; Lahav, Ofer

    1992-01-01

    A 2D algorithm is applied to observed sky maps and numerical simulations. It is found that when topology is studied on smoothing scales larger than the correlation length, the topology is approximately in agreement with the random phase formula for the 2D genus-threshold density relation, G2(nu) varies as nu(e) exp-nu-squared/2. Some samples show small 'meatball shifts' similar to those seen in corresponding 3D observational samples and similar to those produced by biasing in cold dark matter simulations. The observational results are thus consistent with the standard model in which the structure in the universe today has grown from small fluctuations caused by random quantum noise in the early universe.

  1. Sequential time interleaved random equivalent sampling for repetitive signal.

    PubMed

    Zhao, Yijiu; Liu, Jingjing

    2016-12-01

    Compressed sensing (CS) based sampling techniques exhibit many advantages over other existing approaches for sparse signal spectrum sensing; they are also incorporated into non-uniform sampling signal reconstruction to improve the efficiency, such as random equivalent sampling (RES). However, in CS based RES, only one sample of each acquisition is considered in the signal reconstruction stage, and it will result in more acquisition runs and longer sampling time. In this paper, a sampling sequence is taken in each RES acquisition run, and the corresponding block measurement matrix is constructed using a Whittaker-Shannon interpolation formula. All the block matrices are combined into an equivalent measurement matrix with respect to all sampling sequences. We implemented the proposed approach with a multi-cores analog-to-digital converter (ADC), whose ADC cores are time interleaved. A prototype realization of this proposed CS based sequential random equivalent sampling method has been developed. It is able to capture an analog waveform at an equivalent sampling rate of 40 GHz while sampled at 1 GHz physically. Experiments indicate that, for a sparse signal, the proposed CS based sequential random equivalent sampling exhibits high efficiency.

  2. Ventilatory Function in Relation to Mining Experience and Smoking in a Random Sample of Miners and Non-miners in a Witwatersrand Town1

    PubMed Central

    Sluis-Cremer, G. K.; Walters, L. G.; Sichel, H. S.

    1967-01-01

    The ventilatory capacity of a random sample of men over the age of 35 years in the town of Carletonville was estimated by the forced expiratory volume and the peak expiratory flow rate. Five hundred and sixty-two persons were working or had worked in gold-mines and 265 had never worked in gold-mines. No difference in ventilatory function was found between the miners and non-miners other than that due to the excess of chronic bronchitis in miners. PMID:6017134

  3. Probability of coincidental similarity among the orbits of small bodies - I. Pairing

    NASA Astrophysics Data System (ADS)

    Jopek, Tadeusz Jan; Bronikowska, Małgorzata

    2017-09-01

    Probability of coincidental clustering among orbits of comets, asteroids and meteoroids depends on many factors like: the size of the orbital sample searched for clusters or the size of the identified group, it is different for groups of 2,3,4,… members. Probability of coincidental clustering is assessed by the numerical simulation, therefore, it depends also on the method used for the synthetic orbits generation. We have tested the impact of some of these factors. For a given size of the orbital sample we have assessed probability of random pairing among several orbital populations of different sizes. We have found how these probabilities vary with the size of the orbital samples. Finally, keeping fixed size of the orbital sample we have shown that the probability of random pairing can be significantly different for the orbital samples obtained by different observation techniques. Also for the user convenience we have obtained several formulae which, for given size of the orbital sample can be used to calculate the similarity threshold corresponding to the small value of the probability of coincidental similarity among two orbits.

  4. The effects of pilates on mental health outcomes: A meta-analysis of controlled trials.

    PubMed

    Fleming, Karl M; Herring, Matthew P

    2018-04-01

    This meta-analysis estimated the population effect size for Pilates effects on mental health outcomes. Articles published prior to August 2017 were located with searches of Pubmed, Medline, Cinahl, SportDiscus, Science Direct, PsychINFO, Web of Science, and Cochrane Controlled Trial Register using combinations of: Pilates, Pilates method, mental health, anxiety, and depression. Eight English-language publications that included allocation to a Pilates intervention or non-active control and a measure of anxiety and/or depressive symptoms at baseline and after the Pilates intervention were selected. Participant and intervention characteristics, anxiety and depressive symptoms and other mental health outcomes, including feelings of energy and fatigue and quality of life, were extracted. Hedges' d effect sizes were computed, study quality was assessed, and random effects models estimated sampling error and population variance. Pilates resulted in significant, large, heterogeneous reductions in depressive (Δ = 1.27, 95%CI: 0.44, 2.09; z = 3.02, p ≤ 0.003; N = 6, n = 261) and anxiety symptoms (Δ = 1.29, 95%CI: 0.24, 2.33; z = 2.40, p ≤ 0.02; N = 5, n = 231) and feelings of fatigue (Δ = 0.93, 95%CI: 0.21, 1.66; z = 2.52, p ≤ 0.012; N = 3, n = 161), and increases in feelings of energy (Δ = 1.49, 95%CI: 0.67, 2.30; z = 3.57, p < 0.001; N = 2, n = 116). Though this review included a small number of controlled trials with small sample sizes and non-active control conditions of variable quality, the available evidence reviewed here supports that Pilates improves mental health outcomes. Rigorously designed randomized controlled trials, including those that compare Pilates to other empirically-supported therapies, are needed to better understand Pilates' clinical effectiveness and plausible mechanisms of effects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. The Missing Link: Workplace Education in Small Business.

    ERIC Educational Resources Information Center

    BCEL Newsletter for the Business & Literacy Communities, 1992

    1992-01-01

    A study sought to determine how and why small businesses invest or do not invest in basic skills instruction for their workers. Data were gathered through a national mail and telephone survey of a random sampling of 11,000 small (50 or fewer employees) and medium-sized (51-400 employees) firms, a targeted mail survey of 4,317 manufacturers, a…

  6. The impact of group counseling on depression, post-traumatic stress and function outcomes: a prospective comparison study in the Peter C. Alderman trauma clinics in northern Uganda.

    PubMed

    Nakimuli-Mpungu, Etheldreda; Okello, James; Kinyanda, Eugene; Alderman, Stephen; Nakku, Juliet; Alderman, Jeffrey S; Pavia, Alison; Adaku, Alex; Allden, Kathleen; Musisi, Seggane

    2013-10-01

    The effectiveness of group interventions for adults with mental distress in post-conflict settings is less clear in sub-Saharan Africa. To assess the impact of group counseling intervention on depression, post-traumatic stress and function outcomes among adults attending the Peter C. Alderman Foundation (PCAF) trauma clinics in northern Uganda. 631 War affected adults were enrolled into PCAF trauma clinics. Using a quasi-experimental design, assessments were conducted at baseline, at 3 and 6 months following initiation of care. Multivariate longitudinal regression models were used to determine change in depression, post-traumatic stress and function scores over time among group counseling participants and non-participants. In comparison to non-participants, participants had faster reduction in depression scores during the 6-month follow-up period [β=-1.84, 95%CI (-3.38 to -0.30), p=0.019] and faster reduction in post-traumatic stress scores during the 3-month follow-up period [β=-2.14, 95%CI (-4.21 to -0.10), p=0.042]. At 3-month follow up, participants who attended two or more sessions had faster increase in function scores [β=3.51, 95%CI (0.61-6.40), p=0.018] than participants who attended only one session. Selection bias due to the use of non-random samples. Substantial attrition rates and small sample sizes may have resulted in insufficient statistical power to determine meaningful differences. The group counseling intervention offered in the PCAF clinics may have considerable mental health benefits over time. There is need for more research to structure, standardize and test the efficacy of this intervention using a randomized controlled trial. © 2013 Elsevier B.V. All rights reserved.

  7. Random-anisotropy model: Monotonic dependence of the coercive field on D/J

    NASA Astrophysics Data System (ADS)

    Saslow, W. M.; Koon, N. C.

    1994-02-01

    We present the results of a numerical study of the zero-temperature remanence and coercivity for the random anisotropy model (RAM), showing that, contrary to early calculations for this model, the coercive field increases monotonically with increases in the strength D of the random anisotropy relative to the strength J at the exchange field. Local-field adjustments with and without spin flips are considered. Convergence is difficult to obtain for small values of the anisotropy, suggesting that this is the likely source of the nonmonotonic behavior found in earlier studies. For both large and small anisotropy, each spin undergoes about one flip per hysteresis cycle, and about half of the spin flips occur in the vicinity of the coercive field. When only non-spin-flip adjustments are considered, at large anisotropy the coercivity is proportional to the anisotropy. At small anisotropy, the rate of convergence is comparable to that when spin flips are included.

  8. Caries status in 16 year-olds with varying exposure to water fluoridation in Ireland.

    PubMed

    Mullen, J; McGaffin, J; Farvardin, N; Brightman, S; Haire, C; Freeman, R

    2012-12-01

    Most of the Republic of Ireland's public water supplies have been fluoridated since the mid-1960s while Northern Ireland has never been fluoridated, apart from some small short-lived schemes in east Ulster. This study examines dental caries status in 16 year-olds in a part of Ireland straddling fluoridated and non-fluoridated water supply areas and compares two methods of assessing the effectiveness of water fluoridation. The cross-sectional survey tested differences in caries status by two methods: 1, Estimated Fluoridation Status as used previously in national and regional studies in the Republic and in the All-Island study of 2002; 2, Percentage Lifetime Exposure, a modification of a system described by Slade in 1995 and used in Australian caries research. Adolescents were selected for the study by a two-part random sampling process. Firstly, schools were selected in each area by creating three tiers based on school size, and selecting schools randomly from each tier. Then random sampling of 16-year-olds from these schools, based on a pre-set sampling fraction for each tier of schools. With both systems of measurement, significantly lower caries levels were found in those children with the greatest exposure to fluoridated water when compared to those with the least exposure. The survey provides further evidence of the effectiveness in reducing dental caries experience up to 16 years of age. The extra intricacies involved in using the Percentage Lifetime Exposure method did not provide much more information when compared to the simpler Estimated Fluoridation Status method.

  9. Risk factors for wound disruption following cesarean delivery.

    PubMed

    Subramaniam, Akila; Jauk, Victoria C; Figueroa, Dana; Biggio, Joseph R; Owen, John; Tita, Alan T N

    2014-08-01

    Risk factors for post-cesarean wound infection, but not disruption, are well-described in the literature. The primary objective of this study was to identify risk factors for non-infectious post-cesarean wound disruption. Secondary analysis was conducted using data from a single-center randomized controlled trial of staple versus suture skin closure in women ≥24 weeks' gestation undergoing cesarean delivery. Wound disruption was defined as subcutaneous skin or fascial dehiscence excluding primary wound infections. Composite wound morbidity (disruption or infection) was examined as a secondary outcome. Patient demographics, medical co-morbidities, and intrapartum characteristics were evaluated as potential risk factors using multivariable logistic regression. Of the 398 randomized patients, 340, including 26 with disruptions (7.6%) met inclusion criteria and were analyzed. After multivariable adjustments, African-American race (aOR 3.9, 95% CI 1.1-13.8) and staple - as opposed to suture - wound closure (aOR 5.4, 95% CI 1.8-16.1) remained significant risk factors for disruption; non-significant increases were observed for body mass index ≥30 (aOR 2.1, 95% CI 0.6-7.5), but not for diabetes mellitus (aOR 0.9, 95% CI 0.3-2.9). RESULTS for composite wound morbidity were similar. Skin closure with staples, African-American race, and considering the relatively small sample size, potentially obesity are associated with increased risk of non-infectious post-cesarean wound disruption.

  10. Testing non-inferiority of a new treatment in three-arm clinical trials with binary endpoints.

    PubMed

    Tang, Nian-Sheng; Yu, Bin; Tang, Man-Lai

    2014-12-18

    A two-arm non-inferiority trial without a placebo is usually adopted to demonstrate that an experimental treatment is not worse than a reference treatment by a small pre-specified non-inferiority margin due to ethical concerns. Selection of the non-inferiority margin and establishment of assay sensitivity are two major issues in the design, analysis and interpretation for two-arm non-inferiority trials. Alternatively, a three-arm non-inferiority clinical trial including a placebo is usually conducted to assess the assay sensitivity and internal validity of a trial. Recently, some large-sample approaches have been developed to assess the non-inferiority of a new treatment based on the three-arm trial design. However, these methods behave badly with small sample sizes in the three arms. This manuscript aims to develop some reliable small-sample methods to test three-arm non-inferiority. Saddlepoint approximation, exact and approximate unconditional, and bootstrap-resampling methods are developed to calculate p-values of the Wald-type, score and likelihood ratio tests. Simulation studies are conducted to evaluate their performance in terms of type I error rate and power. Our empirical results show that the saddlepoint approximation method generally behaves better than the asymptotic method based on the Wald-type test statistic. For small sample sizes, approximate unconditional and bootstrap-resampling methods based on the score test statistic perform better in the sense that their corresponding type I error rates are generally closer to the prespecified nominal level than those of other test procedures. Both approximate unconditional and bootstrap-resampling test procedures based on the score test statistic are generally recommended for three-arm non-inferiority trials with binary outcomes.

  11. Erlotinib in African Americans with Advanced Non-Small Cell Lung Cancer: A Prospective Randomized Study with Genetic and Pharmacokinetic Analysis

    PubMed Central

    Phelps, Mitch A.; Stinchcombe, Thomas E.; Blachly, James S.; Zhao, Weiqiang; Schaaf, Larry J.; Starrett, Sherri L.; Wei, Lai; Poi, Ming; Wang, Danxin; Papp, Audrey; Aimiuwu, Josephine; Gao, Yue; Li, Junan; Otterson, Gregory A.; Hicks, William J.; Socinski, Mark A.; Villalona-Calero, Miguel A.

    2014-01-01

    Prospective studies focusing on EGFR inhibitors in African Americans with NSCLC have not been previously performed. In this phase II randomized study, 55 African Americans with NSCLC received erlotinib 150mg/day or a body weight adjusted dose with subsequent escalations to the maximum allowable, 200mg/day, to achieve rash. Erlotinib and OSI-420 exposures were lower compared to previous reports, consistent with CYP3A pharmacogenetics implying higher metabolic activity. Tumor genetics revealed only two EGFR mutations, EGFR amplification in 17/47 samples, 8 KRAS mutations and 5 EML4-ALK translocations. Although absence of rash was associated with shorter time to progression (TTP), disease control rate, TTP, and 1-year survival were not different between the two dose groups, indicating the dose-to-rash strategy failed to increase clinical benefit. Observed low incidence of toxicity and low erlotinib exposure suggest standardized and maximum allowable dosing may be suboptimal in African Americans. PMID:24781527

  12. Doxycycline for prevention of erlotinib-induced rash in patients with non-small-cell lung cancer (NSCLC) after failure of first-line chemotherapy: A randomized, open-label trial.

    PubMed

    Deplanque, Gaël; Gervais, Radj; Vergnenegre, Alain; Falchero, Lionel; Souquet, Pierre-Jean; Chavaillon, Jean-Michel; Taviot, Bruno; Fraboulet, Ghislaine; Saal, Hakim; Robert, Caroline; Chosidow, Olivier

    2016-06-01

    Rash is a common epidermal growth factor receptor inhibitor-induced toxicity that can impair quality of life and treatment compliance. We sought to evaluate the efficacy of doxycycline in preventing erlotinib-induced rash (folliculitis) in patients with non-small-cell lung cancer. This open-label, randomized, prospective, phase II trial was conducted in 147 patients with locally advanced or metastatic non-small-cell lung cancer progressing after first-line chemotherapy, randomized for 4 months with erlotinib alone 150 mg/d per os (control arm) or combined with doxycycline 100 mg/d (doxycycline arm). Incidence and severity of rash, compliance, survival, and safety were assessed. Baseline characteristics of the 147 patients were well balanced in the intent-to-treat population. Folliculitis occurred in 71% of patients in the doxycycline arm and 81% in the control arm (P = .175). The severity of folliculitis and other skin lesions was lower in the doxycycline arm compared with the control arm. Other adverse events were reported at a similar frequency across arms. There was no significant difference in survival between treatment arms. The open-label design of the study and the duration of the treatment with doxycycline are limitations. Doxycycline did not reduce the incidence of erlotinib-induced folliculitis, but significantly reduced its severity. Copyright © 2016 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  13. Generalizing the Network Scale-Up Method: A New Estimator for the Size of Hidden Populations*

    PubMed Central

    Feehan, Dennis M.; Salganik, Matthew J.

    2018-01-01

    The network scale-up method enables researchers to estimate the size of hidden populations, such as drug injectors and sex workers, using sampled social network data. The basic scale-up estimator offers advantages over other size estimation techniques, but it depends on problematic modeling assumptions. We propose a new generalized scale-up estimator that can be used in settings with non-random social mixing and imperfect awareness about membership in the hidden population. Further, the new estimator can be used when data are collected via complex sample designs and from incomplete sampling frames. However, the generalized scale-up estimator also requires data from two samples: one from the frame population and one from the hidden population. In some situations these data from the hidden population can be collected by adding a small number of questions to already planned studies. For other situations, we develop interpretable adjustment factors that can be applied to the basic scale-up estimator. We conclude with practical recommendations for the design and analysis of future studies. PMID:29375167

  14. California’s Shrinking Defense Contractors: Effects on Small Suppliers,

    DTIC Science & Technology

    1996-01-01

    They did not physically segregate any parts of their operations or set up a separate data management system to do business with prime contractors...of them out of business . * Small defense aerospace suppliers are not making cutting-edge products for commercial customers. Small defense aerospace...are considered critical to their primes’ supplier base are concerned. Most Firms Are Still in Business When we checked with a random sample of small

  15. Assessing the Status of Wild Felids in a Highly-Disturbed Commercial Forest Reserve in Borneo and the Implications for Camera Trap Survey Design

    PubMed Central

    Wearn, Oliver R.; Rowcliffe, J. Marcus; Carbone, Chris; Bernard, Henry; Ewers, Robert M.

    2013-01-01

    The proliferation of camera-trapping studies has led to a spate of extensions in the known distributions of many wild cat species, not least in Borneo. However, we still do not have a clear picture of the spatial patterns of felid abundance in Southeast Asia, particularly with respect to the large areas of highly-disturbed habitat. An important obstacle to increasing the usefulness of camera trap data is the widespread practice of setting cameras at non-random locations. Non-random deployment interacts with non-random space-use by animals, causing biases in our inferences about relative abundance from detection frequencies alone. This may be a particular problem if surveys do not adequately sample the full range of habitat features present in a study region. Using camera-trapping records and incidental sightings from the Kalabakan Forest Reserve, Sabah, Malaysian Borneo, we aimed to assess the relative abundance of felid species in highly-disturbed forest, as well as investigate felid space-use and the potential for biases resulting from non-random sampling. Although the area has been intensively logged over three decades, it was found to still retain the full complement of Bornean felids, including the bay cat Pardofelis badia, a poorly known Bornean endemic. Camera-trapping using strictly random locations detected four of the five Bornean felid species and revealed inter- and intra-specific differences in space-use. We compare our results with an extensive dataset of >1,200 felid records from previous camera-trapping studies and show that the relative abundance of the bay cat, in particular, may have previously been underestimated due to the use of non-random survey locations. Further surveys for this species using random locations will be crucial in determining its conservation status. We advocate the more wide-spread use of random survey locations in future camera-trapping surveys in order to increase the robustness and generality of inferences that can be made. PMID:24223717

  16. CIDR

    Science.gov Websites

    * Minimum # Experimental Samples DNA Volume (ul) Genomic DNA Concentration (ng/ul) Low Input DNA Volume (ul . **Please inquire about additional cost for low input option. Genotyping Minimum # Experimental Samples DNA sample quality. If you do submit WGA samples, you should anticipate a higher non-random missing data rate

  17. School Readiness in Children Living in Non-Parental Care: Impacts of Head Start

    ERIC Educational Resources Information Center

    Lipscomb, Shannon T.; Pratt, Megan E.; Schmitt, Sara A.; Pears, Katherine C.; Kim, Hyoun K.

    2013-01-01

    The current study examines the effects of Head Start on the development of school readiness outcomes for children living in non-parental care. Data were obtained from the Head Start Impact Study, a randomized controlled trial of Head Start conducted with a nationally representative sample of Head Start programs and families. The sample included…

  18. The Matrix Analogies Test: A Validity Study with the K-ABC.

    ERIC Educational Resources Information Center

    Smith, Douglas K.

    The Matrix Analogies Test-Expanded Form (MAT-EF) and Kaufman Assessment Battery for Children (K-ABC) were administered in counterbalanced order to two randomly selected samples of students in grades 2 through 5. The MAT-EF was recently developed to measure non-verbal reasoning. The samples included 26 non-handicapped second graders in a rural…

  19. Comparing interval estimates for small sample ordinal CFA models

    PubMed Central

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research. PMID:26579002

  20. Comparing interval estimates for small sample ordinal CFA models.

    PubMed

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research.

  1. [Prospect and Current Situation of Immune Checkpoint Inhibitors 
in First-line Treatment in Advanced Non-small Cell Lung Cancer Patients].

    PubMed

    Wang, Haiyang; Yu, Xiaoqing; Fan, Yun

    2017-06-20

    With the breakthroughs achieved of programmed death-1 (PD-1)/PD-L1 inhibitors monotherapy as first-line and second-line treatment in advanced non-small cell lung cancer (NSCLC), the treatment strategy is gradually evolving and optimizing. Immune combination therapy expands the benefit population and improves the curative effect. A series of randomized phase III trials are ongoing. In this review, we discuss the prospect and current situation of immune checkpoint inhibitors in first-line treatment in advanced NSCLC patients.

  2. Early intervention for adolescents at-risk for bipolar disorder: A pilot randomized trial of Interpersonal and Social Rhythm Therapy (IPSRT).

    PubMed

    Goldstein, Tina R; Merranko, John; Krantz, Megan; Garcia, Matthew; Franzen, Peter; Levenson, Jessica; Axelson, David; Birmaher, Boris; Frank, Ellen

    2018-08-01

    To conduct a pilot randomized trial of Interpersonal and Social Rhythm Therapy plus Data-Informed Referral (IPSRT + DIR) versus DIR-alone for adolescents at-risk for bipolar disorder (BP). Eligible participants included youth (12-18) with a BP parent; youth with BP were excluded. Participants (n = 42) were randomized to receive IPSRT + DIR to treat any psychiatric disorders present at baseline, or DIR-alone. A blind evaluator assessed outcomes at baseline, 3- and 6-months. Participants wore an actigraph to measure sleep/wake patterns for 7 days at baseline and 6-months. Primary outcomes included mood and non-mood symptoms and sleep disturbance. Youth randomized to IPSRT + DIR attended approximately half of scheduled IPSRT sessions. Although 33% of DIR-alone youth were referred for mental health services at intake (another 33% were already engaged in services), none initiated new services over follow-up. No youth developed new-onset mood disorder over follow-up. Self- and parent-reported mood and non-mood psychiatric symptoms did not distinguish the groups, although youth in DIR-alone tended to have higher baseline scores on most measures. Per clinician ratings, 1 youth receiving IPSRT + DIR displayed subthreshold hypo/manic symptoms, versus 2 receiving DIR-alone (OR = 14.7, p = 0.03), possibly signaling less subthreshold hypo/manic symptoms, and for fewer weeks (χ 2   = 11.06, p = 0.0009), over 6-months with IPSRT + DIR. We found a small effect for youth in the IPSRT + DIR group to evidence more WASO at pre-treatment, but less at follow-up (cohen's d = 0.28). Small sample size limits statistical power, and we are unable to definitively attribute group differences to IPSRT versus greater clinical contact. Ability to examine distal/rare (i.e., BP onset) outcomes was limited. Adolescents at-risk for BP present challenges to psychosocial treatment engagement and retention. IPSRT merits further study as an acceptable intervention for at-risk youth, though necessary frequency and intensity to affect outcomes should be examined. The potential to delay or prevent subthreshold hypo/manic symptoms via enhanced sleep continuity is an area for further examination. Future studies with larger samples and extended follow-up can help determine whether IPSRT may delay or prevent syndromal hypo/mania in youth at-risk. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Improved variance estimation of classification performance via reduction of bias caused by small sample size.

    PubMed

    Wickenberg-Bolin, Ulrika; Göransson, Hanna; Fryknäs, Mårten; Gustafsson, Mats G; Isaksson, Anders

    2006-03-13

    Supervised learning for classification of cancer employs a set of design examples to learn how to discriminate between tumors. In practice it is crucial to confirm that the classifier is robust with good generalization performance to new examples, or at least that it performs better than random guessing. A suggested alternative is to obtain a confidence interval of the error rate using repeated design and test sets selected from available examples. However, it is known that even in the ideal situation of repeated designs and tests with completely novel samples in each cycle, a small test set size leads to a large bias in the estimate of the true variance between design sets. Therefore different methods for small sample performance estimation such as a recently proposed procedure called Repeated Random Sampling (RSS) is also expected to result in heavily biased estimates, which in turn translates into biased confidence intervals. Here we explore such biases and develop a refined algorithm called Repeated Independent Design and Test (RIDT). Our simulations reveal that repeated designs and tests based on resampling in a fixed bag of samples yield a biased variance estimate. We also demonstrate that it is possible to obtain an improved variance estimate by means of a procedure that explicitly models how this bias depends on the number of samples used for testing. For the special case of repeated designs and tests using new samples for each design and test, we present an exact analytical expression for how the expected value of the bias decreases with the size of the test set. We show that via modeling and subsequent reduction of the small sample bias, it is possible to obtain an improved estimate of the variance of classifier performance between design sets. However, the uncertainty of the variance estimate is large in the simulations performed indicating that the method in its present form cannot be directly applied to small data sets.

  4. Evaluating Bayesian spatial methods for modelling species distributions with clumped and restricted occurrence data.

    PubMed

    Redding, David W; Lucas, Tim C D; Blackburn, Tim M; Jones, Kate E

    2017-01-01

    Statistical approaches for inferring the spatial distribution of taxa (Species Distribution Models, SDMs) commonly rely on available occurrence data, which is often clumped and geographically restricted. Although available SDM methods address some of these factors, they could be more directly and accurately modelled using a spatially-explicit approach. Software to fit models with spatial autocorrelation parameters in SDMs are now widely available, but whether such approaches for inferring SDMs aid predictions compared to other methodologies is unknown. Here, within a simulated environment using 1000 generated species' ranges, we compared the performance of two commonly used non-spatial SDM methods (Maximum Entropy Modelling, MAXENT and boosted regression trees, BRT), to a spatial Bayesian SDM method (fitted using R-INLA), when the underlying data exhibit varying combinations of clumping and geographic restriction. Finally, we tested how any recommended methodological settings designed to account for spatially non-random patterns in the data impact inference. Spatial Bayesian SDM method was the most consistently accurate method, being in the top 2 most accurate methods in 7 out of 8 data sampling scenarios. Within high-coverage sample datasets, all methods performed fairly similarly. When sampling points were randomly spread, BRT had a 1-3% greater accuracy over the other methods and when samples were clumped, the spatial Bayesian SDM method had a 4%-8% better AUC score. Alternatively, when sampling points were restricted to a small section of the true range all methods were on average 10-12% less accurate, with greater variation among the methods. Model inference under the recommended settings to account for autocorrelation was not impacted by clumping or restriction of data, except for the complexity of the spatial regression term in the spatial Bayesian model. Methods, such as those made available by R-INLA, can be successfully used to account for spatial autocorrelation in an SDM context and, by taking account of random effects, produce outputs that can better elucidate the role of covariates in predicting species occurrence. Given that it is often unclear what the drivers are behind data clumping in an empirical occurrence dataset, or indeed how geographically restricted these data are, spatially-explicit Bayesian SDMs may be the better choice when modelling the spatial distribution of target species.

  5. Barriers to Distance Education in Rural Schools

    ERIC Educational Resources Information Center

    Irvin, Matthew J.; Hannum, Wallace H.; Varre, Claire de la; Farmer, Thomas W.

    2010-01-01

    The primary purpose of the current study was to examine barriers to the use of distance education and explore related factors in small and low-income rural schools. Data were collected via a telephone survey with administrators or other qualified personnel. The sample involved 417 randomly selected small and low-income rural school districts…

  6. Distribution of blood types in a sample of 245 New Zealand non-purebred cats.

    PubMed

    Cattin, R P

    2016-05-01

    To determine the distribution of feline blood types in a sample of non-pedigree, domestic cats in New Zealand, whether a difference exists in this distribution between domestic short haired and domestic long haired cats, and between the North and South Islands of New Zealand; and to calculate the risk of a random blood transfusion causing a severe transfusion reaction, and the risk of a random mating producing kittens susceptible to neonatal isoerythrolysis. The results of 245 blood typing tests in non-pedigree cats performed at the New Zealand Veterinary Pathology (NZVP) and Gribbles Veterinary Pathology laboratories between the beginning of 2009 and the end of 2014 were retrospectively collated and analysed. Cats that were identified as domestic short or long haired were included. For the cats tested at Gribbles Veterinary Pathology 62 were from the North Island, and 27 from the South Island. The blood type distribution differed between samples from the two laboratories (p=0.029), but not between domestic short and long haired cats (p=0.50), or between the North and South Islands (p=0.76). Of the 89 cats tested at Gribbles Veterinary Pathology, 70 (79%) were type A, 18 (20%) type B, and 1 (1%) type AB; for NZVP 139/156 (89.1%) cats were type A, 16 (10.3%) type B, and 1 (0.6%) type AB. It was estimated that 18.3-31.9% of random blood transfusions would be at risk of a transfusion reaction, and neonatal isoerythrolysis would be a risk in 9.2-16.1% of random matings between non-pedigree cats. The results from this study suggest that there is a high risk of complications for a random blood transfusion between non-purebred cats in New Zealand. Neonatal isoerythrolysis should be considered an important differential diagnosis in illness or mortality in kittens during the first days of life.

  7. Energy Contents of Frequently Ordered Restaurant Meals and Comparison with Human Energy Requirements and U.S. Department of Agriculture Database Information: A Multisite Randomized Study.

    PubMed

    Urban, Lorien E; Weber, Judith L; Heyman, Melvin B; Schichtl, Rachel L; Verstraete, Sofia; Lowery, Nina S; Das, Sai Krupa; Schleicher, Molly M; Rogers, Gail; Economos, Christina; Masters, William A; Roberts, Susan B

    2016-04-01

    Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ∼50% of US restaurants are individual or small-chain (non-chain) establishments that do not provide nutrition information. To measure the energy content of frequently ordered meals in non-chain restaurants in three US locations, and compare with the energy content of meals from large-chain restaurants, energy requirements, and food database information. A multisite random-sampling protocol was used to measure the energy contents of the most frequently ordered meals from the most popular cuisines in non-chain restaurants, together with equivalent meals from large-chain restaurants. Meals were obtained from restaurants in San Francisco, CA; Boston, MA; and Little Rock, AR, between 2011 and 2014. Meal energy content determined by bomb calorimetry. Regional and cuisine differences were assessed using a mixed model with restaurant nested within region×cuisine as the random factor. Paired t tests were used to evaluate differences between non-chain and chain meals, human energy requirements, and food database values. Meals from non-chain restaurants contained 1,205±465 kcal/meal, amounts that were not significantly different from equivalent meals from large-chain restaurants (+5.1%; P=0.41). There was a significant effect of cuisine on non-chain meal energy, and three of the four most popular cuisines (American, Italian, and Chinese) had the highest mean energy (1,495 kcal/meal). Ninety-two percent of meals exceeded typical energy requirements for a single eating occasion. Non-chain restaurants lacking nutrition information serve amounts of energy that are typically far in excess of human energy requirements for single eating occasions, and are equivalent to amounts served by the large-chain restaurants that have previously been criticized for providing excess energy. Restaurants in general, rather than specific categories of restaurant, expose patrons to excessive portions that induce overeating through established biological mechanisms. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  8. A new approach to evaluate gamma-ray measurements

    NASA Technical Reports Server (NTRS)

    Dejager, O. C.; Swanepoel, J. W. H.; Raubenheimer, B. C.; Vandervalt, D. J.

    1985-01-01

    Misunderstandings about the term random samples its implications may easily arise. Conditions under which the phases, obtained from arrival times, do not form a random sample and the dangers involved are discussed. Watson's U sup 2 test for uniformity is recommended for light curves with duty cycles larger than 10%. Under certain conditions, non-parametric density estimation may be used to determine estimates of the true light curve and its parameters.

  9. Surgical versus Non-Operative Treatment for Lumbar Disc Herniation: Eight-Year Results for the Spine Patient Outcomes Research Trial (SPORT)

    PubMed Central

    Lurie, Jon D.; Tosteson, Tor D.; Tosteson, Anna N. A.; Zhao, Wenyan; Morgan, Tamara S.; Abdu, William A.; Herkowitz, Harry; Weinstein, James N.

    2014-01-01

    Study Design Concurrent prospective randomized and observational cohort studies. Objective To assess the 8-year outcomes of surgery vs. non-operative care. Summary of Background Data Although randomized trials have demonstrated small short-term differences in favor of surgery, long-term outcomes comparing surgical to non-operative treatment remain controversial. Methods Surgical candidates with imaging-confirmed lumbar intervertebral disc herniation meeting SPORT eligibility criteria enrolled into prospective randomized (501 participants) and observational cohorts (743 participants) at 13 spine clinics in 11 US states. Interventions were standard open discectomy versus usual non-operative care. Main outcome measures were changes from baseline in the SF-36 Bodily Pain (BP) and Physical Function (PF) scales and the modified Oswestry Disability Index (ODI - AAOS/Modems version) assessed at 6 weeks, 3 and 6 months, and annually thereafter. Results Advantages were seen for surgery in intent-to-treat analyses for the randomized cohort for all primary and secondary outcomes other than work status; however, with extensive non-adherence to treatment assignment (49% patients assigned to non-operative therapy receiving surgery versus 60% of patients assigned to surgery) these observed effects were relatively small and not statistically significant for primary outcomes (BP, PF, ODI). Importantly, the overall comparison of secondary outcomes was significantly greater with surgery in the intent-to-treat analysis (sciatica bothersomeness [p > 0.005], satisfaction with symptoms [p > 0.013], and self-rated improvement [p > 0.013]) in long-term follow-up. An as-treated analysis showed clinically meaningful surgical treatment effects for primary outcome measures (mean change Surgery vs. Non-operative; treatment effect; 95% CI): BP (45.3 vs. 34.4; 10.9; 7.7 to 14); PF (42.2 vs. 31.5; 10.6; 7.7 to 13.5) and ODI (−36.2 vs. −24.8; −11.2; −13.6 to −9.1). Conclusion Carefully selected patients who underwent surgery for a lumbar disc herniation achieved greater improvement than non-operatively treated patients; there was little to no degradation of outcomes in either group (operative and non-operative) from 4 to 8 years. PMID:24153171

  10. The coverage of a random sample from a biological community.

    PubMed

    Engen, S

    1975-03-01

    A taxonomic group will frequently have a large number of species with small abundances. When a sample is drawn at random from this group, one is therefore faced with the problem that a large proportion of the species will not be discovered. A general definition of quantitative measures of "sample coverage" is proposed, and the problem of statistical inference is considered for two special cases, (1) the actual total relative abundance of those species that are represented in the sample, and (2) their relative contribution to the information index of diversity. The analysis is based on a extended version of the negative binomial species frequency model. The results are tabulated.

  11. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  12. Maintenance treatment for opioid dependence with slow-release oral morphine: a randomized cross-over, non-inferiority study versus methadone

    PubMed Central

    Beck, Thilo; Haasen, Christian; Verthein, Uwe; Walcher, Stephan; Schuler, Christoph; Backmund, Markus; Ruckes, Christian; Reimer, Jens

    2014-01-01

    Aims To compare the efficacy of slow-release oral morphine (SROM) and methadone as maintenance medication for opioid dependence in patients previously treated with methadone. Design Prospective, multiple-dose, open label, randomized, non-inferiority, cross-over study over two 11-week periods. Methadone treatment was switched to SROM with flexible dosing and vice versa according to period and sequence of treatment. Setting Fourteen out-patient addiction treatment centres in Switzerland and Germany. Participants Adults with opioid dependence in methadone maintenance programmes (dose ≥50 mg/day) for ≥26 weeks. Measurements The efficacy end-point was the proportion of heroin-positive urine samples per patient and period of treatment. Each week, two urine samples were collected, randomly selected and analysed for 6-monoacetyl-morphine and 6-acetylcodeine. Non-inferiority was concluded if the two-sided 95% confidence interval (CI) in the difference of proportions of positive urine samples was below the predefined boundary of 10%. Findings One hundred and fifty-seven patients fulfilled criteria to form the per protocol population. The proportion of heroin-positive urine samples under SROM treatment (0.20) was non-inferior to the proportion under methadone treatment (0.15) (least-squares mean difference 0.05; 95% CI = 0.02, 0.08; P > 0.01). The 95% CI fell within the 10% non-inferiority margin, confirming the non-inferiority of SROM to methadone. A dose-dependent effect was shown for SROM (i.e. decreasing proportions of heroin-positive urine samples with increasing SROM doses). Retention in treatment showed no significant differences between treatments (period 1/period 2: SROM: 88.7%/82.1%, methadone: 91.1%/88.0%; period 1: P = 0.50, period 2: P = 0.19). Overall, safety outcomes were similar between the two groups. Conclusions Slow-release oral morphine appears to be at least as effective as methadone in treating people with opioid use disorder. PMID:24304412

  13. Texture and color features for tile classification

    NASA Astrophysics Data System (ADS)

    Baldrich, Ramon; Vanrell, Maria; Villanueva, Juan J.

    1999-09-01

    In this paper we present the results of a preliminary computer vision system to classify the production of a ceramic tile industry. We focus on the classification of a specific type of tiles whose production can be affected by external factors, such as humidity, temperature, origin of clays and pigments. Variations on these uncontrolled factors provoke small differences in the color and the texture of the tiles that force to classify all the production. A constant and non- subjective classification would allow avoiding devolution from customers and unnecessary stock fragmentation. The aim of this work is to simulate the human behavior on this classification task by extracting a set of features from tile images. These features are induced by definitions from experts. To compute them we need to mix color and texture information and to define global and local measures. In this work, we do not seek a general texture-color representation, we only deal with textures formed by non-oriented colored-blobs randomly distributed. New samples are classified using Discriminant Analysis functions derived from known class tile samples. The last part of the paper is devoted to explain the correction of acquired images in order to avoid time and geometry illumination changes.

  14. Support for liberal development policies among community elites and non-elites in a rural region of Wisconsin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buttel, F.H.; Johnson, D.E.

    Liberal development policies for rural areas, aimed at improving economic conditions and helping corporations find more profitable production sites, are found to fit the traditional role of intervention by government to encourage private investment in underdeveloped areas. Two strategies used in Wisconsin are analyzed to determine the level of community support and compare the social and intellectual support for growth centers of community elites with non-elites. Results indicate the general public does not support the concept of planned growth centers, with primary opposition coming from professional and farm groups rather than the ''traditionalism'' that is often used to characterize themore » area. Those favoring growth-center policies are primarily elites, who tend to limit their support to development of their own community. Elites also favor consolidating community and county delivery of services. Data for the study consisted of 231 personal interviews with leaders of 32 small- and mid-size communities. Their responses were then compared with a random sampling of non-elites. 27 references. (DCK)« less

  15. A comparison of confidence/credible interval methods for the area under the ROC curve for continuous diagnostic tests with small sample size.

    PubMed

    Feng, Dai; Cortese, Giuliana; Baumgartner, Richard

    2017-12-01

    The receiver operating characteristic (ROC) curve is frequently used as a measure of accuracy of continuous markers in diagnostic tests. The area under the ROC curve (AUC) is arguably the most widely used summary index for the ROC curve. Although the small sample size scenario is common in medical tests, a comprehensive study of small sample size properties of various methods for the construction of the confidence/credible interval (CI) for the AUC has been by and large missing in the literature. In this paper, we describe and compare 29 non-parametric and parametric methods for the construction of the CI for the AUC when the number of available observations is small. The methods considered include not only those that have been widely adopted, but also those that have been less frequently mentioned or, to our knowledge, never applied to the AUC context. To compare different methods, we carried out a simulation study with data generated from binormal models with equal and unequal variances and from exponential models with various parameters and with equal and unequal small sample sizes. We found that the larger the true AUC value and the smaller the sample size, the larger the discrepancy among the results of different approaches. When the model is correctly specified, the parametric approaches tend to outperform the non-parametric ones. Moreover, in the non-parametric domain, we found that a method based on the Mann-Whitney statistic is in general superior to the others. We further elucidate potential issues and provide possible solutions to along with general guidance on the CI construction for the AUC when the sample size is small. Finally, we illustrate the utility of different methods through real life examples.

  16. Phase II randomized trial of carboplatin, paclitaxel, bevacizumab with or without cixutumumab (IMC-A12) in patients with advanced non-squamous, non-small-cell lung cancer: a trial of the ECOG-ACRIN Cancer Research Group (E3508).

    PubMed

    Argiris, A; Lee, J W; Stevenson, J; Sulecki, M G; Hugec, V; Choong, N W; Saltzman, J N; Song, W; Hansen, R M; Evans, T L; Ramalingam, S S; Schiller, J H

    2017-12-01

    Cixutumumab is a fully human IgG1 monoclonal antibody to the insulin-like growth factor type I receptor that can potentially reverse resistance and enhance the efficacy of chemotherapy. Bevacizumab-eligible patients with stage IV or recurrent non-squamous, non-small-cell lung cancer and good performance status were randomized to receive standard doses of paclitaxel, carboplatin, and bevacizumab to a maximum of six cycles followed by bevacizumab maintenance (CPB) until progression (arm A) or CPB plus cixutumumab 6 mg/kg i.v. weekly (arm B). Of 175 patients randomized, 153 were eligible and treated (78 in arm A; 75 in arm B). The median progression-free survival was 5.8 months (95% CI 5.4-7.1) in arm A versus 7 months (95% CI 5.7-7.6) in arm B (P = 0.33); hazard ratio 0.92 (95% CI 0.65-1.31). Objective response was 46.2% versus 58.7% in arm A versus arm B (P = 0.15). The median overall survival was 16.2 months in arm A versus 16.1 months in arm B (P = 0.95). Grade 3/4 neutropenia and febrile neutropenia, thrombocytopenia, fatigue, and hyperglycemia were increased with cixutumumab. The addition of cixutumumab to CPB increased toxicity without improving efficacy and is not recommended for further development in non-small-cell lung cancer. Both treatment groups had longer OS than historical controls which may be attributed to several factors, and emphasizes the value of a comparator arm in phase II trials. NCT00955305. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  17. MicroRNA array normalization: an evaluation using a randomized dataset as the benchmark.

    PubMed

    Qin, Li-Xuan; Zhou, Qin

    2014-01-01

    MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays.

  18. MicroRNA Array Normalization: An Evaluation Using a Randomized Dataset as the Benchmark

    PubMed Central

    Qin, Li-Xuan; Zhou, Qin

    2014-01-01

    MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays. PMID:24905456

  19. Intervention for First Graders with Limited Number Knowledge: Large-Scale Replication of a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Gersten, Russell; Rolfhus, Eric; Clarke, Ben; Decker, Lauren E.; Wilkins, Chuck; Dimino, Joseph

    2015-01-01

    Replication studies are extremely rare in education. This randomized controlled trial (RCT) is a scale-up replication of Fuchs et al., which in a sample of 139 found a statistically significant positive impact for Number Rockets, a small-group intervention for at-risk first graders that focused on building understanding of number operations. The…

  20. Using Non-experimental Data to Estimate Treatment Effects

    PubMed Central

    Stuart, Elizabeth A.; Marcus, Sue M.; Horvitz-Lennon, Marcela V.; Gibbons, Robert D.; Normand, Sharon-Lise T.

    2009-01-01

    While much psychiatric research is based on randomized controlled trials (RCTs), where patients are randomly assigned to treatments, sometimes RCTs are not feasible. This paper describes propensity score approaches, which are increasingly used for estimating treatment effects in non-experimental settings. The primary goal of propensity score methods is to create sets of treated and comparison subjects who look as similar as possible, in essence replicating a randomized experiment, at least with respect to observed patient characteristics. A study to estimate the metabolic effects of antipsychotic medication in a sample of Florida Medicaid beneficiaries with schizophrenia illustrates methods. PMID:20563313

  1. Effects of behavioral stress reduction Transcendental Meditation intervention in Persons with HIV

    PubMed Central

    Chhatre, Sumedha; Metzger, David S.; Frank, Ian; Boyer, Jean; Thompson, Edward; Nidich, Sanford; Montaner, Luis J.; Jayadevappa, Ravishankar

    2013-01-01

    Stress is implicated in the pathogenesis and progression of HIV. The Transcendental Meditation is a behavioral stress reduction program that incorporates mind-body approach, and has demonstrated effectiveness in improving outcomes via stress reduction. We evaluated the feasibility of implementing Transcendental Meditation and its effects on outcomes in persons with HIV. In this community based single blinded Phase-I, randomized controlled trial, outcomes (psychological and physiological stress, immune activation, generic and HIV-specific health related quality of life, depression and quality of well-being) were assessed at baseline and at six months, and were compared using parametric and non-parametric tests. Twenty two persons with HIV were equally randomized to Transcendental Meditation intervention or healthy eating (HE) education control group. Retention was 100% in Transcendental Meditation group and 91% in healthy eating control group. The Transcendental Meditation group exhibited significant improvement in vitality. Significant between group differences were observed for generic and HIV-specific health related quality of life.. Small sample size may possibly limit the ability to observe significant differences in some outcomes. Transcendental Meditation stress reduction intervention in community dwelling adults with HIV is viable and can enhance health related quality of life. Further research with large sample and longer follow-up is needed to validate our results. PMID:23394825

  2. Peptide ligands targeting integrin alpha3beta1 in non-small cell lung cancer.

    PubMed

    Lau, Derick; Guo, Linlang; Liu, Ruiwu; Marik, Jan; Lam, Kit

    2006-06-01

    Lung cancer is one of the most common cancers and is the leading cause of cancer death. We wish to identify peptide ligands for unique cell surface receptors of non-small lung cancer with the hope of developing these ligands as diagnostic and therapeutic agents. Using the method of 'one-bead one-peptide' combinatorial chemistry, a library of random cyclic octapeptides was synthesized on polystyrene beads. This library was used to screen for peptides that promoted attachment of lung adenocarcinoma cells employing a 'cell-growth-on-bead' assay. Consensus peptide sequences of cNGXGXXc were identified. These peptides promoted cell adhesion by targeting integrin alpha3beta1 over-expressed in non-small lung cancer cells. These peptide beads can be applied to capture cancer cells in malignant pleural fluid for purpose of diagnosis of lung cancer.

  3. Analysis of Puumala hantavirus in a bank vole population in northern Finland: evidence for co-circulation of two genetic lineages and frequent reassortment between strains.

    PubMed

    Razzauti, Maria; Plyusnina, Angelina; Sironen, Tarja; Henttonen, Heikki; Plyusnin, Alexander

    2009-08-01

    In this study, for the first time, two distinct genetic lineages of Puumala virus (PUUV) were found within a small sampling area and within a single host genetic lineage (Ural mtDNA) at Pallasjärvi, northern Finland. Lung tissue samples of 171 bank voles (Myodes glareolus) trapped in September 1998 were screened for the presence of PUUV nucleocapsid antigen and 25 were found to be positive. Partial sequences of the PUUV small (S), medium (M) and large (L) genome segments were recovered from these samples using RT-PCR. Phylogenetic analysis revealed two genetic groups of PUUV sequences that belonged to the Finnish and north Scandinavian lineages. This presented a unique opportunity to study inter-lineage reassortment in PUUV; indeed, 32 % of the studied bank voles appeared to carry reassortant virus genomes. Thus, the frequency of inter-lineage reassortment in PUUV was comparable to that of intra-lineage reassortment observed previously (Razzauti, M., Plyusnina, A., Henttonen, H. & Plyusnin, A. (2008). J Gen Virol 89, 1649-1660). Of six possible reassortant S/M/L combinations, only two were found at Pallasjärvi and, notably, in all reassortants, both S and L segments originated from the same genetic lineage, suggesting a non-random pattern for the reassortment. These findings are discussed in connection to PUUV evolution in Fennoscandia.

  4. Observational studies of patients in the emergency department: a comparison of 4 sampling methods.

    PubMed

    Valley, Morgan A; Heard, Kennon J; Ginde, Adit A; Lezotte, Dennis C; Lowenstein, Steven R

    2012-08-01

    We evaluate the ability of 4 sampling methods to generate representative samples of the emergency department (ED) population. We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by using 2 sample sizes (n=200 and n=400) and 4 sampling methods: true random, random 4-hour time blocks by exact sample size, random 4-hour time blocks by a predetermined number of blocks, and convenience or "business hours." For each method and sample size, we obtained 1,000 samples from the population. Using χ(2) tests, we measured the number of statistically significant differences between the sample and the population for 8 variables (age, sex, race/ethnicity, language, triage acuity, arrival mode, disposition, and payer source). Then, for each variable, method, and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Only the true random samples represented the population with respect to sex, race/ethnicity, triage acuity, mode of arrival, language, and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. Copyright © 2012. Published by Mosby, Inc.

  5. Using pilot data to size a two-arm randomized trial to find a nearly optimal personalized treatment strategy.

    PubMed

    Laber, Eric B; Zhao, Ying-Qi; Regh, Todd; Davidian, Marie; Tsiatis, Anastasios; Stanford, Joseph B; Zeng, Donglin; Song, Rui; Kosorok, Michael R

    2016-04-15

    A personalized treatment strategy formalizes evidence-based treatment selection by mapping patient information to a recommended treatment. Personalized treatment strategies can produce better patient outcomes while reducing cost and treatment burden. Thus, among clinical and intervention scientists, there is a growing interest in conducting randomized clinical trials when one of the primary aims is estimation of a personalized treatment strategy. However, at present, there are no appropriate sample size formulae to assist in the design of such a trial. Furthermore, because the sampling distribution of the estimated outcome under an estimated optimal treatment strategy can be highly sensitive to small perturbations in the underlying generative model, sample size calculations based on standard (uncorrected) asymptotic approximations or computer simulations may not be reliable. We offer a simple and robust method for powering a single stage, two-armed randomized clinical trial when the primary aim is estimating the optimal single stage personalized treatment strategy. The proposed method is based on inverting a plugin projection confidence interval and is thereby regular and robust to small perturbations of the underlying generative model. The proposed method requires elicitation of two clinically meaningful parameters from clinical scientists and uses data from a small pilot study to estimate nuisance parameters, which are not easily elicited. The method performs well in simulated experiments and is illustrated using data from a pilot study of time to conception and fertility awareness. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Impact of spatial variability and sampling design on model performance

    NASA Astrophysics Data System (ADS)

    Schrape, Charlotte; Schneider, Anne-Kathrin; Schröder, Boris; van Schaik, Loes

    2017-04-01

    Many environmental physical and chemical parameters as well as species distributions display a spatial variability at different scales. In case measurements are very costly in labour time or money a choice has to be made between a high sampling resolution at small scales and a low spatial cover of the study area or a lower sampling resolution at the small scales resulting in local data uncertainties with a better spatial cover of the whole area. This dilemma is often faced in the design of field sampling campaigns for large scale studies. When the gathered field data are subsequently used for modelling purposes the choice of sampling design and resulting data quality influence the model performance criteria. We studied this influence with a virtual model study based on a large dataset of field information on spatial variation of earthworms at different scales. Therefore we built a virtual map of anecic earthworm distributions over the Weiherbach catchment (Baden-Württemberg in Germany). First of all the field scale abundance of earthworms was estimated using a catchment scale model based on 65 field measurements. Subsequently the high small scale variability was added using semi-variograms, based on five fields with a total of 430 measurements divided in a spatially nested sampling design over these fields, to estimate the nugget, range and standard deviation of measurements within the fields. With the produced maps, we performed virtual samplings of one up to 50 random points per field. We then used these data to rebuild the catchment scale models of anecic earthworm abundance with the same model parameters as in the work by Palm et al. (2013). The results of the models show clearly that a large part of the non-explained deviance of the models is due to the very high small scale variability in earthworm abundance: the models based on single virtual sampling points on average obtain an explained deviance of 0.20 and a correlation coefficient of 0.64. With increasing sampling points per field, we averaged the measured abundance of the sampling within each field to obtain a more representative value of the field average. Doubling the samplings per field strongly improved the model performance criteria (explained deviance 0.38 and correlation coefficient 0.73). With 50 sampling points per field the performance criteria were 0.91 and 0.97 respectively for explained deviance and correlation coefficient. The relationship between number of samplings and performance criteria can be described with a saturation curve. Beyond five samples per field the model improvement becomes rather small. With this contribution we wish to discuss the impact of data variability at sampling scale on model performance and the implications for sampling design and assessment of model results as well as ecological inferences.

  7. Using an Instrumented Drone to Sample Dust Devils

    NASA Astrophysics Data System (ADS)

    Jackson, Brian; Lorenz, Ralph; Davis, Karan; Lipple, Brock

    2017-10-01

    Dust devils are low-pressure, small (many to tens of meters) convective vortices powered by surface heating and rendered visible by lofted dust. Dust devils occur in arid climates on Earth, where they degrade air quality and pose a hazard to small aircraft. They also occur ubiquitously on Mars, where they may dominate the supply of atmospheric dust. Since dust contributes significantly to Mars’ atmospheric heat budget, dust devils probably play an important role in its climate. The dust-lifting capacity of a devil likely depends sensitively on its structure, particularly the wind and pressure profiles, but the exact dependencies are poorly constrained. Thus, the exact contribution to Mars’ atmosphere remains unresolved. Moreover, most previous studies of martian dust devils have relied on passive sampling of the profiles via meteorology packages on landed spacecraft, resulting in random encounter geometries which non-trivially skew the retrieved profiles. Analog studies of terrestrial devils have employed more active sampling (instrumented vehicles or manned aircraft) but have been limited to near-surface (few meters) or relatively high altitude (hundreds of meters) sampling. Unmanned aerial vehicles (UAVs) or drones, combined with miniature, digital instrumentation, promise a novel and uniquely powerful platform from which to sample dust devils via (relatively) controlled geometries at a wide variety of altitudes. In this presentation, we will describe a pilot study using an instrumented quadcopter on an active field site in southeastern Oregon, which (to our knowledge) has not previously been surveyed for dust devils. We will present preliminary results from the resulting encounters, including stereo image analysis and encounter footage collected onboard the drone.

  8. Occupancy Modeling Species-Environment Relationships with Non-ignorable Survey Designs.

    PubMed

    Irvine, Kathryn M; Rodhouse, Thomas J; Wright, Wilson J; Olsen, Anthony R

    2018-05-26

    Statistical models supporting inferences about species occurrence patterns in relation to environmental gradients are fundamental to ecology and conservation biology. A common implicit assumption is that the sampling design is ignorable and does not need to be formally accounted for in analyses. The analyst assumes data are representative of the desired population and statistical modeling proceeds. However, if datasets from probability and non-probability surveys are combined or unequal selection probabilities are used, the design may be non ignorable. We outline the use of pseudo-maximum likelihood estimation for site-occupancy models to account for such non-ignorable survey designs. This estimation method accounts for the survey design by properly weighting the pseudo-likelihood equation. In our empirical example, legacy and newer randomly selected locations were surveyed for bats to bridge a historic statewide effort with an ongoing nationwide program. We provide a worked example using bat acoustic detection/non-detection data and show how analysts can diagnose whether their design is ignorable. Using simulations we assessed whether our approach is viable for modeling datasets composed of sites contributed outside of a probability design Pseudo-maximum likelihood estimates differed from the usual maximum likelihood occu31 pancy estimates for some bat species. Using simulations we show the maximum likelihood estimator of species-environment relationships with non-ignorable sampling designs was biased, whereas the pseudo-likelihood estimator was design-unbiased. However, in our simulation study the designs composed of a large proportion of legacy or non-probability sites resulted in estimation issues for standard errors. These issues were likely a result of highly variable weights confounded by small sample sizes (5% or 10% sampling intensity and 4 revisits). Aggregating datasets from multiple sources logically supports larger sample sizes and potentially increases spatial extents for statistical inferences. Our results suggest that ignoring the mechanism for how locations were selected for data collection (e.g., the sampling design) could result in erroneous model-based conclusions. Therefore, in order to ensure robust and defensible recommendations for evidence-based conservation decision-making, the survey design information in addition to the data themselves must be available for analysts. Details for constructing the weights used in estimation and code for implementation are provided. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  9. Survival Outcome After Stereotactic Body Radiation Therapy and Surgery for Stage I Non-Small Cell Lung Cancer: A Meta-Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Xiangpeng; Schipper, Matthew; Department of Biostatistics, the University of Michigan, Ann Arbor, Michigan

    Purpose: This study compared treatment outcomes of stereotactic body radiation therapy (SBRT) with those of surgery in stage I non-small cell lung cancer (NSCLC). Methods and Materials: Eligible studies of SBRT and surgery were retrieved through extensive searches of the PubMed, Medline, Embase, and Cochrane library databases from 2000 to 2012. Original English publications of stage I NSCLC with adequate sample sizes and adequate SBRT doses were included. A multivariate random effects model was used to perform a meta-analysis to compare survival between treatments while adjusting for differences in patient characteristics. Results: Forty SBRT studies (4850 patients) and 23 surgerymore » studies (7071 patients) published in the same period were eligible. The median age and follow-up duration were 74 years and 28.0 months for SBRT patients and 66 years and 37 months for surgery patients, respectively. The mean unadjusted overall survival rates at 1, 3, and 5 years with SBRT were 83.4%, 56.6%, and 41.2% compared to 92.5%, 77.9%, and 66.1% with lobectomy and 93.2%, 80.7%, and 71.7% with limited lung resections. In SBRT studies, overall survival improved with increasing proportion of operable patients. After we adjusted for proportion of operable patients and age, SBRT and surgery had similar estimated overall and disease-free survival. Conclusions: Patients treated with SBRT differ substantially from patients treated with surgery in age and operability. After adjustment for these differences, OS and DFS do not differ significantly between SBRT and surgery in patients with operable stage I NSCLC. A randomized prospective trial is warranted to compare the efficacy of SBRT and surgery.« less

  10. Molecular predictors of outcome with gefitinib and docetaxel in previously treated non-small-cell lung cancer: data from the randomized phase III INTEREST trial.

    PubMed

    Douillard, Jean-Yves; Shepherd, Frances A; Hirsh, Vera; Mok, Tony; Socinski, Mark A; Gervais, Radj; Liao, Mei-Lin; Bischoff, Helge; Reck, Martin; Sellers, Mark V; Watkins, Claire L; Speake, Georgina; Armour, Alison A; Kim, Edward S

    2010-02-10

    PURPOSE In the phase III INTEREST trial, 1,466 pretreated patients with advanced non-small cell lung cancer (NSCLC) were randomly assigned to receive gefitinib or docetaxel. As a preplanned analysis, we prospectively analyzed available tumor biopsies to investigate the relationship between biomarkers and clinical outcomes. METHODS Biomarkers included epidermal growth factor receptor (EGFR) copy number by fluorescent in situ hybridization (374 assessable samples), EGFR protein expression by immunohistochemistry (n = 380), and EGFR (n = 297) and KRAS (n = 275) mutations. Results For all biomarker subgroups analyzed, survival was similar for gefitinib and docetaxel, with no statistically significant differences between treatments and no significant treatment by biomarker status interaction tests. EGFR mutation-positive patients had longer progression-free survival (PFS; hazard ratio [HR], 0.16; 95% CI, 0.05 to 0.49; P = .001) and higher objective response rate (ORR; 42.1% v 21.1%; P = .04), and patients with high EGFR copy number had higher ORR (13.0% v 7.4%; P = .04) with gefitinib versus docetaxel. CONCLUSION These biomarkers do not appear to be predictive factors for differential survival between gefitinib and docetaxel in this setting of previously treated patients; however, subsequent treatments may have influenced the survival results. For secondary end points of PFS and ORR, some advantages for gefitinib over docetaxel were seen in EGFR mutation-positive and high EGFR copy number patients. There was no statistically significant difference between gefitinib and docetaxel in biomarker-negative patients. This suggests gefitinib can provide similar overall survival to docetaxel in patients across a broad range of clinical subgroups and that EGFR biomarkers such as mutation status may additionally identify which patients are likely to gain greatest PFS and ORR benefit from gefitinib.

  11. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  12. Comprehensive Analysis of Genome Rearrangements in Eight Human Malignant Tumor Tissues

    PubMed Central

    Wang, Chong

    2016-01-01

    Carcinogenesis is a complex multifactorial, multistage process, but the precise mechanisms are not well understood. In this study, we performed a genome-wide analysis of the copy number variation (CNV), breakpoint region (BPR) and fragile sites in 2,737 tumor samples from eight tumor entities and in 432 normal samples. CNV detection and BPR identification revealed that BPRs tended to accumulate in specific genomic regions in tumor samples whereas being dispersed genome-wide in the normal samples. Hotspots were observed, at which segments with similar alteration in copy number were overlapped along with BPRs adjacently clustered. Evaluation of BPR occurrence frequency showed that at least one was detected in about and more than 15% of samples for each tumor entity while BPRs were maximal in 12% of the normal samples. 127 of 2,716 tumor-relevant BPRs (termed ‘common BPRs’) exhibited also a noticeable occurrence frequency in the normal samples. Colocalization assessment identified 20,077 CNV-affecting genes and 169 of these being known tumor-related genes. The most noteworthy genes are KIAA0513 important for immunologic, synaptic and apoptotic signal pathways, intergenic non-coding RNA RP11-115C21.2 possibly acting as oncogene or tumor suppressor by changing the structure of chromatin, and ADAM32 likely importance in cancer cell proliferation and progression by ectodomain-shedding of diverse growth factors, and the well-known tumor suppressor gene p53. The BPR distributions indicate that CNV mutations are likely non-random in tumor genomes. The marked recurrence of BPRs at specific regions supports common progression mechanisms in tumors. The presence of hotspots together with common BPRs, despite its small group size, imply a relation between fragile sites and cancer-gene alteration. Our data further suggest that both protein-coding and non-coding genes possessing a range of biological functions might play a causative or functional role in tumor biology. This research enhances our understanding of the mechanisms for tumorigenesis and progression. PMID:27391163

  13. An approach to trial design and analysis in the era of non-proportional hazards of the treatment effect.

    PubMed

    Royston, Patrick; Parmar, Mahesh K B

    2014-08-07

    Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.

  14. Estimating the prevalence of 26 health-related indicators at neighbourhood level in the Netherlands using structured additive regression.

    PubMed

    van de Kassteele, Jan; Zwakhals, Laurens; Breugelmans, Oscar; Ameling, Caroline; van den Brink, Carolien

    2017-07-01

    Local policy makers increasingly need information on health-related indicators at smaller geographic levels like districts or neighbourhoods. Although more large data sources have become available, direct estimates of the prevalence of a health-related indicator cannot be produced for neighbourhoods for which only small samples or no samples are available. Small area estimation provides a solution, but unit-level models for binary-valued outcomes that can handle both non-linear effects of the predictors and spatially correlated random effects in a unified framework are rarely encountered. We used data on 26 binary-valued health-related indicators collected on 387,195 persons in the Netherlands. We associated the health-related indicators at the individual level with a set of 12 predictors obtained from national registry data. We formulated a structured additive regression model for small area estimation. The model captured potential non-linear relations between the predictors and the outcome through additive terms in a functional form using penalized splines and included a term that accounted for spatially correlated heterogeneity between neighbourhoods. The registry data were used to predict individual outcomes which in turn are aggregated into higher geographical levels, i.e. neighbourhoods. We validated our method by comparing the estimated prevalences with observed prevalences at the individual level and by comparing the estimated prevalences with direct estimates obtained by weighting methods at municipality level. We estimated the prevalence of the 26 health-related indicators for 415 municipalities, 2599 districts and 11,432 neighbourhoods in the Netherlands. We illustrate our method on overweight data and show that there are distinct geographic patterns in the overweight prevalence. Calibration plots show that the estimated prevalences agree very well with observed prevalences at the individual level. The estimated prevalences agree reasonably well with the direct estimates at the municipal level. Structured additive regression is a useful tool to provide small area estimates in a unified framework. We are able to produce valid nationwide small area estimates of 26 health-related indicators at neighbourhood level in the Netherlands. The results can be used for local policy makers to make appropriate health policy decisions.

  15. Comparative analyses of basal rate of metabolism in mammals: data selection does matter.

    PubMed

    Genoud, Michel; Isler, Karin; Martin, Robert D

    2018-02-01

    Basal rate of metabolism (BMR) is a physiological parameter that should be measured under strictly defined experimental conditions. In comparative analyses among mammals BMR is widely used as an index of the intensity of the metabolic machinery or as a proxy for energy expenditure. Many databases with BMR values for mammals are available, but the criteria used to select metabolic data as BMR estimates have often varied and the potential effect of this variability has rarely been questioned. We provide a new, expanded BMR database reflecting compliance with standard criteria (resting, postabsorptive state; thermal neutrality; adult, non-reproductive status for females) and examine potential effects of differential selectivity on the results of comparative analyses. The database includes 1739 different entries for 817 species of mammals, compiled from the original sources. It provides information permitting assessment of the validity of each estimate and presents the value closest to a proper BMR for each entry. Using different selection criteria, several alternative data sets were extracted and used in comparative analyses of (i) the scaling of BMR to body mass and (ii) the relationship between brain mass and BMR. It was expected that results would be especially dependent on selection criteria with small sample sizes and with relatively weak relationships. Phylogenetically informed regression (phylogenetic generalized least squares, PGLS) was applied to the alternative data sets for several different clades (Mammalia, Eutheria, Metatheria, or individual orders). For Mammalia, a 'subsampling procedure' was also applied, in which random subsamples of different sample sizes were taken from each original data set and successively analysed. In each case, two data sets with identical sample size and species, but comprising BMR data with different degrees of reliability, were compared. Selection criteria had minor effects on scaling equations computed for large clades (Mammalia, Eutheria, Metatheria), although less-reliable estimates of BMR were generally about 12-20% larger than more-reliable ones. Larger effects were found with more-limited clades, such as sciuromorph rodents. For the relationship between BMR and brain mass the results of comparative analyses were found to depend strongly on the data set used, especially with more-limited, order-level clades. In fact, with small sample sizes (e.g. <100) results often appeared erratic. Subsampling revealed that sample size has a non-linear effect on the probability of a zero slope for a given relationship. Depending on the species included, results could differ dramatically, especially with small sample sizes. Overall, our findings indicate a need for due diligence when selecting BMR estimates and caution regarding results (even if seemingly significant) with small sample sizes. © 2017 Cambridge Philosophical Society.

  16. Statistical Sampling Handbook for Student Aid Programs: A Reference for Non-Statisticians. Winter 1984.

    ERIC Educational Resources Information Center

    Office of Student Financial Assistance (ED), Washington, DC.

    A manual on sampling is presented to assist audit and program reviewers, project officers, managers, and program specialists of the U.S. Office of Student Financial Assistance (OSFA). For each of the following types of samples, definitions and examples are provided, along with information on advantages and disadvantages: simple random sampling,…

  17. Small incision lenticule extraction (SMILE) versus laser in-situ keratomileusis (LASIK): study protocol for a randomized, non-inferiority trial.

    PubMed

    Ang, Marcus; Tan, Donald; Mehta, Jodhbir S

    2012-05-31

    Small incision lenticule extraction or SMILE is a novel form of 'flapless' corneal refractive surgery that was adapted from refractive lenticule extraction (ReLEx). SMILE uses only one femtosecond laser to complete the refractive surgery, potentially reducing surgical time, side effects, and cost. If successful, SMILE could potentially replace the current, widely practiced laser in-situ keratomileusis or LASIK. The aim of this study is to evaluate whether SMILE is non-inferior to LASIK in terms of refractive outcomes at 3 months post-operatively. Single tertiary center, parallel group, single-masked, paired-eye design, non-inferiority, randomized controlled trial. Participants who are eligible for LASIK will be enrolled for study after informed consent. Each participant will be randomized to receive SMILE and LASIK in each eye. Our primary hypothesis (stated as null) in this non-inferiority trial would be that SMILE differs from LASIK in adults (>21 years old) with myopia (> -3.00 diopter (D)) at a tertiary eye center in terms of refractive predictability at 3 months post-operatively. Our secondary hypothesis (stated as null) in this non-inferiority trial would be that SMILE differs from LASIK in adults (>21 years old) with myopia (> -3.00 D) at a tertiary eye center in terms of other refractive outcomes (efficacy, safety, higher-order aberrations) at 3 months post-operatively. Our primary outcome is refractive predictability, which is one of several standard refractive outcomes, defined as the proportion of eyes achieving a postoperative spherical equivalent (SE) within ±0.50 D of the intended target. Randomization will be performed using random allocation sequence generated by a computer with no blocks or restrictions, and implemented by concealing the number-coded surgery within sealed envelopes until just before the procedure. In this single-masked trial, subjects and their caregivers will be masked to the assigned treatment in each eye. This novel trial will provide information on whether SMILE has comparable, if not superior, refractive outcomes compared to the established LASIK for myopia, thus providing evidence for translation into clinical practice. Clinicaltrials.gov NCT01216475.

  18. Regularity of random attractors for fractional stochastic reaction-diffusion equations on Rn

    NASA Astrophysics Data System (ADS)

    Gu, Anhui; Li, Dingshi; Wang, Bixiang; Yang, Han

    2018-06-01

    We investigate the regularity of random attractors for the non-autonomous non-local fractional stochastic reaction-diffusion equations in Hs (Rn) with s ∈ (0 , 1). We prove the existence and uniqueness of the tempered random attractor that is compact in Hs (Rn) and attracts all tempered random subsets of L2 (Rn) with respect to the norm of Hs (Rn). The main difficulty is to show the pullback asymptotic compactness of solutions in Hs (Rn) due to the noncompactness of Sobolev embeddings on unbounded domains and the almost sure nondifferentiability of the sample paths of the Wiener process. We establish such compactness by the ideas of uniform tail-estimates and the spectral decomposition of solutions in bounded domains.

  19. Estimating accuracy of land-cover composition from two-stage cluster sampling

    USGS Publications Warehouse

    Stehman, S.V.; Wickham, J.D.; Fattorini, L.; Wade, T.D.; Baffetta, F.; Smith, J.H.

    2009-01-01

    Land-cover maps are often used to compute land-cover composition (i.e., the proportion or percent of area covered by each class), for each unit in a spatial partition of the region mapped. We derive design-based estimators of mean deviation (MD), mean absolute deviation (MAD), root mean square error (RMSE), and correlation (CORR) to quantify accuracy of land-cover composition for a general two-stage cluster sampling design, and for the special case of simple random sampling without replacement (SRSWOR) at each stage. The bias of the estimators for the two-stage SRSWOR design is evaluated via a simulation study. The estimators of RMSE and CORR have small bias except when sample size is small and the land-cover class is rare. The estimator of MAD is biased for both rare and common land-cover classes except when sample size is large. A general recommendation is that rare land-cover classes require large sample sizes to ensure that the accuracy estimators have small bias. ?? 2009 Elsevier Inc.

  20. Plasticity-Driven Self-Organization under Topological Constraints Accounts for Non-random Features of Cortical Synaptic Wiring

    PubMed Central

    Miner, Daniel; Triesch, Jochen

    2016-01-01

    Understanding the structure and dynamics of cortical connectivity is vital to understanding cortical function. Experimental data strongly suggest that local recurrent connectivity in the cortex is significantly non-random, exhibiting, for example, above-chance bidirectionality and an overrepresentation of certain triangular motifs. Additional evidence suggests a significant distance dependency to connectivity over a local scale of a few hundred microns, and particular patterns of synaptic turnover dynamics, including a heavy-tailed distribution of synaptic efficacies, a power law distribution of synaptic lifetimes, and a tendency for stronger synapses to be more stable over time. Understanding how many of these non-random features simultaneously arise would provide valuable insights into the development and function of the cortex. While previous work has modeled some of the individual features of local cortical wiring, there is no model that begins to comprehensively account for all of them. We present a spiking network model of a rodent Layer 5 cortical slice which, via the interactions of a few simple biologically motivated intrinsic, synaptic, and structural plasticity mechanisms, qualitatively reproduces these non-random effects when combined with simple topological constraints. Our model suggests that mechanisms of self-organization arising from a small number of plasticity rules provide a parsimonious explanation for numerous experimentally observed non-random features of recurrent cortical wiring. Interestingly, similar mechanisms have been shown to endow recurrent networks with powerful learning abilities, suggesting that these mechanism are central to understanding both structure and function of cortical synaptic wiring. PMID:26866369

  1. Icotinib versus gefitinib in previously treated advanced non-small-cell lung cancer (ICOGEN): a randomised, double-blind phase 3 non-inferiority trial.

    PubMed

    Shi, Yuankai; Zhang, Li; Liu, Xiaoqing; Zhou, Caicun; Zhang, Li; Zhang, Shucai; Wang, Dong; Li, Qiang; Qin, Shukui; Hu, Chunhong; Zhang, Yiping; Chen, Jianhua; Cheng, Ying; Feng, Jifeng; Zhang, Helong; Song, Yong; Wu, Yi-Long; Xu, Nong; Zhou, Jianying; Luo, Rongcheng; Bai, Chunxue; Jin, Yening; Liu, Wenchao; Wei, Zhaohui; Tan, Fenlai; Wang, Yinxiang; Ding, Lieming; Dai, Hong; Jiao, Shunchang; Wang, Jie; Liang, Li; Zhang, Weimin; Sun, Yan

    2013-09-01

    Icotinib, an oral EGFR tyrosine kinase inhibitor, had shown antitumour activity and favourable toxicity in early-phase clinical trials. We aimed to investigate whether icotinib is non-inferior to gefitinib in patients with non-small-cell lung cancer. In this randomised, double-blind, phase 3 non-inferiority trial we enrolled patients with advanced non-small-cell lung cancer from 27 sites in China. Eligible patients were those aged 18-75 years who had not responded to one or more platinum-based chemotherapy regimen. Patients were randomly assigned (1:1), using minimisation methods, to receive icotinib (125 mg, three times per day) or gefitinib (250 mg, once per day) until disease progression or unacceptable toxicity. The primary endpoint was progression-free survival, analysed in the full analysis set. We analysed EGFR status if tissue samples were available. All investigators, clinicians, and participants were masked to patient distribution. The non-inferiority margin was 1·14; non-inferiority would be established if the upper limit of the 95% CI for the hazard ratio (HR) of gefitinib versus icotinib was less than this margin. This study is registered with ClinicalTrials.gov, number NCT01040780, and the Chinese Clinical Trial Registry, number ChiCTR-TRC-09000506. 400 eligible patients were enrolled between Feb 26, 2009, and Nov 13, 2009; one patient was enrolled by mistake and removed from the study, 200 were assigned to icotinib and 199 to gefitinib. 395 patients were included in the full analysis set (icotinib, n=199; gefitinib, n=196). Icotinib was non-inferior to gefitinib in terms of progression-free survival (HR 0·84, 95% CI 0·67-1·05; median progression-free survival 4·6 months [95% CI 3·5-6·3] vs 3·4 months [2·3-3·8]; p=0·13). The most common adverse events were rash (81 [41%] of 200 patients in the icotinib group vs 98 [49%] of 199 patients in the gefitinib group) and diarrhoea (43 [22%] vs 58 [29%]). Patients given icotinib had less drug-related adverse events than did those given gefitinib (121 [61%] vs 140 [70%]; p=0·046), especially drug-related diarrhoea (37 [19%] vs 55 [28%]; p=0·033). Icotinib could be a new treatment option for pretreated patients with advanced non-small-cell lung cancer. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Decision tree modeling using R.

    PubMed

    Zhang, Zhongheng

    2016-08-01

    In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met. In the example I focus on conditional inference tree, which incorporates tree-structured regression models into conditional inference procedures. While growing a single tree is subject to small changes in the training data, random forests procedure is introduced to address this problem. The sources of diversity for random forests come from the random sampling and restricted set of input variables to be selected. Finally, I introduce R functions to perform model based recursive partitioning. This method incorporates recursive partitioning into conventional parametric model building.

  3. A polymer, random walk model for the size-distribution of large DNA fragments after high linear energy transfer radiation

    NASA Technical Reports Server (NTRS)

    Ponomarev, A. L.; Brenner, D.; Hlatky, L. R.; Sachs, R. K.

    2000-01-01

    DNA double-strand breaks (DSBs) produced by densely ionizing radiation are not located randomly in the genome: recent data indicate DSB clustering along chromosomes. Stochastic DSB clustering at large scales, from > 100 Mbp down to < 0.01 Mbp, is modeled using computer simulations and analytic equations. A random-walk, coarse-grained polymer model for chromatin is combined with a simple track structure model in Monte Carlo software called DNAbreak and is applied to data on alpha-particle irradiation of V-79 cells. The chromatin model neglects molecular details but systematically incorporates an increase in average spatial separation between two DNA loci as the number of base-pairs between the loci increases. Fragment-size distributions obtained using DNAbreak match data on large fragments about as well as distributions previously obtained with a less mechanistic approach. Dose-response relations, linear at small doses of high linear energy transfer (LET) radiation, are obtained. They are found to be non-linear when the dose becomes so large that there is a significant probability of overlapping or close juxtaposition, along one chromosome, for different DSB clusters from different tracks. The non-linearity is more evident for large fragments than for small. The DNAbreak results furnish an example of the RLC (randomly located clusters) analytic formalism, which generalizes the broken-stick fragment-size distribution of the random-breakage model that is often applied to low-LET data.

  4. Randomized, Multicenter Study of Gefitinib Dose-escalation in Advanced Non-small-cell Lung Cancer Patients Achieved Stable Disease after One-month Gefitinib Treatment

    PubMed Central

    Xue, Cong; Hong, Shaodong; Li, Ning; Feng, Weineng; Jia, Jun; Peng, Jiewen; Lin, Daren; Cao, Xiaolong; Wang, Siyang; Zhang, Weimin; Zhang, Hongyu; Dong, Wei; Zhang, Li

    2015-01-01

    There is no consensus on the optimal treatment for patients with advanced non-small-cell lung cancer (NSCLC) and stable disease (SD) after gefitinib therapy. This randomized, open-label, multicenter study aimed to explore whether dose-escalation of gefitinib would improve response and survival in NSCLC patients who achieved SD after one-month of standard gefitinib dosage. Between May 2009 and January 2012, 466 patients were enrolled and 100 eligible patients were randomized (1:1) to receive either a higher dose (500 mg/d; H group) or to continue standard dose (250 mg/d; S group) of gefitinib. Objective response rate (ORR) was similar between the two groups (12.5% vs 12.5%, p = 1.000). There were no significant differences regarding progression-free survival (PFS) and overall survival (OS) between both arms (H group vs S group: median PFS, 5.30 months vs 6.23 months, p = 0.167; median OS, 13.70 months vs 18.87 months, p = 0.156). Therefore, dose-escalation of gefitinib does not confer a response or survival advantage in patients who achieve SD with one month of standard-dose gefitinib treatment. PMID:26216071

  5. Crossover Control Study of the Effect of Personal Care Products Containing Triclosan on the Microbiome

    PubMed Central

    Poole, Angela C.; Pischel, Lauren; Ley, Catherine; Suh, Gina; Goodrich, Julia K.; Haggerty, Thomas D.; Ley, Ruth E.

    2016-01-01

    ABSTRACT Commonly prescribed antibiotics are known to alter human microbiota. We hypothesized that triclosan and triclocarban, components of many household and personal care products (HPCPs), may alter the oral and gut microbiota, with potential consequences for metabolic function and weight. In a double-blind, randomized, crossover study, participants were given triclosan- and triclocarban (TCS)-containing or non-triclosan/triclocarban (nTCS)-containing HPCPs for 4 months and then switched to the other products for an additional 4 months. Blood, stool, gingival plaque, and urine samples and weight data were obtained at baseline and at regular intervals throughout the study period. Blood samples were analyzed for metabolic and endocrine markers and urine samples for triclosan. The microbiome in stool and oral samples was then analyzed. Although there was a significant difference in the amount of triclosan in the urine between the TCS and nTCS phases, no differences were found in microbiome composition, metabolic or endocrine markers, or weight. Though this study was limited by the small sample size and imprecise administration of HPCPs, triclosan at physiologic levels from exposure to HPCPs does not appear to have a significant or important impact on human oral or gut microbiome structure or on a panel of metabolic markers. IMPORTANCE Triclosan and triclocarban are commonly used commercial microbicides found in toothpastes and soaps. It is unknown what effects these chemicals have on the human microbiome or on endocrine function. From this randomized crossover study, it appears that routine personal care use of triclosan and triclocarban neither exerts a major influence on microbial communities in the gut and mouth nor alters markers of endocrine function in humans. PMID:27303746

  6. Comparative particle recoveries by the retracting rotorod, rotoslide and Burkard spore trap sampling in a compact array

    NASA Astrophysics Data System (ADS)

    Solomon, W. R.; Burge, H. A.; Boise, J. R.; Becker, M.

    1980-06-01

    An array comprising 4 intermittent (retracting) rotorods, 3 (“swingshield”) rotoslides and one Burkard (Hirst) automatic volumetric spore trap was operated on an urban rooftop during 70 periods of 9, 15 or 24 hours in late summer. Standard sampling procedures were utilized and recoveries of pollens as well as spores of Alternaria, Epicoccum, Pithomyces and Ganoderma species compared. Differences between paired counts from each sampler type showed variances increasing with levels of particle prevalence (and deposition). In addition, minimal, non-random, side-to-side and intersampler differences were noted for both impactor types. Exclusion of particles between operating intervals by rotoslides and rotorods was virtually complete. Spore trap recoveries for all particle categories, per m3, exceeded those by both impactors. The greatest (7-fold) difference was noted for the smallest type examined ( Ganoderma). For ragweed pollen, an overall spore trap/impactor ratio approached 1.5. Rain effects were difficult to discern but seemed to influence rotoslides least. Overall differences between impactors were quite small but generally favored the rotoslide in this comparison. Our data confirm the relative advantages of suction traps for small particles. Both impactors and spore traps are suited to pollen and large spore collection, and, with some qualification, data from both may be compared.

  7. A null model for Pearson coexpression networks.

    PubMed

    Gobbi, Andrea; Jurman, Giuseppe

    2015-01-01

    Gene coexpression networks inferred by correlation from high-throughput profiling such as microarray data represent simple but effective structures for discovering and interpreting linear gene relationships. In recent years, several approaches have been proposed to tackle the problem of deciding when the resulting correlation values are statistically significant. This is most crucial when the number of samples is small, yielding a non-negligible chance that even high correlation values are due to random effects. Here we introduce a novel hard thresholding solution based on the assumption that a coexpression network inferred by randomly generated data is expected to be empty. The threshold is theoretically derived by means of an analytic approach and, as a deterministic independent null model, it depends only on the dimensions of the starting data matrix, with assumptions on the skewness of the data distribution compatible with the structure of gene expression levels data. We show, on synthetic and array datasets, that the proposed threshold is effective in eliminating all false positive links, with an offsetting cost in terms of false negative detected edges.

  8. Random matrix theory and fund of funds portfolio optimisation

    NASA Astrophysics Data System (ADS)

    Conlon, T.; Ruskin, H. J.; Crane, M.

    2007-08-01

    The proprietary nature of Hedge Fund investing means that it is common practise for managers to release minimal information about their returns. The construction of a fund of hedge funds portfolio requires a correlation matrix which often has to be estimated using a relatively small sample of monthly returns data which induces noise. In this paper, random matrix theory (RMT) is applied to a cross-correlation matrix C, constructed using hedge fund returns data. The analysis reveals a number of eigenvalues that deviate from the spectrum suggested by RMT. The components of the deviating eigenvectors are found to correspond to distinct groups of strategies that are applied by hedge fund managers. The inverse participation ratio is used to quantify the number of components that participate in each eigenvector. Finally, the correlation matrix is cleaned by separating the noisy part from the non-noisy part of C. This technique is found to greatly reduce the difference between the predicted and realised risk of a portfolio, leading to an improved risk profile for a fund of hedge funds.

  9. Paradoxes in Film Ratings

    ERIC Educational Resources Information Center

    Moore, Thomas L.

    2006-01-01

    The author selected a simple random sample of 100 movies from the "Movie and Video Guide" (1996), by Leonard Maltin. The author's intent was to obtain some basic information on the population of roughly 19,000 movies through a small sample. The "Movie and Video Guide" by Leonard Maltin is an annual ratings guide to movies. While not all films ever…

  10. Can Financial Need Analysis be Simplified?

    ERIC Educational Resources Information Center

    Orwig, M. D.; Jones, Paul K.

    This paper examines the problem of collecting financial data on aid applicants. A 10% sample (12,383) of student records was taken from the 1968-69 alphabetic history file for the ACT Student Need Analysis Service. Random sub-samples were taken in certain phases of the study. A relatively small number of financial variables were found to predict…

  11. Considering the Crossroads of Distance Education: The Experiences of Instructors as They Transitioned to Online or Blended Courses

    ERIC Educational Resources Information Center

    Hoffman, David D.

    2016-01-01

    In the short history of online education research, researchers studying teacher experiences regularly relied on anecdotal examples or small samples. In this research, we sought to support and enhance previous findings concerning the best practices in online education by performing randomly sampled, nationwide survey of online and blended course…

  12. Motexafin Gadolinium Combined With Prompt Whole Brain Radiotherapy Prolongs Time to Neurologic Progression in Non-Small-Cell Lung Cancer Patients With Brain Metastases: Results of a Phase III Trial

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehta, Minesh P.; Shapiro, William R.; Phan, See C.

    2009-03-15

    Purpose: To determine the efficacy of motexafin gadolinium (MGd) in combination with whole brain radiotherapy (WBRT) for the treatment of brain metastases from non-small-cell lung cancer. Methods and Materials: In an international, randomized, Phase III study, patients with brain metastases from non-small-cell lung cancer were randomized to WBRT with or without MGd. The primary endpoint was the interval to neurologic progression, determined by a centralized Events Review Committee who was unaware of the treatment the patients had received. Results: Of 554 patients, 275 were randomized to WBRT and 279 to WBRT+MGd. Treatment with MGd was well tolerated, and 92% ofmore » the intended doses were administered. The most common MGd-related Grade 3+ adverse events included liver function abnormalities (5.5%), asthenia (4.0%), and hypertension (4%). MGd improved the interval to neurologic progression compared with WBRT alone (15 vs. 10 months; p = 0.12, hazard ratio [HR] = 0.78) and the interval to neurocognitive progression (p = 0.057, HR = 0.78). The WBRT patients required more salvage brain surgery or radiosurgery than did the WBRT+MGd patients (54 vs. 25 salvage procedures, p < 0.001). A statistically significant interaction between the geographic region and MGd treatment effect (which was in the prespecified analysis plan) and between treatment delay and MGd treatment effect was found. In North American patients, where treatment was more prompt, a statistically significant prolongation of the interval to neurologic progression, from 8.8 months for WBRT to 24.2 months for WBRT+MGd (p = 0.004, HR = 0.53), and the interval to neurocognitive progression (p = 0.06, HR = 0.73) were observed. Conclusion: In the intent-to-treat analysis, MGd exhibited a favorable trend in neurologic outcomes. MGd significantly prolonged the interval to neurologic progression in non-small-cell lung cancer patients with brain metastases receiving prompt WBRT. The toxicity was acceptable.« less

  13. Demonstration of the Attributes of Multi-increment Sampling and Proper Sample Processing Protocols for the Characterization of Metals on DoD Facilities

    DTIC Science & Technology

    2013-06-01

    lenses of unconsolidated sand and rounded river gravel overlain by as much as 5 m of silt. Gravel consists mostly of quartz and metamorphic rock with...iii LIST OF FIGURES Page Figure 1. Example of multi-increment sampling using a systematic-random sampling design for collecting two separate...The small arms firing Range 16 Record berms at Fort Wainwright. .................... 25 Figure 9. Location of berms sampled using ISM and grab

  14. Randomized, phase II trial of pemetrexed and carboplatin with or without enzastaurin versus docetaxel and carboplatin as first-line treatment of patients with stage IIIB/IV non-small cell lung cancer.

    PubMed

    Socinski, Mark A; Raju, Robert N; Stinchcombe, Thomas; Kocs, Darren M; Couch, Linda S; Barrera, David; Rousey, Steven R; Choksi, Janak K; Jotte, Robert; Patt, Debra A; Periman, Phillip O; Schlossberg, Howard R; Weissman, Charles H; Wang, Yunfei; Asmar, Lina; Pritchard, Sharon; Bromund, Jane; Peng, Guangbin; Treat, Joseph; Obasaju, Coleman K

    2010-12-01

    Enzastaurin is an oral serine/threonine kinase inhibitor that targets protein kinase C-beta (PKC-β) and the phosphatidylinositol-3-kinase/AKT pathway. This trial assessed pemetrexed-carboplatin ± enzastaurin to docetaxel-carboplatin in advanced non-small cell lung cancer. Patients with stage IIIB (with pleural effusion) or IV non-small cell lung cancer and performance status 0 or 1 were randomized to one of the three arms: (A) pemetrexed 500 mg/m and carboplatin area under the curve 6 once every 3 weeks for up to 6 cycles with a loading dose of enzastaurin 1125 or 1200 mg followed by 500 mg daily until disease progression, (B) the same regimen of pemetrexed-carboplatin without enzastaurin, or (C) docetaxel 75 mg/m and carboplatin area under the curve 6 once every 3 weeks for up to six cycles. The primary end point was time to disease progression (TTP). Between March 2006 and May 2008, 218 patients were randomized. Median TTP was 4.6 months for pemetrexed-carboplatin-enzastaurin, 6.0 months for pemetrexed-carboplatin, and 4.1 months for docetaxel-carboplatin (differences not significant). Median survival was 7.2 months for pemetrexed-carboplatin-enzastaurin, 12.7 months for pemetrexed-carboplatin, and 9.2 months for docetaxel-carboplatin (log-rank p = 0.05). Compared with the other arms, docetaxel-carboplatin was associated with lower rates of grade 3 thrombocytopenia and anemia but a higher rate of grade 3 or 4 febrile neutropenia. There was no difference in TTP between the three arms, but survival was longer with pemetrexed-carboplatin compared with docetaxel-carboplatin. Enzastaurin did not add to the activity of pemetrexed-carboplatin.

  15. Characterizing the composition of bone formed during fracture healing using scanning electron microscopy techniques.

    PubMed

    Perdikouri, Christina; Tägil, Magnus; Isaksson, Hanna

    2015-01-01

    About 5-10% of all bone fractures suffer from delayed healing, which may lead to non-union. Bone morphogenetic proteins (BMPs) can be used to induce differentiation of osteoblasts and enhance the formation of the bony callus, and bisphosphonates help to retain the newly formed callus. The aim of this study was to investigate if scanning electron microscopy (SEM) and energy-dispersive X-ray spectroscopy (EDS) can identify differences in the mineral composition of the newly formed bone compared to cortical bone from a non-fractured control. Moreover, we investigate whether the use of BMPs and bisphosphonates-alone or combined-may have an effect on bone mineralization and composition. Twelve male Sprague-Dawley rats at 9 weeks of age were randomly divided into four groups and treated with (A) saline, (B) BMP-7, (C) bisphosphonates (Zoledronate), and (D) BMP-7 + Zoledronate. The rats were sacrificed after 6 weeks. All samples were imaged using SEM and chemically analyzed with EDS to quantify the amount of C, N, Ca, P, O, Na, and Mg. The Ca/P ratio was the primary outcome. In the fractured samples, two areas of interest were chosen for chemical analysis with EDS: the callus and the cortical bone. In the non-fractured samples, only the cortex was analyzed. Our results showed that the element composition varied to a small extent between the callus and the cortical bone in the fractured bones. However, the Ca/P ratio did not differ significantly, suggesting that the mineralization at all sites is similar 6 weeks post-fracture in this rat model.

  16. Generalizability of findings from randomized controlled trials: application to the National Institute of Drug Abuse Clinical Trials Network.

    PubMed

    Susukida, Ryoko; Crum, Rosa M; Ebnesajjad, Cyrus; Stuart, Elizabeth A; Mojtabai, Ramin

    2017-07-01

    To compare randomized controlled trial (RCT) sample treatment effects with the population effects of substance use disorder (SUD) treatment. Statistical weighting was used to re-compute the effects from 10 RCTs such that the participants in the trials had characteristics that resembled those of patients in the target populations. Multi-site RCTs and usual SUD treatment settings in the United States. A total of 3592 patients in 10 RCTs and 1 602 226 patients from usual SUD treatment settings between 2001 and 2009. Three outcomes of SUD treatment were examined: retention, urine toxicology and abstinence. We weighted the RCT sample treatment effects using propensity scores representing the conditional probability of participating in RCTs. Weighting the samples changed the significance of estimated sample treatment effects. Most commonly, positive effects of trials became statistically non-significant after weighting (three trials for retention and urine toxicology and one trial for abstinence); also, non-significant effects became significantly positive (one trial for abstinence) and significantly negative effects became non-significant (two trials for abstinence). There was suggestive evidence of treatment effect heterogeneity in subgroups that are under- or over-represented in the trials, some of which were consistent with the differences in average treatment effects between weighted and unweighted results. The findings of randomized controlled trials (RCTs) for substance use disorder treatment do not appear to be directly generalizable to target populations when the RCT samples do not reflect adequately the target populations and there is treatment effect heterogeneity across patient subgroups. © 2017 Society for the Study of Addiction.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavlou, A. T.; Betzler, B. R.; Burke, T. P.

    Uncertainties in the composition and fabrication of fuel compacts for the Fort St. Vrain (FSV) high temperature gas reactor have been studied by performing eigenvalue sensitivity studies that represent the key uncertainties for the FSV neutronic analysis. The uncertainties for the TRISO fuel kernels were addressed by developing a suite of models for an 'average' FSV fuel compact that models the fuel as (1) a mixture of two different TRISO fuel particles representing fissile and fertile kernels, (2) a mixture of four different TRISO fuel particles representing small and large fissile kernels and small and large fertile kernels and (3)more » a stochastic mixture of the four types of fuel particles where every kernel has its diameter sampled from a continuous probability density function. All of the discrete diameter and continuous diameter fuel models were constrained to have the same fuel loadings and packing fractions. For the non-stochastic discrete diameter cases, the MCNP compact model arranged the TRISO fuel particles on a hexagonal honeycomb lattice. This lattice-based fuel compact was compared to a stochastic compact where the locations (and kernel diameters for the continuous diameter cases) of the fuel particles were randomly sampled. Partial core configurations were modeled by stacking compacts into fuel columns containing graphite. The differences in eigenvalues between the lattice-based and stochastic models were small but the runtime of the lattice-based fuel model was roughly 20 times shorter than with the stochastic-based fuel model. (authors)« less

  18. High-definition imaging of circulating tumor cells and associated cellular events in non-small cell lung cancer patients: a longitudinal analysis.

    PubMed

    Nieva, Jorge; Wendel, Marco; Luttgen, Madelyn S; Marrinucci, Dena; Bazhenova, Lyudmila; Kolatkar, Anand; Santala, Roger; Whittenberger, Brock; Burke, James; Torrey, Melissa; Bethel, Kelly; Kuhn, Peter

    2012-02-01

    Sampling circulating tumor cells (CTCs) from peripheral blood is ideally accomplished using assays that detect high numbers of cells and preserve them for downstream characterization. We sought to evaluate a method using enrichment free fluorescent labeling of CTCs followed by automated digital microscopy in patients with non-small cell lung cancer. Twenty-eight patients with non-small cell lung cancer and hematogenously seeded metastasis were analyzed with multiple blood draws. We detected CTCs in 68% of analyzed samples and found a propensity for increased CTC detection as the disease progressed in individual patients. CTCs were present at a median concentration of 1.6 CTCs ml⁻¹ of analyzed blood in the patient population. Higher numbers of detected CTCs were associated with an unfavorable prognosis.

  19. Resource Allocation and Time Utilization in IGE and Non-IGE Schools. Technical Paper No. 410.

    ERIC Educational Resources Information Center

    Rossmiller, Richard A.; Geske, Terry G.

    This study addressed two basic questions; (1) Do individually guided education (IGE) schools cost more or exhibit different expenditure patterns than non-IGE schools? (2) Do instructional personnel in IGE schools allocate their time differently than instructional personnel in non-IGE schools? Data were obtained from a random sample of 41 IGE…

  20. The Immune Landscape of Non-Small Cell Lung Cancer: Utility of Cytologic and Histologic Samples Obtained Through Minimally Invasive Pulmonary Procedures.

    PubMed

    Beattie, Jason; Yarmus, Lonny; Wahidi, Momen M; Rivera, M Patricia; Gilbert, Christopher; Maldonado, Fabien; Czarnecka, Kasia; Argento, Angela; Chen, Alexander; Herth, Felix; Sterman, Daniel H

    2018-05-14

    The success of immune checkpoint inhibitors and the discovery of useful biomarkers to predict response to these agents is shifting much of the focus of personalized care for non-small cell lung cancer towards harnessing the immune response. With further advancement, more effective immunotherapy options will emerge along with more useful biomarkers. Paradoxically, minimally invasive small biopsy and cytology specimens have become the primary method for diagnosis of patients with advanced disease, as well for initial diagnosis and staging in earlier stage disease. For the benefit of these patients, we will continue to learn how to do more with less. In this perspective, we review aspects of immunobiology that underlie the current state of the art of existing and emerging immunologic biomarkers that hold potential to enhance the care of patients with non-small cell lung cancer. We address practical considerations for acquiring patient samples that accurately reflect disease immune status. We also propose a paradigm shift wherein the most important sample types that need to be proven in pioneering basic science and translation work and subsequent clinical trials are the specimens most often obtained clinically.

  1. Efficacy and Safety of the Glucagon Receptor Antagonist PF-06291874: A 12-Week, Randomized, Dose-Response Study in Patients With Type 2 Diabetes Mellitus on Background Metformin Therapy.

    PubMed

    Kazierad, David J; Chidsey, Kristin; Somayaji, Veena R; Bergman, Arthur J; Calle, Roberto A

    2018-06-19

    To conduct a dose-response assessment of the efficacy and safety of the glucagon receptor antagonist PF-06291874 in adults with type 2 diabetes (T2DM) receiving stable doses of metformin. This randomized, double-blind, statin-stratified, placebo-controlled, 4-arm, parallel-group study was conducted in patients with T2DM receiving background metformin. After an 8-week, non-metformin oral antidiabetic agent washout period, 206 patients were randomized to placebo or PF-06291874 (30, 60, or 100 mg once daily) for 12 weeks. Glycosylated hemoglobin (HbA1c), fasting plasma glucose (FPG), and safety endpoints were assessed at baseline and postbaseline. Dose-dependent mean reductions from baseline in HbA1c for PF-06291874 ranged from -0.67% (-7.29 mmol/mol) to -0.93% (-10.13 mmol/mol); and in FPG from -16.6 to -33.3 mg/dL after 12 weeks of dosing. The incidence of hypoglycemia was low and similar between groups receiving PF-06291874. Small, non-dose-dependent increases in LDL cholesterol (<10%) and blood pressure (BP; systolic BP>2 mmHg, diastolic BP>1 mmHg) were observed with PF-06291874. Modest non-dose-dependent median increases were observed across PF-06291874 groups at 12 weeks for alanine aminotransferase (range, 37.6-48.7 U/L versus placebo) and aspartate aminotransferase (range, 33.3-36.6 U/L versus placebo) that were not associated with bilirubin changes. Small increases were observed for body weight (<0.5 kg) in each PF-06291874 group versus placebo. In patients with T2DM, PF-06291874 significantly lowered HbA1c and glucose, was well tolerated, and had low risk of hypoglycemia. Small, non-dose-related increases in BP, lipids, and hepatic transaminases were observed. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  2. Small Sample Performance of Bias-corrected Sandwich Estimators for Cluster-Randomized Trials with Binary Outcomes

    PubMed Central

    Li, Peng; Redden, David T.

    2014-01-01

    SUMMARY The sandwich estimator in generalized estimating equations (GEE) approach underestimates the true variance in small samples and consequently results in inflated type I error rates in hypothesis testing. This fact limits the application of the GEE in cluster-randomized trials (CRTs) with few clusters. Under various CRT scenarios with correlated binary outcomes, we evaluate the small sample properties of the GEE Wald tests using bias-corrected sandwich estimators. Our results suggest that the GEE Wald z test should be avoided in the analyses of CRTs with few clusters even when bias-corrected sandwich estimators are used. With t-distribution approximation, the Kauermann and Carroll (KC)-correction can keep the test size to nominal levels even when the number of clusters is as low as 10, and is robust to the moderate variation of the cluster sizes. However, in cases with large variations in cluster sizes, the Fay and Graubard (FG)-correction should be used instead. Furthermore, we derive a formula to calculate the power and minimum total number of clusters one needs using the t test and KC-correction for the CRTs with binary outcomes. The power levels as predicted by the proposed formula agree well with the empirical powers from the simulations. The proposed methods are illustrated using real CRT data. We conclude that with appropriate control of type I error rates under small sample sizes, we recommend the use of GEE approach in CRTs with binary outcomes due to fewer assumptions and robustness to the misspecification of the covariance structure. PMID:25345738

  3. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    NASA Astrophysics Data System (ADS)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  4. Retrospective evaluation of canine and feline maxillomandibular trauma cases. A comparison of signalment with non-maxillomandibular traumatic injuries (2003-2012).

    PubMed

    Mulherin, B L; Snyder, C J; Soukup, J W; Hetzel, S

    2014-01-01

    To determine differences in signalment between maxillomandibular (MM) and non-maxillomandibular (non-MM) trauma patients to help predict the type of injury sustained. A medical records database was searched from December 2003 to September 2012 to identify all MM trauma patients. A random sample of non-MM trauma patients was generated for comparison. Patient species, age, sex, weight, and injury aetiology were recorded for both populations. Sixty-seven MM trauma patients and 129 non-MM trauma patients were identified. Feline patients were almost twice as likely to be presented for MM trauma compared with non-MM trauma. The median weight of canine patients suffering MM injury was significantly less than that of non-MM patients (p = 0.025). A significant association existed between the causes of injuries associated with MM and non-MM trauma populations (p = 0.000023). The MM trauma patients were more likely to sustain injury as a result of an animal altercation (Bonferroni p = 0.001) while non-MM injuries were more likely to result from motor vehicle accidents (Bonferroni p = 0.001). Overall, animals that were less than one year of age with traumatic injuries were overrepresented (65/196) in comparison to the entire patient population. The results of this study may help guide clinicians in the evaluation and screening of trauma patients that are presented as an emergency. Cats, small dogs and animals suffering from animal altercations should all be closely evaluated for MM injury.

  5. The immediate effects of two manual therapy techniques on ankle musculoarticular stiffness and dorsiflexion range of motion in people with chronic ankle rigidity: A randomized clinical trial.

    PubMed

    Hidalgo, Benjamin; Hall, Toby; Berwart, Mathilde; Biernaux, Elinor; Detrembleur, Christine

    2017-12-29

    Ankle rigidity is a common musculoskeletal disorder affecting the talocrural joint, which can impair weight-bearing ankle dorsiflexion (WBADF) and daily-life in people with or without history of ankle injuries. Our objective was to compare the immediate effects of efficacy of Mulligan Mobilization with Movement (MWM) and Osteopathic Mobilization (OM) for improving ankle dorsiflexion range of motion (ROM) and musculoarticular stiffness (MAS) in people with chronic ankle dorsiflexion rigidity. A randomized clinical trial with two arms. Patients were recruited by word of mouth and via social network as well as posters, and analyzed in the neuro musculoskeletal laboratory of the "Université Catholique de Louvain-la-Neuve", Brussels, Belgium. 67 men (aged 18-40 years) presenting with potential chronic non-specific and unilateral ankle mobility deficit during WBDF were assessed for eligibility and finally 40 men were included and randomly allocated to single session of either MWM or OM. Two modalities of manual therapy indicated for hypothetic immediate effects in chronic ankle dorsiflexion stiffness, i.e. MWM and OM, were applied during a single session on included patients. Comprised blinding measures of MAS with a specific electromechanical device (namely: Lehmann's device) producing passive oscillatory ankle joint dorsiflexion and with clinical measures of WBADF-ROM as well. A two-way ANOVA revealed a non-significant interaction between both techniques and time for all outcome measures. For measures of MAS: elastic-stiffness (p= 0.37), viscous-stiffness (p= 0.83), total-stiffness (p= 0.58). For WBADF-ROM: toe-wall distance (p= 0.58) and angular ROM (p= 0.68). Small effect sizes between groups were determined with Cohen's d ranging from 0.05 to 0.29. One-way ANOVA demonstrated non-significant difference and small to moderate effects sizes (d= 0.003-0.58) on all outcome measures before and after interventions within both groups. A second two-way ANOVA analyzed the effect of each intervention on the sample categorized according to injury history status, and demonstrated a significant interaction between groups and time only for viscous stiffness (p= 0.04, d=-0.55). A single session of MWM and OM targeting the talocrural joint failed to immediately improve all measures in.

  6. Urn models for response-adaptive randomized designs: a simulation study based on a non-adaptive randomized trial.

    PubMed

    Ghiglietti, Andrea; Scarale, Maria Giovanna; Miceli, Rosalba; Ieva, Francesca; Mariani, Luigi; Gavazzi, Cecilia; Paganoni, Anna Maria; Edefonti, Valeria

    2018-03-22

    Recently, response-adaptive designs have been proposed in randomized clinical trials to achieve ethical and/or cost advantages by using sequential accrual information collected during the trial to dynamically update the probabilities of treatment assignments. In this context, urn models-where the probability to assign patients to treatments is interpreted as the proportion of balls of different colors available in a virtual urn-have been used as response-adaptive randomization rules. We propose the use of Randomly Reinforced Urn (RRU) models in a simulation study based on a published randomized clinical trial on the efficacy of home enteral nutrition in cancer patients after major gastrointestinal surgery. We compare results with the RRU design with those previously published with the non-adaptive approach. We also provide a code written with the R software to implement the RRU design in practice. In detail, we simulate 10,000 trials based on the RRU model in three set-ups of different total sample sizes. We report information on the number of patients allocated to the inferior treatment and on the empirical power of the t-test for the treatment coefficient in the ANOVA model. We carry out a sensitivity analysis to assess the effect of different urn compositions. For each sample size, in approximately 75% of the simulation runs, the number of patients allocated to the inferior treatment by the RRU design is lower, as compared to the non-adaptive design. The empirical power of the t-test for the treatment effect is similar in the two designs.

  7. A double-blind randomized discontinuation phase II study of sorafenib (BAY 43-9006) in previously treated non-small cell lung cancer patients: Eastern Cooperative Oncology Group study E2501

    PubMed Central

    Wakelee, Heather A.; Lee, Ju-Whei; Hanna, Nasser H.; Traynor, Anne M.; Carbone, David P.; Schiller, Joan H.

    2012-01-01

    Introduction Sorafenib is a raf kinase and angiogenesis inhibitor with activity in multiple cancers. This phase II study in heavily pretreated non-small cell lung cancer (NSCLC) patients (≥ two prior therapies) utilized a randomized discontinuation design. Methods Patients received 400 mg of sorafenib orally twice daily for two cycles (two months) (Step 1). Responding patients on Step 1 continued on sorafenib; progressing patients went off study, and patients with stable disease were randomized to placebo or sorafenib (Step 2), with crossover from placebo allowed upon progression. The primary endpoint of this study was the proportion of patients having stable or responding disease two months after randomization. Results : There were 299 patients evaluated for Step 1 with 81 eligible patients randomized on Step 2 who received sorafenib (n=50) or placebo (n=31). The two-month disease control rates following randomization were 54% and 23% for patients initially receiving sorafenib and placebo respectively, p=0.005. The hazard ratio for progression on Step 2 was 0.51 (95% CI 0.30, 0.87, p=0.014) favoring sorafenib. A trend in favor of overall survival with sorafenib was also observed (13.7 versus 9.0 months from time of randomization), HR 0.67 (95% CI 0.40-1.11), p=0.117. A dispensing error occurred which resulted in unblinding of some patients, but not before completion of the 8 week initial step 2 therapy. Toxicities were manageable and as expected. Conclusions : The results of this randomized discontinuation trial suggest that sorafenib has single agent activity in a heavily pretreated, enriched patient population with advanced NSCLC. These results support further investigation with sorafenib as a single agent in larger, randomized studies in NSCLC. PMID:22982658

  8. Narrow log-periodic modulations in non-Markovian random walks

    NASA Astrophysics Data System (ADS)

    Diniz, R. M. B.; Cressoni, J. C.; da Silva, M. A. A.; Mariz, A. M.; de Araújo, J. M.

    2017-12-01

    What are the necessary ingredients for log-periodicity to appear in the dynamics of a random walk model? Can they be subtle enough to be overlooked? Previous studies suggest that long-range damaged memory and negative feedback together are necessary conditions for the emergence of log-periodic oscillations. The role of negative feedback would then be crucial, forcing the system to change direction. In this paper we show that small-amplitude log-periodic oscillations can emerge when the system is driven by positive feedback. Due to their very small amplitude, these oscillations can easily be mistaken for numerical finite-size effects. The models we use consist of discrete-time random walks with strong memory correlations where the decision process is taken from memory profiles based either on a binomial distribution or on a delta distribution. Anomalous superdiffusive behavior and log-periodic modulations are shown to arise in the large time limit for convenient choices of the models parameters.

  9. Probabilistic generation of random networks taking into account information on motifs occurrence.

    PubMed

    Bois, Frederic Y; Gayraud, Ghislaine

    2015-01-01

    Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli.

  10. Probabilistic Generation of Random Networks Taking into Account Information on Motifs Occurrence

    PubMed Central

    Bois, Frederic Y.

    2015-01-01

    Abstract Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli. PMID:25493547

  11. Multipartite nonlocality and random measurements

    NASA Astrophysics Data System (ADS)

    de Rosier, Anna; Gruca, Jacek; Parisio, Fernando; Vértesi, Tamás; Laskowski, Wiesław

    2017-07-01

    We present an exhaustive numerical analysis of violations of local realism by families of multipartite quantum states. As an indicator of nonclassicality we employ the probability of violation for randomly sampled observables. Surprisingly, it rapidly increases with the number of parties or settings and even for relatively small values local realism is violated for almost all observables. We have observed this effect to be typical in the sense that it emerged for all investigated states including some with randomly drawn coefficients. We also present the probability of violation as a witness of genuine multipartite entanglement.

  12. Perceptions of coping with non-disease-related life stress for women with osteoarthritis: a qualitative analysis

    PubMed Central

    Harris, Melissa L; Byles, Julie E; Townsend, Natalie; Loxton, Deborah

    2016-01-01

    Objective Coping with arthritis-related stress has been extensively studied. However, limited evidence exists regarding coping with stress extraneous to the disease (life stress). This study explored life stress and coping in a subset of older women with osteoarthritis from a larger longitudinal study. Setting An Australian regional university. Design This qualitative study involved semistructured telephone interviews. Potential participants were mailed a letter of invitation/participant information statement by the Australian Longitudinal Study on Women's Health (ALSWH). Invitations were sent out in small batches (primarily 10). Interviews were conducted until data saturation was achieved using a systematic process (n=19). Digitally recorded interviews were transcribed verbatim and deidentified. Data were thematically analysed. Participants Women who indicated being diagnosed or treated for arthritis in the previous 3 years in the fifth survey of the ALSWH (conducted in 2007) provided the sampling frame. Potential participants were randomly sampled by a blinded data manager using a random number generator. Results Coping with life stress involved both attitudinal coping processes developed early in life (ie, stoicism) and transient cognitive and support-based responses. Women also described a dualistic process involving a reduction in the ability to cope with ongoing stress over time, coupled with personal growth. Conclusions This is the first study to examine how individuals cope with non-arthritis-related stress. The findings add to the current understanding of stress and coping, and have implications regarding the prevention of arthritis in women. Importantly, this study highlighted the potential detrimental impact of persistent coping patterns developed early in life. Public health campaigns aimed at stress mitigation and facilitation of adaptive coping mechanisms in childhood and adolescence may assist with arthritis prevention. PMID:27188808

  13. Comparison of EGFR signaling pathway somatic DNA mutations derived from peripheral blood and corresponding tumor tissue of patients with advanced non-small-cell lung cancer using liquidchip technology.

    PubMed

    Zhang, Hui; Liu, Deruo; Li, Shanqing; Zheng, Yongqing; Yang, Xinjie; Li, Xi; Zhang, Quan; Qin, Na; Lu, Jialin; Ren-Heidenreich, Lifen; Yang, Huiyi; Wu, Yuhua; Zhang, Xinyong; Nong, Jingying; Sun, Yifen; Zhang, Shucai

    2013-11-01

    Somatic DNA mutations affecting the epidermal growth factor receptor (EGFR) signaling pathway are known to predict responsiveness to EGFR-tyrosine kinase inhibitor drugs in patients with advanced non-small-cell lung cancers. We evaluated a sensitive liquidchip platform for detecting EGFR, KRAS (alias Ki-ras), proto-oncogene B-Raf, and phosphatidylinositol 3-kinase CA mutations in plasma samples, which were highly correlated with matched tumor tissues from 86 patients with advanced non-small-cell lung cancers. Either EGFR exon 19 or 21 mutations were detected in 36 patients: 23 of whom had identical mutations in both their blood and tissue samples; whereas mutations in the remaining 13 were found only in their tumor samples. These EGFR mutations occurred at a significantly higher frequency in females, never-smokers, and in patients with adenocarcinomas (P ≤ 0.001). The EGFR exon 20 T790M mutation was detected in only one of the paired samples [100% (95% CI, 96% to 100%) agreement]. For KRAS, proto-oncogene B-Raf, and phosphatidylinositol 3-kinase CA mutations, the overall agreements were 97% (95% CI, 90% to 99%), 98% (95% CI, 92% to 99%), and 97% (95% CI, 90% to 99%), respectively, and these were not associated with age, sex, smoking history, or histopathologic type. In conclusion, mutations detected in plasma correlated strongly with mutation profiles in each respective tumor sample, suggesting that this liquidchip platform may offer a rapid and noninvasive method for predicting tumor responsiveness to EGFR-tyrosine kinase inhibitor drugs in patients with advanced non-small-cell lung cancers. Copyright © 2013 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  14. Utility of the ACE Inhibitor Captopril in Mitigating Radiation-associated Pulmonary Toxicity in Lung Cancer: Results From NRG Oncology RTOG 0123.

    PubMed

    Small, William; James, Jennifer L; Moore, Timothy D; Fintel, Dan J; Lutz, Stephen T; Movsas, Benjamin; Suntharalingam, Mohan; Garces, Yolanda I; Ivker, Robert; Moulder, John; Pugh, Stephanie; Berk, Lawrence B

    2018-04-01

    The primary objective of NRG Oncology Radiation Therapy Oncology Group 0123 was to test the ability of the angiotensin-converting enzyme inhibitor captopril to alter the incidence of pulmonary damage after radiation therapy for lung cancer; secondary objectives included analyzing pulmonary cytokine expression, quality of life, and the long-term effects of captopril. Eligible patients included stage II-IIIB non-small cell lung cancer, stage I central non-small cell lung cancer, or limited-stage small cell. Patients who met eligibility for randomization at the end of radiotherapy received either captopril or standard care for 1 year. The captopril was to be escalated to 50 mg three times a day. Primary endpoint was incidence of grade 2+ radiation-induced pulmonary toxicity in the first year. Eighty-one patients were accrued between June 2003 and August 2007. Given the low accrual rate, the study was closed early. No significant safety issues were encountered. Eight patients were ineligible for registration or withdrew consent before randomization and 40 patients were not randomized postradiation. Major reasons for nonrandomization included patients' refusal and physician preference. Of the 33 randomized patients, 20 were analyzable (13 observation, 7 captopril). The incidence of grade 2+ pulmonary toxicity attributable to radiation therapy was 23% (3/13) in the observation arm and 14% (1/7) in the captopril arm. Despite significant resources and multiple amendments, NRG Oncology Radiation Therapy Oncology Group 0123 was unable to test the hypothesis that captopril mitigates radiation-induced pulmonary toxicity. It did show the safety of such an approach and the use of newer angiotensin-converting enzyme inhibitors started during radiotherapy may solve the accrual problems.

  15. Anomalous Anticipatory Responses in Networked Random Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Roger D.; Bancel, Peter A.

    2006-10-16

    We examine an 8-year archive of synchronized, parallel time series of random data from a world spanning network of physical random event generators (REGs). The archive is a publicly accessible matrix of normally distributed 200-bit sums recorded at 1 Hz which extends from August 1998 to the present. The primary question is whether these data show non-random structure associated with major events such as natural or man-made disasters, terrible accidents, or grand celebrations. Secondarily, we examine the time course of apparently correlated responses. Statistical analyses of the data reveal consistent evidence that events which strongly affect people engender small butmore » significant effects. These include suggestions of anticipatory responses in some cases, leading to a series of specialized analyses to assess possible non-random structure preceding precisely timed events. A focused examination of data collected around the time of earthquakes with Richter magnitude 6 and greater reveals non-random structure with a number of intriguing, potentially important features. Anomalous effects in the REG data are seen only when the corresponding earthquakes occur in populated areas. No structure is found if they occur in the oceans. We infer that an important contributor to the effect is the relevance of the earthquake to humans. Epoch averaging reveals evidence for changes in the data some hours prior to the main temblor, suggestive of reverse causation.« less

  16. Is physical activity of medical personnel a role model for their patients.

    PubMed

    Biernat, Elżbieta; Poznańska, Anna; Gajewski, Antoni K

    2012-01-01

    Sedentary lifestyle and other health behaviors such as smoking or alcohol consumption are well documented risk factors of several diseases. Numerous works by doctors and other healthcare professionals have been dedicated to the study of smoking and alcohol consumption. In contrast, the prevalence of physical activity of doctors or other medical personnel, who are well positioned to provide physical activity counseling to patients, remains almost unknown. Most studies were focused on male physicians and used a small total sample from one hospital. To study the situation in Warsaw, data on a random sample of medical personnel was analyzed in order to determine the prevalence of sport (both competitive and non-competitive leisure sport activity) and physical activity. The participants were a random sample of Warsaw medical doctors, nurses, and other medical personnel (764 persons). Data was collected face-to-face in November 2008 by well trained interviewers. The respondents were asked about their participation in competitive sport or non-competitive leisure sport activities during the previous year. The short, last seven days, Polish version of International Physical Activity Questionnaire (IPAQ) was used for the assessment of physical activity level. In the whole sample, the prevalence in competitive sport was low but significantly higher among men, but there were no significant differences between genders in division for different professional groups. Men more often took part in non-competitive leisure sport activities. A high level of physical activity was a rare characteristic for the majority of studied men and women (10.9 and 13.5%, respectively). A low level of physical activity was dominant among men and women (44.0 and 49.6% respectively). Independent risk factors of low physical activity were: not participating in sport or leisure sport activities (OR [95% CI] 3.70; 1.64-8.33 and 2.08; 1.37-.23 for men and women, respectively), being employed in an Out-patient Clinic (OR 2.86; 1.54-5.28 and 2.03; 1.42-2.90), overweight (only for men - OR 1.91; 1.10-3.31), and working as a doctor (for both men and women - 1.43; 1.05-1.94). All kinds of healthcare workers in Warsaw reported low physical activity, which could influence their physical activity counseling.

  17. Severe traumatic brain injury management and clinical outcome using the Lund concept.

    PubMed

    Koskinen, L-O D; Olivecrona, M; Grände, P O

    2014-12-26

    This review covers the main principles of the Lund concept for treatment of severe traumatic brain injury. This is followed by a description of results of clinical studies in which this therapy or a modified version of the therapy has been used. Unlike other guidelines, which are based on meta-analytical approaches, important components of the Lund concept are based on physiological mechanisms for regulation of brain volume and brain perfusion and to reduce transcapillary plasma leakage and the need for plasma volume expanders. There have been nine non-randomized and two randomized outcome studies with the Lund concept or modified versions of the concept. The non-randomized studies indicated that the Lund concept is beneficial for outcome. The two randomized studies were small but showed better outcome in the groups of patients treated according to the modified principles of the Lund concept than in the groups given a more conventional treatment. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  18. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    PubMed

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  19. Assessment of safety and immunogenicity of two different lots of diphtheria, tetanus, pertussis, hepatitis B and Haemophilus influenzae type b vaccine manufactured using small and large scale manufacturing process.

    PubMed

    Sharma, Hitt J; Patil, Vishwanath D; Lalwani, Sanjay K; Manglani, Mamta V; Ravichandran, Latha; Kapre, Subhash V; Jadhav, Suresh S; Parekh, Sameer S; Ashtagi, Girija; Malshe, Nandini; Palkar, Sonali; Wade, Minal; Arunprasath, T K; Kumar, Dinesh; Shewale, Sunil D

    2012-01-11

    Hib vaccine can be easily incorporated in EPI vaccination schedule as the immunization schedule of Hib is similar to that of DTP vaccine. To meet the global demand of Hib vaccine, SIIL scaled up the Hib conjugate manufacturing process. This study was conducted in Indian infants to assess and compare the immunogenicity and safety of DTwP-HB+Hib (Pentavac(®)) vaccine of SIIL manufactured at large scale with the 'same vaccine' manufactured at a smaller scale. 720 infants aged 6-8 weeks were randomized (2:1 ratio) to receive 0.5 ml of Pentavac(®) vaccine from two different lots one produced at scaled up process and the other at a small scale process. Serum samples obtained before and at one month after the 3rd dose of vaccine from both the groups were tested for IgG antibody response by ELISA and compared to assess non-inferiority. Neither immunological interference nor increased reactogenicity was observed in either of the vaccine groups. All infants developed protective antibody titres to diphtheria, tetanus and Hib disease. For hepatitis B antigen, one child from each group remained sero-negative. The response to pertussis was 88% in large scale group vis-à-vis 87% in small scale group. Non-inferiority was concluded for all five components of the vaccine. No serious adverse event was reported in the study. The scale up vaccine achieved comparable response in terms of the safety and immunogenicity to small scale vaccine and therefore can be easily incorporated in the routine childhood vaccination programme. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e -Å 2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis ofmore » the node distribution in metal-organic frameworks (MOFs).« less

  1. Non-random mating and convergence over time for alcohol consumption, smoking, and exercise: the Nord-Trøndelag Health Study.

    PubMed

    Ask, Helga; Rognmo, Kamilla; Torvik, Fartein Ask; Røysamb, Espen; Tambs, Kristian

    2012-05-01

    Spouses tend to have similar lifestyles. We explored the degree to which spouse similarity in alcohol use, smoking, and physical exercise is caused by non-random mating or convergence. We used data collected for the Nord-Trøndelag Health Study from 1984 to 1986 and prospective registry information about when and with whom people entered marriage/cohabitation between 1970 and 2000. Our sample included 19,599 married/cohabitating couples and 1,551 future couples that were to marry/cohabitate in the 14-16 years following data collection. All couples were grouped according to the duration between data collection and entering into marriage/cohabitation. Age-adjusted polychoric spouse correlations were used as the dependent variables in non-linear segmented regression analysis; the independent variable was time. The results indicate that spouse concordance in lifestyle is due to both non-random mating and convergence. Non-random mating appeared to be strongest for smoking. Convergence in alcohol use and smoking was evident during the period prior to marriage/cohabitation, whereas convergence in exercise was evident throughout life. Reduced spouse similarity in smoking with relationship duration may reflect secular trends.

  2. Prevalence of paratuberculosis in the dairy goat and dairy sheep industries in Ontario, Canada.

    PubMed

    Bauman, Cathy A; Jones-Bitton, Andria; Menzies, Paula; Toft, Nils; Jansen, Jocelyn; Kelton, David

    2016-02-01

    A cross-sectional study was undertaken (October 2010 to August 2011) to estimate the prevalence of paratuberculosis in the small ruminant dairy industries in Ontario, Canada. Blood and feces were sampled from 580 goats and 397 sheep (lactating and 2 y of age or older) that were randomly selected from 29 randomly selected dairy goat herds and 21 convenience-selected dairy sheep flocks. Fecal samples were analyzed using bacterial culture (BD BACTEC MGIT 960) and polymerase chain reaction (Tetracore); serum samples were tested with the Prionics Parachek enzyme-linked immunosorbent assay (ELISA). Using 3-test latent class Bayesian models, true farm-level prevalence was estimated to be 83.0% [95% probability interval (PI): 62.6% to 98.1%] for dairy goats and 66.8% (95% PI: 41.6% to 91.4%) for dairy sheep. The within-farm true prevalence for dairy goats was 35.2% (95% PI: 23.0% to 49.8%) and for dairy sheep was 48.3% (95% PI: 27.6% to 74.3%). These data indicate that a paratuberculosis control program for small ruminants is needed in Ontario.

  3. Depression and aggressive behaviour in adolescents offenders and non-offenders.

    PubMed

    Llorca Mestre, Anna; Malonda, Elisabeth; Samper-García, Paula

    2017-05-01

    Adolescent behaviour is strongly linked to emotions. The aim of this study is 1) analyse the differences between young offenders and non-offenders in emotional instability, anger, aggressive behaviour, anxiety and depression, and also the differences according to sex; and 2) compare the relation between emotional instability and anxiety, depression and aggressive behaviour mediated or modulated by anger in both groups. participants are 440 adolescents, both male and female (15-18 years old). 220 were young offenders from four different correctional centres of the Valencia Region. The other 220 participants were randomly chosen from ten public and private schools in the Valencia metropolitan area. In the Schools the instruments were applied collectively in the classroom, with a 50 minutes maximum duration. In the Youth Detention Centre the application was carried out in small groups. The structural equation model (SEM) carried out on each group, young offenders and non-offenders show a relation between the assessed variables. Emotional instability appears strongly related with anger in both samples, but anger just predicts depression and aggressive behaviour in the offender population. The results give relevant information for treatment and prevention of aggressive behaviour and delinquency in teenagers through emotional regulation.

  4. Individual decision making in relation to participation in cardiovascular screening: a study of revealed and stated preferences.

    PubMed

    Søgaard, Rikke; Lindholt, Jes; Gyrd-Hansen, Dorte

    2013-02-01

    The (cost-)effectiveness of a screening programme may be strongly influenced by the participation rate. The objective of this study was to compare participants' and non-participants' motives for the attendance decision as well as their overall preferences for participation in cardiovascular disease screening. This study sampled 1053 participants and 1006 non-participants from a screening trial and randomly allocated the participants to receive different levels of additional information about the screening programme. An ad hoc survey questionnaire about doubt and arguments in relation to the participation decision was given to participants and non-participants along with a contingent valuation task. Among participants, 5% had doubt about participation and the most frequent argument was that they did not want the test result. Among non-participants, 40% would reconsider their non-participation decision after having received additional information while the remainder 60% stood by their decision and provided explicit arguments for it. After having received additional information the participants still valued the programme significantly higher than non-participants, but the difference was relatively small. Participants and non-participants in cardiovascular screening programmes seem to have different strengths of preferences, which signals that their behavioural choice is founded in rational thinking. Furthermore, it appears that additional information and a second reflection about the participation decision may affect a substantial proportion of non-participants to reverse their decision, a finding that should receive policy interest.

  5. Scalable randomized benchmarking of non-Clifford gates

    NASA Astrophysics Data System (ADS)

    Cross, Andrew; Magesan, Easwar; Bishop, Lev; Smolin, John; Gambetta, Jay

    Randomized benchmarking is a widely used experimental technique to characterize the average error of quantum operations. Benchmarking procedures that scale to enable characterization of n-qubit circuits rely on efficient procedures for manipulating those circuits and, as such, have been limited to subgroups of the Clifford group. However, universal quantum computers require additional, non-Clifford gates to approximate arbitrary unitary transformations. We define a scalable randomized benchmarking procedure over n-qubit unitary matrices that correspond to protected non-Clifford gates for a class of stabilizer codes. We present efficient methods for representing and composing group elements, sampling them uniformly, and synthesizing corresponding poly (n) -sized circuits. The procedure provides experimental access to two independent parameters that together characterize the average gate fidelity of a group element. We acknowledge support from ARO under Contract W911NF-14-1-0124.

  6. Some design issues of strata-matched non-randomized studies with survival outcomes.

    PubMed

    Mazumdar, Madhu; Tu, Donsheng; Zhou, Xi Kathy

    2006-12-15

    Non-randomized studies for the evaluation of a medical intervention are useful for quantitative hypothesis generation before the initiation of a randomized trial and also when randomized clinical trials are difficult to conduct. A strata-matched non-randomized design is often utilized where subjects treated by a test intervention are matched to a fixed number of subjects treated by a standard intervention within covariate based strata. In this paper, we consider the issue of sample size calculation for this design. Based on the asymptotic formula for the power of a stratified log-rank test, we derive a formula to calculate the minimum number of subjects in the test intervention group that is required to detect a given relative risk between the test and standard interventions. When this minimum number of subjects in the test intervention group is available, an equation is also derived to find the multiple that determines the number of subjects in the standard intervention group within each stratum. The methodology developed is applied to two illustrative examples in gastric cancer and sarcoma.

  7. The Incidence and Management of Conflicts in Secular and Non-Secular Tertiary Institutions in South West Nigeria

    ERIC Educational Resources Information Center

    Ayodele, Joseph Babatola; Adewumi, Joseph Olukayode

    2007-01-01

    This paper compared the incidence and management of conflicts in secular and non-secular tertiary institutions in Nigeria. The sample of this study was made of sixty staff, and two hundred and forty students randomly selected each from two secular and two non-secular tertiary institutions in south western Nigeria. A validated questionnaire was…

  8. Stable and efficient retrospective 4D-MRI using non-uniformly distributed quasi-random numbers

    NASA Astrophysics Data System (ADS)

    Breuer, Kathrin; Meyer, Cord B.; Breuer, Felix A.; Richter, Anne; Exner, Florian; Weng, Andreas M.; Ströhle, Serge; Polat, Bülent; Jakob, Peter M.; Sauer, Otto A.; Flentje, Michael; Weick, Stefan

    2018-04-01

    The purpose of this work is the development of a robust and reliable three-dimensional (3D) Cartesian imaging technique for fast and flexible retrospective 4D abdominal MRI during free breathing. To this end, a non-uniform quasi random (NU-QR) reordering of the phase encoding (k y –k z ) lines was incorporated into 3D Cartesian acquisition. The proposed sampling scheme allocates more phase encoding points near the k-space origin while reducing the sampling density in the outer part of the k-space. Respiratory self-gating in combination with SPIRiT-reconstruction is used for the reconstruction of abdominal data sets in different respiratory phases (4D-MRI). Six volunteers and three patients were examined at 1.5 T during free breathing. Additionally, data sets with conventional two-dimensional (2D) linear and 2D quasi random phase encoding order were acquired for the volunteers for comparison. A quantitative evaluation of image quality versus scan times (from 70 s to 626 s) for the given sampling schemes was obtained by calculating the normalized mutual information (NMI) for all volunteers. Motion estimation was accomplished by calculating the maximum derivative of a signal intensity profile of a transition (e.g. tumor or diaphragm). The 2D non-uniform quasi-random distribution of phase encoding lines in Cartesian 3D MRI yields more efficient undersampling patterns for parallel imaging compared to conventional uniform quasi-random and linear sampling. Median NMI values of NU-QR sampling are the highest for all scan times. Therefore, within the same scan time 4D imaging could be performed with improved image quality. The proposed method allows for the reconstruction of motion artifact reduced 4D data sets with isotropic spatial resolution of 2.1  ×  2.1  ×  2.1 mm3 in a short scan time, e.g. 10 respiratory phases in only 3 min. Cranio-caudal tumor displacements between 23 and 46 mm could be observed. NU-QR sampling enables for stable 4D-MRI with high temporal and spatial resolution within short scan time for visualization of organ or tumor motion during free breathing. Further studies, e.g. the application of the method for radiotherapy planning are needed to investigate the clinical applicability and diagnostic value of the approach.

  9. Ear Acupuncture for Acute Sore Throat: A Randomized Controlled Trial

    DTIC Science & Technology

    2014-09-26

    SEP 2014 2. REPORT TYPE Final 3. DATES COVERED 4. TITLE AND SUBTITLE Ear acupuncture for acute sore throat. A randomized controlled trial...Auncular Acupuncture is a low risk option for acute pain control •Battlefield acupuncture (BFA) IS a specific auncular acupuncture technique •BFA IS...Strengths: Prospect1ve RCT •Weaknesses Small sample stze. no sham acupuncture performed, patients not blinded to treatment •Th1s study represents an

  10. Prospective Randomized Phase II Parallel Study of Vinorelbine Maintenance Therapy versus Best Supportive Care in Advanced Non-Small Cell Lung Cancer.

    PubMed

    Khosravi, Adnan; Esfahani-Monfared, Zahra; Seifi, Sharareh; Khodadad, Kian

    2017-01-01

    Maintenance strategy has been used to improve survival in non-small cell lung cancer (NSCLC). We investigated whether switch maintenance therapy with vinorelbine improved progression free survival (PFS) after first-line chemotherapy with gemcitabine plus carboplatin. In this single blind, parallel, phase 2, randomized trial, patients with NSCLC pathology, age >18 years, Eastern Cooperative Oncology Group (ECOG) performance status (PS) score of 0-2, and advanced stage (IIIB and IV) were treated with up to 6 cycles of gemcitabine 1250 mg/m 2 (day 1 and 8) plus carboplatin AUC 5 (day 1) every 3 weeks. Patients who did not show progression after first-line chemotherapy were randomly assigned to receive switch maintenance with vinorelbine (25 mg/m 2 , day 1, 15) or the best supportive care until disease progression. A total of 100 patients were registered, of whom 34 had a non-progressive response to first-line chemotherapy and randomly received maintenance vinorelbine (n=19) or best supportive care (n=15). The hazard ratio of PFS in the vinorelbine group relative to the best supportive care group was 1.097 (95% confidence interval = 0.479-2.510; P-value =0.827). There was no significant difference between the overall survival for the two groups (P=0.068). Switch maintenance strategies are beneficial, but defining the right candidates for treatment is a problem. Moreover, the trial designs do not always reflect the real-world considerations. Switch maintenance therapy with vinorelbine, though had tolerable toxicity, did not improve PFS in patients with NSCLC. Therefore, other agents should be considered in this setting.

  11. Antimicrobial drugs for persistent diarrhoea of unknown or non-specific cause in children under six in low and middle income countries: systematic review of randomized controlled trials

    PubMed Central

    2009-01-01

    Background A high proportion of children with persistent diarrhoea in middle and low income countries die. The best treatment is not clear. We conducted a systematic review to evaluate the effectiveness of antimicrobial drug treatment for persistent diarrhoea of unknown or non-specific cause. Methods We included randomized comparisons of antimicrobial drugs for the treatment of persistent diarrhoea of unknown or non-specific cause in children under the age of six years in low and middle income countries. We searched the electronic databases MEDLINE, EMBASE, LILACS, WEB OF SCIENCE, and the Cochrane Central Register of Controlled Trials (CENTRAL) to May 2008 for relevant randomized or quasi randomized controlled trials. We summarised the characteristics of the eligible trials, assessed their quality using standard criteria, and extracted relevant outcomes data. Where appropriate, we combined the results of different trials. Results Three trials from South East Asia and one from Guatemala were included, all were small, and three had adequate allocation concealment. Two were in patients with diarrhoea of unknown cause, and two were in patients in whom known bacterial or parasitological causes of diarrhoea had been excluded. No difference was demonstrated for oral gentamicin compared with placebo (presence of diarrhoea at 6 or 7 days; 2 trials, n = 151); and for metronidazole compared with placebo (presence of diarrhoea at 3, 5 and 7 days; 1 trial, n = 99). In one small trial, sulphamethoxazole-trimethoprim appeared better than placebo in relation to diarrhoea at seven days and total stool volume (n = 55). Conclusion There is little evidence as to whether or not antimicrobials help treat persistent diarrhoea in young children in low and middle income countries. PMID:19257885

  12. A two-way enriched clinical trial design: combining advantages of placebo lead-in and randomized withdrawal.

    PubMed

    Ivanova, Anastasia; Tamura, Roy N

    2015-12-01

    A new clinical trial design, designated the two-way enriched design (TED), is introduced, which augments the standard randomized placebo-controlled trial with second-stage enrichment designs in placebo non-responders and drug responders. The trial is run in two stages. In the first stage, patients are randomized between drug and placebo. In the second stage, placebo non-responders are re-randomized between drug and placebo and drug responders are re-randomized between drug and placebo. All first-stage data, and second-stage data from first-stage placebo non-responders and first-stage drug responders, are utilized in the efficacy analysis. The authors developed one, two and three degrees of freedom score tests for treatment effect in the TED and give formulae for asymptotic power and for sample size computations. The authors compute the optimal allocation ratio between drug and placebo in the first stage for the TED and compare the operating characteristics of the design to the standard parallel clinical trial, placebo lead-in and randomized withdrawal designs. Two motivating examples from different disease areas are presented to illustrate the possible design considerations. © The Author(s) 2011.

  13. Multiscale Roughness Influencing on Transport Behavior of Passive Solute through a Single Self-affine Fracture

    NASA Astrophysics Data System (ADS)

    Dou, Z.

    2017-12-01

    In this study, the influence of multi-scale roughness on transport behavior of the passive solute through the self-affine fracture was investigated. The single self-affine fracture was constructed by the successive random additions (SRA) and the fracture roughness was decomposed into two different scales (i.e. large-scale primary roughness and small-scale secondary roughness) by the Wavelet analysis technique. The fluid flow in fractures, which was characterized by the Forchheimer's law, showed the non-linear flow behaviors such as eddies and tortuous streamlines. The results indicated that the small-scale secondary roughness was primarily responsible for the non-linear flow behaviors. The direct simulations of asymptotic passive solute transport represented the Non-Fickian transport characteristics (i.e. early arrivals and long tails) in breakthrough curves (BTCs) and residence time distributions (RTDs) with and without consideration of the secondary roughness. Analysis of multiscale BTCs and RTDs showed that the small-scale secondary roughness played a significant role in enhancing the Non-Fickian transport characteristics. We found that removing small-scale secondary roughness led to the lengthening arrival and shortening tail. The peak concentration in BTCs decreased as the secondary roughness was removed, implying that the secondary could also enhance the solute dilution. The estimated BTCs by the Fickian advection-dispersion equation (ADE) yielded errors which decreased with the small-scale secondary roughness being removed. The mobile-immobile model (MIM) was alternatively implemented to characterize the Non-Fickian transport. We found that the MIM was more capable of estimating Non-Fickian BTCs. The small-scale secondary roughness resulted in the decreasing mobile domain fraction and the increasing mass exchange rate between immobile and mobile domains. The estimated parameters from the MIM could provide insight into the inherent mechanism of roughness-induced Non-Fickian transport behaviors.

  14. Latin Hypercube Sampling (LHS) UNIX Library/Standalone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-05-13

    The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less

  15. A robust and efficient statistical method for genetic association studies using case and control samples from multiple cohorts

    PubMed Central

    2013-01-01

    Background The theoretical basis of genome-wide association studies (GWAS) is statistical inference of linkage disequilibrium (LD) between any polymorphic marker and a putative disease locus. Most methods widely implemented for such analyses are vulnerable to several key demographic factors and deliver a poor statistical power for detecting genuine associations and also a high false positive rate. Here, we present a likelihood-based statistical approach that accounts properly for non-random nature of case–control samples in regard of genotypic distribution at the loci in populations under study and confers flexibility to test for genetic association in presence of different confounding factors such as population structure, non-randomness of samples etc. Results We implemented this novel method together with several popular methods in the literature of GWAS, to re-analyze recently published Parkinson’s disease (PD) case–control samples. The real data analysis and computer simulation show that the new method confers not only significantly improved statistical power for detecting the associations but also robustness to the difficulties stemmed from non-randomly sampling and genetic structures when compared to its rivals. In particular, the new method detected 44 significant SNPs within 25 chromosomal regions of size < 1 Mb but only 6 SNPs in two of these regions were previously detected by the trend test based methods. It discovered two SNPs located 1.18 Mb and 0.18 Mb from the PD candidates, FGF20 and PARK8, without invoking false positive risk. Conclusions We developed a novel likelihood-based method which provides adequate estimation of LD and other population model parameters by using case and control samples, the ease in integration of these samples from multiple genetically divergent populations and thus confers statistically robust and powerful analyses of GWAS. On basis of simulation studies and analysis of real datasets, we demonstrated significant improvement of the new method over the non-parametric trend test, which is the most popularly implemented in the literature of GWAS. PMID:23394771

  16. Association between circulating irisin and insulin resistance in non-diabetic adults: A meta-analysis.

    PubMed

    Qiu, Shanhu; Cai, Xue; Yin, Han; Zügel, Martina; Sun, Zilin; Steinacker, Jürgen Michael; Schumann, Uwe

    2016-06-01

    Exogenous administration of recombinant irisin improves glucose metabolism. However, the association of endogenous circulating (plasma/serum) irisin with insulin resistance remains poorly delineated. This study was aimed to examine this association by meta-analyzing the current evidence without study design restriction in non-diabetic adults. Peer-reviewed studies written in English from 3 databases were searched to December 2015. Studies that reported the association between circulating irisin and insulin resistance (or its reverse, insulin sensitivity) in non-diabetic non-pregnant adults (mean ages ≥18years) were included. The pooled correlation coefficient (r) and 95% confidence intervals (CIs) were calculated using a random-effects model. Subgroup analyses and meta-regression were performed to explore potential sources of heterogeneity. Of the 195 identified publications, 17 studies from 15 articles enrolling 1912 participants reported the association between circulating irisin and insulin resistance. The pooled effect size was 0.15 (95% CI: 0.07 to 0.22) with a substantial heterogeneity (I(2)=55.5%). This association seemed to be modified by glycemic status (fasting blood glucose ≥6.1mmol/L versus <6.1mmol/L) and racial-ethnic difference (Asians versus Europeans versus Americans), but not by sex difference, sampling time-point, blood sample type, ELISA kits used, baseline age, or body mass index. Circulating irisin was inversely associated with insulin sensitivity (6 studies; r=-0.17, 95% CI: -0.25 to -0.09). Circulating irisin is directly and positively associated with insulin resistance in non-diabetic adults. However, this association is rather small and requires further clarification, in particular by well-designed large epidemiological studies with overall, race-, and sex-specific analyses. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Prevalence and risk factors for foot and mouth disease infection in small ruminants in Israel.

    PubMed

    Elnekave, Ehud; van Maanen, Kees; Shilo, Hila; Gelman, Boris; Storm, Nick; Berdenstain, Svetlane; Berke, Olaf; Klement, Eyal

    2016-03-01

    During the last decade, 27% of the foot and mouth disease (FMD) outbreaks in Israel affected small ruminant (SR) farms. FMD outbreaks reoccur in Israel despite vaccination of all livestock and application of control measures. We performed a cross-sectional serological study, aimed at estimating the prevalence of FMD infection in SR in Israel and the possible risk factors for infection. Overall, 2305 samples of adult sheep (n=1948) and goats (n=357) were collected during 2011-14 in two separate surveys. One survey was based on random sampling of intensive management system farms and the other was originally aimed at the detection of Brucella melitensis at extensive and semi-intensive management system farms. Sera were tested by NS blocking ELISA (PrioCHECK(®)). The serological prevalence of antibodies against non structural proteins (NSP) of FMD virus was estimated at 3.7% (95% confidence interval (CI95%)=3.0% -4.5%). Additionally, a significantly lower infection prevalence (p value=0.049) of 1.0% (CI95%=0.1%-3.6%) was found in a small sample (197 sera) of young SR, collected during 2012. The positive samples from adult SR were scattered all over Israel, though two significant infection clusters were found by the spatial scan statistic. Occurrence of an outbreak on a non-SR farm within 5km distance was associated with a fifteen times increase in the risk of FMD infection of SR in the univariable analysis. Yet, this variable was not included in the multivariable analysis due to collinearities with the other independent variables. Multivariable logistic regression modeling found significantly negative associations (P value<0.05) of grazing and being in a herd larger than 500 animals with risk of infection. Grazing herds and herds larger than 500 animals, both represent farms that are intensively or semi-intensively managed. Higher maintenance of bio-safety, fewer introductions of new animals and higher vaccination compliance in these farms may explain their lower risk of infection by FMD virus. We conclude that despite the wide distribution of infection among SR farms, low farm level prevalence indicates that in Israel SR pose only limited role in the transmission and dissemination of FMD. This conclusion may be applicable for other endemic countries in which, similar to Israel, all livestock are vaccinated against FMD. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Colonic stem cell data are consistent with the immortal model of stem cell division under non-random strand segregation.

    PubMed

    Walters, K

    2009-06-01

    Colonic stem cells are thought to reside towards the base of crypts of the colon, but their numbers and proliferation mechanisms are not well characterized. A defining property of stem cells is that they are able to divide asymmetrically, but it is not known whether they always divide asymmetrically (immortal model) or whether there are occasional symmetrical divisions (stochastic model). By measuring diversity of methylation patterns in colon crypt samples, a recent study found evidence in favour of the stochastic model, assuming random segregation of stem cell DNA strands during cell division. Here, the effect of preferential segregation of the template strand is considered to be consistent with the 'immortal strand hypothesis', and explore the effect on conclusions of previously published results. For a sample of crypts, it is shown how, under the immortal model, to calculate mean and variance of the number of unique methylation patterns allowing for non-random strand segregation and compare them with those observed. The calculated mean and variance are consistent with an immortal model that incorporates non-random strand segregation for a range of stem cell numbers and levels of preferential strand segregation. Allowing for preferential strand segregation considerably alters previously published conclusions relating to stem cell numbers and turnover mechanisms. Evidence in favour of the stochastic model may not be as strong as previously thought.

  19. Dacomitinib versus erlotinib in patients with advanced-stage, previously treated non-small-cell lung cancer (ARCHER 1009): a randomised, double-blind, phase 3 trial.

    PubMed

    Ramalingam, Suresh S; Jänne, Pasi A; Mok, Tony; O'Byrne, Kenneth; Boyer, Michael J; Von Pawel, Joachim; Pluzanski, Adam; Shtivelband, Mikhail; Docampo, Lara Iglesias; Bennouna, Jaafar; Zhang, Hui; Liang, Jane Q; Doherty, Jim P; Taylor, Ian; Mather, Cecile B; Goldberg, Zelanna; O'Connell, Joseph; Paz-Ares, Luis

    2014-11-01

    Dacomitinib is an irreversible pan-EGFR family tyrosine kinase inhibitor. Findings from a phase 2 study in non-small cell lung cancer showed favourable efficacy for dacomitinib compared with erlotinib. We aimed to compare dacomitinib with erlotinib in a phase 3 study. In a randomised, multicentre, double-blind phase 3 trial in 134 centres in 23 countries, we enrolled patients who had locally advanced or metastatic non-small-cell lung cancer, progression after one or two previous regimens of chemotherapy, Eastern Cooperative Oncology Group (ECOG) performance status of 0-2, and presence of measurable disease. We randomly assigned patients in a 1:1 ratio to dacomitinib (45 mg/day) or erlotinib (150 mg/day) with matching placebo. Treatment allocation was masked to the investigator, patient, and study funder. Randomisation was stratified by histology (adenocarcinoma vs non-adenocarcinoma), ethnic origin (Asian vs non-Asian and Indian sub-continent), performance status (0-1 vs 2), and smoking status (never-smoker vs ever-smoker). The coprimary endpoints were progression-free survival per independent review for all randomly assigned patients, and for all randomly assigned patients with KRAS wild-type tumours. The study has completed accrual and is registered with ClinicalTrials.gov, number NCT01360554. Between June 22, 2011, and March 12, 2013, we enrolled 878 patients and randomly assigned 439 to dacomitinib (256 KRAS wild type) and 439 (263 KRAS wild type) to erlotinib. Median progression-free survival was 2·6 months (95% CI 1·9-2·8) in both the dacomitinib group and the erlotinib group (stratified hazard ratio [HR] 0·941, 95% CI 0·802-1·104, one-sided log-rank p=0·229). For patients with wild-type KRAS, median progression-free survival was 2·6 months for dacomitinib (95% CI 1·9-2·9) and erlotinib (95% CI 1·9-3·0; stratified HR 1·022, 95% CI 0·834-1·253, one-sided p=0·587). In patients who received at least one dose of study drug, the most frequent grade 3-4 adverse events were diarrhoea (47 [11%] patients in the dacomitinib group vs ten [2%] patients in the erlotinib group), rash (29 [7%] vs 12 [3%]), and stomatitis (15 [3%] vs two [<1%]). Serious adverse events were reported in 52 (12%) patients receiving dacomitinib and 40 (9%) patients receiving erlotinib. Irreversible EGFR inhibition with dacomitinib was not superior to erlotinib in an unselected patient population with advanced non-small-cell lung cancer or in patients with KRAS wild-type tumours. Further study of irreversible EGFR inhibitors should be restricted to patients with activating EGFR mutations. Pfizer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Design of Phase II Non-inferiority Trials.

    PubMed

    Jung, Sin-Ho

    2017-09-01

    With the development of inexpensive treatment regimens and less invasive surgical procedures, we are confronted with non-inferiority study objectives. A non-inferiority phase III trial requires a roughly four times larger sample size than that of a similar standard superiority trial. Because of the large required sample size, we often face feasibility issues to open a non-inferiority trial. Furthermore, due to lack of phase II non-inferiority trial design methods, we do not have an opportunity to investigate the efficacy of the experimental therapy through a phase II trial. As a result, we often fail to open a non-inferiority phase III trial and a large number of non-inferiority clinical questions still remain unanswered. In this paper, we want to develop some designs for non-inferiority randomized phase II trials with feasible sample sizes. At first, we review a design method for non-inferiority phase III trials. Subsequently, we propose three different designs for non-inferiority phase II trials that can be used under different settings. Each method is demonstrated with examples. Each of the proposed design methods is shown to require a reasonable sample size for non-inferiority phase II trials. The three different non-inferiority phase II trial designs are used under different settings, but require similar sample sizes that are typical for phase II trials.

  1. Efficacy of educational video game versus traditional educational apps at improving physician decision making in trauma triage: randomized controlled trial.

    PubMed

    Mohan, Deepika; Farris, Coreen; Fischhoff, Baruch; Rosengart, Matthew R; Angus, Derek C; Yealy, Donald M; Wallace, David J; Barnato, Amber E

    2017-12-12

    To determine whether a behavioral intervention delivered through a video game can improve the appropriateness of trauma triage decisions in the emergency department of non-trauma centers. Randomized clinical trial. Online intervention in national sample of emergency medicine physicians who make triage decisions at US hospitals. 368 emergency medicine physicians primarily working at non-trauma centers. A random sample (n=200) of those with primary outcome data was reassessed at six months. Physicians were randomized in a 1:1 ratio to one hour of exposure to an adventure video game (Night Shift) or apps based on traditional didactic education (myATLS and Trauma Life Support MCQ Review), both on iPads. Night Shift was developed to recalibrate the process of using pattern recognition to recognize moderate-severe injuries (representativeness heuristics) through the use of stories to promote behavior change (narrative engagement). Physicians were randomized with a 2×2 factorial design to intervention (game v traditional education apps) and then to the experimental condition under which they completed the outcome assessment tool (low v high cognitive load). Blinding could not be maintained after allocation but group assignment was masked during the analysis phase. Outcomes of a virtual simulation that included 10 cases; in four of these the patients had severe injuries. Participants completed the simulation within four weeks of their intervention. Decisions to admit, discharge, or transfer were measured. The proportion of patients under-triaged (patients with severe injuries not transferred to a trauma center) was calculated then (primary outcome) and again six months later, with a different set of cases (primary outcome of follow-up study). The secondary outcome was effect of cognitive load on under-triage. 149 (81%) physicians in the game arm and 148 (80%) in the traditional education arm completed the trial. Of these, 64/100 (64%) and 58/100 (58%), respectively, completed reassessment at six months. The mean age was 40 (SD 8.9), 283 (96%) were trained in emergency medicine, and 207 (70%) were ATLS (advanced trauma life support) certified. Physicians exposed to the game under-triaged fewer severely injured patients than those exposed to didactic education (316/596 (0.53) v 377/592 (0.64), estimated difference 0.11, 95% confidence interval 0.05 to 0.16; P<0.001). Cognitive load did not influence under-triage (161/308 (0.53) v 155/288 (0.54) in the game arm; 197/300 (0.66) v 180/292 (0.62) in the traditional educational apps arm; P=0.66). At six months, physicians exposed to the game remained less likely to under-triage patients (146/256 (0.57) v 172/232 (0.74), estimated difference 0.17, 0.09 to 0.25; P<0.001). No physician reported side effects. The sample might not reflect all emergency medicine physicians, and a small set of cases was used to assess performance. Compared with apps based on traditional didactic education, exposure of physicians to a theoretically grounded video game improved triage decision making in a validated virtual simulation. Though the observed effect was large, the wide confidence intervals include the possibility of a small benefit, and the real world efficacy of this intervention remains uncertain. clinicaltrials.gov; NCT02857348 (initial study)/NCT03138304 (follow-up). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Efficacy of educational video game versus traditional educational apps at improving physician decision making in trauma triage: randomized controlled trial

    PubMed Central

    Farris, Coreen; Fischhoff, Baruch; Rosengart, Matthew R; Angus, Derek C; Yealy, Donald M; Wallace, David J; Barnato, Amber E

    2017-01-01

    Abstract Objective To determine whether a behavioral intervention delivered through a video game can improve the appropriateness of trauma triage decisions in the emergency department of non-trauma centers. Design Randomized clinical trial. Setting Online intervention in national sample of emergency medicine physicians who make triage decisions at US hospitals. Participants 368 emergency medicine physicians primarily working at non-trauma centers. A random sample (n=200) of those with primary outcome data was reassessed at six months. Interventions Physicians were randomized in a 1:1 ratio to one hour of exposure to an adventure video game (Night Shift) or apps based on traditional didactic education (myATLS and Trauma Life Support MCQ Review), both on iPads. Night Shift was developed to recalibrate the process of using pattern recognition to recognize moderate-severe injuries (representativeness heuristics) through the use of stories to promote behavior change (narrative engagement). Physicians were randomized with a 2×2 factorial design to intervention (game v traditional education apps) and then to the experimental condition under which they completed the outcome assessment tool (low v high cognitive load). Blinding could not be maintained after allocation but group assignment was masked during the analysis phase. Main outcome measures Outcomes of a virtual simulation that included 10 cases; in four of these the patients had severe injuries. Participants completed the simulation within four weeks of their intervention. Decisions to admit, discharge, or transfer were measured. The proportion of patients under-triaged (patients with severe injuries not transferred to a trauma center) was calculated then (primary outcome) and again six months later, with a different set of cases (primary outcome of follow-up study). The secondary outcome was effect of cognitive load on under-triage. Results 149 (81%) physicians in the game arm and 148 (80%) in the traditional education arm completed the trial. Of these, 64/100 (64%) and 58/100 (58%), respectively, completed reassessment at six months. The mean age was 40 (SD 8.9), 283 (96%) were trained in emergency medicine, and 207 (70%) were ATLS (advanced trauma life support) certified. Physicians exposed to the game under-triaged fewer severely injured patients than those exposed to didactic education (316/596 (0.53) v 377/592 (0.64), estimated difference 0.11, 95% confidence interval 0.05 to 0.16; P<0.001). Cognitive load did not influence under-triage (161/308 (0.53) v 155/288 (0.54) in the game arm; 197/300 (0.66) v 180/292 (0.62) in the traditional educational apps arm; P=0.66). At six months, physicians exposed to the game remained less likely to under-triage patients (146/256 (0.57) v 172/232 (0.74), estimated difference 0.17, 0.09 to 0.25; P<0.001). No physician reported side effects. The sample might not reflect all emergency medicine physicians, and a small set of cases was used to assess performance. Conclusions Compared with apps based on traditional didactic education, exposure of physicians to a theoretically grounded video game improved triage decision making in a validated virtual simulation. Though the observed effect was large, the wide confidence intervals include the possibility of a small benefit, and the real world efficacy of this intervention remains uncertain. Trial registration clinicaltrials.gov; NCT02857348 (initial study)/NCT03138304 (follow-up). PMID:29233854

  3. Effect of Exercise Training on Non-Exercise Physical Activity: A Systematic Review and Meta-Analysis of Randomized Controlled Trials.

    PubMed

    Fedewa, Michael V; Hathaway, Elizabeth D; Williams, Tyler D; Schmidt, Michael D

    2017-06-01

    Many overweight and obese individuals use exercise when attempting to lose weight. However, the improvements in weight and body composition are often far less than expected. Levels of physical activity outside of the structured exercise program are believed to change and may be responsible for the unsuccessful weight loss. The purpose of this meta-analysis was to provide a quantitative estimate of the change in non-exercise physical activity (NEPA) during exercise interventions. All studies included in the meta-analysis were peer-reviewed and published in English. Participants were randomized to a non-exercise comparison group or exercise training group with an intervention lasting ≥2 weeks. NEPA was measured at baseline and at various times during the study. Hedges' d effect size (ES) was used to adjust for small sample bias, and random-effects models were used to calculate the mean ES and explore potential moderators. The cumulative results of 44 effects gathered from ten studies published between 1997 and 2015 indicated that NEPA did not change significantly during exercise training (ES = 0.02, 95% confidence interval [CI] -0.09 to 0.13; p = 0.723). Duration of the exercise session (β = -0.0039), intervention length (β = 0.0543), and an age × sex (β = -0.0005) interaction indicated that the increase in NEPA may be attenuated in older women during exercise training and during shorter exercise interventions with longer sessions (all p < 0.005). On average, no statistically or clinically significant mean change in NEPA occurs during exercise training. However, session duration and intervention length, age, and sex should be accounted for when designing exercise programs to improve long-term sustainability and improve the likelihood of weight loss success, as the initial decrease in NEPA appears to dissipate with continued training.

  4. Tumor marker analyses from the phase III, placebo-controlled, FASTACT-2 study of intercalated erlotinib with gemcitabine/platinum in the first-line treatment of advanced non-small-cell lung cancer.

    PubMed

    Mok, Tony; Ladrera, Guia; Srimuninnimit, Vichien; Sriuranpong, Virote; Yu, Chong-Jen; Thongprasert, Sumitra; Sandoval-Tan, Jennifer; Lee, Jin Soo; Fuerte, Fatima; Shames, David S; Klughammer, Barbara; Truman, Matt; Perez-Moreno, Pablo; Wu, Yi-Long

    2016-08-01

    The FASTACT-2 study of intercalated erlotinib with chemotherapy in Asian patients found that EGFR mutations were the main driver behind the significant progression-free survival (PFS) benefit noted in the overall population. Further exploratory biomarker analyses were conducted to provide additional insight. This multicenter, randomized, placebo-controlled, double-blind, phase III study investigated intercalated first-line erlotinib or placebo with gemcitabine/platinum, followed by maintenance erlotinib or placebo, for patients with stage IIIB/IV non-small cell lung cancer (NSCLC). Provision of samples for biomarker analysis was encouraged but not mandatory. The following biomarkers were analyzed (in order of priority): EGFR mutation by cobas(®) test, KRAS mutation by cobas(®)KRAS test, HER2 by immunohistochemistry (IHC), HER3 by IHC, ERCC1 by IHC, EGFR gene copy number by fluorescence in-situ hybridization (FISH) and EGFR by IHC. All subgroups were assessed for PFS (primary endpoint), overall survival (OS), non-progression rate and objective response rate. Overall, 256 patients provided samples for analysis. Considerable overlap was noted among biomarkers, except for EGFR and KRAS mutations, which are mutually exclusive. Other than EGFR mutations (p<0.0001), no other biomarkers were significantly predictive of outcomes in a treatment-by-biomarker interaction test, although ERCC1 IHC-positive status was predictive of improved OS for the erlotinib arm versus placebo in EGFR wild-type patients (median 18.4 vs 9.5 months; hazard ratio [HR] HR=0.32, 95% confidence intervals [CI]: 0.14-0.69, p=0.0024). Activating EGFR mutations were predictive for improved treatment outcomes with a first-line intercalated regimen of chemotherapy and erlotinib in NSCLC. ERCC1 status may have some predictive value in EGFR wild-type disease, but requires further investigation. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  5. Deep ensemble learning of virtual endoluminal views for polyp detection in CT colonography

    NASA Astrophysics Data System (ADS)

    Umehara, Kensuke; Näppi, Janne J.; Hironaka, Toru; Regge, Daniele; Ishida, Takayuki; Yoshida, Hiroyuki

    2017-03-01

    Robust training of a deep convolutional neural network (DCNN) requires a very large number of annotated datasets that are currently not available in CT colonography (CTC). We previously demonstrated that deep transfer learning provides an effective approach for robust application of a DCNN in CTC. However, at high detection accuracy, the differentiation of small polyps from non-polyps was still challenging. In this study, we developed and evaluated a deep ensemble learning (DEL) scheme for reviewing of virtual endoluminal images to improve the performance of computer-aided detection (CADe) of polyps in CTC. Nine different types of image renderings were generated from virtual endoluminal images of polyp candidates detected by a conventional CADe system. Eleven DCNNs that represented three types of publically available pre-trained DCNN models were re-trained by transfer learning to identify polyps from the virtual endoluminal images. A DEL scheme that determines the final detected polyps by a review of the nine types of VE images was developed by combining the DCNNs using a random forest classifier as a meta-classifier. For evaluation, we sampled 154 CTC cases from a large CTC screening trial and divided the cases randomly into a training dataset and a test dataset. At 3.9 falsepositive (FP) detections per patient on average, the detection sensitivities of the conventional CADe system, the highestperforming single DCNN, and the DEL scheme were 81.3%, 90.7%, and 93.5%, respectively, for polyps ≥6 mm in size. For small polyps, the DEL scheme reduced the number of false positives by up to 83% over that of using a single DCNN alone. These preliminary results indicate that the DEL scheme provides an effective approach for improving the polyp detection performance of CADe in CTC, especially for small polyps.

  6. Weekly and every 2 weeks cetuximab maintenance therapy after platinum-based chemotherapy plus cetuximab as first-line treatment for non-small cell lung cancer: randomized non-comparative phase IIIb NEXT trial.

    PubMed

    Heigener, David F; Pereira, José Rodrigues; Felip, Enriqueta; Mazal, Juraj; Manzyuk, Lyudmila; Tan, Eng Huat; Merimsky, Ofer; Sarholz, Barbara; Esser, Regina; Gatzemeier, Ulrich

    2015-06-01

    The First-Line Erbitux in Lung Cancer (FLEX) trial showed that the addition of cetuximab to chemotherapy followed by weekly cetuximab maintenance significantly improved survival in the first-line treatment of advanced non-small cell lung cancer (NSCLC). The phase IIIb NSCLC Erbitux Trial (NEXT) trial (NCT00820755) investigated the efficacy and safety of weekly and every 2 weeks cetuximab maintenance therapy in this setting. Patients were treated with platinum-based chemotherapy plus cetuximab, and those progression-free after four to six cycles were randomized to every 2 weeks (500 mg/m(2)) or weekly (250 mg/m(2)) cetuximab maintenance. Randomization was stratified for tumor histology and response status. The primary endpoint for a regimen would be reached if the lower boundary of the 95 % confidence interval (CI) for the 1-year survival rate exceeded 55 %. A planned 480 patients were to be randomized. However, enrollment was curtailed following a negative opinion from the European Medicines Agency with regard to the use of cetuximab in this setting. After combination therapy, 311/583 (53.3 %) patients without progression were randomized to maintenance therapy: 157 to every 2 weeks cetuximab and 154 to weekly cetuximab. Baseline characteristics were balanced between these groups and exposure to cetuximab was similar. The 1-year survival rate was 62.8 % (95 % CI, 54.7-70.0) for every 2 weeks cetuximab and 64.4 % (95 % CI, 56.2-71.4) for weekly cetuximab. Safety profiles were similar, manageable, and in line with expectations. Therefore, in patients with advanced NSCLC who were progression-free after four to six cycles of first-line chemotherapy plus cetuximab, weekly and every 2 weeks cetuximab maintenance therapy were associated with similar survival outcomes.

  7. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications

    PubMed Central

    Chaibub Neto, Elias

    2015-01-01

    In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson’s sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling. PMID:26125965

  8. A retrospective analysis of RET translocation, gene copy number gain and expression in NSCLC patients treated with vandetanib in four randomized Phase III studies.

    PubMed

    Platt, Adam; Morten, John; Ji, Qunsheng; Elvin, Paul; Womack, Chris; Su, Xinying; Donald, Emma; Gray, Neil; Read, Jessica; Bigley, Graham; Blockley, Laura; Cresswell, Carl; Dale, Angela; Davies, Amanda; Zhang, Tianwei; Fan, Shuqiong; Fu, Haihua; Gladwin, Amanda; Harrod, Grace; Stevens, James; Williams, Victoria; Ye, Qingqing; Zheng, Li; de Boer, Richard; Herbst, Roy S; Lee, Jin-Soo; Vasselli, James

    2015-03-23

    To determine the prevalence of RET rearrangement genes, RET copy number gains and expression in tumor samples from four Phase III non-small-cell lung cancer (NSCLC) trials of vandetanib, a selective inhibitor of VEGFR, RET and EGFR signaling, and to determine any association with outcome to vandetanib treatment. Archival tumor samples from the ZODIAC ( NCT00312377 , vandetanib ± docetaxel), ZEAL ( NCT00418886 , vandetanib ± pemetrexed), ZEPHYR ( NCT00404924 , vandetanib vs placebo) and ZEST ( NCT00364351 , vandetanib vs erlotinib) studies were evaluated by fluorescence in situ hybridization (FISH) and immunohistochemistry (IHC) in 944 and 1102 patients. The prevalence of RET rearrangements by FISH was 0.7% (95% CI 0.3-1.5%) among patients with a known result. Seven tumor samples were positive for RET rearrangements (vandetanib, n = 3; comparator, n = 4). 2.8% (n = 26) of samples had RET amplification (innumerable RET clusters, or ≥7 copies in > 10% of tumor cells), 8.1% (n = 76) had low RET gene copy number gain (4-6 copies in ≥40% of tumor cells) and 8.3% (n = 92) were RET expression positive (signal intensity ++ or +++ in >10% of tumor cells). Of RET-rearrangement-positive patients, none had an objective response in the vandetanib arm and one patient responded in the comparator arm. Radiologic evidence of tumor shrinkage was observed in two patients treated with vandetanib and one treated with comparator drug. The objective response rate was similar in the vandetanib and comparator arms for patients positive for RET copy number gains or RET protein expression. We have identified prevalence for three RET biomarkers in a population predominated by non-Asians and smokers. RET rearrangement prevalence was lower than previously reported. We found no evidence of a differential benefit for efficacy by IHC and RET gene copy number gains. The low prevalence of RET rearrangements (0.7%) prevents firm conclusions regarding association of vandetanib treatment with efficacy in the RET rearrangement NSCLC subpopulation. Randomized Phase III clinical trials ( NCT00312377 , ZODIAC; NCT00418886 , ZEAL; NCT00364351 , ZEST; NCT00404924 , ZEPHYR).

  9. A Geology Sampling System for Small Bodies

    NASA Technical Reports Server (NTRS)

    Hood, A. D.; Naids, A. J.; Graff, T.; Abell, P.

    2015-01-01

    Human exploration of Small Bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this Small Bodies category and some are being discussed as potential mission tar-gets. Obtaining geological samples for return to Earth will be a major objective for any mission to a Small Body. Currently the knowledge base for geology sampling in microgravity is in its infancy. Furthermore, humans interacting with non-engineered surfaces in a microgravity environment poses unique challenges. In preparation for such missions, a team at the National Aeronautics and Space Administration (NASA) John-son Space Center (JSC) has been working to gain experience on how to safely obtain numerous sample types in such an environment. This abstract briefly summarizes the type of samples the science community is interested in, discusses an integrated geology sampling solution, and highlights some of the unique challenges associated with this type of exploration.

  10. Analysis of spatial patterns informs community assembly and sampling requirements for Collembola in forest soils

    NASA Astrophysics Data System (ADS)

    Dirilgen, Tara; Juceviča, Edite; Melecis, Viesturs; Querner, Pascal; Bolger, Thomas

    2018-01-01

    The relative importance of niche separation, non-equilibrial and neutral models of community assembly has been a theme in community ecology for many decades with none appearing to be applicable under all circumstances. In this study, Collembola species abundances were recorded over eleven consecutive years in a spatially explicit grid and used to examine (i) whether observed beta diversity differed from that expected under conditions of neutrality, (ii) whether sampling points differed in their relative contributions to overall beta diversity, and (iii) the number of samples required to provide comparable estimates of species richness across three forest sites. Neutrality could not be rejected for 26 of the forest by year combinations. However, there is a trend toward greater structure in the oldest forest, where beta diversity was greater than predicted by neutrality on five of the eleven sampling dates. The lack of difference in individual- and sample-based rarefaction curves also suggests randomness in the system at this particular scale of investigation. It seems that Collembola communities are not spatially aggregated and assembly is driven primarily by neutral processes particularly in the younger two sites. Whether this finding is due to small sample size or unaccounted for environmental variables cannot be determined. Variability between dates and sites illustrates the potential of drawing incorrect conclusions if data are collected at a single site and a single point in time.

  11. No evidence for MHC class II-based non-random mating at the gametic haplotype in Atlantic salmon.

    PubMed

    Promerová, M; Alavioon, G; Tusso, S; Burri, R; Immler, S

    2017-06-01

    Genes of the major histocompatibility complex (MHC) are a likely target of mate choice because of their role in inbreeding avoidance and potential benefits for offspring immunocompetence. Evidence for female choice for complementary MHC alleles among competing males exists both for the pre- and the postmating stages. However, it remains unclear whether the latter may involve non-random fusion of gametes depending on gametic haplotypes resulting in transmission ratio distortion or non-random sequence divergence among fused gametes. We tested whether non-random gametic fusion of MHC-II haplotypes occurs in Atlantic salmon Salmo salar. We performed in vitro fertilizations that excluded interindividual sperm competition using a split family design with large clutch sample sizes to test for a possible role of the gametic haplotype in mate choice. We sequenced two MHC-II loci in 50 embryos per clutch to assess allelic frequencies and sequence divergence. We found no evidence for transmission ratio distortion at two linked MHC-II loci, nor for non-random gamete fusion with respect to MHC-II alleles. Our findings suggest that the gametic MHC-II haplotypes play no role in gamete association in Atlantic salmon and that earlier findings of MHC-based mate choice most likely reflect choice among diploid genotypes. We discuss possible explanations for these findings and how they differ from findings in mammals.

  12. Stability of selected serum proteins after long-term storage in the Janus Serum Bank.

    PubMed

    Gislefoss, Randi E; Grimsrud, Tom K; Mørkrid, Lars

    2009-01-01

    Human serum from biobanks is frequently used in prospective epidemiological studies. Long-term storage may modify its composition. A better understanding of the stability of the serum components may improve the interpretation of future studies. The concentrations of selected proteins; immunoglobulins, carrier proteins and enzymes in samples stored at -25 degrees C for 25 years and 2 years were compared with 1-month-old samples. For each length of storage time, 130 specimens were randomly selected from apparently healthy male blood donors aged 40-49 years. We examined the distribution of values, compared dispersion and localization of central tendency, and established reference intervals for each component. The study demonstrated non-significant or numerically small group differences in the concentrations of albumin, aspartate amino transferase, cystatin C, immunoglobulin E, immunoglobulin G, and sex hormone binding globulin. Mean values between fresh and 25-year-old samples suggested larger differences during storage for alanine amino transferase (-73.4%), creatinine kinase (-96.1%), insulin C-peptide (-98.7%), ferritin (-18.5%) and transferrin (+8.2%). The findings showed that long-term storage can introduce a considerable bias for vulnerable components.

  13. Circ-UBR5: An exonic circular RNA and novel small nuclear RNA involved in RNA splicing.

    PubMed

    Qin, Meilin; Wei, Gang; Sun, Xiaomeng

    2018-06-24

    Circular RNAs (circRNAs) are class of non-coding RNAs formed by back-splicing events as loops, and could be found in all types of organisms. They play important and diverse roles in cell development, growth, and tumorigenesis, but functions of the majority of circRNAs remain enigmatic. Particularly functional phenotypes of great majority of circRNAs are not obvious. Here we randomly selected a circRNA circ-UBR5, which has no obvious functional phenotype in non-small cell lung cancer (NSCLC) cells from our previous research findings, to explore its potential function in cells. Differential expression of circ-UBR5 was detected in paired samples of tumorous tissues and adjacent nontumorous tissues from 59 patients with NSCLC by real-time quantitative reverse transcription-polymerase chain reactions (qRT-PCRs). Results showed circ-UBR5 expression was significantly downregulated in NSCLC tissues (p < 0.001) and was correlated with tumor differentiation (p = 0.00126), suggesting circ-UBR5 might serve as an index of NSCLC differentiation. Our findings indicated circ-UBR5 could bind splicing regulatory factor QKI, KH domain containing RNA binding (QKI) and NOVA alternative splicing regulator 1 (NOVA1) and U1 small nuclear RNA (snRNA) in the nucleus, revealing circ-UBR5 might be a novel snRNA involved in RNA splicing regulatory process. Moreover, we first presented a highly efficient strategy for finding specific circRNA binding proteins using Human Protein Microarray (Huprot™ Protoarray). Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Statin Treatment and Functional Outcome after Ischemic Stroke: Case-control and Meta-analysis

    PubMed Central

    Biffi, A; Devan, WJ; Anderson, CD; Cortellini, L; Furie, KL; Rosand, J; Rost, NS

    2011-01-01

    Background and Purpose Multiple studies suggest that statin use prior to acute ischemic stroke (AIS) is associated with improved functional outcome. However, available evidence is conflicting, and several published reports are limited by small sample sizes. We therefore investigated the effect of antecedent use of statins on stroke outcome by performing a meta-analysis of all results from published studies as well as our own unpublished data. Methods We performed a systematic literature search and meta-analysis of studies investigating the association between pre-stroke statin use and clinical outcome, and included additional data from 126 pre-stroke statin users and 767 non-users enrolled at our Institution. A total of 12 studies, comprising 2013 statin users and 9682 non- users were meta-analyzed using a random effects model. We also meta-analyzed results for individual TOAST stroke subtypes to determine whether the effect of statin use differed across subtypes, using the Breslow-Day (BD) test. Results Meta-analysis of all available data identified an association between pre-stroke statin use and improved functional outcome (Odds Ratio = 1.62, 95% Confidence Interval: 1.39 -1.88), but we uncovered evidence of publication bias. The effect of statin use on functional outcome was found to be larger for small vessel strokes compared to other subtypes (BD p = 0.008). Conclusions Antecedent use of statins is associated with improved outcome in AIS patients. This association appears to be stronger in patients with small vessel stroke subtype. However, evidence of publication bias in the existing literature suggests these findings should be interpreted with caution. PMID:21415396

  15. SNP selection and classification of genome-wide SNP data using stratified sampling random forests.

    PubMed

    Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K

    2012-09-01

    For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.

  16. Reality Check for the Chinese Microblog Space: A Random Sampling Approach

    PubMed Central

    Fu, King-wa; Chau, Michael

    2013-01-01

    Chinese microblogs have drawn global attention to this online application’s potential impact on the country’s social and political environment. However, representative and reliable statistics on Chinese microbloggers are limited. Using a random sampling approach, this study collected Chinese microblog data from the service provider, analyzing the profile and the pattern of usage for 29,998 microblog accounts. From our analysis, 57.4% (95% CI 56.9%,58.0%) of the accounts’ timelines were empty. Among the 12,774 non-zero statuses samples, 86.9% (95% CI 86.2%,87.4%) did not make original post in a 7-day study period. By contrast, 0.51% (95% CI 0.4%,0.65%) wrote twenty or more original posts and 0.45% (95% CI 0.35%,0.60%) reposted more than 40 unique messages within the 7-day period. A small group of microbloggers created a majority of contents and drew other users’ attention. About 4.8% (95% CI 4.4%,5.2%) of the 12,774 users contributed more than 80% (95% CI,78.6%,80.3%) of the original posts and about 4.8% (95% CI 4.5%,5.2%) managed to create posts that were reposted or received comments at least once. Moreover, a regression analysis revealed that volume of followers is a key determinant of creating original microblog posts, reposting messages, being reposted, and receiving comments. Volume of friends is found to be linked only with the number of reposts. Gender differences and regional disparities in using microblogs in China are also observed. PMID:23520502

  17. Rare event simulation in radiation transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less

  18. Kolmogorov complexity, statistical regularization of inverse problems, and Birkhoff's formalization of beauty

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha

    1998-09-01

    Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

  19. Small incision lenticule extraction (SMILE) versus laser in-situ keratomileusis (LASIK): study protocol for a randomized, non-inferiority trial

    PubMed Central

    2012-01-01

    Background Small incision lenticule extraction or SMILE is a novel form of ‘flapless’ corneal refractive surgery that was adapted from refractive lenticule extraction (ReLEx). SMILE uses only one femtosecond laser to complete the refractive surgery, potentially reducing surgical time, side effects, and cost. If successful, SMILE could potentially replace the current, widely practiced laser in-situ keratomileusis or LASIK. The aim of this study is to evaluate whether SMILE is non-inferior to LASIK in terms of refractive outcomes at 3 months post-operatively. Methods/Design Single tertiary center, parallel group, single-masked, paired-eye design, non-inferiority, randomized controlled trial. Participants who are eligible for LASIK will be enrolled for study after informed consent. Each participant will be randomized to receive SMILE and LASIK in each eye. Our primary hypothesis (stated as null) in this non-inferiority trial would be that SMILE differs from LASIK in adults (>21 years old) with myopia (> −3.00 diopter (D)) at a tertiary eye center in terms of refractive predictability at 3 months post-operatively. Our secondary hypothesis (stated as null) in this non-inferiority trial would be that SMILE differs from LASIK in adults (>21 years old) with myopia (> −3.00 D) at a tertiary eye center in terms of other refractive outcomes (efficacy, safety, higher-order aberrations) at 3 months post-operatively. Our primary outcome is refractive predictability, which is one of several standard refractive outcomes, defined as the proportion of eyes achieving a postoperative spherical equivalent (SE) within ±0.50 D of the intended target. Randomization will be performed using random allocation sequence generated by a computer with no blocks or restrictions, and implemented by concealing the number-coded surgery within sealed envelopes until just before the procedure. In this single-masked trial, subjects and their caregivers will be masked to the assigned treatment in each eye. Discussion This novel trial will provide information on whether SMILE has comparable, if not superior, refractive outcomes compared to the established LASIK for myopia, thus providing evidence for translation into clinical practice. Trial registration Clinicaltrials.gov NCT01216475. PMID:22647480

  20. GPU-based stochastic-gradient optimization for non-rigid medical image registration in time-critical applications

    NASA Astrophysics Data System (ADS)

    Bhosale, Parag; Staring, Marius; Al-Ars, Zaid; Berendsen, Floris F.

    2018-03-01

    Currently, non-rigid image registration algorithms are too computationally intensive to use in time-critical applications. Existing implementations that focus on speed typically address this by either parallelization on GPU-hardware, or by introducing methodically novel techniques into CPU-oriented algorithms. Stochastic gradient descent (SGD) optimization and variations thereof have proven to drastically reduce the computational burden for CPU-based image registration, but have not been successfully applied in GPU hardware due to its stochastic nature. This paper proposes 1) NiftyRegSGD, a SGD optimization for the GPU-based image registration tool NiftyReg, 2) random chunk sampler, a new random sampling strategy that better utilizes the memory bandwidth of GPU hardware. Experiments have been performed on 3D lung CT data of 19 patients, which compared NiftyRegSGD (with and without random chunk sampler) with CPU-based elastix Fast Adaptive SGD (FASGD) and NiftyReg. The registration runtime was 21.5s, 4.4s and 2.8s for elastix-FASGD, NiftyRegSGD without, and NiftyRegSGD with random chunk sampling, respectively, while similar accuracy was obtained. Our method is publicly available at https://github.com/SuperElastix/NiftyRegSGD.

  1. Effects of clutter on information processing deficits in individuals with hoarding disorder.

    PubMed

    Raines, Amanda M; Timpano, Kiara R; Schmidt, Norman B

    2014-09-01

    Current cognitive behavioral models of hoarding view hoarding as a multifaceted problem stemming from various information processing deficits. However, there is also reason to suspect that the consequences of hoarding may in turn impact or modulate deficits in information processing. The current study sought to expand upon the existing literature by manipulating clutter to examine whether the presence of a cluttered environment affects information processing. Participants included 34 individuals with hoarding disorder. Participants were randomized into a clutter or non-clutter condition and asked to complete various neuropsychological tasks of memory and attention. Results revealed that hoarding severity was associated with difficulties in sustained attention. However, individuals in the clutter condition relative to the non-clutter condition did not experience greater deficits in information processing. Limitations include the cross-sectional design and small sample size. The current findings add considerably to a growing body of literature on the relationships between information processing deficits and hoarding behaviors. Research of this type is integral to understanding the etiology and maintenance of hoarding. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Conflict Resolution Strategies in Non-Government Secondary Schools in Benue State, Nigeria

    ERIC Educational Resources Information Center

    Oboegbulem, Angie; Alfa, Idoko Alphonusu

    2013-01-01

    This study investigated perceived CRSs (conflict resolution strategies) for the resolution of conflicts in non-government secondary schools in Benue State, Nigeria. Three research questions and three hypotheses guided this study. Proportionate stratified random sampling technique was used in drawing 15% of the population which gave a total of 500…

  3. Treatment Rationale and Design for J-SONIC: A Randomized Study of Carboplatin Plus Nab-paclitaxel With or Without Nintedanib for Advanced Non-Small-cell Lung Cancer With Idiopathic Pulmonary Fibrosis.

    PubMed

    Otsubo, Kohei; Kishimoto, Junji; Kenmotsu, Hirotsugu; Minegishi, Yuji; Ichihara, Eiki; Shiraki, Akira; Kato, Terufumi; Atagi, Shinji; Horinouchi, Hidehito; Ando, Masahiko; Kondoh, Yasuhiro; Kusumoto, Masahiko; Ichikado, Kazuya; Yamamoto, Nobuyuki; Nakanishi, Yoichi; Okamoto, Isamu

    2018-01-01

    We describe the treatment rationale and procedure for a randomized study (J-SONIC; University Hospital Medical Information Network Clinical Trials Registry identification no., UMIN000026799) of carboplatin plus nanoparticle albumin-bound paclitaxel (nab-paclitaxel) with or without nintedanib for patients with advanced non-small cell lung cancer (NSCLC) and idiopathic pulmonary fibrosis (IPF). The study was designed to examine the efficacy and safety of nintedanib administered with carboplatin plus nab-paclitaxel versus carboplatin plus nab-paclitaxel alone in chemotherapy-naive patients with advanced NSCLC associated with IPF. Eligible patients (enrollment target, n = 170) will be randomized at a 1:1 ratio to receive 4 cycles of carboplatin (area under the curve, 6 on day 1) plus nab-paclitaxel (100 mg/m 2 on days 1, 8, and 15) administered every 3 weeks either without (arm A) or with (arm B) nintedanib (150 mg twice daily), to be followed in arm B by single-agent administration of nintedanib (150 mg twice daily). The present trial is the first randomized controlled study for the treatment of NSCLC associated with IPF. The goal of the study is to demonstrate that nintedanib combined with carboplatin plus nab-paclitaxel prolongs the interval to acute exacerbation of IPF compared with carboplatin plus nab-paclitaxel alone. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Forced oscillations of cracked beam under the stochastic cyclic loading

    NASA Astrophysics Data System (ADS)

    Matsko, I.; Javors'kyj, I.; Yuzefovych, R.; Zakrzewski, Z.

    2018-05-01

    An analysis of forced oscillations of cracked beam using statistical methods for periodically correlated random processes is presented. The oscillation realizations are obtained on the basis of numerical solutions of differential equations of the second order, for the case when applied force is described by a sum of harmonic and stationary random process. It is established that due to crack appearance forced oscillations acquire properties of second-order periodical non-stationarity. It is shown that in a super-resonance regime covariance and spectral characteristics, which describe non-stationary structure of forced oscillations, are more sensitive to crack growth than the characteristics of the oscillation's deterministic part. Using diagnostic indicators formed on their basis allows the detection of small cracks.

  5. Non-fixation for Conservative Stochastic Dynamics on the Line

    NASA Astrophysics Data System (ADS)

    Basu, Riddhipratim; Ganguly, Shirshendu; Hoffman, Christopher

    2018-03-01

    We consider activated random walk (ARW), a model which generalizes the stochastic sandpile, one of the canonical examples of self organized criticality. Informally ARW is a particle system on Z with mass conservation. One starts with a mass density {μ > 0} of initially active particles, each of which performs a symmetric random walk at rate one and falls asleep at rate {λ > 0}. Sleepy particles become active on coming in contact with other active particles. We investigate the question of fixation/non-fixation of the process and show for small enough {λ} the critical mass density for fixation is strictly less than one. Moreover, the critical density goes to zero as {λ} tends to zero. This settles a long standing open question.

  6. Community mass treatment with azithromycin for trachoma: Factors associated with change in participation of children from the first to the second round

    PubMed Central

    Ssemanda, Elizabeth N.; Mkocha, Harran; Levens, Joshua; Munoz, Beatriz; West, Sheila K.

    2013-01-01

    Background Mass drug administration (MDA) with azithromycin is an important part of trachoma control programs. Maintaining high participation among children is challenging. Aim We assessed factors identifying households with a child who changed participation from the first MDA to the second MDA compared to households where all children participated at both MDAs. Methods Two case-control comparisons were conducted in 11 Tanzanian communities, which underwent MDA in 2008 and 2009. The first case group (n=165) was a random sample of households with a child who changed from a 2008 non-participant to a 2009 participant (delayed participant). The second case group (n=165) was a random sample of households with a child who went from a 2008 participant to a 2009 non-participant (change to non-participant). Controls (n=330) were a random sample of households where all children participated in both rounds. Risk factors were assessed using questionnaires asked of children’s guardians. Logistic models with a random-intercept were used to estimate odds ratios and 95% confidence intervals. Results Households with delayed participation were more likely to be in communities with fewer treatment days (OR=2.98, 95% CI=1.80–4.92) and assigned to Community Treatment Assistants (CTA) with a wide area to cover (OR=1.88, 95% CI=1.09–3.23). Households with change to non-participation were more likely to live further from the distribution site (OR=3.17, 95% CI=1.19–8.46), have the guardian born outside the village with short-term residency (OR=2.64, 95% CI=1.32–5.31), and be assigned to a male CTA (OR=1.75, 95% CI=1.08–2.83). Conclusions Factors related to program accessibility were associated with delayed participation and maintaining participation. PMID:26462290

  7. TNO/Centaurs grouping tested with asteroid data sets

    NASA Astrophysics Data System (ADS)

    Fulchignoni, M.; Birlan, M.; Barucci, M. A.

    2001-11-01

    Recently, we have discussed the possible subdivision in few groups of a sample of 22 TNO and Centaurs for which the BVRIJ photometry were available (Barucci et al., 2001, A&A, 371,1150). We obtained this results using the multivariate statistics adopted to define the current asteroid taxonomy, namely the Principal Components Analysis and the G-mode method (Tholen & Barucci, 1989, in ASTEROIDS II). How these methods work with a very small statistical sample as the TNO/Centaurs one? Theoretically, the number of degrees of freedom of the sample is correct. In fact it is 88 in our case and have to be larger then 50 to cope with the requirements of the G-mode. Does the random sampling of the small number of members of a large population contain enough information to reveal some structure in the population? We extracted several samples of 22 asteroids out of a data-base of 86 objects of known taxonomic type for which BVRIJ photometry is available from ECAS (Zellner et al. 1985, ICARUS 61, 355), SMASS II (S.W. Bus, 1999, PhD Thesis, MIT), and the Bell et al. Atlas of the asteroid infrared spectra. The objects constituting the first sample were selected in order to give a good representation of the major asteroid taxonomic classes (at least three samples each class): C,S,D,A, and G. Both methods were able to distinguish all these groups confirming the validity of the adopted methods. The S class is hard to individuate as a consequence of the choice of I and J variables, which imply a lack of information on the absorption band at 1 micron. The other samples were obtained by random choice of the objects. Not all the major groups were well represented (less than three samples per groups), but the general trend of the asteroid taxonomy has been always obtained. We conclude that the quoted grouping of TNO/Centaurs is representative of some physico-chemical structure of the outer solar system small body population.

  8. The non-equilibrium allele frequency spectrum in a Poisson random field framework.

    PubMed

    Kaj, Ingemar; Mugal, Carina F

    2016-10-01

    In population genetic studies, the allele frequency spectrum (AFS) efficiently summarizes genome-wide polymorphism data and shapes a variety of allele frequency-based summary statistics. While existing theory typically features equilibrium conditions, emerging methodology requires an analytical understanding of the build-up of the allele frequencies over time. In this work, we use the framework of Poisson random fields to derive new representations of the non-equilibrium AFS for the case of a Wright-Fisher population model with selection. In our approach, the AFS is a scaling-limit of the expectation of a Poisson stochastic integral and the representation of the non-equilibrium AFS arises in terms of a fixation time probability distribution. The known duality between the Wright-Fisher diffusion process and a birth and death process generalizing Kingman's coalescent yields an additional representation. The results carry over to the setting of a random sample drawn from the population and provide the non-equilibrium behavior of sample statistics. Our findings are consistent with and extend a previous approach where the non-equilibrium AFS solves a partial differential forward equation with a non-traditional boundary condition. Moreover, we provide a bridge to previous coalescent-based work, and hence tie several frameworks together. Since frequency-based summary statistics are widely used in population genetics, for example, to identify candidate loci of adaptive evolution, to infer the demographic history of a population, or to improve our understanding of the underlying mechanics of speciation events, the presented results are potentially useful for a broad range of topics. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Dynamical traps in Wang-Landau sampling of continuous systems: Mechanism and solution

    NASA Astrophysics Data System (ADS)

    Koh, Yang Wei; Sim, Adelene Y. L.; Lee, Hwee Kuan

    2015-08-01

    We study the mechanism behind dynamical trappings experienced during Wang-Landau sampling of continuous systems reported by several authors. Trapping is caused by the random walker coming close to a local energy extremum, although the mechanism is different from that of the critical slowing-down encountered in conventional molecular dynamics or Monte Carlo simulations. When trapped, the random walker misses the entire or even several stages of Wang-Landau modification factor reduction, leading to inadequate sampling of the configuration space and a rough density of states, even though the modification factor has been reduced to very small values. Trapping is dependent on specific systems, the choice of energy bins, and the Monte Carlo step size, making it highly unpredictable. A general, simple, and effective solution is proposed where the configurations of multiple parallel Wang-Landau trajectories are interswapped to prevent trapping. We also explain why swapping frees the random walker from such traps. The efficacy of the proposed algorithm is demonstrated.

  10. Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa

    2018-01-01

    A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.

  11. Sensitivity and specificity of normality tests and consequences on reference interval accuracy at small sample size: a computer-simulation study.

    PubMed

    Le Boedec, Kevin

    2016-12-01

    According to international guidelines, parametric methods must be chosen for RI construction when the sample size is small and the distribution is Gaussian. However, normality tests may not be accurate at small sample size. The purpose of the study was to evaluate normality test performance to properly identify samples extracted from a Gaussian population at small sample sizes, and assess the consequences on RI accuracy of applying parametric methods to samples that falsely identified the parent population as Gaussian. Samples of n = 60 and n = 30 values were randomly selected 100 times from simulated Gaussian, lognormal, and asymmetric populations of 10,000 values. The sensitivity and specificity of 4 normality tests were compared. Reference intervals were calculated using 6 different statistical methods from samples that falsely identified the parent population as Gaussian, and their accuracy was compared. Shapiro-Wilk and D'Agostino-Pearson tests were the best performing normality tests. However, their specificity was poor at sample size n = 30 (specificity for P < .05: .51 and .50, respectively). The best significance levels identified when n = 30 were 0.19 for Shapiro-Wilk test and 0.18 for D'Agostino-Pearson test. Using parametric methods on samples extracted from a lognormal population but falsely identified as Gaussian led to clinically relevant inaccuracies. At small sample size, normality tests may lead to erroneous use of parametric methods to build RI. Using nonparametric methods (or alternatively Box-Cox transformation) on all samples regardless of their distribution or adjusting, the significance level of normality tests depending on sample size would limit the risk of constructing inaccurate RI. © 2016 American Society for Veterinary Clinical Pathology.

  12. The Efficacy of Yoga as a Form of Treatment for Depression

    PubMed Central

    Bridges, Ledetra; Sharma, Manoj

    2017-01-01

    The purpose of this article was to systematically review yoga interventions aimed at improving depressive symptoms. A total of 23 interventions published between 2011 and May 2016 were evaluated in this review. Three study designs were used: randomized control trials, quasi-experimental, and pretest/posttest, with majority being randomized control trials. Most of the studies were in the United States. Various yoga schools were used, with the most common being Hatha yoga. The number of participants participating in the studies ranged from 14 to 136, implying that most studies had a small sample. The duration of the intervention period varied greatly, with the majority being 6 weeks or longer. Limitations of the interventions involved the small sample sizes used by the majority of the studies, most studies examining the short-term effect of yoga for depression, and the nonutilization of behavioral theories. Despite the limitations, it can be concluded that the yoga interventions were effective in reducing depression. PMID:28664775

  13. Non-destructive controlled single-particle light scattering measurement

    NASA Astrophysics Data System (ADS)

    Maconi, G.; Penttilä, A.; Kassamakov, I.; Gritsevich, M.; Helander, P.; Puranen, T.; Salmi, A.; Hæggström, E.; Muinonen, K.

    2018-01-01

    We present a set of light scattering data measured from a millimeter-sized extraterrestrial rock sample. The data were acquired by our novel scatterometer, which enables accurate multi-wavelength measurements of single-particle samples whose position and orientation are controlled by ultrasonic levitation. The measurements demonstrate a non-destructive approach to derive optical properties of small mineral samples. This enables research on valuable materials, such as those returned from space missions or rare meteorites.

  14. Mutual synchronization of weakly coupled gyrotrons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rozental, R. M.; Glyavin, M. Yu.; Sergeev, A. S.

    2015-09-15

    The processes of synchronization of two weakly coupled gyrotrons are studied within the framework of non-stationary equations with non-fixed longitudinal field structure. With the allowance for a small difference of the free oscillation frequencies of the gyrotrons, we found a certain range of parameters where mutual synchronization is possible while a high electronic efficiency is remained. It is also shown that synchronization regimes can be realized even under random fluctuations of the parameters of the electron beams.

  15. Underwater Intruder Detection Sonar for Harbour Protection: State of the Art Review and Implications

    DTIC Science & Technology

    2006-10-01

    intruder would appear as a small moving “ blob ” of energetic echo in the echograph, and the operator could judge whether the contact is a threat that calls...visually then as a small fluctuating “ blob ” against a fluctuating background of sound clutter and reverberation, making it difficult to visually...4. Non-random false alarms caused by genuine underwater contacts that happened not to be intruders—by large fish , or schools of fish , or marine

  16. 78 FR 70921 - Takes of Marine Mammals Incidental to Specified Activities; Taking Marine Mammals Incidental to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ... permissible methods of taking, other means of effecting the least practicable impact on the species or stock... non-destructive sampling methods to monitor rocky intertidal algal and invertebrate species abundances... and random quadrat are sampled, using methods described by Foster et al. (1991) and Dethier et al...

  17. Balancing selection and genetic drift at major histocompatibility complex class II genes in isolated populations of golden snub-nosed monkey (Rhinopithecus roxellana)

    PubMed Central

    2012-01-01

    Background Small, isolated populations often experience loss of genetic variation due to random genetic drift. Unlike neutral or nearly neutral markers (such as mitochondrial genes or microsatellites), major histocompatibility complex (MHC) genes in these populations may retain high levels of polymorphism due to balancing selection. The relative roles of balancing selection and genetic drift in either small isolated or bottlenecked populations remain controversial. In this study, we examined the mechanisms maintaining polymorphisms of MHC genes in small isolated populations of the endangered golden snub-nosed monkey (Rhinopithecus roxellana) by comparing genetic variation found in MHC and microsatellite loci. There are few studies of this kind conducted on highly endangered primate species. Results Two MHC genes were sequenced and sixteen microsatellite loci were genotyped from samples representing three isolated populations. We isolated nine DQA1 alleles and sixteen DQB1 alleles and validated expression of the alleles. Lowest genetic variation for both MHC and microsatellites was found in the Shennongjia (SNJ) population. Historical balancing selection was revealed at both the DQA1 and DQB1 loci, as revealed by excess non-synonymous substitutions at antigen binding sites (ABS) and maximum-likelihood-based random-site models. Patterns of microsatellite variation revealed population structure. FST outlier analysis showed that population differentiation at the two MHC loci was similar to the microsatellite loci. Conclusions MHC genes and microsatellite loci showed the same allelic richness pattern with the lowest genetic variation occurring in SNJ, suggesting that genetic drift played a prominent role in these isolated populations. As MHC genes are subject to selective pressures, the maintenance of genetic variation is of particular interest in small, long-isolated populations. The results of this study may contribute to captive breeding and translocation programs for endangered species. PMID:23083308

  18. On the Structure of Cortical Microcircuits Inferred from Small Sample Sizes.

    PubMed

    Vegué, Marina; Perin, Rodrigo; Roxin, Alex

    2017-08-30

    The structure in cortical microcircuits deviates from what would be expected in a purely random network, which has been seen as evidence of clustering. To address this issue, we sought to reproduce the nonrandom features of cortical circuits by considering several distinct classes of network topology, including clustered networks, networks with distance-dependent connectivity, and those with broad degree distributions. To our surprise, we found that all of these qualitatively distinct topologies could account equally well for all reported nonrandom features despite being easily distinguishable from one another at the network level. This apparent paradox was a consequence of estimating network properties given only small sample sizes. In other words, networks that differ markedly in their global structure can look quite similar locally. This makes inferring network structure from small sample sizes, a necessity given the technical difficulty inherent in simultaneous intracellular recordings, problematic. We found that a network statistic called the sample degree correlation (SDC) overcomes this difficulty. The SDC depends only on parameters that can be estimated reliably given small sample sizes and is an accurate fingerprint of every topological family. We applied the SDC criterion to data from rat visual and somatosensory cortex and discovered that the connectivity was not consistent with any of these main topological classes. However, we were able to fit the experimental data with a more general network class, of which all previous topologies were special cases. The resulting network topology could be interpreted as a combination of physical spatial dependence and nonspatial, hierarchical clustering. SIGNIFICANCE STATEMENT The connectivity of cortical microcircuits exhibits features that are inconsistent with a simple random network. Here, we show that several classes of network models can account for this nonrandom structure despite qualitative differences in their global properties. This apparent paradox is a consequence of the small numbers of simultaneously recorded neurons in experiment: when inferred via small sample sizes, many networks may be indistinguishable despite being globally distinct. We develop a connectivity measure that successfully classifies networks even when estimated locally with a few neurons at a time. We show that data from rat cortex is consistent with a network in which the likelihood of a connection between neurons depends on spatial distance and on nonspatial, asymmetric clustering. Copyright © 2017 the authors 0270-6474/17/378498-13$15.00/0.

  19. Digit ratio (2D:4D) and male facial attractiveness: new data and a meta-analysis.

    PubMed

    Hönekopp, Johannes

    2013-10-01

    Digit ratio (2D:4D) appears to correlate negatively with prenatal testosterone (T) effects in humans. As T probably increases facial masculinity, which in turn might be positively related to male facial attractiveness, a number of studies have looked into the relationship between 2D:4D and male facial attractiveness, showing equivocal results. Here, I present the largest and third largest samples so far, which investigate the relationship between 2D:4D and male facial attractiveness in adolescents (n = 115) and young men (n = 80). I then present random-effects meta-analyses of the available data (seven to eight samples, overall n = 362 to 469). These showed small (r ≈ -.09), statistically non-significant relationships between 2D:4D measures and male facial attractiveness. Thus, 2D:4D studies offer no convincing evidence at present that prenatal T has a positive effect on male facial attractiveness. However, a consideration of confidence intervals shows that, at present, a theoretically meaningful relationship between 2D:4D and male facial attractiveness cannot be ruled out either.

  20. Social network recruitment for Yo Puedo: an innovative sexual health intervention in an underserved urban neighborhood—sample and design implications.

    PubMed

    Minnis, Alexandra M; vanDommelen-Gonzalez, Evan; Luecke, Ellen; Cheng, Helen; Dow, William; Bautista-Arredondo, Sergio; Padian, Nancy S

    2015-02-01

    Most existing evidence-based sexual health interventions focus on individual-level behavior, even though there is substantial evidence that highlights the influential role of social environments in shaping adolescents' behaviors and reproductive health outcomes. We developed Yo Puedo, a combined conditional cash transfer and life skills intervention for youth to promote educational attainment, job training, and reproductive health wellness that we then evaluated for feasibility among 162 youth aged 16-21 years in a predominantly Latino community in San Francisco, CA. The intervention targeted youth's social networks and involved recruitment and randomization of small social network clusters. In this paper we describe the design of the feasibility study and report participants' baseline characteristics. Furthermore, we examined the sample and design implications of recruiting social network clusters as the unit of randomization. Baseline data provide evidence that we successfully enrolled high risk youth using a social network recruitment approach in community and school-based settings. Nearly all participants (95%) were high risk for adverse educational and reproductive health outcomes based on multiple measures of low socioeconomic status (81%) and/or reported high risk behaviors (e.g., gang affiliation, past pregnancy, recent unprotected sex, frequent substance use; 62%). We achieved variability in the study sample through heterogeneity in recruitment of the index participants, whereas the individuals within the small social networks of close friends demonstrated substantial homogeneity across sociodemographic and risk profile characteristics. Social networks recruitment was feasible and yielded a sample of high risk youth willing to enroll in a randomized study to evaluate a novel sexual health intervention.

  1. A randomized controlled trial in non-responders from Newcastle upon Tyne invited to return a self-sample for Human Papillomavirus testing versus repeat invitation for cervical screening.

    PubMed

    Cadman, Louise; Wilkes, Scott; Mansour, Diana; Austin, Janet; Ashdown-Barr, Lesley; Edwards, Rob; Kleeman, Michelle; Szarewski, Anne

    2015-03-01

    Non-attenders for cervical screening are at increased risk of cervical cancer. Studies offering self-sampling for high-risk Human Papillomavirus (HrHPV) testing have shown greater uptake than sending another invitation for cytology. To explore whether uptake would increase in a less diverse, more stable population than the previous English study, which demonstrated a lower response rate than other studies. The primary objective was whether non-attenders were more likely to respond to a postal invitation, including kit, to collect a self-sample compared with a further invitation for cytology screening. The secondary objective was whether women with an abnormal result would attend for follow-up. 6000 non-attenders for screening in this pragmatic, randomized (1:1) controlled trial in Newcastle-upon-Tyne were sent an HPV self-sample kit (intervention) or a further invitation for cytology screening (comparator). 411(13%) responded to the intervention, returning a self-sample (247(8%)) or attending for cytology (164(5%)), compared with 183(6%) attending for cytology, relative risk 2.25 (95% CI 1.90-2.65) (comparator arm). Of those testing hrHPV positive (32(13%)), 19(59%) subsequently attended cytology screening. Of those in the intervention group who attended for cytology screening without returning an hrHPV self-sample (n = 164), 5% (n = 8) were referred for colposcopy - all attended. In the comparator group eight of the nine referred for colposcopy attended. Persistent non-responders to invitations for cervical screening are significantly more likely to respond to a postal invitation to return a self-collected sample for HPV testing than a further invitation for cytology screening. However, just over half followed up on this positive HPV result. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  2. 78 FR 42079 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-15

    ... a three- year approval to continue the ERSG project. The ongoing evaluation employs a quasi-experimental/non-randomized design in which a convenience sample of participants in schools receiving universal...

  3. Ultrasonic sensor and method of use

    DOEpatents

    Condreva, Kenneth J.

    2001-01-01

    An ultrasonic sensor system and method of use for measuring transit time though a liquid sample, using one ultrasonic transducer coupled to a precision time interval counter. The timing circuit captures changes in transit time, representing small changes in the velocity of sound transmitted, over necessarily small time intervals (nanoseconds) and uses the transit time changes to identify the presence of non-conforming constituents in the sample.

  4. Reducing bias in survival under non-random temporary emigration

    USGS Publications Warehouse

    Peñaloza, Claudia L.; Kendall, William L.; Langtimm, Catherine Ann

    2014-01-01

    Despite intensive monitoring, temporary emigration from the sampling area can induce bias severe enough for managers to discard life-history parameter estimates toward the terminus of the times series (terminal bias). Under random temporary emigration unbiased parameters can be estimated with CJS models. However, unmodeled Markovian temporary emigration causes bias in parameter estimates and an unobservable state is required to model this type of emigration. The robust design is most flexible when modeling temporary emigration, and partial solutions to mitigate bias have been identified, nonetheless there are conditions were terminal bias prevails. Long-lived species with high adult survival and highly variable non-random temporary emigration present terminal bias in survival estimates, despite being modeled with the robust design and suggested constraints. Because this bias is due to uncertainty about the fate of individuals that are undetected toward the end of the time series, solutions should involve using additional information on survival status or location of these individuals at that time. Using simulation, we evaluated the performance of models that jointly analyze robust design data and an additional source of ancillary data (predictive covariate on temporary emigration, telemetry, dead recovery, or auxiliary resightings) in reducing terminal bias in survival estimates. The auxiliary resighting and predictive covariate models reduced terminal bias the most. Additional telemetry data was effective at reducing terminal bias only when individuals were tracked for a minimum of two years. High adult survival of long-lived species made the joint model with recovery data ineffective at reducing terminal bias because of small-sample bias. The naïve constraint model (last and penultimate temporary emigration parameters made equal), was the least efficient, though still able to reduce terminal bias when compared to an unconstrained model. Joint analysis of several sources of data improved parameter estimates and reduced terminal bias. Efforts to incorporate or acquire such data should be considered by researchers and wildlife managers, especially in the years leading up to status assessments of species of interest. Simulation modeling is a very cost effective method to explore the potential impacts of using different sources of data to produce high quality demographic data to inform management.

  5. Many-body delocalization with random vector potentials

    NASA Astrophysics Data System (ADS)

    Cheng, Chen; Mondaini, Rubem

    In this talk we present the ergodic properties of excited states in a model of interacting fermions in quasi-one dimensional chains subjected to a random vector potential. In the non-interacting limit, we show that arbitrarily small values of this complex off-diagonal disorder triggers localization for the whole spectrum; the divergence of the localization length in the single particle basis is characterized by a critical exponent ν which depends on the energy density being investigated. However, when short-ranged interactions are included, the localization is lost and the system is ergodic regardless of the magnitude of disorder in finite chains. Our numerical results suggest a delocalization scheme for arbitrary small values of interactions. This finding indicates that the standard scenario of the many-body localization cannot be obtained in a model with random gauge fields. This research is financially supported by the National Natural Science Foundation of China (NSFC) (Grant Nos. U1530401 and 11674021). RM also acknowledges support from NSFC (Grant No. 11650110441).

  6. Effects of Sex Education and Kegel Exercises on the Sexual Function of Postmenopausal Women: A Randomized Clinical Trial.

    PubMed

    Nazarpour, Soheila; Simbar, Masoumeh; Ramezani Tehrani, Fahimeh; Alavi Majd, Hamid

    2017-07-01

    The sex lives of women are strongly affected by menopause. Non-pharmacologic approaches to improving the sexual function of postmenopausal women might prove effective. To compare two methods of intervention (formal sex education and Kegel exercises) with routine postmenopausal care services in a randomized clinical trial. A randomized clinical trial was conducted of 145 postmenopausal women residing in Chalus and Noshahr, Iran. Their sexual function statuses were assessed using the Female Sexual Function Index (FSFI) questionnaire. After obtaining written informed consents, they were randomly assigned to one of three groups: (i) formal sex education, (ii) Kegel exercises, or (iii) routine postmenopausal care. After 12 weeks, all participants completed the FSFI again. Analysis of covariance was used to compare the participants' sexual function before and after the interventions, and multiple linear regression analysis was used to determine the predictive factors for variation in FSFI scores in the postintervention stage. Sexual function was assessed using the FSFI. There were no statistically significant differences in demographic and socioeconomic characteristics and FSFI total scores among the three study groups at the outset of the study. After 12 weeks, the scores of arousal in the formal sex education and Kegel groups were significantly higher compared with the control group (3.38 and 3.15 vs 2.77, respectively). The scores of orgasm and satisfaction in the Kegel group were significantly higher compared with the control group (4.43 and 4.88 vs 3.95 and 4.39, respectively). Formal sex education and Kegel exercises were used as two non-pharmacologic approaches to improve the sexual function of women after menopause. The main strength of this study was its design: a well-organized randomized trial using precise eligibility criteria with a small sample loss. The second strength was the methods of intervention used, namely non-pharmacologic approaches that are simple, easily accessible, and fairly inexpensive. The main limitation of the study was our inability to objectively assess the participants' commitment to exercise and the sexual function of their partners. Sex education programs and Kegel exercises could cause improvements in some domains of sexual function-specifically arousal, orgasm, and satisfaction-in postmenopausal women. Nazarpour S, Simbar M, Tehrani FR, Majd HA. Effects of Sex Education and Kegel Exercises on the Sexual Function of Postmenopausal Women: A Randomized Clinical Trial. J Sex Med 2017;14:959-967. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  7. Impact of gender participation in non-farming activities on household income and poverty levels in Pakistan.

    PubMed

    Ali, Akhter; Erenstein, Olaf; Rahut, Dil Bahadur

    2015-01-01

    In the rural areas of Pakistan, the majority of farm households have small landholdings of less than 2 hectares. Both male and females are engaged in farming and non-farming activities. However, in Pakistan the gender-wise participation in farming activities is not much documented. The main objective of the current study is to estimate the impact of male and female participation in non-farming activities on a household's income level and poverty status in Pakistan. The current study is based on a cross-sectional data set collected from 325 households through a purposive random sampling technique. A detailed comprehensive questionnaire was prepared for data collection. The data were analyzed by employing the propensity score matching approach. The empirical results indicate that both male and female participation in non-farming activities has a positive impact on household welfare in Pakistan by raising income levels and thus contributing to poverty reduction. However, the impact is greater when the males of a household take part in these activities rather than the females. In the past only a few studies have focused on gender-based participation in non-farming activities. The non-farming sector is an important one in rural areas, especially in developing countries like Pakistan. More opportunities need to be created for both men and women in rural areas of Pakistan to find off-farm work, in order to increase household income and reduce poverty levels.

  8. Bayesian hierarchical models for smoothing in two-phase studies, with application to small area estimation.

    PubMed

    Ross, Michelle; Wakefield, Jon

    2015-10-01

    Two-phase study designs are appealing since they allow for the oversampling of rare sub-populations which improves efficiency. In this paper we describe a Bayesian hierarchical model for the analysis of two-phase data. Such a model is particularly appealing in a spatial setting in which random effects are introduced to model between-area variability. In such a situation, one may be interested in estimating regression coefficients or, in the context of small area estimation, in reconstructing the population totals by strata. The efficiency gains of the two-phase sampling scheme are compared to standard approaches using 2011 birth data from the research triangle area of North Carolina. We show that the proposed method can overcome small sample difficulties and improve on existing techniques. We conclude that the two-phase design is an attractive approach for small area estimation.

  9. Multiple-instance ensemble learning for hyperspectral images

    NASA Astrophysics Data System (ADS)

    Ergul, Ugur; Bilgin, Gokhan

    2017-10-01

    An ensemble framework for multiple-instance (MI) learning (MIL) is introduced for use in hyperspectral images (HSIs) by inspiring the bagging (bootstrap aggregation) method in ensemble learning. Ensemble-based bagging is performed by a small percentage of training samples, and MI bags are formed by a local windowing process with variable window sizes on selected instances. In addition to bootstrap aggregation, random subspace is another method used to diversify base classifiers. The proposed method is implemented using four MIL classification algorithms. The classifier model learning phase is carried out with MI bags, and the estimation phase is performed over single-test instances. In the experimental part of the study, two different HSIs that have ground-truth information are used, and comparative results are demonstrated with state-of-the-art classification methods. In general, the MI ensemble approach produces more compact results in terms of both diversity and error compared to equipollent non-MIL algorithms.

  10. ASSORTATIVE MATING CAN IMPEDE OR FACILITATE FIXATION OF UNDERDOMINANT ALLELES

    PubMed Central

    NEWBERRY, MITCHELL G; MCCANDLISH, DAVID M; PLOTKIN, JOSHUA B

    2017-01-01

    Underdominant mutations have fixed between divergent species, yet classical models suggest that rare underdominant alleles are purged quickly except in small or subdivided populations. We predict that underdominant alleles that also influence mate choice, such as those affecting coloration patterns visible to mates and predators alike, can fix more readily. We analyze a mechanistic model of positive assortative mating in which individuals have n chances to sample compatible mates. This one-parameter model naturally spans random mating (n =1) and complete assortment (n → ∞), yet it produces sexual selection whose strength depends non-monotonically on n. This sexual selection interacts with viability selection to either inhibit or facilitate fixation. As mating opportunities increase, underdominant alleles fix as frequently as neutral mutations, even though sexual selection and underdominance independently each suppress rare alleles. This mechanism allows underdominant alleles to fix in large populations and illustrates how life history can affect evolutionary change. PMID:27497738

  11. Analysis of Realized Volatility for Nikkei Stock Average on the Tokyo Stock Exchange

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya; Watanabe, Toshiaki

    2016-04-01

    We calculate realized volatility of the Nikkei Stock Average (Nikkei225) Index on the Tokyo Stock Exchange and investigate the return dynamics. To avoid the bias on the realized volatility from the non-trading hours issue we calculate realized volatility separately in the two trading sessions, i.e. morning and afternoon, of the Tokyo Stock Exchange and find that the microstructure noise decreases the realized volatility at small sampling frequency. Using realized volatility as a proxy of the integrated volatility we standardize returns in the morning and afternoon sessions and investigate the normality of the standardized returns by calculating variance, kurtosis and 6th moment. We find that variance, kurtosis and 6th moment are consistent with those of the standard normal distribution, which indicates that the return dynamics of the Nikkei Stock Average are well described by a Gaussian random process with time-varying volatility.

  12. Laparoscopic Complete Mesocolic Excision versus Open Complete Mesocolic Excision for Transverse Colon Cancer: Long-Term Survival Results of a Prospective Single Centre Non-Randomized Study.

    PubMed

    Storli, Kristian Eeg; Eide, Geir Egil

    2016-01-01

    Laparoscopic complete mesocolic excision (CME) used in the treatment of transverse colon cancer has been questioned on the basis of the technical challenges. The aim of this study was to evaluate the medium- and long-term clinical and survival outcomes after laparoscopic and open CME for transverse colon cancer and to compare the 2 approaches. This study was a retrospective non-randomized study of patients with prospectively registered data on open and laparoscopic CME for transverse colon cancer tumour-node-metastasis stages I-III operated on between 2007 and 2014. This was a single-centre study in a community teaching hospital. A total of 56 patients with transverse colon cancer were included, excluding those with tumours in the colonic flexures. The outcome aims were 4-year time to recurrence (TTR) and cancer-specific survival (CSS). Morbidity was also measured. The 4-year TTR was 93.9% in the laparoscopic group and 91.3% in the open group (p = 0.71). The 4-year CSS was 97.0% in the laparoscopic group and 91.3% in the open group (p = 0.42). This was a prospective single-institution study with a small sample size. Results of the study suggest that the laparoscopic CME approach might be the preferred approach for transverse colon cancer, especially regarding its benefits in terms of short-term morbidity, length of stay and oncological outcome. © 2016 S. Karger AG, Basel.

  13. Small Bowel Volvulus in the Adult Populace of the United States: Results From a Population-Based Study

    PubMed Central

    Coe, Taylor M.; Chang, David C.; Sicklick, Jason K.

    2015-01-01

    Background Small bowel volvulus is a rare entity in Western adults. Greater insight into epidemiology and outcomes may be gained from a national database inquiry. Methods The Nationwide Inpatient Sample (1998–2010), a 20% stratified sample of United States hospitals, was retrospectively reviewed for small bowel volvulus cases (ICD-9 560.2 excluding gastric/colonic procedures) in patients ≥18-years old. Results There were 2,065,599 hospitalizations for bowel obstruction (ICD-9 560.x). Of those, there were 20,680 (1.00%) small bowel volvulus cases; 169 were attributable to intestinal malrotation. Most cases presented emergently (89.24%) and operative management was employed more frequently than non-operative (65.21% vs. 34.79%, P<0.0001). Predictors of mortality included age >50-years, Charlson comorbidity index ≥1, emergent admission, peritonitis, acute vascular insufficiency, coagulopathy, and non-operative management (P<0.0001). Conclusions As the first population-based epidemiological study of small bowel volvulus, our findings provide a robust representation of this rare cause of small bowel obstruction in American adults. PMID:26002189

  14. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    PubMed

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  15. Wave Propagation in Non-Stationary Statistical Mantle Models at the Global Scale

    NASA Astrophysics Data System (ADS)

    Meschede, M.; Romanowicz, B. A.

    2014-12-01

    We study the effect of statistically distributed heterogeneities that are smaller than the resolution of current tomographic models on seismic waves that propagate through the Earth's mantle at teleseismic distances. Current global tomographic models are missing small-scale structure as evidenced by the failure of even accurate numerical synthetics to explain enhanced coda in observed body and surface waveforms. One way to characterize small scale heterogeneity is to construct random models and confront observed coda waveforms with predictions from these models. Statistical studies of the coda typically rely on models with simplified isotropic and stationary correlation functions in Cartesian geometries. We show how to construct more complex random models for the mantle that can account for arbitrary non-stationary and anisotropic correlation functions as well as for complex geometries. Although this method is computationally heavy, model characteristics such as translational, cylindrical or spherical symmetries can be used to greatly reduce the complexity such that this method becomes practical. With this approach, we can create 3D models of the full spherical Earth that can be radially anisotropic, i.e. with different horizontal and radial correlation functions, and radially non-stationary, i.e. with radially varying model power and correlation functions. Both of these features are crucial for a statistical description of the mantle in which structure depends to first order on the spherical geometry of the Earth. We combine different random model realizations of S velocity with current global tomographic models that are robust at long wavelengths (e.g. Meschede and Romanowicz, 2014, GJI submitted), and compute the effects of these hybrid models on the wavefield with a spectral element code (SPECFEM3D_GLOBE). We finally analyze the resulting coda waves for our model selection and compare our computations with observations. Based on these observations, we make predictions about the strength of unresolved small-scale structure and extrinsic attenuation.

  16. Quality of life analyses from the randomized, open-label, phase III PointBreak study of pemetrexed-carboplatin-bevacizumab followed by maintenance pemetrexed-bevacizumab versus paclitaxel-carboplatin-bevacizumab followed by maintenance bevacizumab in patients with stage IIIB or IV nonsquamous non-small-cell lung cancer.

    PubMed

    Spigel, David R; Patel, Jyoti D; Reynolds, Craig H; Garon, Edward B; Hermann, Robert C; Govindan, Ramaswamy; Olsen, Mark R; Winfree, Katherine B; Chen, Jian; Liu, Jingyi; Guba, Susan C; Socinski, Mark A; Bonomi, Philip

    2015-02-01

    Treatment impact on quality of life (QoL) informs treatment management decisions in advanced nonsquamous non-small-cell lung cancer (NS NSCLC). QoL outcomes from the phase III PointBreak trial are reported. Chemonaive patients (n = 939) with stage IIIB/IV nonsquamous non-small-cell lung cancer and Eastern Cooperative Oncology Group performance status 0 to 1 were randomized (1:1) to pemetrexed-carboplatin-bevacizumab (pemetrexed arm) or paclitaxel-carboplatin-bevacizumab (paclitaxel arm). Patients without progressive disease received maintenance pemetrexed-bevacizumab (pemetrexed arm) or bevacizumab (paclitaxel arm). QoL was assessed using Functional Assessment of Cancer Therapy (FACT)-General (FACT-G), FACT-Lung (FACT-L), and FACT/Gynecologic Oncology Group-Neurotoxicity (FACT-Ntx) instruments. Subscale scores, total scores, and trial outcome indices were analyzed using linear mixed-effects models. Post hoc analyses examined the association between baseline FACT scores and overall survival (OS). Mean score differences in change from baseline significantly favored the pemetrexed arm for the neurotoxicity subscale score, FACT-Ntx total scores, and FACT-Ntx trial outcome index. They occurred at cycle 2 (p < 0.001) and persisted through induction cycles 2 to 4 and six maintenance cycles. Investigator-assessed, qualitative, drug-related differences in grade 2 (1.6% versus 10.6%) and grade 3 (0.0% versus 4.1%) sensory neuropathy and grade 3/4 fatigue (10.9% versus 5.0%, p = 0.0012) were observed between the pemetrexed and paclitaxel arms. Baseline FACT-G, FACT-L, and FACT-Ntx scores were significant prognostic factors for OS (p < 0.001). Randomized patients reported similar changes in QoL, except for less change from baseline in neurotoxicity on the pemetrexed arm; investigators reported greater neurotoxicity on the paclitaxel arm and greater fatigue on the pemetrexed arm. Higher baseline FACT scores were favorable prognostic factors for OS.

  17. R. A. Fisher and his advocacy of randomization.

    PubMed

    Hall, Nancy S

    2007-01-01

    The requirement of randomization in experimental design was first stated by R. A. Fisher, statistician and geneticist, in 1925 in his book Statistical Methods for Research Workers. Earlier designs were systematic and involved the judgment of the experimenter; this led to possible bias and inaccurate interpretation of the data. Fisher's dictum was that randomization eliminates bias and permits a valid test of significance. Randomization in experimenting had been used by Charles Sanders Peirce in 1885 but the practice was not continued. Fisher developed his concepts of randomizing as he considered the mathematics of small samples, in discussions with "Student," William Sealy Gosset. Fisher published extensively. His principles of experimental design were spread worldwide by the many "voluntary workers" who came from other institutions to Rothamsted Agricultural Station in England to learn Fisher's methods.

  18. Seven common mistakes in population genetics and how to avoid them.

    PubMed

    Meirmans, Patrick G

    2015-07-01

    As the data resulting from modern genotyping tools are astoundingly complex, genotyping studies require great care in the sampling design, genotyping, data analysis and interpretation. Such care is necessary because, with data sets containing thousands of loci, small biases can easily become strongly significant patterns. Such biases may already be present in routine tasks that are present in almost every genotyping study. Here, I discuss seven common mistakes that can be frequently encountered in the genotyping literature: (i) giving more attention to genotyping than to sampling, (ii) failing to perform or report experimental randomization in the laboratory, (iii) equating geopolitical borders with biological borders, (iv) testing significance of clustering output, (v) misinterpreting Mantel's r statistic, (vi) only interpreting a single value of k and (vii) forgetting that only a small portion of the genome will be associated with climate. For every of those issues, I give some suggestions how to avoid the mistake. Overall, I argue that genotyping studies would benefit from establishing a more rigorous experimental design, involving proper sampling design, randomization and better distinction of a priori hypotheses and exploratory analyses. © 2015 John Wiley & Sons Ltd.

  19. Estimating the duration of geologic intervals from a small number of age determinations: A challenge common to petrology and paleobiology

    NASA Astrophysics Data System (ADS)

    Glazner, Allen F.; Sadler, Peter M.

    2016-12-01

    The duration of a geologic interval, such as the time over which a given volume of magma accumulated to form a pluton, or the lifespan of a large igneous province, is commonly determined from a relatively small number of geochronologic determinations (e.g., 4-10) within that interval. Such sample sets can underestimate the true length of the interval by a significant amount. For example, the average interval determined from a sample of size n = 5, drawn from a uniform random distribution, will underestimate the true interval by 50%. Even for n = 10, the average sample only captures ˜80% of the interval. If the underlying distribution is known then a correction factor can be determined from theory or Monte Carlo analysis; for a uniform random distribution, this factor is n+1n-1. Systematic undersampling of interval lengths can have a large effect on calculated magma fluxes in plutonic systems. The problem is analogous to determining the duration of an extinct species from its fossil occurrences. Confidence interval statistics developed for species origination and extinction times are applicable to the onset and cessation of magmatic events.

  20. The Effect of Food or Omeprazole on the Pharmacokinetics of Osimertinib in Patients With Non-Small-Cell Lung Cancer and in Healthy Volunteers.

    PubMed

    Vishwanathan, Karthick; Dickinson, Paul A; Bui, Khanh; Cassier, Philippe A; Greystoke, Alastair; Lisbon, Eleanor; Moreno, Victor; So, Karen; Thomas, Karen; Weilert, Doris; Yap, Timothy A; Plummer, Ruth

    2018-04-01

    Two phase 1, open-label studies assessed the impact of food or gastric pH modification (omeprazole) on the exposure and safety/tolerability of osimertinib and its metabolites. The food effect study was an open-label, 2-period crossover study in patients with advanced non-small-cell lung cancer, randomized into 2 treatment sequences: single-dose osimertinib 80 mg in a fed then fasted state or fasted then fed. The gastric pH study was an open-label, 2-period fixed sequence study assessing the effect of omeprazole on osimertinib exposure in healthy male volunteers. In period 1, volunteers received omeprazole 40 mg (days 1-4), then omeprazole 40 mg plus osimertinib 80 mg (day 5). In period 2, volunteers received osimertinib 80 mg alone (single dose). Blood samples were collected at prespecified time points for pharmacokinetic analyses. Safety/tolerability was also assessed. In the food effect study 38 patients were randomized to fed/fasted (n = 18) or fasted/fed (n = 20) sequences with all patients completing treatment. Coadministration with food did not affect osimertinib exposure (geometric least-squares mean ratios [90% confidence intervals]: 106.05% [94.82%, 118.60%] [area under the plasma concentration time curve from zero to 72 hours] and 92.75% [81.40%, 105.68%] [maximum plasma concentration]). In the gastric pH study (n = 68 received treatment, n = 47 completed the study), coadministration with omeprazole did not affect osimertinib exposure (geometric least-squares mean ratios 106.66% [100.26%, 113.46%] [area under the concentration-time curve], 101.65% [94.65%, 109.16%] [peak concentration]). Osimertinib was well tolerated in both studies. Osimertinib may be administered without regard to food. Dose restriction is not required in patients whose gastric pH may be altered by concomitant agents or medical conditions. ClinicalTrials.gov: NCT02224053, NCT02163733. © 2017, The American College of Clinical Pharmacology.

  1. The superiority of antidepressant medication to cognitive behavior therapy in melancholic depressed patients: a 12-week single-blind randomized study.

    PubMed

    Parker, G; Blanch, B; Paterson, A; Hadzi-Pavlovic, D; Sheppard, E; Manicavasagar, V; Synnott, H; Graham, R K; Friend, P; Gilfillan, D; Perich, T

    2013-10-01

    To pursue the previously long-standing but formally untested clinical view that melancholia is preferentially responsive to antidepressant medication in comparison with psychotherapy [specifically Cognitive Behavior Therapy (CBT)]. Second, to determine whether a broader action antidepressant medication sequencing regimen is superior to a Selective Serotonin Reuptake Inhibitor (SSRI) alone. We sought to recruit a large sample of participants with melancholic depression for a 12-week trial but inclusion criteria compromised recruitment and testing the second hypothesis. The first hypothesis was evaluated by comparing 18 participants receiving antidepressant medication to 11 receiving CBT. Primary study measures were the Hamilton Rating Scale for Depression (HAM-D) and the Hamilton Endogenous Subscale (HES), rated blindly, while several secondary measures also evaluated outcome. Participants receiving medication had a superior 12-week outcome to those receiving CBT, with significant differences present on primary measures as early as 4 weeks. At trial conclusion, the percentage improvement in HAM-D scores was 61.1% vs. 34.4%, respectively [Number Needed to Treat (NNT) = 3.7] and with those in receipt of medication returning non-significantly higher HAM-D responder (66.6% vs. 36.4%, NNT = 2.8) and remission (66.7% vs. 45.4%, NNT = 4.7) rates. As the sample size was small and participants evidenced only moderate levels of depression severity, the study risked being underpowered and idiosyncratic. Despite the small sample, the superiority of antidepressant medication to CBT in those with a melancholic depression was distinctive in this pilot study. © 2012 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Violation of the Sphericity Assumption and Its Effect on Type-I Error Rates in Repeated Measures ANOVA and Multi-Level Linear Models (MLM).

    PubMed

    Haverkamp, Nicolas; Beauducel, André

    2017-01-01

    We investigated the effects of violations of the sphericity assumption on Type I error rates for different methodical approaches of repeated measures analysis using a simulation approach. In contrast to previous simulation studies on this topic, up to nine measurement occasions were considered. Effects of the level of inter-correlations between measurement occasions on Type I error rates were considered for the first time. Two populations with non-violation of the sphericity assumption, one with uncorrelated measurement occasions and one with moderately correlated measurement occasions, were generated. One population with violation of the sphericity assumption combines uncorrelated with highly correlated measurement occasions. A second population with violation of the sphericity assumption combines moderately correlated and highly correlated measurement occasions. From these four populations without any between-group effect or within-subject effect 5,000 random samples were drawn. Finally, the mean Type I error rates for Multilevel linear models (MLM) with an unstructured covariance matrix (MLM-UN), MLM with compound-symmetry (MLM-CS) and for repeated measures analysis of variance (rANOVA) models (without correction, with Greenhouse-Geisser-correction, and Huynh-Feldt-correction) were computed. To examine the effect of both the sample size and the number of measurement occasions, sample sizes of n = 20, 40, 60, 80, and 100 were considered as well as measurement occasions of m = 3, 6, and 9. With respect to rANOVA, the results plead for a use of rANOVA with Huynh-Feldt-correction, especially when the sphericity assumption is violated, the sample size is rather small and the number of measurement occasions is large. For MLM-UN, the results illustrate a massive progressive bias for small sample sizes ( n = 20) and m = 6 or more measurement occasions. This effect could not be found in previous simulation studies with a smaller number of measurement occasions. The proportionality of bias and number of measurement occasions should be considered when MLM-UN is used. The good news is that this proportionality can be compensated by means of large sample sizes. Accordingly, MLM-UN can be recommended even for small sample sizes for about three measurement occasions and for large sample sizes for about nine measurement occasions.

  3. Occupational position and its relation to mental distress in a random sample of Danish residents.

    PubMed

    Rugulies, Reiner; Madsen, Ida E H; Nielsen, Maj Britt D; Olsen, Lis R; Mortensen, Erik L; Bech, Per

    2010-08-01

    To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self-employed, and unemployed. Compared to the reference group of high-grade non-manual workers, the depressive symptom score was statistically significantly elevated among unskilled manual workers (P = 0.043) and the unemployed (P < 0.001), after adjustment for age, gender, cohabitation, life events, and low household income. The anxiety symptom score was elevated only among the unemployed (P = 0.004). The somatization symptom score was elevated among unskilled manual workers (P = 0.002), the low-grade self-employed (P = 0.023), and the unemployed (P = 0.001). When we analyzed caseness of severe symptoms, we found that unskilled manual workers (OR = 3.27, 95% CI = 1.06-10.04) and the unemployed (OR = 6.20, 95% CI = 1.98-19.42) had a higher prevalence of severe depressive symptoms, compared to the reference group of high-grade non-manual workers. The unemployed also had a higher prevalence of severe somatization symptoms (OR = 6.28, 95% CI = 1.39-28.46). Unskilled manual workers, the unemployed, and, to a lesser extent, the low-grade self-employed showed an increased level of mental distress. Activities to promote mental health in the Danish population should be directed toward these groups.

  4. DialBetics

    PubMed Central

    Fujita, Hideo; Uchimura, Yuji; Omae, Koji; Aramaki, Eiji; Kato, Shigeko; Lee, Hanae; Kobayashi, Haruka; Kadowaki, Takashi; Ohe, Kazuhiko

    2014-01-01

    Numerous diabetes-management systems and programs for improving glycemic control to meet guideline targets have been proposed, using IT technology. But all of them allow only limited—or no—real-time interaction between patients and the system in terms of system response to patient input; few studies have effectively assessed the systems’ usability and feasibility to determine how well patients understand and can adopt the technology involved. DialBetics is composed of 4 modules: (1) data transmission module, (2) evaluation module, (3) communication module, and (4) dietary evaluation module. A 3-month randomized study was designed to assess the safety and usability of a remote health-data monitoring system, and especially its impact on modifying patient lifestyles to improve diabetes self-management and, thus, clinical outcomes. Fifty-four type 2 diabetes patients were randomly divided into 2 groups, 27 in the DialBetics group and 27 in the non-DialBetics control group. HbA1c and fasting blood sugar (FBS) values declined significantly in the DialBetics group: HbA1c decreased an average of 0.4% (from 7.1 ± 1.0% to 6.7 ± 0.7%) compared with an average increase of 0.1% in the non-DialBetics group (from 7.0 ± 0.9% to 7.1 ± 1.1%) (P = .015); The DialBetics group FBS decreased an average of 5.5 mg/dl compared with a non-DialBetics group average increase of 16.9 mg/dl (P = .019). BMI improvement—although not statistically significant because of the small sample size—was greater in the DialBetics group. DialBetics was shown to be a feasible and an effective tool for improving HbA1c by providing patients with real-time support based on their measurements and inputs. PMID:24876569

  5. Effect of red yeast rice combined with antioxidants on lipid pattern, hs-CRP level, and endothelial function in moderately hypercholesterolemic subjects.

    PubMed

    Cicero, Arrigo F G; Morbini, Martino; Parini, Angelo; Urso, Riccardo; Rosticci, Martina; Grandi, Elisa; Borghi, Claudio

    2016-01-01

    Our aim was to test, through a crossover, double-blind, placebo-controlled randomized clinical trial, if a short-term treatment with 10 mg monacolins combined with antioxidants could improve lipid pattern, high-sensitivity C-reactive protein (hs-CRP), and endothelial function in a small cohort of moderately hypercholesterolemic subjects. Thus, 25 healthy, moderately hypercholesterolemic subjects were consecutively enrolled and, after 4 weeks of stabilization diet, were randomized to the sequence placebo followed by a washout, monacolins or monacolins followed by a washout, placebo, with each period being 4 weeks long. At each study step, a complete lipid pattern, safety parameters, hs-CRP, and endothelial function have been measured. When compared to the placebo phase, during monacolin treatment, patients experienced a more favorable percentage change in total cholesterol (TC) (TC after monacolin treatment, -18.35%; TC after placebo treatment, -5.39%), low-density lipoprotein cholesterol (LDL-C) (LDL after monacolin treatment, -22.36%; LDL after placebo treatment, -1.38%), non-high-density lipoprotein cholesterol (HDL-C) (non-HDL after monacolin treatment, -22.83%; non-HDL after placebo treatment: -7.15%), hs-CRP (hs-CRP after monacolin treatment: -2.33%; hs-CRP after placebo treatment, 2.11%), and endothelial function (pulse volume displacement after monacolin treatment, 18.59%; pulse volume displacement after placebo treatment, -6.69%). No significant difference was observed with regard to triglycerides, HDL-cholesterol, and safety parameters. On the basis of our data, we could demonstrate that a 10 mg monacolin nutraceutical treatment appears to safely reduce cholesterolemia, hs-CRP, and markers of vascular remodeling in moderately hypercholesterolemic subjects. These results need to be confirmed in larger patient samples and in studies with longer duration.

  6. What's in a name? The challenge of describing interventions in systematic reviews: analysis of a random sample of reviews of non-pharmacological stroke interventions

    PubMed Central

    Hoffmann, Tammy C; Walker, Marion F; Langhorne, Peter; Eames, Sally; Thomas, Emma; Glasziou, Paul

    2015-01-01

    Objective To assess, in a sample of systematic reviews of non-pharmacological interventions, the completeness of intervention reporting, identify the most frequently missing elements, and assess review authors’ use of and beliefs about providing intervention information. Design Analysis of a random sample of systematic reviews of non-pharmacological stroke interventions; online survey of review authors. Data sources and study selection The Cochrane Library and PubMed were searched for potentially eligible systematic reviews and a random sample of these assessed for eligibility until 60 (30 Cochrane, 30 non-Cochrane) eligible reviews were identified. Data collection In each review, the completeness of the intervention description in each eligible trial (n=568) was assessed by 2 independent raters using the Template for Intervention Description and Replication (TIDieR) checklist. All review authors (n=46) were invited to complete a survey. Results Most reviews were missing intervention information for the majority of items. The most incompletely described items were: modifications, fidelity, materials, procedure and tailoring (missing from all interventions in 97%, 90%, 88%, 83% and 83% of reviews, respectively). Items that scored better, but were still incomplete for the majority of reviews, were: ‘when and how much’ (in 31% of reviews, adequate for all trials; in 57% of reviews, adequate for some trials); intervention mode (in 22% of reviews, adequate for all trials; in 38%, adequate for some trials); and location (in 19% of reviews, adequate for all trials). Of the 33 (71%) authors who responded, 58% reported having further intervention information but not including it, and 70% tried to obtain information. Conclusions Most focus on intervention reporting has been directed at trials. Poor intervention reporting in stroke systematic reviews is prevalent, compounded by poor trial reporting. Without adequate intervention descriptions, the conduct, usability and interpretation of reviews are restricted and therefore, require action by trialists, systematic reviewers, peer reviewers and editors. PMID:26576811

  7. Trends and educational differences in non-communicable disease risk factors in Pitkäranta, Russia, from 1992 to 2007.

    PubMed

    Vlasoff, Tiina; Laatikainen, Tiina; Korpelainen, Vesa; Uhanov, Mihail; Pokusajeva, Svetlana; Tossavainen, Kerttu; Vartiainen, Erkki; Puska, Pekka

    2015-02-01

    Mortality and morbidity from non-communicable diseases (NCDs) is a major public health problem in Russia. The aim of the study was to examine trends and educational differences from 1992 to 2007 in NCD risk factors in Pitkäranta in the Republic of Karelia, Russia. Four cross-sectional population health surveys were carried out in the Pitkäranta region, Republic of Karelia, Russia, in 1992, 1997, 2002, and 2007. An independent random sample of 1000 persons from the general population aged 25-64 years was studied in each survey round. The total number of respondents in the four surveys was 2672. The surveys included a questionnaire, physical measurements, and blood sampling, and they were carried out following standard protocols. The NCD risk factor trends generally increased in Pitkäranta during the study period with the exception of systolic blood pressure and smoking among men. Especially significant increases were observed in alcohol consumption among both sexes and in smoking among women. Educational differences and differences in trends were relatively small with the exception of a significant increase in smoking in the lowest female educational category. Trends showing an increase in some major NCD risk factors and signs of emerging socio-economic differences call for stronger attention to effective health promotion and preventive policies in Russia. © 2014 the Nordic Societies of Public Health.

  8. [The role of meta-analysis in assessing the treatment of advanced non-small cell lung cancer].

    PubMed

    Pérol, M; Pérol, D

    2004-02-01

    Meta-analysis is a statistical method allowing an evaluation of the direction and quantitative importance of a treatment effect observed in randomized trials which have tested the treatment but have not provided a definitive conclusion. In the present review, we discuss the methodology and the contribution of meta-analyses to the treatment of advanced-stage or metastatic non-small-cell lung cancer. In this area of cancerology, meta-analyses have provided determining information demonstrating the impact of chemotherapy on patient survival. They have also helped define a two-drug regimen based on cisplatin as the gold standard treatment for patients with a satisfactory general status. Recently, the meta-analysis method was used to measure the influence of gemcitabin in combination with platinium salts and demonstrated a small but significant benefit in survival, confirming that gemcitabin remains the gold standard treatment in combination with cisplatin.

  9. Predictors of Suicide Ideation in a Random Digit Dial Study: Exposure to Suicide Matters.

    PubMed

    van de Venne, Judy; Cerel, Julie; Moore, Melinda; Maple, Myfanwy

    2017-07-03

    Suicide is an important public health concern requiring ongoing research to understand risk factors for suicide ideation. A dual-frame, random digit dial survey was utilized to identify demographic and suicide-related factors associated with suicide ideation in a statewide sample of 1,736 adults. The PH-Q 9 Depression scale suicide ideation question was used to assess current suicide ideation in both the full sample and suicide exposed sub-sample. Being non-married and having previous suicide exposure were separately associated with higher risks of suicide ideation in the full sample. Being male, having increased suicide exposures, and having increased perceptions of closeness to the decedent increased risks, while older age decreased risks for the suicide exposed. Implications for future screening and research are discussed.

  10. [Theory, method and application of method R on estimation of (co)variance components].

    PubMed

    Liu, Wen-Zhong

    2004-07-01

    Theory, method and application of Method R on estimation of (co)variance components were reviewed in order to make the method be reasonably used. Estimation requires R values,which are regressions of predicted random effects that are calculated using complete dataset on predicted random effects that are calculated using random subsets of the same data. By using multivariate iteration algorithm based on a transformation matrix,and combining with the preconditioned conjugate gradient to solve the mixed model equations, the computation efficiency of Method R is much improved. Method R is computationally inexpensive,and the sampling errors and approximate credible intervals of estimates can be obtained. Disadvantages of Method R include a larger sampling variance than other methods for the same data,and biased estimates in small datasets. As an alternative method, Method R can be used in larger datasets. It is necessary to study its theoretical properties and broaden its application range further.

  11. Evolutionary Trends and the Salience Bias (with Apologies to Oil Tankers, Karl Marx, and Others).

    ERIC Educational Resources Information Center

    McShea, Daniel W.

    1994-01-01

    Examines evolutionary trends, specifically trends in size, complexity, and fitness. Notes that documentation of these trends consists of either long lists of cases, or descriptions of a small number of salient cases. Proposes the use of random samples to avoid this "saliency bias." (SR)

  12. Anti-Depressants, Suicide, and Drug Regulation

    ERIC Educational Resources Information Center

    Ludwig, Jens; Marcotte, Dave E.

    2005-01-01

    Policymakers are increasingly concerned that a relatively new class of anti-depressant drugs, selective serotonin re-uptake inhibitors (SSRI), may increase the risk of suicide for at least some patients, particularly children. Prior randomized trials are not informative on this question because of small sample sizes and other limitations. Using…

  13. Assessing Postgraduate Students' Critical Thinking Ability

    ERIC Educational Resources Information Center

    Javed, Muhammad; Nawaz, Muhammad Atif; Qurat-Ul-Ain, Ansa

    2015-01-01

    This paper addresses to assess the critical thinking ability of postgraduate students. The target population was the male and female students at University level in Pakistan. A small sample of 45 male and 45 female students were selected randomly from The Islamia University of Bahawalpur, Pakistan. Cornell Critical Thinking Test Series, The…

  14. Binary Colloidal Alloy Test-5: Aspheres

    NASA Technical Reports Server (NTRS)

    Chaikin, Paul M.; Hollingsworth, Andrew D.

    2008-01-01

    The Binary Colloidal Alloy Test - 5: Aspheres (BCAT-5-Aspheres) experiment photographs initially randomized colloidal samples (tiny nanoscale spheres suspended in liquid) in microgravity to determine their resulting structure over time. BCAT-5-Aspheres will study the properties of concentrated systems of small particles when they are identical, but not spherical in microgravity..

  15. Predictor sort sampling and one-sided confidence bounds on quantiles

    Treesearch

    Steve Verrill; Victoria L. Herian; David W. Green

    2002-01-01

    Predictor sort experiments attempt to make use of the correlation between a predictor that can be measured prior to the start of an experiment and the response variable that we are investigating. Properly designed and analyzed, they can reduce necessary sample sizes, increase statistical power, and reduce the lengths of confidence intervals. However, if the non- random...

  16. The Effect on Non-Normal Distributions on the Integrated Moving Average Model of Time-Series Analysis.

    ERIC Educational Resources Information Center

    Doerann-George, Judith

    The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…

  17. Intelligent Control of a Sensor-Actuator System via Kernelized Least-Squares Policy Iteration

    PubMed Central

    Liu, Bo; Chen, Sanfeng; Li, Shuai; Liang, Yongsheng

    2012-01-01

    In this paper a new framework, called Compressive Kernelized Reinforcement Learning (CKRL), for computing near-optimal policies in sequential decision making with uncertainty is proposed via incorporating the non-adaptive data-independent Random Projections and nonparametric Kernelized Least-squares Policy Iteration (KLSPI). Random Projections are a fast, non-adaptive dimensionality reduction framework in which high-dimensionality data is projected onto a random lower-dimension subspace via spherically random rotation and coordination sampling. KLSPI introduce kernel trick into the LSPI framework for Reinforcement Learning, often achieving faster convergence and providing automatic feature selection via various kernel sparsification approaches. In this approach, policies are computed in a low-dimensional subspace generated by projecting the high-dimensional features onto a set of random basis. We first show how Random Projections constitute an efficient sparsification technique and how our method often converges faster than regular LSPI, while at lower computational costs. Theoretical foundation underlying this approach is a fast approximation of Singular Value Decomposition (SVD). Finally, simulation results are exhibited on benchmark MDP domains, which confirm gains both in computation time and in performance in large feature spaces. PMID:22736969

  18. Non-Crop Host Sampling Yields Insights into Small-Scale Population Dynamics of Drosophila suzukii (Matsumura)

    PubMed Central

    Loeb, Gregory M.

    2018-01-01

    Invasive, polyphagous crop pests subsist on a number of crop and non-crop resources. While knowing the full range of host species is important, a seasonal investigation into the use of non-crop plants adjacent to cropping systems provide key insights into some of the factors determining local population dynamics. This study investigated the infestation of non-crop plants by the invasive Drosophila suzukii (Matsumura), a pest of numerous economically important stone and small fruit crops, by sampling fruit-producing non-crop hosts adjacent to commercial plantings weekly from June through November in central New York over a two-year period. We found D. suzukii infestation rates (number of flies emerged/kg fruit) peaked mid-August through early September, with Rubus allegheniensis Porter and Lonicera morrowii Asa Gray showing the highest average infestation in both years. Interannual infestation patterns were similar despite a lower number of adults caught in monitoring traps the second year, suggesting D. suzukii host use may be density independent. PMID:29301358

  19. Adults' Knowledge of Child Development in Alberta, Canada: Comparing the Level of Knowledge of Adults in Two Samples in 2007 and 2013

    ERIC Educational Resources Information Center

    Pujadas Botey, Anna; Vinturache, Angela; Bayrampour, Hamideh; Breitkreuz, Rhonda; Bukutu, Cecilia; Gibbard, Ben; Tough, Suzanne

    2017-01-01

    Parents and non-parental adults who interact with children influence child development. This study evaluates the knowledge of child development in two large and diverse samples of adults from Alberta in 2007 and 2013. Telephone interviews were completed by two random samples (1,443 in 2007; 1,451 in 2013). Participants were asked when specific…

  20. Random forests ensemble classifier trained with data resampling strategy to improve cardiac arrhythmia diagnosis.

    PubMed

    Ozçift, Akin

    2011-05-01

    Supervised classification algorithms are commonly used in the designing of computer-aided diagnosis systems. In this study, we present a resampling strategy based Random Forests (RF) ensemble classifier to improve diagnosis of cardiac arrhythmia. Random forests is an ensemble classifier that consists of many decision trees and outputs the class that is the mode of the class's output by individual trees. In this way, an RF ensemble classifier performs better than a single tree from classification performance point of view. In general, multiclass datasets having unbalanced distribution of sample sizes are difficult to analyze in terms of class discrimination. Cardiac arrhythmia is such a dataset that has multiple classes with small sample sizes and it is therefore adequate to test our resampling based training strategy. The dataset contains 452 samples in fourteen types of arrhythmias and eleven of these classes have sample sizes less than 15. Our diagnosis strategy consists of two parts: (i) a correlation based feature selection algorithm is used to select relevant features from cardiac arrhythmia dataset. (ii) RF machine learning algorithm is used to evaluate the performance of selected features with and without simple random sampling to evaluate the efficiency of proposed training strategy. The resultant accuracy of the classifier is found to be 90.0% and this is a quite high diagnosis performance for cardiac arrhythmia. Furthermore, three case studies, i.e., thyroid, cardiotocography and audiology, are used to benchmark the effectiveness of the proposed method. The results of experiments demonstrated the efficiency of random sampling strategy in training RF ensemble classification algorithm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Meta-analysis and systematic review of the number of non-syndromic congenitally missing permanent teeth per affected individual and its influencing factors

    PubMed Central

    Rakhshan, Hamid

    2016-01-01

    Summary Background and purpose: Dental aplasia (or hypodontia) is a frequent and challenging anomaly and thus of interest to many dental fields. Although the number of missing teeth (NMT) in each person is a major clinical determinant of treatment need, there is no meta-analysis on this subject. Therefore, we aimed to investigate the relevant literature, including epidemiological studies and research on dental/orthodontic patients. Methods: Among 50 reports, the effects of ethnicities, regions, sample sizes/types, subjects’ minimum ages, journals’ scientific credit, publication year, and gender composition of samples on the number of missing permanent teeth (except the third molars) per person were statistically analysed (α = 0.05, 0.025, 0.01). Limitations: The inclusion of small studies and second-hand information might reduce the reliability. Nevertheless, these strategies increased the meta-sample size and favoured the generalisability. Moreover, data weighting was carried out to account for the effect of study sizes/precisions. Results: The NMT per affected person was 1.675 [95% confidence interval (CI) = 1.621–1.728], 1.987 (95% CI = 1.949–2.024), and 1.893 (95% CI = 1.864–1.923), in randomly selected subjects, dental/orthodontic patients, and both groups combined, respectively. The effects of ethnicities (P > 0.9), continents (P > 0.3), and time (adjusting for the population type, P = 0.7) were not significant. Dental/orthodontic patients exhibited a significantly greater NMT compared to randomly selected subjects (P < 0.012). Larger samples (P = 0.000) and enrolling younger individuals (P = 0.000) might inflate the observed NMT per person. Conclusions: Time, ethnic backgrounds, and continents seem unlikely influencing factors. Subjects younger than 13 years should be excluded. Larger samples should be investigated by more observers. PMID:25840586

  2. Lensless digital holography with diffuse illumination through a pseudo-random phase mask.

    PubMed

    Bernet, Stefan; Harm, Walter; Jesacher, Alexander; Ritsch-Marte, Monika

    2011-12-05

    Microscopic imaging with a setup consisting of a pseudo-random phase mask, and an open CMOS camera, without an imaging objective, is demonstrated. The pseudo random phase mask acts as a diffuser for an incoming laser beam, scattering a speckle pattern to a CMOS chip, which is recorded once as a reference. A sample which is afterwards inserted somewhere in the optical beam path changes the speckle pattern. A single (non-iterative) image processing step, comparing the modified speckle pattern with the previously recorded one, generates a sharp image of the sample. After a first calibration the method works in real-time and allows quantitative imaging of complex (amplitude and phase) samples in an extended three-dimensional volume. Since no lenses are used, the method is free from lens abberations. Compared to standard inline holography the diffuse sample illumination improves the axial sectioning capability by increasing the effective numerical aperture in the illumination path, and it suppresses the undesired so-called twin images. For demonstration, a high resolution spatial light modulator (SLM) is programmed to act as the pseudo-random phase mask. We show experimental results, imaging microscopic biological samples, e.g. insects, within an extended volume at a distance of 15 cm with a transverse and longitudinal resolution of about 60 μm and 400 μm, respectively.

  3. Randomized controlled trial of Gastrografin in adhesive small bowel obstruction.

    PubMed

    Burge, Jonathan; Abbas, Saleh M; Roadley, Graeme; Donald, Jennifer; Connolly, Andrew; Bissett, Ian P; Hill, Andrew G

    2005-08-01

    Several previous studies have shown that Gastrografin can be utilized to triage patients with adhesive small bowel obstruction (ASBO) to an operative or a non-operative course. Previous studies assessing the therapeutic effect of Gastrografin have been confounded by post-administration radiology alerting the physician to the treatment group of the patient. Therefore the aim of the present paper was to test the hypothesis that Gastrografin hastens the non-operative resolution of (ASBO). Patients, diagnosed with ASBO on clinical and radiological grounds, were randomized to receive Gastrografin or placebo in a double-blinded fashion. Patients did not undergo further radiological investigation. If the patient required subsequent radiological intervention or surgical intervention they were excluded from the study. End-points were passage of time to resolution of ASBO (flatus and bowel motion), length of hospital stay and complications. Forty-five patients with ASBO were randomized to receive either Gastrografin or placebo. Two patients were excluded due to protocol violations. Four patients in each group required surgery. Eighteen of the remaining patients received Gastrografin and 17 received placebo. Patients who received Gastrografin had complete resolution of their ASBO significantly earlier than placebo patients (12 vs 21 h, P = 0.009) and this translated into a median of a 1-day saving in time in hospital (3 vs 4 days, P = 0.03). Gastrografin accelerates resolution of ASBO by a specific therapeutic effect.

  4. Mutation patterns in small cell and non-small cell lung cancer patients suggest a different level of heterogeneity between primary and metastatic tumors.

    PubMed

    Saber, Ali; Hiltermann, T Jeroen N; Kok, Klaas; Terpstra, M Martijn; de Lange, Kim; Timens, Wim; Groen, Harry J M; van den Berg, Anke

    2017-02-01

    Several studies have shown heterogeneity in lung cancer, with parallel existence of multiple subclones characterized by their own specific mutational landscape. The extent to which minor clones become dominant in distinct metastasis is not clear. The aim of our study was to gain insight in the evolution pattern of lung cancer by investigating genomic heterogeneity between primary tumor and its distant metastases. Whole exome sequencing (WES) was performed on 24 tumor and five normal samples of two small cell lung carcinoma (SCLC) and three non-SCLC (NSCLC) patients. Validation of somatic variants in these 24 and screening of 33 additional samples was done by single primer enrichment technology. For each of the three NSCLC patients, about half of the mutations were shared between all tumor samples, whereas for SCLC patients, this percentage was around 95. Independent validation of the non-ubiquitous mutations confirmed the WES data for the vast majority of the variants. Phylogenetic trees indicated more distance between the tumor samples of the NSCLC patients as compared to the SCLC patients. Analysis of 30 independent DNA samples of 16 biopsies used for WES revealed a low degree of intra-tumor heterogeneity of the selected sets of mutations. In the primary tumors of all five patients, variable percentages (19-67%) of the seemingly metastases-specific mutations were present albeit at low read frequencies. Patients with advanced NSCLC have a high percentage of non-ubiquitous mutations indicative of branched evolution. In contrast, the low degree of heterogeneity in SCLC suggests a parallel and linear model of evolution. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    PubMed

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  6. The first liquid biopsy test approved. Is it a new era of mutation testing for non-small cell lung cancer?

    PubMed Central

    2017-01-01

    Specific mutations in epidermal growth factor receptor (EGFR) gene are predictive for response to the EGFR tyrosine kinase inhibitors (TKIs) in non-small cell lung cancer patients (NSCLC). According to international guidelines, the molecular testing in patients with advanced NSCLC of a non-squamous subtype is recommended. However, obtain a tissue sample could be challenging. Liquid biopsy allows to determine patients suitable for EGFR-targeted therapy by analysis of circulating-free tumor DNA (cfDNA) in peripheral blood samples and might replace tissue biopsy. It allows to acquire a material in convenient minimally invasive manner, is easily repeatable, could be used for molecular identification and molecular changes monitoring. Many studies show a high concordance rate between tissue and plasma samples testing. When U.S. Food and Drug Administration (FDA) approved the first liquid biopsy test, analysis of driver gene mutation from cfDNA becomes a reality in clinical practice for patients with NSCLC. PMID:28251125

  7. Inadequacy of Conventional Grab Sampling for Remediation Decision-Making for Metal Contamination at Small-Arms Ranges.

    PubMed

    Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A

    2018-01-01

    Research shows grab sampling is inadequate for evaluating military ranges contaminated with energetics because of their highly heterogeneous distribution. Similar studies assessing the heterogeneous distribution of metals at small-arms ranges (SAR) are lacking. To address this we evaluated whether grab sampling provides appropriate data for performing risk analysis at metal-contaminated SARs characterized with 30-48 grab samples. We evaluated the extractable metal content of Cu, Pb, Sb, and Zn of the field data using a Monte Carlo random resampling with replacement (bootstrapping) simulation approach. Results indicate the 95% confidence interval of the mean for Pb (432 mg/kg) at one site was 200-700 mg/kg with a data range of 5-4500 mg/kg. Considering the U.S. Environmental Protection Agency screening level for lead is 400 mg/kg, the necessity of cleanup at this site is unclear. Resampling based on populations of 7 and 15 samples, a sample size more realistic for the area yielded high false negative rates.

  8. An investigation of factors affecting elementary female student teachers' choice of science as a major at college level in Zimbabwe

    NASA Astrophysics Data System (ADS)

    Mlenga, Francis Howard

    The purpose of the study was to determine factors affecting elementary female student teachers' choice of science as a major at college level in Zimbabwe. The study was conducted at one of the Primary School Teachers' Colleges in Zimbabwe. A sample of two hundred and thirty-eight female student teachers was used in the study. Of these one hundred and forty-two were non-science majors who had been randomly selected, forty-one were science majors and forty-five were math majors. Both science and math majors were a convenient sample because the total enrollment of the two groups was small. All the subjects completed a survey questionnaire that had sixty-eight items. Ten students from the non-science majors were selected for individual interviews and the same was done for the science majors. A further eighteen were selected from the non-science majors and divided into three groups of six each for focus group interviews. The same was done for the science majors. The interviews were audio taped and transcribed. Data from the survey questionnaires were analyzed using Binary Logistic Regression which predicted factors that affected students' choice of science as a major. The transcribed interview data were analyzed used using domain, taxonomic and componential analyses. Results of the study indicated that elementary female students' choice of science as a major at college level is affected by students' attitudes toward science, teacher behavior, out-of-school experiences, role models, gender stereotyping, parental influence, peer influence, in-school experiences, and societal expectations, namely cultural and social expectations.

  9. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor.

    PubMed

    Mura, Maria Chiara; De Felice, Marco; Morlino, Roberta; Fuselli, Sergio

    2010-01-01

    In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6), concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a) node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b) node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c) node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW) non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%); most important, they suggest a possible procedure to optimize network design.

  10. Spectral analysis of tissues from patients with cancer using a portable spectroscopic diagnostic ratiometer unit

    NASA Astrophysics Data System (ADS)

    Sordillo, Laura A.; Pu, Yang; Sordillo, Peter P.; Budansky, Yury; Alfano, R. R.

    2014-05-01

    Spectral profiles of tissues from patients with breast carcinoma, malignant carcinoid and non-small cell lung carcinoma were acquired using native fluorescence spectroscopy. A novel spectroscopic ratiometer device (S3-LED) with selective excitation wavelengths at 280 nm and 335 nm was used to produce the emission spectra of the key biomolecules, tryptophan and NADH, in the tissue samples. In each of the samples, analysis of emission intensity peaks from biomolecules showed increased 340 nm/440 nm and 340 nm/460 nm ratios in the malignant samples compared to their paired normal samples. This most likely represented increased tryptophan to NADH ratios in the malignant tissue samples compared to their paired normal samples. Among the non-small cell lung carcinoma and breast carcinomas, it appeared that tumors of very large size or poor differentiation had an even greater increase in the 340 nm/440 nm and 340 nm/460 nm ratios. In the samples of malignant carcinoid, which is known to be a highly metabolically active tumor, a marked increase in these ratios was also seen.

  11. Maintenance or non-maintenance therapy in the treatment of advanced non-small cell lung cancer: that is the question.

    PubMed

    Galetta, D; Rossi, A; Pisconti, S; Millaku, A; Colucci, G

    2010-11-01

    Lung cancer is the most common cancer worldwide with non-small cell lung cancer (NSCLC), including squamous carcinoma, adenocarcinoma and large cell carcinoma, accounting for about 85% of all lung cancer types with most of the patients presenting with advanced disease at the time of diagnosis. In this setting first-line platinum-based chemotherapy for no more than 4-6 cycles are recommended. After these cycles of treatment, non-progressing patients enter in the so called "watch and wait" period in which no further therapy is administered until there is disease progression. In order to improve the advanced NSCLC outcomes, the efficacy of further treatment in the "watch and wait" period was investigated. This is the "maintenance therapy". Recently, the results coming from randomized phase III trials investigating two new agents, pemetrexed and erlotinib, in this setting led to their registration for maintenance therapy. Here, we report and discuss these results. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. The Development of the Control of Adult Instructions Over Non-Verbal Behavior.

    ERIC Educational Resources Information Center

    Van Duyne, H. John

    The purpose of the study was (1) to examine the results from a two-association perceptual-motor task as to their implications for Luria's theory about the development of verbal control of non-verbal behavior; (2) to explore the effects of various learning experiences upon this development. The sample consisted of 20 randomly selected children in…

  13. Effects of Teaching Gardening on Science Students' Attitudes toward Entrepreneurial Skills Acquisition in Jos South, Plateau State, Nigeria

    ERIC Educational Resources Information Center

    Charity, Dimlong; Ozoji, Bernadette Ebele; Osasebor, Florence Osaze; Ibn Umar, Suleiman

    2017-01-01

    This study investigated the effects of teaching gardening on science students' attitudes toward entrepreneurial skills acquisition in Jos South, Plateau State, Nigeria. The study employed the non-randomized pre-test post-test non-equivalent control group design. A sample of 75 senior secondary school students from two intact classes, randomly…

  14. Survey of Students and Non-Students about Continuing Education Market Place. Volume XXIV, Number 16.

    ERIC Educational Resources Information Center

    Lucas, John A.; And Others

    To evaluate the outreach and marketing efforts for its non-credit offerings, William Rainey Harper College in Illinois conducted a study of recent continuing education students in spring 1996. First, a random sample of 200 former students who had enrolled in continuing education courses in the past 5 years was surveyed, receiving 57 completed…

  15. Are Boys Discriminated in Swedish High Schools?

    ERIC Educational Resources Information Center

    Hinnerich, Bjorn Tyrefors; Hoglin, Erik; Johannesson, Magnus

    2011-01-01

    Girls typically have higher grades than boys in school and recent research suggests that part of this gender difference may be due to discrimination of boys in grading. We rigorously test this in a field experiment where a random sample of the same tests in the Swedish language is subject to blind and non-blind grading. The non-blind test score is…

  16. Discrimination against Students with Foreign Backgrounds: Evidence from Grading in Swedish Public High Schools

    ERIC Educational Resources Information Center

    Hinnerich, Bjorn Tyrefors; Höglin, Erik; Johannesson, Magnus

    2015-01-01

    We rigorously test for discrimination against students with foreign backgrounds in high school grading in Sweden. We analyse a random sample of national tests in the Swedish language graded both non-blindly by the student's own teacher and blindly without any identifying information. The increase in the test score due to non-blind grading is…

  17. Randomized Trial of Endobronchial Ultrasound-Guided Transbronchial Needle Aspiration With and Without Rapid On-site Evaluation for Lung Cancer Genotyping.

    PubMed

    Trisolini, Rocco; Cancellieri, Alessandra; Tinelli, Carmine; de Biase, Dario; Valentini, Ilaria; Casadei, Gianpiero; Paioli, Daniela; Ferrari, Franco; Gordini, Giovanni; Patelli, Marco; Tallini, Giovanni

    2015-12-01

    Experts and scientific society guidelines recommend that rapid on-site evaluation (ROSE) be used with endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) to optimize lung cancer genotyping, but no comparative trial has been carried out to confirm and quantify its usefulness. To assess the influence of ROSE on the yield of EBUS-TBNA for a multigene molecular analysis of lung cancer samples, consecutive patients with suspected or known advanced lung cancer were randomized to undergo EBUS-TBNA without ROSE (EBUS arm) or with ROSE (ROSE arm). The primary end point was the rate of the successful accomplishment of the institution's clinical protocol for molecular profiling of nonsquamous non-small cell lung cancer (EGFR and KRAS testing, followed by ALK testing for tumors with EGFR and KRAS wild-type status). Complete genotyping was achieved in 108 of 126 patients (85.7%) (90.8% in the ROSE arm vs 80.3% in the EBUS arm, P = .09). The patients in the ROSE arm were less likely to have samples that could be used only for pathologic diagnosis because of minimal tumor burden (0 vs 6, P = .05), and were more likely to have the bronchoscopy terminated after a single biopsy site (58.9% vs 44.1%, P = .01). ROSE prevents the need for a repeat invasive diagnostic procedure aimed at molecular profiling in at least one out of 10 patients with advanced lung cancer and significantly reduces the risk of retrieving samples that can be used only for pathologic subtyping because of minimal tumor burden. ClinicalTrials.gov; No.: NCT01799382; URL: www.clinicaltrials.gov.

  18. Detection of Mycobacterium avium subspecies paratuberculosis in tie-stall dairy herds using a standardized environmental sampling technique and targeted pooled samples.

    PubMed

    Arango-Sabogal, Juan C; Côté, Geneviève; Paré, Julie; Labrecque, Olivia; Roy, Jean-Philippe; Buczinski, Sébastien; Doré, Elizabeth; Fairbrother, Julie H; Bissonnette, Nathalie; Wellemans, Vincent; Fecteau, Gilles

    2016-07-01

    Mycobacterium avium ssp. paratuberculosis (MAP) is the etiologic agent of Johne's disease, a chronic contagious enteritis of ruminants that causes major economic losses. Several studies, most involving large free-stall herds, have found environmental sampling to be a suitable method for detecting MAP-infected herds. In eastern Canada, where small tie-stall herds are predominant, certain conditions and management practices may influence the survival and transmission of MAP and recovery (isolation). Our objective was to estimate the performance of a standardized environmental and targeted pooled sampling technique for the detection of MAP-infected tie-stall dairy herds. Twenty-four farms (19 MAP-infected and 5 non-infected) were enrolled, but only 20 were visited twice in the same year, to collect 7 environmental samples and 2 pooled samples (sick cows and cows with poor body condition). Concurrent individual sampling of all adult cows in the herds was also carried out. Isolation of MAP was achieved using the MGIT Para TB culture media and the BACTEC 960 detection system. Overall, MAP was isolated in 7% of the environmental cultures. The sensitivity of the environmental culture was 44% [95% confidence interval (CI): 20% to 70%] when combining results from 2 different herd visits and 32% (95% CI: 13% to 57%) when results from only 1 random herd visit were used. The best sampling strategy was to combine samples from the manure pit, gutter, sick cows, and cows with poor body condition. The standardized environmental sampling technique and the targeted pooled samples presented in this study is an alternative sampling strategy to costly individual cultures for detecting MAP-infected tie-stall dairies. Repeated samplings may improve the detection of MAP-infected herds.

  19. Organic cattle products: Authenticating production origin by analysis of serum mineral content.

    PubMed

    Rodríguez-Bermúdez, Ruth; Herrero-Latorre, Carlos; López-Alonso, Marta; Losada, David E; Iglesias, Roberto; Miranda, Marta

    2018-10-30

    An authentication procedure for differentiating between organic and non-organic cattle production on the basis of analysis of serum samples has been developed. For this purpose, the concentrations of fourteen mineral elements (As, Cd, Co, Cr, Cu, Fe, Hg, I, Mn, Mo, Ni, Pb, Se and Zn) in 522 serum samples from cows (341 from organic farms and 181 from non-organic farms), determined by inductively coupled plasma spectrometry, were used. The chemical information provided by serum analysis was employed to construct different pattern recognition classification models that predict the origin of each sample: organic or non-organic class. Among all classification procedures considered, the best results were obtained with the decision tree C5.0, Random Forest and AdaBoost neural networks, with hit levels close to 90% for both production types. The proposed method, involving analysis of serum samples, provided rapid, accurate in vivo classification of cattle according to organic and non-organic production type. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. EBUS-TBNA Provides Highest RNA Yield for Multiple Biomarker Testing from Routinely Obtained Small Biopsies in Non-Small Cell Lung Cancer Patients - A Comparative Study of Three Different Minimal Invasive Sampling Methods

    PubMed Central

    Schmid-Bindert, Gerald; Wang, Yongsheng; Jiang, Hongbin; Sun, Hui; Henzler, Thomas; Wang, Hao; Pilz, Lothar R.; Ren, Shengxiang; Zhou, Caicun

    2013-01-01

    Background Multiple biomarker testing is necessary to facilitate individualized treatment of lung cancer patients. More than 80% of lung cancers are diagnosed based on very small tumor samples. Often there is not enough tissue for molecular analysis. We compared three minimal invasive sampling methods with respect to RNA quantity for molecular testing. Methods 106 small biopsies were prospectively collected by three different methods forceps biopsy, endobronchial ultrasound (EBUS) guided transbronchial needle aspiration (TBNA), and CT-guided core biopsy. Samples were split into two halves. One part was formalin fixed and paraffin embedded for standard pathological evaluation. The other part was put in RNAlater for immediate RNA/DNA extraction. If the pathologist confirmed the diagnosis of non-small cell lung cancer(NSCLC), the following molecular markers were tested: EGFR mutation, ERCC1, RRM1 and BRCA1. Results Overall, RNA-extraction was possible in 101 out of 106 patients (95.3%). We found 49% adenocarcinomas, 38% squamouscarcinomas, and 14% non-otherwise-specified(NOS). The highest RNA yield came from endobronchial ultrasound guided needle aspiration, which was significantly higher than bronchoscopy (37.74±41.09 vs. 13.74±15.53 ng respectively, P = 0.005) and numerically higher than CT-core biopsy (37.74±41.09 vs. 28.72±44.27 ng respectively, P = 0.244). EGFR mutation testing was feasible in 100% of evaluable patients and its incidence was 40.8%, 7.9% and 14.3% in adenocarcinomas, squamouscarcinomas and NSCLC NOS subgroup respectively. There was no difference in the feasibility of molecular testing between the three sampling methods with feasibility rates for ERCC1, RRM1 and BRCA1 of 91%, 87% and 81% respectively. Conclusion All three methods can provide sufficient tumor material for multiple biomarkers testing from routinely obtained small biopsies in lung cancer patients. In our study EBUS guided needle aspiration provided the highest amount of tumor RNA compared to bronchoscopy or CT guided core biopsy. Thus EBUS should be considered as an acceptable option for tissue acquisition for molecular testing. PMID:24205040

  1. A novel approach to the simultaneous extraction and non-targeted analysis of the small molecules metabolome and lipidome using 96-well solid phase extraction plates with column-switching technology.

    PubMed

    Li, Yubo; Zhang, Zhenzhu; Liu, Xinyu; Li, Aizhu; Hou, Zhiguo; Wang, Yuming; Zhang, Yanjun

    2015-08-28

    This study combines solid phase extraction (SPE) using 96-well plates with column-switching technology to construct a rapid and high-throughput method for the simultaneous extraction and non-targeted analysis of small molecules metabolome and lipidome based on ultra-performance liquid chromatography quadrupole time-of-flight mass spectrometry. This study first investigated the columns and analytical conditions for small molecules metabolome and lipidome, separated by an HSS T3 and BEH C18 columns, respectively. Next, the loading capacity and actuation duration of SPE were further optimized. Subsequently, SPE and column switching were used together to rapidly and comprehensively analyze the biological samples. The experimental results showed that the new analytical procedure had good precision and maintained sample stability (RSD<15%). The method was then satisfactorily applied to more widely analyze the small molecules metabolome and lipidome to test the throughput. The resulting method represents a new analytical approach for biological samples, and a highly useful tool for researches in metabolomics and lipidomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Sparsely-synchronized brain rhythm in a small-world neural network

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Yoon; Lim, Woochang

    2013-07-01

    Sparsely-synchronized cortical rhythms, associated with diverse cognitive functions, have been observed in electric recordings of brain activity. At the population level, cortical rhythms exhibit small-amplitude fast oscillations while at the cellular level, individual neurons show stochastic firings sparsely at a much lower rate than the population rate. We study the effect of network architecture on sparse synchronization in an inhibitory population of subthreshold Morris-Lecar neurons (which cannot fire spontaneously without noise). Previously, sparse synchronization was found to occur for cases of both global coupling ( i.e., regular all-to-all coupling) and random coupling. However, a real neural network is known to be non-regular and non-random. Here, we consider sparse Watts-Strogatz small-world networks which interpolate between a regular lattice and a random graph via rewiring. We start from a regular lattice with only short-range connections and then investigate the emergence of sparse synchronization by increasing the rewiring probability p for the short-range connections. For p = 0, the average synaptic path length between pairs of neurons becomes long; hence, only an unsynchronized population state exists because the global efficiency of information transfer is low. However, as p is increased, long-range connections begin to appear, and global effective communication between distant neurons may be available via shorter synaptic paths. Consequently, as p passes a threshold p th (}~ 0.044), sparsely-synchronized population rhythms emerge. However, with increasing p, longer axon wirings become expensive because of their material and energy costs. At an optimal value p* DE (}~ 0.24) of the rewiring probability, the ratio of the synchrony degree to the wiring cost is found to become maximal. In this way, an optimal sparse synchronization is found to occur at a minimal wiring cost in an economic small-world network through trade-off between synchrony and wiring cost.

  3. Substance P expression in the gingival tissue after upper third molar extraction: effect of ketoprofen, a preliminary study.

    PubMed

    Abbate, G M; Mangano, A; Sacerdote, P; Amodeo, G; Moschetti, G; Levrini, L

    2017-01-01

    The aim of this study was to evaluate substance P (SP) levels and the effect of a non-steroidal anti-inflammatory drug (NSAID), ketoprofen, on SP in the pericoronal gingival tissue after extraction of upper third molars. A sample of 20 young non-smoking systemically healthy adults of both sexes, with a healthy upper third molar to extract for orthodontic purposes, was selected. After extraction, a sample of the gingival tissue of the pericoronal region was collected with a sterile scalpel, placed into test tubes and kept frozen at -20°C until the SP determination. SP levels were determined by using a commercially available enzyme immunoassay (ELISA) kit. The subjects were randomly divided into two groups: group 1 received a single dose of ketoprofen 30 minutes prior to the experimental procedure. The subjects of group 2 did not receive any kind of drug administration before extraction. The patients were asked to complete a diary on the postoperative pain. A relevant amount of SP was measured in all the gingival samples. No statistically significant difference could be detected in SP expression between the two groups. In group 1 pain appearance was significantly delayed (6.2±0.13 hours) in comparison with group 2 (3.95±0.2 hours). In this small selected group of subjects and limited study design, preventive administration of ketoprofen did not significantly affect the gingival levels of SP, the clinical recommendation emerging is that of NSAID administration postoperatively but before pain appearance in order to optimize the management of pain of the patient.

  4. Social network recruitment for Yo Puedo - an innovative sexual health intervention in an underserved urban neighborhood: sample and design implications

    PubMed Central

    Minnis, Alexandra M.; vanDommelen-Gonzalez, Evan; Luecke, Ellen; Cheng, Helen; Dow, William; Bautista-Arredondo, Sergio; Padian, Nancy S.

    2016-01-01

    Most existing evidence-based sexual health interventions focus on individual-level behavior, even though there is substantial evidence that highlights the influential role of social environments in shaping adolescents’ behaviors and reproductive health outcomes. We developed Yo Puedo, a combined conditional cash transfer (CCT) and life skills intervention for youth to promote educational attainment, job training, and reproductive health wellness that we then evaluated for feasibility among 162 youth aged 16–21 years in a predominantly Latino community in San Francisco, CA. The intervention targeted youth’s social networks and involved recruitment and randomization of small social network clusters. In this paper we describe the design of the feasibility study and report participants’ baseline characteristics. Furthermore, we examined the sample and design implications of recruiting social network clusters as the unit of randomization. Baseline data provide evidence that we successfully enrolled high risk youth using a social network recruitment approach in community and school-based settings. Nearly all participants (95%) were high risk for adverse educational and reproductive health outcomes based on multiple measures of low socioeconomic status (81%) and/or reported high risk behaviors (e.g., gang affiliation, past pregnancy, recent unprotected sex, frequent substance use) (62%). We achieved variability in the study sample through heterogeneity in recruitment of the index participants, whereas the individuals within the small social networks of close friends demonstrated substantial homogeneity across sociodemographic and risk profile characteristics. Social networks recruitment was feasible and yielded a sample of high risk youth willing to enroll in a randomized study to evaluate a novel sexual health intervention. PMID:25358834

  5. Successful treatment with erlotinib of severe neutropenia induced by gefitinib in a patient with advanced non-small cell lung cancer.

    PubMed

    Araya, Tomoyuki; Kasahara, Kazuo; Demura, Yoshiki; Matsuoka, Hiroki; Nishitsuji, Masaru; Nishi, Koichi

    2013-06-01

    Neutropenia is a rare side effect of gefitinib and was scarcely reported in many large-scale randomized phase III trials using gefitinib monotherapy as first-line treatment. A 77-year-old female was referred to our institution due to abnormal shadow of the right lung, diagnosed by CT scan and biopsy histopathology as adenocarcinoma of the lung (cT3N1M1b). Mutation analysis with PCR-Invader assay of tumor DNA samples revealed short in-frame deletion in exon 19. Based on the diagnosis, first-line treatment was initiated using oral gefitinib (250 mg, daily). During the initial 27 days of gefitinib therapy, the only side effect was a mild skin rash. After 28 days, there was marked tumor shrinkage, indicative of a partial response to gefitinib; however, grade 4 neutropenia was also detected. The patient was switched to the oral erlotinib monotherapy (150 mg/day) as second-line chemotherapy with careful monitoring of neutropenia. Discontinuation of the gefitinib, without the need for granulocyte colony-stimulating factor support, was successful in allowing the neutrophils and leukocytes counts to recover to normal by day 47. The patient continued oral erlotinib for more than 9 months and there has been no evidence of neutropenia, leukopenia, or disease progression. Clinicians should be aware that gefitinib-induced neutropenia in patients with non-small cell lung cancer can be treated successful by switching to erlotinib. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Traditional Chinese Medicine treatment as maintenance therapy in advanced non-small-cell lung cancer: A randomized controlled trial.

    PubMed

    Jiang, Yi; Liu, Ling-Shuang; Shen, Li-Ping; Han, Zhi-Fen; Jian, Hong; Liu, Jia-Xiang; Xu, Ling; Li, He-Gen; Tian, Jian-Hui; Mao, Zhu-Jun

    2016-02-01

    Maintenance therapy for patients with advanced non-small-cell lung cancer (NSCLC) is an increasingly hot topic in the field of clinical NSCLC research. This study aimed to evaluate the effects of Traditional Chinese Medicine (TCM) treatment as maintenance therapy on time to progression (TTP), quality of life (QOL), overall survival (OS) and 1-year survival rate in patients with advanced NSCLC. This study was conducted as a randomized, controlled, open-label trial. 64 non-progressive patients who responded to initial therapy were randomized 1:1 to the TCM arm (treated with herbal injection (Cinobufacini, 20ml/d, d1-d10), herbal decoction (d1-d21) and Chinese acupoint application (d1-d21), n=32) or to the chemotherapy arm (treated with pemetrexed (non-squamous NSCLC, 500mg/m(2), d1), docetaxel (75mg/m(2), d1) or gemcitabine (1250mg/m(2), d1 and d8), n=32). Each therapy cycle was 21 days. They were repeated until disease progression, unacceptable toxicity, or until the patients requested therapy discontinuation. The primary end point was TTP; the secondary end points were QOL, OS and 1-year survival rate. "Intention-to-treat" analysis included all randomized participants. TCM treatment prolonged median TTP for 0.7 months compared with chemotherapy, but it was not statistically significant (3.0 months vs. 2.3 months, P=0.114). Median OS time for TCM treatment did not offer a significant advantage over for chemotherapy (21.5 months vs. 18.8 months, P=0.601). 1-year survival rate of TCM treatment significantly improved than that of chemotherapy (78.1% vs. 53.1%, P=0.035). TCM treatment can significantly improve QOL when compared to chemotherapy as assessed by EORTC QLQ-C30 and EORTC QLQ-LC13 QOL instruments. TCM maintenance treatment had similar effects on TTP and OS compared with maintenance chemotherapy, but it improved patients' QOL and had higher 1-year survival rate. TCM Maintenance treatment is a promising option for advanced NSCLC patients without progression following first-line chemotherapy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Computerized cognitive training in children and adolescents with attention deficit/hyperactivity disorder as add-on treatment to stimulants: feasibility study and protocol description.

    PubMed

    Rosa, Virginia de Oliveira; Schmitz, Marcelo; Moreira-Maia, Carlos Roberto; Wagner, Flavia; Londero, Igor; Bassotto, Caroline de Fraga; Moritz, Guilherme; de Souza, Caroline Dos Santos; Rohde, Luis Augusto Paim

    2017-01-01

    Cognitive training has received increasing attention as a non-pharmacological approach for the treatment of attention deficit/hyperactivity disorder (ADHD) in children and adolescents. Few studies have assessed cognitive training as add-on treatment to medication in randomized placebo controlled trials. The purpose of this preliminary study was to explore the feasibility of implementing a computerized cognitive training program for ADHD in our environment, describe its main characteristics and potential efficacy in a small pilot study. Six ADHD patients aged 10-12-years old receiving stimulants and presenting residual symptoms were enrolled in a randomized clinical trial to either a standard cognitive training program or a controlled placebo condition for 12 weeks. The primary outcome was core ADHD symptoms measured using the Swanson, Nolan and Pelham Questionnaire (SNAP-IV scale). We faced higher resistance than expected to patient enrollment due to logistic issues to attend face-to-face sessions in the hospital and to fill the requirement of medication status and absence of some comorbidities. Both groups showed decrease in parent reported ADHD symptoms without statistical difference between them. In addition, improvements on neuropsychological tests were observed in both groups - mainly on trained tasks. This protocol revealed the need for new strategies to better assess the effectiveness of cognitive training such as the need to implement the intervention in a school environment to have an assessment with more external validity. Given the small sample size of this pilot study, definitive conclusions on the effects of cognitive training as add-on treatment to stimulants would be premature.

  8. Exposure to celebrity-endorsed small cigar promotions and susceptibility to use among young adult cigarette smokers.

    PubMed

    Sterling, Kymberle L; Moore, Roland S; Pitts, Nicole; Duong, Melissa; Ford, Kentya H; Eriksen, Michael P

    2013-01-01

    Small cigar smoking among young adult cigarette smokers may be attributed to their exposure to its advertisements and promotions. We examined the association between exposure to a celebrity music artist's endorsement of a specific brand of small cigars and young adult cigarette smokers' susceptibility to smoking that brand. Venue-based sampling procedures were used to select and survey a random sample of 121 young adult cigarette smokers, aged 18-35. Fourteen percent reported exposure to the artist's endorsement of the small cigar and 45.4% reported an intention to smoke the product in the future. The odds of small cigar smoking susceptibility increased threefold for those who reported exposure to the endorsement compared to those not exposed (OR = 3.64, 95% CI 1.06 to 12.54). Past 30-day small cigar use (OR = 3.30, 95% CI 1.24 to 8.74) and past 30-day cigar use (OR = 5.08, 95% CI 1.23, 21.08) were also associated with susceptibility to smoke a small cigar. An association between young adult cigarette smokers' exposure to the music artist's small cigar endorsement and their susceptibility to smoke small cigars was found. This association underscores the importance of monitoring small cigar promotions geared toward young people and their impact on small cigar product smoking.

  9. Cluster-Randomized Trial to Increase Hepatitis B Testing among Koreans in Los Angeles

    PubMed Central

    Bastani, Roshan; Glenn, Beth A.; Maxwell, Annette E.; Jo, Angela M.; Herrmann, Alison K.; Crespi, Catherine M.; Wong, Weng K.; Chang, L. Cindy; Stewart, Susan L.; Nguyen, Tung T.; Chen, Moon S.; Taylor, Victoria M.

    2015-01-01

    Background In the United States, Korean immigrants experience a disproportionately high burden of chronic hepatitis B (HBV) viral infection and associated liver cancer compared to the general population. However, despite clear clinical guidelines, HBV serologic testing among Koreans remains persistently sub-optimal. Methods We conducted a cluster-randomized trial to evaluate a church-based small group intervention to improve HBV testing among Koreans in Los Angeles. Fifty-two Korean churches, stratified by size (small, medium, large) and location (Koreatown versus other), were randomized to intervention or control conditions. Intervention church participants attended a single-session small-group discussion on liver cancer and HBV testing and control church participants attended a similar session on physical activity and nutrition. Outcome data consisted of self-reported HBV testing obtained via 6-month telephone follow-up interviews. Results We recruited 1123 individuals, 18-64 years of age, across the 52 churches. Ninety-two percent of the sample attended the assigned intervention session and 86% completed the 6-month follow-up. Sample characteristics included: mean age 46 years, 65% female, 97% born in Korea, 69% completed some college, and 43% insured. In an intent-to-treat analysis, the intervention produced a statistically significant effect (OR = 4.9, p < .001), with 19% of intervention and 6% of control group participants reporting a HBV test. Conclusion Our intervention was successful in achieving a large and robust effect in a population at high risk of HBV infection and sequelae. Impact The intervention was fairly resource efficient and thus has high potential for replication in other high-risk Asian groups. PMID:26104909

  10. Small studies may overestimate the effect sizes in critical care meta-analyses: a meta-epidemiological study

    PubMed Central

    2013-01-01

    Introduction Small-study effects refer to the fact that trials with limited sample sizes are more likely to report larger beneficial effects than large trials. However, this has never been investigated in critical care medicine. Thus, the present study aimed to examine the presence and extent of small-study effects in critical care medicine. Methods Critical care meta-analyses involving randomized controlled trials and reported mortality as an outcome measure were considered eligible for the study. Component trials were classified as large (≥100 patients per arm) and small (<100 patients per arm) according to their sample sizes. Ratio of odds ratio (ROR) was calculated for each meta-analysis and then RORs were combined using a meta-analytic approach. ROR<1 indicated larger beneficial effect in small trials. Small and large trials were compared in methodological qualities including sequence generating, blinding, allocation concealment, intention to treat and sample size calculation. Results A total of 27 critical care meta-analyses involving 317 trials were included. Of them, five meta-analyses showed statistically significant RORs <1, and other meta-analyses did not reach a statistical significance. Overall, the pooled ROR was 0.60 (95% CI: 0.53 to 0.68); the heterogeneity was moderate with an I2 of 50.3% (chi-squared = 52.30; P = 0.002). Large trials showed significantly better reporting quality than small trials in terms of sequence generating, allocation concealment, blinding, intention to treat, sample size calculation and incomplete follow-up data. Conclusions Small trials are more likely to report larger beneficial effects than large trials in critical care medicine, which could be partly explained by the lower methodological quality in small trials. Caution should be practiced in the interpretation of meta-analyses involving small trials. PMID:23302257

  11. Statistical theory of nucleation in the presence of uncharacterized impurities

    NASA Astrophysics Data System (ADS)

    Sear, Richard P.

    2004-08-01

    First order phase transitions proceed via nucleation. The rate of nucleation varies exponentially with the free-energy barrier to nucleation, and so is highly sensitive to variations in this barrier. In practice, very few systems are absolutely pure, there are typically some impurities present which are rather poorly characterized. These interact with the nucleus, causing the barrier to vary, and so must be taken into account. Here the impurity-nucleus interactions are modelled by random variables. The rate then has the same form as the partition function of Derrida’s random energy model, and as in this model there is a regime in which the behavior is non-self-averaging. Non-self-averaging nucleation is nucleation with a rate that varies significantly from one realization of the random variables to another. In experiment this corresponds to variation in the nucleation rate from one sample to another. General analytic expressions are obtained for the crossover from a self-averaging to a non-self-averaging rate of nucleation.

  12. Process to Selectively Distinguish Viable from Non-Viable Bacterial Cells

    NASA Technical Reports Server (NTRS)

    LaDuc, Myron T.; Bernardini, Jame N.; Stam, Christina N.

    2010-01-01

    The combination of ethidium monoazide (EMA) and post-fragmentation, randomly primed DNA amplification technologies will enhance the analytical capability to discern viable from non-viable bacterial cells in spacecraft-related samples. Intercalating agents have been widely used since the inception of molecular biology to stain and visualize nucleic acids. Only recently, intercalating agents such as EMA have been exploited to selectively distinguish viable from dead bacterial cells. Intercalating dyes can only penetrate the membranes of dead cells. Once through the membrane and actually inside the cell, they intercalate DNA and, upon photolysis with visible light, produce stable DNA monoadducts. Once the DNA is crosslinked, it becomes insoluble and unable to be fragmented for post-fragmentation, randomly primed DNA library formation. Viable organisms DNA remains unaffected by the intercalating agents, allowing for amplification via post-fragmentation, randomly primed technologies. This results in the ability to carry out downstream nucleic acid-based analyses on viable microbes to the exclusion of all non-viable cells.

  13. ESDA®-Lite collection of DNA from latent fingerprints on documents.

    PubMed

    Plaza, Dane T; Mealy, Jamia L; Lane, J Nicholas; Parsons, M Neal; Bathrick, Abigail S; Slack, Donia P

    2015-05-01

    The ability to detect and non-destructively collect biological samples for DNA processing would benefit the forensic community by preserving the physical integrity of evidentiary items for more thorough evaluations by other forensic disciplines. The Electrostatic Detection Apparatus (ESDA®) was systemically evaluated for its ability to non-destructively collect DNA from latent fingerprints deposited on various paper substrates for short tandem repeat (STR) DNA profiling. Fingerprints were deposited on a variety of paper substrates that included resume paper, cotton paper, magazine paper, currency, copy paper, and newspaper. Three DNA collection techniques were performed: ESDA collection, dry swabbing, and substrate cutting. Efficacy of each collection technique was evaluated by the quantity of DNA present in each sample and the percent profile generated by each sample. Both the ESDA and dry swabbing non-destructive sampling techniques outperformed the destructive methodology of substrate cutting. A greater number of full profiles were generated from samples collected with the non-destructive dry swabbing collection technique than were generated from samples collected with the ESDA; however, the ESDA also allowed the user to visualize the area of interest while non-destructively collecting the biological material. The ability to visualize the biological material made sampling straightforward and eliminated the need for numerous, random swabbings/cuttings. Based on these results, the evaluated non-destructive ESDA collection technique has great potential for real-world forensic implementation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Introductory Statistics Students' Conceptual Understanding of Study Design and Conclusions

    NASA Astrophysics Data System (ADS)

    Fry, Elizabeth Brondos

    Recommended learning goals for students in introductory statistics courses include the ability to recognize and explain the key role of randomness in designing studies and in drawing conclusions from those studies involving generalizations to a population or causal claims (GAISE College Report ASA Revision Committee, 2016). The purpose of this study was to explore introductory statistics students' understanding of the distinct roles that random sampling and random assignment play in study design and the conclusions that can be made from each. A study design unit lasting two and a half weeks was designed and implemented in four sections of an undergraduate introductory statistics course based on modeling and simulation. The research question that this study attempted to answer is: How does introductory statistics students' conceptual understanding of study design and conclusions (in particular, unbiased estimation and establishing causation) change after participating in a learning intervention designed to promote conceptual change in these areas? In order to answer this research question, a forced-choice assessment called the Inferences from Design Assessment (IDEA) was developed as a pretest and posttest, along with two open-ended assignments, a group quiz and a lab assignment. Quantitative analysis of IDEA results and qualitative analysis of the group quiz and lab assignment revealed that overall, students' mastery of study design concepts significantly increased after the unit, and the great majority of students successfully made the appropriate connections between random sampling and generalization, and between random assignment and causal claims. However, a small, but noticeable portion of students continued to demonstrate misunderstandings, such as confusion between random sampling and random assignment.

  15. Feasibility and Integrity of a Parent-Teacher Consultation Intervention for ADHD Students

    ERIC Educational Resources Information Center

    Murray, Desiree W.; Rabiner, David; Schulte, Ann; Newitt, Kristy

    2008-01-01

    This study examined the feasibility and integrity of a daily report card (DRC) intervention in a small sample of randomly assigned elementary students with previously diagnosed ADHD and classroom impairment. In order to enhance implementation, a conjoint behavioral consultation approach was used in which parents were engaged as active participants…

  16. Implications of Small Samples for Generalization: Adjustments and Rules of Thumb

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hallberg, Kelly; Hedges, Larry V.; Chan, Wendy

    2015-01-01

    Policy-makers are frequently interested in understanding how effective a particular intervention may be for a specific (and often broad) population. In many fields, particularly education and social welfare, the ideal form of these evaluations is a large-scale randomized experiment. Recent research has highlighted that sites in these large-scale…

  17. Small area estimation (SAE) model: Case study of poverty in West Java Province

    NASA Astrophysics Data System (ADS)

    Suhartini, Titin; Sadik, Kusman; Indahwati

    2016-02-01

    This paper showed the comparative of direct estimation and indirect/Small Area Estimation (SAE) model. Model selection included resolve multicollinearity problem in auxiliary variable, such as choosing only variable non-multicollinearity and implemented principal component (PC). Concern parameters in this paper were the proportion of agricultural venture poor households and agricultural poor households area level in West Java Province. The approach for estimating these parameters could be performed based on direct estimation and SAE. The problem of direct estimation, three area even zero and could not be conducted by directly estimation, because small sample size. The proportion of agricultural venture poor households showed 19.22% and agricultural poor households showed 46.79%. The best model from agricultural venture poor households by choosing only variable non-multicollinearity and the best model from agricultural poor households by implemented PC. The best estimator showed SAE better then direct estimation both of the proportion of agricultural venture poor households and agricultural poor households area level in West Java Province. The solution overcame small sample size and obtained estimation for small area was implemented small area estimation method for evidence higher accuracy and better precision improved direct estimator.

  18. Perceptions of coping with non-disease-related life stress for women with osteoarthritis: a qualitative analysis.

    PubMed

    Harris, Melissa L; Byles, Julie E; Townsend, Natalie; Loxton, Deborah

    2016-05-17

    Coping with arthritis-related stress has been extensively studied. However, limited evidence exists regarding coping with stress extraneous to the disease (life stress). This study explored life stress and coping in a subset of older women with osteoarthritis from a larger longitudinal study. An Australian regional university. This qualitative study involved semistructured telephone interviews. Potential participants were mailed a letter of invitation/participant information statement by the Australian Longitudinal Study on Women's Health (ALSWH). Invitations were sent out in small batches (primarily 10). Interviews were conducted until data saturation was achieved using a systematic process (n=19). Digitally recorded interviews were transcribed verbatim and deidentified. Data were thematically analysed. Women who indicated being diagnosed or treated for arthritis in the previous 3 years in the fifth survey of the ALSWH (conducted in 2007) provided the sampling frame. Potential participants were randomly sampled by a blinded data manager using a random number generator. Coping with life stress involved both attitudinal coping processes developed early in life (ie, stoicism) and transient cognitive and support-based responses. Women also described a dualistic process involving a reduction in the ability to cope with ongoing stress over time, coupled with personal growth. This is the first study to examine how individuals cope with non-arthritis-related stress. The findings add to the current understanding of stress and coping, and have implications regarding the prevention of arthritis in women. Importantly, this study highlighted the potential detrimental impact of persistent coping patterns developed early in life. Public health campaigns aimed at stress mitigation and facilitation of adaptive coping mechanisms in childhood and adolescence may assist with arthritis prevention. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  19. A stochastic-geometric model of soil variation in Pleistocene patterned ground

    NASA Astrophysics Data System (ADS)

    Lark, Murray; Meerschman, Eef; Van Meirvenne, Marc

    2013-04-01

    In this paper we examine the spatial variability of soil in parent material with complex spatial structure which arises from complex non-linear geomorphic processes. We show that this variability can be better-modelled by a stochastic-geometric model than by a standard Gaussian random field. The benefits of the new model are seen in the reproduction of features of the target variable which influence processes like water movement and pollutant dispersal. Complex non-linear processes in the soil give rise to properties with non-Gaussian distributions. Even under a transformation to approximate marginal normality, such variables may have a more complex spatial structure than the Gaussian random field model of geostatistics can accommodate. In particular the extent to which extreme values of the variable are connected in spatially coherent regions may be misrepresented. As a result, for example, geostatistical simulation generally fails to reproduce the pathways for preferential flow in an environment where coarse infill of former fluvial channels or coarse alluvium of braided streams creates pathways for rapid movement of water. Multiple point geostatistics has been developed to deal with this problem. Multiple point methods proceed by sampling from a set of training images which can be assumed to reproduce the non-Gaussian behaviour of the target variable. The challenge is to identify appropriate sources of such images. In this paper we consider a mode of soil variation in which the soil varies continuously, exhibiting short-range lateral trends induced by local effects of the factors of soil formation which vary across the region of interest in an unpredictable way. The trends in soil variation are therefore only apparent locally, and the soil variation at regional scale appears random. We propose a stochastic-geometric model for this mode of soil variation called the Continuous Local Trend (CLT) model. We consider a case study of soil formed in relict patterned ground with pronounced lateral textural variations arising from the presence of infilled ice-wedges of Pleistocene origin. We show how knowledge of the pedogenetic processes in this environment, along with some simple descriptive statistics, can be used to select and fit a CLT model for the apparent electrical conductivity (ECa) of the soil. We use the model to simulate realizations of the CLT process, and compare these with realizations of a fitted Gaussian random field. We show how statistics that summarize the spatial coherence of regions with small values of ECa, which are expected to have coarse texture and so larger saturated hydraulic conductivity, are better reproduced by the CLT model than by the Gaussian random field. This suggests that the CLT model could be used to generate an unlimited supply of training images to allow multiple point geostatistical simulation or prediction of this or similar variables.

  20. Experiences of being a control group: lessons from a UK-based randomized controlled trial of group singing as a health promotion initiative for older people.

    PubMed

    Skingley, Ann; Bungay, Hilary; Clift, Stephen; Warden, June

    2014-12-01

    Existing randomized controlled trials within the health field suggest that the concept of randomization is not always well understood and that feelings of disappointment may occur when participants are not placed in their preferred arm. This may affect a study's rigour and ethical integrity if not addressed. We aimed to test whether these issues apply to a healthy volunteer sample within a health promotion trial of singing for older people. Written comments from control group participants at two points during the trial were analysed, together with individual semi-structured interviews with a small sample (n = 11) of this group. We found that motivation to participate in the trial was largely due to the appeal of singing and disappointment resulted from allocation to the control group. Understanding of randomization was generally good and feelings of disappointment lessened over time and with a post-research opportunity to sing. Findings suggest that measures should be put in place to minimize the potential negative impacts of randomized controlled trials in health promotion research. © The Author (2013). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Energy contents of frequently ordered restaurant meals and comparison with human energy requirements and US Department of Agriculture database information: a multisite randomized study

    USDA-ARS?s Scientific Manuscript database

    BACKGROUND: Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ~50% of US restaurants are individual or small-chain (non-chain) establishments that do not provide nutrition information. OBJECTIVE: To measure the energy content of frequently o...

  2. Estimation After a Group Sequential Trial.

    PubMed

    Milanzi, Elasma; Molenberghs, Geert; Alonso, Ariel; Kenward, Michael G; Tsiatis, Anastasios A; Davidian, Marie; Verbeke, Geert

    2015-10-01

    Group sequential trials are one important instance of studies for which the sample size is not fixed a priori but rather takes one of a finite set of pre-specified values, dependent on the observed data. Much work has been devoted to the inferential consequences of this design feature. Molenberghs et al (2012) and Milanzi et al (2012) reviewed and extended the existing literature, focusing on a collection of seemingly disparate, but related, settings, namely completely random sample sizes, group sequential studies with deterministic and random stopping rules, incomplete data, and random cluster sizes. They showed that the ordinary sample average is a viable option for estimation following a group sequential trial, for a wide class of stopping rules and for random outcomes with a distribution in the exponential family. Their results are somewhat surprising in the sense that the sample average is not optimal, and further, there does not exist an optimal, or even, unbiased linear estimator. However, the sample average is asymptotically unbiased, both conditionally upon the observed sample size as well as marginalized over it. By exploiting ignorability they showed that the sample average is the conventional maximum likelihood estimator. They also showed that a conditional maximum likelihood estimator is finite sample unbiased, but is less efficient than the sample average and has the larger mean squared error. Asymptotically, the sample average and the conditional maximum likelihood estimator are equivalent. This previous work is restricted, however, to the situation in which the the random sample size can take only two values, N = n or N = 2 n . In this paper, we consider the more practically useful setting of sample sizes in a the finite set { n 1 , n 2 , …, n L }. It is shown that the sample average is then a justifiable estimator , in the sense that it follows from joint likelihood estimation, and it is consistent and asymptotically unbiased. We also show why simulations can give the false impression of bias in the sample average when considered conditional upon the sample size. The consequence is that no corrections need to be made to estimators following sequential trials. When small-sample bias is of concern, the conditional likelihood estimator provides a relatively straightforward modification to the sample average. Finally, it is shown that classical likelihood-based standard errors and confidence intervals can be applied, obviating the need for technical corrections.

  3. Standardized Uptake Decrease on [18F]-Fluorodeoxyglucose Positron Emission Tomography After Neoadjuvant Chemotherapy Is a Prognostic Classifier for Long-Term Outcome After Multimodality Treatment: Secondary Analysis of a Randomized Trial for Resectable Stage IIIA/B Non-Small-Cell Lung Cancer.

    PubMed

    Pöttgen, Christoph; Gauler, Thomas; Bellendorf, Alexander; Guberina, Maja; Bockisch, Andreas; Schwenzer, Nina; Heinzelmann, Frank; Cordes, Sebastian; Schuler, Martin H; Welter, Stefan; Stamatis, Georgios; Friedel, Godehard; Darwiche, Kaid; Jöckel, Karl-Heinz; Eberhardt, Wilfried; Stuschke, Martin

    2016-07-20

    A confirmatory analysis was performed to determine the prognostic value of metabolic response during induction chemotherapy followed by bimodality/trimodality treatment of patients with operable locally advanced non-small-cell lung cancer. Patients with potentially operable stage IIIA(N2) or selected stage IIIB non-small-cell lung cancer received three cycles of cisplatin/paclitaxel (induction chemotherapy) followed by neoadjuvant radiochemotherapy (RCT) to 45 Gy (1.5 Gy twice per day concurrent cisplatin/vinorelbine) within the ESPATUE (Phase III Study of Surgery Versus Definitive Concurrent Chemoradiotherapy Boost in Patients With Resectable Stage IIIA[N2] and Selected IIIB Non-Small-Cell Lung Cancer After Induction Chemotherapy and Concurrent Chemoradiotherapy) trial. Positron emission tomography scans were recommended before (t0) and after (t2) induction chemotherapy. Patients who were eligible for surgery after neoadjuvant RCT were randomly assigned to definitive RCT or surgery. The prognostic value of percentage of maximum standardized uptake value (%SUVmax) remaining in the primary tumor after induction chemotherapy-%SUVremaining = SUVmax(t2)/SUVmax(t0)-was assessed by proportional hazard analysis and receiver operating characteristic analysis. Overall, 161 patients were randomly assigned (155 from the Essen and Tübingen centers), and 124 of these received positron emission tomography scans at t0 and t2. %SUVremaining as a continuous variable was prognostic for the three end points of overall survival, progression-free survival, and freedom from extracerebral progression in univariable and multivariable analysis (P < .016). The respective hazard ratios per 50% increase in %SUVremaining from multivariable analysis were 2.3 (95% CI, 1.5 to 3.4; P < .001), 1.8 (95% CI, 1.3 to 2.5; P < .001), and 1.8 (95% CI, 1.2 to 2.7; P = .006) for the three end points. %SUVremaining dichotomized at a cut point of maximum sum of sensitivity and specificity from receiver operating characteristic analysis at 36 months was also prognostic. Exploratory analysis revealed that %SUVremaining was likewise prognostic for overall survival in both treatment arms and was more closely associated with extracerebral distant metastases (P = .016) than with isolated locoregional relapses (P = .97). %SUVremaining is a predictor for survival and other end points after multimodality treatment and can serve as a parameter for treatment stratification after induction chemotherapy or for evaluation of adjuvant new systemic treatment options for high-risk patients. © 2016 by American Society of Clinical Oncology.

  4. Student Characteristics as Compared to the Community Profile, Fall 1987. Volume XVI, No. 13.

    ERIC Educational Resources Information Center

    Flaherty, Toni

    In fall 1987, a study was conducted at William Rainey Harper College (WRHC) to develop a profile of WRHC students and assess the college's market outreach. Surveys were mailed to random samples of 500 degree credit students and 300 non-degree credit students. Response rates of 80% for the degree credit students and 78% for the non-degree credit…

  5. Motivation for and Use of Social Networking Sites: Comparisons among College Students with and without Histories of Non-Suicidal Self-Injury

    ERIC Educational Resources Information Center

    Jarvi, Stephanie M.; Swenson, Lance P.; Batejan, Kristen L.

    2017-01-01

    Objective: This research examines potential differences in social network use and motivation for social network use by non-suicidal self-injury (NSSI) status. Participants: 367 (73% women; M[subscript age] = 20.60) college students were recruited in November-December 2011. Methods: A random sample of 2,500 students was accessed through a…

  6. Annealing of Co-Cr dental alloy: effects on nanostructure and Rockwell hardness.

    PubMed

    Ayyıldız, Simel; Soylu, Elif Hilal; Ide, Semra; Kılıç, Selim; Sipahi, Cumhur; Pişkin, Bulent; Gökçe, Hasan Suat

    2013-11-01

    The aim of the study was to evaluate the effect of annealing on the nanostructure and hardness of Co-Cr metal ceramic samples that were fabricated with a direct metal laser sintering (DMLS) technique. Five groups of Co-Cr dental alloy samples were manufactured in a rectangular form measuring 4 × 2 × 2 mm. Samples fabricated by a conventional casting technique (Group I) and prefabricated milling blanks (Group II) were examined as conventional technique groups. The DMLS samples were randomly divided into three groups as not annealed (Group III), annealed in argon atmosphere (Group IV), or annealed in oxygen atmosphere (Group V). The nanostructure was examined with the small-angle X-ray scattering method. The Rockwell hardness test was used to measure the hardness changes in each group, and the means and standard deviations were statistically analyzed by one-way ANOVA for comparison of continuous variables and Tukey's HSD test was used for post hoc analysis. P values of <.05 were accepted as statistically significant. The general nanostructures of the samples were composed of small spherical entities stacked atop one another in dendritic form. All groups also displayed different hardness values depending on the manufacturing technique. The annealing procedure and environment directly affected both the nanostructure and hardness of the Co-Cr alloy. Group III exhibited a non-homogeneous structure and increased hardness (48.16 ± 3.02 HRC) because the annealing process was incomplete and the inner stress was not relieved. Annealing in argon atmosphere of Group IV not only relieved the inner stresses but also decreased the hardness (27.40 ± 3.98 HRC). The results of fitting function presented that Group IV was the most homogeneous product as the minimum bilayer thickness was measured (7.11 Å). After the manufacturing with DMLS technique, annealing in argon atmosphere is an essential process for Co-Cr metal ceramic substructures. The dentists should be familiar with the materials that are used in clinic for prosthodontics treatments.

  7. Correlated Observations, the Law of Small Numbers and Bank Runs

    PubMed Central

    2016-01-01

    Empirical descriptions and studies suggest that generally depositors observe a sample of previous decisions before deciding if to keep their funds deposited or to withdraw them. These observed decisions may exhibit different degrees of correlation across depositors. In our model depositors decide sequentially and are assumed to follow the law of small numbers in the sense that they believe that a bank run is underway if the number of observed withdrawals in their sample is large. Theoretically, with highly correlated samples and infinite depositors runs occur with certainty, while with random samples it needs not be the case, as for many parameter settings the likelihood of bank runs is zero. We investigate the intermediate cases and find that i) decreasing the correlation and ii) increasing the sample size reduces the likelihood of bank runs, ceteris paribus. Interestingly, the multiplicity of equilibria, a feature of the canonical Diamond-Dybvig model that we use also, disappears almost completely in our setup. Our results have relevant policy implications. PMID:27035435

  8. Correlated Observations, the Law of Small Numbers and Bank Runs.

    PubMed

    Horváth, Gergely; Kiss, Hubert János

    2016-01-01

    Empirical descriptions and studies suggest that generally depositors observe a sample of previous decisions before deciding if to keep their funds deposited or to withdraw them. These observed decisions may exhibit different degrees of correlation across depositors. In our model depositors decide sequentially and are assumed to follow the law of small numbers in the sense that they believe that a bank run is underway if the number of observed withdrawals in their sample is large. Theoretically, with highly correlated samples and infinite depositors runs occur with certainty, while with random samples it needs not be the case, as for many parameter settings the likelihood of bank runs is zero. We investigate the intermediate cases and find that i) decreasing the correlation and ii) increasing the sample size reduces the likelihood of bank runs, ceteris paribus. Interestingly, the multiplicity of equilibria, a feature of the canonical Diamond-Dybvig model that we use also, disappears almost completely in our setup. Our results have relevant policy implications.

  9. Identifying key demographic parameters of a small island-associated population of Indo-Pacific bottlenose dolphins (Reunion, Indian Ocean).

    PubMed

    Dulau, Violaine; Estrade, Vanessa; Fayan, Jacques

    2017-01-01

    Photo-identification surveys of Indo-Pacific bottlenose dolphins were conducted from 2009 to 2014 off Reunion Island (55°E33'/21°S07'), in the Indian Ocean. Robust Design models were applied to produce the most reliable estimate of population abundance and survival rate, while accounting for temporary emigration from the survey area (west coast). The sampling scheme consisted of a five-month (June-October) sampling period in each year of the study. The overall population size at Reunion was estimated to be 72 individuals (SE = 6.17, 95%CI = 61-85), based on a random temporary emigration (γ") of 0.096 and a proportion of 0.70 (SE = 0.03) distinct individuals. The annual survival rate was 0.93 (±0.018 SE, 95%CI = 0.886-0.958) and was constant over time and between sexes. Models considering gender groups indicated different movement patterns between males and females. Males showed null or quasi-null temporary emigration (γ" = γ' < 0.01), while females showed a random temporary emigration (γ") of 0.10, suggesting that a small proportion of females was outside the survey area during each primary sampling period. Sex-specific temporary migration patterns were consistent with movement and residency patterns observed in other areas. The Robust Design approach provided an appropriate sampling scheme for deriving island-associated population parameters, while allowing to restrict survey effort both spatially (i.e. west coast only) and temporally (five months per year). Although abundance and survival were stable over the six years, the small population size of fewer than 100 individuals suggested that this population is highly vulnerable. Priority should be given to reducing any potential impact of human activity on the population and its habitat.

  10. Crossover from impurity-controlled to granular superconductivity in (TMTSF) 2ClO4

    NASA Astrophysics Data System (ADS)

    Yonezawa, Shingo; Marrache-Kikuchi, Claire A.; Bechgaard, Klaus; Jérome, Denis

    2018-01-01

    Using a proper cooling procedure, a controllable amount of nonmagnetic structural disorder can be introduced at low temperature in (TMTSF) 2ClO4 . Here we performed simultaneous measurements of transport and magnetic properties of (TMTSF) 2ClO4 in its normal and superconducting states, while finely covering three orders of magnitude of the cooling rate around the anion ordering temperature. Our result reveals, with increasing density of disorder, the existence of a crossover between homogeneous defect-controlled d -wave superconductivity and granular superconductivity. At slow cooling rates, with small amount of disorder, the evolution of superconducting properties is well described with the Abrikosov-Gorkov theory, providing further confirmation of non-s -wave pairing in this compound. In contrast, at fast cooling rates, zero resistance and diamagnetic shielding are achieved through a randomly distributed network of superconducting puddles embedded in a normal conducting background and interconnected by proximity effect coupling. The temperature dependence of the ac complex susceptibility reveals features typical for a network of granular superconductors. This makes (TMTSF) 2ClO4 a model system for granular superconductivity where the grain size and their concentration are tunable within the same sample.

  11. Reactive flow modeling of small scale detonation failure experiments for a baseline non-ideal explosive

    NASA Astrophysics Data System (ADS)

    Kittell, David E.; Cummock, Nick R.; Son, Steven F.

    2016-08-01

    Small scale characterization experiments using only 1-5 g of a baseline ammonium nitrate plus fuel oil (ANFO) explosive are discussed and simulated using an ignition and growth reactive flow model. There exists a strong need for the small scale characterization of non-ideal explosives in order to adequately survey the wide parameter space in sample composition, density, and microstructure of these materials. However, it is largely unknown in the scientific community whether any useful or meaningful result may be obtained from detonation failure, and whether a minimum sample size or level of confinement exists for the experiments. In this work, it is shown that the parameters of an ignition and growth rate law may be calibrated using the small scale data, which is obtained from a 35 GHz microwave interferometer. Calibration is feasible when the samples are heavily confined and overdriven; this conclusion is supported with detailed simulation output, including pressure and reaction contours inside the ANFO samples. The resulting shock wave velocity is most likely a combined chemical-mechanical response, and simulations of these experiments require an accurate unreacted equation of state (EOS) in addition to the calibrated reaction rate. Other experiments are proposed to gain further insight into the detonation failure data, as well as to help discriminate between the role of the EOS and reaction rate in predicting the measured outcome.

  12. Reactive flow modeling of small scale detonation failure experiments for a baseline non-ideal explosive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kittell, David E.; Cummock, Nick R.; Son, Steven F.

    2016-08-14

    Small scale characterization experiments using only 1–5 g of a baseline ammonium nitrate plus fuel oil (ANFO) explosive are discussed and simulated using an ignition and growth reactive flow model. There exists a strong need for the small scale characterization of non-ideal explosives in order to adequately survey the wide parameter space in sample composition, density, and microstructure of these materials. However, it is largely unknown in the scientific community whether any useful or meaningful result may be obtained from detonation failure, and whether a minimum sample size or level of confinement exists for the experiments. In this work, itmore » is shown that the parameters of an ignition and growth rate law may be calibrated using the small scale data, which is obtained from a 35 GHz microwave interferometer. Calibration is feasible when the samples are heavily confined and overdriven; this conclusion is supported with detailed simulation output, including pressure and reaction contours inside the ANFO samples. The resulting shock wave velocity is most likely a combined chemical-mechanical response, and simulations of these experiments require an accurate unreacted equation of state (EOS) in addition to the calibrated reaction rate. Other experiments are proposed to gain further insight into the detonation failure data, as well as to help discriminate between the role of the EOS and reaction rate in predicting the measured outcome.« less

  13. Experimental light scattering by ultrasonically controlled small particles - Implications for Planetary Science

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.; Penttilä, A.; Maconi, G.; Kassamakov, I.; Markkanen, J.; Martikainen, J.; Väisänen, T.; Helander, P.; Puranen, T.; Salmi, A.; Hæggström, E.; Muinonen, K.

    2017-09-01

    We present the results obtained with our newly developed 3D scatterometer - a setup for precise multi-angular measurements of light scattered by mm- to µm-sized samples held in place by sound. These measurements are cross-validated against the modeled light-scattering characteristics of the sample, i.e., the intensity and the degree of linear polarization of the reflected light, calculated with state-of-the-art electromagnetic techniques. We demonstrate a unique non-destructive approach to derive the optical properties of small grain samples which facilitates research on highly valuable planetary materials, such as samples returned from space missions or rare meteorites.

  14. Viral metagenomic analysis of feces of wild small carnivores

    PubMed Central

    2014-01-01

    Background Recent studies have clearly demonstrated the enormous virus diversity that exists among wild animals. This exemplifies the required expansion of our knowledge of the virus diversity present in wildlife, as well as the potential transmission of these viruses to domestic animals or humans. Methods In the present study we evaluated the viral diversity of fecal samples (n = 42) collected from 10 different species of wild small carnivores inhabiting the northern part of Spain using random PCR in combination with next-generation sequencing. Samples were collected from American mink (Neovison vison), European mink (Mustela lutreola), European polecat (Mustela putorius), European pine marten (Martes martes), stone marten (Martes foina), Eurasian otter (Lutra lutra) and Eurasian badger (Meles meles) of the family of Mustelidae; common genet (Genetta genetta) of the family of Viverridae; red fox (Vulpes vulpes) of the family of Canidae and European wild cat (Felis silvestris) of the family of Felidae. Results A number of sequences of possible novel viruses or virus variants were detected, including a theilovirus, phleboviruses, an amdovirus, a kobuvirus and picobirnaviruses. Conclusions Using random PCR in combination with next generation sequencing, sequences of various novel viruses or virus variants were detected in fecal samples collected from Spanish carnivores. Detected novel viruses highlight the viral diversity that is present in fecal material of wild carnivores. PMID:24886057

  15. Research Designs for Intervention Research with Small Samples II: Stepped Wedge and Interrupted Time-Series Designs.

    PubMed

    Fok, Carlotta Ching Ting; Henry, David; Allen, James

    2015-10-01

    The stepped wedge design (SWD) and the interrupted time-series design (ITSD) are two alternative research designs that maximize efficiency and statistical power with small samples when contrasted to the operating characteristics of conventional randomized controlled trials (RCT). This paper provides an overview and introduction to previous work with these designs and compares and contrasts them with the dynamic wait-list design (DWLD) and the regression point displacement design (RPDD), which were presented in a previous article (Wyman, Henry, Knoblauch, and Brown, Prevention Science. 2015) in this special section. The SWD and the DWLD are similar in that both are intervention implementation roll-out designs. We discuss similarities and differences between the SWD and DWLD in their historical origin and application, along with differences in the statistical modeling of each design. Next, we describe the main design characteristics of the ITSD, along with some of its strengths and limitations. We provide a critical comparative review of strengths and weaknesses in application of the ITSD, SWD, DWLD, and RPDD as small sample alternatives to application of the RCT, concluding with a discussion of the types of contextual factors that influence selection of an optimal research design by prevention researchers working with small samples.

  16. Research Designs for Intervention Research with Small Samples II: Stepped Wedge and Interrupted Time-Series Designs

    PubMed Central

    Ting Fok, Carlotta Ching; Henry, David; Allen, James

    2015-01-01

    The stepped wedge design (SWD) and the interrupted time-series design (ITSD) are two alternative research designs that maximize efficiency and statistical power with small samples when contrasted to the operating characteristics of conventional randomized controlled trials (RCT). This paper provides an overview and introduction to previous work with these designs, and compares and contrasts them with the dynamic wait-list design (DWLD) and the regression point displacement design (RPDD), which were presented in a previous article (Wyman, Henry, Knoblauch, and Brown, 2015) in this Special Section. The SWD and the DWLD are similar in that both are intervention implementation roll-out designs. We discuss similarities and differences between the SWD and DWLD in their historical origin and application, along with differences in the statistical modeling of each design. Next, we describe the main design characteristics of the ITSD, along with some of its strengths and limitations. We provide a critical comparative review of strengths and weaknesses in application of the ITSD, SWD, DWLD, and RPDD as small samples alternatives to application of the RCT, concluding with a discussion of the types of contextual factors that influence selection of an optimal research design by prevention researchers working with small samples. PMID:26017633

  17. Adapted random sampling patterns for accelerated MRI.

    PubMed

    Knoll, Florian; Clason, Christian; Diwoky, Clemens; Stollberger, Rudolf

    2011-02-01

    Variable density random sampling patterns have recently become increasingly popular for accelerated imaging strategies, as they lead to incoherent aliasing artifacts. However, the design of these sampling patterns is still an open problem. Current strategies use model assumptions like polynomials of different order to generate a probability density function that is then used to generate the sampling pattern. This approach relies on the optimization of design parameters which is very time consuming and therefore impractical for daily clinical use. This work presents a new approach that generates sampling patterns by making use of power spectra of existing reference data sets and hence requires neither parameter tuning nor an a priori mathematical model of the density of sampling points. The approach is validated with downsampling experiments, as well as with accelerated in vivo measurements. The proposed approach is compared with established sampling patterns, and the generalization potential is tested by using a range of reference images. Quantitative evaluation is performed for the downsampling experiments using RMS differences to the original, fully sampled data set. Our results demonstrate that the image quality of the method presented in this paper is comparable to that of an established model-based strategy when optimization of the model parameter is carried out and yields superior results to non-optimized model parameters. However, no random sampling pattern showed superior performance when compared to conventional Cartesian subsampling for the considered reconstruction strategy.

  18. Influence of item distribution pattern and abundance on efficiency of benthic core sampling

    USGS Publications Warehouse

    Behney, Adam C.; O'Shaughnessy, Ryan; Eichholz, Michael W.; Stafford, Joshua D.

    2014-01-01

    ore sampling is a commonly used method to estimate benthic item density, but little information exists about factors influencing the accuracy and time-efficiency of this method. We simulated core sampling in a Geographic Information System framework by generating points (benthic items) and polygons (core samplers) to assess how sample size (number of core samples), core sampler size (cm2), distribution of benthic items, and item density affected the bias and precision of estimates of density, the detection probability of items, and the time-costs. When items were distributed randomly versus clumped, bias decreased and precision increased with increasing sample size and increased slightly with increasing core sampler size. Bias and precision were only affected by benthic item density at very low values (500–1,000 items/m2). Detection probability (the probability of capturing ≥ 1 item in a core sample if it is available for sampling) was substantially greater when items were distributed randomly as opposed to clumped. Taking more small diameter core samples was always more time-efficient than taking fewer large diameter samples. We are unable to present a single, optimal sample size, but provide information for researchers and managers to derive optimal sample sizes dependent on their research goals and environmental conditions.

  19. On the strength of random fiber networks

    NASA Astrophysics Data System (ADS)

    Deogekar, S.; Picu, R. C.

    2018-07-01

    Damage accumulation and failure in random fiber networks is of importance in a variety of applications, from design of synthetic materials, such as paper and non-wovens, to accidental tearing of biological tissues. In this work we study these processes using three-dimensional models of athermal fiber networks, focusing attention on the modes of failure and on the relationship between network strength and network structural parameters. We consider network failure at small and large strains associated with the rupture of inter-fiber bonds. It is observed that the strength increases linearly with the network volume fraction and with the bond strength, while the stretch at peak stress is inversely related to these two parameters. A small fraction of the bonds rupture before peak stress and this fraction increases with increasing failure stretch. Rendering the bond strength stochastic causes a reduction of the network strength. However, heterogeneity retards damage localization and increases the stretch at peak stress, therefore promoting ductility.

  20. Emergence of cooperation in non-scale-free networks

    NASA Astrophysics Data System (ADS)

    Zhang, Yichao; Aziz-Alaoui, M. A.; Bertelle, Cyrille; Zhou, Shi; Wang, Wenting

    2014-06-01

    Evolutionary game theory is one of the key paradigms behind many scientific disciplines from science to engineering. Previous studies proposed a strategy updating mechanism, which successfully demonstrated that the scale-free network can provide a framework for the emergence of cooperation. Instead, individuals in random graphs and small-world networks do not favor cooperation under this updating rule. However, a recent empirical result shows the heterogeneous networks do not promote cooperation when humans play a prisoner’s dilemma. In this paper, we propose a strategy updating rule with payoff memory. We observe that the random graphs and small-world networks can provide even better frameworks for cooperation than the scale-free networks in this scenario. Our observations suggest that the degree heterogeneity may be neither a sufficient condition nor a necessary condition for the widespread cooperation in complex networks. Also, the topological structures are not sufficed to determine the level of cooperation in complex networks.

  1. Dose and Fractionation in Radiation Therapy of Curative Intent for Non-Small Cell Lung Cancer: Meta-Analysis of Randomized Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramroth, Johanna; Cutter, David J.; Darby, Sarah C.

    Purpose: The optimum dose and fractionation in radiation therapy of curative intent for non-small cell lung cancer remains uncertain. We undertook a published data meta-analysis of randomized trials to examine whether radiation therapy regimens with higher time-corrected biologically equivalent doses resulted in longer survival, either when given alone or when given with chemotherapy. Methods and Materials: Eligible studies were randomized comparisons of 2 or more radiation therapy regimens, with other treatments identical. Median survival ratios were calculated for each comparison and pooled. Results: 3795 patients in 25 randomized comparisons of radiation therapy dose were studied. The median survival ratio, highermore » versus lower corrected dose, was 1.13 (95% confidence interval [CI] 1.04-1.22) when radiation therapy was given alone and 0.83 (95% CI 0.71-0.97) when it was given with concurrent chemotherapy (P for difference=.001). In comparisons of radiation therapy given alone, the survival benefit increased with increasing dose difference between randomized treatment arms (P for trend=.004). The benefit increased with increasing dose in the lower-dose arm (P for trend=.01) without reaching a level beyond which no further survival benefit was achieved. The survival benefit did not differ significantly between randomized comparisons where the higher-dose arm was hyperfractionated and those where it was not. There was heterogeneity in the median survival ratio by geographic region (P<.001), average age at randomization (P<.001), and year trial started (P for trend=.004), but not for proportion of patients with squamous cell carcinoma (P=.2). Conclusions: In trials with concurrent chemotherapy, higher radiation therapy doses resulted in poorer survival, possibly caused, at least in part, by high levels of toxicity. Where radiation therapy was given without chemotherapy, progressively higher radiation therapy doses resulted in progressively longer survival, and no upper dose level was found above which there was no further benefit. These findings support the consideration of further radiation therapy dose escalation trials, making use of modern treatment methods to reduce toxicity.« less

  2. Efficacy of Vitamin C in Lowering Serum Uric Acid.

    PubMed

    Choudhury, M R; Haq, S M; Saleh, A A; Hakim, F; Azad, A K

    2016-10-01

    The objective of the study was to determine the efficacy of vitamin C in reducing serum uric acid (UA). This study was a double-blind placebo-controlled randomized trial conducted in the Department of Rheumatology, Bangabandhu Sheikh Mujib Medical University (BSMMU) Dhaka, Bangladesh from July 2007 and August 2008. Study participants were included from out patient department (OPD) of Rheumatology of BSMMU suffering from various Rheumatological problems other than gouty arthritis. All of the participants were non-smokers, non-alcoholics, and randomized to take either placebo or vitamin C (500 mg/day) for 12 weeks. A total of 98 subjects were enrolled in the study; 71 completed the trial, with 34 in the placebo group and 37 receiving vitamin C. Serum uric acid levels were not significantly reduced in the experimental group and they increased in the placebo group. In the vitamin C group, the mean change was -0.32mg/dl [95% confidence interval -0.73, 0.77], whereas in the placebo group, the mean change was +0.12mg/dl [95% confidence interval was -0.22, 0.47]. Subgroups were defined by sex, body mass index, and quartiles of baseline serum uric acid levels. In a subgroup analysis, vitamin C lowered serum uric acid significantly in those who had comparatively higher baseline uric acid levels. Although vitamin C did not lower serum uric acid significantly, participants with higher baseline serum uric acid levels experienced a significant uric acid lowering effect, but as the sample size was very small, it is difficult to draw any definitive conclusion.

  3. Lack of replication of thirteen single-nucleotide polymorphisms implicated in Parkinson’s disease: a large-scale international study

    PubMed Central

    Elbaz, Alexis; Nelson, Lorene M; Payami, Haydeh; Ioannidis, John P A; Fiske, Brian K; Annesi, Grazia; Belin, Andrea Carmine; Factor, Stewart A; Ferrarese, Carlo; Hadjigeorgiou, Georgios M; Higgins, Donald S; Kawakami, Hideshi; Krüger, Rejko; Marder, Karen S; Mayeux, Richard P; Mellick, George D; Nutt, John G; Ritz, Beate; Samii, Ali; Tanner, Caroline M; Van Broeckhoven, Christine; Van Den Eeden, Stephen K; Wirdefeldt, Karin; Zabetian, Cyrus P; Dehem, Marie; Montimurro, Jennifer S; Southwick, Audrey; Myers, Richard M; Trikalinos, Thomas A

    2013-01-01

    Summary Background A genome-wide association study identified 13 single-nucleotide polymorphisms (SNPs) significantly associated with Parkinson’s disease. Small-scale replication studies were largely non-confirmatory, but a meta-analysis that included data from the original study could not exclude all SNP associations, leaving relevance of several markers uncertain. Methods Investigators from three Michael J Fox Foundation for Parkinson’s Research-funded genetics consortia—comprising 14 teams—contributed DNA samples from 5526 patients with Parkinson’s disease and 6682 controls, which were genotyped for the 13 SNPs. Most (88%) participants were of white, non-Hispanic descent. We assessed log-additive genetic effects using fixed and random effects models stratified by team and ethnic origin, and tested for heterogeneity across strata. A meta-analysis was undertaken that incorporated data from the original genome-wide study as well as subsequent replication studies. Findings In fixed and random-effects models no associations with any of the 13 SNPs were identified (odds ratios 0·89 to 1·09). Heterogeneity between studies and between ethnic groups was low for all SNPs. Subgroup analyses by age at study entry, ethnic origin, sex, and family history did not show any consistent associations. In our meta-analysis, no SNP showed significant association (summary odds ratios 0·95 to 1.08); there was little heterogeneity except for SNP rs7520966. Interpretation Our results do not lend support to the finding that the 13 SNPs reported in the original genome-wide association study are genetic susceptibility factors for Parkinson’s disease. PMID:17052658

  4. Linear discriminant analysis with misallocation in training samples

    NASA Technical Reports Server (NTRS)

    Chhikara, R. (Principal Investigator); Mckeon, J.

    1982-01-01

    Linear discriminant analysis for a two-class case is studied in the presence of misallocation in training samples. A general appraoch to modeling of mislocation is formulated, and the mean vectors and covariance matrices of the mixture distributions are derived. The asymptotic distribution of the discriminant boundary is obtained and the asymptotic first two moments of the two types of error rate given. Certain numerical results for the error rates are presented by considering the random and two non-random misallocation models. It is shown that when the allocation procedure for training samples is objectively formulated, the effect of misallocation on the error rates of the Bayes linear discriminant rule can almost be eliminated. If, however, this is not possible, the use of Fisher rule may be preferred over the Bayes rule.

  5. xMSanalyzer: automated pipeline for improved feature detection and downstream analysis of large-scale, non-targeted metabolomics data.

    PubMed

    Uppal, Karan; Soltow, Quinlyn A; Strobel, Frederick H; Pittard, W Stephen; Gernert, Kim M; Yu, Tianwei; Jones, Dean P

    2013-01-16

    Detection of low abundance metabolites is important for de novo mapping of metabolic pathways related to diet, microbiome or environmental exposures. Multiple algorithms are available to extract m/z features from liquid chromatography-mass spectral data in a conservative manner, which tends to preclude detection of low abundance chemicals and chemicals found in small subsets of samples. The present study provides software to enhance such algorithms for feature detection, quality assessment, and annotation. xMSanalyzer is a set of utilities for automated processing of metabolomics data. The utilites can be classified into four main modules to: 1) improve feature detection for replicate analyses by systematic re-extraction with multiple parameter settings and data merger to optimize the balance between sensitivity and reliability, 2) evaluate sample quality and feature consistency, 3) detect feature overlap between datasets, and 4) characterize high-resolution m/z matches to small molecule metabolites and biological pathways using multiple chemical databases. The package was tested with plasma samples and shown to more than double the number of features extracted while improving quantitative reliability of detection. MS/MS analysis of a random subset of peaks that were exclusively detected using xMSanalyzer confirmed that the optimization scheme improves detection of real metabolites. xMSanalyzer is a package of utilities for data extraction, quality control assessment, detection of overlapping and unique metabolites in multiple datasets, and batch annotation of metabolites. The program was designed to integrate with existing packages such as apLCMS and XCMS, but the framework can also be used to enhance data extraction for other LC/MS data software.

  6. Assessing differential gene expression with small sample sizes in oligonucleotide arrays using a mean-variance model.

    PubMed

    Hu, Jianhua; Wright, Fred A

    2007-03-01

    The identification of the genes that are differentially expressed in two-sample microarray experiments remains a difficult problem when the number of arrays is very small. We discuss the implications of using ordinary t-statistics and examine other commonly used variants. For oligonucleotide arrays with multiple probes per gene, we introduce a simple model relating the mean and variance of expression, possibly with gene-specific random effects. Parameter estimates from the model have natural shrinkage properties that guard against inappropriately small variance estimates, and the model is used to obtain a differential expression statistic. A limiting value to the positive false discovery rate (pFDR) for ordinary t-tests provides motivation for our use of the data structure to improve variance estimates. Our approach performs well compared to other proposed approaches in terms of the false discovery rate.

  7. A machine learning model with human cognitive biases capable of learning from small and biased datasets.

    PubMed

    Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro

    2018-05-09

    Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.

  8. Instrumentation of the variable-angle magneto-optic ellipsometer and its application to M-O media and other non-magnetic films

    NASA Technical Reports Server (NTRS)

    Zhou, Andy F.; Erwin, J. Kevin; Mansuripur, M.

    1992-01-01

    A new and comprehensive dielectric tensor characterization instrument is presented for characterization of magneto-optical recording media and non-magnetic thin films. Random and systematic errors of the system are studied. A series of TbFe, TbFeCo, and Co/Pt samples with different composition and thicknesses are characterized for their optical and magneto-optical properties. The optical properties of several non-magnetic films are also measured.

  9. Estimation of population mean under systematic sampling

    NASA Astrophysics Data System (ADS)

    Noor-ul-amin, Muhammad; Javaid, Amjad

    2017-11-01

    In this study we propose a generalized ratio estimator under non-response for systematic random sampling. We also generate a class of estimators through special cases of generalized estimator using different combinations of coefficients of correlation, kurtosis and variation. The mean square errors and mathematical conditions are also derived to prove the efficiency of proposed estimators. Numerical illustration is included using three populations to support the results.

  10. How Are We Educating Agricultural Students? A National Profile of Leadership Capacities and Involvement in College Compared to Non-Agricultural Peers

    ERIC Educational Resources Information Center

    Rosch, David M.; Coers, Natalie

    2013-01-01

    Given the importance of leadership development within the various agricultural professions, a national sample (n = 461) of students with agriculture-related majors from 55 colleges was compared to a similarly-sized random peer group from the same institutions. The data were analyzed to compare the agricultural student sample to their peers with…

  11. The Functional Profile of Young Adults with Suspected Developmental Coordination Disorder (DCD)

    ERIC Educational Resources Information Center

    Tal-Saban, Miri; Zarka, Salman; Grotto, Itamar; Ornoy, Asher; Parush, Shula

    2012-01-01

    We assessed the non-academic and academic functioning of young adults with DCD, and investigated the emotional influences and the role of strategy use within this population. A random sample of 2379 adolescents and young adults aged 19-25 (1081 males [45.4%]; mean age = 20.68, SD = 3.42) was used to develop the instruments. From this sample, three…

  12. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    PubMed

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Recurrence of random walks with long-range steps generated by fractional Laplacian matrices on regular networks and simple cubic lattices

    NASA Astrophysics Data System (ADS)

    Michelitsch, T. M.; Collet, B. A.; Riascos, A. P.; Nowakowski, A. F.; Nicolleau, F. C. G. A.

    2017-12-01

    We analyze a Markovian random walk strategy on undirected regular networks involving power matrix functions of the type L\\frac{α{2}} where L indicates a ‘simple’ Laplacian matrix. We refer to such walks as ‘fractional random walks’ with admissible interval 0<α ≤slant 2 . We deduce probability-generating functions (network Green’s functions) for the fractional random walk. From these analytical results we establish a generalization of Polya’s recurrence theorem for fractional random walks on d-dimensional infinite lattices: The fractional random walk is transient for dimensions d > α (recurrent for d≤slantα ) of the lattice. As a consequence, for 0<α< 1 the fractional random walk is transient for all lattice dimensions d=1, 2, .. and in the range 1≤slantα < 2 for dimensions d≥slant 2 . Finally, for α=2 , Polya’s classical recurrence theorem is recovered, namely the walk is transient only for lattice dimensions d≥slant 3 . The generalization of Polya’s recurrence theorem remains valid for the class of random walks with Lévy flight asymptotics for long-range steps. We also analyze the mean first passage probabilities, mean residence times, mean first passage times and global mean first passage times (Kemeny constant) for the fractional random walk. For an infinite 1D lattice (infinite ring) we obtain for the transient regime 0<α<1 closed form expressions for the fractional lattice Green’s function matrix containing the escape and ever passage probabilities. The ever passage probabilities (fractional lattice Green’s functions) in the transient regime fulfil Riesz potential power law decay asymptotic behavior for nodes far from the departure node. The non-locality of the fractional random walk is generated by the non-diagonality of the fractional Laplacian matrix with Lévy-type heavy tailed inverse power law decay for the probability of long-range moves. This non-local and asymptotic behavior of the fractional random walk introduces small-world properties with the emergence of Lévy flights on large (infinite) lattices.

  14. A Meta-Analysis of D-Cycloserine in Exposure-Based Treatment: Moderators of Treatment Efficacy, Response, and Diagnostic Remission

    PubMed Central

    McGuire, Joseph F.; Wu, Monica S.; Piacentini, John; McCracken, James T.; Storch, Eric A.

    2018-01-01

    Objective This meta-analysis examined treatment efficacy, treatment response, and diagnostic remission effect sizes (ES) and moderators of d-cycloserine (DCS) augmented exposure treatment in randomized controlled trials (RCTs) of individuals with anxiety disorders, obsessive-compulsive disorder (OCD), and posttraumatic stress disorder (PTSD). Data Sources and Study Selection Using search terms d-cycloserine AND randomized controlled trial, PubMED (1965-May 2015), PsycInfo, and Scopus were searched for randomized placebo-controlled trials of DCS-augmented exposure therapy for anxiety disorders, OCD, and PTSD. Data Extraction Clinical variables and ES were extracted from 20 RCTs (957 participants). A random effects model calculated the ES for treatment efficacy, treatment response, and diagnostic remission using standardized rating scales. Subgroup analyses and meta-regression examined potential moderators. Results A small non-significant benefit of DCS augmentation compared to placebo augmentation was identified across treatment efficacy (g=0.15), response (RR=1.08), and remission (RR=1.109), with a moderately significant effect for anxiety disorders specifically (g=0.33, p=.03). At initial follow-up assessments, a small non-significant ES of DCS augmentation compared to placebo was found for treatment efficacy (g=0.21), response (RR=1.06), and remission (RR=1.12). Specific treatment moderators (e.g., comorbidity, medication status, gender, publication year) were found across conditions for both acute treatment and initial follow-up assessments. Conclusions DCS does not universally enhance treatment outcomes, but demonstrates promise for anxiety disorders. Distinct treatment moderators may account for discrepant findings across RCTs and disorders. Future trials may be strengthened by accounting for identified moderators in their design, with ongoing research needed on the mechanisms of DCS to tailor treatment protocols and maximize its benefit. PMID:27314661

  15. Incorporating Erlotinib or Irinotecan Plus Cisplatin into Chemoradiotherapy for Stage III Non-small Cell Lung Cancer According to EGFR Mutation Status.

    PubMed

    Lee, Youngjoo; Han, Ji-Youn; Moon, Sung Ho; Nam, Byung-Ho; Lim, Kun Young; Lee, Geon Kook; Kim, Heung Tae; Yun, Tak; An, Hye Jin; Lee, Jin Soo

    2017-10-01

    Concurrent chemoradiotherapy (CCRT) is the standard care for stage III non-small cell lung cancer (NSCLC) patients; however, a more effective regimen is needed to improve the outcome by better controlling occult metastases. We conducted two parallel randomized phase II studies to incorporate erlotinib or irinotecan-cisplatin (IP) into CCRT for stage III NSCLC depending on epidermal growth factor receptor (EGFR) mutation status. Patients with EGFR-mutant tumors were randomized to receive three cycles of erlotinib first and then either CCRT with erlotinib followed by erlotinib (arm A) or CCRT with IP only (arm B). Patients with EGFR unknown or wild-type tumors were randomized to receive either three cycles of IP before (arm C) or after CCRT with IP (arm D). Seventy-three patients were screened and the study was closed early because of slow accrual after 59 patients were randomized. Overall, there were seven patients in arm A, five in arm B, 22 in arm C, and 25 in arm D. The response rate was 71.4% and 80.0% for arm A and B, and 70.0% and 73.9% for arm C and D. The median overall survival (OS) was 39.3 months versus 31.2 months for arm A and B (p=0.442), and 16.3 months versus 25.3 months for arm C and D (p=0.050). Patients with sensitive EGFR mutations had significantly longer OS than EGFR-wild patients (74.8 months vs. 25.3 months, p=0.034). There were no unexpected toxicities. Combined-modality treatment by molecular diagnostics is feasible in stage III NSCLC. EGFR-mutant patients appear to be a distinct subset with longer survival.

  16. Total coliform and E. coli in public water systems using undisinfected ground water in the United States.

    PubMed

    Messner, Michael J; Berger, Philip; Javier, Julie

    2017-06-01

    Public water systems (PWSs) in the United States generate total coliform (TC) and Escherichia coli (EC) monitoring data, as required by the Total Coliform Rule (TCR). We analyzed data generated in 2011 by approximately 38,000 small (serving fewer than 4101 individuals) undisinfected public water systems (PWSs). We used statistical modeling to characterize a distribution of TC detection probabilities for each of nine groupings of PWSs based on system type (community, non-transient non-community, and transient non-community) and population served (less than 101, 101-1000 and 1001-4100 people). We found that among PWS types sampled in 2011, on average, undisinfected transient PWSs test positive for TC 4.3% of the time as compared with 3% for undisinfected non-transient PWSs and 2.5% for undisinfected community PWSs. Within each type of PWS, the smaller systems have higher median TC detection than the larger systems. All TC-positive samples were assayed for EC. Among TC-positive samples from small undisinfected PWSs, EC is detected in about 5% of samples, regardless of PWS type or size. We evaluated the upper tail of the TC detection probability distributions and found that significant percentages of some system types have high TC detection probabilities. For example, assuming the systems providing data are nationally-representative, then 5.0% of the ∼50,000 small undisinfected transient PWSs in the U.S. have TC detection probabilities of 20% or more. Communities with such high TC detection probabilities may have elevated risk of acute gastrointestinal (AGI) illness - perhaps as great or greater than the attributable risk to drinking water (6-22%) calculated for 14 Wisconsin community PWSs with much lower TC detection probabilities (about 2.3%, Borchardt et al., 2012). Published by Elsevier GmbH.

  17. A Practical Methodology for Quantifying Random and Systematic Components of Unexplained Variance in a Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Obara, Clifford J.; Goodman, Wesley L.

    2012-01-01

    This paper documents a check standard wind tunnel test conducted in the Langley 0.3-Meter Transonic Cryogenic Tunnel (0.3M TCT) that was designed and analyzed using the Modern Design of Experiments (MDOE). The test designed to partition the unexplained variance of typical wind tunnel data samples into two constituent components, one attributable to ordinary random error, and one attributable to systematic error induced by covariate effects. Covariate effects in wind tunnel testing are discussed, with examples. The impact of systematic (non-random) unexplained variance on the statistical independence of sequential measurements is reviewed. The corresponding correlation among experimental errors is discussed, as is the impact of such correlation on experimental results generally. The specific experiment documented herein was organized as a formal test for the presence of unexplained variance in representative samples of wind tunnel data, in order to quantify the frequency with which such systematic error was detected, and its magnitude relative to ordinary random error. Levels of systematic and random error reported here are representative of those quantified in other facilities, as cited in the references.

  18. A Procedure to Detect Item Bias Present Simultaneously in Several Items

    DTIC Science & Technology

    1991-04-25

    exhibit a coherent and major biasing influence at the test level. In partic- ular, this can be true even if each individual item displays only a minor...response functions (IRFs) without the use of item parameter estimation algorithms when the sample size is too small for their use. Thissen, Steinberg...convention). A random sample of examinees is drawn from each group, and a test of N items is administered to them. Typically it is suspected that a

  19. How large are the consequences of covariate imbalance in cluster randomized trials: a simulation study with a continuous outcome and a binary covariate at the cluster level.

    PubMed

    Moerbeek, Mirjam; van Schie, Sander

    2016-07-11

    The number of clusters in a cluster randomized trial is often low. It is therefore likely random assignment of clusters to treatment conditions results in covariate imbalance. There are no studies that quantify the consequences of covariate imbalance in cluster randomized trials on parameter and standard error bias and on power to detect treatment effects. The consequences of covariance imbalance in unadjusted and adjusted linear mixed models are investigated by means of a simulation study. The factors in this study are the degree of imbalance, the covariate effect size, the cluster size and the intraclass correlation coefficient. The covariate is binary and measured at the cluster level; the outcome is continuous and measured at the individual level. The results show covariate imbalance results in negligible parameter bias and small standard error bias in adjusted linear mixed models. Ignoring the possibility of covariate imbalance while calculating the sample size at the cluster level may result in a loss in power of at most 25 % in the adjusted linear mixed model. The results are more severe for the unadjusted linear mixed model: parameter biases up to 100 % and standard error biases up to 200 % may be observed. Power levels based on the unadjusted linear mixed model are often too low. The consequences are most severe for large clusters and/or small intraclass correlation coefficients since then the required number of clusters to achieve a desired power level is smallest. The possibility of covariate imbalance should be taken into account while calculating the sample size of a cluster randomized trial. Otherwise more sophisticated methods to randomize clusters to treatments should be used, such as stratification or balance algorithms. All relevant covariates should be carefully identified, be actually measured and included in the statistical model to avoid severe levels of parameter and standard error bias and insufficient power levels.

  20. Heuristics for Understanding the Concepts of Interaction, Polynomial Trend, and the General Linear Model.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    The relationship between analysis of variance (ANOVA) methods and their analogs (analysis of covariance and multiple analyses of variance and covariance--collectively referred to as OVA methods) and the more general analytic case is explored. A small heuristic data set is used, with a hypothetical sample of 20 subjects, randomly assigned to five…

  1. WRITING SKILLS--ARE LARGE CLASSES CONDUCTIVE TO EFFECTIVE LEARNING.

    ERIC Educational Resources Information Center

    HOPPER, HAROLD H.; KELLER, HELEN

    BY A STRATIFIED RANDOM SAMPLING, 274 STUDENTS WERE ASSIGNED TO THREE SECTIONS OF 56 STUDENTS EACH AND FOUR SECTIONS OF 28 STUDENTS. EVALUATION OF THE INSTRUCTION IN THE LARGE AND SMALL GROUPS INVOLVED ANALYSIS OF TWO ESSAYS AND INSTRUCTOR-STUDENT EVALUATIONS. WHILE THERE WAS SOME VARIATION IN STUDENT PREFERENCES, THE RESULTS OF THE PRETEST AND THE…

  2. Substance Use Prevention for Urban American Indian Youth: A Efficacy Trial of the Culturally Adapted Living in 2 Worlds Program.

    PubMed

    Kulis, Stephen S; Ayers, Stephanie L; Harthun, Mary L

    2017-04-01

    This article describes a small efficacy trial of the Living in 2 Worlds (L2W) substance use prevention curriculum, a culturally adapted version of keepin' it REAL (kiR) redesigned for urban American Indian (AI) middle school students. Focused on strengthening resiliency and AI cultural engagement, L2W teaches drug resistance skills, decision making, and culturally grounded prevention messages. Using cluster random assignment, the research team randomized three urban middle schools with enrichment classes for AI students. AI teachers of these classes delivered the L2W curriculum in two schools; the remaining school implemented kiR, unadapted, and became the comparison group. AI students (N = 107) completed a pretest questionnaire before they received the manualized curriculum lessons, and a posttest (85% completion) 1 month after the final lesson. We assessed the adapted L2W intervention, compared to kiR, with paired t tests, baseline adjusted general linear models, and effect size estimates (Cohen's d). Differences between the L2W and kiR groups reached statistically significant thresholds for four outcomes. Youth receiving L2W, compared to kiR, reported less growth in cigarette use from pretest to posttest, less frequent use of the Leave drug resistance strategy, and less loss of connections to AI spirituality and cultural traditions. For other substance use behaviors and antecedents, the direction of the non-significant effects in small sample tests was toward more positive outcomes in L2W and small to medium effect sizes. Results suggest that evidence-based substance use prevention programs that are culturally adapted for urban AI adolescents, like L2W, can be a foundation for prevention approaches to help delay initiation and slow increases in substance use. In addition to study limitations, we discuss implementation challenges in delivering school-based interventions for urban AI populations.

  3. Association between P16INK4a Promoter Methylation and Non-Small Cell Lung Cancer: A Meta-Analysis

    PubMed Central

    Zhu, Siwei; Hua, Feng; Zhao, Hui; Xu, Hongrui; You, Jiacong; Sun, Linlin; Wang, Weiqiang; Chen, Jun; Zhou, Qinghua

    2013-01-01

    Background Aberrant methylation of CpG islands acquired in tumor cells in promoter regions plays an important role in carcinogenesis. Accumulated evidence demonstrates P16INK4a gene promoter hypermethylation is involved in non-small cell lung carcinoma (NSCLC), indicating it may be a potential biomarker for this disease. The aim of this study is to evaluate the frequency of P16INK4a gene promoter methylation between cancer tissue and autologous controls by summarizing published studies. Methods By searching Medline, EMBSE and CNKI databases, the open published studies about P16INK4a gene promoter methylation and NSCLC were identified using a systematic search strategy. The pooled odds of P16INK4A promoter methylation in lung cancer tissue versus autologous controls were calculated by meta-analysis method. Results Thirty-four studies, including 2 652 NSCLC patients with 5 175 samples were included in this meta-analysis. Generally, the frequency of P16INK4A promoter methylation ranged from 17% to 80% (median 44%) in the lung cancer tissue and 0 to 80% (median 15%) in the autologous controls, which indicated the methylation frequency in cancer tissue was much higher than that in autologous samples. We also find a strong and significant correlation between tumor tissue and autologous controls of P16INK4A promoter methylation frequency across studies (Correlation coefficient 0.71, 95% CI:0.51–0.83, P<0.0001). And the pooled odds ratio of P16INK4A promoter methylation in cancer tissue was 3.45 (95% CI: 2.63–4.54) compared to controls under random-effect model. Conclusion Frequency of P16INK4a promoter methylation in cancer tissue was much higher than that in autologous controls, indicating promoter methylation plays an important role in carcinogenesis of the NSCLC. Strong and significant correlation between tumor tissue and autologous samples of P16INK4A promoter methylation demonstrated a promising biomarker for NSCLC. PMID:23577085

  4. Temporal patterns of care and outcomes of non-small cell lung cancer patients in the United States diagnosed in 1996, 2005, and 2010.

    PubMed

    Kaniski, Filip; Enewold, Lindsey; Thomas, Anish; Malik, Shakuntala; Stevens, Jennifer L; Harlan, Linda C

    2017-01-01

    Lung cancer remains a common and deadly cancer in the United States. This study evaluated factors associated with stage-specific cancer therapy and survival focusing on temporal trends and sociodemographic disparities. A random sample (n=3,318) of non-small cell lung cancer (NSCLC) patients diagnosed in 1996, 2005 and 2010, and reported to the National Cancer Institute's Surveillance Epidemiology and End Results (SEER) program was analyzed. Logistic regression was utilized to identify factors associated with receipt of surgery among stage I/II patients and chemotherapy among stage IIIB/IV patients. Cox proportional hazard regression was utilized to assess factors associated with all-cause mortality, stratified by stage. Surgery among stage I/II patients decreased non-significantly overtime (1996: 78.8%; 2010: 68.5%; p=0.18), whereas receipt of chemotherapy among stage IIIB/IV patients increased significantly overtime (1996: 36.1%; 2010: 51.2%; p<0.01). Receipt of surgery (70-79 and ≥80 vs. <70: Odds Ratio(OR):0.31; 95% Confidence Interval (CI): 0.16-0.63 and OR:0.04; 95% CI: 0.02-0.10, respectively) and chemotherapy (≥80 vs. <70: OR: 0.26; 95% CI:0.15-0.45) was less likely among older patients. Median survival improved non-significantly among stage I/II patients from 51 to 64 months (p=0.75) and significantly among IIIB/IV patients from 4 to 5 months (p<0.01). Treatment disparities were observed in both stage groups, notably among older patients. Among stage I/II patients, survival did not change significantly possibly due to stable surgery utilization. Among stage IIIB/IV patients, although the use of chemotherapy increased and survival improved, the one-month increase in median survival highlights the need for addition research. Published by Elsevier Ireland Ltd.

  5. Shaken, but not stirred: how vortical flow drives small-scale aggregations of gyrotactic phytoplankton

    NASA Astrophysics Data System (ADS)

    Barry, Michael; Durham, William; Climent, Eric; Stocker, Roman

    2011-11-01

    Coastal ocean observations reveal that motile phytoplankton form aggregations at the Kolmogorov scale (mm-cm), whereas non-motile cells do not. We propose a new mechanism for the formation of this small-scale patchiness based on the interplay of turbulence and gyrotactic motility. Counterintuitively, turbulence does not stir a plankton suspension to homogeneity but drives aggregations instead. Through controlled laboratory experiments we show that the alga Heterosigma akashiwo rapidly forms aggregations in a cavity-driven vortical flow that approximates Kolmogorov eddies. Gyrotactic motility is found to be the key ingredient for aggregation, as non-motile cells remain randomly distributed. Observations are in remarkable agreement with a 3D model, and the validity of this mechanism for generating patchiness has been extended to realistic turbulent flows using Direct Numerical Simulations. Because small-scale patchiness influences rates of predation, sexual reproduction, infection, and nutrient competition, this result indicates that gyrotactic motility can profoundly affect phytoplankton ecology.

  6. The Vitamin D Assessment (ViDA) Study: design of a randomized controlled trial of vitamin D supplementation for the prevention of cardiovascular disease, acute respiratory infection, falls and non-vertebral fractures.

    PubMed

    Scragg, Robert; Waayer, Debbie; Stewart, Alistair W; Lawes, Carlene M M; Toop, Les; Murphy, Judy; Khaw, Kay-Tee; Camargo, Carlos A

    2016-11-01

    Observational studies have shown that low vitamin D status is associated with an increased risk of cardiovascular disease, acute respiratory infection, falls and non-vertebral fractures. We recruited 5110 Auckland adults, aged 50-84 years, into a randomized, double-blind, placebo-controlled trial to test whether vitamin D supplementation protects against these four major outcomes. The intervention is a monthly cholecalciferol dose of 100,000IU (2.5mg) for an estimated median 3.3 years (range 2.5-4.2) during 2011-2015. Participants were recruited primarily from family practices, plus community groups with a high proportion of Maori, Pacific, or South Asian individuals. The baseline evaluation included medical history, lifestyle, physical measurements (e.g. blood pressure, arterial waveform, lung function, muscle function), and a blood sample (stored at -80°C for later testing). Capsules are being mailed to home addresses with a questionnaire to collect data on non-hospitalized outcomes and to monitor adherence and potential adverse effects. Other data sources include New Zealand Ministry of Health data on mortality, hospitalization, cancer registrations and dispensed pharmaceuticals. A random sample of 438 participants returned for annual collection of blood samples to monitor adherence and safety (hypercalcemia), including repeat physical measurements at 12 months follow-up. The trial will allow testing of a priori hypotheses on several other endpoints including: weight, blood pressure, arterial waveform parameters, heart rate variability, lung function, muscle strength, gait and balance, mood, psoriasis, bone density, and chronic pain. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Clinical investigation of the effect of topical anesthesia on intraocular pressure

    PubMed Central

    Almubrad, Turki M; Ogbuehi, Kelechi C

    2007-01-01

    Background/Aims: Contact tonometry is generally considered more accurate than non-contact tonometry in the assessment of intraocular pressure (IOP). This study was designed to investigate the effect of ocular anesthesia, a pre-requisite for contact tonometry, on the IOP in a sample of visually normal subjects. Method: In a random sample of 120 young visually normal subjects (divided equally among three groups), the Topcon CT80 non-contact tonometer was used to measure IOP before, at the second minute and at the fifth minute following instillation of one drop of one of three eyedrops – carboxymethylcellulose sodium 0.5% (control), oxybuprocaine hydrochloride 0.4% and proparacaine hydrochloride 0.5%. Results: The IOP measured before instilling the ophthalmic drops did not vary significantly among the three groups of subjects (p > 0.05). In the control group, the average IOP of 15.1 ± 2.6 mmHg did not vary significantly (p > 0.05) 2 minutes and 5 minutes following instillation of one drop of Carboxymethylcellulose sodium. There were statistically significant reductions of IOP 2 minutes (p < 0.01) and 5 minutes (p < 0.001) after the instillation of one drop of oxybuprocaine hydrochloride. One drop of proparacaine hydrochloride caused significant reductions in the average IOP after 2 minutes (p < 0.001) and after 5 minutes (p < 0.001). Conclusions: One drop of topical proparacaine or oxybuprocaine may cause a small but a statistically significant reduction in IOP which could lead to lower IOP readings. PMID:19668485

  8. Clinical investigation of the effect of topical anesthesia on intraocular pressure.

    PubMed

    Almubrad, Turki M; Ogbuehi, Kelechi C

    2007-09-01

    Contact tonometry is generally considered more accurate than non-contact tonometry in the assessment of intraocular pressure (IOP). This study was designed to investigate the effect of ocular anesthesia, a pre-requisite for contact tonometry, on the IOP in a sample of visually normal subjects. In a random sample of 120 young visually normal subjects (divided equally among three groups), the Topcon CT80 non-contact tonometer was used to measure IOP before, at the second minute and at the fifth minute following instillation of one drop of one of three eyedrops - carboxymethylcellulose sodium 0.5% (control), oxybuprocaine hydrochloride 0.4% and proparacaine hydrochloride 0.5%. The IOP measured before instilling the ophthalmic drops did not vary significantly among the three groups of subjects (p > 0.05). In the control group, the average IOP of 15.1 +/- 2.6 mmHg did not vary significantly (p > 0.05) 2 minutes and 5 minutes following instillation of one drop of Carboxymethylcellulose sodium. There were statistically significant reductions of IOP 2 minutes (p < 0.01) and 5 minutes (p < 0.001) after the instillation of one drop of oxybuprocaine hydrochloride. One drop of proparacaine hydrochloride caused significant reductions in the average IOP after 2 minutes (p < 0.001) and after 5 minutes (p < 0.001). One drop of topical proparacaine or oxybuprocaine may cause a small but a statistically significant reduction in IOP which could lead to lower IOP readings.

  9. Degradation kinetics and safety evaluation of buprofezin residues in grape (Vitis vinifera L.) and three different soils of India.

    PubMed

    Oulkar, Dasharath P; Banerjee, Kaushik; Patil, Sangram H; Upadhyay, Ajay K; Taware, Praveen B; Deshmukh, Madhukar B; Adsule, Pandurang G

    2009-02-01

    This work was undertaken to determine the preharvest interval (PHI) of buprofezin to minimize its residues in grapes and thereby ensure consumer safety and avoid possible non-compliance in terms of residue violations in export markets. Furthermore, the residue dynamics in three grapevine soils of India was explored to assess its environmental safety. Residues dissipated following non-linear two-compartment first + first-order kinetics. In grapes, the PHI was 31 days at both treatments (312.5 and 625 g a.i. ha(-1)), with the residues below the maximum permissible intake even 1 h after foliar spraying. Random sampling of 5 kg comprising small bunchlets (8-10 berries) collected from a 1 ha area gave satisfactory homogeneity and representation of the population. A survey on the samples harvested after the PHI from supervised vineyards that received treatment at the recommended dose showed residues below the maximum residue limit (MRL) of 0.02 mg kg(-1) applicable for the European Union. In soil, the degradation rate was fastest in clay soil, followed by sandy loam and silty clay, with a half-life within 16 days in all the soils. The recommendation of the PHI proved to be effective in minimizing buprofezin residues in grapes. Thus, this work is of high practical significance to the domestic and export grape industry of India to ensure safety compliance in respect of buprofezin residues, keeping in view the requirements of international trade.

  10. Biased phylodynamic inferences from analysing clusters of viral sequences

    PubMed Central

    Xiang, Fei; Frost, Simon D. W.

    2017-01-01

    Abstract Phylogenetic methods are being increasingly used to help understand the transmission dynamics of measurably evolving viruses, including HIV. Clusters of highly similar sequences are often observed, which appear to follow a ‘power law’ behaviour, with a small number of very large clusters. These clusters may help to identify subpopulations in an epidemic, and inform where intervention strategies should be implemented. However, clustering of samples does not necessarily imply the presence of a subpopulation with high transmission rates, as groups of closely related viruses can also occur due to non-epidemiological effects such as over-sampling. It is important to ensure that observed phylogenetic clustering reflects true heterogeneity in the transmitting population, and is not being driven by non-epidemiological effects. We qualify the effect of using a falsely identified ‘transmission cluster’ of sequences to estimate phylodynamic parameters including the effective population size and exponential growth rate under several demographic scenarios. Our simulation studies show that taking the maximum size cluster to re-estimate parameters from trees simulated under a randomly mixing, constant population size coalescent process systematically underestimates the overall effective population size. In addition, the transmission cluster wrongly resembles an exponential or logistic growth model 99% of the time. We also illustrate the consequences of false clusters in exponentially growing coalescent and birth-death trees, where again, the growth rate is skewed upwards. This has clear implications for identifying clusters in large viral databases, where a false cluster could result in wasted intervention resources. PMID:28852573

  11. Mindfulness-Based Cognitive Therapy Improves Emotional Reactivity to Social Stress: Results from A Randomized Controlled Trial

    PubMed Central

    Britton, Willoughby B.; Shahar, Ben; Szepsenwol, Ohad; Jacobs, W. Jake

    2012-01-01

    Objectives The high likelihood of recurrences in depression is linked to progressive increase in emotional reactivity to stress (stress sensitization). Mindfulness-based therapies teach mindfulness skills designed to decrease emotional reactivity in the face of negative-affect producing stressors. The primary aim of the current study was to assess whether Mindfulness-Based Cognitive Therapy (MBCT) is efficacious in reducing emotional reactivity to social evaluative threat in a clinical sample with recurrent depression. A secondary aim was to assess whether improvement in emotional reactivity mediates improvements in depressive symptoms. Methods Fifty-two individuals with partially-remitted depression were randomized into an 8-week MBCT course or a waitlist control condition. All participants underwent the Trier Social Stress Test (TSST) before and after the 8-week trial period. Emotional reactivity to stress was assessed with the Spielberger State Anxiety Inventory at several time points before, during and after the stressor. Results MBCT was associated with decreased emotional reactivity to social stress, specifically during the recovery (post-stressor) phase of the TSST. Waitlist controls showed an increase in anticipatory (pre-stressor) anxiety, which was absent in the MBCT group. Improvements in emotional reactivity partially mediated improvements in depressive symptoms. Limitations Limitations include small sample size, lack of objective or treatment adherence measures, and non-generalizability to more severely depressed populations. Conclusions Given that emotional reactivity to stress is an important psychopathological process underlying the chronic and recurrent nature of depression, these findings suggest that mindfulness skills are important in adaptive emotion regulation when coping with stress. PMID:22440072

  12. Maximum likelihood estimation of correction for dilution bias in simple linear regression using replicates from subjects with extreme first measurements.

    PubMed

    Berglund, Lars; Garmo, Hans; Lindbäck, Johan; Svärdsudd, Kurt; Zethelius, Björn

    2008-09-30

    The least-squares estimator of the slope in a simple linear regression model is biased towards zero when the predictor is measured with random error. A corrected slope may be estimated by adding data from a reliability study, which comprises a subset of subjects from the main study. The precision of this corrected slope depends on the design of the reliability study and estimator choice. Previous work has assumed that the reliability study constitutes a random sample from the main study. A more efficient design is to use subjects with extreme values on their first measurement. Previously, we published a variance formula for the corrected slope, when the correction factor is the slope in the regression of the second measurement on the first. In this paper we show that both designs improve by maximum likelihood estimation (MLE). The precision gain is explained by the inclusion of data from all subjects for estimation of the predictor's variance and by the use of the second measurement for estimation of the covariance between response and predictor. The gain of MLE enhances with stronger true relationship between response and predictor and with lower precision in the predictor measurements. We present a real data example on the relationship between fasting insulin, a surrogate marker, and true insulin sensitivity measured by a gold-standard euglycaemic insulin clamp, and simulations, where the behavior of profile-likelihood-based confidence intervals is examined. MLE was shown to be a robust estimator for non-normal distributions and efficient for small sample situations. Copyright (c) 2008 John Wiley & Sons, Ltd.

  13. Recent advances in scalable non-Gaussian geostatistics: The generalized sub-Gaussian model

    NASA Astrophysics Data System (ADS)

    Guadagnini, Alberto; Riva, Monica; Neuman, Shlomo P.

    2018-07-01

    Geostatistical analysis has been introduced over half a century ago to allow quantifying seemingly random spatial variations in earth quantities such as rock mineral content or permeability. The traditional approach has been to view such quantities as multivariate Gaussian random functions characterized by one or a few well-defined spatial correlation scales. There is, however, mounting evidence that many spatially varying quantities exhibit non-Gaussian behavior over a multiplicity of scales. The purpose of this minireview is not to paint a broad picture of the subject and its treatment in the literature. Instead, we focus on very recent advances in the recognition and analysis of this ubiquitous phenomenon, which transcends hydrology and the Earth sciences, brought about largely by our own work. In particular, we use porosity data from a deep borehole to illustrate typical aspects of such scalable non-Gaussian behavior, describe a very recent theoretical model that (for the first time) captures all these behavioral aspects in a comprehensive manner, show how this allows generating random realizations of the quantity conditional on sampled values, point toward ways of incorporating scalable non-Gaussian behavior in hydrologic analysis, highlight the significance of doing so, and list open questions requiring further research.

  14. Using Empowering Processes to Create Empowered Outcomes through the Family Development Credential Program: An Empirical Study of Change in Human Service Workers

    ERIC Educational Resources Information Center

    Hewitt, Nicole M.

    2010-01-01

    This study employed a quasi-experimental non-equivalent control group design with pretest and posttest. Two waves of data were collected from a non-random sample of 180 human service professionals in Western and Central Pennsylvania using two research instruments: the Social Work Empowerment Scale and the Conditions of Work Effectiveness-II Scale.…

  15. Response of six non-native invasive plant species to wildfires in the northern Rocky Mountains, USA

    Treesearch

    Dennis E. Ferguson; Christine L. Craig

    2010-01-01

    This paper presents early results on the response of six non-native invasive plant species to eight wildfires on six National Forests (NFs) in the northern Rocky Mountains, USA. Stratified random sampling was used to choose 224 stands based on burn severity, habitat type series, slope steepness, stand height, and stand density. Data for this report are from 219 stands...

  16. Estimating bias in causes of death ascertainment in the Finnish Randomized Study of Screening for Prostate Cancer.

    PubMed

    Kilpeläinen, Tuomas P; Mäkinen, Tuukka; Karhunen, Pekka J; Aro, Jussi; Lahtela, Jorma; Taari, Kimmo; Talala, Kirsi; Tammela, Teuvo L J; Auvinen, Anssi

    2016-12-01

    Precise cause of death (CoD) ascertainment is crucial in any cancer screening trial to avoid bias from misclassification due to excessive recording of diagnosed cancer as a CoD in death certificates instead of non-cancer disease that actually caused death. We estimated whether there was bias in CoD determination between screening (SA) and control arms (CA) in a population-based prostate cancer (PCa) screening trial. Our trial is the largest component of the European Randomized Study of Screening for Prostate Cancer with more than 80,000 men. Randomly selected deaths in men with PCa (N=442/2568 cases, 17.2%) were reviewed by an independent CoD committee. Median follow-up was 16.8 years in both arms. Overdiagnosis of PCa was present in the SA as the risk ratio for PCa incidence was 1.19 (95% confidence interval (CI) 1.14-1.24). The hazard ratio (HR) for PCa mortality was 0.94 (95%CI 0.82-1.08) in favor of the SA. Agreement with official CoD registry was 94.6% (κ=0.88) in the SA and 95.4% (κ=0.91) in the CA. Altogether 14 PCa deaths were estimated as false-positive in both arms and exclusion of these resulted in HR 0.92 (95% CI 0.80-1.06). A small differential misclassification bias in ascertainment of CoD was present, most likely due to attribution bias (overdiagnosis in the SA). Maximum precision in CoD ascertainment can only be achieved with independent review of all deaths in the diseased population. However, this is cumbersome and expensive and may provide little benefit compared to random sampling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Precipitation and Latent Heating Distributions from Satellite Passive Microwave Radiometry. Part 1; Method and Uncertainties

    NASA Technical Reports Server (NTRS)

    Olson, William S.; Kummerow, Christian D.; Yang, Song; Petty, Grant W.; Tao, Wei-Kuo; Bell, Thomas L.; Braun, Scott A.; Wang, Yansen; Lang, Stephen E.; Johnson, Daniel E.

    2004-01-01

    A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating/drying profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and non-convective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud resolving model simulations, and from the Bayesian formulation itself. Synthetic rain rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in instantaneous rain rate estimates at 0.5 deg resolution range from approximately 50% at 1 mm/h to 20% at 14 mm/h. These errors represent about 70-90% of the mean random deviation between collocated passive microwave and spaceborne radar rain rate estimates. The cumulative algorithm error in TMI estimates at monthly, 2.5 deg resolution is relatively small (less than 6% at 5 mm/day) compared to the random error due to infrequent satellite temporal sampling (8-35% at the same rain rate).

  18. Exposure to Celebrity-Endorsed Small Cigar Promotions and Susceptibility to Use among Young Adult Cigarette Smokers

    PubMed Central

    Sterling, Kymberle L.; Moore, Roland S.; Pitts, Nicole; Duong, Melissa; Ford, Kentya H.; Eriksen, Michael P.

    2013-01-01

    Small cigar smoking among young adult cigarette smokers may be attributed to their exposure to its advertisements and promotions. We examined the association between exposure to a celebrity music artist's endorsement of a specific brand of small cigars and young adult cigarette smokers' susceptibility to smoking that brand. Venue-based sampling procedures were used to select and survey a random sample of 121 young adult cigarette smokers, aged 18–35. Fourteen percent reported exposure to the artist's endorsement of the small cigar and 45.4% reported an intention to smoke the product in the future. The odds of small cigar smoking susceptibility increased threefold for those who reported exposure to the endorsement compared to those not exposed (OR = 3.64, 95% CI 1.06 to 12.54). Past 30-day small cigar use (OR = 3.30, 95% CI 1.24 to 8.74) and past 30-day cigar use (OR = 5.08, 95% CI 1.23, 21.08) were also associated with susceptibility to smoke a small cigar. An association between young adult cigarette smokers' exposure to the music artist's small cigar endorsement and their susceptibility to smoke small cigars was found. This association underscores the importance of monitoring small cigar promotions geared toward young people and their impact on small cigar product smoking. PMID:24371444

  19. SELWAY-BITTERROOT WILDERNESS, IDAHO AND MONTANA.

    USGS Publications Warehouse

    Toth, Margo I.; Zilka, Nicholas T.

    1984-01-01

    Mineral-resource studies of the Selway-Bitterroot Wilderness in Idaho County, Idaho, and Missoula and Ravalli Counties, Montana, were carried out. Four areas with probable and one small area of substantiated mineral-resource potential were recognized. The areas of the Running Creek, Painted Rocks, and Whistling Pig plutons of Tertiary age have probable resource potential for molybdenum, although detailed geochemical sampling and surface investigations failed to recognize mineralized systems at the surface. Randomly distributed breccia zones along a fault in the vicinity of the Cliff mine have a substantiated potential for small silver-copper-lead resources.

  20. A post hoc evaluation of a sample size re-estimation in the Secondary Prevention of Small Subcortical Strokes study.

    PubMed

    McClure, Leslie A; Szychowski, Jeff M; Benavente, Oscar; Hart, Robert G; Coffey, Christopher S

    2016-10-01

    The use of adaptive designs has been increasing in randomized clinical trials. Sample size re-estimation is a type of adaptation in which nuisance parameters are estimated at an interim point in the trial and the sample size re-computed based on these estimates. The Secondary Prevention of Small Subcortical Strokes study was a randomized clinical trial assessing the impact of single- versus dual-antiplatelet therapy and control of systolic blood pressure to a higher (130-149 mmHg) versus lower (<130 mmHg) target on recurrent stroke risk in a two-by-two factorial design. A sample size re-estimation was performed during the Secondary Prevention of Small Subcortical Strokes study resulting in an increase from the planned sample size of 2500-3020, and we sought to determine the impact of the sample size re-estimation on the study results. We assessed the results of the primary efficacy and safety analyses with the full 3020 patients and compared them to the results that would have been observed had randomization ended with 2500 patients. The primary efficacy outcome considered was recurrent stroke, and the primary safety outcomes were major bleeds and death. We computed incidence rates for the efficacy and safety outcomes and used Cox proportional hazards models to examine the hazard ratios for each of the two treatment interventions (i.e. the antiplatelet and blood pressure interventions). In the antiplatelet intervention, the hazard ratio was not materially modified by increasing the sample size, nor did the conclusions regarding the efficacy of mono versus dual-therapy change: there was no difference in the effect of dual- versus monotherapy on the risk of recurrent stroke hazard ratios (n = 3020 HR (95% confidence interval): 0.92 (0.72, 1.2), p = 0.48; n = 2500 HR (95% confidence interval): 1.0 (0.78, 1.3), p = 0.85). With respect to the blood pressure intervention, increasing the sample size resulted in less certainty in the results, as the hazard ratio for higher versus lower systolic blood pressure target approached, but did not achieve, statistical significance with the larger sample (n = 3020 HR (95% confidence interval): 0.81 (0.63, 1.0), p = 0.089; n = 2500 HR (95% confidence interval): 0.89 (0.68, 1.17), p = 0.40). The results from the safety analyses were similar to 3020 and 2500 patients for both study interventions. Other trial-related factors, such as contracts, finances, and study management, were impacted as well. Adaptive designs can have benefits in randomized clinical trials, but do not always result in significant findings. The impact of adaptive designs should be measured in terms of both trial results, as well as practical issues related to trial management. More post hoc analyses of study adaptations will lead to better understanding of the balance between the benefits and the costs. © The Author(s) 2016.

  1. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. Part 5: Experimental and operational techniques of mapping land use

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. Scope of the preprocessing techniques was restricted to standard material from the EROS Data Center accompanied by some enlarging procedures and the use of the diazo process. Investigation has shown that the most appropriate sampling strategy for this study is the stratified random technique. A viable sampling procedure, together with a method for determining minimum number of sample points in order to test results of any interpretation are presented.

  2. Cluster-Randomized Trial to Increase Hepatitis B Testing among Koreans in Los Angeles.

    PubMed

    Bastani, Roshan; Glenn, Beth A; Maxwell, Annette E; Jo, Angela M; Herrmann, Alison K; Crespi, Catherine M; Wong, Weng K; Chang, L Cindy; Stewart, Susan L; Nguyen, Tung T; Chen, Moon S; Taylor, Victoria M

    2015-09-01

    In the United States, Korean immigrants experience a disproportionately high burden of chronic hepatitis B (HBV) viral infection and associated liver cancer compared with the general population. However, despite clear clinical guidelines, HBV serologic testing among Koreans remains persistently suboptimal. We conducted a cluster-randomized trial to evaluate a church-based small group intervention to improve HBV testing among Koreans in Los Angeles. Fifty-two Korean churches, stratified by size (small, medium, large) and location (Koreatown versus other), were randomized to intervention or control conditions. Intervention church participants attended a single-session small-group discussion on liver cancer and HBV testing, and control church participants attended a similar session on physical activity and nutrition. Outcome data consisted of self-reported HBV testing obtained via 6-month telephone follow-up interviews. We recruited 1,123 individuals, 18 to 64 years of age, across the 52 churches. Ninety-two percent of the sample attended the assigned intervention session and 86% completed the 6-month follow-up. Sample characteristics included were as follows: mean age 46 years, 65% female, 97% born in Korea, 69% completed some college, and 43% insured. In an intent-to-treat analysis, the intervention produced a statistically significant effect (OR = 4.9, P < 0.001), with 19% of intervention and 6% of control group participants reporting a HBV test. Our intervention was successful in achieving a large and robust effect in a population at high risk of HBV infection and sequelae. The intervention was fairly resource efficient and thus has high potential for replication in other high-risk Asian groups. ©2015 American Association for Cancer Research.

  3. Prediction of soil attributes through interpolators in a deglaciated environment with complex landforms

    NASA Astrophysics Data System (ADS)

    Schünemann, Adriano Luis; Inácio Fernandes Filho, Elpídio; Rocha Francelino, Marcio; Rodrigues Santos, Gérson; Thomazini, Andre; Batista Pereira, Antônio; Gonçalves Reynaud Schaefer, Carlos Ernesto

    2017-04-01

    The knowledge of environmental variables values, in non-sampled sites from a minimum data set can be accessed through interpolation technique. Kriging and the classifier Random Forest algorithm are examples of predictors with this aim. The objective of this work was to compare methods of soil attributes spatialization in a recent deglaciated environment with complex landforms. Prediction of the selected soil attributes (potassium, calcium and magnesium) from ice-free areas were tested by using morphometric covariables, and geostatistical models without these covariables. For this, 106 soil samples were collected at 0-10 cm depth in Keller Peninsula, King George Island, Maritime Antarctica. Soil chemical analysis was performed by the gravimetric method, determining values of potassium, calcium and magnesium for each sampled point. Digital terrain models (DTMs) were obtained by using Terrestrial Laser Scanner. DTMs were generated from a cloud of points with spatial resolutions of 1, 5, 10, 20 and 30 m. Hence, 40 morphometric covariates were generated. Simple Kriging was performed using the R package software. The same data set coupled with morphometric covariates, was used to predict values of the studied attributes in non-sampled sites through Random Forest interpolator. Little differences were observed on the DTMs generated by Simple kriging and Random Forest interpolators. Also, DTMs with better spatial resolution did not improved the quality of soil attributes prediction. Results revealed that Simple Kriging can be used as interpolator when morphometric covariates are not available, with little impact regarding quality. It is necessary to go further in soil chemical attributes prediction techniques, especially in periglacial areas with complex landforms.

  4. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    PubMed Central

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible non-parametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. PMID:24633656

  5. Non-wadeable river bioassessment: spatial variation of benthic diatom assemblages in Pacific Northwest rivers, USA

    EPA Science Inventory

    Current bioassessment efforts are focused on small wadeable streams, at least partly because assessing ecological conditions in non-wadeable large rivers poses many additional challenges. In this study, we sampled 20 sites in each of seven large rivers in the Pacific Northwest, U...

  6. Using machine learning to predict radiation pneumonitis in patients with stage I non-small cell lung cancer treated with stereotactic body radiation therapy

    NASA Astrophysics Data System (ADS)

    Valdes, Gilmer; Solberg, Timothy D.; Heskel, Marina; Ungar, Lyle; Simone, Charles B., II

    2016-08-01

    To develop a patient-specific ‘big data’ clinical decision tool to predict pneumonitis in stage I non-small cell lung cancer (NSCLC) patients after stereotactic body radiation therapy (SBRT). 61 features were recorded for 201 consecutive patients with stage I NSCLC treated with SBRT, in whom 8 (4.0%) developed radiation pneumonitis. Pneumonitis thresholds were found for each feature individually using decision stumps. The performance of three different algorithms (Decision Trees, Random Forests, RUSBoost) was evaluated. Learning curves were developed and the training error analyzed and compared to the testing error in order to evaluate the factors needed to obtain a cross-validated error smaller than 0.1. These included the addition of new features, increasing the complexity of the algorithm and enlarging the sample size and number of events. In the univariate analysis, the most important feature selected was the diffusion capacity of the lung for carbon monoxide (DLCO adj%). On multivariate analysis, the three most important features selected were the dose to 15 cc of the heart, dose to 4 cc of the trachea or bronchus, and race. Higher accuracy could be achieved if the RUSBoost algorithm was used with regularization. To predict radiation pneumonitis within an error smaller than 10%, we estimate that a sample size of 800 patients is required. Clinically relevant thresholds that put patients at risk of developing radiation pneumonitis were determined in a cohort of 201 stage I NSCLC patients treated with SBRT. The consistency of these thresholds can provide radiation oncologists with an estimate of their reliability and may inform treatment planning and patient counseling. The accuracy of the classification is limited by the number of patients in the study and not by the features gathered or the complexity of the algorithm.

  7. The Bim deletion polymorphism clinical profile and its relation with tyrosine kinase inhibitor resistance in Chinese patients with non-small cell lung cancer.

    PubMed

    Zhao, Mingchuan; Zhang, Yishi; Cai, Weijing; Li, Jiayu; Zhou, Fei; Cheng, Ningning; Ren, Ruixin; Zhao, Chao; Li, Xuefei; Ren, Shengxiang; Zhou, Caicun; Hirsch, Fred R

    2014-08-01

    Epidermal growth factor receptor (EGFR) tyrosine kinase inhibitors (TKIs) are widely used for the treatment of patients with advanced non-small cell lung cancer (NSCLC) who have EGFR mutations. Recent studies have indicated that some patients with positive mutations were refractory to EGFR TKIs if they harbored a B-cell chronic lymphocytic leukemia/lymphoma (Bcl-2)-like 11 (Bim) deletion polymorphism. The objective of the current work was to retrospectively study the Bim deletion polymorphism in Chinese patients with NSCLC and its correlation with the efficacy of EGFR TKIs. Distribution of the Bim polymorphism was detected using polymerase chain reaction analysis and direct sequencing of DNA from peripheral neutrophils in samples from 352 patients with NSCLC. Of the 352 patients, 166 who received TKI therapy and had an activating mutation identified were involved in further analysis. Progression-free survival (PFS) was the primary endpoint of the subsequent analyses, and the incidence of the Bim polymorphism and its relation to clinical benefit from EGFR TKIs also were investigated. In total, 45 of 352 patient samples (12.8%) had the Bim deletion polymorphism, which was distributed randomly with regard to various clinical characteristics. In patients with EGFR mutations who received treatment with TKIs, the median PFS and the median objective response rate were 4.7 months and 25%, respectively, for those with the Bim deletion polymorphism versus 11 months (P = .003) and 66% (P = .001), respectively, for those with wild-type Bim. Cox regression analysis identified Bim status (P = .016) and sex (P = .002) as independent factors predicting clinical benefit from EGFR TKIs in patients with EGFR-mutated NSCLC. The incidence of the Bim deletion polymorphism was approximately 13% in this study, and it was associated with a poor clinical response to EGFR TKIs in patients who had NSCLC with EGFR mutations. © 2014 American Cancer Society.

  8. Non-stationarities in the relationships of heavy precipitation events in the Mediterranean area and the large-scale circulation in the second half of the 20th century

    NASA Astrophysics Data System (ADS)

    Merkenschlager, Christian; Hertig, Elke; Jacobeit, Jucundus

    2017-04-01

    In the context of analyzing temporal varying relationships of heavy precipitation events in the Mediterranean area and associated anomalies of the large-scale circulation, quantile regression models were established. The models were calibrated using different circulation and thermodynamic variables at the 700 hPa and 850 hPa levels as predictors as well as daily precipitation time series at different stations in the Mediterranean area as predictand. Analyses were done for the second half of the 20th century. In the scope of assessing non-stationarities in the predictor-predictand relationships the time series were divided into calibration and validation periods. 100 randomized subsamples were used to calibrate/validate the models under stationary conditions. The highest and lowest skill score of the 100 random samples was used to determine the range of random variability. The model performance under non-stationary conditions was derived from the skill scores of cross-validated running subintervals. If the skill scores of several consecutive years are outside the range of random variability a non-stationarity was declaimed. Particularly the Iberian Peninsula and the Levant region were affected by non-stationarities, the former with significant positive deviations of the skill scores, the latter with significant negative deviations. By means of a case study for the Levant region we determined three possible reasons for non-stationary behavior in the predictor-predictand relationships. The Mediterranean Oscillation as a superordinate system affects the cyclone activity in the Mediterranean basin and the location and intensity of the Cyprus low. Overall, it is demonstrated that non-stationarities have to be taken into account within statistical downscaling model development.

  9. Designing a national soil erosion monitoring network for England and Wales

    NASA Astrophysics Data System (ADS)

    Lark, Murray; Rawlins, Barry; Anderson, Karen; Evans, Martin; Farrow, Luke; Glendell, Miriam; James, Mike; Rickson, Jane; Quine, Timothy; Quinton, John; Brazier, Richard

    2014-05-01

    Although soil erosion is recognised as a significant threat to sustainable land use and may be a priority for action in any forthcoming EU Soil Framework Directive, those responsible for setting national policy with respect to erosion are constrained by a lack of robust, representative, data at large spatial scales. This reflects the process-orientated nature of much soil erosion research. Recognising this limitation, The UK Department for Environment, Food and Rural Affairs (Defra) established a project to pilot a cost-effective framework for monitoring of soil erosion in England and Wales (E&W). The pilot will compare different soil erosion monitoring methods at a site scale and provide statistical information for the final design of the full national monitoring network that will: provide unbiased estimates of the spatial mean of soil erosion rate across E&W (tonnes ha-1 yr-1) for each of three land-use classes - arable and horticultural grassland upland and semi-natural habitats quantify the uncertainty of these estimates with confidence intervals. Probability (design-based) sampling provides most efficient unbiased estimates of spatial means. In this study, a 16 hectare area (a square of 400 x 400 m) positioned at the centre of a 1-km grid cell, selected at random from mapped land use across E&W, provided the sampling support for measurement of erosion rates, with at least 94% of the support area corresponding to the target land use classes. Very small or zero erosion rates likely to be encountered at many sites reduce the sampling efficiency and make it difficult to compare different methods of soil erosion monitoring. Therefore, to increase the proportion of samples with larger erosion rates without biasing our estimates, we increased the inclusion probability density in areas where the erosion rate is likely to be large by using stratified random sampling. First, each sampling domain (land use class in E&W) was divided into strata; e.g. two sub-domains within which, respectively, small or no erosion rates, and moderate or larger erosion rates are expected. Each stratum was then sampled independently and at random. The sample density need not be equal in the two strata, but is known and is accounted for in the estimation of the mean and its standard error. To divide the domains into strata we used information on slope angle, previous interpretation of erosion susceptibility of the soil associations that correspond to the soil map of E&W at 1:250 000 (Soil Survey of England and Wales, 1983), and visual interpretation of evidence of erosion from aerial photography. While each domain could be stratified on the basis of the first two criteria, air photo interpretation across the whole country was not feasible. For this reason we used a two-phase random sampling for stratification (TPRS) design (de Gruijter et al., 2006). First, we formed an initial random sample of 1-km grid cells from the target domain. Second, each cell was then allocated to a stratum on the basis of the three criteria. A subset of the selected cells from each stratum were then selected for field survey at random, with a specified sampling density for each stratum so as to increase the proportion of cells where moderate or larger erosion rates were expected. Once measurements of erosion have been made, an estimate of the spatial mean of the erosion rate over the target domain, its standard error and associated uncertainty can be calculated by an expression which accounts for the estimated proportions of the two strata within the initial random sample. de Gruijter, J.J., Brus, D.J., Biekens, M.F.P. & Knotters, M. 2006. Sampling for Natural Resource Monitoring. Springer, Berlin. Soil Survey of England and Wales. 1983 National Soil Map NATMAP Vector 1:250,000. National Soil Research Institute, Cranfield University.

  10. Causal role for inverse reasoning on obsessive-compulsive symptoms: Preliminary evidence from a cognitive bias modification for interpretation bias study.

    PubMed

    Wong, Shiu F; Grisham, Jessica R

    2017-12-01

    The inference-based approach (IBA) is a cognitive account of the genesis and maintenance of obsessive-compulsive disorder (OCD). According to the IBA, individuals with OCD are prone to using inverse reasoning, in which hypothetical causes form the basis of conclusions about reality. Several studies have provided preliminary support for an association between features of the IBA and OCD symptoms. However, there are currently no studies that have investigated the proposed causal relationship of inverse reasoning in OCD. In a non-clinical sample (N = 187), we used an interpretive cognitive bias procedure to train a bias towards using inverse reasoning (n = 64), healthy sensory-based reasoning (n = 65), or a control condition (n = 58). Participants were randomly allocated to these training conditions. This manipulation allowed us to assess whether, consistent with the IBA, inverse reasoning training increased compulsive-like behaviours and self-reported OCD symptoms. Results indicated that compared to a control condition, participants trained in inverse reasoning reported more OCD symptoms and were more avoidant of potentially contaminated objects. Moreover, change in inverse reasoning bias was a small but significant mediator of the relationship between training condition and behavioural avoidance. Conversely, training in a healthy (non-inverse) reasoning style did not have any effect on symptoms or behaviour relative to the control condition. As this study was conducted in a non-clinical sample, we were unable to generalise our findings to a clinical population. Findings generally support the IBA model by providing preliminary evidence of a causal role for inverse reasoning in OCD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Short Term Culture of Vitrified Human Ovarian Cortical Tissue to Assess the Cryopreservation Outcome: Molecular and Morphological Analysis.

    PubMed

    Ramezani, Mehdi; Salehnia, Mojdeh; Jafarabadi, Mina

    2017-01-01

    The aim of the present study was to evaluate the effectiveness of human ovarian vitrification protocol followed with in vitro culture at the morphological and molecular levels. Ovarian tissues were obtained from 10 normal transsexual women and cut into small pieces and were divided into non-vitrified and vitrified groups and some of the tissues fragments in both groups were randomly cultured for two weeks. The morphological study using hematoxylin and eosin and Masson's trichrome staining was done. The analysis of mean follicular density, 17-β estradiol (E2) and anti mullerian hormone (AMH), and real-time RT-PCR was down for the evaluation of expression of genes related to folliculogenesis. Data were compared by paired-samples and independent-samples T test. Values of p<0.05 were considered statistically significant. The proportion of normal follicles did not show significant difference between vitrified and non-vitrified groups before and after culture but these rates and the mean follicle density significantly decreased in both cultured tissues (p<0.05). The expression of genes was similar in vitrified and non-vitrified groups but in cultured tissues the expression of GDF9 and FSHR genes increased and the expression of FIGLA and KIT-L genes decreased (p<0.05). An increase in E2 and AMH concentration was observed after 14 days of culture in both groups. In conclusion, the present study indicated that the follicular development and gene expression in vitrified ovarian tissue was not altered before and after in vitro culture, thus this method could be useful for fertility preservation; however, additional studies are needed to improve the culture condition.

  12. Affluence as a predictor of vaccine refusal and underimmunization in California private kindergartens.

    PubMed

    McNutt, Louise-Anne; Desemone, Cristina; DeNicola, Erica; El Chebib, Hassan; Nadeau, Jessica A; Bednarczyk, Robert A; Shaw, Jana

    2016-03-29

    Non-medical vaccine exemption rates in California private schools far exceed those of public schools, but little is known about specific factors which may be associated with high exemption rates in private schools. The percent of personal-belief exemptions (PBEs) among California public and private kindergartens were computed for 2000-2001 to 2014-2015 academic years. For the 2014-2015 academic year, a random sample of private schools was selected to investigate associations between kindergarten characteristics (tuition amount, religious affiliation) and vaccine profile (non-medical vaccine exemptions, vaccine coverage). The proportion of private kindergartens reporting 5% or more children with PBEs increased from 9% (2000-2001) to 34% (2013-2014), followed by a small decrease in 2014-2015 (31%). Overall, 93.7% (565/605) of kindergartens sampled in 2014-2015 had data available. Very high PBE levels (>20%) were seen among secular and non-Catholic, Christian kindergartens but not Roman Catholic, Jewish or Islamic kindergartens. However, the majority of schools at all tuition levels had fewer than 5% of children with a PBE. Kindergartens with an annual tuition of $10,000 or more were over twice as likely to have 20% or more children with PBEs than kindergartens with a lower tuition (p<.01). Additionally, the conditional admission proportions for kindergartens with tuitions of $10,000 or more were 39% compared to 22% for less expensive kindergartens (p<.01). Only about half of all private kindergartens had 95% coverage of the MMR (49%) and pertussis-containing vaccines (51%). School-entry vaccination requirements are critical to preventing outbreaks of vaccine preventable diseases in the US. Nonmedical exemptions increased between the 2000-2001 and 2014-2015 academic years and appear to be associated with affluence, raising social justice concerns. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Near-optimal alternative generation using modified hit-and-run sampling for non-linear, non-convex problems

    NASA Astrophysics Data System (ADS)

    Rosenberg, D. E.; Alafifi, A.

    2016-12-01

    Water resources systems analysis often focuses on finding optimal solutions. Yet an optimal solution is optimal only for the modelled issues and managers often seek near-optimal alternatives that address un-modelled objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as the region comprising the original problem constraints plus a new constraint that allowed performance within a specified tolerance of the optimal objective function value. MGA identified a few maximally-different alternatives from the near-optimal region. Subsequent work applied Markov Chain Monte Carlo (MCMC) sampling to generate a larger number of alternatives that span the near-optimal region of linear problems or select portions for non-linear problems. We extend the MCMC Hit-And-Run method to generate alternatives that span the full extent of the near-optimal region for non-linear, non-convex problems. First, start at a feasible hit point within the near-optimal region, then run a random distance in a random direction to a new hit point. Next, repeat until generating the desired number of alternatives. The key step at each iterate is to run a random distance along the line in the specified direction to a new hit point. If linear equity constraints exist, we construct an orthogonal basis and use a null space transformation to confine hits and runs to a lower-dimensional space. Linear inequity constraints define the convex bounds on the line that runs through the current hit point in the specified direction. We then use slice sampling to identify a new hit point along the line within bounds defined by the non-linear inequity constraints. This technique is computationally efficient compared to prior near-optimal alternative generation techniques such MGA, MCMC Metropolis-Hastings, evolutionary, or firefly algorithms because search at each iteration is confined to the hit line, the algorithm can move in one step to any point in the near-optimal region, and each iterate generates a new, feasible alternative. We use the method to generate alternatives that span the near-optimal regions of simple and more complicated water management problems and may be preferred to optimal solutions. We also discuss extensions to handle non-linear equity constraints.

  14. Changes in Brain Volume and Cognition in a Randomized Trial of Exercise and Social Interaction in a Community-Based Sample of Non-Demented Chinese Elders

    PubMed Central

    Mortimer, James A.; Ding, Ding; Borenstein, Amy R.; DeCarli, Charles; Guo, Qihao; Wu, Yougui; Zhao, Qianhua; Chu, Shugang

    2013-01-01

    Physical exercise has been shown to increase brain volume and improve cognition in randomized trials of non-demented elderly. Although greater social engagement was found to reduce dementia risk in observational studies, randomized trials of social interventions have not been reported. A representative sample of 120 elderly from Shanghai, China was randomized to four groups (Tai Chi, Walking, Social Interaction, No Intervention) for 40 weeks. Two MRIs were obtained, one before the intervention period, the other after. A neuropsychological battery was administered at baseline, 20 weeks, and 40 weeks. Comparison of changes in brain volumes in intervention groups with the No Intervention group were assessed by t-tests. Time-intervention group interactions for neuropsychological measures were evaluated with repeated-measures mixed models. Compared to the No Intervention group, significant increases in brain volume were seen in the Tai Chi and Social Intervention groups (p < 0.05). Improvements also were observed in several neuropsychological measures in the Tai Chi group, including the Mattis Dementia Rating Scale score (p = 0.004), the Trailmaking Test A (p = 0.002) and B (p = 0.0002), the Auditory Verbal Learning Test (p = 0.009), and verbal fluency for animals (p = 0.01). The Social Interaction group showed improvement on some, but fewer neuropsychological indices. No differences were observed between the Walking and No Intervention groups. The findings differ from previous clinical trials in showing increases in brain volume and improvements in cognition with a largely non-aerobic exercise (Tai Chi). In addition, intellectual stimulation through social interaction was associated with increases in brain volume as well as with some cognitive improvements. PMID:22451320

  15. Changes in brain volume and cognition in a randomized trial of exercise and social interaction in a community-based sample of non-demented Chinese elders.

    PubMed

    Mortimer, James A; Ding, Ding; Borenstein, Amy R; DeCarli, Charles; Guo, Qihao; Wu, Yougui; Zhao, Qianhua; Chu, Shugang

    2012-01-01

    Physical exercise has been shown to increase brain volume and improve cognition in randomized trials of non-demented elderly. Although greater social engagement was found to reduce dementia risk in observational studies, randomized trials of social interventions have not been reported. A representative sample of 120 elderly from Shanghai, China was randomized to four groups (Tai Chi, Walking, Social Interaction, No Intervention) for 40 weeks. Two MRIs were obtained, one before the intervention period, the other after. A neuropsychological battery was administered at baseline, 20 weeks, and 40 weeks. Comparison of changes in brain volumes in intervention groups with the No Intervention group were assessed by t-tests. Time-intervention group interactions for neuropsychological measures were evaluated with repeated-measures mixed models. Compared to the No Intervention group, significant increases in brain volume were seen in the Tai Chi and Social Intervention groups (p < 0.05). Improvements also were observed in several neuropsychological measures in the Tai Chi group, including the Mattis Dementia Rating Scale score (p = 0.004), the Trailmaking Test A (p = 0.002) and B (p = 0.0002), the Auditory Verbal Learning Test (p = 0.009), and verbal fluency for animals (p = 0.01). The Social Interaction group showed improvement on some, but fewer neuropsychological indices. No differences were observed between the Walking and No Intervention groups. The findings differ from previous clinical trials in showing increases in brain volume and improvements in cognition with a largely non-aerobic exercise (Tai Chi). In addition, intellectual stimulation through social interaction was associated with increases in brain volume as well as with some cognitive improvements.

  16. Prevalence of alcohol-impaired drivers based on random breath tests in a roadside survey in Catalonia (Spain).

    PubMed

    Alcañiz, Manuela; Guillén, Montserrat; Santolino, Miguel; Sánchez-Moscona, Daniel; Llatje, Oscar; Ramon, Lluís

    2014-04-01

    Sobriety checkpoints are not usually randomly located by traffic authorities. As such, information provided by non-random alcohol tests cannot be used to infer the characteristics of the general driving population. In this paper a case study is presented in which the prevalence of alcohol-impaired driving is estimated for the general population of drivers. A stratified probabilistic sample was designed to represent vehicles circulating in non-urban areas of Catalonia (Spain), a region characterized by its complex transportation network and dense traffic around the metropolis of Barcelona. Random breath alcohol concentration tests were performed during spring 2012 on 7596 drivers. The estimated prevalence of alcohol-impaired drivers was 1.29%, which is roughly a third of the rate obtained in non-random tests. Higher rates were found on weekends (1.90% on Saturdays and 4.29% on Sundays) and especially at night. The rate is higher for men (1.45%) than for women (0.64%) and it shows an increasing pattern with age. In vehicles with two occupants, the proportion of alcohol-impaired drivers is estimated at 2.62%, but when the driver was alone the rate drops to 0.84%, which might reflect the socialization of drinking habits. The results are compared with outcomes in previous surveys, showing a decreasing trend in the prevalence of alcohol-impaired drivers over time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Decay of random correlation functions for unimodal maps

    NASA Astrophysics Data System (ADS)

    Baladi, Viviane; Benedicks, Michael; Maume-Deschamps, Véronique

    2000-10-01

    Since the pioneering results of Jakobson and subsequent work by Benedicks-Carleson and others, it is known that quadratic maps tfa( χ) = a - χ2 admit a unique absolutely continuous invariant measure for a positive measure set of parameters a. For topologically mixing tfa, Young and Keller-Nowicki independently proved exponential decay of correlation functions for this a.c.i.m. and smooth observables. We consider random compositions of small perturbations tf + ωt, with tf = tfa or another unimodal map satisfying certain nonuniform hyperbolicity axioms, and ωt chosen independently and identically in [-ɛ, ɛ]. Baladi-Viana showed exponential mixing of the associated Markov chain, i.e., averaging over all random itineraries. We obtain stretched exponential bounds for the random correlation functions of Lipschitz observables for the sample measure μωof almost every itinerary.

  18. Using near infrared spectroscopy to classify soybean oil according to expiration date.

    PubMed

    da Costa, Gean Bezerra; Fernandes, David Douglas Sousa; Gomes, Adriano A; de Almeida, Valber Elias; Veras, Germano

    2016-04-01

    A rapid and non-destructive methodology is proposed for the screening of edible vegetable oils according to conservation state expiration date employing near infrared (NIR) spectroscopy and chemometric tools. A total of fifty samples of soybean vegetable oil, of different brands andlots, were used in this study; these included thirty expired and twenty non-expired samples. The oil oxidation was measured by peroxide index. NIR spectra were employed in raw form and preprocessed by offset baseline correction and Savitzky-Golay derivative procedure, followed by PCA exploratory analysis, which showed that NIR spectra would be suitable for the classification task of soybean oil samples. The classification models were based in SPA-LDA (Linear Discriminant Analysis coupled with Successive Projection Algorithm) and PLS-DA (Discriminant Analysis by Partial Least Squares). The set of samples (50) was partitioned into two groups of training (35 samples: 15 non-expired and 20 expired) and test samples (15 samples 5 non-expired and 10 expired) using sample-selection approaches: (i) Kennard-Stone, (ii) Duplex, and (iii) Random, in order to evaluate the robustness of the models. The obtained results for the independent test set (in terms of correct classification rate) were 96% and 98% for SPA-LDA and PLS-DA, respectively, indicating that the NIR spectra can be used as an alternative to evaluate the degree of oxidation of soybean oil samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Lifetime Paid Work and Mental Health Problems among Poor Urban 9-to-13-Year-Old Children in Brazil

    PubMed Central

    Pires, Ivens H.; Paula, Cristiane S.

    2013-01-01

    Objective. To verify if emotional/behavioral problems are associated with lifetime paid work in poor urban children, when taking into account other potential correlates. Methods. Cross-sectional study focused on 9-to-13-year-old children (n = 212). In a probabilistic sample of clusters of eligible households (women 15–49 years and son/daughter <18 years), one mother-child pair was randomly selected per household (n = 813; response rate = 82.4%). CBCL/6-18 identified child emotional/behavioral problems. Potential correlates include child gender and age, socioeconomic status/SES, maternal education, parental working status, and family social isolation, among others. Multivariate analysis examined the relationship between emotional/behavioral problems and lifetime paid work in the presence of significant correlates. Findings. All work activities were non-harmful (e.g., selling fruits, helping parents at their small business, and baby sitting). Children with lower SES and socially isolated were more involved in paid work than less disadvantaged peers. Children ever exposed to paid work were four times more likely to present anxiety/depression symptoms at a clinical level compared to non-exposed children. Multivariate modeling identified three independent correlates: child pure internalizing problems, social isolation, and low SES. Conclusion. There is an association between lifetime exposure to exclusively non-harmful paid work activities and pure internalizing problems even when considering SES variability and family social isolation. PMID:24302872

  20. Non-every day statin administration--a literature review.

    PubMed

    Elis, Avishay; Lishner, Michael

    2012-07-01

    Statins are the treatment of choice for lowering LDL-C levels and reducing cardiovascular events. They have a remarkable safety profile, although some patients do not tolerate them. The aim of the study was to summarize the existing data on non-every day statin administration regimens. We searched the MEDLINE databases to identify articles on non-every day statin administration, published between 1990 and January 2010. All publications regardless of methodology, design, size, or language were included. Data extracted included study design, duration and aims, type of statin, therapeutic regimen, patient characteristics, effectiveness, tolerability, and costs. The 21 retrieved articles were characterized by small sample size, short follow up period, and a preponderance of males and "primary" prevention cases. Several lacked randomization or a control group. The heterogeneity of the study groups, medications, doses, design and aims precluded a pooled or meta-analysis. The most reported and effective regimens were atorvastatin and rosuvastatin on alternate days. These regimens, with or without other lipid lowering agents, were well tolerated even among subjects with previous statin intolerance, and produced meaningful cost savings. Nevertheless, the effectiveness of these regimens on cardiovascular events was not clarified. Atorvastatin or rosuvastatin on alternate days might be considered for patients who are intolerant to statin therapy. Further studies are needed to evaluate the effect of these regimens on cardiovascular events. Copyright © 2012 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  1. Lifetime paid work and mental health problems among poor urban 9-to-13-year-old children in Brazil.

    PubMed

    Bordin, Isabel A; Pires, Ivens H; Paula, Cristiane S

    2013-01-01

    To verify if emotional/behavioral problems are associated with lifetime paid work in poor urban children, when taking into account other potential correlates. Cross-sectional study focused on 9-to-13-year-old children (n = 212). In a probabilistic sample of clusters of eligible households (women 15-49 years and son/daughter <18 years), one mother-child pair was randomly selected per household (n = 813; response rate = 82.4%). CBCL/6-18 identified child emotional/behavioral problems. Potential correlates include child gender and age, socioeconomic status/SES, maternal education, parental working status, and family social isolation, among others. Multivariate analysis examined the relationship between emotional/behavioral problems and lifetime paid work in the presence of significant correlates. All work activities were non-harmful (e.g., selling fruits, helping parents at their small business, and baby sitting). Children with lower SES and socially isolated were more involved in paid work than less disadvantaged peers. Children ever exposed to paid work were four times more likely to present anxiety/depression symptoms at a clinical level compared to non-exposed children. Multivariate modeling identified three independent correlates: child pure internalizing problems, social isolation, and low SES. There is an association between lifetime exposure to exclusively non-harmful paid work activities and pure internalizing problems even when considering SES variability and family social isolation.

  2. Non local means denoising in photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Siregar, Syahril; Nagaoka, Ryo; Haq, Israr Ul; Saijo, Yoshifumi

    2018-07-01

    Photoacoustic (PA) imaging has the ability to visualize human organs with high spatial resolution and high contrast. Like digital images, PA images are contaminated with random noise due to some parameters. The band-pass filter does not effectively remove the noise because noise is randomly distributed in the bandwidth frequency. We present noise removal method in PA images by using non local means denoising (NLMD) method. The NLMD can be used if there are similarities or redundancies in the image. PA images contain of blood vessel which repeating on the small patch. The method was tested on PA images of carbon nanotubes in micropipe, in vivo mice brain and in vivo mice ear. We estimated the suggested input parameters of NLMD, so it can be automatically applied after scanning the image in PA imaging system. Our results declared that the NLMD enhanced the image quality of PA images.

  3. Rationale, design, methodology and sample characteristics for the family partners for health study: a cluster randomized controlled study

    PubMed Central

    2012-01-01

    Background Young children who are overweight are at increased risk of becoming obese and developing type 2 diabetes and cardiovascular disease later in life. Therefore, early intervention is critical. This paper describes the rationale, design, methodology, and sample characteristics of a 5-year cluster randomized controlled trial being conducted in eight elementary schools in rural North Carolina, United States. Methods/Design The first aim of the trial is to examine the effects of a two-phased intervention on weight status, adiposity, nutrition and exercise health behaviors, and self-efficacy in overweight or obese 2nd, 3 rd, and 4th grade children and their overweight or obese parents. The primary outcome in children is stabilization of BMI percentile trajectory from baseline to 18 months. The primary outcome in parents is a decrease in BMI from baseline to 18 months. Secondary outcomes for both children and parents include adiposity, nutrition and exercise health behaviors, and self-efficacy from baseline to 18 months. A secondary aim of the trial is to examine in the experimental group, the relationships between parents and children's changes in weight status, adiposity, nutrition and exercise health behaviors, and self-efficacy. An exploratory aim is to determine whether African American, Hispanic, and non-Hispanic white children and parents in the experimental group benefit differently from the intervention in weight status, adiposity, health behaviors, and self-efficacy. A total of 358 African American, non-Hispanic white, and bilingual Hispanic children with a BMI ≥ 85th percentile and 358 parents with a BMI ≥ 25 kg/m2 have been inducted over 3 1/2 years and randomized by cohort to either an experimental or a wait-listed control group. The experimental group receives a 12-week intensive intervention of nutrition and exercise education, coping skills training and exercise (Phase I), 9 months of continued monthly contact (Phase II) and then 6 months (follow-up) on their own. Safety endpoints include adverse event reporting. Intention-to-treat analysis will be applied to all data. Discussion Findings from this trial may lead to an effective intervention to assist children and parents to work together to improve nutrition and exercise patterns by making small lifestyle pattern changes. Trial registration NCT01378806. PMID:22463125

  4. The relationship between apical root resorption and orthodontic tooth movement in growing subjects.

    PubMed

    Xu, Tianmin; Baumrind, S

    2002-07-01

    To investigate the relationship between apical root resorption and orthodontic tooth movement in growing subjects. 58 growing subjects were collected randomly into the study sample and another 40 non-treated cases were used as control. The apical resoption of the upper central incisors was measured on periapical film and the incisor displacement was measured on lateral cephalogram. Using multiple linear regression analysis to examine the relationship between root resoption and the displacement of the upper incisor apex in each of four direction (retraction, advancement, intrusion and extrusion). The statistically significant negative association were found between resorption and both intrusion (P < 0.001) and extrusion (P < 0.05), but no significant association was found between resorption and both retraction and advancement. The regression analysis implied an average of 2.29 mm resorption in the absence of apical displacement. The likelihood that the magnitude of displacement of the incisor root is positively associated with root resoption in the population of treated growing subjects is very small.

  5. Identifying Gender Minority Patients' Health And Health Care Needs In Administrative Claims Data.

    PubMed

    Progovac, Ana M; Cook, Benjamin Lê; Mullin, Brian O; McDowell, Alex; Sanchez R, Maria Jose; Wang, Ye; Creedon, Timothy B; Schuster, Mark A

    2018-03-01

    Health care utilization patterns for gender minority Medicare beneficiaries (those who are transgender or gender nonbinary people) are largely unknown. We identified gender minority beneficiaries using a diagnosis-code algorithm and compared them to a 5 percent random sample of non-gender minority beneficiaries from the period 2009-14 in terms of mental health and chronic diseases, use of preventive and mental health care, hospitalizations, and emergency department (ED) visits. Gender minority beneficiaries experienced more disability and mental illness. When we adjusted for age and mental health, we found that they used more mental health care. And when we adjusted for age and chronic conditions, we found that they were more likely to be hospitalized and to visit the ED. There were several small but significant differences in preventive care use. Findings were similar for disabled and older cohorts. These findings underscore the need to capture gender identity in health data to better address this population's health needs.

  6. High-precision simulation of the height distribution for the KPZ equation

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Le Doussal, Pierre; Majumdar, Satya N.; Rosso, Alberto; Schehr, Gregory

    2018-03-01

    The one-point distribution of the height for the continuum Kardar-Parisi-Zhang (KPZ) equation is determined numerically using the mapping to the directed polymer in a random potential at high temperature. Using an importance sampling approach, the distribution is obtained over a large range of values, down to a probability density as small as 10-1000 in the tails. Both short and long times are investigated and compared with recent analytical predictions for the large-deviation forms of the probability of rare fluctuations. At short times the agreement with the analytical expression is spectacular. We observe that the far left and right tails, with exponents 5/2 and 3/2, respectively, are preserved also in the region of long times. We present some evidence for the predicted non-trivial crossover in the left tail from the 5/2 tail exponent to the cubic tail of the Tracy-Widom distribution, although the details of the full scaling form remain beyond reach.

  7. Assortative mating can impede or facilitate fixation of underdominant alleles.

    PubMed

    Newberry, Mitchell G; McCandlish, David M; Plotkin, Joshua B

    2016-12-01

    Underdominant mutations have fixed between divergent species, yet classical models suggest that rare underdominant alleles are purged quickly except in small or subdivided populations. We predict that underdominant alleles that also influence mate choice, such as those affecting coloration patterns visible to mates and predators alike, can fix more readily. We analyze a mechanistic model of positive assortative mating in which individuals have n chances to sample compatible mates. This one-parameter model naturally spans random mating (n=1) and complete assortment (n→∞), yet it produces sexual selection whose strength depends non-monotonically on n. This sexual selection interacts with viability selection to either inhibit or facilitate fixation. As mating opportunities increase, underdominant alleles fix as frequently as neutral mutations, even though sexual selection and underdominance independently each suppress rare alleles. This mechanism allows underdominant alleles to fix in large populations and illustrates how life history can affect evolutionary change. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Transport of oxygen ions in Er doped La2Mo2O9 oxide ion conductors: Correlation with microscopic length scales

    NASA Astrophysics Data System (ADS)

    Paul, T.; Ghosh, A.

    2018-01-01

    We report oxygen ion transport in La2-xErxMo2O9 (0.05 ≤ x ≤ 0.25) oxide ion conductors. We have measured conductivity and dielectric spectra at different temperatures in a wide frequency range. The mean square displacement and spatial extent of non-random sub-diffusive regions are estimated from the conductivity spectra and dielectric spectra, respectively, using linear response theory. The composition dependence of the conductivity is observed to be similar to that of the spatial extent of non-random sub-diffusive regions. The behavior of the composition dependence of the mean square displacement of oxygen ions is opposite to that of the conductivity. The attempt frequency estimated from the analysis of the electric modulus agrees well with that obtained from the Raman spectra analysis. The full Rietveld refinement of X-ray diffraction data of the samples is performed to estimate the distance between different oxygen lattice sites. The results obtained from such analysis confirm the ion hopping within the spatial extent of non-random sub-diffusive regions.

  9. Generation of Stationary Non-Gaussian Time Histories with a Specified Cross-spectral Density

    DOE PAGES

    Smallwood, David O.

    1997-01-01

    The paper reviews several methods for the generation of stationary realizations of sampled time histories with non-Gaussian distributions and introduces a new method which can be used to control the cross-spectral density matrix and the probability density functions (pdfs) of the multiple input problem. Discussed first are two methods for the specialized case of matching the auto (power) spectrum, the skewness, and kurtosis using generalized shot noise and using polynomial functions. It is then shown that the skewness and kurtosis can also be controlled by the phase of a complex frequency domain description of the random process. The general casemore » of matching a target probability density function using a zero memory nonlinear (ZMNL) function is then covered. Next methods for generating vectors of random variables with a specified covariance matrix for a class of spherically invariant random vectors (SIRV) are discussed. Finally the general case of matching the cross-spectral density matrix of a vector of inputs with non-Gaussian marginal distributions is presented.« less

  10. Lessons learned in research: an attempt to study the effects of magnetic therapy.

    PubMed

    Szor, Judy K; Holewinski, Paul

    2002-02-01

    Difficulties related to chronic wound healing research are frequently discussed, but results of less-than-perfect studies commonly are not published. A 16-week, randomized controlled double-blind study attempted to investigate the effect of static magnetic therapy on the healing of diabetic foot ulcers. Of 56 subjects, 37 completed the study. Because of the small sample size, randomization did not control for differences between the two groups, and the data could not be analyzed in any meaningful way. The challenges of performing magnetic therapy research are discussed and considerations for future studies are noted.

  11. Electronic nicotine delivery system landscape in licensed tobacco retailers: results of a county-level survey in Oklahoma

    PubMed Central

    Brame, L S; Mowls, D S; Damphousse, K E; Beebe, L A

    2016-01-01

    Objectives Electronic nicotine delivery systems (ENDS) have recently emerged as a component of the tobacco retail environment. The aims of this study were to describe the availability, types of ENDS and placement of ENDS relative to traditional tobacco products at franchised licensed tobacco retailers and non-franchised licensed tobacco retailers. Design Observational study. Setting Franchised and non-franchised tobacco retailers in Cleveland County, Oklahoma, USA. Primary and secondary outcome measures The number of stores selling ENDS, the variability in brands of ENDS sold, the location of the ENDS within the retailers, the quantity of ENDS sold compared with traditional tobacco products, and the presence of outdoor signage. Results Data from 57 randomly sampled tobacco retailers were used to describe the presence of ENDS at independent non-franchised and franchised tobacco retailers. The overwhelming majority (90%) of licensed tobacco retailers sold ENDS, and differences were observed between franchised and non-franchised stores. 45 of the 51 retailers (88%) selling ENDS had them placed at the point of sale. 2 of the 21 franchised retailers (9.5%) had ENDS placed at ≤3½ feet above floor level compared to none of the 30 non-franchised retailers (0%). Conclusions This small study is the first to characterise ENDS within the tobacco retail environment in a county in Oklahoma, USA. The results from this study demonstrate the complexity of the tobacco retail landscape and generate questions for future studies regarding the incorporation and placement of ENDS in tobacco retail environments. PMID:27266774

  12. [Study of blood sedimentation by photo-thermal radiometry with random excitation].

    PubMed

    Antoniow, J S; Marx, J; Egee, M; Droulle, C; Potron, G

    1994-01-01

    The erythrocyte sedimentation rate is a complex phenomena involving a large number of parameters. The rate of sedimentation is highly dependent on the haematocrit, the internal viscosity of the red cells and the viscosity of the suspending medium and its composition. The experimental conditions also have a non-negligible effect (geometry and nature of the test tube, temperature, foreign substances in the medium...). In order to respond to the need for more precise and more rapid methods of analyzing the erythrocyte sedimentation rate, we developed new physical methods allowing a real time evaluation of the phenomena involved. Several of these new photothermal methods have already been applied for non-destructive evaluation of thin or layered material (such as composite material or glued structures) both in laboratory situations and in the industry. When a material is placed in a modulated laser beam, the incident rays absorbed heat the sample. The heat then diffuses throughout the material and the surface temperature of the sample increases locally with a periodicity. The surface thus emits a modulated flow of infrared radiation. The amplitude and phase shift of the photothermal signal generated is characteristically dependent of the optic and thermal properties of the material for a given modulation frequency. The early photothermal modelling based on a two-layer model and a physico-mathematical theory of red cell sedimentation proposed by S. Oka made it possible to simulate the phenomena as they occur over time. We hypothesize that the temperature gradients created within the sample are too small to create a convection current and that the all heat transfer occurs by conduction.(ABSTRACT TRUNCATED AT 250 WORDS)

  13. Design of a placebo-controlled, randomized study of the efficacy of repetitive transcranial magnetic stimulation for the treatment of chronic tinnitus.

    PubMed

    Landgrebe, Michael; Binder, Harald; Koller, Michael; Eberl, Yvonne; Kleinjung, Tobias; Eichhammer, Peter; Graf, Erika; Hajak, Goeran; Langguth, Berthold

    2008-04-15

    Chronic tinnitus is a frequent condition, which can have enormous impact on patient's life and which is very difficult to treat. Accumulating data indicate that chronic tinnitus is related to dysfunctional neuronal activity in the central nervous system. Repetitive transcranial magnetic stimulation (rTMS) is a non-invasive method which allows to focally modulate neuronal activity. An increasing amount of studies demonstrate reduction of tinnitus after repeated sessions of low-frequency rTMS and indicate that rTMS might represent a new promising approach for the treatment of tinnitus. However available studies have been mono-centric and are characterized by small sample sizes. Therefore, this multi-center trial will test the efficacy of rTMS treatment in a large sample of chronic tinnitus patients. This is a randomized, placebo-controlled, double-blind multi-center trial of two weeks 1 Hz rTMS-treatment in chronic tinnitus patients. Eligible patients will be randomized to either 2 weeks real or sham rTMS treatment. Main eligibility criteria: male or female individuals aged 18-70 years with chronic tinnitus (duration > 6 months), tinnitus-handicap-inventory-score > or = 38, age-adjusted normal sensorineural hearing (i.e. not more than 5 dB below the 10% percentile of the appropriate age and gender group (DIN EN ISO 7029), conductive hearing loss < or = 15dB. The primary endpoint is a change of tinnitus severity according to the tinnitus questionnaire of Goebel and Hiller (baseline vs. end of treatment period). A total of 138 patients are needed to detect a clinical relevant change of tinnitus severity (i.e. 5 points on the questionnaire of Goebel and Hiller; alpha = 0.05; 1-beta = 0.80). Assuming a drop-out rate of less than 5% until the primary endpoint, 150 patients have to be randomized to guarantee the target number of 138 evaluable patients. The study will be conducted by otorhinolaryngologists and psychiatrists of 7 university hospitals and 1 municipal hospital in Germany. This study will provide important information about the efficacy of rTMS in the treatment of chronic tinnitus. Current Controlled Trials ISRCTN89848288.

  14. The impact of pharmaceutical company funding on results of randomized trials of nicotine replacement therapy for smoking cessation: a meta-analysis.

    PubMed

    Etter, Jean-François; Burri, Mafalda; Stapleton, John

    2007-05-01

    To assess whether source of funding affected the results of trials of nicotine replacement therapy (NRT) for smoking cessation. We reviewed all randomized controlled trials included in the Cochrane review. There were insufficient non-industry trials of the newer products for these to be included. We included 90 trials of either the nicotine gum (52) or nicotine patch (38). They comprised 18 238 treatment and 16 235 control participants. Forty-nine showed evidence of industry support (18 gum, 31 patch). Industry (31 of 49, 63%) compared with non-industry (seven of 41, 17%, P < 0.001) supported a higher proportion of nicotine patch studies and had larger sample sizes (479 versus 268, P = 0.04). Twenty-five (51%) industry trials reported statistically significant (P < 0.05) results, compared with nine (22%) non-industry trials (OR = 3.70, 95% CI = 1.46-9.35). This difference was not explained by trial characteristics. Industry-supported trials had a pooled odds ratio of 1.90 (1.67-2.16), compared with 1.61 (1.43-1.80) for other studies (chi(2) = 3.6, P = 0.058). There was evidence of funnel-plot asymmetry among industry trials (t = 4.35, P < 0.001), but not among other trials, indicating that several small null-effect industry trials may not have reached publication. After imputation adjustment, the odds ratio for industry trials reduced to 1.64 (1.43-1.89) and the overall NRT odds ratio reduced from 1.73 (1.60-1.90) to 1.62 (1.49-1.77). Compared with independent trials, industry-supported trials were more likely to produce statistically significant results and larger odds ratios. These differences persisted after adjustment for basic trial characteristics. Although we had no data on the amount of funding for each trial, it is possible that more resources led to higher treatment compliance and therefore greater efficacy in industry-supported trials. Differences can also possibly be explained by publication bias with several small, null-effect industry studies not having reached publication. After adjustment for this possible bias, results for industry trials were lower and similar to non-industry results. Similarly, the overall estimate of the net effect for these products reduces to about 5% attributable 1-year successes. This remains of considerable public health benefit. Registration of clinical trials has become mandatory in many countries since most of the trials considered here were conducted, and this should reduce the potential for publication bias in future.

  15. Fluorescent Random Amplified Microsatellites (F-RAMS) analysis of mushrooms as a forensic investigative tool.

    PubMed

    Kallifatidis, Beatrice; Borovička, Jan; Stránská, Jana; Drábek, Jiří; Mills, Deetta K

    2014-03-01

    The capability of Fluorescent Random Amplified Microsatellites (F-RAMS) to profile hallucinogenic mushrooms to species and sub-species level was assessed. Fifteen samples of Amanita rubescens and 22 samples of other hallucinogenic and non-hallucinogenic mushrooms of the genera Amanita and Psilocybe were profiled using two fluorescently-labeled, 5'degenerate primers, 5'-6FAM-SpC3-DD (CCA)5 and 5'-6FAM-SpC3-DHB (CGA)5, which target different microsatellite repeat regions. Among the two primers, 5'-6FAM-SpC3-DHB (CGA)5 provided more reliable data for identification purposes, by grouping samples of the same species and clustering closely related species together in a dendrogram based on amplicon similarities. A high degree of intra-specific variation between the 15 A. rubescens samples was shown with both primers and the amplicons generated for all A. rubescens samples were organized into three classes of amplicons (discriminant, private, and marker) based on their individualizing potential. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Publication bias in animal research presented at the 2008 Society of Critical Care Medicine Conference.

    PubMed

    Conradi, Una; Joffe, Ari R

    2017-07-07

    To determine a direct measure of publication bias by determining subsequent full-paper publication (P) of studies reported in animal research abstracts presented at an international conference (A). We selected 100 random (using a random-number generator) A from the 2008 Society of Critical Care Medicine Conference. Using a data collection form and study manual, we recorded methodology and result variables from A. We searched PubMed and EMBASE to June 2015, and DOAJ and Google Scholar to May 2017 to screen for subsequent P. Methodology and result variables were recorded from P to determine changes in reporting from A. Predictors of P were examined using Fisher's Exact Test. 62% (95% CI 52-71%) of studies described in A were subsequently P after a median 19 [IQR 9-33.3] months from conference presentation. Reporting of studies in A was of low quality: randomized 27% (the method of randomization and allocation concealment not described), blinded 0%, sample-size calculation stated 0%, specifying the primary outcome 26%, numbers given with denominators 6%, and stating number of animals used 47%. Only being an orally presented (vs. poster presented) A (14/16 vs. 48/84, p = 0.025) predicted P. Reporting of studies in P was of poor quality: randomized 39% (the method of randomization and allocation concealment not described), likely blinded 6%, primary outcome specified 5%, sample size calculation stated 0%, numbers given with denominators 34%, and number of animals used stated 56%. Changes in reporting from A to P occurred: from non-randomized to randomized 19%, from non-blinded to blinded 6%, from negative to positive outcomes 8%, from having to not having a stated primary outcome 16%, and from non-statistically to statistically significant findings 37%. Post-hoc, using publication data, P was predicted by having positive outcomes (published 62/62, unpublished 33/38; p = 0.003), or statistically significant results (published 58/62, unpublished 20/38; p < 0.001). Only 62% (95% CI 52-71%) of animal research A are subsequently P; this was predicted by oral presentation of the A, finally having positive outcomes, and finally having statistically significant results. Publication bias is prevalent in critical care animal research.

  17. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    PubMed Central

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  18. Influence of the Acidic Beverage Cola on the Absorption of Erlotinib in Patients With Non-Small-Cell Lung Cancer.

    PubMed

    van Leeuwen, Roelof W F; Peric, Robert; Hussaarts, Koen G A M; Kienhuis, Emma; IJzerman, Nikki S; de Bruijn, Peter; van der Leest, Cor; Codrington, Henk; Kloover, Jeroen S; van der Holt, Bronno; Aerts, Joachim G; van Gelder, Teun; Mathijssen, Ron H J

    2016-04-20

    Erlotinib depends on stomach pH for its bioavailability. When erlotinib is taken concurrently with a proton pump inhibitor (PPI), stomach pH increases, which results in a clinically relevant decrease of erlotinib bioavailability. We hypothesized that this drug-drug interaction is reversed by taking erlotinib with the acidic beverage cola. The effects of cola on erlotinib bioavailability in patients not treated with a PPI were also studied. In this randomized, cross-over, pharmacokinetic study in patients with non-small-cell lung cancer, we studied intrapatient differences in absorption (area under the plasma concentration time curve [AUC0-12h]) after a 7-day period of concomitant treatment with erlotinib, with or without esomeprazole, with either cola or water. At the 7th and 14th day, patients were hospitalized for 1 day for pharmacokinetic sampling. Twenty-eight evaluable patients were included in the analysis. In patients treated with erlotinib and esomeprazole with cola, the mean AUC0-12h increased 39% (range, -12% to 136%; P = .004), whereas in patients not treated with the PPI, the mean AUC0-12h was only slightly higher (9%; range, -10% to +30%; P = .03) after erlotinib intake with cola. Cola intake led to a clinically relevant and statistically significant increase in the bioavailability of erlotinib during esomeprazole treatment. In patients not treated with the PPI, the effects of cola were marginal. These findings can be used to optimize the management of drug-drug interactions between PPIs and erlotinib. © 2016 by American Society of Clinical Oncology.

  19. Efficacy and safety of iodine-125 radioactive seeds brachytherapy for advanced non-small cell lung cancer-A meta-analysis.

    PubMed

    Zhang, Wenchao; Li, Jiawei; Li, Ran; Zhang, Ying; Han, Mingyong; Ma, Wei

    This meta-analysis was conducted to investigate the efficacy and safety of 125 I brachytherapy for locally advanced non-small cell lung cancer (NSCLC). Trials comparing 125 I brachytherapy with chemotherapy in NSCLC were identified. Meta-analysis was performed to obtain pooled risk ratios for an overall response rate (ORR), disease control rate (DCR) and complications, and pooled hazard ratio for overall survival (OS). Fifteen studies including 1188 cases were included. The pooled result indicated that there were significant differences in ORR, DCR, and OS between 125 I brachytherapy combined with chemotherapy and chemotherapy alone, but no statistic differences in gastrointestinal symptoms, leukopenia, myelosuppression, and hemoglobin reduction. Patients treated with 125 I brachytherapy combined with chemotherapy have a higher relative risk of pneumothorax, bloody sputum, and pneumorrhagia compared with chemotherapy alone. Seeds migration only occurred in the group treated with 125 I brachytherapy. There were significant differences in ORR, DCR, and myelosuppression between 125 I brachytherapy alone and chemotherapy. 125 I brachytherapy combined with chemotherapy can significantly enhance the clinical efficacy and improve the OS of patients with advanced NSCLC without increasing the incidence of complications of chemotherapy. 125 I brachytherapy alone can significantly enhance the clinical efficacy and reduce the incidence of myelosuppression compared with chemotherapy. However, 125 I brachytherapy may cause lung injury. Large sample and higher-quality randomized controlled trials are needed to confirm the pooled results of complications. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  20. Sample types applied for molecular diagnosis of therapeutic management of advanced non-small cell lung cancer in the precision medicine.

    PubMed

    Han, Yanxi; Li, Jinming

    2017-10-26

    In this era of precision medicine, molecular biology is becoming increasingly significant for the diagnosis and therapeutic management of non-small cell lung cancer. The specimen as the primary element of the whole testing flow is particularly important for maintaining the accuracy of gene alteration testing. Presently, the main sample types applied in routine diagnosis are tissue and cytology biopsies. Liquid biopsies are considered as the most promising alternatives when tissue and cytology samples are not available. Each sample type possesses its own strengths and weaknesses, pertaining to the disparity of sampling, preparation and preservation procedures, the heterogeneity of inter- or intratumors, the tumor cellularity (percentage and number of tumor cells) of specimens, etc., and none of them can individually be a "one size to fit all". Therefore, in this review, we summarized the strengths and weaknesses of different sample types that are widely used in clinical practice, offered solutions to reduce the negative impact of the samples and proposed an optimized strategy for choice of samples during the entire diagnostic course. We hope to provide valuable information to laboratories for choosing optimal clinical specimens to achieve comprehensive functional genomic landscapes and formulate individually tailored treatment plans for NSCLC patients that are in advanced stages.

  1. What is the extent of prokaryotic diversity?

    PubMed Central

    Curtis, Thomas P; Head, Ian M; Lunn, Mary; Woodcock, Stephen; Schloss, Patrick D; Sloan, William T

    2006-01-01

    The extent of microbial diversity is an intrinsically fascinating subject of profound practical importance. The term ‘diversity’ may allude to the number of taxa or species richness as well as their relative abundance. There is uncertainty about both, primarily because sample sizes are too small. Non-parametric diversity estimators make gross underestimates if used with small sample sizes on unevenly distributed communities. One can make richness estimates over many scales using small samples by assuming a species/taxa-abundance distribution. However, no one knows what the underlying taxa-abundance distributions are for bacterial communities. Latterly, diversity has been estimated by fitting data from gene clone libraries and extrapolating from this to taxa-abundance curves to estimate richness. However, since sample sizes are small, we cannot be sure that such samples are representative of the community from which they were drawn. It is however possible to formulate, and calibrate, models that predict the diversity of local communities and of samples drawn from that local community. The calibration of such models suggests that migration rates are small and decrease as the community gets larger. The preliminary predictions of the model are qualitatively consistent with the patterns seen in clone libraries in ‘real life’. The validation of this model is also confounded by small sample sizes. However, if such models were properly validated, they could form invaluable tools for the prediction of microbial diversity and a basis for the systematic exploration of microbial diversity on the planet. PMID:17028084

  2. Improved sampling and analysis of images in corneal confocal microscopy.

    PubMed

    Schaldemose, E L; Fontain, F I; Karlsson, P; Nyengaard, J R

    2017-10-01

    Corneal confocal microscopy (CCM) is a noninvasive clinical method to analyse and quantify corneal nerve fibres in vivo. Although the CCM technique is in constant progress, there are methodological limitations in terms of sampling of images and objectivity of the nerve quantification. The aim of this study was to present a randomized sampling method of the CCM images and to develop an adjusted area-dependent image analysis. Furthermore, a manual nerve fibre analysis method was compared to a fully automated method. 23 idiopathic small-fibre neuropathy patients were investigated using CCM. Corneal nerve fibre length density (CNFL) and corneal nerve fibre branch density (CNBD) were determined in both a manual and automatic manner. Differences in CNFL and CNBD between (1) the randomized and the most common sampling method, (2) the adjusted and the unadjusted area and (3) the manual and automated quantification method were investigated. The CNFL values were significantly lower when using the randomized sampling method compared to the most common method (p = 0.01). There was not a statistical significant difference in the CNBD values between the randomized and the most common sampling method (p = 0.85). CNFL and CNBD values were increased when using the adjusted area compared to the standard area. Additionally, the study found a significant increase in the CNFL and CNBD values when using the manual method compared to the automatic method (p ≤ 0.001). The study demonstrated a significant difference in the CNFL values between the randomized and common sampling method indicating the importance of clear guidelines for the image sampling. The increase in CNFL and CNBD values when using the adjusted cornea area is not surprising. The observed increases in both CNFL and CNBD values when using the manual method of nerve quantification compared to the automatic method are consistent with earlier findings. This study underlines the importance of improving the analysis of the CCM images in order to obtain more objective corneal nerve fibre measurements. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  3. Random walks and diffusion on networks

    NASA Astrophysics Data System (ADS)

    Masuda, Naoki; Porter, Mason A.; Lambiotte, Renaud

    2017-11-01

    Random walks are ubiquitous in the sciences, and they are interesting from both theoretical and practical perspectives. They are one of the most fundamental types of stochastic processes; can be used to model numerous phenomena, including diffusion, interactions, and opinions among humans and animals; and can be used to extract information about important entities or dense groups of entities in a network. Random walks have been studied for many decades on both regular lattices and (especially in the last couple of decades) on networks with a variety of structures. In the present article, we survey the theory and applications of random walks on networks, restricting ourselves to simple cases of single and non-adaptive random walkers. We distinguish three main types of random walks: discrete-time random walks, node-centric continuous-time random walks, and edge-centric continuous-time random walks. We first briefly survey random walks on a line, and then we consider random walks on various types of networks. We extensively discuss applications of random walks, including ranking of nodes (e.g., PageRank), community detection, respondent-driven sampling, and opinion models such as voter models.

  4. [Study on correction of data bias caused by different missing mechanisms in survey of medical expenditure among students enrolling in Urban Resident Basic Medical Insurance].

    PubMed

    Zhang, Haixia; Zhao, Junkang; Gu, Caijiao; Cui, Yan; Rong, Huiying; Meng, Fanlong; Wang, Tong

    2015-05-01

    The study of the medical expenditure and its influencing factors among the students enrolling in Urban Resident Basic Medical Insurance (URBMI) in Taiyuan indicated that non response bias and selection bias coexist in dependent variable of the survey data. Unlike previous studies only focused on one missing mechanism, a two-stage method to deal with two missing mechanisms simultaneously was suggested in this study, combining multiple imputation with sample selection model. A total of 1 190 questionnaires were returned by the students (or their parents) selected in child care settings, schools and universities in Taiyuan by stratified cluster random sampling in 2012. In the returned questionnaires, 2.52% existed not missing at random (NMAR) of dependent variable and 7.14% existed missing at random (MAR) of dependent variable. First, multiple imputation was conducted for MAR by using completed data, then sample selection model was used to correct NMAR in multiple imputation, and a multi influencing factor analysis model was established. Based on 1 000 times resampling, the best scheme of filling the random missing values is the predictive mean matching (PMM) method under the missing proportion. With this optimal scheme, a two stage survey was conducted. Finally, it was found that the influencing factors on annual medical expenditure among the students enrolling in URBMI in Taiyuan included population group, annual household gross income, affordability of medical insurance expenditure, chronic disease, seeking medical care in hospital, seeking medical care in community health center or private clinic, hospitalization, hospitalization canceled due to certain reason, self medication and acceptable proportion of self-paid medical expenditure. The two-stage method combining multiple imputation with sample selection model can deal with non response bias and selection bias effectively in dependent variable of the survey data.

  5. Urinary metabolomic study of non-small cell lung carcinoma based on ultra high performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Wu, Qian; Wang, Yan; Gu, Xue; Zhou, Junyi; Zhang, Huiping; Lv, Wang; Chen, Zhe; Yan, Chao

    2014-07-01

    Metabolic profiles from human urine reveal the significant difference of carnitine and acylcarnitines levels between non-small cell lung carcinoma patients and healthy controls. Urine samples from cancer patients and healthy individuals were assayed in this metabolomic study using ultra high performance liquid chromatography coupled to quadrupole time-of-flight mass spectrometry. The data were normalized by the sum of all intensities and creatinine calibration, respectively, before orthogonal partial least squares discriminant analysis. Twenty differential metabolites were identified based on standard compounds or tandem mass spectrometry fragments. Among them, some medium-/long-chain acylcarnitines, for example, cis-3,4-methylene heptanoylcarnitine, were found to be downregulated while carnitine was upregulated in urine samples from the cancer group compared to the control group. Receiver operating characteristic analysis of the two groups showed that the area under curve for the combination of carnitine and 11 selected acylcarnitines was 0.958. This study suggests that the developed carnitine and acylcarnitines profiling method has the potential to be used for screening non-small cell lung carcinoma. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. The male-taller norm: Lack of evidence from a developing country.

    PubMed

    Sohn, K

    2015-08-01

    In general, women prefer men taller than themselves; this is referred to as the male-taller norm. However, since women are shorter than men on average, it is difficult to determine whether the fact that married women are on average shorter than their husbands results from the norm or is a simple artifact generated by the shorter stature of women. This study addresses the question by comparing the rate of adherence to the male-taller norm between actual mating and hypothetical random mating. A total of 7954 actually married couples are drawn from the last follow-up of the Indonesian Family Life Survey, a nationally representative survey. Their heights were measured by trained nurses. About 10,000 individuals are randomly sampled from the actual couples and randomly matched. An alternative random mating of about 100,000 couples is also performed, taking into account an age difference of 5 years within a couple. The rate of adherence to the male-taller norm is 93.4% for actual couples and 88.8% for random couples. The difference between the two figures is statistically significant, but it is emphasized that it is very small. The alternative random mating produces a rate of 91.4%. The male-taller norm exists in Indonesia, but only in a statistical sense. The small difference suggests that the norm is mostly explained by the fact that women are shorter than men on average. Copyright © 2015 Elsevier GmbH. All rights reserved.

  7. Light scattering by low-density agglomerates of micron-sized grains with the PROGRA2 experiment

    NASA Astrophysics Data System (ADS)

    Hadamcik, E.; Renard, J.-B.; Lasue, J.; Levasseur-Regourd, A. C.; Blum, J.; Schraepler, R.

    2007-07-01

    This work was carried out with the PROGRA2 experiment, specifically developed to measure the angular dependence of the polarization of light scattered by dust particles. The samples are small agglomerates of micron-sized grains and huge, low number density agglomerates of the same grains. The constituent grains (spherical or irregularly shaped) are made of different non-absorbing and absorbing materials. The small agglomerates, in a size range of a few microns, are lifted by an air draught. The huge centimeter-sized agglomerates, produced by random ballistic deposition of the grains, are deposited on a flat surface. The phase curves obtained for monodisperse, micron-sized spheres in agglomerates are obviously not comparable to the ‘smooth’ phase curves obtained by remote observations of cometary dust or asteroidal regoliths but they are used for comparison with numerical calculations to a better understanding of the light scattering processes. The phase curves obtained for irregular grains in agglomerates are similar to those obtained by remote observations, with a negative branch at phase angles smaller than 20° and a maximum polarization decreasing with increasing albedo. These results, coupled with remote observations in the solar system, should provide a better understanding of the physical properties of solid particles and their variation in cometary comae and asteroidal regoliths.

  8. A new mosaic method for three-dimensional surface

    NASA Astrophysics Data System (ADS)

    Yuan, Yun; Zhu, Zhaokun; Ding, Yongjun

    2011-08-01

    Three-dimensional (3-D) data mosaic is a indispensable link in surface measurement and digital terrain map generation. With respect to the mosaic problem of the local unorganized cloud points with rude registration and mass mismatched points, a new mosaic method for 3-D surface based on RANSAC is proposed. Every circular of this method is processed sequentially by random sample with additional shape constraint, data normalization of cloud points, absolute orientation, data denormalization of cloud points, inlier number statistic, etc. After N random sample trials the largest consensus set is selected, and at last the model is re-estimated using all the points in the selected subset. The minimal subset is composed of three non-colinear points which form a triangle. The shape of triangle is considered in random sample selection in order to make the sample selection reasonable. A new coordinate system transformation algorithm presented in this paper is used to avoid the singularity. The whole rotation transformation between the two coordinate systems can be solved by twice rotations expressed by Euler angle vector, each rotation has explicit physical means. Both simulation and real data are used to prove the correctness and validity of this mosaic method. This method has better noise immunity due to its robust estimation property, and has high accuracy as the shape constraint is added to random sample and the data normalization added to the absolute orientation. This method is applicable for high precision measurement of three-dimensional surface and also for the 3-D terrain mosaic.

  9. The Effects of Computer Animated Dissection versus Preserved Animal Dissection on the Student Achievement in a High School Biology Class.

    ERIC Educational Resources Information Center

    Kariuki, Patrick; Paulson, Ronda

    The purpose of this study was to examine the effectiveness of computer-animated dissection techniques versus the effectiveness of traditional dissection techniques as related to student achievement. The sample used was 104 general biology students from a small, rural high school in Northeast Tennessee. Random selection was used to separate the…

  10. Bias of health estimates obtained from chronic disease and risk factor surveillance systems using telephone population surveys in Australia: results from a representative face-to-face survey in Australia from 2010 to 2013.

    PubMed

    Dal Grande, Eleonora; Chittleborough, Catherine R; Campostrini, Stefano; Taylor, Anne W

    2016-04-18

    Emerging communication technologies have had an impact on population-based telephone surveys worldwide. Our objective was to examine the potential biases of health estimates in South Australia, a state of Australia, obtained via current landline telephone survey methodologies and to report on the impact of mobile-only household on household surveys. Data from an annual multi-stage, systematic, clustered area, face-to-face population survey, Health Omnibus Survey (approximately 3000 interviews annually), included questions about telephone ownership to assess the population that were non-contactable by current telephone sampling methods (2006 to 2013). Univariable analyses (2010 to 2013) and trend analyses were conducted for sociodemographic and health indicator variables in relation to telephone status. Relative coverage biases (RCB) of two hypothetical telephone samples was undertaken by examining the prevalence estimates of health status and health risk behaviours (2010 to 2013): directory-listed numbers, consisting mainly of landline telephone numbers and a small proportion of mobile telephone numbers; and a random digit dialling (RDD) sample of landline telephone numbers which excludes mobile-only households. Telephone (landline and mobile) coverage in South Australia is very high (97%). Mobile telephone ownership increased slightly (7.4%), rising from 89.7% in 2006 to 96.3% in 2013; mobile-only households increased by 431% over the eight year period from 5.2% in 2006 to 27.6% in 2013. Only half of the households have either a mobile or landline number listed in the telephone directory. There were small differences in the prevalence estimates for current asthma, arthritis, diabetes and obesity between the hypothetical telephone samples and the overall sample. However, prevalence estimate for diabetes was slightly underestimated (RCB value of -0.077) in 2013. Mixed RCB results were found for having a mental health condition for both telephone samples. Current smoking prevalence was lower for both hypothetical telephone samples in absolute differences and RCB values: -0.136 to -0.191 for RDD landline samples and -0.129 to -0.313 for directory-listed samples. These findings suggest landline-based sampling frames used in Australia, when appropriately weighted, produce reliable representative estimates for some health indicators but not for all. Researchers need to be aware of their limitations and potential biased estimates.

  11. Hierarchy of evidence: differences in results between non-randomized studies and randomized trials in patients with femoral neck fractures.

    PubMed

    Bhandari, Mohit; Tornetta, Paul; Ellis, Thomas; Audige, Laurent; Sprague, Sheila; Kuo, Jonathann C; Swiontkowski, Marc F

    2004-01-01

    There have been a number of non-randomized studies comparing arthroplasty with internal fixation in patients with femoral neck fractures. However, there remains considerable debate about whether the results of non-randomized studies are consistent with the results of randomized, controlled trials. Given the economic burden of hip fractures, it remains essential to identify therapies to improve outcomes; however, whether data from non-randomized studies of an intervention should be used to guide patient care remains unclear. We aimed to determine whether the pooled results of mortality and revision surgery among non-randomized studies were similar to those of randomized trials in studies comparing arthroplasty with internal fixation in patients with femoral neck fractures. We conducted a Medline search from 1969 to June 2002, identifying both randomized and non-randomized studies comparing internal fixation with arthroplasty in patients with femoral neck fractures. Additional strategies to identify relevant articles included Cochrane database, SCISEARCH, textbooks, annual meeting programs, and content experts. We abstracted information on mortality and revision rates in each study and compared the pooled results between non-randomized and randomized studies. In addition, we explored potential reasons for dissimilar results between the two study designs. We identified 140 citations that addressed the general topic of comparison of arthroplasty and internal fixation for hip fracture. Of these, 27 studies met the eligibility criteria, 13 of which were non-randomized studies and 14 of which were randomized trials. Mortality data was available in all 13 non-randomized studies ( n=3108 patients) and in 12 randomized studies ( n=1767 patients). Non-randomized studies overestimated the risk of mortality by 40% when compared with the results of randomized trials (relative risk 1.44 vs 1.04, respectively). Information on revision risk was available in 9 non-randomized studies ( n=2764 patients) and all 14 randomized studies ( n=1901 patients). Both estimates from non-randomized and randomized studies revealed a significant reduction in the risk of revision surgery with arthroplasty compared with internal fixation (relative risk 0.38 vs 0.23, respectively). The reduction in the risk of revision surgery with arthroplasty compared with internal fixation was 62% for non-randomized studies and 77% for randomized trials. Thus, non-randomized studies underestimated the relative benefit of arthroplasty by 19.5%. Non-randomized studies with point estimates of relative risk similar to the pooled estimate for randomized trials all controlled for patient age, gender, and fracture displacement in their comparisons of mortality. We were unable to identify reasons for differences in the revision rate results between the study designs. Similar to other reports in medical subspecialties, non-randomized studies provided results dissimilar to randomized trials of arthroplasty vs internal fixation for mortality and revision rates in patients with femoral neck fractures. Investigators should be aware of these discrepancies when evaluating the merits of alternative surgical interventions, especially when both randomized trials and non-randomized comparative studies are available.

  12. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    PubMed

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  13. Transcriptome instability as a molecular pan-cancer characteristic of carcinomas.

    PubMed

    Sveen, Anita; Johannessen, Bjarne; Teixeira, Manuel R; Lothe, Ragnhild A; Skotheim, Rolf I

    2014-08-10

    We have previously proposed transcriptome instability as a genome-wide, pre-mRNA splicing-related characteristic of colorectal cancer. Here, we explore the hypothesis of transcriptome instability being a general characteristic of cancer. Exon-level microarray expression data from ten cancer datasets were analyzed, including breast cancer, cervical cancer, colorectal cancer, gastric cancer, lung cancer, neuroblastoma, and prostate cancer (555 samples), as well as paired normal tissue samples from the colon, lung, prostate, and stomach (93 samples). Based on alternative splicing scores across the genomes, we calculated sample-wise relative amounts of aberrant exon skipping and inclusion. Strong and non-random (P < 0.001) correlations between these estimates and the expression levels of splicing factor genes (n = 280) were found in most cancer types analyzed (breast-, cervical-, colorectal-, lung- and prostate cancer). This suggests a biological explanation for the splicing variation. Surprisingly, these associations prevailed in pan-cancer analyses. This is in contrast to the tissue and cancer specific patterns observed in comparisons across healthy tissue samples from the colon, lung, prostate, and stomach, and between paired cancer-normal samples from the same four tissue types. Based on exon-level expression profiling and computational analyses of alternative splicing, we propose transcriptome instability as a molecular pan-cancer characteristic. The affected cancers show strong and non-random associations between low expression levels of splicing factor genes, and high amounts of aberrant exon skipping and inclusion, and vice versa, on a genome-wide scale.

  14. A universal method for the mutational analysis of K-ras and p53 gene in non-small-cell lung cancer using formalin-fixed paraffin-embedded tissue.

    PubMed

    Sarkar, F H; Valdivieso, M; Borders, J; Yao, K L; Raval, M M; Madan, S K; Sreepathi, P; Shimoyama, R; Steiger, Z; Visscher, D W

    1995-12-01

    The p53 tumor suppressor gene has been found to be altered in almost all human solid tumors, whereas K-ras gene mutations have been observed in a limited number of human cancers (adenocarcinoma of colon, pancreas, and lung). Studies of mutational inactivation for both genes in the same patient's sample on non-small-cell lung cancer have been limited. In an effort to perform such an analysis, we developed and compared methods (for the mutational detection of p53 and K-ras gene) that represent a modified and universal protocol, in terms of DNA extraction, polymerase chain reaction (PCR) amplification, and nonradioisotopic PCR-single-strand conformation polymorphism (PCR-SSCP) analysis, which is readily applicable to either formalin-fixed, paraffin-embedded tissues or frozen tumor specimens. We applied this method to the evaluation of p53 (exons 5-8) and K-ras (codon 12 and 13) gene mutations in 55 cases of non-small-cell lung cancer. The mutational status in the p53 gene was evaluated by radioisotopic PCR-SSCP and compared with PCR-SSCP utilizing our standardized nonradioisotopic detection system using a single 6-microns tissue section. The mutational patterns observed by PCR-SSCP were subsequently confirmed by PCR-DNA sequencing. The mutational status in the K-ras gene was similarly evaluated by PCR-SSCP, and the specific mutation was confirmed by Southern slot-blot hybridization using 32P-labeled sequence-specific oligonucleotide probes for codons 12 and 13. Mutational changes in K-ras (codon 12) were found in 10 of 55 (18%) of non-small-cell lung cancers. Whereas adenocarcinoma showed K-ras mutation in 33% of the cases at codon 12, only one mutation was found at codon 13. As expected, squamous cell carcinoma samples (25 cases) did not show K-ras mutations. Mutations at exons 5-8 of the p53 gene were documented in 19 of 55 (34.5%) cases. Ten of the 19 mutations were single nucleotide point mutations, leading to amino acid substitution. Six showed insertional mutation, and three showed deletion mutations. Only three samples showed mutations of both K-ras and p53 genes. We conclude that although K-ras and p53 gene mutations are frequent in non-small-cell lung cancer, mutations of both genes in the same patient's samples are not common. We also conclude that this universal nonradioisotopic method is superior to other similar methods and is readily applicable to the rapid screening of large numbers of formalin-fixed, paraffin-embedded or frozen samples for the mutational analysis of multiple genes.

  15. Non-parametric methods for cost-effectiveness analysis: the central limit theorem and the bootstrap compared.

    PubMed

    Nixon, Richard M; Wonderling, David; Grieve, Richard D

    2010-03-01

    Cost-effectiveness analyses (CEA) alongside randomised controlled trials commonly estimate incremental net benefits (INB), with 95% confidence intervals, and compute cost-effectiveness acceptability curves and confidence ellipses. Two alternative non-parametric methods for estimating INB are to apply the central limit theorem (CLT) or to use the non-parametric bootstrap method, although it is unclear which method is preferable. This paper describes the statistical rationale underlying each of these methods and illustrates their application with a trial-based CEA. It compares the sampling uncertainty from using either technique in a Monte Carlo simulation. The experiments are repeated varying the sample size and the skewness of costs in the population. The results showed that, even when data were highly skewed, both methods accurately estimated the true standard errors (SEs) when sample sizes were moderate to large (n>50), and also gave good estimates for small data sets with low skewness. However, when sample sizes were relatively small and the data highly skewed, using the CLT rather than the bootstrap led to slightly more accurate SEs. We conclude that while in general using either method is appropriate, the CLT is easier to implement, and provides SEs that are at least as accurate as the bootstrap. (c) 2009 John Wiley & Sons, Ltd.

  16. Composition-dependent nanoelectronics of amido-phenazines: non-volatile RRAM and WORM memory devices.

    PubMed

    Maiti, Dilip K; Debnath, Sudipto; Nawaz, Sk Masum; Dey, Bapi; Dinda, Enakhi; Roy, Dipanwita; Ray, Sudipta; Mallik, Abhijit; Hussain, Syed A

    2017-10-17

    A metal-free three component cyclization reaction with amidation is devised for direct synthesis of DFT-designed amido-phenazine derivative bearing noncovalent gluing interactions to fabricate organic nanomaterials. Composition-dependent organic nanoelectronics for nonvolatile memory devices are discovered using mixed phenazine-stearic acid (SA) nanomaterials. We discovered simultaneous two different types of nonmagnetic and non-moisture sensitive switching resistance properties of fabricated devices utilizing mixed organic nanomaterials: (a) sample-1(8:SA = 1:3) is initially off, turning on at a threshold, but it does not turn off again with the application of any voltage, and (b) sample-2 (8:SA = 3:1) is initially off, turning on at a sharp threshold and off again by reversing the polarity. No negative differential resistance is observed in either type. These samples have different device implementations: sample-1 is attractive for write-once-read-many-times memory devices, such as novel non-editable database, archival memory, electronic voting, radio frequency identification, sample-2 is useful for resistive-switching random access memory application.

  17. The analysis of ALK gene rearrangement by fluorescence in situ hybridization in non-small cell lung cancer patients

    PubMed Central

    Krawczyk, Paweł Adam; Ramlau, Rodryg Adam; Szumiło, Justyna; Kozielski, Jerzy; Kalinka-Warzocha, Ewa; Bryl, Maciej; Knopik-Dąbrowicz, Alina; Spychalski, Łukasz; Szczęsna, Aleksandra; Rydzik, Ewelina; Milanowski, Janusz

    2013-01-01

    Introduction ALK gene rearrangement is observed in a small subset (3–7%) of non-small cell lung cancer (NSCLC) patients. The efficacy of crizotinib was shown in lung cancer patients harbouring ALK rearrangement. Nowadays, the analysis of ALK gene rearrangement is added to molecular examination of predictive factors. Aim of the study The frequency of ALK gene rearrangement as well as the type of its irregularity was analysed by fluorescence in situ hybridisation (FISH) in tissue samples from NSCLC patients. Material and methods The ALK gene rearrangement was analysed in 71 samples including 53 histological and 18 cytological samples. The analysis could be performed in 56 cases (78.87%), significantly more frequently in histological than in cytological materials. The encountered problem with ALK rearrangement diagnosis resulted from the scarcity of tumour cells in cytological samples, high background fluorescence noises and fragmentation of cell nuclei. Results The normal ALK copy number without gene rearrangement was observed in 26 (36.62%) patients ALK gene polysomy without gene rearrangement was observed in 25 (35.21%) samples while in 3 (4.23%) samples ALK gene amplification was found. ALK gene rearrangement was observed in 2 (2.82%) samples from males, while in the first case the rearrangement coexisted with ALK amplification. In the second case, signet-ring tumour cells were found during histopathological examination and this patient was successfully treated with crizotinib with partial remission lasting 16 months. Conclusions FISH is a useful technique for ALK gene rearrangement analysis which allows us to specify the type of gene irregularities. ALK gene examination could be performed in histological as well as cytological (cellblocks) samples, but obtaining a reliable result in cytological samples depends on the cellularity of examined materials. PMID:24592134

  18. An intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces.

    PubMed

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-09-01

    Poisson disk sampling has excellent spatial and spectral properties, and plays an important role in a variety of visual computing. Although many promising algorithms have been proposed for multidimensional sampling in euclidean space, very few studies have been reported with regard to the problem of generating Poisson disks on surfaces due to the complicated nature of the surface. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. In sharp contrast to the conventional parallel approaches, our method neither partitions the given surface into small patches nor uses any spatial data structure to maintain the voids in the sampling domain. Instead, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. Our algorithm guarantees that the generated Poisson disks are uniformly and randomly distributed without bias. It is worth noting that our method is intrinsic and independent of the embedding space. This intrinsic feature allows us to generate Poisson disk patterns on arbitrary surfaces in IR(n). To our knowledge, this is the first intrinsic, parallel, and accurate algorithm for surface Poisson disk sampling. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  19. Hydration Free Energy from Orthogonal Space Random Walk and Polarizable Force Field.

    PubMed

    Abella, Jayvee R; Cheng, Sara Y; Wang, Qiantao; Yang, Wei; Ren, Pengyu

    2014-07-08

    The orthogonal space random walk (OSRW) method has shown enhanced sampling efficiency in free energy calculations from previous studies. In this study, the implementation of OSRW in accordance with the polarizable AMOEBA force field in TINKER molecular modeling software package is discussed and subsequently applied to the hydration free energy calculation of 20 small organic molecules, among which 15 are positively charged and five are neutral. The calculated hydration free energies of these molecules are compared with the results obtained from the Bennett acceptance ratio method using the same force field, and overall an excellent agreement is obtained. The convergence and the efficiency of the OSRW are also discussed and compared with BAR. Combining enhanced sampling techniques such as OSRW with polarizable force fields is very promising for achieving both accuracy and efficiency in general free energy calculations.

  20. 75 FR 15492 - Paperwork Reduction Act of 1995, as Amended by Public Law 104-13; Proposed Collection, Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-29

    .... James W. Sample, Director of CyberSecurity. [FR Doc. 2010-6904 Filed 3-26-10; 8:45 am] BILLING CODE 8120...: Individuals or households, state or local governments, farms, businesses, or other for-profit Federal agencies or employees, non-profit institutions, small businesses or organizations. Small Businesses or...

Top