Sample records for random representative sample

  1. Observational studies of patients in the emergency department: a comparison of 4 sampling methods.

    PubMed

    Valley, Morgan A; Heard, Kennon J; Ginde, Adit A; Lezotte, Dennis C; Lowenstein, Steven R

    2012-08-01

    We evaluate the ability of 4 sampling methods to generate representative samples of the emergency department (ED) population. We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by using 2 sample sizes (n=200 and n=400) and 4 sampling methods: true random, random 4-hour time blocks by exact sample size, random 4-hour time blocks by a predetermined number of blocks, and convenience or "business hours." For each method and sample size, we obtained 1,000 samples from the population. Using χ(2) tests, we measured the number of statistically significant differences between the sample and the population for 8 variables (age, sex, race/ethnicity, language, triage acuity, arrival mode, disposition, and payer source). Then, for each variable, method, and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Only the true random samples represented the population with respect to sex, race/ethnicity, triage acuity, mode of arrival, language, and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. Copyright © 2012. Published by Mosby, Inc.

  2. Point-Sampling and Line-Sampling Probability Theory, Geometric Implications, Synthesis

    Treesearch

    L.R. Grosenbaugh

    1958-01-01

    Foresters concerned with measuring tree populations on definite areas have long employed two well-known methods of representative sampling. In list or enumerative sampling the entire tree population is tallied with a known proportion being randomly selected and measured for volume or other variables. In area sampling all trees on randomly located plots or strips...

  3. Sampling in epidemiological research: issues, hazards and pitfalls.

    PubMed

    Tyrer, Stephen; Heyman, Bob

    2016-04-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research.

  4. Sampling in epidemiological research: issues, hazards and pitfalls

    PubMed Central

    Tyrer, Stephen; Heyman, Bob

    2016-01-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985

  5. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  6. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    PubMed

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical uniform random (VUR) sections.

  7. Optimized probability sampling of study sites to improve generalizability in a multisite intervention trial.

    PubMed

    Kraschnewski, Jennifer L; Keyserling, Thomas C; Bangdiwala, Shrikant I; Gizlice, Ziya; Garcia, Beverly A; Johnston, Larry F; Gustafson, Alison; Petrovic, Lindsay; Glasgow, Russell E; Samuel-Hodge, Carmen D

    2010-01-01

    Studies of type 2 translation, the adaption of evidence-based interventions to real-world settings, should include representative study sites and staff to improve external validity. Sites for such studies are, however, often selected by convenience sampling, which limits generalizability. We used an optimized probability sampling protocol to select an unbiased, representative sample of study sites to prepare for a randomized trial of a weight loss intervention. We invited North Carolina health departments within 200 miles of the research center to participate (N = 81). Of the 43 health departments that were eligible, 30 were interested in participating. To select a representative and feasible sample of 6 health departments that met inclusion criteria, we generated all combinations of 6 from the 30 health departments that were eligible and interested. From the subset of combinations that met inclusion criteria, we selected 1 at random. Of 593,775 possible combinations of 6 counties, 15,177 (3%) met inclusion criteria. Sites in the selected subset were similar to all eligible sites in terms of health department characteristics and county demographics. Optimized probability sampling improved generalizability by ensuring an unbiased and representative sample of study sites.

  8. Reaching a Representative Sample of College Students: A Comparative Analysis

    ERIC Educational Resources Information Center

    Giovenco, Daniel P.; Gundersen, Daniel A.; Delnevo, Cristine D.

    2016-01-01

    Objective: To explore the feasibility of a random-digit dial (RDD) cellular phone survey in order to reach a national and representative sample of college students. Methods: Demographic distributions from the 2011 National Young Adult Health Survey (NYAHS) were benchmarked against enrollment numbers from the Integrated Postsecondary Education…

  9. Recruitment for Occupational Research: Using Injured Workers as the Point of Entry into Workplaces

    PubMed Central

    Koehoorn, Mieke; Trask, Catherine M.; Teschke, Kay

    2013-01-01

    Objective To investigate the feasibility, costs and sample representativeness of a recruitment method that used workers with back injuries as the point of entry into diverse working environments. Methods Workers' compensation claims were used to randomly sample workers from five heavy industries and to recruit their employers for ergonomic assessments of the injured worker and up to 2 co-workers. Results The final study sample included 54 workers from the workers’ compensation registry and 72 co-workers. This sample of 126 workers was based on an initial random sample of 822 workers with a compensation claim, or a ratio of 1 recruited worker to approximately 7 sampled workers. The average recruitment cost was CND$262/injured worker and CND$240/participating worksite including co-workers. The sample was representative of the heavy industry workforce, and was successful in recruiting the self-employed (8.2%), workers from small employers (<20 workers, 38.7%), and workers from diverse working environments (49 worksites, 29 worksite types, and 51 occupations). Conclusions The recruitment rate was low but the cost per participant reasonable and the sample representative of workers in small worksites. Small worksites represent a significant portion of the workforce but are typically underrepresented in occupational research despite having distinct working conditions, exposures and health risks worthy of investigation. PMID:23826387

  10. Does self-selection affect samples' representativeness in online surveys? An investigation in online video game research.

    PubMed

    Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-07-07

    The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.

  11. Representative Sampling: Follow-up of Spring 1972 and Spring 1973 Students. TEX-SIS FOLLOW-UP SC3.

    ERIC Educational Resources Information Center

    Wilkinson, Larry; And Others

    This report presents the findings of a research study, conducted by the College of the Mainland (COM) as a subcontractor for Project FOLLOW-UP, designed to test the accuracy of random sampling and to measure non-response bias in mail surveys. In 1975, a computer-generated random sample of 500 students was drawn from a population of 1,256 students…

  12. A Model for Predicting Behavioural Sleep Problems in a Random Sample of Australian Pre-Schoolers

    ERIC Educational Resources Information Center

    Hall, Wendy A.; Zubrick, Stephen R.; Silburn, Sven R.; Parsons, Deborah E.; Kurinczuk, Jennifer J.

    2007-01-01

    Behavioural sleep problems (childhood insomnias) can cause distress for both parents and children. This paper reports a model describing predictors of high sleep problem scores in a representative population-based random sample survey of non-Aboriginal singleton children born in 1995 and 1996 (1085 girls and 1129 boys) in Western Australia.…

  13. Predicting Posttraumatic Stress Symptoms Longitudinally in a Representative Sample of Hospitalized Injured Adolescents

    ERIC Educational Resources Information Center

    Zatzick, Douglas F.; Grossman, David C.; Russo, Joan; Pynoos, Robert; Berliner, Lucy; Jurkovich, Gregory; Sabin, Janice A.; Katon, Wayne; Ghesquiere, Angela; McCauley, Elizabeth; Rivara, Frederick P.

    2006-01-01

    Objective: Adolescents constitute a high-risk population for traumatic physical injury, yet few longitudinal investigations have assessed the development of posttraumatic stress disorder (PTSD) symptoms over time in representative samples. Method: Between July 2002 and August 2003,108 randomly selected injured adolescent patients ages 12 to 18 and…

  14. Toward a Principled Sampling Theory for Quasi-Orders

    PubMed Central

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  15. Toward a Principled Sampling Theory for Quasi-Orders.

    PubMed

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  16. Does Self-Selection Affect Samples’ Representativeness in Online Surveys? An Investigation in Online Video Game Research

    PubMed Central

    van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-01-01

    Background The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Objective Our objective was to explore the representativeness of a self-selected sample of online gamers using online players’ virtual characters (avatars). Methods All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars’ characteristics were defined using various games’ scores, reported on the WoW’s official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. Results We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Conclusions Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted. PMID:25001007

  17. Studies in astronomical time series analysis: Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  18. Recruitment of healthy participants for studies on risks for alcoholism: effectiveness of random digit dialling.

    PubMed

    Sorocco, Kristen H; Vincent, Andrea S; Collins, Frank L; Johnson, Christine A; Lovallo, William R

    2006-01-01

    To compare the effectiveness of two strategies for recruiting healthy research volunteers. Demographic characteristics and recruitment costs of participants who completed a laboratory study examining risk factors for alcoholism recruited through random digit dialling (N = 11) and community advertisements (N = 102) were compared. Advertisement yielded a more representative sample [76% Caucasian, less well educated (M = 15.2 years, SEM = 0.2; P < 0.05), more equally divided by family history of alcoholism (43% FH- and 57% FH+), and lower in SES (M = 42.8, SEM = 1.3; P < 0.05)] and was more cost effective (72 dollars vs 2272 dollars per participant) than random digit dialling. Findings are relevant to alcohol researchers trying to determine the recruitment strategy that will yield the most representative sample at the lowest cost.

  19. A Practical Methodology for Quantifying Random and Systematic Components of Unexplained Variance in a Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Obara, Clifford J.; Goodman, Wesley L.

    2012-01-01

    This paper documents a check standard wind tunnel test conducted in the Langley 0.3-Meter Transonic Cryogenic Tunnel (0.3M TCT) that was designed and analyzed using the Modern Design of Experiments (MDOE). The test designed to partition the unexplained variance of typical wind tunnel data samples into two constituent components, one attributable to ordinary random error, and one attributable to systematic error induced by covariate effects. Covariate effects in wind tunnel testing are discussed, with examples. The impact of systematic (non-random) unexplained variance on the statistical independence of sequential measurements is reviewed. The corresponding correlation among experimental errors is discussed, as is the impact of such correlation on experimental results generally. The specific experiment documented herein was organized as a formal test for the presence of unexplained variance in representative samples of wind tunnel data, in order to quantify the frequency with which such systematic error was detected, and its magnitude relative to ordinary random error. Levels of systematic and random error reported here are representative of those quantified in other facilities, as cited in the references.

  20. Reaching Asian Americans: sampling strategies and incentives.

    PubMed

    Lee, Soo-Kyung; Cheng, Yu-Yao

    2006-07-01

    Reaching and recruiting representative samples of minority populations is often challenging. This study examined in Chinese and Korean Americans: 1) whether using two different sampling strategies (random sampling vs. convenience sampling) significantly affected characteristics of recruited participants and 2) whether providing different incentives in the mail survey produced different response rates. We found statistically significant, however mostly not remarkable, differences between random and convenience samples. Offering monetary incentives in the mail survey improved response rates among Chinese Americans, while offering a small gift did not improve response rates among either Chinese or Korean Americans. This information will be useful for researchers and practitioners working with Asian Americans.

  1. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    PubMed

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling procedure was reproducible with results comparable to the collected sample. However, the sampling procedure favoured sampling of large farms. Furthermore, both under-sampled and over-sampled areas were found using scan statistics. In conclusion, sampling conducted at abattoirs can provide a spatially representative sample. Hence it is a possible cost-effective alternative to simple random sampling. However, it is important to assess the properties of the resulting sample so that any potential selection bias can be addressed when reporting the findings. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).

  3. Feline mitochondrial DNA sampling for forensic analysis: when enough is enough!

    PubMed

    Grahn, Robert A; Alhaddad, Hasan; Alves, Paulo C; Randi, Ettore; Waly, Nashwa E; Lyons, Leslie A

    2015-05-01

    Pet hair has a demonstrated value in resolving legal issues. Cat hair is chronically shed and it is difficult to leave a home with cats without some level of secondary transfer. The power of cat hair as an evidentiary resource may be underused because representative genetic databases are not available for exclusionary purposes. Mitochondrial control region databases are highly valuable for hair analyses and have been developed for the cat. In a representative worldwide data set, 83% of domestic cat mitotypes belong to one of twelve major types. Of the remaining 17%, 7.5% are unique within the published 1394 sample database. The current research evaluates the sample size necessary to establish a representative population for forensic comparison of the mitochondrial control region for the domestic cat. For most worldwide populations, randomly sampling 50 unrelated local individuals will achieve saturation at 95%. The 99% saturation is achieved by randomly sampling 60-170 cats, depending on the numbers of mitotypes available in the population at large. Likely due to the recent domestication of the cat and minimal localized population substructure, fewer cats are needed to meet mitochondria DNA control region database practical saturation than for humans or dogs. Coupled with the available worldwide feline control region database of nearly 1400 cats, minimal local sampling will be required to establish an appropriate comparative representative database and achieve significant exclusionary power. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    PubMed

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries.

  5. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    PubMed Central

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries. PMID:29351349

  6. Is Knowledge Random? Introducing Sampling and Bias through Outdoor Inquiry

    ERIC Educational Resources Information Center

    Stier, Sam

    2010-01-01

    Sampling, very generally, is the process of learning about something by selecting and assessing representative parts of that population or object. In the inquiry activity described here, students learned about sampling techniques as they estimated the number of trees greater than 12 cm dbh (diameter at breast height) in a wooded, discrete area…

  7. Random-effects linear modeling and sample size tables for two special crossover designs of average bioequivalence studies: the four-period, two-sequence, two-formulation and six-period, three-sequence, three-formulation designs.

    PubMed

    Diaz, Francisco J; Berg, Michel J; Krebill, Ron; Welty, Timothy; Gidal, Barry E; Alloway, Rita; Privitera, Michael

    2013-12-01

    Due to concern and debate in the epilepsy medical community and to the current interest of the US Food and Drug Administration (FDA) in revising approaches to the approval of generic drugs, the FDA is currently supporting ongoing bioequivalence studies of antiepileptic drugs, the EQUIGEN studies. During the design of these crossover studies, the researchers could not find commercial or non-commercial statistical software that quickly allowed computation of sample sizes for their designs, particularly software implementing the FDA requirement of using random-effects linear models for the analyses of bioequivalence studies. This article presents tables for sample-size evaluations of average bioequivalence studies based on the two crossover designs used in the EQUIGEN studies: the four-period, two-sequence, two-formulation design, and the six-period, three-sequence, three-formulation design. Sample-size computations assume that random-effects linear models are used in bioequivalence analyses with crossover designs. Random-effects linear models have been traditionally viewed by many pharmacologists and clinical researchers as just mathematical devices to analyze repeated-measures data. In contrast, a modern view of these models attributes an important mathematical role in theoretical formulations in personalized medicine to them, because these models not only have parameters that represent average patients, but also have parameters that represent individual patients. Moreover, the notation and language of random-effects linear models have evolved over the years. Thus, another goal of this article is to provide a presentation of the statistical modeling of data from bioequivalence studies that highlights the modern view of these models, with special emphasis on power analyses and sample-size computations.

  8. Remote Sensing, Sampling and Simulation Applications in Analyses of Insect Dispersion and Abundance in Cotton

    Treesearch

    J. L. Willers; J. M. McKinion; J. N. Jenkins

    2006-01-01

    Simulation was employed to create stratified simple random samples of different sample unit sizes to represent tarnished plant bug abundance at different densities within various habitats of simulated cotton fields. These samples were used to investigate dispersion patterns of this cotton insect. It was found that the assessment of spatial pattern varied as a function...

  9. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  10. Improvement of Predictive Ability by Uniform Coverage of the Target Genetic Space

    PubMed Central

    Bustos-Korts, Daniela; Malosetti, Marcos; Chapman, Scott; Biddulph, Ben; van Eeuwijk, Fred

    2016-01-01

    Genome-enabled prediction provides breeders with the means to increase the number of genotypes that can be evaluated for selection. One of the major challenges in genome-enabled prediction is how to construct a training set of genotypes from a calibration set that represents the target population of genotypes, where the calibration set is composed of a training and validation set. A random sampling protocol of genotypes from the calibration set will lead to low quality coverage of the total genetic space by the training set when the calibration set contains population structure. As a consequence, predictive ability will be affected negatively, because some parts of the genotypic diversity in the target population will be under-represented in the training set, whereas other parts will be over-represented. Therefore, we propose a training set construction method that uniformly samples the genetic space spanned by the target population of genotypes, thereby increasing predictive ability. To evaluate our method, we constructed training sets alongside with the identification of corresponding genomic prediction models for four genotype panels that differed in the amount of population structure they contained (maize Flint, maize Dent, wheat, and rice). Training sets were constructed using uniform sampling, stratified-uniform sampling, stratified sampling and random sampling. We compared these methods with a method that maximizes the generalized coefficient of determination (CD). Several training set sizes were considered. We investigated four genomic prediction models: multi-locus QTL models, GBLUP models, combinations of QTL and GBLUPs, and Reproducing Kernel Hilbert Space (RKHS) models. For the maize and wheat panels, construction of the training set under uniform sampling led to a larger predictive ability than under stratified and random sampling. The results of our methods were similar to those of the CD method. For the rice panel, all training set construction methods led to similar predictive ability, a reflection of the very strong population structure in this panel. PMID:27672112

  11. Determination of reference values for elevated fasting and random insulin levels and their associations with metabolic risk factors among rural Pakistanis from Sindh Province.

    PubMed

    Ahmadani, Muhammad Yakoob; Hakeem, Rubina; Fawwad, Asher; Basit, Abdul; Shera, A Samad

    2008-06-01

    To assess insulin levels and their association with metabolic risk factors (family history of diabetes, abnormal glucose tolerance, hypertension, overweight and android obesity) among a representative group of Pakistan. The study data was taken from the database of a population-based survey conducted in Sindh Province, Pakistan, in 1994 to assess the prevalence of diabetes mellitus and impaired glucose tolerance (IGT). Through stratified random sampling; oral glucose tolerance tests were performed in 967 adults; every fifth sample was estimated for fasting and random (2-hour post-75 gm glucose load) insulin levels. The total number of metabolic risk factors was counted for each subject, and their association with insulin levels studied. Of the 130 subjects, 56.1% were females and 95.4% were Sindhi. The mean age of males and females was 43.84 and 40.61 years, respectively. Family history for diabetes and frequency of overweight had significant positive associations with both fasting and random insulin levels (P < 0.05). Association between hypertension and insulin levels was significant only for random insulin levels, and between android obesity, abnormal glucose tolerance, or male gender and insulin levels only for fasting insulin levels (P < 0.05). Metabolic risk factors had significant positive associations with both fasting (r = 0.351 P = 0.000) as well as random insulin levels (r = 0.364 P = 0.000). This paper provides baseline pioneering information applicable to the Pakistani population. Furthermore, the observations made in this study about differences in association of fasting or random insulin levels with various metabolic risk factors highlight the possibility of using either of them for risk assessment. This finding needs to be assessed in a larger and nationally representative sample.

  12. Women Secondary Principals in Texas 1998 and 2011: Movement toward Equity

    ERIC Educational Resources Information Center

    Marczynski, Jean C.; Gates, Gordon S.

    2013-01-01

    Purpose: The purpose of this paper is to analyze data gathered in 1998 and 2011 from representative samples of women secondary school principals in Texas to identify differences in personal, professional, leadership, and school characteristics. Design/methodology/approach: Two proportionate, random samples were drawn of women secondary principals…

  13. The coverage of a random sample from a biological community.

    PubMed

    Engen, S

    1975-03-01

    A taxonomic group will frequently have a large number of species with small abundances. When a sample is drawn at random from this group, one is therefore faced with the problem that a large proportion of the species will not be discovered. A general definition of quantitative measures of "sample coverage" is proposed, and the problem of statistical inference is considered for two special cases, (1) the actual total relative abundance of those species that are represented in the sample, and (2) their relative contribution to the information index of diversity. The analysis is based on a extended version of the negative binomial species frequency model. The results are tabulated.

  14. [Exploration of the concept of genetic drift in genetics teaching of undergraduates].

    PubMed

    Wang, Chun-ming

    2016-01-01

    Genetic drift is one of the difficulties in teaching genetics due to its randomness and probability which could easily cause conceptual misunderstanding. The “sampling error" in its definition is often misunderstood because of the research method of “sampling", which disturbs the results and causes the random changes in allele frequency. I analyzed and compared the definitions of genetic drift in domestic and international genetic textbooks, and found that the definitions containing “sampling error" are widely adopted but are interpreted correctly in only a few textbooks. Here, the history of research on genetic drift, i.e., the contributions of Wright, Fisher and Kimura, is introduced. Moreover, I particularly describe two representative articles recently published about genetic drift teaching of undergraduates, which point out that misconceptions are inevitable for undergraduates during the studying process and also provide a preliminary solution. Combined with my own teaching practice, I suggest that the definition of genetic drift containing “sampling error" can be adopted with further interpretation, i.e., “sampling error" is random sampling among gametes when generating the next generation of alleles which is equivalent to a random sampling of all gametes participating in mating in gamete pool and has no relationship with artificial sampling in general genetics studies. This article may provide some help in genetics teaching.

  15. An Analysis of Job Satisfaction Among Public, College or University, and Special Librarians.

    ERIC Educational Resources Information Center

    Miniter, John J.

    Usable data relating to six elements of job satisfaction: work, supervision, people, pay, promotion, and total satisfaction, were collected from 190 of a total sample of 310 librarians, chosen by stratified random sampling techniques from library association membership lists. The librarians, both male and female, represented three types of…

  16. Influence of tree spatial pattern and sample plot type and size on inventory

    Treesearch

    John-Pascall Berrill; Kevin L. O' Hara

    2012-01-01

    Sampling with different plot types and sizes was simulated using tree location maps and data collected in three even-aged coast redwood (Sequoia sempervirens) stands selected to represent uniform, random, and clumped spatial patterns of tree locations. Fixed-radius circular plots, belt transects, and variable-radius plots were installed by...

  17. Sampling maternal care behaviour in domestic dogs: What's the best approach?

    PubMed

    Czerwinski, Veronika H; Smith, Bradley P; Hynd, Philip I; Hazel, Susan J

    2017-07-01

    Our understanding of the frequency and duration of maternal care behaviours in the domestic dog during the first two postnatal weeks is limited, largely due to the inconsistencies in the sampling methodologies that have been employed. In order to develop a more concise picture of maternal care behaviour during this period, and to help establish the sampling method that represents these behaviours best, we compared a variety of time sampling methods Six litters were continuously observed for a total of 96h over postnatal days 3, 6, 9 and 12 (24h per day). Frequent (dam presence, nursing duration, contact duration) and infrequent maternal behaviours (anogenital licking duration and frequency) were coded using five different time sampling methods that included: 12-h night (1800-0600h), 12-h day (0600-1800h), one hour period during the night (1800-0600h), one hour period during the day (0600-1800h) and a one hour period anytime. Each of the one hour time sampling method consisted of four randomly chosen 15-min periods. Two random sets of four 15-min period were also analysed to ensure reliability. We then determined which of the time sampling methods averaged over the three 24-h periods best represented the frequency and duration of behaviours. As might be expected, frequently occurring behaviours were adequately represented by short (oneh) sampling periods, however this was not the case with the infrequent behaviour. Thus, we argue that the time sampling methodology employed must match the behaviour of interest. This caution applies to maternal behaviour in altricial species, such as canids, as well as all systematic behavioural observations utilising time sampling methodology. Copyright © 2017. Published by Elsevier B.V.

  18. The revised Temperament and Character Inventory: normative data by sex and age from a Spanish normal randomized sample

    PubMed Central

    Labad, Javier; Martorell, Lourdes; Gaviria, Ana; Bayón, Carmen; Vilella, Elisabet; Cloninger, C. Robert

    2015-01-01

    Objectives. The psychometric properties regarding sex and age for the revised version of the Temperament and Character Inventory (TCI-R) and its derived short version, the Temperament and Character Inventory (TCI-140), were evaluated with a randomized sample from the community. Methods. A randomized sample of 367 normal adult subjects from a Spanish municipality, who were representative of the general population based on sex and age, participated in the current study. Descriptive statistics and internal consistency according to α coefficient were obtained for all of the dimensions and facets. T-tests and univariate analyses of variance, followed by Bonferroni tests, were conducted to compare the distributions of the TCI-R dimension scores by age and sex. Results. On both the TCI-R and TCI-140, women had higher scores for Harm Avoidance, Reward Dependence and Cooperativeness than men, whereas men had higher scores for Persistence. Age correlated negatively with Novelty Seeking, Reward Dependence and Cooperativeness and positively with Harm Avoidance and Self-transcendence. Young subjects between 18 and 35 years had higher scores than older subjects in NS and RD. Subjects between 51 and 77 years scored higher in both HA and ST. The alphas for the dimensions were between 0.74 and 0.87 for the TCI-R and between 0.63 and 0.83 for the TCI-140. Conclusion. Results, which were obtained with a randomized sample, suggest that there are specific distributions of personality traits by sex and age. Overall, both the TCI-R and the abbreviated TCI-140 were reliable in the ‘good-to-excellent’ range. A strength of the current study is the representativeness of the sample. PMID:26713237

  19. Ground sample data for the Conterminous U.S. Land Cover Characteristics Database

    Treesearch

    Robert Burgan; Colin Hardy; Donald Ohlen; Gene Fosnight; Robert Treder

    1999-01-01

    Ground sample data were collected for a land cover database and raster map that portray 159 vegetation classes at 1 km2 resolution for the conterminous United States. Locations for 3,500 1 km2 ground sample plots were selected randomly across the United States. The number of plots representing each vegetation class was weighted by the proportionate coverage of each...

  20. Public Participation Guide: Citizen Juries

    EPA Pesticide Factsheets

    Citizen juries involve creating a “jury” a representative sample of citizens (usually selected in a random or stratified manner) who are briefed in detail on the background and current thinking relating to a particular issue or project.

  1. Comparison of the efficacy of a hydrogen peroxide dry-mist disinfection system and sodium hypochlorite solution for eradication of Clostridium difficile spores.

    PubMed

    Barbut, F; Menuet, D; Verachten, M; Girou, E

    2009-06-01

    To compare a hydrogen peroxide dry-mist system and a 0.5% hypochlorite solution with respect to their ability to disinfect Clostridium difficile-contaminated surfaces in vitro and in situ. Prospective, randomized, before-after trial. Two French hospitals affected by C. difficile. In situ efficacy of disinfectants was assessed in rooms that had housed patients with C. difficile infection. A prospective study was performed at 2 hospitals that involved randomization of disinfection processes. When a patient with C. difficile infection was discharged, environmental contamination in the patient's room was evaluated before and after disinfection. Environmental surfaces were sampled for C. difficile by use of moistened swabs; swab samples were cultured on selective plates and in broth. Both disinfectants were tested in vitro with a spore-carrier test; in this test, 2 types of material, vinyl polychloride (representative of the room's floor) and laminate (representative of the room's furniture), were experimentally contaminated with spores from 3 C. difficile strains, including the epidemic clone ribotype 027-North American pulsed-field gel electrophoresis type 1. There were 748 surface samples collected (360 from rooms treated with hydrogen peroxide and 388 from rooms treated with hypochlorite). Before disinfection, 46 (24%) of 194 samples obtained in the rooms randomized to hypochlorite treatment and 34 (19%) of 180 samples obtained in the rooms randomized to hydrogen peroxide treatment showed environmental contamination. After disinfection, 23 (12%) of 194 samples from hypochlorite-treated rooms and 4 (2%) of 180 samples from hydrogen peroxide treated rooms showed environmental contamination, a decrease in contamination of 50% after hypochlorite decontamination and 91% after hydrogen peroxide decontamination (P < .005). The in vitro activity of 0.5% hypochlorite was time dependent. The mean (+/-SD) reduction in initial log(10) bacterial count was 4.32 +/- 0.35 log(10) colony-forming units after 10 minutes of exposure to hypochlorite and 4.18 +/- 0.8 log(10) colony-forming units after 1 cycle of hydrogen peroxide decontamination. In situ experiments indicate that the hydrogen peroxide dry-mist disinfection system is significantly more effective than 0.5% sodium hypochlorite solution at eradicating C. difficile spores and might represent a new alternative for disinfecting the rooms of patients with C. difficile infection.

  2. ADAPTIVE MATCHING IN RANDOMIZED TRIALS AND OBSERVATIONAL STUDIES

    PubMed Central

    van der Laan, Mark J.; Balzer, Laura B.; Petersen, Maya L.

    2014-01-01

    SUMMARY In many randomized and observational studies the allocation of treatment among a sample of n independent and identically distributed units is a function of the covariates of all sampled units. As a result, the treatment labels among the units are possibly dependent, complicating estimation and posing challenges for statistical inference. For example, cluster randomized trials frequently sample communities from some target population, construct matched pairs of communities from those included in the sample based on some metric of similarity in baseline community characteristics, and then randomly allocate a treatment and a control intervention within each matched pair. In this case, the observed data can neither be represented as the realization of n independent random variables, nor, contrary to current practice, as the realization of n/2 independent random variables (treating the matched pair as the independent sampling unit). In this paper we study estimation of the average causal effect of a treatment under experimental designs in which treatment allocation potentially depends on the pre-intervention covariates of all units included in the sample. We define efficient targeted minimum loss based estimators for this general design, present a theorem that establishes the desired asymptotic normality of these estimators and allows for asymptotically valid statistical inference, and discuss implementation of these estimators. We further investigate the relative asymptotic efficiency of this design compared with a design in which unit-specific treatment assignment depends only on the units’ covariates. Our findings have practical implications for the optimal design and analysis of pair matched cluster randomized trials, as well as for observational studies in which treatment decisions may depend on characteristics of the entire sample. PMID:25097298

  3. Elder Abuse and Black Americans: Incidence, Correlates, Treatment and Prevention.

    ERIC Educational Resources Information Center

    Cazenave, Noel A.

    Existing evidence on family violence rates by age and race as well as the available data on race and physical elder abuse incidence rates suggests that because such data are not based on random or representative samples and tend to reflect a "sampling artifact" of the particular client populations served by the professionals surveyed,…

  4. Inclusive Education in Spain: How Do Skills, Resources, and Supports Affect Regular Education Teachers' Perceptions of Inclusion?

    ERIC Educational Resources Information Center

    Chiner, Esther; Cardona, Maria Cristina

    2013-01-01

    This study examined regular education teachers' perceptions of inclusion in elementary and secondary schools in Spain and how these perceptions may differ depending on teaching experience, skills, and the availability of resources and supports. Stratified random sampling procedures were used to draw a representative sample of 336 general education…

  5. School Readiness in Children Living in Non-Parental Care: Impacts of Head Start

    ERIC Educational Resources Information Center

    Lipscomb, Shannon T.; Pratt, Megan E.; Schmitt, Sara A.; Pears, Katherine C.; Kim, Hyoun K.

    2013-01-01

    The current study examines the effects of Head Start on the development of school readiness outcomes for children living in non-parental care. Data were obtained from the Head Start Impact Study, a randomized controlled trial of Head Start conducted with a nationally representative sample of Head Start programs and families. The sample included…

  6. Use of Nutritional Information in Canada: National Trends between 2004 and 2008

    ERIC Educational Resources Information Center

    Goodman, Samantha; Hammond, David; Pillo-Blocka, Francy; Glanville, Theresa; Jenkins, Richard

    2011-01-01

    Objective: To examine longitudinal trends in use of nutrition information among Canadians. Design: Population-based telephone and Internet surveys. Setting and Participants: Representative samples of Canadian adults recruited with random-digit dialing sampling in 2004 (n = 2,405) and 2006 (n = 2,014) and an online commercial panel in 2008 (n =…

  7. The densest terrestrial vertebrate

    USGS Publications Warehouse

    Rodda, G.H.; Perry, G.; Rondeau, R.J.; Lazell, J.

    2001-01-01

    An understanding of the abundance of organisms is central to understanding ecology, but many population density estimates are unrepresentative because they were obtained from study areas chosen for the high abundance of the target species. For example, from a pool of 1072 lizard density estimates that we compiled from the literature, we sampled 303 estimates and scored each for its assessment of the degree to which the study site was representative. Less than half (45%) indicated that the study area was chosen to be representative of the population or habitat. An additional 15% reported that individual plots or transects were chosen randomly, but this often indicated only that the sample points were located randomly within a study area chosen for its high abundance of the target species. The remainder of the studies either gave no information or specified that the study area was chosen because the focal species was locally abundant.

  8. The Influence of Religious Awareness Program in Scaling down Death Anxiety among Children Sample in Late Childhood Stage; 9-12 Years Old in Al Shobak Province

    ERIC Educational Resources Information Center

    Al-Mohtadi, Reham Mohammad; Al-Msubheen, Moonerh Mheel

    2017-01-01

    This study drives at identifying the influence of religious awareness program in scaling down the death anxiety among sample consisted of (50) students; (30) males and (20) females, at the late childhood stage. The sample distributed randomly into (25) students representing main group and (25) students as experimental group. Religious Awareness…

  9. Demythologizing sex education in Oklahoma: an attitudinal study.

    PubMed

    Turner, N H

    1983-08-01

    A randomized study was conducted to determine the distribution of attitudes among Oklahomans of voting age toward sex education and to analyze the relationship of demographic, sociocultural, and attitudinal factors. The state was stratified into six regions. Forty-five percent of the sample lived in urban areas, and 55% in rural areas. Random digit dialing and random selection within households were utilized to ensure a representative sample of the population. Eighty percent of the sample was found to be favorable toward sex education in the public schools, while 20% was unfavorable. A majority of respondents in all religious groups including "fundamentalists" were favorable. Seventeen variables were found to be significant in the univariate analysis of the data; eight were not significant. In a multivariate analysis, three variables, age, Protestant denominational type and female employment, were shown to have predictive ability in determining favorability and unfavorability. Implications for building community support for sex education also are discussed.

  10. Oscillating-flow regenerator test rig: Woven screen and metal felt results

    NASA Technical Reports Server (NTRS)

    Gedeon, D.; Wood, J. G.

    1992-01-01

    We present correlating expressions, in terms of Reynolds or Peclet numbers, for friction factors, Nusselt numbers, enhanced axial conduction ratios, and overall heat flux ratios in four porous regenerator samples representative of stirling cycle regenerators: two woven screen samples and two random wire samples. Error estimates and comparison of data with others suggest our correlations are reliable, but we need to test more samples over a range of porosities before our results will become generally useful.

  11. Strengths and weaknesses of temporal stability analysis for monitoring and estimating grid-mean soil moisture in a high-intensity irrigated agricultural landscape

    NASA Astrophysics Data System (ADS)

    Ran, Youhua; Li, Xin; Jin, Rui; Kang, Jian; Cosh, Michael H.

    2017-01-01

    Monitoring and estimating grid-mean soil moisture is very important for assessing many hydrological, biological, and biogeochemical processes and for validating remotely sensed surface soil moisture products. Temporal stability analysis (TSA) is a valuable tool for identifying a small number of representative sampling points to estimate the grid-mean soil moisture content. This analysis was evaluated and improved using high-quality surface soil moisture data that were acquired by a wireless sensor network in a high-intensity irrigated agricultural landscape in an arid region of northwestern China. The performance of the TSA was limited in areas where the representative error was dominated by random events, such as irrigation events. This shortcoming can be effectively mitigated by using a stratified TSA (STSA) method, proposed in this paper. In addition, the following methods were proposed for rapidly and efficiently identifying representative sampling points when using TSA. (1) Instantaneous measurements can be used to identify representative sampling points to some extent; however, the error resulting from this method is significant when validating remotely sensed soil moisture products. Thus, additional representative sampling points should be considered to reduce this error. (2) The calibration period can be determined from the time span of the full range of the grid-mean soil moisture content during the monitoring period. (3) The representative error is sensitive to the number of calibration sampling points, especially when only a few representative sampling points are used. Multiple sampling points are recommended to reduce data loss and improve the likelihood of representativeness at two scales.

  12. Effective Schools Programs: Their Extent and Characteristics. Briefing Report to the Chairman, Committee on Education and Labor, House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Div. of Human Resources.

    A national study of effective schools programs provides information on the extent and characteristics of these programs in the nation's school districts and schools. A questionnaire was mailed to a random sample of 1,685 school district superintendents. Findings are representative of the approximately 16,000 local school districts in the nation.…

  13. Does sampling using random digit dialling really cost more than sampling from telephone directories: Debunking the myths

    PubMed Central

    Yang, Baohui; Eyeson-Annan, Margo

    2006-01-01

    Background Computer assisted telephone interviewing (CATI) is widely used for health surveys. The advantages of CATI over face-to-face interviewing are timeliness and cost reduction to achieve the same sample size and geographical coverage. Two major CATI sampling procedures are used: sampling directly from the electronic white pages (EWP) telephone directory and list assisted random digit dialling (LA-RDD) sampling. EWP sampling covers telephone numbers of households listed in the printed white pages. LA-RDD sampling has a better coverage of households than EWP sampling but is considered to be more expensive due to interviewers dialling more out-of-scope numbers. Methods This study compared an EWP sample and a LA-RDD sample from the New South Wales Population Health Survey in 2003 on demographic profiles, health estimates, coefficients of variation in weights, design effects on estimates, and cost effectiveness, on the basis of achieving the same level of precision of estimates. Results The LA-RDD sample better represented the population than the EWP sample, with a coefficient of variation of weights of 1.03 for LA-RDD compared with 1.21 for EWP, and average design effects of 2.00 for LA-RDD compared with 2.38 for EWP. Also, a LA-RDD sample can save up to 14.2% in cost compared to an EWP sample to achieve the same precision for health estimates. Conclusion A LA-RDD sample better represents the population, which potentially leads to reduced bias in health estimates, and rather than costing more than EWP actually costs less. PMID:16504117

  14. An efficient reliability algorithm for locating design point using the combination of importance sampling concepts and response surface method

    NASA Astrophysics Data System (ADS)

    Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin

    2017-06-01

    Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.

  15. Random sampling of elementary flux modes in large-scale metabolic networks.

    PubMed

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  16. Findings from the 2013 NZCER Primary and Intermediate Schools National Survey

    ERIC Educational Resources Information Center

    New Zealand Council for Educational Research, 2014

    2014-01-01

    The New Zealand Council for Educational Research (NZCER) primary and intermediate schools national survey was carried out in July-August 2013. NZCER questioned principals, teachers and trustees at a representative sample of schools, and sought the views of a random sample of 1 in 4 parents in 36 of these schools. In all, the survey gathered data…

  17. Occupational Stress in Secondary Education in Cyprus: Causes, Symptoms, Consequences and Stress Management

    ERIC Educational Resources Information Center

    Hadjisymeou, Georgia

    2010-01-01

    The survey attempted to look into the causes, symptoms and consequences that occupational stress has on teachers in Secondary Education in Cyprus and find ways to manage it. Thirty eight schools with 553 teachers participated in the survey. The sample chosen is a result of a simple random sampling and it is representative of the country's…

  18. Generalizability of findings from randomized controlled trials: application to the National Institute of Drug Abuse Clinical Trials Network.

    PubMed

    Susukida, Ryoko; Crum, Rosa M; Ebnesajjad, Cyrus; Stuart, Elizabeth A; Mojtabai, Ramin

    2017-07-01

    To compare randomized controlled trial (RCT) sample treatment effects with the population effects of substance use disorder (SUD) treatment. Statistical weighting was used to re-compute the effects from 10 RCTs such that the participants in the trials had characteristics that resembled those of patients in the target populations. Multi-site RCTs and usual SUD treatment settings in the United States. A total of 3592 patients in 10 RCTs and 1 602 226 patients from usual SUD treatment settings between 2001 and 2009. Three outcomes of SUD treatment were examined: retention, urine toxicology and abstinence. We weighted the RCT sample treatment effects using propensity scores representing the conditional probability of participating in RCTs. Weighting the samples changed the significance of estimated sample treatment effects. Most commonly, positive effects of trials became statistically non-significant after weighting (three trials for retention and urine toxicology and one trial for abstinence); also, non-significant effects became significantly positive (one trial for abstinence) and significantly negative effects became non-significant (two trials for abstinence). There was suggestive evidence of treatment effect heterogeneity in subgroups that are under- or over-represented in the trials, some of which were consistent with the differences in average treatment effects between weighted and unweighted results. The findings of randomized controlled trials (RCTs) for substance use disorder treatment do not appear to be directly generalizable to target populations when the RCT samples do not reflect adequately the target populations and there is treatment effect heterogeneity across patient subgroups. © 2017 Society for the Study of Addiction.

  19. Ear Acupuncture for Acute Sore Throat: A Randomized Controlled Trial

    DTIC Science & Technology

    2014-09-26

    SEP 2014 2. REPORT TYPE Final 3. DATES COVERED 4. TITLE AND SUBTITLE Ear acupuncture for acute sore throat. A randomized controlled trial...Auncular Acupuncture is a low risk option for acute pain control •Battlefield acupuncture (BFA) IS a specific auncular acupuncture technique •BFA IS...Strengths: Prospect1ve RCT •Weaknesses Small sample stze. no sham acupuncture performed, patients not blinded to treatment •Th1s study represents an

  20. PRELIMINARY REPORT ON NATIONWIDE STUDY OF DRINKING WATER AND CARDIOVASCULAR DISEASES

    EPA Science Inventory

    This study was designed to further investigate the association(s) of cardiovascular diseases and drinking water constituents. A sample of 4200 adults were randomly selected from 35 geographic areas to represent the civilian noninstitutionalized population of the contiguous United...

  1. Sexual Sensation Seeking, Social Stress, and Coping Styles as Predictors of HIV/STD Risk Behaviors in Adolescents

    ERIC Educational Resources Information Center

    Teva, Inmaculada; Bermudez, Maria Paz; Buela-Casal, Gualberto

    2010-01-01

    The aim of this study was to assess whether coping styles, social stress, and sexual sensation seeking were predictors of HIV/STD risk behaviours in adolescents. A representative sample of 4,456 female and male Spanish high school students aged 13 to 18 years participated. A stratified random sampling procedure was used. Self-report questionnaires…

  2. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less

  3. Assessing representativeness of sampling methods for reaching men who have sex with men: a direct comparison of results obtained from convenience and probability samples.

    PubMed

    Schwarcz, Sandra; Spindler, Hilary; Scheer, Susan; Valleroy, Linda; Lansky, Amy

    2007-07-01

    Convenience samples are used to determine HIV-related behaviors among men who have sex with men (MSM) without measuring the extent to which the results are representative of the broader MSM population. We compared results from a cross-sectional survey of MSM recruited from gay bars between June and October 2001 to a random digit dial telephone survey conducted between June 2002 and January 2003. The men in the probability sample were older, better educated, and had higher incomes than men in the convenience sample, the convenience sample enrolled more employed men and men of color. Substance use around the time of sex was higher in the convenience sample but other sexual behaviors were similar. HIV testing was common among men in both samples. Periodic validation, through comparison of data collected by different sampling methods, may be useful when relying on survey data for program and policy development.

  4. Bias, Confounding, and Interaction: Lions and Tigers, and Bears, Oh My!

    PubMed

    Vetter, Thomas R; Mascha, Edward J

    2017-09-01

    Epidemiologists seek to make a valid inference about the causal effect between an exposure and a disease in a specific population, using representative sample data from a specific population. Clinical researchers likewise seek to make a valid inference about the association between an intervention and outcome(s) in a specific population, based upon their randomly collected, representative sample data. Both do so by using the available data about the sample variable to make a valid estimate about its corresponding or underlying, but unknown population parameter. Random error in an experiment can be due to the natural, periodic fluctuation or variation in the accuracy or precision of virtually any data sampling technique or health measurement tool or scale. In a clinical research study, random error can be due to not only innate human variability but also purely chance. Systematic error in an experiment arises from an innate flaw in the data sampling technique or measurement instrument. In the clinical research setting, systematic error is more commonly referred to as systematic bias. The most commonly encountered types of bias in anesthesia, perioperative, critical care, and pain medicine research include recall bias, observational bias (Hawthorne effect), attrition bias, misclassification or informational bias, and selection bias. A confounding variable is a factor associated with both the exposure of interest and the outcome of interest. A confounding variable (confounding factor or confounder) is a variable that correlates (positively or negatively) with both the exposure and outcome. Confounding is typically not an issue in a randomized trial because the randomized groups are sufficiently balanced on all potential confounding variables, both observed and nonobserved. However, confounding can be a major problem with any observational (nonrandomized) study. Ignoring confounding in an observational study will often result in a "distorted" or incorrect estimate of the association or treatment effect. Interaction among variables, also known as effect modification, exists when the effect of 1 explanatory variable on the outcome depends on the particular level or value of another explanatory variable. Bias and confounding are common potential explanations for statistically significant associations between exposure and outcome when the true relationship is noncausal. Understanding interactions is vital to proper interpretation of treatment effects. These complex concepts should be consistently and appropriately considered whenever one is not only designing but also analyzing and interpreting data from a randomized trial or observational study.

  5. Have We Really Been Analyzing Terminating Simulations Incorrectly All These Years?

    DTIC Science & Technology

    2013-12-01

    TERMINATING SIMULATIONS INCORRECTLY ALL THESE YEARS? Paul J. Sánchez Operations Research Naval Postgraduate School 1411 Cunningham Road Monterey, CA...measure. If that observation directly represents an end state such as the number of failed components after a week’s operation , or the number of patients...processed in 24 hours of emergency room operations , there’s no problem—the set of values obtained by replication represent a random sample from the

  6. Urban and Suburban Residents' Perceptions of Farmers and Agriculture.

    ERIC Educational Resources Information Center

    Molnar, Joseph J.; Duffy, Patricia A.

    Attitudes about farming and government agricultural policies differed among residential categories ranging from urban to rural. A mail survey gathered 3,232 completed questionnaires from a national random sample of 9,250 households. Statistical weighting made respondent categories representative of national proportions. Although respondents…

  7. American Healthy Homes Survey: A National Study of Residential Pesticides Measured from Floor Wipes.

    EPA Science Inventory

    The U.S. Department of Housing and Urban Development in collaboration with the United States Environmental Protection Agency conducted a survey measuring lead, allergens, and insecticides in a randomly selected nationally representative sample of resodential homes. Multistage sa...

  8. Identification of Teaching Behaviors Which Predict Success for Mainstreamed Students.

    ERIC Educational Resources Information Center

    Larrivee, Barbara; Algina, James

    The final phase of a study investigating effective teaching behaviors for mainstreamed students involved 118 elementary teachers. Teachers provided information on mainstreamed students and a sample of students was randomly selected to represent classification categories (learning disabilities, behavior disorders, speech impairments, and hearing…

  9. An active learning representative subset selection method using net analyte signal.

    PubMed

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-05

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. An active learning representative subset selection method using net analyte signal

    NASA Astrophysics Data System (ADS)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  11. Exactly solvable random graph ensemble with extensively many short cycles

    NASA Astrophysics Data System (ADS)

    Aguirre López, Fabián; Barucca, Paolo; Fekom, Mathilde; Coolen, Anthony C. C.

    2018-02-01

    We introduce and analyse ensembles of 2-regular random graphs with a tuneable distribution of short cycles. The phenomenology of these graphs depends critically on the scaling of the ensembles’ control parameters relative to the number of nodes. A phase diagram is presented, showing a second order phase transition from a connected to a disconnected phase. We study both the canonical formulation, where the size is large but fixed, and the grand canonical formulation, where the size is sampled from a discrete distribution, and show their equivalence in the thermodynamical limit. We also compute analytically the spectral density, which consists of a discrete set of isolated eigenvalues, representing short cycles, and a continuous part, representing cycles of diverging size.

  12. The Effect Of Age At Harvest On Bending And Tensile Properties Of Loblolly Pine From The Coastal Plain

    Treesearch

    Robert H. McAlister; Alexander Clark; Joseph R. Saucier

    1997-01-01

    The effect of rotation age on strength and stiffness of lumber produced from unthinned loblolly pine stands in the Coastal Plain of Georgia was examined. Six stands representing 22-, 28-, and 40-year-old roations were sampled. A stratified random sample of trees 8 to 16 inches in diameter at breast height was selected from each stand and processed into lumber....

  13. Job Stress and Organizational Commitment among Mentoring Coordinators

    ERIC Educational Resources Information Center

    Michael, Orly; Court, Deborah; Petal, Pnina

    2009-01-01

    Purpose: This research aims to examine the impact of job stress on the organizational commitment of a random, representative sample of coordinators in the Israeli educational mentoring organization PMP. Organizational commitment, including affective, continuance and normative commitment, refers to worker relations in the organization, and how…

  14. Variable density randomized stack of spirals (VDR-SoS) for compressive sensing MRI.

    PubMed

    Valvano, Giuseppe; Martini, Nicola; Landini, Luigi; Santarelli, Maria Filomena

    2016-07-01

    To develop a 3D sampling strategy based on a stack of variable density spirals for compressive sensing MRI. A random sampling pattern was obtained by rotating each spiral by a random angle and by delaying for few time steps the gradient waveforms of the different interleaves. A three-dimensional (3D) variable sampling density was obtained by designing different variable density spirals for each slice encoding. The proposed approach was tested with phantom simulations up to a five-fold undersampling factor. Fully sampled 3D dataset of a human knee, and of a human brain, were obtained from a healthy volunteer. The proposed approach was tested with off-line reconstructions of the knee dataset up to a four-fold acceleration and compared with other noncoherent trajectories. The proposed approach outperformed the standard stack of spirals for various undersampling factors. The level of coherence and the reconstruction quality of the proposed approach were similar to those of other trajectories that, however, require 3D gridding for the reconstruction. The variable density randomized stack of spirals (VDR-SoS) is an easily implementable trajectory that could represent a valid sampling strategy for 3D compressive sensing MRI. It guarantees low levels of coherence without requiring 3D gridding. Magn Reson Med 76:59-69, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  15. The effects of race and racial priming on self-report of contamination anxiety

    PubMed Central

    Williams, Monnica T.; Turkheimer, Eric; Magee, Emily; Guterbock, Thomas

    2011-01-01

    African Americans show unusually high endorsement rates on self-report measures of contamination anxiety. The purpose of this study was to replicate this finding in a nationally representative sample and conduct a randomized experiment to determine the effect of salience of race as a causal factor. Black and White participants were given contamination items from two popular measures of obsessive-compulsive disorder, half prior to being primed about ethnic identity and half after being primed, via the administration of an ethnic identity measure. The experiment took the form of a 2 (Black and White participant) X 2 (ethnicity salient and ethnicity non-salient) double-blind design, with ethnic saliency assigned at random by computer. Participants consisted of a geographically representative US sample of African Americans supplemented with a similar sample of European Americans (N=258). Black participants scored significantly higher than White participants on contamination scales. Participants from Southern states scored higher than those from other regions. Over-endorsements by Black participants were greater when awareness of ethnic and racial identification was increased. Clinical and research implications were discussed; these measures should be used with caution in African Americans. PMID:22163374

  16. Fast Physically Accurate Rendering of Multimodal Signatures of Distributed Fracture in Heterogeneous Materials.

    PubMed

    Visell, Yon

    2015-04-01

    This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

  17. Using variance components to estimate power in a hierarchically nested sampling design improving monitoring of larval Devils Hole pupfish

    USGS Publications Warehouse

    Dzul, Maria C.; Dixon, Philip M.; Quist, Michael C.; Dinsomore, Stephen J.; Bower, Michael R.; Wilson, Kevin P.; Gaines, D. Bailey

    2013-01-01

    We used variance components to assess allocation of sampling effort in a hierarchically nested sampling design for ongoing monitoring of early life history stages of the federally endangered Devils Hole pupfish (DHP) (Cyprinodon diabolis). Sampling design for larval DHP included surveys (5 days each spring 2007–2009), events, and plots. Each survey was comprised of three counting events, where DHP larvae on nine plots were counted plot by plot. Statistical analysis of larval abundance included three components: (1) evaluation of power from various sample size combinations, (2) comparison of power in fixed and random plot designs, and (3) assessment of yearly differences in the power of the survey. Results indicated that increasing the sample size at the lowest level of sampling represented the most realistic option to increase the survey's power, fixed plot designs had greater power than random plot designs, and the power of the larval survey varied by year. This study provides an example of how monitoring efforts may benefit from coupling variance components estimation with power analysis to assess sampling design.

  18. 48 CFR 13.303-6 - Review procedures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...

  19. 48 CFR 13.303-6 - Review procedures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...

  20. 48 CFR 13.303-6 - Review procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...

  1. 48 CFR 13.303-6 - Review procedures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...

  2. 48 CFR 13.303-6 - Review procedures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...

  3. ALIEN SPECIES IMPORTANTANCE IN NATIVE VEGETATION ALONG WADEABLE STREAMS, JOHN DAY RIVER BASIN, OREGON, USA

    EPA Science Inventory

    We evaluated the importance of alien species in existing vegetation along wadeable streams of a large, topographically diverse river basin in eastern Oregon, USA; sampling 165 plots (30 × 30 m) across 29 randomly selected 1-km stream reaches. Plots represented eight streamside co...

  4. Estimating Acceptability of Financial Health Incentives

    ERIC Educational Resources Information Center

    Bigsby, Elisabeth; Seitz, Holli H.; Halpern, Scott D.; Volpp, Kevin; Cappella, Joseph N.

    2017-01-01

    A growing body of evidence suggests that financial incentives can influence health behavior change, but research on the public acceptability of these programs and factors that predict public support have been limited. A representative sample of U.S. adults (N = 526) were randomly assigned to receive an incentive program description in which the…

  5. The Vocational Personality of School Psychologists in the United States

    ERIC Educational Resources Information Center

    Toomey, Kristine D.; Levinson, Edward M.; Morrison, Takea J.

    2008-01-01

    This study represents the first empirical test of the vocational personality of US school psychologists. Specifically, we investigated the personality of school psychologists using Holland's (1997) well-researched theory of vocational personalities and work environments. The sample consisted of 241 randomly selected members of the National…

  6. Homophobia in Registered Nurses: Impact on LGB Youth

    ERIC Educational Resources Information Center

    Blackwell, Christopher W.; Kiehl, Ermalynn M.

    2008-01-01

    This study examined registered nurses' overall attitudes and homophobia towards gays and lesbians in the workplace. Homophobia scores, represented by the Attitudes Toward Lesbians and Gay Men (ATLG) Scale, was the dependent variable. Overall homophobia scores were assessed among a randomized stratified sample of registered nurses licensed in the…

  7. The Impact of Electronic Communication Technology on Written Language

    ERIC Educational Resources Information Center

    Hamzah, Mohd. Sahandri Gani B.; Ghorbani, Mohd. Reza; Abdullah, Saifuddin Kumar B.

    2009-01-01

    Communication technology is changing things. Language is no exception. Some language researchers argue that language is deteriorating due to increased use in electronic communication. The present paper investigated 100 randomly selected electronic mails (e-mails) and 50 short messaging system (SMS) messages of a representative sample of…

  8. Employment of College Students.

    ERIC Educational Resources Information Center

    High, Robert V.

    A survey was conducted to determine the effect on academic performance, if any, of employment on undergraduate college students. A questionnaire was sent to professors at 3 four-year colleges on Long Island (New York); various day classes were randomly selected. The final sample of n=257 represented approximately a 30 percent response. The…

  9. Chronic obstructive pulmonary disease self-management activation research trial (COPD-SMART): results of recruitment and baseline patient characteristics.

    PubMed

    Russo, Rennie; Coultas, David; Ashmore, Jamile; Peoples, Jennifer; Sloan, John; Jackson, Bradford E; Uhm, Minyong; Singh, Karan P; Blair, Steven N; Bae, Sejong

    2015-03-01

    To describe the recruitment methods, study participation rate, and baseline characteristics of a representative sample of outpatients with COPD eligible for pulmonary rehabilitation participating in a trial of a lifestyle behavioral intervention to increase physical activity. A patient registry was developed for recruitment using an administrative database from primary care and specialty clinics of an academic medical center in northeast Texas for a parallel group randomized trial. The registry was comprised of 5582 patients and over the course of the 30 month recruitment period 325 patients were enrolled for an overall study participation rate of 35.1%. After a 6-week COPD self-management education period provided to all enrolled patients, 305 patients were randomized into either usual care (UC; n=156) or the physical activity self-management intervention (PASM; n=149). There were no clinically significant differences in demographics, clinical characteristics, or health status indicators between the randomized groups. The results of this recruitment process demonstrate the successful use of a patient registry for enrolling a representative sample of outpatients eligible for pulmonary rehabilitation with COPD from primary and specialty care. Moreover, this approach to patient recruitment provides a model for future studies utilizing administrative databases and electronic health records. Published by Elsevier Inc.

  10. High School Physics Availability: Results from the 2012-13 Nationwide Survey of High School Physics Teachers. Focus On

    ERIC Educational Resources Information Center

    White, Susan; Tesfaye, Casey Langer

    2014-01-01

    In this report, the authors share their analysis of the data from over 3,500 high schools in the U.S. beginning with an examination of the availability of physics in U.S. high schools. The schools in their sample are a nationally-representative random sample of the almost 25,000 high schools in forty-nine of the fifty states. Table 1 shows the…

  11. Statistical auditing and randomness test of lotto k/N-type games

    NASA Astrophysics Data System (ADS)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  12. PubMed Central

    King-Shier, Kathryn M; Hemmelgarn, Brenda R; Musto, Richard; Quan, Hude

    2014-01-01

    Background: Francophones who live outside the primarily French-speaking province of Quebec, Canada, risk being excluded from research by lack of a sampling frame. We examined the adequacy of random sampling, advertising, and respondent-driven sampling for recruitment of francophones for survey research. Methods: We recruited francophones residing in the city of Calgary, Alberta, through advertising and respondentdriven sampling. These 2 samples were then compared with a random subsample of Calgary francophones derived from the 2006 Canadian Community Health Survey (CCHS). We assessed the effectiveness of advertising and respondent-driven sampling in relation to the CCHS sample by comparing demographic characteristics and selected items from the CCHS (specifically self-reported general health status, perceived weight, and having a family doctor). Results: We recruited 120 francophones through advertising and 145 through respondent-driven sampling; the random sample from the CCHS consisted of 259 records. The samples derived from advertising and respondentdriven sampling differed from the CCHS in terms of age (mean ages 41.0, 37.6, and 42.5 years, respectively), sex (proportion of males 26.1%, 40.6%, and 56.6%, respectively), education (college or higher 86.7% , 77.9% , and 59.1%, respectively), place of birth (immigrants accounting for 45.8%, 55.2%, and 3.7%, respectively), and not having a regular medical doctor (16.7%, 34.5%, and 16.6%, respectively). Differences were not tested statistically because of limitations on the analysis of CCHS data imposed by Statistics Canada. Interpretation: The samples generated exclusively through advertising and respondent-driven sampling were not representative of the gold standard sample from the CCHS. Use of such biased samples for research studies could generate misleading results. PMID:25426180

  13. Slowdowns in diversification rates from real phylogenies may not be real.

    PubMed

    Cusimano, Natalie; Renner, Susanne S

    2010-07-01

    Studies of diversification patterns often find a slowing in lineage accumulation toward the present. This seemingly pervasive pattern of rate downturns has been taken as evidence for adaptive radiations, density-dependent regulation, and metacommunity species interactions. The significance of rate downturns is evaluated with statistical tests (the gamma statistic and Monte Carlo constant rates (MCCR) test; birth-death likelihood models and Akaike Information Criterion [AIC] scores) that rely on null distributions, which assume that the included species are a random sample of the entire clade. Sampling in real phylogenies, however, often is nonrandom because systematists try to include early-diverging species or representatives of previous intrataxon classifications. We studied the effects of biased sampling, structured sampling, and random sampling by experimentally pruning simulated trees (60 and 150 species) as well as a completely sampled empirical tree (58 species) and then applying the gamma statistic/MCCR test and birth-death likelihood models/AIC scores to assess rate changes. For trees with random species sampling, the true model (i.e., the one fitting the complete phylogenies) could be inferred in most cases. Oversampling deep nodes, however, strongly biases inferences toward downturns, with simulations of structured and biased sampling suggesting that this occurs when sampling percentages drop below 80%. The magnitude of the effect and the sensitivity of diversification rate models is such that a useful rule of thumb may be not to infer rate downturns from real trees unless they have >80% species sampling.

  14. Flexible sampling large-scale social networks by self-adjustable random walk

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Ke; Zhu, Jonathan J. H.

    2016-12-01

    Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.

  15. Assimilating and Following through with Nutritional Recommendations by Adolescents

    ERIC Educational Resources Information Center

    Pich, Jordi; Ballester, Lluis; Thomas, Monica; Canals, Ramon; Tur, Josep A.

    2011-01-01

    Objective: To investigate the relationship between knowledge about a healthy diet and the actual food consumption habits of adolescents. Design: A survey of several food-related aspects applied to a representative sample of adolescents. Setting: One thousand, six hundred and sixty three individuals aged 11 to 18 from 40 schools randomly selected…

  16. Campus HIV Prevention Strategies: Planning for Success.

    ERIC Educational Resources Information Center

    Hoban, Mary T.; Ottenritter, Nan W.; Gascoigne, Jan L.; Kerr, Dianne L.

    This document presents the results of the National College Health Risk Behavior Survey (NCHRBS) conducted by the U.S. Centers for Disease Control (CDC) that pertain to HIV transmission. These results include sexual assault, alcohol and other drug use, and sexual behaviors. The survey was administered to a nationally representative random sample of…

  17. Job Demand in the Cosmetology Industry.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.; Deutermann, William V., Jr.

    In order to determine job demand in the cosmetology industry, a survey was made of a nationally representative stratified random sample of 1,454 beauty salons, barber shops, and unisex salons in July 1991. Salon owners were asked about 1990 and their plans for the future. Survey results were supplemented with information about the industry…

  18. Fruit and Vegetable Intake among Urban Community Gardeners

    ERIC Educational Resources Information Center

    Alaimo, Katherine; Packnett, Elizabeth; Miles, Richard A.; Kruger, Daniel J.

    2008-01-01

    Objective: To determine the association between household participation in a community garden and fruit and vegetable consumption among urban adults. Design: Data were analyzed from a cross-sectional random phone survey conducted in 2003. A quota sampling strategy was used to ensure that all census tracts within the city were represented. Setting:…

  19. Dual Use of Cigarettes and Smokeless Tobacco among South African Adolescents

    ERIC Educational Resources Information Center

    Rantao, Masego; Ayo-Yusuf, Olalekan A.

    2012-01-01

    Objectives: To determine factors associated with dual use of tobacco products in a population of black South African adolescents. Methods: Data were obtained from a self-administered questionnaire completed by a representative sample of grade 8 students from 21 randomly selected secondary state schools in the Limpopo Province, South Africa (n =…

  20. Correlates of Sexual Abuse and Smoking among French Adults

    ERIC Educational Resources Information Center

    King, Gary; Guilbert, Philippe; Ward, D. Gant; Arwidson, Pierre; Noubary, Farzad

    2006-01-01

    Objective: The goal of this study was to examine the association between sexual abuse (SA) and initiation, cessation, and current cigarette smoking among a large representative adult population in France. Method: A random sample size of 12,256 adults (18-75 years of age) was interviewed by telephone concerning demographic variables, health…

  1. Maneuvering the Role as a Community College Artist-Educator: Scholarship Assessed

    ERIC Educational Resources Information Center

    Gibson, John R.; Murray, John P.

    2009-01-01

    This study examined how Texas community college artist-educators balance artistic productivity with their teaching responsibilities. The 98 survey respondents represented 76.6% of a stratified random sample of the full-time instructors in visual arts departments within the 50 Texas public community college districts. Access to studio space and…

  2. Home Education: Characteristics of Its Families and Schools.

    ERIC Educational Resources Information Center

    Gladin, Earl Wade

    This study of the characteristics of home schooling is based on returned questionnaires of 37 questions each, mailed to a random sample of 416 drawn from 6,850 families listed in the Bob Jones University Press home school mailing list. The 253 returned questionnaires, representing a 62% reponse, provided data on the characteristics of these…

  3. Preliminary Findings on Rural Homelessness in Ohio.

    ERIC Educational Resources Information Center

    First, Richard J.; And Others

    This report is designed to present preliminary findings from the first comprehensive study of rural homelessness in the United States. The study was conducted during the first 6 months of 1990, and data were collected from interviews with 921 homeless adults in 21 randomly selected rural counties in Ohio. The sample counties represent 26% of the…

  4. Epidemiology of Attention Problems among Turkish Children and Adolescents: A National Study

    ERIC Educational Resources Information Center

    Erol, Nese; Simsek, Zeynep; Oner, Ozgur; Munir, Kerim

    2008-01-01

    Objective: To evaluate the epidemiology of attention problems using parent, teacher, and youth informants among a nationally representative Turkish sample. Method: The children and adolescents, 4 to 18 years old, were selected from a random household survey. Attention problems derived from the Child Behavior Checklist (CBCL) (N = 4,488), Teacher…

  5. Prime-Time Television: Assessing Violence during the Most Popular Viewing Hours.

    ERIC Educational Resources Information Center

    Smith, Stacy L.; Nathanson, Amy I.; Wilson, Barbara J.

    2002-01-01

    Assesses the prevalence and context of violence in prime-time television programming using a random, representative sample. Shows that, regardless of the time of day, viewers are likely to encounter violence in roughly 2 out of 3 programs. Identifies specific channel types and genres that feature potentially harmful depictions of violence during…

  6. Anemia, micronutrient deficiencies, and malaria in children and women in Sierra Leone prior to the Ebola outbreak

    USDA-ARS?s Scientific Manuscript database

    To identify the factors associated with anemia and to document the severity of micronutrient deficiencies, malaria and inflammation, a nationally representative cross-sectional survey was conducted. A three-stage sampling procedure was used to randomly select children <5 years of age and adult women...

  7. 7 CFR 29.6104 - Rule 18.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... INSPECTION Standards Rules § 29.6104 Rule 18. Burn shall be determined as the average burning time of leaves selected at random from the sample. A minimum of 10 leaves shall be selected as representative regardless... on the same side of the leaf. The leaf shall be punctured to permit quick ignition when placed over a...

  8. 7 CFR 29.6104 - Rule 18.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... INSPECTION Standards Rules § 29.6104 Rule 18. Burn shall be determined as the average burning time of leaves selected at random from the sample. A minimum of 10 leaves shall be selected as representative regardless... on the same side of the leaf. The leaf shall be punctured to permit quick ignition when placed over a...

  9. 7 CFR 29.6104 - Rule 18.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... INSPECTION Standards Rules § 29.6104 Rule 18. Burn shall be determined as the average burning time of leaves selected at random from the sample. A minimum of 10 leaves shall be selected as representative regardless... on the same side of the leaf. The leaf shall be punctured to permit quick ignition when placed over a...

  10. 7 CFR 29.6104 - Rule 18.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INSPECTION Standards Rules § 29.6104 Rule 18. Burn shall be determined as the average burning time of leaves selected at random from the sample. A minimum of 10 leaves shall be selected as representative regardless... on the same side of the leaf. The leaf shall be punctured to permit quick ignition when placed over a...

  11. 7 CFR 29.6104 - Rule 18.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... INSPECTION Standards Rules § 29.6104 Rule 18. Burn shall be determined as the average burning time of leaves selected at random from the sample. A minimum of 10 leaves shall be selected as representative regardless... on the same side of the leaf. The leaf shall be punctured to permit quick ignition when placed over a...

  12. Congruence of Standard Setting Methods for a Nursing Certification Examination.

    ERIC Educational Resources Information Center

    Fabrey, Lawrence J.; Raymond, Mark R.

    The American Nurses' Association certification provides professional recognition beyond licensure to nurses who pass an examination. To determine the passing score as it would be set by a representative peer group, a survey was mailed to a random sample of 200 recently certified nurses. Three questions were asked: (1) what percentage of examinees…

  13. Parts on Demand: Evaluation of Approaches to Achieve Flexible Manufacturing Systems for Navy Partson Demand. Volume 1

    DTIC Science & Technology

    1984-02-01

    measurable impact if changed. The following items were included in the sample: * Mark Zero Items -Low demand insurance items which represent about three...R&D efforts reviewed. The resulting assessment highlighted the generic enabling technologies and cross- cutting R&D projects required to focus current...supplied by spot buys, and which may generate Navy Inventory Control Numbers (NICN). Random samples of data were extracted from the Master Data File ( MDF

  14. Veterinary Medicine and Multi-Omics Research for Future Nutrition Targets: Metabolomics and Transcriptomics of the Common Degenerative Mitral Valve Disease in Dogs.

    PubMed

    Li, Qinghong; Freeman, Lisa M; Rush, John E; Huggins, Gordon S; Kennedy, Adam D; Labuda, Jeffrey A; Laflamme, Dorothy P; Hannah, Steven S

    2015-08-01

    Canine degenerative mitral valve disease (DMVD) is the most common form of heart disease in dogs. The objective of this study was to identify cellular and metabolic pathways that play a role in DMVD by performing metabolomics and transcriptomics analyses on serum and tissue (mitral valve and left ventricle) samples previously collected from dogs with DMVD or healthy hearts. Gas or liquid chromatography followed by mass spectrophotometry were used to identify metabolites in serum. Transcriptomics analysis of tissue samples was completed using RNA-seq, and selected targets were confirmed by RT-qPCR. Random Forest analysis was used to classify the metabolites that best predicted the presence of DMVD. Results identified 41 known and 13 unknown serum metabolites that were significantly different between healthy and DMVD dogs, representing alterations in fat and glucose energy metabolism, oxidative stress, and other pathways. The three metabolites with the greatest single effect in the Random Forest analysis were γ-glutamylmethionine, oxidized glutathione, and asymmetric dimethylarginine. Transcriptomics analysis identified 812 differentially expressed transcripts in left ventricle samples and 263 in mitral valve samples, representing changes in energy metabolism, antioxidant function, nitric oxide signaling, and extracellular matrix homeostasis pathways. Many of the identified alterations may benefit from nutritional or medical management. Our study provides evidence of the growing importance of integrative approaches in multi-omics research in veterinary and nutritional sciences.

  15. Comparison of sampling methods for hard-to-reach francophone populations: yield and adequacy of advertisement and respondent-driven sampling.

    PubMed

    Ngwakongnwi, Emmanuel; King-Shier, Kathryn M; Hemmelgarn, Brenda R; Musto, Richard; Quan, Hude

    2014-01-01

    Francophones who live outside the primarily French-speaking province of Quebec, Canada, risk being excluded from research by lack of a sampling frame. We examined the adequacy of random sampling, advertising, and respondent-driven sampling for recruitment of francophones for survey research. We recruited francophones residing in the city of Calgary, Alberta, through advertising and respondentdriven sampling. These 2 samples were then compared with a random subsample of Calgary francophones derived from the 2006 Canadian Community Health Survey (CCHS). We assessed the effectiveness of advertising and respondent-driven sampling in relation to the CCHS sample by comparing demographic characteristics and selected items from the CCHS (specifically self-reported general health status, perceived weight, and having a family doctor). We recruited 120 francophones through advertising and 145 through respondent-driven sampling; the random sample from the CCHS consisted of 259 records. The samples derived from advertising and respondentdriven sampling differed from the CCHS in terms of age (mean ages 41.0, 37.6, and 42.5 years, respectively), sex (proportion of males 26.1%, 40.6%, and 56.6%, respectively), education (college or higher 86.7% , 77.9% , and 59.1%, respectively), place of birth (immigrants accounting for 45.8%, 55.2%, and 3.7%, respectively), and not having a regular medical doctor (16.7%, 34.5%, and 16.6%, respectively). Differences were not tested statistically because of limitations on the analysis of CCHS data imposed by Statistics Canada. The samples generated exclusively through advertising and respondent-driven sampling were not representative of the gold standard sample from the CCHS. Use of such biased samples for research studies could generate misleading results.

  16. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations].

    PubMed

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna

    2008-01-01

    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  17. Evaluation of the NCPDP Structured and Codified Sig Format for e-prescriptions.

    PubMed

    Liu, Hangsheng; Burkhart, Q; Bell, Douglas S

    2011-01-01

    To evaluate the ability of the structure and code sets specified in the National Council for Prescription Drug Programs Structured and Codified Sig Format to represent ambulatory electronic prescriptions. We parsed the Sig strings from a sample of 20,161 de-identified ambulatory e-prescriptions into variables representing the fields of the Structured and Codified Sig Format. A stratified random sample of these representations was then reviewed by a group of experts. For codified Sig fields, we attempted to map the actual words used by prescribers to the equivalent terms in the designated terminology. Proportion of prescriptions that the Format could fully represent; proportion of terms used that could be mapped to the designated terminology. The fields defined in the Format could fully represent 95% of Sigs (95% CI 93% to 97%), but ambiguities were identified, particularly in representing multiple-step instructions. The terms used by prescribers could be codified for only 60% of dose delivery methods, 84% of dose forms, 82% of vehicles, 95% of routes, 70% of sites, 33% of administration timings, and 93% of indications. The findings are based on a retrospective sample of ambulatory prescriptions derived mostly from primary care physicians. The fields defined in the Format could represent most of the patient instructions in a large prescription sample, but prior to its mandatory adoption, further work is needed to ensure that potential ambiguities are addressed and that a complete set of terms is available for the codified fields.

  18. Learning Bayesian Networks from Correlated Data

    NASA Astrophysics Data System (ADS)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola

    2016-05-01

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  19. Statistical scaling of geometric characteristics in stochastically generated pore microstructures

    DOE PAGES

    Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee

    2015-05-21

    In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, wemore » rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Φ) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Φ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.« less

  20. Sampling for mercury at subnanogram per litre concentrations for load estimation in rivers

    USGS Publications Warehouse

    Colman, J.A.; Breault, R.F.

    2000-01-01

    Estimation of constituent loads in streams requires collection of stream samples that are representative of constituent concentrations, that is, composites of isokinetic multiple verticals collected along a stream transect. An all-Teflon isokinetic sampler (DH-81) cleaned in 75??C, 4 N HCl was tested using blank, split, and replicate samples to assess systematic and random sample contamination by mercury species. Mean mercury concentrations in field-equipment blanks were low: 0.135 ng??L-1 for total mercury (??Hg) and 0.0086 ng??L-1 for monomethyl mercury (MeHg). Mean square errors (MSE) for ??Hg and MeHg duplicate samples collected at eight sampling stations were not statistically different from MSE of samples split in the laboratory, which represent the analytical and splitting error. Low fieldblank concentrations and statistically equal duplicate- and split-sample MSE values indicate that no measurable contamination was occurring during sampling. Standard deviations associated with example mercury load estimations were four to five times larger, on a relative basis, than standard deviations calculated from duplicate samples, indicating that error of the load determination was primarily a function of the loading model used, not of sampling or analytical methods.

  1. Random Sampling of Squamate Reptiles in Spanish Natural Reserves Reveals the Presence of Novel Adenoviruses in Lacertids (Family Lacertidae) and Worm Lizards (Amphisbaenia)

    PubMed Central

    Szirovicza, Leonóra; López, Pilar; Kopena, Renáta; Benkő, Mária; Martín, José; Pénzes, Judit J.

    2016-01-01

    Here, we report the results of a large-scale PCR survey on the prevalence and diversity of adenoviruses (AdVs) in samples collected randomly from free-living reptiles. On the territories of the Guadarrama Mountains National Park in Central Spain and of the Chafarinas Islands in North Africa, cloacal swabs were taken from 318 specimens of eight native species representing five squamate reptilian families. The healthy-looking animals had been captured temporarily for physiological and ethological examinations, after which they were released. We found 22 AdV-positive samples in representatives of three species, all from Central Spain. Sequence analysis of the PCR products revealed the existence of three hitherto unknown AdVs in 11 Carpetane rock lizards (Iberolacerta cyreni), nine Iberian worm lizards (Blanus cinereus), and two Iberian green lizards (Lacerta schreiberi), respectively. Phylogeny inference showed every novel putative virus to be a member of the genus Atadenovirus. This is the very first description of the occurrence of AdVs in amphisbaenian and lacertid hosts. Unlike all squamate atadenoviruses examined previously, two of the novel putative AdVs had A+T rich DNA, a feature generally deemed to mirror previous host switch events. Our results shed new light on the diversity and evolution of atadenoviruses. PMID:27399970

  2. Random Sampling of Squamate Reptiles in Spanish Natural Reserves Reveals the Presence of Novel Adenoviruses in Lacertids (Family Lacertidae) and Worm Lizards (Amphisbaenia).

    PubMed

    Szirovicza, Leonóra; López, Pilar; Kopena, Renáta; Benkő, Mária; Martín, José; Pénzes, Judit J

    2016-01-01

    Here, we report the results of a large-scale PCR survey on the prevalence and diversity of adenoviruses (AdVs) in samples collected randomly from free-living reptiles. On the territories of the Guadarrama Mountains National Park in Central Spain and of the Chafarinas Islands in North Africa, cloacal swabs were taken from 318 specimens of eight native species representing five squamate reptilian families. The healthy-looking animals had been captured temporarily for physiological and ethological examinations, after which they were released. We found 22 AdV-positive samples in representatives of three species, all from Central Spain. Sequence analysis of the PCR products revealed the existence of three hitherto unknown AdVs in 11 Carpetane rock lizards (Iberolacerta cyreni), nine Iberian worm lizards (Blanus cinereus), and two Iberian green lizards (Lacerta schreiberi), respectively. Phylogeny inference showed every novel putative virus to be a member of the genus Atadenovirus. This is the very first description of the occurrence of AdVs in amphisbaenian and lacertid hosts. Unlike all squamate atadenoviruses examined previously, two of the novel putative AdVs had A+T rich DNA, a feature generally deemed to mirror previous host switch events. Our results shed new light on the diversity and evolution of atadenoviruses.

  3. A Critical Assessment of Bias in Survey Studies Using Location-Based Sampling to Recruit Patrons in Bars

    PubMed Central

    Morrison, Christopher; Lee, Juliet P.; Gruenewald, Paul J.; Marzell, Miesha

    2015-01-01

    Location-based sampling is a method to obtain samples of people within ecological contexts relevant to specific public health outcomes. Random selection increases generalizability, however in some circumstances (such as surveying bar patrons) recruitment conditions increase risks of sample bias. We attempted to recruit representative samples of bars and patrons in six California cities, but low response rates precluded meaningful analysis. A systematic review of 24 similar studies revealed that none addressed the key shortcomings of our study. We recommend steps to improve studies that use location-based sampling: (i) purposively sample places of interest, (ii) utilize recruitment strategies appropriate to the environment, and (iii) provide full information on response rates at all levels of sampling. PMID:26574657

  4. Facebook and Twitter, communication and shelter, and the 2011 Tuscaloosa tornado.

    PubMed

    Stokes, Courtney; Senkbeil, Jason C

    2017-01-01

    This paper represents one of the first attempts to analyse the many ways in which Facebook and Twitter were used during a tornado disaster. Comparisons between five randomly selected campus samples and a city of Tuscaloosa, Alabama, sample revealed that campus samples used Facebook and Twitter significantly more both before and after the tornado, but Facebook usage was not significantly different after the event. Furthermore, differences in social media usage and other forms of communication before the tornado were found for age, education, and years lived in Tuscaloosa. Generally, age and education were inversely proportionate to social media usage. Influences on shelter-seeking actions varied between social media users and three random samples of non-social media users; however, it appears that social media respondents were likely to be using a smartphone simultaneously to access warning polygon information, to receive text message alerts, and to listen or respond to environmental cues. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.

  5. Method for analyzing microbial communities

    DOEpatents

    Zhou, Jizhong [Oak Ridge, TN; Wu, Liyou [Oak Ridge, TN

    2010-07-20

    The present invention provides a method for quantitatively analyzing microbial genes, species, or strains in a sample that contains at least two species or strains of microorganisms. The method involves using an isothermal DNA polymerase to randomly and representatively amplify genomic DNA of the microorganisms in the sample, hybridizing the resultant polynucleotide amplification product to a polynucleotide microarray that can differentiate different genes, species, or strains of microorganisms of interest, and measuring hybridization signals on the microarray to quantify the genes, species, or strains of interest.

  6. Modeling and Simulation of Upset-Inducing Disturbances for Digital Systems in an Electromagnetic Reverberation Chamber

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report describes a modeling and simulation approach for disturbance patterns representative of the environment experienced by a digital system in an electromagnetic reverberation chamber. The disturbance is modeled by a multi-variate statistical distribution based on empirical observations. Extended versions of the Rejection Samping and Inverse Transform Sampling techniques are developed to generate multi-variate random samples of the disturbance. The results show that Inverse Transform Sampling returns samples with higher fidelity relative to the empirical distribution. This work is part of an ongoing effort to develop a resilience assessment methodology for complex safety-critical distributed systems.

  7. An Association between Bullying Behaviors and Alcohol Use among Middle School Students

    ERIC Educational Resources Information Center

    Peleg-Oren, Neta; Cardenas, Gabriel A.; Comerford, Mary; Galea, Sandro

    2012-01-01

    Although a high prevalence of bullying behaviors among adolescents has been documented, little is known about the association between bullying behaviors and alcohol use among perpetrators or victims. This study used data from a representative two-stage cluster random sample of 44, 532 middle school adolescents in Florida. We found a high…

  8. The Williamsburg Charter Survey on Religion and Public Life.

    ERIC Educational Resources Information Center

    Williamsburg Charter Foundation, Washington, DC.

    Findings from a survey designed to gauge how U.S. citizens view the place of religion in public life are discussed. A total of 1,889 adults were interviewed at random by telephone for the cross-sectional sample. Additional interviews were conducted with more than 300 teenagers and with 7 leadership groups representing business, higher education,…

  9. Assessing Changes in Socioemotional Adjustment across Early School Transitions--New National Scales for Children at Risk

    ERIC Educational Resources Information Center

    McDermott, Paul A.; Watkins, Marley W.; Rovine, Michael J.; Rikoon, Samuel H.

    2013-01-01

    This article reports the development and evidence for validity and application of the Adjustment Scales for Early Transition in Schooling (ASETS). Based on primary analyses of data from the Head Start Impact Study, a nationally representative sample (N = 3077) of randomly selected children from low-income households is configured to inform…

  10. New Directions in Apprentice Selection: Self-Perceived "On the Job" Literacy (Reading) Demands of Apprentices.

    ERIC Educational Resources Information Center

    Edwards, Peter; Gould, Warren

    A study investigated the self-perceived, on-the-job literacy tasks of electrical mechanic apprentices in Victoria, Australia. A random sample of 401 apprentices from 19 locations representing all levels of apprenticeship training were questioned about their reading needs and the consequences of making a reading error in their work. Data were…

  11. A Nationwide Epidemiologic Modeling Study of LD: Risk, Protection, and Unintended Impact

    ERIC Educational Resources Information Center

    McDermott, Paul A.; Goldberg, Michelle M.; Watkins, Marley W.; Stanley, Jeanne L.; Glutting, Joseph J.

    2006-01-01

    Through multiple logistic regression modeling, this article explores the relative importance of risk and protective factors associated with learning disabilities (LD). A representative national sample of 6- to 17-year-old students (N = 1,268) was drawn by random stratification and classified by the presence versus absence of LD in reading,…

  12. Mothers' Antenatal Depression and Their Children's Antisocial Outcomes

    ERIC Educational Resources Information Center

    Hay, Dale F.; Pawlby, Susan; Waters, Cerith S.; Perra, Oliver; Sharp, Deborah

    2010-01-01

    Interviews of 120 British adolescents and their parents (80% of a random sample of antenatal patients drawn from a representative urban population and followed longitudinally) revealed that 40 (33%) had been arrested and/or had a diagnosis of "DSM-IV" conduct disorder by 16 years of age; of those, 18 (45%) had committed violent acts.…

  13. Bridging the Rural-Urban Literacy Gap in China: A Mediation Analysis of Family Effects

    ERIC Educational Resources Information Center

    Wang, Jingying; Li, Hui; Wang, Dan

    2018-01-01

    This study examines the effects of family involvement on the literacy gap between rural and urban Chinese primary students via mediation analysis. Altogether, 1080 students in Grades 1, 3, and 5 were randomly sampled from three urban and three rural primary schools from Shandong and Guizhou Provinces, representing eastern and western China,…

  14. Modeling Signal-Noise Processes Supports Student Construction of a Hierarchical Image of Sample

    ERIC Educational Resources Information Center

    Lehrer, Richard

    2017-01-01

    Grade 6 (modal age 11) students invented and revised models of the variability generated as each measured the perimeter of a table in their classroom. To construct models, students represented variability as a linear composite of true measure (signal) and multiple sources of random error. Students revised models by developing sampling…

  15. National Study of Emotional and Perceptional Changes Since September 11

    ERIC Educational Resources Information Center

    Seo, Dong-Chul; Torabi, Mohammad R.

    2004-01-01

    This study examined emotional and perceptional changes American people had experienced 10 to 12 months after the September 11 (9-11) terrorist attacks. A nationally representative sample of 807 U.S. adults ages 18 or older was interviewed using random-digit dialing that included unpublished numbers and new listings. The results indicated that 5 to…

  16. What Affects People's Willingness to Participate in Qualitative Research? An Experimental Comparison of Five Incentives

    ERIC Educational Resources Information Center

    Kelly, Bridget; Margolis, Marjorie; McCormack, Lauren; LeBaron, Patricia A.; Chowdhury, Dhuly

    2017-01-01

    The literature on factors that influence participation in qualitative research is lacking. We conducted an experiment with a nationally representative sample to test the impact of different incentive types and amounts on willingness to participate in a hypothetical qualitative interview. We randomized participants from an online panel to one of…

  17. Accounting for selection bias in association studies with complex survey data.

    PubMed

    Wirth, Kathleen E; Tchetgen Tchetgen, Eric J

    2014-05-01

    Obtaining representative information from hidden and hard-to-reach populations is fundamental to describe the epidemiology of many sexually transmitted diseases, including HIV. Unfortunately, simple random sampling is impractical in these settings, as no registry of names exists from which to sample the population at random. However, complex sampling designs can be used, as members of these populations tend to congregate at known locations, which can be enumerated and sampled at random. For example, female sex workers may be found at brothels and street corners, whereas injection drug users often come together at shooting galleries. Despite the logistical appeal, complex sampling schemes lead to unequal probabilities of selection, and failure to account for this differential selection can result in biased estimates of population averages and relative risks. However, standard techniques to account for selection can lead to substantial losses in efficiency. Consequently, researchers implement a variety of strategies in an effort to balance validity and efficiency. Some researchers fully or partially account for the survey design, whereas others do nothing and treat the sample as a realization of the population of interest. We use directed acyclic graphs to show how certain survey sampling designs, combined with subject-matter considerations unique to individual exposure-outcome associations, can induce selection bias. Finally, we present a novel yet simple maximum likelihood approach for analyzing complex survey data; this approach optimizes statistical efficiency at no cost to validity. We use simulated data to illustrate this method and compare it with other analytic techniques.

  18. Hog Charm II tetracycline test screening results compared with a liquid chromatography tandem mass spectrometry 10-μg/kg method.

    PubMed

    Salter, Robert; Holmes, Steven; Legg, David; Coble, Joel; George, Bruce

    2012-02-01

    Pork tissue samples that tested positive and negative by the Charm II tetracycline test screening method in the slaughter plant laboratory were tested with the modified AOAC International liquid chromatography tandem mass spectrometry (LC-MS-MS) method 995.09 to determine the predictive value of the screening method at detecting total tetracyclines at 10 μg/kg of tissue, in compliance with Russian import regulations. There were 218 presumptive-positive tetracycline samples of 4,195 randomly tested hogs. Of these screening test positive samples, 83% (182) were positive, >10 μg/kg by LC-MS-MS; 12.8% (28) were false violative, greater than limit of detection (LOD) but <10 μg/kg; and 4.2% (8) were not detected at the LC-MS-MS LOD. The 36 false-violative and not-detected samples represent 1% of the total samples screened. Twenty-seven of 30 randomly selected tetracycline screening negative samples tested below the LC-MS-MS LOD, and 3 samples tested <3 μg/kg chlortetracycline. Results indicate that the Charm II tetracycline test is effective at predicting hogs containing >10 μg/kg total tetracyclines in compliance with Russian import regulations.

  19. Generalizability of causal inference in observational studies under retrospective convenience sampling.

    PubMed

    Hu, Zonghui; Qin, Jing

    2018-05-20

    Many observational studies adopt what we call retrospective convenience sampling (RCS). With the sample size in each arm prespecified, RCS randomly selects subjects from the treatment-inclined subpopulation into the treatment arm and those from the control-inclined into the control arm. Samples in each arm are representative of the respective subpopulation, but the proportion of the 2 subpopulations is usually not preserved in the sample data. We show in this work that, under RCS, existing causal effect estimators actually estimate the treatment effect over the sample population instead of the underlying study population. We investigate how to correct existing methods for consistent estimation of the treatment effect over the underlying population. Although RCS is adopted in medical studies for ethical and cost-effective purposes, it also has a big advantage for statistical inference: When the tendency to receive treatment is low in a study population, treatment effect estimators under RCS, with proper correction, are more efficient than their parallels under random sampling. These properties are investigated both theoretically and through numerical demonstration. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  20. Marketing Norm Perception Among Medical Representatives in Indian Pharmaceutical Industry

    PubMed Central

    Nagashekhara, Molugulu; Agil, Syed Omar Syed; Ramasamy, Ravindran

    2012-01-01

    Study of marketing norm perception among medical representatives is an under-portrayed component that deserves further perusal in the pharmaceutical industry. The purpose of this study is to find out the perception of marketing norms among medical representatives. The research design is quantitative and cross sectional study with medical representatives as unit of analysis. Data is collected from medical representatives (n=300) using a simple random and cluster sampling using a structured questionnaire. Results indicate that there is no difference in the perception of marketing norms among male and female medical representatives. But there is a difference in opinion among domestic and multinational company’s medical representatives. Educational back ground of medical representatives also shows the difference in opinion among medical representatives. Degree holders and multinational company medical representatives have high perception of marketing norms compare to their counterparts. The researchers strongly believe that mandatory training on marketing norms is beneficial in decision making process during the dilemmas in the sales field. PMID:24826035

  1. Improvement of the Work Environment and Work-Related Stress: A Cross-Sectional Multilevel Study of a Nationally Representative Sample of Japanese Workers.

    PubMed

    Watanabe, Kazuhiro; Tabuchi, Takahiro; Kawakami, Norito

    2017-03-01

    This cross-sectional multilevel study aimed to investigate the relationship between improvement of the work environment and work-related stress in a nationally representative sample in Japan. The study was based on a national survey that randomly sampled 1745 worksites and 17,500 nested employees. The survey asked the worksites whether improvements of the work environment were conducted; and it asked the employees to report the number of work-related stresses they experienced. Multilevel multinominal logistic and linear regression analyses were conducted. Improvement of the work environment was not significantly associated with any level of work-related stress. Among men, it was significantly and negatively associated with the severe level of work-related stress. The association was not significant among women. Improvements to work environments may be associated with reduced work-related stress among men nationwide in Japan.

  2. The effects of sample scheduling and sample numbers on estimates of the annual fluxes of suspended sediment in fluvial systems

    USGS Publications Warehouse

    Horowitz, Arthur J.; Clarke, Robin T.; Merten, Gustavo Henrique

    2015-01-01

    Since the 1970s, there has been both continuing and growing interest in developing accurate estimates of the annual fluvial transport (fluxes and loads) of suspended sediment and sediment-associated chemical constituents. This study provides an evaluation of the effects of manual sample numbers (from 4 to 12 year−1) and sample scheduling (random-based, calendar-based and hydrology-based) on the precision, bias and accuracy of annual suspended sediment flux estimates. The evaluation is based on data from selected US Geological Survey daily suspended sediment stations in the USA and covers basins ranging in area from just over 900 km2 to nearly 2 million km2 and annual suspended sediment fluxes ranging from about 4 Kt year−1 to about 200 Mt year−1. The results appear to indicate that there is a scale effect for random-based and calendar-based sampling schemes, with larger sample numbers required as basin size decreases. All the sampling schemes evaluated display some level of positive (overestimates) or negative (underestimates) bias. The study further indicates that hydrology-based sampling schemes are likely to generate the most accurate annual suspended sediment flux estimates with the fewest number of samples, regardless of basin size. This type of scheme seems most appropriate when the determination of suspended sediment concentrations, sediment-associated chemical concentrations, annual suspended sediment and annual suspended sediment-associated chemical fluxes only represent a few of the parameters of interest in multidisciplinary, multiparameter monitoring programmes. The results are just as applicable to the calibration of autosamplers/suspended sediment surrogates currently used to measure/estimate suspended sediment concentrations and ultimately, annual suspended sediment fluxes, because manual samples are required to adjust the sample data/measurements generated by these techniques so that they provide depth-integrated and cross-sectionally representative data. 

  3. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    PubMed

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  4. Quantifying errors without random sampling.

    PubMed

    Phillips, Carl V; LaPole, Luwanna M

    2003-06-12

    All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.

  5. Relation between modern pollen rain, vegetation and climate in northern China: Implications for quantitative vegetation reconstruction in a steppe environment.

    PubMed

    Ge, Yawen; Li, Yuecong; Bunting, M Jane; Li, Bing; Li, Zetao; Wang, Junting

    2017-05-15

    Vegetation reconstructions from palaeoecological records depend on adequate understanding of relationships between modern pollen, vegetation and climate. A key parameter for quantitative vegetation reconstructions is the Relative Pollen Productivity (RPP). Differences in both environmental and methodological factors are known to alter the RPP estimated significantly, making it difficult to determine whether the underlying pollen productivity does actually vary, and if so, why. In this paper, we present the results of a replication study for the Bashang steppe region, a typical steppe area in northern China, carried out in 2013 and 2014. In each year, 30 surface samples were collected for pollen analysis, with accompanying vegetation survey using the "Crackles Bequest Project" methodology. Sampling designs differed slightly between the two years: in 2013, sites were located completely randomly, whilst in 2014 sampling locations were constrained to be within a few km of roads. There is a strong inter-annual variability in both the pollen and the vegetation spectra therefore in RPPs, and annual precipitation may be a key influence on these variations. The pollen assemblages in both years are dominated by herbaceous taxa such as Artemisia, Amaranthaceae, Poaceae, Asteraceae, Cyperaceae, Fabaceae and Allium. Artemisia and Amaranthaceae pollen are significantly over-represented for their vegetation abundance. Poaceae, Cyperaceae and Fabaceae seem to have under-represented pollen for vegetation with correspondingly lower RPPs. Asteraceae seems to be well-represented, with moderate RPPs and less annual variation. Estimated Relevant Source Area of Pollen (RSAP) ranges from 2000 to 3000m. Different sampling designs have an effect both on RSAP and RPPs and random sample selection may be the best strategy for obtaining robust estimates. Our results have implications for further pollen-vegetation relationship and quantitative vegetation reconstruction research in typical steppe areas and in other open habitats with strong inter-annual variation. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Improving response rate and quality of survey data with a scratch lottery ticket incentive

    PubMed Central

    2012-01-01

    Background The quality of data collected in survey research is usually indicated by the response rate; the representativeness of the sample, and; the rate of completed questions (item-response). In attempting to improve a generally declining response rate in surveys considerable efforts are being made through follow-up mailings and various types of incentives. This study examines effects of including a scratch lottery ticket in the invitation letter to a survey. Method Questionnaires concerning oral health were mailed to a random sample of 2,400 adults. A systematically selected half of the sample (1,200 adults) received a questionnaire including a scratch lottery ticket. One reminder without the incentive was sent. Results The incentive increased the response rate and improved representativeness by reaching more respondents with lower education. Furthermore, it reduced item nonresponse. The initial incentive had no effect on the propensity to respond after the reminder. Conclusion When attempting to improve survey data, three issues become important: response rate, representativeness, and item-response. This study shows that including a scratch lottery ticket in the invitation letter performs well on all the three. PMID:22515335

  7. Evaluation of the NCPDP Structured and Codified Sig Format for e-prescriptions

    PubMed Central

    Burkhart, Q; Bell, Douglas S

    2011-01-01

    Objective To evaluate the ability of the structure and code sets specified in the National Council for Prescription Drug Programs Structured and Codified Sig Format to represent ambulatory electronic prescriptions. Design We parsed the Sig strings from a sample of 20 161 de-identified ambulatory e-prescriptions into variables representing the fields of the Structured and Codified Sig Format. A stratified random sample of these representations was then reviewed by a group of experts. For codified Sig fields, we attempted to map the actual words used by prescribers to the equivalent terms in the designated terminology. Measurements Proportion of prescriptions that the Format could fully represent; proportion of terms used that could be mapped to the designated terminology. Results The fields defined in the Format could fully represent 95% of Sigs (95% CI 93% to 97%), but ambiguities were identified, particularly in representing multiple-step instructions. The terms used by prescribers could be codified for only 60% of dose delivery methods, 84% of dose forms, 82% of vehicles, 95% of routes, 70% of sites, 33% of administration timings, and 93% of indications. Limitations The findings are based on a retrospective sample of ambulatory prescriptions derived mostly from primary care physicians. Conclusion The fields defined in the Format could represent most of the patient instructions in a large prescription sample, but prior to its mandatory adoption, further work is needed to ensure that potential ambiguities are addressed and that a complete set of terms is available for the codified fields. PMID:21613642

  8. Different hunting strategies select for different weights in red deer.

    PubMed

    Martínez, María; Rodríguez-Vigal, Carlos; Jones, Owen R; Coulson, Tim; San Miguel, Alfonso

    2005-09-22

    Much insight can be derived from records of shot animals. Most researchers using such data assume that their data represents a random sample of a particular demographic class. However, hunters typically select a non-random subset of the population and hunting is, therefore, not a random process. Here, with red deer (Cervus elaphus) hunting data from a ranch in Toledo, Spain, we demonstrate that data collection methods have a significant influence upon the apparent relationship between age and weight. We argue that a failure to correct for such methodological bias may have significant consequences for the interpretation of analyses involving weight or correlated traits such as breeding success, and urge researchers to explore methods to identify and correct for such bias in their data.

  9. On the importance of incorporating sampling weights in ...

    EPA Pesticide Factsheets

    Occupancy models are used extensively to assess wildlife-habitat associations and to predict species distributions across large geographic regions. Occupancy models were developed as a tool to properly account for imperfect detection of a species. Current guidelines on survey design requirements for occupancy models focus on the number of sample units and the pattern of revisits to a sample unit within a season. We focus on the sampling design or how the sample units are selected in geographic space (e.g., stratified, simple random, unequal probability, etc). In a probability design, each sample unit has a sample weight which quantifies the number of sample units it represents in the finite (oftentimes areal) sampling frame. We demonstrate the importance of including sampling weights in occupancy model estimation when the design is not a simple random sample or equal probability design. We assume a finite areal sampling frame as proposed for a national bat monitoring program. We compare several unequal and equal probability designs and varying sampling intensity within a simulation study. We found the traditional single season occupancy model produced biased estimates of occupancy and lower confidence interval coverage rates compared to occupancy models that accounted for the sampling design. We also discuss how our findings inform the analyses proposed for the nascent North American Bat Monitoring Program and other collaborative synthesis efforts that propose h

  10. Allelic variability in species and stocks of Lake Superior ciscoes (Coregoninae)

    USGS Publications Warehouse

    Todd, Thomas N.

    1981-01-01

    Starch gel electrophoresis was used as a means of recognizing species and stocks in Lake Superior Coregonus. Allelic variability at isocitrate dehydrogenase and glycerol-3-phosphate dehydrogenase loci was recorded for samples of lake herring (Coregonus artedii), bloater (C. hoyi), kiyi (C. kiyi), and shortjaw cisco (C. zenithicus) from five Lake Superior localities. The observed frequencies of genotypes within each subsample did not differ significantly from those expected on the basis of random mating, and suggested that each subsample represented either a random sample from a larger randomly mating population or an independent and isolated subpopulation within which mating was random. Significant contingency X2 values for comparisons between both localities and species suggested that more than one randomly mating population occurred among the Lake Superior ciscoes, but did not reveal how many such populations there were. In contrast to the genetic results of this study, morphology seems to be a better descriptor of cisco stocks, and identification of cisco stocks and species will still have to be based on morphological criteria until more data are forthcoming. Where several species are sympatric, management should strive to preserve the least abundant. Failure to do so could result in the extinction or depletion of the rarer forms.

  11. Using an Informative Missing Data Model to Predict the Ability to Assess Recovery of Balance Control after Spaceflight

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Wood, Scott J.; Jain, Varsha

    2008-01-01

    Astronauts show degraded balance control immediately after spaceflight. To assess this change, astronauts' ability to maintain a fixed stance under several challenging stimuli on a movable platform is quantified by "equilibrium" scores (EQs) on a scale of 0 to 100, where 100 represents perfect control (sway angle of 0) and 0 represents data loss where no sway angle is observed because the subject has to be restrained from falling. By comparing post- to pre-flight EQs for actual astronauts vs. controls, we built a classifier for deciding when an astronaut has recovered. Future diagnostic performance depends both on the sampling distribution of the classifier as well as the distribution of its input data. Taking this into consideration, we constructed a predictive ROC by simulation after modeling P(EQ = 0) in terms of a latent EQ-like beta-distributed random variable with random effects.

  12. Parent-Client Participation in the Bilingual Education Program in St. Croix, United States Virgin Islands.

    ERIC Educational Resources Information Center

    Padgett, Carmen H. A.

    A study of a St. Croix bilingual education program looked at parent involvement from the program's beginning to the present and at parent recommendations for more meaningful involvement. A random sample of parents representing students at all grade levels was drawn from school records. The parents were surveyed by questionnaire, and interviewed.…

  13. Student Characteristics as Compared to the Community Profile of Fall 1993. Volume XXIII, No. 4.

    ERIC Educational Resources Information Center

    Lucas, John A.; Meltesen, Cal

    A study was conducted at William Rainey Harper College (WRHC) in Palatine, Illinois, to develop a profile of fall 1993 students, compare student and community demographic data, and determine the percentage of various community sub-groups served by the college. A random sample of 500 degree-credit students (representing 3.2% of the 15,518 students…

  14. Analysis of Teachers' Adoption of Technology for Use in Instruction in Seven Career and Technical Education Programs

    ERIC Educational Resources Information Center

    Kotrlik, Joe W.; Redmann, Donna H.

    2009-01-01

    This study addressed utilization of technology in instruction by secondary career and technical education (CTE) teachers in seven program areas in Louisiana. A stratified random sample was utilized, with 539 teachers responding to the survey after three data collection efforts. The data were determined to be representative of all CTE teachers in…

  15. "The Knowledge of" Counselors in Balqa Governorate: Behavior Modification Strategies in Light of Some of the Variables

    ERIC Educational Resources Information Center

    Al-basel, D-Nagham Mohammad Abu

    2013-01-01

    The present study aimed to identify the extent of knowledge of counselor behavior modification strategies. The current study sample consisted of (80) mentor and guide, were selected randomly from among all workers enrolled in regular public schools in the Balqa governorate represented the community study for the academic year 2012-2013. The study…

  16. An Empirical Study about China: Gender Equity in Science Education.

    ERIC Educational Resources Information Center

    Wang, Jianjun; Staver, John R.

    A data base representing a random sample of more than 10,000 grade 9 students in an SISS (Second IEA Science Study) Extended Study (SES), a key project supported by the China State Commission of Education in the late 1980s, was employed in this study to investigate gender equity in student science achievement in China. This empirical data analysis…

  17. Student Characteristics as Compared to the Community Profile--1976-1977. Vol. IX, No. 1.

    ERIC Educational Resources Information Center

    Lucas, John A.

    In order to provide a fall 1976 student profile and to determine the college's market penetration, William Rainey Harper College drew a random sample of 400 traditional credit and 194 continuing education students, representing respectively 3.2% and 2.4% of the total student population. Traditional credit students averaged 27 years of age; 57.8%…

  18. Correlates to Human Papillomavirus Vaccination Status and Willingness to Vaccinate in Low-Income Philadelphia High School Students

    ERIC Educational Resources Information Center

    Bass, Sarah B.; Leader, Amy; Shwarz, Michelle; Greener, Judith; Patterson, Freda

    2015-01-01

    Background: Little is known about the correlates of human papillomavirus (HPV) vaccination or willingness to be vaccinated in urban, minority adolescents. Methods: Using responses to the 2013 Youth Risk Behavior Survey in Philadelphia, a random sample of high schools provided weighted data representing 20,941 9th to 12th graders. Stratified by…

  19. Rationales of a Shift towards Knowledge Economy in Jordan from the Viewpoint of Educational Experts and Relationship with Some Variables

    ERIC Educational Resources Information Center

    Al Zboon, Mohammad Saleem; Al Ahmad, Suliman Diab Ali; Al Zboon, Saleem Odeh

    2009-01-01

    The purpose of the present study was to identify rationales underlying a shift towards knowledge economy in education as perceived by the educational experts in Jordan and relationship with some variables. The random stratum sample (n = 90) consisted of educational experts representing faculty members in the Jordanian universities and top leaders…

  20. Trending toward Reform: Teachers Speak on Unions and the Future of the Profession. Education Sector Reports

    ERIC Educational Resources Information Center

    Rosenberg, Sarah; Silva, Elena

    2012-01-01

    Over the past decade, teachers have seen changes in both their conditions of employment--from pay to retirement benefits--and their practice. Far too often, these policies have been made by people who talk "about" teachers, rather than talking "to" them. Last fall, Education Sector surveyed a nationally representative random sample of more than…

  1. Dynamic laser speckle analyzed considering inhomogeneities in the biological sample

    NASA Astrophysics Data System (ADS)

    Braga, Roberto A.; González-Peña, Rolando J.; Viana, Dimitri Campos; Rivera, Fernando Pujaico

    2017-04-01

    Dynamic laser speckle phenomenon allows a contactless and nondestructive way to monitor biological changes that are quantified by second-order statistics applied in the images in time using a secondary matrix known as time history of the speckle pattern (THSP). To avoid being time consuming, the traditional way to build the THSP restricts the data to a line or column. Our hypothesis is that the spatial restriction of the information could compromise the results, particularly when undesirable and unexpected optical inhomogeneities occur, such as in cell culture media. It tested a spatial random approach to collect the points to form a THSP. Cells in a culture medium and in drying paint, representing homogeneous samples in different levels, were tested, and a comparison with the traditional method was carried out. An alternative random selection based on a Gaussian distribution around a desired position was also presented. The results showed that the traditional protocol presented higher variation than the outcomes using the random method. The higher the inhomogeneity of the activity map, the higher the efficiency of the proposed method using random points. The Gaussian distribution proved to be useful when there was a well-defined area to monitor.

  2. Neighborhood sampling: how many streets must an auditor walk?

    PubMed

    McMillan, Tracy E; Cubbin, Catherine; Parmenter, Barbara; Medina, Ashley V; Lee, Rebecca E

    2010-03-12

    This study tested the representativeness of four street segment sampling protocols using the Pedestrian Environment Data Scan (PEDS) in eleven neighborhoods surrounding public housing developments in Houston, TX. The following four street segment sampling protocols were used (1) all segments, both residential and arterial, contained within the 400 meter radius buffer from the center point of the housing development (the core) were compared with all segments contained between the 400 meter radius buffer and the 800 meter radius buffer (the ring); all residential segments in the core were compared with (2) 75% (3) 50% and (4) 25% samples of randomly selected residential street segments in the core. Analyses were conducted on five key variables: sidewalk presence; ratings of attractiveness and safety for walking; connectivity; and number of traffic lanes. Some differences were found when comparing all street segments, both residential and arterial, in the core to the ring. Findings suggested that sampling 25% of residential street segments within the 400 m radius of a residence sufficiently represents the pedestrian built environment. Conclusions support more cost effective environmental data collection for physical activity research.

  3. Neighborhood sampling: how many streets must an auditor walk?

    PubMed Central

    2010-01-01

    This study tested the representativeness of four street segment sampling protocols using the Pedestrian Environment Data Scan (PEDS) in eleven neighborhoods surrounding public housing developments in Houston, TX. The following four street segment sampling protocols were used (1) all segments, both residential and arterial, contained within the 400 meter radius buffer from the center point of the housing development (the core) were compared with all segments contained between the 400 meter radius buffer and the 800 meter radius buffer (the ring); all residential segments in the core were compared with (2) 75% (3) 50% and (4) 25% samples of randomly selected residential street segments in the core. Analyses were conducted on five key variables: sidewalk presence; ratings of attractiveness and safety for walking; connectivity; and number of traffic lanes. Some differences were found when comparing all street segments, both residential and arterial, in the core to the ring. Findings suggested that sampling 25% of residential street segments within the 400 m radius of a residence sufficiently represents the pedestrian built environment. Conclusions support more cost effective environmental data collection for physical activity research. PMID:20226052

  4. Physicochemical equivalence of generic antihypertensive medicines (EQUIMEDS): protocol for a quality of medicines assessment

    PubMed Central

    Redfern, Julie; Adedoyin, Rufus Adesoji; Ofori, Sandra; Anchala, Raghupathy; Ajay, Vamadevan S; De Andrade, Luciano; Zelaya, Jose; Kaur, Harparkash; Balabanova, Dina; Sani, Mahmoud U

    2016-01-01

    Background Prevention and optimal management of hypertension in the general population is paramount to the achievement of the World Heart Federation (WHF) goal of reducing premature cardiovascular disease (CVD) mortality by 25% by the year 2025 and widespread access to good quality antihypertensive medicines is a critical component for achieving the goal. Despite research and evidence relating to other medicines such as antimalarials and antibiotics, there is very little known about the quality of generic antihypertensive medicines in low-income and middle-income countries. The aim of this study was to determine the physicochemical equivalence (percentage of active pharmaceutical ingredient, API) of generic antihypertensive medicines available in the retail market of a developing country. Methods An observational design will be adopted, which includes literature search, landscape assessment, collection and analysis of medicine samples. To determine physicochemical equivalence, a multistage sampling process will be used, including (1) identification of the 2 most commonly prescribed classes of antihypertensive medicines prescribed in Nigeria; (2) identification of a random sample of 10 generics from within each of the 2 most commonly prescribed classes; (3) a geographical representative sampling process to identify a random sample of 24 retail outlets in Nigeria; (4) representative sample purchasing, processing to assess the quality of medicines, storage and transport; and (5) assessment of the physical and chemical equivalence of the collected samples compared to the API in the relevant class. In total, 20 samples from each of 24 pharmacies will be tested (total of 480 samples). Discussion Availability of and access to quality antihypertensive medicines globally is therefore a vital strategy needed to achieve the WHF 25×25 targets. However, there is currently a scarcity of knowledge about the quality of antihypertensive medicines available in developing countries. Such information is important for enforcing and for ensuring the quality of antihypertensive medicines. PMID:28588941

  5. TNO/Centaurs grouping tested with asteroid data sets

    NASA Astrophysics Data System (ADS)

    Fulchignoni, M.; Birlan, M.; Barucci, M. A.

    2001-11-01

    Recently, we have discussed the possible subdivision in few groups of a sample of 22 TNO and Centaurs for which the BVRIJ photometry were available (Barucci et al., 2001, A&A, 371,1150). We obtained this results using the multivariate statistics adopted to define the current asteroid taxonomy, namely the Principal Components Analysis and the G-mode method (Tholen & Barucci, 1989, in ASTEROIDS II). How these methods work with a very small statistical sample as the TNO/Centaurs one? Theoretically, the number of degrees of freedom of the sample is correct. In fact it is 88 in our case and have to be larger then 50 to cope with the requirements of the G-mode. Does the random sampling of the small number of members of a large population contain enough information to reveal some structure in the population? We extracted several samples of 22 asteroids out of a data-base of 86 objects of known taxonomic type for which BVRIJ photometry is available from ECAS (Zellner et al. 1985, ICARUS 61, 355), SMASS II (S.W. Bus, 1999, PhD Thesis, MIT), and the Bell et al. Atlas of the asteroid infrared spectra. The objects constituting the first sample were selected in order to give a good representation of the major asteroid taxonomic classes (at least three samples each class): C,S,D,A, and G. Both methods were able to distinguish all these groups confirming the validity of the adopted methods. The S class is hard to individuate as a consequence of the choice of I and J variables, which imply a lack of information on the absorption band at 1 micron. The other samples were obtained by random choice of the objects. Not all the major groups were well represented (less than three samples per groups), but the general trend of the asteroid taxonomy has been always obtained. We conclude that the quoted grouping of TNO/Centaurs is representative of some physico-chemical structure of the outer solar system small body population.

  6. Cardiovascular risk reduction intervention among school-students in Kolkata, West Bengal - the CRRIS study protocol.

    PubMed

    Kumar, Soumitra; Ray, Saumitra; Mahapatra, Tanmay; Gupta, Kinnari; Mahapatra, Sanchita; Das, Mrinal K; Guha, Santanu; Deb, Pradip K; Banerjee, Amal K

    2015-01-01

    Increasing burden of cardiovascular risk-factors among adolescent school-children is a major concern in India. Dearth of information regarding the burden of these factors and the efficacy of educational intervention in minimizing them among urban school-students of India called for a school-based, educational intervention involving a representative sample of these students and their caregivers. Using a randomized-controlled design with stratified-random sampling, 1000 students (approximately 50/school) of 9th grade from 20 randomly selected schools (representing all socio-economic classes and school-types) and their caregivers (preferably mothers) will be recruited. Objectives of the study will include: estimation of the baseline burden and post-interventional change in cardiovascular risk-factors, related knowledge, perception and practice among participants in Kolkata. After obtaining appropriate consent (assent for adolescents), collection of the questionnaire-based data (regarding cardiovascular disease/risk-factor related knowledge, perception, practice), anthropometric measurements, stress assessment and cardiological check-up (pulse and blood pressure measurement along with auscultation for any abnormal heart sounds) will be conducted for each participating students twice at an interval of six months. In between 6 educational sessions will be administered in 10 of the 20 schools randomized to the intervention arm. After the follow-up data collection, same sessions will be conducted in the non-interventional schools. Descriptive and inferential analyses (using SAS 9.3) will be conducted to determine the distribution of the risk-factors and efficacy of the intervention in minimizing them so that policy-making can be guided appropriately to keep the adolescents healthy in their future life. Copyright © 2015 Cardiological Society of India. Published by Elsevier B.V. All rights reserved.

  7. Methodological considerations in using complex survey data: an applied example with the Head Start Family and Child Experiences Survey.

    PubMed

    Hahs-Vaughn, Debbie L; McWayne, Christine M; Bulotsky-Shearer, Rebecca J; Wen, Xiaoli; Faria, Ann-Marie

    2011-06-01

    Complex survey data are collected by means other than simple random samples. This creates two analytical issues: nonindependence and unequal selection probability. Failing to address these issues results in underestimated standard errors and biased parameter estimates. Using data from the nationally representative Head Start Family and Child Experiences Survey (FACES; 1997 and 2000 cohorts), three diverse multilevel models are presented that illustrate differences in results depending on addressing or ignoring the complex sampling issues. Limitations of using complex survey data are reported, along with recommendations for reporting complex sample results. © The Author(s) 2011

  8. A technique for evaluating the influence of spatial sampling on the determination of global mean total columnar ozone

    NASA Technical Reports Server (NTRS)

    Tolson, R. H.

    1981-01-01

    A technique is described for providing a means of evaluating the influence of spatial sampling on the determination of global mean total columnar ozone. A finite number of coefficients in the expansion are determined, and the truncated part of the expansion is shown to contribute an error to the estimate, which depends strongly on the spatial sampling and is relatively insensitive to data noise. First and second order statistics are derived for each term in a spherical harmonic expansion which represents the ozone field, and the statistics are used to estimate systematic and random errors in the estimates of total ozone.

  9. Attitudes towards smoking restrictions and tobacco advertisement bans in Georgia.

    PubMed

    Bakhturidze, George D; Mittelmark, Maurice B; Aarø, Leif E; Peikrishvili, Nana T

    2013-11-25

    This study aims to provide data on a public level of support for restricting smoking in public places and banning tobacco advertisements. A nationally representative multistage sampling design, with sampling strata defined by region (sampling quotas proportional to size) and substrata defined by urban/rural and mountainous/lowland settlement, within which census enumeration districts were randomly sampled, within which households were randomly sampled, within which a randomly selected respondent was interviewed. The country of Georgia, population 4.7 million, located in the Caucasus region of Eurasia. One household member aged between 13 and 70 was selected as interviewee. In households with more than one age-eligible person, selection was carried out at random. Of 1588 persons selected, 14 refused to participate and interviews were conducted with 915 women and 659 men. Respondents were interviewed about their level of agreement with eight possible smoking restrictions/bans, used to calculate a single dichotomous (agree/do not agree) opinion indicator. The level of agreement with restrictions was analysed in bivariate and multivariate analyses by age, gender, education, income and tobacco use status. Overall, 84.9% of respondents indicated support for smoking restrictions and tobacco advertisement bans. In all demographic segments, including tobacco users, the majority of respondents indicated agreement with restrictions, ranging from a low of 51% in the 13-25 age group to a high of 98% in the 56-70 age group. Logistic regression with all demographic variables entered showed that agreement with restrictions was higher with age, and was significantly higher among never smokers as compared to daily smokers. Georgian public opinion is normatively supportive of more stringent tobacco-control measures in the form of smoking restrictions and tobacco advertisement bans.

  10. Prevalence and correlates of bullying victimisation and perpetration in a nationally representative sample of Australian youth.

    PubMed

    Thomas, Hannah J; Connor, Jason P; Lawrence, David M; Hafekost, Jennifer M; Zubrick, Stephen R; Scott, James G

    2017-09-01

    Bullying prevalence studies are limited by varied measurement methods and a lack of representative samples. This study estimated the national prevalence of bullying victimisation, perpetration and combined victim-perpetration experiences in a representative population-based sample of Australian youth. The relationships between the three types of bullying involvement with a range of mental health symptoms and diagnoses were also examined. A randomly selected nationally representative sample aged 11-17 years ( N = 2967, M age = 14.6 years; 51.6% male) completed the youth component of the Second Australian Child and Adolescent Survey of Mental Health and Wellbeing (Young Minds Matter). Parents or carers also completed a structured face-to-face interview that asked questions about a single randomly selected child in the household. The youth survey comprised self-reported bullying victimisation and perpetration (Olweus Bully-Victim Questionnaire-adapted), psychological distress (K10), emotional and behavioural problems (Strengths and Difficulties Questionnaire), as well as self-harm, suicide attempts and substance use. Modules from the Diagnostic Interview Schedule for Children Version IV were administered to all youth and parents to assess for mental disorder diagnoses (major depressive disorder, any anxiety disorder and any externalising disorder [attention-deficit hyperactivity disorder, oppositional defiant disorder and conduct disorder]). The 12-month prevalence of bullying victimisation was 13.3%, perpetration 1.6% and victim-perpetration 1.9%. Logistic regression models showed all forms of involvement in bullying were associated with increased risk of psychological distress, emotional and behavioural problems, substance use, self-harm and attempted suicide. Victimisation and victim-perpetration were associated with youth-reported major depressive disorder. There were also significant associations between bullying involvement and parent-reported diagnoses of major depressive disorder, any anxiety disorder and any externalising disorder. Bullying continues to be frequently experienced by Australian adolescents. The current findings showed that involvement in any bullying behaviour was associated with increased risk of concurrent mental health problems. This evidence can be used to inform decisions concerning the allocation of resources to address this important health issue.

  11. Distributional assumptions in food and feed commodities- development of fit-for-purpose sampling protocols.

    PubMed

    Paoletti, Claudia; Esbensen, Kim H

    2015-01-01

    Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.

  12. “Does Organizational Culture Influence the Ethical Behavior in the Pharmaceutical Industry?”

    PubMed Central

    Nagashekhara, Molugulu; Agil, Syed Omar Syed

    2011-01-01

    Study of ethical behavior among medical representatives in the profession is an under-portrayed component that deserves further perusal in the pharmaceutical industry. The purpose of this study is to find out the influence of organizational culture on ethical behavior of medical representatives. Medical representatives working for both domestic and multinational companies constitutes the sample (n=300). Data is collected using a simple random and cluster sampling through a structured questionnaire. The research design is hypothesis testing. It is a cross-sectional and correlational study, conducted under non-contrived settings. Chi-square tests were shows that there is an association between the organizational culture and ethical behavior of medical representatives. In addition, the strength of the association is measured which report to Cramer’s V of 63.1% and Phi Value of 2.749. Results indicate that multinational company medical reps are more ethical compared to domestic company medical representatives vast difference in both variance and in t test results. Through better organizational culture, pharmaceutical companies can create the most desirable behavior among their employees. Authors conclude that apart from organizational culture, the study of additional organizational, individual and external factors are imperative for better understanding of ethical behavior of medical representatives in the pharmaceutical industry in India. PMID:24826027

  13. "Does organizational culture influence the ethical behavior in the pharmaceutical industry?".

    PubMed

    Nagashekhara, Molugulu; Agil, Syed Omar Syed

    2011-12-01

    Study of ethical behavior among medical representatives in the profession is an under-portrayed component that deserves further perusal in the pharmaceutical industry. The purpose of this study is to find out the influence of organizational culture on ethical behavior of medical representatives. Medical representatives working for both domestic and multinational companies constitutes the sample (n=300). Data is collected using a simple random and cluster sampling through a structured questionnaire. The research design is hypothesis testing. It is a cross-sectional and correlational study, conducted under non-contrived settings. Chi-square tests were shows that there is an association between the organizational culture and ethical behavior of medical representatives. In addition, the strength of the association is measured which report to Cramer's V of 63.1% and Phi Value of 2.749. Results indicate that multinational company medical reps are more ethical compared to domestic company medical representatives vast difference in both variance and in t test results. Through better organizational culture, pharmaceutical companies can create the most desirable behavior among their employees. Authors conclude that apart from organizational culture, the study of additional organizational, individual and external factors are imperative for better understanding of ethical behavior of medical representatives in the pharmaceutical industry in India.

  14. A weighted belief-propagation algorithm for estimating volume-related properties of random polytopes

    NASA Astrophysics Data System (ADS)

    Font-Clos, Francesc; Massucci, Francesco Alessandro; Pérez Castillo, Isaac

    2012-11-01

    In this work we introduce a novel weighted message-passing algorithm based on the cavity method for estimating volume-related properties of random polytopes, properties which are relevant in various research fields ranging from metabolic networks, to neural networks, to compressed sensing. We propose, as opposed to adopting the usual approach consisting in approximating the real-valued cavity marginal distributions by a few parameters, using an algorithm to faithfully represent the entire marginal distribution. We explain various alternatives for implementing the algorithm and benchmarking the theoretical findings by showing concrete applications to random polytopes. The results obtained with our approach are found to be in very good agreement with the estimates produced by the Hit-and-Run algorithm, known to produce uniform sampling.

  15. Different hunting strategies select for different weights in red deer

    PubMed Central

    Martínez, María; Rodríguez-Vigal, Carlos; Jones, Owen R; Coulson, Tim; Miguel, Alfonso San

    2005-01-01

    Much insight can be derived from records of shot animals. Most researchers using such data assume that their data represents a random sample of a particular demographic class. However, hunters typically select a non-random subset of the population and hunting is, therefore, not a random process. Here, with red deer (Cervus elaphus) hunting data from a ranch in Toledo, Spain, we demonstrate that data collection methods have a significant influence upon the apparent relationship between age and weight. We argue that a failure to correct for such methodological bias may have significant consequences for the interpretation of analyses involving weight or correlated traits such as breeding success, and urge researchers to explore methods to identify and correct for such bias in their data. PMID:17148205

  16. Factors associated with physicians' reliance on pharmaceutical sales representatives.

    PubMed

    Anderson, Britta L; Silverman, Gabriel K; Loewenstein, George F; Zinberg, Stanley; Schulkin, Jay

    2009-08-01

    To examine relationships between pharmaceutical representatives and obstetrician-gynecologists and identify factors associated with self-reported reliance on representatives when making prescribing decisions. In 2006-2007, questionnaires were mailed to 515 randomly selected physicians in the American College of Obstetricians and Gynecologists' Collaborative Ambulatory Research Network. Participants were asked about the information sources used when deciding to prescribe a new drug, interactions with sales representatives, views of representatives' value, and guidelines they had read on appropriate industry interactions. Two hundred fifty-one completed questionnaires (49%) were returned. Seventy-six percent of participants see sales representatives' information as at least somewhat valuable. Twenty-nine percent use representatives often or almost always when deciding whether to prescribe a new drug; 44% use them sometimes. Physicians in private practice are more likely than those in university hospitals to interact with, value, and rely on representatives; community hospital physicians tend to fall in the middle. Gender and age are not associated with industry interaction. Dispensing samples is associated with increased reliance on representatives when making prescribing decisions, beyond what is predicted by a physician's own beliefs about the value of representatives' information. Reading guidelines on physician-industry interaction is not associated with less reliance on representatives after controlling for practice setting. Physicians' interactions with industry and their familiarity with guidelines vary by practice setting, perhaps because of more restrictive policies in university settings, professional isolation of private practice, or differences in social norms. Prescribing samples may be associated with physicians' use of information from sales representatives more than is merited by the physicians' own beliefs about the value of pharmaceutical representatives.

  17. The Comparative Power of "Type/Token" and "Hapax Legomena/Type" Ratios: A Corpus-Based Study of Authorial Differentiation

    ERIC Educational Resources Information Center

    Ali, Sundus Muhsin; Hussein, Khalid Shakir

    2014-01-01

    This paper presents an attempt to verify the comparative power of two statistical features: Type/Token, and Hapax legomena/Token ratios (henceforth TTR and HTR). A corpus of ten novels is compiled. Then sixteen samples (each is 5,000 tokens in length) are taken randomly out of these novels as representative blocks. The researchers observe the way…

  18. Gender Differences and Public Sector Managers: Women's Perceptions of Equality in State Government.

    ERIC Educational Resources Information Center

    Whitcraft, Carol; Williams, M. Lee

    A study assessed the equality of women managers in 11 of the largest state agencies in Texas. It also investigated the perceptions of men and women managers concerning a variety of work related issues in Texas state government. A stratified random sample of 25 percent of all managers was drawn, and 1,844 responses, representing a 55.5% response…

  19. Personal Health Risks Behaviour Profile among University Students in the South East Nigeria: Implication for Health Education

    ERIC Educational Resources Information Center

    Ilo, Cajetan I.; Onwunaka, Chinagorom; Nwimo, Ignatius O.

    2015-01-01

    This descriptive survey was carried out in order to determine the personal health risks behaviour profile among university students in the south east of Nigeria. A random sample of 900 students completed the questionnaire designed for the study. Out of this number 821, representing about 91.2% return rate, were used for data analysis. Means and…

  20. Preliminary Report of Alcohol and Other Drug Use among Ontario Students in 1983, and Trends since 1977.

    ERIC Educational Resources Information Center

    Smart, Reginald G.; And Others

    Since 1977, alcohol and drug use among Ontario students has been studied every 2 years. To examine the patterns of alcohol and other drug use among Ontario students in 1983, a randomized sample of 5,835 students, representing grades 5, 7, 9, 11, and 13, from four geographical regions, completed an anonymous questionnaire. An analysis of the…

  1. Occupational Safety & Health. Inspectors' Opinions on Improving OSHA Effectiveness. Fact Sheet for Subcommittee on Health and Safety, Committee on Education and Labor, House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Div. of Human Resources.

    Questionnaires gathered opinions of all Occupational Safety and Health Administration (OSHA) field supervisors and a randomly selected sample of one-third of the compliance officers about OSHA's approach to improving workplace safety and health. Major topics addressed were enforcement, safety and health standards, education and training, employer…

  2. Evaluation of Child Health Matters: A Web-Based Tutorial to Enhance School Nurses' Communications with Families about Weight-Related Health

    ERIC Educational Resources Information Center

    Steele, Ric G.; Wu, Yelena P.; Cushing, Christopher C.; Jensen, Chad D.

    2013-01-01

    The goal of the current study was to assess the efficacy and acceptability of a web-based tutorial (Child Health Matters, CHM) designed to improve school nurses' communications with families about pediatric weight-related health issues. Using a randomized wait-list control design, a nationally representative sample of school nurses was assigned to…

  3. Pharmaceutical representatives' beliefs and practices about their professional practice: a study in Sudan.

    PubMed

    Idris, K M; Mustafa, A F; Yousif, M A

    2012-08-01

    Pharmaceutical representatives are an important promotional tool for pharmaceutical companies. This cross-sectional, exploratory study aimed to determine pharmaceutical representatives' beliefs and practices about their professional practice in Sudan. A random sample of 160 pharmaceutical representatives were interviewed using a pretested questionnaire. The majority were male (84.4%) and had received training in professional sales skills (86.3%) and about the products being promoted (82.5%). Only 65.6% agreed that they provided full and balanced information about products. Not providing balanced information was attributed by 23.1% to doctors' lack of time. However, 28.1% confessed they sometimes felt like hiding unfavourable information, 21.9% were sometimes or always inclined to give untrue information to make sales and 66.9% considered free gifts as ethically acceptable. More attention is needed to dissemination of ethical codes of conduct and training about the ethics of drug promotion for pharmaceutical representatives in Sudan.

  4. Studies on spectral analysis of randomly sampled signals: Application to laser velocimetry data

    NASA Technical Reports Server (NTRS)

    Sree, David

    1992-01-01

    Spectral analysis is very useful in determining the frequency characteristics of many turbulent flows, for example, vortex flows, tail buffeting, and other pulsating flows. It is also used for obtaining turbulence spectra from which the time and length scales associated with the turbulence structure can be estimated. These estimates, in turn, can be helpful for validation of theoretical/numerical flow turbulence models. Laser velocimetry (LV) is being extensively used in the experimental investigation of different types of flows, because of its inherent advantages; nonintrusive probing, high frequency response, no calibration requirements, etc. Typically, the output of an individual realization laser velocimeter is a set of randomly sampled velocity data. Spectral analysis of such data requires special techniques to obtain reliable estimates of correlation and power spectral density functions that describe the flow characteristics. FORTRAN codes for obtaining the autocorrelation and power spectral density estimates using the correlation-based slotting technique were developed. Extensive studies have been conducted on simulated first-order spectrum and sine signals to improve the spectral estimates. A first-order spectrum was chosen because it represents the characteristics of a typical one-dimensional turbulence spectrum. Digital prefiltering techniques, to improve the spectral estimates from randomly sampled data were applied. Studies show that the spectral estimates can be increased up to about five times the mean sampling rate.

  5. Within-Tunnel Variations in Pressure Data for Three Transonic Wind Tunnels

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2014-01-01

    This paper compares the results of pressure measurements made on the same test article with the same test matrix in three transonic wind tunnels. A comparison is presented of the unexplained variance associated with polar replicates acquired in each tunnel. The impact of a significance component of systematic (not random) unexplained variance is reviewed, and the results of analyses of variance are presented to assess the degree of significant systematic error in these representative wind tunnel tests. Total uncertainty estimates are reported for 140 samples of pressure data, quantifying the effects of within-polar random errors and between-polar systematic bias errors.

  6. Prevalence of drug use among drivers based on mandatory, random tests in a roadside survey

    PubMed Central

    Alcañiz, Manuela; Guillen, Montserrat

    2018-01-01

    Background In the context of road safety, this study aims to examine the prevalence of drug use in a random sample of drivers. Methods A stratified probabilistic sample was designed to represent vehicles circulating on non-urban roads. Random drug tests were performed during autumn 2014 on 521 drivers in Catalonia (Spain). Participation was mandatory. The prevalence of drug driving for cannabis, methamphetamines, amphetamines, cocaine, opiates and benzodiazepines was assessed. Results The overall prevalence of drug use is 16.4% (95% CI: 13.9; 18.9) and affects primarily younger male drivers. Drug use is similarly prevalent during weekdays and on weekends, but increases with the number of occupants. The likelihood of being positive for methamphetamines is significantly higher for drivers of vans and lorries. Conclusions Different patterns of use are detected depending on the drug considered. Preventive drug tests should not only be conducted on weekends and at night-time, and need to be reinforced for drivers of commercial vehicles. Active educational campaigns should focus on the youngest age-group of male drivers. PMID:29920542

  7. Towards representative energy data: the Machiguenga study.

    PubMed

    Montgomery, E

    1978-01-01

    Representative energy data for a human population can be produced by combining randomly sampled time allocation observations with activity-specific energy expenditure measurements. Research to produce representative energy data for adults of a population of Machiguenga Indians has recently been conducted in lowland, southeastern Peru. Marked contrast was found between the sexes for average married adults in energy expended on an average day. Men spent about 3,200 kcals and women, about 1,925; ratio: 1.66 to 1. In general, men tended to work at somewhat more energetic activities and for longer periods than did women. In addition to sex-role-related task differences were contrasts in uses of technological items and in respective work settings. These representative behavior data permit direct estimates of population-level energy requirements for average days, seasons, or for 1 year.

  8. Network Structure and the Risk for HIV Transmission Among Rural Drug Users

    PubMed Central

    Young, A. M.; Jonas, A. B.; Mullins, U. L.; Halgin, D. S.

    2012-01-01

    Research suggests that structural properties of drug users’ social networks can have substantial effects on HIV risk. The purpose of this study was to investigate if the structural properties of Appalachian drug users’ risk networks could lend insight into the potential for HIV transmission in this population. Data from 503 drug users recruited through respondent-driven sampling were used to construct a sociometric risk network. Network ties represented relationships in which partners had engaged in unprotected sex and/or shared injection equipment. Compared to 1,000 randomly generated networks, the observed network was found to have a larger main component and exhibit more cohesiveness and centralization than would be expected at random. Thus, the risk network structure in this sample has many structural characteristics shown to be facilitative of HIV transmission. This underscores the importance of primary prevention in this population and prompts further investigation into the epidemiology of HIV in the region. PMID:23184464

  9. The Study on Mental Health at Work: Design and sampling.

    PubMed

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  10. Clutch sizes and nests of tailed frogs from the Olympic Peninsula, Washington

    USGS Publications Warehouse

    Bury, R. Bruce; Loafman, P.; Rofkar, D.; Mike, K.

    2001-01-01

    In the summers 1995-1998, we sampled 168 streams (1,714 in of randomly selected 1-m bands) to determine distribution and abundance of stream amphibians in Olympic National Park, Washington. We found six nests (two in one stream) of the tailed frog, compared to only two nests with clutch sizes reported earlier for coastal regions. This represents only one nest per 286 in searched and one nest per 34 streams sampled. Tailed frogs occurred only in 94 (60%) of the streams and, for these waters, we found one nest per 171 in searched or one nest per 20 streams sampled. The numbers of eggs for four masses ((x) over bar = 48.3, range 40-55) were low but one single strand in a fifth nest had 96 eggs. One nest with 185 eggs likely represented communal egg deposition. Current evidence indicates a geographic trend with yearly clutches of relatively few eggs in coastal tailed frogs compared to biennial nesting with larger clutches for inland populations in the Rocky Mountains.

  11. Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.

    PubMed

    Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2016-06-01

    Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.

  12. New applications of particle accelerators in medicine, materials science, and industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knapp, E.A.

    1981-01-01

    Recently, the application of particle accelerators to medicine, materials science, and other industrial uses has increased dramatically. A random sampling of some of these new programs is discussed, primarily to give the scope of these new applications. The three areas, medicine, materials science or solid-state physics, and industrial applications, are chosen for their diversity and are representative of new accelerator applications for the future.

  13. CLUSTERS OF TASKS PERFORMED BY WASHINGTON STATE FARM OPERATORS ENGAGED IN SEVEN TYPES OF AGRICULTURAL PRODUCTION--GRAIN, DAIRY, FORESTRY, LIVESTOCK, POULTRY, HORTICULTURE, AND GENERAL FARMING. REPORT NO. 27.

    ERIC Educational Resources Information Center

    LONG, GILBERT A.

    THE OBJECTIVE OF THIS STUDY WAS TO OBTAIN UP-TO-DATE FACTS ABOUT CLUSTERS OF TASKS PERFORMED BY WASHINGTON STATE FARM OPERATORS ENGAGED PRIMARILY IN PRODUCING GRAIN, LIVESTOCK, DAIRY COMMODITIES, POULTRY, FOREST PRODUCTS, HORTICULTURAL COMMODITIES, AND GENERAL FARMING COMMODITIES. FROM A RANDOM SAMPLE OF 267 FARMERS REPRESENTING THOSE CATEGORIES…

  14. Determining the effectiveness of the third person interview in the level of insight psychotic patients.

    PubMed

    Mehdizadeh, Mahsa; Rezaei, Omid; Dolatshahi, Behrouz

    2016-11-30

    The goal of this study was to determine the effectiveness of the third person interview in increasing the level of insight and cooperation in psychotic patients. We used a quasi-experimental posttest design with an alternative method group. A number of 40 individuals with a definite diagnosis of psychosis were selected using a simple random sampling, and were put randomly in an experimental group (third person interview) and an alternative control group (clinical interview). The results indicated that using the third person interview, the insight level of the psychotic patients increased in all dimensions of insight, except awareness of flat or blunted affect and awareness of unsociability. The results of the independent t-test samples showed no significant difference in cooperation between the two groups of psychotic patients. It seems that the ability to consider one's mental viewpoint from other's, is dependent on the relative ability of psychotic patients to represent other's mental states (theory of mind). But, psychotic patients have severe impairment in the ability to represent their own mental states, resulting in an impairment in the recognition of their mental disorder, psychotic symptoms, the need for therapy, and social consequences of their mental disorder. Copyright © 2016. Published by Elsevier Ireland Ltd.

  15. Prevalence and correlates of elder mistreatment in South Carolina: the South Carolina elder mistreatment study.

    PubMed

    Amstadter, Ananda B; Zajac, Kristyn; Strachan, Martha; Hernandez, Melba A; Kilpatrick, Dean G; Acierno, Ron

    2011-10-01

    The purposes of this study were to (a) derive prevalence estimates for elder mistreatment (emotional, physical, sexual, neglectful, and financial mistreatment of older adults [age 60 +]) in a randomly selected sample of South Carolinians; (b) examine correlates (i.e., potential risk factors) of mistreatment; and (c) examine incident characteristics of mistreatment events. Random Digit Dialing (RDD) was used to derive a representative sample in terms of age and gender; computer-assisted telephone interviewing was used to standardize collection of demographic, correlate, and mistreatment data. Prevalence estimates and mistreatment correlates were obtained and subjected to logistic regression. A total of 902 participants provided data. Prevalence for mistreatment types (since age 60) were 12.9% emotional, 2.1% physical, 0.3% sexual, 5.4% potential neglect, and 6.6% financial exploitation by family member. The most consistent correlates of mistreatment across abuse types were low social support and needing assistance with daily living activities. One in 10 participants reported either emotional, physical, sexual, or neglectful mistreatment within the past year, and 2 in 10 reported mistreatment since age 60. Across categories, the most consistent correlate of mistreatment was low social support, representing an area toward which preventive intervention may be directed with significant public health implications.

  16. The creation of digital thematic soil maps at the regional level (with the map of soil carbon pools in the Usa River basin as an example)

    NASA Astrophysics Data System (ADS)

    Pastukhov, A. V.; Kaverin, D. A.; Shchanov, V. M.

    2016-09-01

    A digital map of soil carbon pools was created for the forest-tundra ecotone in the Usa River basin with the use of ERDAS Imagine 2014 and ArcGIS 10.2 software. Supervised classification and thematic interpretation of satellite images and digital terrain models with the use of a georeferenced database on soil profiles were applied. Expert assessment of the natural diversity and representativeness of random samples for different soil groups was performed, and the minimal necessary size of the statistical sample was determined.

  17. Sampled-data H∞ filtering for Markovian jump singularly perturbed systems with time-varying delay and missing measurements

    NASA Astrophysics Data System (ADS)

    Yan, Yifang; Yang, Chunyu; Ma, Xiaoping; Zhou, Linna

    2018-02-01

    In this paper, sampled-data H∞ filtering problem is considered for Markovian jump singularly perturbed systems with time-varying delay and missing measurements. The sampled-data system is represented by a time-delay system, and the missing measurement phenomenon is described by an independent Bernoulli random process. By constructing an ɛ-dependent stochastic Lyapunov-Krasovskii functional, delay-dependent sufficient conditions are derived such that the filter error system satisfies the prescribed H∞ performance for all possible missing measurements. Then, an H∞ filter design method is proposed in terms of linear matrix inequalities. Finally, numerical examples are given to illustrate the feasibility and advantages of the obtained results.

  18. Exploring Sampling in the Detection of Multicategory EEG Signals

    PubMed Central

    Siuly, Siuly; Kabir, Enamul; Wang, Hua; Zhang, Yanchun

    2015-01-01

    The paper presents a structure based on samplings and machine leaning techniques for the detection of multicategory EEG signals where random sampling (RS) and optimal allocation sampling (OS) are explored. In the proposed framework, before using the RS and OS scheme, the entire EEG signals of each class are partitioned into several groups based on a particular time period. The RS and OS schemes are used in order to have representative observations from each group of each category of EEG data. Then all of the selected samples by the RS from the groups of each category are combined in a one set named RS set. In the similar way, for the OS scheme, an OS set is obtained. Then eleven statistical features are extracted from the RS and OS set, separately. Finally this study employs three well-known classifiers: k-nearest neighbor (k-NN), multinomial logistic regression with a ridge estimator (MLR), and support vector machine (SVM) to evaluate the performance for the RS and OS feature set. The experimental outcomes demonstrate that the RS scheme well represents the EEG signals and the k-NN with the RS is the optimum choice for detection of multicategory EEG signals. PMID:25977705

  19. Framing deadly domestic violence: why the media's spin matters in newspaper coverage of femicide.

    PubMed

    Gillespie, Lane Kirkland; Richards, Tara N; Givens, Eugena M; Smith, M Dwayne

    2013-02-01

    The news media play a substantial role in shaping society's perceptions of social issues, including domestic violence. However, minimal research has been conducted to examine whether news media frame stories of femicide within the context of domestic violence. Using frame analysis, the present research compares newspaper articles representing 113 cases of femicide that define the murder as domestic violence to a random sample of 113 cases without coverage defining the femicide as domestic violence. Findings indicate that both groups are represented by multiple frames, including a previously unidentified frame that places the femicide in the context of domestic violence as a social problem.

  20. Scalable randomized benchmarking of non-Clifford gates

    NASA Astrophysics Data System (ADS)

    Cross, Andrew; Magesan, Easwar; Bishop, Lev; Smolin, John; Gambetta, Jay

    Randomized benchmarking is a widely used experimental technique to characterize the average error of quantum operations. Benchmarking procedures that scale to enable characterization of n-qubit circuits rely on efficient procedures for manipulating those circuits and, as such, have been limited to subgroups of the Clifford group. However, universal quantum computers require additional, non-Clifford gates to approximate arbitrary unitary transformations. We define a scalable randomized benchmarking procedure over n-qubit unitary matrices that correspond to protected non-Clifford gates for a class of stabilizer codes. We present efficient methods for representing and composing group elements, sampling them uniformly, and synthesizing corresponding poly (n) -sized circuits. The procedure provides experimental access to two independent parameters that together characterize the average gate fidelity of a group element. We acknowledge support from ARO under Contract W911NF-14-1-0124.

  1. Electrofishing effort required to estimate biotic condition in southern Idaho Rivers

    USGS Publications Warehouse

    Maret, Terry R.; Ott, Douglas S.; Herlihy, Alan T.

    2007-01-01

    An important issue surrounding biomonitoring in large rivers is the minimum sampling effort required to collect an adequate number of fish for accurate and precise determinations of biotic condition. During the summer of 2002, we sampled 15 randomly selected large-river sites in southern Idaho to evaluate the effects of sampling effort on an index of biotic integrity (IBI). Boat electrofishing was used to collect sample populations of fish in river reaches representing 40 and 100 times the mean channel width (MCW; wetted channel) at base flow. Minimum sampling effort was assessed by comparing the relation between reach length sampled and change in IBI score. Thirty-two species of fish in the families Catostomidae, Centrarchidae, Cottidae, Cyprinidae, Ictaluridae, Percidae, and Salmonidae were collected. Of these, 12 alien species were collected at 80% (12 of 15) of the sample sites; alien species represented about 38% of all species (N = 32) collected during the study. A total of 60% (9 of 15) of the sample sites had poor IBI scores. A minimum reach length of about 36 times MCW was determined to be sufficient for collecting an adequate number of fish for estimating biotic condition based on an IBI score. For most sites, this equates to collecting 275 fish at a site. Results may be applicable to other semiarid, fifth-order through seventh-order rivers sampled during summer low-flow conditions.

  2. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    PubMed

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  3. 2008 Niday Perinatal Database quality audit: report of a quality assurance project.

    PubMed

    Dunn, S; Bottomley, J; Ali, A; Walker, M

    2011-12-01

    This quality assurance project was designed to determine the reliability, completeness and comprehensiveness of the data entered into Niday Perinatal Database. Quality of the data was measured by comparing data re-abstracted from the patient record to the original data entered into the Niday Perinatal Database. A representative sample of hospitals in Ontario was selected and a random sample of 100 linked mother and newborn charts were audited for each site. A subset of 33 variables (representing 96 data fields) from the Niday dataset was chosen for re-abstraction. Of the data fields for which Cohen's kappa statistic or intraclass correlation coefficient (ICC) was calculated, 44% showed substantial or almost perfect agreement (beyond chance). However, about 17% showed less than 95% agreement and a kappa or ICC value of less than 60% indicating only slight, fair or moderate agreement (beyond chance). Recommendations to improve the quality of these data fields are presented.

  4. Rigidity of the far-right? Motivated social cognition in a nationally representative sample of Hungarians on the eve of the far-right breakthrough in the 2010 elections.

    PubMed

    Lönnqvist, Jan-Erik; Szabó, Zsolt Péter; Kelemen, Laszlo

    2018-04-26

    We investigated the "rigidity of the right" hypothesis in the context of the far-right breakthrough in the 2010 Hungarian parliamentary elections. This hypothesis suggests that psychological characteristics having to do with need for security and certainty attract people to a broad-based right-wing ideology. A nationally representative sample (N = 1000) in terms of age, gender and place of residence was collected by means of the random walking method and face-to-face interviews. Voters of JOBBIK (n = 124), the radically nationalist conservative far-right party, scored lower on System Justifying Belief, Belief in a Just World (Global) and higher on Need for Cognition than other voters. Our results contradict the "rigidity of the right" hypothesis: JOBBIK voters scored, on many measures, opposite to what the hypothesis would predict. © 2018 International Union of Psychological Science.

  5. Treatment outcomes in 4 modes of orthodontic practice.

    PubMed

    Poulton, Donald; Vlaskalic, Vicki; Baumrind, Sheldon

    2005-03-01

    This study is a continuation of a previously published report on the outcome of orthodontic treatment provided in offices representing different modes of practice. The sample consisted of duplicate pretreatment (T1) and posttreatment (T2) dental casts of 348 patients from traditional private orthodontic practices (5 offices, 134 patients), company-owned practices (5 offices, 107 patients), offices associated with practice-management organizations (2 offices, 60 patients), and general dental practices (2 offices, 47 patients). Methods were used to obtain random, representative samples from each office, starting with lists of patients who were treated consecutively with full fixed orthodontic appliances. The dental casts were measured by 2 independent judges who used the unweighted PAR score. Good interjudge agreement was shown on the initial casts, but the agreement was not as strong on the final casts. The measurements showed that treatment outcomes were generally satisfactory, although some significant differences between offices and management modes were shown.

  6. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    PubMed

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  7. Methods and analysis of realizing randomized grouping.

    PubMed

    Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi

    2011-07-01

    Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.

  8. "Under the radar": nurse practitioner prescribers and pharmaceutical industry promotions.

    PubMed

    Ladd, Elissa C; Mahoney, Diane Feeney; Emani, Srinivas

    2010-12-01

    To assess nurse practitioners' interactions with pharmaceutical industry promotional activities and their perception of information reliability and self-reported prescribing behaviors. Self-administered online survey. A nationally randomized sample of nurse practitioner prescribers was surveyed. Eligibility criteria included current clinical practice and licensure to prescribe medications in their state of practice. A total of 263 responses were analyzed. Almost all respondents (96%) reported regular contact with pharmaceutical sales representatives, and most (71%) reported receiving information on new drugs directly from pharmaceutical sales representatives some or most of the time. A large portion (66%) dispensed drug samples regularly to their patients, and 73% believed that samples were somewhat or very helpful in learning about new drugs. Eighty-one percent of respondents thought that it was ethically acceptable to give out samples to anyone, and 90% believed that it was acceptable to attend lunch and dinner events sponsored by the pharmaceutical industry. Almost half (48%) stated that they were more likely to prescribe a drug that was highlighted during a lunch or dinner event. Most respondents stated that it was ethically acceptable for speakers to be paid by industry. Nurse practitioner prescribers had extensive contact with pharmaceutical industry promotional activities such as pharmaceutical representative contact, receipt of drug samples, and regular attendance at industry-sponsored meal events and continuing education programs. They reported that industry interface with nurse practitioner prescribers in the form of sponsored meals, education events, and paid speakers was ethically acceptable.

  9. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  10. Radiation Transport in Random Media With Large Fluctuations

    NASA Astrophysics Data System (ADS)

    Olson, Aaron; Prinja, Anil; Franke, Brian

    2017-09-01

    Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.

  11. The Degree and Nature to Which Public School Libraries Are Automated: A Survey of Public School Libraries in Ohio.

    ERIC Educational Resources Information Center

    Meckler, Elizabeth M.

    This paper examines the belief that no more than half of the public school libraries in the state of Ohio are automated to any degree. The purpose of the research was to determine the degree and nature of automation at the public school libraries in Ohio. A written survey was mailed to 350 libraries that represented a randomized sample of the…

  12. Confidence regions of planar cardiac vectors

    NASA Technical Reports Server (NTRS)

    Dubin, S.; Herr, A.; Hunt, P.

    1980-01-01

    A method for plotting the confidence regions of vectorial data obtained in electrocardiology is presented. The 90%, 95% and 99% confidence regions of cardiac vectors represented in a plane are obtained in the form of an ellipse centered at coordinates corresponding to the means of a sample selected at random from a bivariate normal distribution. An example of such a plot for the frontal plane QRS mean electrical axis for 80 horses is also presented.

  13. Effect of initial spacing on mechanical properties of lumber sawn from unthinned slash pine at age 40

    Treesearch

    Robert H. McAlister; Alexander Clark; Joseph R. Saucier

    1997-01-01

    The effect of initial planting density on strength and stiffness of slash pine (Pinus elliotti Engelm. var elliotti) from a 40-year-old plantation on the Georgia Coastal Plain was examined. A stratified random sample of trees with diameters at breast height ranging from 8 to 16 inches from replicated stands representing tree spacing of 6 by 8, 8 by 8, 10 by 10, and 15...

  14. Inference from clustering with application to gene-expression microarrays.

    PubMed

    Dougherty, Edward R; Barrera, Junior; Brun, Marcel; Kim, Seungchan; Cesar, Roberto M; Chen, Yidong; Bittner, Michael; Trent, Jeffrey M

    2002-01-01

    There are many algorithms to cluster sample data points based on nearness or a similarity measure. Often the implication is that points in different clusters come from different underlying classes, whereas those in the same cluster come from the same class. Stochastically, the underlying classes represent different random processes. The inference is that clusters represent a partition of the sample points according to which process they belong. This paper discusses a model-based clustering toolbox that evaluates cluster accuracy. Each random process is modeled as its mean plus independent noise, sample points are generated, the points are clustered, and the clustering error is the number of points clustered incorrectly according to the generating random processes. Various clustering algorithms are evaluated based on process variance and the key issue of the rate at which algorithmic performance improves with increasing numbers of experimental replications. The model means can be selected by hand to test the separability of expected types of biological expression patterns. Alternatively, the model can be seeded by real data to test the expected precision of that output or the extent of improvement in precision that replication could provide. In the latter case, a clustering algorithm is used to form clusters, and the model is seeded with the means and variances of these clusters. Other algorithms are then tested relative to the seeding algorithm. Results are averaged over various seeds. Output includes error tables and graphs, confusion matrices, principal-component plots, and validation measures. Five algorithms are studied in detail: K-means, fuzzy C-means, self-organizing maps, hierarchical Euclidean-distance-based and correlation-based clustering. The toolbox is applied to gene-expression clustering based on cDNA microarrays using real data. Expression profile graphics are generated and error analysis is displayed within the context of these profile graphics. A large amount of generated output is available over the web.

  15. Comparison of non-landslide sampling strategies to counteract inventory-based biases within national-scale statistical landslide susceptibility models

    NASA Astrophysics Data System (ADS)

    Lima, Pedro; Steger, Stefan; Glade, Thomas

    2017-04-01

    Landslides can represent a significant threat for people and infrastructure in hilly and mountainous landscapes worldwide. The understanding and prediction of those geomorphic processes is crucial to avoid economic loses or even casualties to people and their properties. Statistical based landslide susceptibility models are well known for being highly reliant on the quality, representativeness and availability of input data. In this context, several studies indicate that the landslide inventory represents the most important input data. However each landslide mapping technique or data collection has its drawbacks. Consequently, biased landslide inventories may be commonly introduced into statistical models, especially at regional or even national scale. It remains to the researcher to be aware of potential limitations and design strategies to avoid or reduce the potential propagation of input data errors and biases influences on the modelling outcomes. Previous studies have proven that such erroneous landslide inventories may lead to unrealistic landslide susceptibility maps. We assume that one possibility to tackle systematic landslide inventory-based biases might be a concentration on sampling strategies that focus on the distribution of non-landslide locations. For this purpose, we test an approach for the Austrian territory that concentrates on a modified non-landslide sampling strategy, instead the traditional applied random sampling. It is expected that the way non-landslide locations are represented (e.g. equally over the area or within those areas where mapping campaigns have been conducted) is important to reduce a potential over- or underestimation of landslide susceptibility within specific areas caused by bias. As presumably each landslide inventory is known to be systematically incomplete, especially in those areas where no mapping campaign was previously conducted. This is also applicable to the one currently available for the Austrian territory, composed by 14,519 shallow landslides. Within this study, we introduce the following explanatory variables to test the effect of different non-landslide strategies: Lithological units, grouped by their geotechnical properties and topographic parameters such as aspect, elevation, slope gradient and the topographic position. Landslide susceptibility maps will be derived by applying logistic regression, while systematic comparisons will be carried out based on models created by different non-landslide sampling strategies. Models generated by the conventional random sampling are presented against models based on stratified and clustered sampling strategies. The modelling results will be compared in terms of their prediction performance measured by the AUROC (Area Under the Receiver Operating Characteristic Curve) obtained by means of a k-fold cross-validation and also by the spatial pattern of the maps. The outcomes of this study are intended to contribute to the understanding on how landslide-inventory based biases may be counteracted.

  16. Sample design effects in landscape genetics

    USGS Publications Warehouse

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  17. Feasibility of collecting 24-h urine to monitor sodium intake in the National Health and Nutrition Examination Survey123

    PubMed Central

    Terry, Ana L; Cogswell, Mary E; Wang, Chia-Yih; Chen, Te-Ching; Loria, Catherine M; Wright, Jacqueline D; Zhang, Xinli; Lacher, David A; Merritt, Robert K; Bowman, Barbara A

    2016-01-01

    Background: Twenty-four–hour urine sodium excretion is recommended for monitoring population sodium intake. Because of concerns about participation and completion, sodium excretion has not been collected previously in US nationally representative surveys. Objective: We assessed the feasibility of implementing 24-h urine collections as part of a nationally representative survey. Design: We selected a random half sample of nonpregnant US adults aged 20–69 y in 3 geographic locations of the 2013 NHANES. Participants received explicit instructions, started and ended the urine collection in a urine study mobile examination center, and answered questions about their collection. Among those with a complete 24-h urine collection, a random one-half were asked to collect a second 24-h urine sample. Sodium, potassium, chloride, and creatinine excretion were analyzed. Results: The final NHANES examination response rate for adults aged 20–69 y in these 3 study locations was 71%. Of those examined (n = 476), 282 (59%) were randomly selected to participate in the 24-h urine collection. Of these, 212 persons [75% of those selected for 24-h urine collection; 53% (equal to 71% × 75% of those selected for the NHANES)] collected a complete initial 24-h specimen and 92 persons (85% of 108 selected) collected a second complete 24-h urine sample. More men than women completed an initial collection (P = 0.04); otherwise, completion did not vary by sociodemographic characteristics, body mass index, education, or employment status for either collection. Mean 24-h urine volume and sodium excretion were 1964 ± 1228 mL and 3657 ± 2003 mg, respectively, for the first 24-h urine sample, and 2048 ± 1288 mL and 3773 ± 1891 mg, respectively, for the second collection. Conclusion: Given the 53% final component response rate and 75% completion rate, 24-h urine collections were deemed feasible and implemented in the NHANES 2014 on a subsample of adults aged 20–69 y to assess population sodium intake. This study was registered at clinicaltrials.gov as NCT02723682. PMID:27413136

  18. Post-traumatic stress disorder in older adults: a systematic review of the psychotherapy treatment literature.

    PubMed

    Dinnen, Stephanie; Simiola, Vanessa; Cook, Joan M

    2015-01-01

    Older adults represent the fastest growing segment of the US and industrialized populations. However, older adults have generally not been included in randomized clinical trials of psychotherapy for post-traumatic stress disorder (PTSD). This review examined reports of psychological treatment for trauma-related problems, primarily PTSD, in studies with samples of at least 50% adults aged 55 and older using standardized measures. A systematic review of the literature was conducted on psychotherapy for PTSD with older adults using PubMed, Medline, PsychInfo, CINAHL, PILOTS, and Google Scholar. A total of 42 studies were retrieved for full review; 22 were excluded because they did not provide at least one outcome measure or results were not reported by age in the case of mixed-age samples. Of the 20 studies that met review criteria, there were: 13 case studies or series, three uncontrolled pilot studies, two randomized clinical trials, one non-randomized concurrent control study and one post hoc effectiveness study. Significant methodological limitations in the current older adult PTSD treatment outcome literature were found reducing its internal validity and generalizability, including non-randomized research designs, lack of comparison conditions and small sample sizes. Select evidence-based interventions validated in younger and middle-aged populations appear acceptable and efficacious with older adults. There are few treatment studies on subsets of the older adult population including cultural and ethnic minorities, women, the oldest old (over 85), and those who are cognitively impaired. Implications for clinical practice and future research directions are discussed.

  19. [Krigle estimation and its simulated sampling of Chilo suppressalis population density].

    PubMed

    Yuan, Zheming; Bai, Lianyang; Wang, Kuiwu; Hu, Xiangyue

    2004-07-01

    In order to draw up a rational sampling plan for the larvae population of Chilo suppressalis, an original population and its two derivative populations, random population and sequence population, were sampled and compared with random sampling, gap-range-random sampling, and a new systematic sampling integrated Krigle interpolation and random original position. As for the original population whose distribution was up to aggregative and dependence range in line direction was 115 cm (6.9 units), gap-range-random sampling in line direction was more precise than random sampling. Distinguishing the population pattern correctly is the key to get a better precision. Gap-range-random sampling and random sampling are fit for aggregated population and random population, respectively, but both of them are difficult to apply in practice. Therefore, a new systematic sampling named as Krigle sample (n = 441) was developed to estimate the density of partial sample (partial estimation, n = 441) and population (overall estimation, N = 1500). As for original population, the estimated precision of Krigle sample to partial sample and population was better than that of investigation sample. With the increase of the aggregation intensity of population, Krigel sample was more effective than investigation sample in both partial estimation and overall estimation in the appropriate sampling gap according to the dependence range.

  20. On the predictivity of pore-scale simulations: Estimating uncertainties with multilevel Monte Carlo

    NASA Astrophysics Data System (ADS)

    Icardi, Matteo; Boccardo, Gianluca; Tempone, Raúl

    2016-09-01

    A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another ;equivalent; sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [1], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers, extrapolation and post-processing techniques. The proposed method can be efficiently used in many porous media applications for problems such as stochastic homogenization/upscaling, propagation of uncertainty from microscopic fluid and rock properties to macro-scale parameters, robust estimation of Representative Elementary Volume size for arbitrary physics.

  1. Jello Shot Consumption among Older Adolescents: A Pilot Study of a Newly Identified Public Health Problem

    PubMed Central

    Binakonsky, Jane; Giga, Noreen; Ross, Craig; Siegel, Michael

    2011-01-01

    We investigated the extent of jello shot consumption among underage youth. We conducted a pilot study among a non-random national sample of 108 drinkers, ages 16-20 years, recruited from the Knowledge Networks internet panel in 2010 using consecutive sampling. The prevalence of past 30-day jello shot consumption among the 108 16-20 year-old drinkers in our sample was 21.4% and among those who consumed jello shots, the percentage of alcohol consumption attributable to jello shots averaged 14.5%. We conclude that jello shot use is prevalent among youth, representing a substantial proportion of their alcohol intake. Surveillance of youth alcohol use should include jello shot consumption. PMID:21174500

  2. Focus on Function – a randomized controlled trial comparing two rehabilitation interventions for young children with cerebral palsy

    PubMed Central

    Law, Mary; Darrah, Johanna; Pollock, Nancy; Rosenbaum, Peter; Russell, Dianne; Walter, Stephen D; Petrenchik, Theresa; Wilson, Brenda; Wright, Virginia

    2007-01-01

    Background Children with cerebral palsy receive a variety of long-term physical and occupational therapy interventions to facilitate development and to enhance functional independence in movement, self-care, play, school activities and leisure. Considerable human and financial resources are directed at the "intervention" of the problems of cerebral palsy, although the available evidence supporting current interventions is inconclusive. A considerable degree of uncertainty remains about the appropriate therapeutic approaches to manage the habilitation of children with cerebral palsy. The primary objective of this project is to conduct a multi-site randomized clinical trial to evaluate the efficacy of a task/context-focused approach compared to a child-focused remediation approach in improving performance of functional tasks and mobility, increasing participation in everyday activities, and improving quality of life in children 12 months to 5 years of age who have cerebral palsy. Method/Design A multi-centred randomized controlled trial research design will be used. Children will be recruited from a representative sample of children attending publicly-funded regional children's rehabilitation centers serving children with disabilities in Ontario and Alberta in Canada. Target sample size is 220 children with cerebral palsy aged 12 months to 5 years at recruitment date. Therapists are randomly assigned to deliver either a context-focused approach or a child-focused approach. Children follow their therapist into their treatment arm. Outcomes will be evaluated at baseline, after 6 months of treatment and at a 3-month follow-up period. Outcomes represent the components of the International Classification of Functioning, Disability and Health, including body function and structure (range of motion), activities (performance of functional tasks, motor function), participation (involvement in formal and informal activities), and environment (parent perceptions of care, parental empowerment). Discussion This paper presents the background information, design and protocol for a randomized controlled trial comparing a task/context-focused approach to a child-focused remediation approach in improving functional outcomes for young children with cerebral palsy. Trial registration [clinical trial registration #: NCT00469872] PMID:17900362

  3. Systematic versus random sampling in stereological studies.

    PubMed

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  4. Baseline Characteristics and Generalizability of Participants in an Internet Smoking Cessation Randomized Trial

    PubMed Central

    Cha, Sarah; Erar, Bahar; Niaura, Raymond S.; Graham, Amanda L.

    2016-01-01

    Background The potential for sampling bias in Internet smoking cessation studies is widely recognized. However, few studies have explicitly addressed the issue of sample representativeness in the context of an Internet smoking cessation treatment trial. Purpose To examine the generalizability of participants enrolled in a randomized controlled trial of an Internet smoking cessation intervention using weighted data from the National Health Interview Survey (NHIS). Methods A total of 5,290 new users on a smoking cessation website enrolled in the trial between March 2012–January 2015. Descriptive statistics summarized baseline characteristics of screened and enrolled participants and multivariate analysis examined predictors of enrollment. Generalizability analyses compared demographic and smoking characteristics of trial participants to current smokers in the 2012–2014 waves of NHIS (n=19,043), and to an NHIS subgroup based on Internet use and cessation behavior (n=3,664). Effect sizes were obtained to evaluate the magnitude of differences across variables. Results Predictors of study enrollment were age, gender, race, education, and motivation to quit. Compared to NHIS smokers, trial participants were more likely to be female, college educated, daily smokers, and to have made a quit attempt in the past year (all effect sizes 0.25–0.60). In comparisons with the NHIS subgroup, differences in gender and education were attenuated while differences in daily smoking and smoking rate were amplified. Conclusions Few differences emerged between Internet trial participants and nationally representative samples of smokers, and all were in expected directions. This study highlights the importance of assessing generalizability in a focused and specific manner. PMID:27283295

  5. Baseline Characteristics and Generalizability of Participants in an Internet Smoking Cessation Randomized Trial.

    PubMed

    Cha, Sarah; Erar, Bahar; Niaura, Raymond S; Graham, Amanda L

    2016-10-01

    The potential for sampling bias in Internet smoking cessation studies is widely recognized. However, few studies have explicitly addressed the issue of sample representativeness in the context of an Internet smoking cessation treatment trial. The purpose of the present study is to examine the generalizability of participants enrolled in a randomized controlled trial of an Internet smoking cessation intervention using weighted data from the National Health Interview Survey (NHIS). A total of 5290 new users on a smoking cessation website enrolled in the trial between March 2012 and January 2015. Descriptive statistics summarized baseline characteristics of screened and enrolled participants, and multivariate analysis examined predictors of enrollment. Generalizability analyses compared demographic and smoking characteristics of trial participants to current smokers in the 2012-2014 waves of NHIS (n = 19,043) and to an NHIS subgroup based on Internet use and cessation behavior (n = 3664). Effect sizes were obtained to evaluate the magnitude of differences across variables. Predictors of study enrollment were age, gender, race, education, and motivation to quit. Compared to NHIS smokers, trial participants were more likely to be female, college educated, and daily smokers and to have made a quit attempt in the past year (all effect sizes 0.25-0.60). In comparisons with the NHIS subgroup, differences in gender and education were attenuated, while differences in daily smoking and smoking rate were amplified. Few differences emerged between Internet trial participants and nationally representative samples of smokers, and all were in expected directions. This study highlights the importance of assessing generalizability in a focused and specific manner. CLINICALTRIALS.GOV: #NCT01544153.

  6. Changes in sample collection and analytical techniques and effects on retrospective comparability of low-level concentrations of trace elements in ground water

    USGS Publications Warehouse

    Ivahnenko, T.; Szabo, Z.; Gibs, J.

    2001-01-01

    Ground-water sampling techniques were modified to reduce random low-level contamination during collection of filtered water samples for determination of trace-element concentrations. The modified sampling techniques were first used in New Jersey by the US Geological Survey in 1994 along with inductively coupled plasma-mass spectrometry (ICP-MS) analysis to determine the concentrations of 18 trace elements at the one microgram-per-liter (μg/L) level in the oxic water of the unconfined sand and gravel Kirkwood-Cohansey aquifer system. The revised technique tested included a combination of the following: collection of samples (1) with flow rates of about 2L per minute, (2) through acid-washed single-use disposable tubing and (3) a single-use disposable 0.45-μm pore size capsule filter, (4) contained within portable glove boxes, (5) in a dedicated clean sampling van, (6) only after turbidity stabilized at values less than 2 nephelometric turbidity units (NTU), when possible. Quality-assurance data, obtained from equipment blanks and split samples, indicated that trace element concentrations, with the exception of iron, chromium, aluminum, and zinc, measured in the samples collected in 1994 were not subject to random contamination at 1μg/L.Results from samples collected in 1994 were compared to those from samples collected in 1991 from the same 12 PVC-cased observation wells using the available sampling and analytical techniques at that time. Concentrations of copper, lead, manganese and zinc were statistically significantly lower in samples collected in 1994 than in 1991. Sampling techniques used in 1994 likely provided trace-element data that represented concentrations in the aquifer with less bias than data from 1991 when samples were collected without the same degree of attention to sample handling.

  7. Treating stimuli as a random factor in social psychology: a new and comprehensive solution to a pervasive but largely ignored problem.

    PubMed

    Judd, Charles M; Westfall, Jacob; Kenny, David A

    2012-07-01

    Throughout social and cognitive psychology, participants are routinely asked to respond in some way to experimental stimuli that are thought to represent categories of theoretical interest. For instance, in measures of implicit attitudes, participants are primed with pictures of specific African American and White stimulus persons sampled in some way from possible stimuli that might have been used. Yet seldom is the sampling of stimuli taken into account in the analysis of the resulting data, in spite of numerous warnings about the perils of ignoring stimulus variation (Clark, 1973; Kenny, 1985; Wells & Windschitl, 1999). Part of this failure to attend to stimulus variation is due to the demands imposed by traditional analysis of variance procedures for the analysis of data when both participants and stimuli are treated as random factors. In this article, we present a comprehensive solution using mixed models for the analysis of data with crossed random factors (e.g., participants and stimuli). We show the substantial biases inherent in analyses that ignore one or the other of the random factors, and we illustrate the substantial advantages of the mixed models approach with both hypothetical and actual, well-known data sets in social psychology (Bem, 2011; Blair, Chapleau, & Judd, 2005; Correll, Park, Judd, & Wittenbrink, 2002). PsycINFO Database Record (c) 2012 APA, all rights reserved

  8. A Comparison of Techniques for Scheduling Earth-Observing Satellites

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna

    2004-01-01

    Scheduling observations by coordinated fleets of Earth Observing Satellites (EOS) involves large search spaces, complex constraints and poorly understood bottlenecks, conditions where evolutionary and related algorithms are often effective. However, there are many such algorithms and the best one to use is not clear. Here we compare multiple variants of the genetic algorithm: stochastic hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on ten realistically-sized EOS scheduling problems. Schedules are represented by a permutation (non-temperal ordering) of the observation requests. A simple deterministic scheduler assigns times and resources to each observation request in the order indicated by the permutation, discarding those that violate the constraints created by previously scheduled observations. Simulated annealing performs best. Random mutation outperform a more 'intelligent' mutator. Furthermore, the best mutator, by a small margin, was a novel approach we call temperature dependent random sampling that makes large changes in the early stages of evolution and smaller changes towards the end of search.

  9. Factual knowledge about AIDS and dating practices among high school students from selected schools.

    PubMed

    Nyachuru-Sihlangu, R H; Ndlovu, J

    1992-06-01

    Following various educational strategies by governmental and non-governmental organisations to educate youths and school teachers about HIV infection and prevention, this KABP survey was one attempt to evaluate the results. The study sample of 478 high school students was drawn from four randomly selected schools in Mashonaland and Matabeleland including high and low density, government and mission co-educational schools. The sample was randomly selected and stratified to represent sex and grade level. The KABP self administered questionnaire was used. The paper analyses the relationship between the knowledge and dating patterns. Generally, respondents demonstrated a 50pc to 80pc accuracy of factual knowledge. Of the 66pc Forms I through IV pupils who dated, 30pc preferred only sexually involved relationships and a small number considered the possibility of HIV/AIDS infection. A theoretically based tripartite coalition involving the school, the family health care services for education, guidance and support to promote responsible behaviour throughout childhood was suggested.

  10. Open Inclusion or Shameful Secret: A Comparison of Characters with Fetal Alcohol Spectrum Disorders (FASD) and Characters with Autism Spectrum Disorders (ASD) in a North American Sample of Books for Children and Young Adults

    ERIC Educational Resources Information Center

    Barker, Conor; Kulyk, Juli; Knorr, Lyndsay; Brenna, Beverley

    2011-01-01

    Using a framework of critical literacy, and acknowledging the characteristics of Radical Change, the authors explore 75 North American youth fiction novels which depict characters with disabilities. Books were identified from a variety of sources (i.e., awards lists, book reviews, other research, and word-of-mouth), to represent a random sample…

  11. Identification of compound-protein interactions through the analysis of gene ontology, KEGG enrichment for proteins and molecular fragments of compounds.

    PubMed

    Chen, Lei; Zhang, Yu-Hang; Zheng, Mingyue; Huang, Tao; Cai, Yu-Dong

    2016-12-01

    Compound-protein interactions play important roles in every cell via the recognition and regulation of specific functional proteins. The correct identification of compound-protein interactions can lead to a good comprehension of this complicated system and provide useful input for the investigation of various attributes of compounds and proteins. In this study, we attempted to understand this system by extracting properties from both proteins and compounds, in which proteins were represented by gene ontology and KEGG pathway enrichment scores and compounds were represented by molecular fragments. Advanced feature selection methods, including minimum redundancy maximum relevance, incremental feature selection, and the basic machine learning algorithm random forest, were used to analyze these properties and extract core factors for the determination of actual compound-protein interactions. Compound-protein interactions reported in The Binding Databases were used as positive samples. To improve the reliability of the results, the analytic procedure was executed five times using different negative samples. Simultaneously, five optimal prediction methods based on a random forest and yielding maximum MCCs of approximately 77.55 % were constructed and may be useful tools for the prediction of compound-protein interactions. This work provides new clues to understanding the system of compound-protein interactions by analyzing extracted core features. Our results indicate that compound-protein interactions are related to biological processes involving immune, developmental and hormone-associated pathways.

  12. Stalking and health – an Austrian prevalence study.

    PubMed

    Freidl, W; Neuberger, I; Schönberger, S; Raml, R

    2011-04-01

    The aim of this study was to estimate the prevalence of stalking and related subjective health impairment, based on concrete definitions of stalking, for a representative random sample of the female population in the Austrian Federal State of Styria. A representative random sample (randomised last digits procedure) of 2000 women selected from the female population of Styria aged 18 years or older underwent a computer-aided phone interview survey (CATI). Questions centred on the occurrence of stalking, the exact period of stalking, the gender of the stalker, the subjective impairment through stalking, addressing the aspects of life-style and the subjectively perceived state of health, and socio-demographic variables. For data analyses descriptive statistics, and chi(2)-tests and t-tests were applied. Lifetime prevalence varies between ca. 6% and 18%, depending on definition levels. The annual prevalences reveal a range of 1-4%. 39-43% of the stalked women feel they are impaired in their life-style, and 32-40% feel impaired in their health. Higher age and living in a partnership reduce the likelihood of being stalked. 81% of the stalked women are stalked by a male person. The prevalences found in this study are in line with other international studies, although, in a direct comparison, they are in the lower range. However, these data document the relevance of the phenomenon of stalking for the female Austrian population. © Georg Thieme Verlag KG Stuttgart · New York.

  13. A novel one-class SVM based negative data sampling method for reconstructing proteome-wide HTLV-human protein interaction networks.

    PubMed

    Mei, Suyu; Zhu, Hao

    2015-01-26

    Protein-protein interaction (PPI) prediction is generally treated as a problem of binary classification wherein negative data sampling is still an open problem to be addressed. The commonly used random sampling is prone to yield less representative negative data with considerable false negatives. Meanwhile rational constraints are seldom exerted on model selection to reduce the risk of false positive predictions for most of the existing computational methods. In this work, we propose a novel negative data sampling method based on one-class SVM (support vector machine, SVM) to predict proteome-wide protein interactions between HTLV retrovirus and Homo sapiens, wherein one-class SVM is used to choose reliable and representative negative data, and two-class SVM is used to yield proteome-wide outcomes as predictive feedback for rational model selection. Computational results suggest that one-class SVM is more suited to be used as negative data sampling method than two-class PPI predictor, and the predictive feedback constrained model selection helps to yield a rational predictive model that reduces the risk of false positive predictions. Some predictions have been validated by the recent literature. Lastly, gene ontology based clustering of the predicted PPI networks is conducted to provide valuable cues for the pathogenesis of HTLV retrovirus.

  14. Random sampling causes the low reproducibility of rare eukaryotic OTUs in Illumina COI metabarcoding.

    PubMed

    Leray, Matthieu; Knowlton, Nancy

    2017-01-01

    DNA metabarcoding, the PCR-based profiling of natural communities, is becoming the method of choice for biodiversity monitoring because it circumvents some of the limitations inherent to traditional ecological surveys. However, potential sources of bias that can affect the reproducibility of this method remain to be quantified. The interpretation of differences in patterns of sequence abundance and the ecological relevance of rare sequences remain particularly uncertain. Here we used one artificial mock community to explore the significance of abundance patterns and disentangle the effects of two potential biases on data reproducibility: indexed PCR primers and random sampling during Illumina MiSeq sequencing. We amplified a short fragment of the mitochondrial Cytochrome c Oxidase Subunit I (COI) for a single mock sample containing equimolar amounts of total genomic DNA from 34 marine invertebrates belonging to six phyla. We used seven indexed broad-range primers and sequenced the resulting library on two consecutive Illumina MiSeq runs. The total number of Operational Taxonomic Units (OTUs) was ∼4 times higher than expected based on the composition of the mock sample. Moreover, the total number of reads for the 34 components of the mock sample differed by up to three orders of magnitude. However, 79 out of 86 of the unexpected OTUs were represented by <10 sequences that did not appear consistently across replicates. Our data suggest that random sampling of rare OTUs (e.g., small associated fauna such as parasites) accounted for most of variation in OTU presence-absence, whereas biases associated with indexed PCRs accounted for a larger amount of variation in relative abundance patterns. These results suggest that random sampling during sequencing leads to the low reproducibility of rare OTUs. We suggest that the strategy for handling rare OTUs should depend on the objectives of the study. Systematic removal of rare OTUs may avoid inflating diversity based on common β descriptors but will exclude positive records of taxa that are functionally important. Our results further reinforce the need for technical replicates (parallel PCR and sequencing from the same sample) in metabarcoding experimental designs. Data reproducibility should be determined empirically as it will depend upon the sequencing depth, the type of sample, the sequence analysis pipeline, and the number of replicates. Moreover, estimating relative biomasses or abundances based on read counts remains elusive at the OTU level.

  15. Geographic Information Systems to Assess External Validity in Randomized Trials.

    PubMed

    Savoca, Margaret R; Ludwig, David A; Jones, Stedman T; Jason Clodfelter, K; Sloop, Joseph B; Bollhalter, Linda Y; Bertoni, Alain G

    2017-08-01

    To support claims that RCTs can reduce health disparities (i.e., are translational), it is imperative that methodologies exist to evaluate the tenability of external validity in RCTs when probabilistic sampling of participants is not employed. Typically, attempts at establishing post hoc external validity are limited to a few comparisons across convenience variables, which must be available in both sample and population. A Type 2 diabetes RCT was used as an example of a method that uses a geographic information system to assess external validity in the absence of a priori probabilistic community-wide diabetes risk sampling strategy. A geographic information system, 2009-2013 county death certificate records, and 2013-2014 electronic medical records were used to identify community-wide diabetes prevalence. Color-coded diabetes density maps provided visual representation of these densities. Chi-square goodness of fit statistic/analysis tested the degree to which distribution of RCT participants varied across density classes compared to what would be expected, given simple random sampling of the county population. Analyses were conducted in 2016. Diabetes prevalence areas as represented by death certificate and electronic medical records were distributed similarly. The simple random sample model was not a good fit for death certificate record (chi-square, 17.63; p=0.0001) and electronic medical record data (chi-square, 28.92; p<0.0001). Generally, RCT participants were oversampled in high-diabetes density areas. Location is a highly reliable "principal variable" associated with health disparities. It serves as a directly measurable proxy for high-risk underserved communities, thus offering an effective and practical approach for examining external validity of RCTs. Copyright © 2017 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  16. Supercluster simulations: impact of baryons on the matter power spectrum and weak lensing forecasts for Super-CLASS

    NASA Astrophysics Data System (ADS)

    Peters, Aaron; Brown, Michael L.; Kay, Scott T.; Barnes, David J.

    2018-03-01

    We use a combination of full hydrodynamic and dark matter only simulations to investigate the effect that supercluster environments and baryonic physics have on the matter power spectrum, by re-simulating a sample of supercluster sub-volumes. On large scales we find that the matter power spectrum measured from our supercluster sample has at least twice as much power as that measured from our random sample. Our investigation of the effect of baryonic physics on the matter power spectrum is found to be in agreement with previous studies and is weaker than the selection effect over the majority of scales. In addition, we investigate the effect of targeting a cosmologically non-representative, supercluster region of the sky on the weak lensing shear power spectrum. We do this by generating shear and convergence maps using a line-of-sight integration technique, which intercepts our random and supercluster sub-volumes. We find the convergence power spectrum measured from our supercluster sample has a larger amplitude than that measured from the random sample at all scales. We frame our results within the context of the Super-CLuster Assisted Shear Survey (Super-CLASS), which aims to measure the cosmic shear signal in the radio band by targeting a region of the sky that contains five Abell clusters. Assuming the Super-CLASS survey will have a source density of 1.5 galaxies arcmin-2, we forecast a detection significance of 2.7^{+1.5}_{-1.2}, which indicates that in the absence of systematics the Super-CLASS project could make a cosmic shear detection with radio data alone.

  17. Validation of two complementary oral-health related quality of life indicators (OIDP and OSS 0-10 ) in two qualitatively distinct samples of the Spanish population

    PubMed Central

    Montero, J; Bravo, M; Albaladejo, A

    2008-01-01

    Background Oral health-related quality of life can be assessed positively, by measuring satisfaction with mouth, or negatively, by measuring oral impact on the performance of daily activities. The study objective was to validate two complementary indicators, i.e., the OIDP (Oral Impacts on Daily Performances) and Oral Satisfaction 0–10 Scale (OSS), in two qualitatively different socio-demographic samples of the Spanish adult population, and to analyse the factors affecting both perspectives of well-being. Methods A cross-sectional study was performed, recruiting a Validation Sample from randomly selected Health Centres in Granada (Spain), representing the general population (n = 253), and a Working Sample (n = 561) randomly selected from active Regional Government staff, i.e., representing the more privileged end of the socio-demographic spectrum of this reference population. All participants were examined according to WHO methodology and completed an in-person interview on their oral impacts and oral satisfaction using the OIDP and OSS 0–10 respectively. The reliability and validity of the two indicators were assessed. An alternative method of describing the causes of oral impacts is presented. Results The reliability coefficient (Cronbach's alpha) of the OIDP was above the recommended 0.7 threshold in both Validation and Occupational samples (0.79 and 0.71 respectively). Test-retest analysis confirmed the external reliability of the OSS (Intraclass Correlation Coefficient, 0.89; p < 0.001) Some subjective factors (perceived need for dental treatment, complaints about mouth and intermediate impacts) were strongly associated with both indicators, supporting their construct and criterion validity. The main cause of oral impact was dental pain. Several socio-demographic, behavioural and clinical variables were identified as modulating factors. Conclusion OIDP and OSS are valid and reliable subjective measures of oral impacts and oral satisfaction, respectively, in an adult Spanish population. Exploring simultaneously these issues may provide useful insights into how satisfaction and impact on well-being are constructed. PMID:19019208

  18. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    PubMed Central

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-01-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464

  19. Sampling Of SAR Imagery For Wind Resource Assesment

    NASA Astrophysics Data System (ADS)

    Badger, Merete; Badger, Jake; Hasager, Charlotte; Nielsen, Morten

    2010-04-01

    Wind resources over the sea can be assessed from a series of wind fields retrieved from Envisat ASAR imagery, or other SAR data. Previous wind resource maps have been produced through random sampling of 70 or more satellite scenes over a given area of interest followed by fitting of a Weibull function to the data. Here we introduce a more advanced sampling strategy based on wind class methodology that is normally applied in Risø DTU’s numerical modeling of wind resources. The aim is to obtain a more representative data set using fewer satellite SAR scenes. The new sampling strategy has been applied within a wind and solar resource assessment study for the United Arab Emirates (UAE) and also for wind resource mapping over a domain in the North Sea, as part of the EU- NORSEWInD project (2008-2012).

  20. Do we really need a large number of particles to simulate bimolecular reactive transport with random walk methods? A kernel density estimation approach

    NASA Astrophysics Data System (ADS)

    Rahbaralam, Maryam; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2015-12-01

    Random walk particle tracking methods are a computationally efficient family of methods to solve reactive transport problems. While the number of particles in most realistic applications is in the order of 106-109, the number of reactive molecules even in diluted systems might be in the order of fractions of the Avogadro number. Thus, each particle actually represents a group of potentially reactive molecules. The use of a low number of particles may result not only in loss of accuracy, but also may lead to an improper reproduction of the mixing process, limited by diffusion. Recent works have used this effect as a proxy to model incomplete mixing in porous media. In this work, we propose using a Kernel Density Estimation (KDE) of the concentrations that allows getting the expected results for a well-mixed solution with a limited number of particles. The idea consists of treating each particle as a sample drawn from the pool of molecules that it represents; this way, the actual location of a tracked particle is seen as a sample drawn from the density function of the location of molecules represented by that given particle, rigorously represented by a kernel density function. The probability of reaction can be obtained by combining the kernels associated to two potentially reactive particles. We demonstrate that the observed deviation in the reaction vs time curves in numerical experiments reported in the literature could be attributed to the statistical method used to reconstruct concentrations (fixed particle support) from discrete particle distributions, and not to the occurrence of true incomplete mixing. We further explore the evolution of the kernel size with time, linking it to the diffusion process. Our results show that KDEs are powerful tools to improve computational efficiency and robustness in reactive transport simulations, and indicates that incomplete mixing in diluted systems should be modeled based on alternative mechanistic models and not on a limited number of particles.

  1. Frequency position modulation using multi-spectral projections

    NASA Astrophysics Data System (ADS)

    Goodman, Joel; Bertoncini, Crystal; Moore, Michael; Nousain, Bryan; Cowart, Gregory

    2012-10-01

    In this paper we present an approach to harness multi-spectral projections (MSPs) to carefully shape and locate tones in the spectrum, enabling a new and robust modulation in which a signal's discrete frequency support is used to represent symbols. This method, called Frequency Position Modulation (FPM), is an innovative extension to MT-FSK and OFDM and can be non-uniformly spread over many GHz of instantaneous bandwidth (IBW), resulting in a communications system that is difficult to intercept and jam. The FPM symbols are recovered using adaptive projections that in part employ an analog polynomial nonlinearity paired with an analog-to-digital converter (ADC) sampling at a rate at that is only a fraction of the IBW of the signal. MSPs also facilitate using commercial of-the-shelf (COTS) ADCs with uniform-sampling, standing in sharp contrast to random linear projections by random sampling, which requires a full Nyquist rate sample-and-hold. Our novel communication system concept provides an order of magnitude improvement in processing gain over conventional LPI/LPD communications (e.g., FH- or DS-CDMA) and facilitates the ability to operate in interference laden environments where conventional compressed sensing receivers would fail. We quantitatively analyze the bit error rate (BER) and processing gain (PG) for a maximum likelihood based FPM demodulator and demonstrate its performance in interference laden conditions.

  2. Two-sample discrimination of Poisson means

    NASA Technical Reports Server (NTRS)

    Lampton, M.

    1994-01-01

    This paper presents a statistical test for detecting significant differences between two random count accumulations. The null hypothesis is that the two samples share a common random arrival process with a mean count proportional to each sample's exposure. The model represents the partition of N total events into two counts, A and B, as a sequence of N independent Bernoulli trials whose partition fraction, f, is determined by the ratio of the exposures of A and B. The detection of a significant difference is claimed when the background (null) hypothesis is rejected, which occurs when the observed sample falls in a critical region of (A, B) space. The critical region depends on f and the desired significance level, alpha. The model correctly takes into account the fluctuations in both the signals and the background data, including the important case of small numbers of counts in the signal, the background, or both. The significance can be exactly determined from the cumulative binomial distribution, which in turn can be inverted to determine the critical A(B) or B(A) contour. This paper gives efficient implementations of these tests, based on lookup tables. Applications include the detection of clustering of astronomical objects, the detection of faint emission or absorption lines in photon-limited spectroscopy, the detection of faint emitters or absorbers in photon-limited imaging, and dosimetry.

  3. Comparison of prevalence estimation of Mycobacterium avium subsp. paratuberculosis infection by sampling slaughtered cattle with macroscopic lesions vs. systematic sampling.

    PubMed

    Elze, J; Liebler-Tenorio, E; Ziller, M; Köhler, H

    2013-07-01

    The objective of this study was to identify the most reliable approach for prevalence estimation of Mycobacterium avium ssp. paratuberculosis (MAP) infection in clinically healthy slaughtered cattle. Sampling of macroscopically suspect tissue was compared to systematic sampling. Specimens of ileum, jejunum, mesenteric and caecal lymph nodes were examined for MAP infection using bacterial microscopy, culture, histopathology and immunohistochemistry. MAP was found most frequently in caecal lymph nodes, but sampling more tissues optimized the detection rate. Examination by culture was most efficient while combination with histopathology increased the detection rate slightly. MAP was detected in 49/50 animals with macroscopic lesions representing 1.35% of the slaughtered cattle examined. Of 150 systematically sampled macroscopically non-suspect cows, 28.7% were infected with MAP. This indicates that the majority of MAP-positive cattle are slaughtered without evidence of macroscopic lesions and before clinical signs occur. For reliable prevalence estimation of MAP infection in slaughtered cattle, systematic random sampling is essential.

  4. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    PubMed

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  5. Assessing the Generalizability of Randomized Trial Results to Target Populations

    PubMed Central

    Stuart, Elizabeth A.; Bradshaw, Catherine P.; Leaf, Philip J.

    2014-01-01

    Recent years have seen increasing interest in and attention to evidence-based practices, where the “evidence” generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as “internal validity”), they do not always yield relevant information about the effects in a particular target population (known as “external validity”). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a pre-specified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of School-wide Positive Behavioral Interventions and Supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless they are confident of the similarity between the trial sample and that target population. PMID:25307417

  6. Assessing the generalizability of randomized trial results to target populations.

    PubMed

    Stuart, Elizabeth A; Bradshaw, Catherine P; Leaf, Philip J

    2015-04-01

    Recent years have seen increasing interest in and attention to evidence-based practices, where the "evidence" generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as "internal validity"), they do not always yield relevant information about the effects in a particular target population (known as "external validity"). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a prespecified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of school-wide positive behavioral interventions and supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless they are confident of the similarity between the trial sample and that target population.

  7. The stability of hydrogen ion and specific conductance in filtered wet-deposition samples stored at ambient temperatures

    USGS Publications Warehouse

    Gordon, J.D.; Schroder, L.J.; Morden-Moore, A. L.; Bowersox, V.C.

    1995-01-01

    Separate experiments by the U.S. Geological Survey (USGS) and the Illinois State Water Survey Central Analytical Laboratory (CAL) independently assessed the stability of hydrogen ion and specific conductance in filtered wet-deposition samples stored at ambient temperatures. The USGS experiment represented a test of sample stability under a diverse range of conditions, whereas the CAL experiment was a controlled test of sample stability. In the experiment by the USGS, a statistically significant (?? = 0.05) relation between [H+] and time was found for the composited filtered, natural, wet-deposition solution when all reported values are included in the analysis. However, if two outlying pH values most likely representing measurement error are excluded from the analysis, the change in [H+] over time was not statistically significant. In the experiment by the CAL, randomly selected samples were reanalyzed between July 1984 and February 1991. The original analysis and reanalysis pairs revealed that [H+] differences, although very small, were statistically different from zero, whereas specific-conductance differences were not. Nevertheless, the results of the CAL reanalysis project indicate there appears to be no consistent, chemically significant degradation in sample integrity with regard to [H+] and specific conductance while samples are stored at room temperature at the CAL. Based on the results of the CAL and USGS studies, short-term (45-60 day) stability of [H+] and specific conductance in natural filtered wet-deposition samples that are shipped and stored unchilled at ambient temperatures was satisfactory.

  8. Estimating implied rates of discount in healthcare decision-making.

    PubMed

    West, R R; McNabb, R; Thompson, A G H; Sheldon, T A; Grimley Evans, J

    2003-01-01

    To consider whether implied rates of discounting from the perspectives of individual and society differ, and whether implied rates of discounting in health differ from those implied in choices involving finance or "goods". The study comprised first a review of economics, health economics and social science literature and then an empirical estimate of implied rates of discounting in four fields: personal financial, personal health, public financial and public health, in representative samples of the public and of healthcare professionals. Samples were drawn in the former county and health authority district of South Glamorgan, Wales. The public sample was a representative random sample of men and women, aged over 18 years and drawn from electoral registers. The health professional sample was drawn at random with the cooperation of professional leads to include doctors, nurses, professions allied to medicine, public health, planners and administrators. The literature review revealed few empirical studies in representative samples of the population, few direct comparisons of public with private decision-making and few direct comparisons of health with financial discounting. Implied rates of discounting varied widely and studies suggested that discount rates are higher the smaller the value of the outcome and the shorter the period considered. The relationship between implied discount rates and personal attributes was mixed, possibly reflecting the limited nature of the samples. Although there were few direct comparisons, some studies found that individuals apply different rates of discount to social compared with private comparisons and health compared with financial. The present study also found a wide range of implied discount rates, with little systematic effect of age, gender, educational level or long-term illness. There was evidence, in both samples, that people chose a lower rate of discount in comparisons made on behalf of society than in comparisons made for themselves. Both public and health professional samples tended to choose lower discount rates in health-related comparisons than in finance-related comparisons. It was also suggested that implied rates of discount, derived from responses to hypothetical questions, can be influenced by detail of question framing. The study suggested that both the lay public and healthcare professionals consider that the discount rate appropriate for public decisions is lower than that for private decisions. This finding suggests that lay people as well as healthcare professionals, used to making decisions on behalf of others, recognise that society is not simply an aggregate of individuals. It also implies a general appreciation that society is more stable and has a more predictable future than does the individual. There is fairly general support for this view in the theoretical literature and limited support in the few previous direct comparisons. Further research is indicated, possibly involving more in-depth interviewing and drawing inference on real, rather than hypothetical choices.

  9. Optical parametric oscillation in a random poly-crystalline medium: ZnSe ceramic

    NASA Astrophysics Data System (ADS)

    Ru, Qitian; Kawamori, Taiki; Lee, Nathaniel; Chen, Xuan; Zhong, Kai; Mirov, Mike; Vasilyev, Sergey; Mirov, Sergey B.; Vodopyanov, Konstantin L.

    2018-02-01

    We demonstrate an optical parametric oscillator (OPO) based on random phase matching in a polycrystalline χ(2) material, ZnSe. The subharmonic OPO utilized a 1.5-mm-long polished ZnSe ceramic sample placed at the Brewster's angle and was synchronously pumped by a Kerr-lens mode-locked Cr:ZnS laser with a central wavelength of 2.35 μm, a pulse duration of 62 fs, and a repetition frequency of 79 MHz. The OPO had a 90-mW pump threshold, and produced an ultrabroadband spectrum spanning 3-7.5 μm. The observed pump depletion was as high as 79%. The key to success in achieving the OPO action was choosing the average grain size of the ZnSe ceramic to be close to the coherence length ( 100 μm) for our 3-wave interaction. This is the first OPO that uses random polycrystalline material with quadratic nonlinearity and the first OPO based on ZnSe. Very likely, random phase matching in ZnSe and similar random polycrystalline materials (ZnS, CdS, CdSe, GaP) represents a viable route for generating few-cycle pulses and multi-octave frequency combs, thanks to a very broadband nonlinear response.

  10. Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program

    PubMed Central

    Heckman, James; Moon, Seong Hyeok; Pinto, Rodrigo; Savelyev, Peter; Yavitz, Adam

    2012-01-01

    Social experiments are powerful sources of information about the effectiveness of interventions. In practice, initial randomization plans are almost always compromised. Multiple hypotheses are frequently tested. “Significant” effects are often reported with p-values that do not account for preliminary screening from a large candidate pool of possible effects. This paper develops tools for analyzing data from experiments as they are actually implemented. We apply these tools to analyze the influential HighScope Perry Preschool Program. The Perry program was a social experiment that provided preschool education and home visits to disadvantaged children during their preschool years. It was evaluated by the method of random assignment. Both treatments and controls have been followed from age 3 through age 40. Previous analyses of the Perry data assume that the planned randomization protocol was implemented. In fact, as in many social experiments, the intended randomization protocol was compromised. Accounting for compromised randomization, multiple-hypothesis testing, and small sample sizes, we find statistically significant and economically important program effects for both males and females. We also examine the representativeness of the Perry study. PMID:23255883

  11. An improved initialization center k-means clustering algorithm based on distance and density

    NASA Astrophysics Data System (ADS)

    Duan, Yanling; Liu, Qun; Xia, Shuyin

    2018-04-01

    Aiming at the problem of the random initial clustering center of k means algorithm that the clustering results are influenced by outlier data sample and are unstable in multiple clustering, a method of central point initialization method based on larger distance and higher density is proposed. The reciprocal of the weighted average of distance is used to represent the sample density, and the data sample with the larger distance and the higher density are selected as the initial clustering centers to optimize the clustering results. Then, a clustering evaluation method based on distance and density is designed to verify the feasibility of the algorithm and the practicality, the experimental results on UCI data sets show that the algorithm has a certain stability and practicality.

  12. Karhunen-Loève treatment to remove noise and facilitate data analysis in sensing, spectroscopy and other applications.

    PubMed

    Zaharov, V V; Farahi, R H; Snyder, P J; Davison, B H; Passian, A

    2014-11-21

    Resolving weak spectral variations in the dynamic response of materials that are either dominated or excited by stochastic processes remains a challenge. Responses that are thermal in origin are particularly relevant examples due to the delocalized nature of heat. Despite its inherent properties in dealing with stochastic processes, the Karhunen-Loève expansion has not been fully exploited in measurement of systems that are driven solely by random forces or can exhibit large thermally driven random fluctuations. Here, we present experimental results and analysis of the archetypes (a) the resonant excitation and transient response of an atomic force microscope probe by the ambient random fluctuations and nanoscale photothermal sample response, and (b) the photothermally scattered photons in pump-probe spectroscopy. In each case, the dynamic process is represented as an infinite series with random coefficients to obtain pertinent frequency shifts and spectral peaks and demonstrate spectral enhancement for a set of compounds including the spectrally complex biomass. The considered cases find important applications in nanoscale material characterization, biosensing, and spectral identification of biological and chemical agents.

  13. Improving Riverine Constituent Concentration and Flux Estimation by Accounting for Antecedent Discharge Conditions

    NASA Astrophysics Data System (ADS)

    Zhang, Q.; Ball, W. P.

    2016-12-01

    Regression-based approaches are often employed to estimate riverine constituent concentrations and fluxes based on typically sparse concentration observations. One such approach is the WRTDS ("Weighted Regressions on Time, Discharge, and Season") method, which has been shown to provide more accurate estimates than prior approaches. Centered on WRTDS, this work was aimed at developing improved models for constituent concentration and flux estimation by accounting for antecedent discharge conditions. Twelve modified models were developed and tested, each of which contains one additional variable to represent antecedent conditions. High-resolution ( daily) data at nine monitoring sites were used to evaluate the relative merits of the models for estimation of six constituents - chloride (Cl), nitrate-plus-nitrite (NOx), total Kjeldahl nitrogen (TKN), total phosphorus (TP), soluble reactive phosphorus (SRP), and suspended sediment (SS). For each site-constituent combination, 30 concentration subsets were generated from the original data through Monte Carlo sub-sampling and then used to evaluate model performance. For the sub-sampling, three sampling strategies were adopted: (A) 1 random sample each month (12/year), (B) 12 random monthly samples plus additional 8 random samples per year (20/year), and (C) 12 regular (non-storm) and 8 storm samples per year (20/year). The modified models show general improvement over the original model under all three sampling strategies. Major improvements were achieved for NOx by the long-term flow-anomaly model and for Cl by the ADF (average discounted flow) model and the short-term flow-anomaly model. Moderate improvements were achieved for SS, TP, and TKN by the ADF model. By contrast, no such achievement was achieved for SRP by any proposed model. In terms of sampling strategy, performance of all models was generally best using strategy C and worst using strategy A, and especially so for SS, TP, and SRP, confirming the value of routinely collecting storm-flow samples. Overall, this work provides a comprehensive set of statistical evidence for supporting the incorporation of antecedent discharge conditions into WRTDS for constituent concentration and flux estimation, thereby combining the advantages of two recent developments in water quality modeling.

  14. The Danish National Health Survey 2010. Study design and respondent characteristics.

    PubMed

    Christensen, Anne Illemann; Ekholm, Ola; Glümer, Charlotte; Andreasen, Anne Helms; Hvidberg, Michael Falk; Kristensen, Peter Lund; Larsen, Finn Breinholt; Ortiz, Britta; Juel, Knud

    2012-06-01

    In 2010 the five Danish regions and the National Institute of Public Health at the University of Southern Denmark conducted a national representative health survey among the adult population in Denmark. This paper describes the study design and the sample and study population as well as the content of the questionnaire. The survey was based on five regional stratified random samples and one national random sample. The samples were mutually exclusive. A total of 298,550 individuals (16 years or older) were invited to participate. Information was collected using a mixed mode approach (paper and web questionnaires). A questionnaire with a minimum of 52 core questions was used in all six subsamples. Calibrated weights were computed in order to take account of the complex survey design and reduce non-response bias. In all, 177,639 individuals completed the questionnaire (59.5%). The response rate varied from 52.3% in the Capital Region of Denmark sample to 65.5% in the North Denmark Region sample. The response rate was particularly low among young men, unmarried people and among individuals with a different ethnic background than Danish. The survey was a result of extensive national cooperation across sectors, which makes it unique in its field of application, e.g. health surveillance, planning and prioritizing public health initiatives and research. However, the low response rate in some subgroups of the study population can pose problems in generalizing data, and efforts to increase the response rate will be important in the forthcoming surveys.

  15. Pilot Test of a Novel Method for Assessing Community Response to Low-Amplitude Sonic Booms

    NASA Technical Reports Server (NTRS)

    Fidell, Sanford; Horonjeff, Richard D.; Harris, Michael

    2012-01-01

    A pilot test of a novel method for assessing residents annoyance to sonic booms was performed. During a two-week period, residents of the base housing area at Edwards Air Force Base provided data on their reactions to sonic booms using Smartphone-based interviews. Noise measurements were conducted at the same time. The report presents information about data collection methods and about test participants reactions to low-amplitude sonic booms. The latter information should not be viewed as definitive for several reasons. It may not be reliably generalized to the wider U.S. residential population (because it was not derived from a representative random sample) and the sample itself was not large.

  16. High-speed, random-access fluorescence microscopy: I. High-resolution optical recording with voltage-sensitive dyes and ion indicators.

    PubMed

    Bullen, A; Patel, S S; Saggau, P

    1997-07-01

    The design and implementation of a high-speed, random-access, laser-scanning fluorescence microscope configured to record fast physiological signals from small neuronal structures with high spatiotemporal resolution is presented. The laser-scanning capability of this nonimaging microscope is provided by two orthogonal acousto-optic deflectors under computer control. Each scanning point can be randomly accessed and has a positioning time of 3-5 microseconds. Sampling time is also computer-controlled and can be varied to maximize the signal-to-noise ratio. Acquisition rates up to 200k samples/s at 16-bit digitizing resolution are possible. The spatial resolution of this instrument is determined by the minimal spot size at the level of the preparation (i.e., 2-7 microns). Scanning points are selected interactively from a reference image collected with differential interference contrast optics and a video camera. Frame rates up to 5 kHz are easily attainable. Intrinsic variations in laser light intensity and scanning spot brightness are overcome by an on-line signal-processing scheme. Representative records obtained with this instrument by using voltage-sensitive dyes and calcium indicators demonstrate the ability to make fast, high-fidelity measurements of membrane potential and intracellular calcium at high spatial resolution (2 microns) without any temporal averaging.

  17. High-speed, random-access fluorescence microscopy: I. High-resolution optical recording with voltage-sensitive dyes and ion indicators.

    PubMed Central

    Bullen, A; Patel, S S; Saggau, P

    1997-01-01

    The design and implementation of a high-speed, random-access, laser-scanning fluorescence microscope configured to record fast physiological signals from small neuronal structures with high spatiotemporal resolution is presented. The laser-scanning capability of this nonimaging microscope is provided by two orthogonal acousto-optic deflectors under computer control. Each scanning point can be randomly accessed and has a positioning time of 3-5 microseconds. Sampling time is also computer-controlled and can be varied to maximize the signal-to-noise ratio. Acquisition rates up to 200k samples/s at 16-bit digitizing resolution are possible. The spatial resolution of this instrument is determined by the minimal spot size at the level of the preparation (i.e., 2-7 microns). Scanning points are selected interactively from a reference image collected with differential interference contrast optics and a video camera. Frame rates up to 5 kHz are easily attainable. Intrinsic variations in laser light intensity and scanning spot brightness are overcome by an on-line signal-processing scheme. Representative records obtained with this instrument by using voltage-sensitive dyes and calcium indicators demonstrate the ability to make fast, high-fidelity measurements of membrane potential and intracellular calcium at high spatial resolution (2 microns) without any temporal averaging. Images FIGURE 6 PMID:9199810

  18. Sampling intraspecific variability in leaf functional traits: Practical suggestions to maximize collected information.

    PubMed

    Petruzzellis, Francesco; Palandrani, Chiara; Savi, Tadeja; Alberti, Roberto; Nardini, Andrea; Bacaro, Giovanni

    2017-12-01

    The choice of the best sampling strategy to capture mean values of functional traits for a species/population, while maintaining information about traits' variability and minimizing the sampling size and effort, is an open issue in functional trait ecology. Intraspecific variability (ITV) of functional traits strongly influences sampling size and effort. However, while adequate information is available about intraspecific variability between individuals (ITV BI ) and among populations (ITV POP ), relatively few studies have analyzed intraspecific variability within individuals (ITV WI ). Here, we provide an analysis of ITV WI of two foliar traits, namely specific leaf area (SLA) and osmotic potential (π), in a population of Quercus ilex L. We assessed the baseline ITV WI level of variation between the two traits and provided the minimum and optimal sampling size in order to take into account ITV WI , comparing sampling optimization outputs with those previously proposed in the literature. Different factors accounted for different amount of variance of the two traits. SLA variance was mostly spread within individuals (43.4% of the total variance), while π variance was mainly spread between individuals (43.2%). Strategies that did not account for all the canopy strata produced mean values not representative of the sampled population. The minimum size to adequately capture the studied functional traits corresponded to 5 leaves taken randomly from 5 individuals, while the most accurate and feasible sampling size was 4 leaves taken randomly from 10 individuals. We demonstrate that the spatial structure of the canopy could significantly affect traits variability. Moreover, different strategies for different traits could be implemented during sampling surveys. We partially confirm sampling sizes previously proposed in the recent literature and encourage future analysis involving different traits.

  19. Soil Sampling Techniques For Alabama Grain Fields

    NASA Technical Reports Server (NTRS)

    Thompson, A. N.; Shaw, J. N.; Mask, P. L.; Touchton, J. T.; Rickman, D.

    2003-01-01

    Characterizing the spatial variability of nutrients facilitates precision soil sampling. Questions exist regarding the best technique for directed soil sampling based on a priori knowledge of soil and crop patterns. The objective of this study was to evaluate zone delineation techniques for Alabama grain fields to determine which method best minimized the soil test variability. Site one (25.8 ha) and site three (20.0 ha) were located in the Tennessee Valley region, and site two (24.2 ha) was located in the Coastal Plain region of Alabama. Tennessee Valley soils ranged from well drained Rhodic and Typic Paleudults to somewhat poorly drained Aquic Paleudults and Fluventic Dystrudepts. Coastal Plain s o i l s ranged from coarse-loamy Rhodic Kandiudults to loamy Arenic Kandiudults. Soils were sampled by grid soil sampling methods (grid sizes of 0.40 ha and 1 ha) consisting of: 1) twenty composited cores collected randomly throughout each grid (grid-cell sampling) and, 2) six composited cores collected randomly from a -3x3 m area at the center of each grid (grid-point sampling). Zones were established from 1) an Order 1 Soil Survey, 2) corn (Zea mays L.) yield maps, and 3) airborne remote sensing images. All soil properties were moderately to strongly spatially dependent as per semivariogram analyses. Differences in grid-point and grid-cell soil test values suggested grid-point sampling does not accurately represent grid values. Zones created by soil survey, yield data, and remote sensing images displayed lower coefficient of variations (8CV) for soil test values than overall field values, suggesting these techniques group soil test variability. However, few differences were observed between the three zone delineation techniques. Results suggest directed sampling using zone delineation techniques outlined in this paper would result in more efficient soil sampling for these Alabama grain fields.

  20. Sampling design for an integrated socioeconomic and ecological survey by using satellite remote sensing and ordination

    PubMed Central

    Binford, Michael W.; Lee, Tae Jeong; Townsend, Robert M.

    2004-01-01

    Environmental variability is an important risk factor in rural agricultural communities. Testing models requires empirical sampling that generates data that are representative in both economic and ecological domains. Detrended correspondence analysis of satellite remote sensing data were used to design an effective low-cost sampling protocol for a field study to create an integrated socioeconomic and ecological database when no prior information on ecology of the survey area existed. We stratified the sample for the selection of tambons from various preselected provinces in Thailand based on factor analysis of spectral land-cover classes derived from satellite data. We conducted the survey for the sampled villages in the chosen tambons. The resulting data capture interesting variations in soil productivity and in the timing of good and bad years, which a purely random sample would likely have missed. Thus, this database will allow tests of hypotheses concerning the effect of credit on productivity, the sharing of idiosyncratic risks, and the economic influence of environmental variability. PMID:15254298

  1. Design and simulation study of the immunization Data Quality Audit (DQA).

    PubMed

    Woodard, Stacy; Archer, Linda; Zell, Elizabeth; Ronveaux, Olivier; Birmingham, Maureen

    2007-08-01

    The goal of the Data Quality Audit (DQA) is to assess whether the Global Alliance for Vaccines and Immunization-funded countries are adequately reporting the number of diphtheria-tetanus-pertussis immunizations given, on which the "shares" are awarded. Given that this sampling design is a modified two-stage cluster sample (modified because a stratified, rather than a simple, random sample of health facilities is obtained from the selected clusters); the formula for the calculation of the standard error for the estimate is unknown. An approximated standard error has been proposed, and the first goal of this simulation is to assess the accuracy of the standard error. Results from the simulations based on hypothetical populations were found not to be representative of the actual DQAs that were conducted. Additional simulations were then conducted on the actual DQA data to better access the precision of the DQ with both the original and the increased sample sizes.

  2. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  3. Experimental scattershot boson sampling.

    PubMed

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J; Galvão, Ernesto F; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-04-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy.

  4. Prevalence of Plasmodium falciparum transmission reducing immunity among primary school children in a malaria moderate transmission region in Zimbabwe.

    PubMed

    Paul, Noah H; Vengesai, Arthur; Mduluza, Takafira; Chipeta, James; Midzi, Nicholas; Bansal, Geetha P; Kumar, Nirbhay

    2016-11-01

    Malaria continues to cause alarming morbidity and mortality in more than 100 countries worldwide. Antigens in the various life cycle stages of malaria parasites are presented to the immune system during natural infection and it is widely recognized that after repeated malaria exposure, adults develop partially protective immunity. Specific antigens of natural immunity represent among the most important targets for the development of malaria vaccines. Immunity against the transmission stages of the malaria parasite represents an important approach to reduce malaria transmission and is believed to become an important tool for gradual elimination of malaria. Development of immunity against Plasmodium falciparum sexual stages was evaluated in primary school children aged 6-16 years in Makoni district of Zimbabwe, an area of low to modest malaria transmission. Malaria infection was screened by microscopy, rapid diagnostic tests and finally using nested PCR. Plasma samples were tested for antibodies against recombinant Pfs48/45 and Pfs47 by ELISA. Corresponding serum samples were used to test for P. falciparum transmission reducing activity in Anopheles stephensi and An. gambiae mosquitoes using the membrane feeding assay. The prevalence of malaria diagnosed by rapid diagnostic test kit (Paracheck)™ was 1.7%. However, of the randomly tested blood samples, 66% were positive by nested PCR. ELISA revealed prevalence (64% positivity at 1:500 dilution, in randomly selected 66 plasma samples) of antibodies against recombinant Pfs48/45 (mean A 405nm=0.53, CI=0.46-0.60) and Pfs47 (mean A405nm=0.91, CI=0.80-1.02); antigens specific to the sexual stages. The mosquito membrane feeding assay demonstrated measurable transmission reducing ability of the samples that were positive for Pfs48/45 antibodies by ELISA. Interestingly, 3 plasma samples revealed enhancement of infectivity of P. falciparum in An. stephensi mosquitoes. These studies revealed the presence of antibodies with transmission reducing immunity in school age children from a moderate transmission area of malaria, and provide further support to exploit target antigens such as Pfs48/45 for further development of a malaria transmission blocking vaccine. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Iterative random vs. Kennard-Stone sampling for IR spectrum-based classification task using PLS2-DA

    NASA Astrophysics Data System (ADS)

    Lee, Loong Chuen; Liong, Choong-Yeun; Jemain, Abdul Aziz

    2018-04-01

    External testing (ET) is preferred over auto-prediction (AP) or k-fold-cross-validation in estimating more realistic predictive ability of a statistical model. With IR spectra, Kennard-stone (KS) sampling algorithm is often used to split the data into training and test sets, i.e. respectively for model construction and for model testing. On the other hand, iterative random sampling (IRS) has not been the favored choice though it is theoretically more likely to produce reliable estimation. The aim of this preliminary work is to compare performances of KS and IRS in sampling a representative training set from an attenuated total reflectance - Fourier transform infrared spectral dataset (of four varieties of blue gel pen inks) for PLS2-DA modeling. The `best' performance achievable from the dataset is estimated with AP on the full dataset (APF, error). Both IRS (n = 200) and KS were used to split the dataset in the ratio of 7:3. The classic decision rule (i.e. maximum value-based) is employed for new sample prediction via partial least squares - discriminant analysis (PLS2-DA). Error rate of each model was estimated repeatedly via: (a) AP on full data (APF, error); (b) AP on training set (APS, error); and (c) ET on the respective test set (ETS, error). A good PLS2-DA model is expected to produce APS, error and EVS, error that is similar to the APF, error. Bearing that in mind, the similarities between (a) APS, error vs. APF, error; (b) ETS, error vs. APF, error and; (c) APS, error vs. ETS, error were evaluated using correlation tests (i.e. Pearson and Spearman's rank test), using series of PLS2-DA models computed from KS-set and IRS-set, respectively. Overall, models constructed from IRS-set exhibits more similarities between the internal and external error rates than the respective KS-set, i.e. less risk of overfitting. In conclusion, IRS is more reliable than KS in sampling representative training set.

  6. Probability Distributions for Random Quantum Operations

    NASA Astrophysics Data System (ADS)

    Schultz, Kevin

    Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.

  7. Nonuniform sampling theorems for random signals in the linear canonical transform domain

    NASA Astrophysics Data System (ADS)

    Shuiqing, Xu; Congmei, Jiang; Yi, Chai; Youqiang, Hu; Lei, Huang

    2018-06-01

    Nonuniform sampling can be encountered in various practical processes because of random events or poor timebase. The analysis and applications of the nonuniform sampling for deterministic signals related to the linear canonical transform (LCT) have been well considered and researched, but up to now no papers have been published regarding the various nonuniform sampling theorems for random signals related to the LCT. The aim of this article is to explore the nonuniform sampling and reconstruction of random signals associated with the LCT. First, some special nonuniform sampling models are briefly introduced. Second, based on these models, some reconstruction theorems for random signals from various nonuniform samples associated with the LCT have been derived. Finally, the simulation results are made to prove the accuracy of the sampling theorems. In addition, the latent real practices of the nonuniform sampling for random signals have been also discussed.

  8. Estimating the breeding population of long-billed curlew in the United States

    USGS Publications Warehouse

    Stanley, T.R.; Skagen, S.K.

    2007-01-01

    Determining population size and long-term trends in population size for species of high concern is a priority of international, national, and regional conservation plans. Long-billed curlews (Numenius americanus) are a species of special concern in North America due to apparent declines in their population. Because long-billed curlews are not adequately monitored by existing programs, we undertook a 2-year study with the goals of 1) determining present long-billed curlew distribution and breeding population size in the United States and 2) providing recommendations for a long-term long-billed curlew monitoring protocol. We selected a stratified random sample of survey routes in 16 western states for sampling in 2004 and 2005, and we analyzed count data from these routes to estimate detection probabilities and abundance. In addition, we evaluated habitat along roadsides to determine how well roadsides represented habitat throughout the sampling units. We estimated there were 164,515 (SE = 42,047) breeding long-billed curlews in 2004, and 109,533 (SE = 31,060) breeding individuals in 2005. These estimates far exceed currently accepted estimates based on expert opinion. We found that habitat along roadsides was representative of long-billed curlew habitat in general. We make recommendations for improving sampling methodology, and we present power curves to provide guidance on minimum sample sizes required to detect trends in abundance.

  9. Health Surveys Using Mobile Phones in Developing Countries: Automated Active Strata Monitoring and Other Statistical Considerations for Improving Precision and Reducing Biases

    PubMed Central

    Blynn, Emily; Ahmed, Saifuddin; Gibson, Dustin; Pariyo, George; Hyder, Adnan A

    2017-01-01

    In low- and middle-income countries (LMICs), historically, household surveys have been carried out by face-to-face interviews to collect survey data related to risk factors for noncommunicable diseases. The proliferation of mobile phone ownership and the access it provides in these countries offers a new opportunity to remotely conduct surveys with increased efficiency and reduced cost. However, the near-ubiquitous ownership of phones, high population mobility, and low cost require a re-examination of statistical recommendations for mobile phone surveys (MPS), especially when surveys are automated. As with landline surveys, random digit dialing remains the most appropriate approach to develop an ideal survey-sampling frame. Once the survey is complete, poststratification weights are generally applied to reduce estimate bias and to adjust for selectivity due to mobile ownership. Since weights increase design effects and reduce sampling efficiency, we introduce the concept of automated active strata monitoring to improve representativeness of the sample distribution to that of the source population. Although some statistical challenges remain, MPS represent a promising emerging means for population-level data collection in LMICs. PMID:28476726

  10. Precipitation and Latent Heating Distributions from Satellite Passive Microwave Radiometry. Part 1; Improved Method and Uncertainties

    NASA Technical Reports Server (NTRS)

    Olson, William S.; Kummerow, Christian D.; Yang, Song; Petty, Grant W.; Tao, Wei-Kuo; Bell, Thomas L.; Braun, Scott A.; Wang, Yansen; Lang, Stephen E.; Johnson, Daniel E.; hide

    2006-01-01

    A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and nonconvective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud-resolving model simulations, and from the Bayesian formulation itself. Synthetic rain-rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in TMI instantaneous rain-rate estimates at 0.5 -resolution range from approximately 50% at 1 mm/h to 20% at 14 mm/h. Errors in collocated spaceborne radar rain-rate estimates are roughly 50%-80% of the TMI errors at this resolution. The estimated algorithm random error in TMI rain rates at monthly, 2.5deg resolution is relatively small (less than 6% at 5 mm day.1) in comparison with the random error resulting from infrequent satellite temporal sampling (8%-35% at the same rain rate). Percentage errors resulting from sampling decrease with increasing rain rate, and sampling errors in latent heating rates follow the same trend. Averaging over 3 months reduces sampling errors in rain rates to 6%-15% at 5 mm day.1, with proportionate reductions in latent heating sampling errors.

  11. Sampling procedures for inventory of commercial volume tree species in Amazon Forest.

    PubMed

    Netto, Sylvio P; Pelissari, Allan L; Cysneiros, Vinicius C; Bonazza, Marcelo; Sanquetta, Carlos R

    2017-01-01

    The spatial distribution of tropical tree species can affect the consistency of the estimators in commercial forest inventories, therefore, appropriate sampling procedures are required to survey species with different spatial patterns in the Amazon Forest. For this, the present study aims to evaluate the conventional sampling procedures and introduce the adaptive cluster sampling for volumetric inventories of Amazonian tree species, considering the hypotheses that the density, the spatial distribution and the zero-plots affect the consistency of the estimators, and that the adaptive cluster sampling allows to obtain more accurate volumetric estimation. We use data from a census carried out in Jamari National Forest, Brazil, where trees with diameters equal to or higher than 40 cm were measured in 1,355 plots. Species with different spatial patterns were selected and sampled with simple random sampling, systematic sampling, linear cluster sampling and adaptive cluster sampling, whereby the accuracy of the volumetric estimation and presence of zero-plots were evaluated. The sampling procedures applied to species were affected by the low density of trees and the large number of zero-plots, wherein the adaptive clusters allowed concentrating the sampling effort in plots with trees and, thus, agglutinating more representative samples to estimate the commercial volume.

  12. The big five personality traits and individual job performance growth trajectories in maintenance and transitional job stages.

    PubMed

    Thoresen, Carl J; Bradley, Jill C; Bliese, Paul D; Thoresen, Joseph D

    2004-10-01

    This study extends the literature on personality and job performance through the use of random coefficient modeling to test the validity of the Big Five personality traits in predicting overall sales performance and sales performance trajectories--or systematic patterns of performance growth--in 2 samples of pharmaceutical sales representatives at maintenance and transitional job stages (K. R. Murphy, 1989). In the maintenance sample, conscientiousness and extraversion were positively associated with between-person differences in total sales, whereas only conscientiousness predicted performance growth. In the transitional sample, agreeableness and openness to experience predicted overall performance differences and performance trends. All effects remained significant with job tenure statistically controlled. Possible explanations for these findings are offered, and theoretical and practical implications of findings are discussed. (c) 2004 APA, all rights reserved

  13. The patient safety climate in healthcare organizations (PSCHO) survey: Short-form development.

    PubMed

    Benzer, Justin K; Meterko, Mark; Singer, Sara J

    2017-08-01

    Measures of safety climate are increasingly used to guide safety improvement initiatives. However, cost and respondent burden may limit the use of safety climate surveys. The purpose of this study was to develop a 15- to 20-item safety climate survey based on the Patient Safety Climate in Healthcare Organizations survey, a well-validated 38-item measure of safety climate. The Patient Safety Climate in Healthcare Organizations was administered to all senior managers, all physicians, and a 10% random sample of all other hospital personnel in 69 private sector hospitals and 30 Veterans Health Administration hospitals. Both samples were randomly divided into a derivation sample to identify a short-form subset and a confirmation sample to assess the psychometric properties of the proposed short form. The short form consists of 15 items represented 3 overarching domains in the long-form scale-organization, work unit, and interpersonal. The proposed short form efficiently captures 3 important sources of variance in safety climate: organizational, work-unit, and interpersonal. The short-form development process was a practical method that can be applied to other safety climate surveys. This safety climate short form may increase response rates in studies that involve busy clinicians or repeated measures. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  14. Detection of mastitis in dairy cattle by use of mixture models for repeated somatic cell scores: a Bayesian approach via Gibbs sampling.

    PubMed

    Odegård, J; Jensen, J; Madsen, P; Gianola, D; Klemetsdal, G; Heringstad, B

    2003-11-01

    The distribution of somatic cell scores could be regarded as a mixture of at least two components depending on a cow's udder health status. A heteroscedastic two-component Bayesian normal mixture model with random effects was developed and implemented via Gibbs sampling. The model was evaluated using datasets consisting of simulated somatic cell score records. Somatic cell score was simulated as a mixture representing two alternative udder health statuses ("healthy" or "diseased"). Animals were assigned randomly to the two components according to the probability of group membership (Pm). Random effects (additive genetic and permanent environment), when included, had identical distributions across mixture components. Posterior probabilities of putative mastitis were estimated for all observations, and model adequacy was evaluated using measures of sensitivity, specificity, and posterior probability of misclassification. Fitting different residual variances in the two mixture components caused some bias in estimation of parameters. When the components were difficult to disentangle, so were their residual variances, causing bias in estimation of Pm and of location parameters of the two underlying distributions. When all variance components were identical across mixture components, the mixture model analyses returned parameter estimates essentially without bias and with a high degree of precision. Including random effects in the model increased the probability of correct classification substantially. No sizable differences in probability of correct classification were found between models in which a single cow effect (ignoring relationships) was fitted and models where this effect was split into genetic and permanent environmental components, utilizing relationship information. When genetic and permanent environmental effects were fitted, the between-replicate variance of estimates of posterior means was smaller because the model accounted for random genetic drift.

  15. Comparison of Random Forest, k-Nearest Neighbor, and Support Vector Machine Classifiers for Land Cover Classification Using Sentinel-2 Imagery

    PubMed Central

    Thanh Noi, Phan; Kappas, Martin

    2017-01-01

    In previous classification studies, three non-parametric classifiers, Random Forest (RF), k-Nearest Neighbor (kNN), and Support Vector Machine (SVM), were reported as the foremost classifiers at producing high accuracies. However, only a few studies have compared the performances of these classifiers with different training sample sizes for the same remote sensing images, particularly the Sentinel-2 Multispectral Imager (MSI). In this study, we examined and compared the performances of the RF, kNN, and SVM classifiers for land use/cover classification using Sentinel-2 image data. An area of 30 × 30 km2 within the Red River Delta of Vietnam with six land use/cover types was classified using 14 different training sample sizes, including balanced and imbalanced, from 50 to over 1250 pixels/class. All classification results showed a high overall accuracy (OA) ranging from 90% to 95%. Among the three classifiers and 14 sub-datasets, SVM produced the highest OA with the least sensitivity to the training sample sizes, followed consecutively by RF and kNN. In relation to the sample size, all three classifiers showed a similar and high OA (over 93.85%) when the training sample size was large enough, i.e., greater than 750 pixels/class or representing an area of approximately 0.25% of the total study area. The high accuracy was achieved with both imbalanced and balanced datasets. PMID:29271909

  16. Comparison of Random Forest, k-Nearest Neighbor, and Support Vector Machine Classifiers for Land Cover Classification Using Sentinel-2 Imagery.

    PubMed

    Thanh Noi, Phan; Kappas, Martin

    2017-12-22

    In previous classification studies, three non-parametric classifiers, Random Forest (RF), k-Nearest Neighbor (kNN), and Support Vector Machine (SVM), were reported as the foremost classifiers at producing high accuracies. However, only a few studies have compared the performances of these classifiers with different training sample sizes for the same remote sensing images, particularly the Sentinel-2 Multispectral Imager (MSI). In this study, we examined and compared the performances of the RF, kNN, and SVM classifiers for land use/cover classification using Sentinel-2 image data. An area of 30 × 30 km² within the Red River Delta of Vietnam with six land use/cover types was classified using 14 different training sample sizes, including balanced and imbalanced, from 50 to over 1250 pixels/class. All classification results showed a high overall accuracy (OA) ranging from 90% to 95%. Among the three classifiers and 14 sub-datasets, SVM produced the highest OA with the least sensitivity to the training sample sizes, followed consecutively by RF and kNN. In relation to the sample size, all three classifiers showed a similar and high OA (over 93.85%) when the training sample size was large enough, i.e., greater than 750 pixels/class or representing an area of approximately 0.25% of the total study area. The high accuracy was achieved with both imbalanced and balanced datasets.

  17. Botanical origin, colour, granulation, and sensory properties of the Harenna forest honey, Bale, Ethiopia.

    PubMed

    Belay, Abera; Solomon, W K; Bultossa, Geremew; Adgaba, Nuru; Melaku, Samuel

    2015-01-15

    In this study, the Harenna forest honey samples were investigated with respect to their botanical origin, granulation, colour and sensory properties. Sixteen honey samples were collected from two representative sites (Chiri, C, and Wabero, W) using random sampling techniques. Botanical origin was investigated using qualitative pollen analysis by counting 500 pollen grains using harmonised methods of melissopalynology. Granulation, colour, and sensory properties of honey were determined by visual observation, using Pfund grader, acceptability and preference tests, respectively. Honey samples were also tested for tetracycline. Honey obtained from Wabero is originated dominantly from Syzygium guineense while Chiri was multifloral. The colour of honey ranged from 34 to 85 with light amber and extra light amber colours. The honey samples were free from tetracycline residue and form coarse granules slowly. Significant variation (p>0.05) in sensory preference and acceptability tests not observed due to hive types and locations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. The saving and empowering young lives in Europe (SEYLE) randomized controlled trial (RCT): methodological issues and participant characteristics.

    PubMed

    Carli, Vladimir; Wasserman, Camilla; Wasserman, Danuta; Sarchiapone, Marco; Apter, Alan; Balazs, Judit; Bobes, Julio; Brunner, Romuald; Corcoran, Paul; Cosman, Doina; Guillemin, Francis; Haring, Christian; Kaess, Michael; Kahn, Jean Pierre; Keeley, Helen; Keresztény, Agnes; Iosue, Miriam; Mars, Ursa; Musa, George; Nemes, Bogdan; Postuvan, Vita; Reiter-Theil, Stella; Saiz, Pilar; Varnik, Peeter; Varnik, Airi; Hoven, Christina W

    2013-05-16

    Mental health problems and risk behaviours among young people are of great public health concern. Consequently, within the VII Framework Programme, the European Commission funded the Saving and Empowering Young Lives in Europe (SEYLE) project. This Randomized Controlled Trial (RCT) was conducted in eleven European countries, with Sweden as the coordinating centre, and was designed to identify an effective way to promote mental health and reduce suicidality and risk taking behaviours among adolescents. To describe the methodological and field procedures in the SEYLE RCT among adolescents, as well as to present the main characteristics of the recruited sample. Analyses were conducted to determine: 1) representativeness of study sites compared to respective national data; 2) response rate of schools and pupils, drop-out rates from baseline to 3 and 12 month follow-up, 3) comparability of samples among the four Intervention Arms; 4) properties of the standard scales employed: Beck Depression Inventory, Second Edition (BDI-II), Zung Self-Rating Anxiety Scale (Z-SAS), Strengths and Difficulties Questionnaire (SDQ), World Health Organization Well-Being Scale (WHO-5). Participants at baseline comprised 12,395 adolescents (M/F: 5,529/6,799; mean age=14.9±0.9) from Austria, Estonia, France, Germany, Hungary, Ireland, Israel, Italy, Romania, Slovenia and Spain. At the 3 and 12 months follow up, participation rates were 87.3% and 79.4%, respectively. Demographic characteristics of participating sites were found to be reasonably representative of their respective national population. Overall response rate of schools was 67.8%. All scales utilised in the study had good to very good internal reliability, as measured by Cronbach's alpha (BDI-II: 0.864; Z-SAS: 0.805; SDQ: 0.740; WHO-5: 0.799). SEYLE achieved its objective of recruiting a large representative sample of adolescents within participating European countries. Analysis of SEYLE data will shed light on the effectiveness of important interventions aimed at improving adolescent mental health and well-being, reducing risk-taking and self-destructive behaviour and preventing suicidality. US National Institute of Health (NIH) clinical trial registry (NCT00906620) and the German Clinical Trials Register (DRKS00000214).

  19. CURE-SMOTE algorithm and hybrid algorithm for feature selection and parameter optimization based on random forests.

    PubMed

    Ma, Li; Fan, Suohai

    2017-03-14

    The random forests algorithm is a type of classifier with prominent universality, a wide application range, and robustness for avoiding overfitting. But there are still some drawbacks to random forests. Therefore, to improve the performance of random forests, this paper seeks to improve imbalanced data processing, feature selection and parameter optimization. We propose the CURE-SMOTE algorithm for the imbalanced data classification problem. Experiments on imbalanced UCI data reveal that the combination of Clustering Using Representatives (CURE) enhances the original synthetic minority oversampling technique (SMOTE) algorithms effectively compared with the classification results on the original data using random sampling, Borderline-SMOTE1, safe-level SMOTE, C-SMOTE, and k-means-SMOTE. Additionally, the hybrid RF (random forests) algorithm has been proposed for feature selection and parameter optimization, which uses the minimum out of bag (OOB) data error as its objective function. Simulation results on binary and higher-dimensional data indicate that the proposed hybrid RF algorithms, hybrid genetic-random forests algorithm, hybrid particle swarm-random forests algorithm and hybrid fish swarm-random forests algorithm can achieve the minimum OOB error and show the best generalization ability. The training set produced from the proposed CURE-SMOTE algorithm is closer to the original data distribution because it contains minimal noise. Thus, better classification results are produced from this feasible and effective algorithm. Moreover, the hybrid algorithm's F-value, G-mean, AUC and OOB scores demonstrate that they surpass the performance of the original RF algorithm. Hence, this hybrid algorithm provides a new way to perform feature selection and parameter optimization.

  20. Translational Genomics Research Institute: Identification of Pathways Enriched with Condition-Specific Statistical Dependencies Across Four Subtypes of Glioblastoma Multiforme | Office of Cancer Genomics

    Cancer.gov

    Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.  

  1. Translational Genomics Research Institute (TGen): Identification of Pathways Enriched with Condition-Specific Statistical Dependencies Across Four Subtypes of Glioblastoma Multiforme | Office of Cancer Genomics

    Cancer.gov

    Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.  

  2. Designing a national soil erosion monitoring network for England and Wales

    NASA Astrophysics Data System (ADS)

    Lark, Murray; Rawlins, Barry; Anderson, Karen; Evans, Martin; Farrow, Luke; Glendell, Miriam; James, Mike; Rickson, Jane; Quine, Timothy; Quinton, John; Brazier, Richard

    2014-05-01

    Although soil erosion is recognised as a significant threat to sustainable land use and may be a priority for action in any forthcoming EU Soil Framework Directive, those responsible for setting national policy with respect to erosion are constrained by a lack of robust, representative, data at large spatial scales. This reflects the process-orientated nature of much soil erosion research. Recognising this limitation, The UK Department for Environment, Food and Rural Affairs (Defra) established a project to pilot a cost-effective framework for monitoring of soil erosion in England and Wales (E&W). The pilot will compare different soil erosion monitoring methods at a site scale and provide statistical information for the final design of the full national monitoring network that will: provide unbiased estimates of the spatial mean of soil erosion rate across E&W (tonnes ha-1 yr-1) for each of three land-use classes - arable and horticultural grassland upland and semi-natural habitats quantify the uncertainty of these estimates with confidence intervals. Probability (design-based) sampling provides most efficient unbiased estimates of spatial means. In this study, a 16 hectare area (a square of 400 x 400 m) positioned at the centre of a 1-km grid cell, selected at random from mapped land use across E&W, provided the sampling support for measurement of erosion rates, with at least 94% of the support area corresponding to the target land use classes. Very small or zero erosion rates likely to be encountered at many sites reduce the sampling efficiency and make it difficult to compare different methods of soil erosion monitoring. Therefore, to increase the proportion of samples with larger erosion rates without biasing our estimates, we increased the inclusion probability density in areas where the erosion rate is likely to be large by using stratified random sampling. First, each sampling domain (land use class in E&W) was divided into strata; e.g. two sub-domains within which, respectively, small or no erosion rates, and moderate or larger erosion rates are expected. Each stratum was then sampled independently and at random. The sample density need not be equal in the two strata, but is known and is accounted for in the estimation of the mean and its standard error. To divide the domains into strata we used information on slope angle, previous interpretation of erosion susceptibility of the soil associations that correspond to the soil map of E&W at 1:250 000 (Soil Survey of England and Wales, 1983), and visual interpretation of evidence of erosion from aerial photography. While each domain could be stratified on the basis of the first two criteria, air photo interpretation across the whole country was not feasible. For this reason we used a two-phase random sampling for stratification (TPRS) design (de Gruijter et al., 2006). First, we formed an initial random sample of 1-km grid cells from the target domain. Second, each cell was then allocated to a stratum on the basis of the three criteria. A subset of the selected cells from each stratum were then selected for field survey at random, with a specified sampling density for each stratum so as to increase the proportion of cells where moderate or larger erosion rates were expected. Once measurements of erosion have been made, an estimate of the spatial mean of the erosion rate over the target domain, its standard error and associated uncertainty can be calculated by an expression which accounts for the estimated proportions of the two strata within the initial random sample. de Gruijter, J.J., Brus, D.J., Biekens, M.F.P. & Knotters, M. 2006. Sampling for Natural Resource Monitoring. Springer, Berlin. Soil Survey of England and Wales. 1983 National Soil Map NATMAP Vector 1:250,000. National Soil Research Institute, Cranfield University.

  3. Reliability of N-terminal proBNP assay in diagnosis of left ventricular systolic dysfunction within representative and high risk populations.

    PubMed

    Hobbs, F D R; Davis, R C; Roalfe, A K; Hare, R; Davies, M K

    2004-08-01

    To determine the performance of a new NT-proBNP assay in comparison with brain natriuretic peptide (BNP) in identifying left ventricular systolic dysfunction (LVSD) in randomly selected community populations. Blood samples were taken prospectively in the community from 591 randomly sampled individuals over the age of 45 years, stratified for age and socioeconomic status and divided into four cohorts (general population; clinically diagnosed heart failure; patients on diuretics; and patients deemed at high risk of heart failure). Definite heart failure (left ventricular ejection fraction (LVEF) < 40%) was identified in 33 people. Samples were handled as though in routine clinical practice. The laboratories undertaking the assays were blinded. Using NT-proBNP to diagnose LVEF < 40% in the general population, a level of > 40 pmol/l had 80% sensitivity, 73% specificity, 5% positive predictive value (PPV), 100% negative predictive value (NPV), and an area under the receiver-operator characteristic curve (AUC) of 76% (95% confidence interval (CI) 46% to 100%). For BNP to diagnose LVSD, a cut off level of > 33 pmol/l had 80% sensitivity, 88% specificity, 10% PPV, 100% NPV, and AUC of 88% (95% CI 75% to 100%). Similar NPVs were found for patients randomly screened from the three other populations. Both NT-proBNP and BNP have value in diagnosing LVSD in a community setting, with similar sensitivities and specificities. Using a high cut off for positivity will confirm the diagnosis of LVSD but will miss cases. At lower cut off values, positive results will require cardiac imaging to confirm LVSD.

  4. Being "SMART" About Adolescent Conduct Problems Prevention: Executing a SMART Pilot Study in a Juvenile Diversion Agency.

    PubMed

    August, Gerald J; Piehler, Timothy F; Bloomquist, Michael L

    2016-01-01

    The development of adaptive treatment strategies (ATS) represents the next step in innovating conduct problems prevention programs within a juvenile diversion context. Toward this goal, we present the theoretical rationale, associated methods, and anticipated challenges for a feasibility pilot study in preparation for implementing a full-scale SMART (i.e., sequential, multiple assignment, randomized trial) for conduct problems prevention. The role of a SMART design in constructing ATS is presented. The SMART feasibility pilot study includes a sample of 100 youth (13-17 years of age) identified by law enforcement as early stage offenders and referred for precourt juvenile diversion programming. Prior data on the sample population detail a high level of ethnic diversity and approximately equal representations of both genders. Within the SMART, youth and their families are first randomly assigned to one of two different brief-type evidence-based prevention programs, featuring parent-focused behavioral management or youth-focused strengths-building components. Youth who do not respond sufficiently to brief first-stage programming will be randomly assigned a second time to either an extended parent- or youth-focused second-stage programming. Measures of proximal intervention response and measures of potential candidate tailoring variables for developing ATS within this sample are detailed. Results of the described pilot study will include information regarding feasibility and acceptability of the SMART design. This information will be used to refine a subsequent full-scale SMART. The use of a SMART to develop ATS for prevention will increase the efficiency and effectiveness of prevention programing for youth with developing conduct problems.

  5. Rapid Quantification of Mutant Fitness in Diverse Bacteria by Sequencing Randomly Bar-Coded Transposons

    PubMed Central

    Wetmore, Kelly M.; Price, Morgan N.; Waters, Robert J.; Lamson, Jacob S.; He, Jennifer; Hoover, Cindi A.; Blow, Matthew J.; Bristow, James; Butland, Gareth

    2015-01-01

    ABSTRACT Transposon mutagenesis with next-generation sequencing (TnSeq) is a powerful approach to annotate gene function in bacteria, but existing protocols for TnSeq require laborious preparation of every sample before sequencing. Thus, the existing protocols are not amenable to the throughput necessary to identify phenotypes and functions for the majority of genes in diverse bacteria. Here, we present a method, random bar code transposon-site sequencing (RB-TnSeq), which increases the throughput of mutant fitness profiling by incorporating random DNA bar codes into Tn5 and mariner transposons and by using bar code sequencing (BarSeq) to assay mutant fitness. RB-TnSeq can be used with any transposon, and TnSeq is performed once per organism instead of once per sample. Each BarSeq assay requires only a simple PCR, and 48 to 96 samples can be sequenced on one lane of an Illumina HiSeq system. We demonstrate the reproducibility and biological significance of RB-TnSeq with Escherichia coli, Phaeobacter inhibens, Pseudomonas stutzeri, Shewanella amazonensis, and Shewanella oneidensis. To demonstrate the increased throughput of RB-TnSeq, we performed 387 successful genome-wide mutant fitness assays representing 130 different bacterium-carbon source combinations and identified 5,196 genes with significant phenotypes across the five bacteria. In P. inhibens, we used our mutant fitness data to identify genes important for the utilization of diverse carbon substrates, including a putative d-mannose isomerase that is required for mannitol catabolism. RB-TnSeq will enable the cost-effective functional annotation of diverse bacteria using mutant fitness profiling. PMID:25968644

  6. Expedite random structure searching using objects from Wyckoff positions

    NASA Astrophysics Data System (ADS)

    Wang, Shu-Wei; Hsing, Cheng-Rong; Wei, Ching-Ming

    2018-02-01

    Random structure searching has been proved to be a powerful approach to search and find the global minimum and the metastable structures. A true random sampling is in principle needed yet it would be highly time-consuming and/or practically impossible to find the global minimum for the complicated systems in their high-dimensional configuration space. Thus the implementations of reasonable constraints, such as adopting system symmetries to reduce the independent dimension in structural space and/or imposing chemical information to reach and relax into low-energy regions, are the most essential issues in the approach. In this paper, we propose the concept of "object" which is either an atom or composed of a set of atoms (such as molecules or carbonates) carrying a symmetry defined by one of the Wyckoff positions of space group and through this process it allows the searching of global minimum for a complicated system to be confined in a greatly reduced structural space and becomes accessible in practice. We examined several representative materials, including Cd3As2 crystal, solid methanol, high-pressure carbonates (FeCO3), and Si(111)-7 × 7 reconstructed surface, to demonstrate the power and the advantages of using "object" concept in random structure searching.

  7. A design of experiments approach to validation sampling for logistic regression modeling with error-prone medical records.

    PubMed

    Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay

    2016-04-01

    Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Unsupervised Bayesian linear unmixing of gene expression microarrays.

    PubMed

    Bazot, Cécile; Dobigeon, Nicolas; Tourneret, Jean-Yves; Zaas, Aimee K; Ginsburg, Geoffrey S; Hero, Alfred O

    2013-03-19

    This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores collected during the study. Using a constrained model allows recovery of all the inflammatory genes in a single factor.

  9. Methodology Series Module 5: Sampling Strategies.

    PubMed

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  10. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    PubMed

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  11. The science of ecological economics: a content analysis of Ecological Economics, 1989-2004.

    PubMed

    Luzadis, Valerie A; Castello, Leandro; Choi, Jaewon; Greenfield, Eric; Kim, Sung-kyun; Munsell, John; Nordman, Erik; Franco, Carol; Olowabi, Flavien

    2010-01-01

    The Ecological Economics journal is a primary source for inquiry on ecological economics and sustainability. To explore the scholarly pursuit of ecological economics, we conducted a content analysis of 200 randomly sampled research, survey, and methodological articles published in Ecological Economics during the 15-year period of 1989-2004. Results of the analysis were used to investigate facets of transdisciplinarity within the journal. A robust qualitative approach was used to gather and examine data to identify themes representing substantive content found within the span of sampled journal papers. The extent to which each theme was represented was counted as well as additional data, such as author discipline, year published, etc. Four main categories were revealed: (1) foundations (self-reflexive themes stemming from direct discussions about ecological economics); (2) human systems, represented by the themes of values, social indicators of well-being, intergenerational distribution, and equity; (3) biophysical systems, including themes, such as carrying capacity and scarcity, energy, and resource use, relating directly to the biophysical aspects of systems; and (4) policy and management encompassing themes of development, growth, trade, accounting, and valuation, as well as institutional structures and management. The results provide empirical evidence for discussing the future direction of ecological economic efforts.

  12. [The treatment needs of migrant children according to child and adolescent psychiatrists from medical clinics and in private practice].

    PubMed

    Siefen, Georg; Kirkcaldy, Bruce; Adam, Hubertus; Schepker, Renate

    2015-03-01

    How does the German child and adolescent psychiatry system respond to the increasing number of migrant children and adolescents? Senior doctors from German child and adolescent psychiatric hospitals (Association of Medical Hospital Directors in Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy in Germany, BAG) completed a specially constructed questionnaire about the treatment needs of migrant children, while a «random, representative» sample of child and adolescent psychiatrists in private practice (German Professional Association for Child and Adolescent Psychiatry, Psychosomatic Medicine and Psychotherapy, BKJPP) was administered a slightly modified version. The 100 psychiatrists in private practice represented only about one-eighth of their group, whereas the 55 medical directors comprised a representative sample. One-third of the hospitals has treatments tailored to the specific needs of migrants. In both settings, however, competent interpreters were rarely found, despite the treatment problems arising from the understanding the illness by the parents, language problems, and the clinical knowledge of the patient. Cultural diversity is perceived as enriching. The migration background and the sex of child and adolescent psychiatrists influence the treatment of migrants. Facilitating the process of «cultural opening» in child and adolescent psychiatry involves enacting concrete steps, such as the funding of interpreter costs.

  13. Host-Associated Metagenomics: A Guide to Generating Infectious RNA Viromes

    PubMed Central

    Robert, Catherine; Pascalis, Hervé; Michelle, Caroline; Jardot, Priscilla; Charrel, Rémi; Raoult, Didier; Desnues, Christelle

    2015-01-01

    Background Metagenomic analyses have been widely used in the last decade to describe viral communities in various environments or to identify the etiology of human, animal, and plant pathologies. Here, we present a simple and standardized protocol that allows for the purification and sequencing of RNA viromes from complex biological samples with an important reduction of host DNA and RNA contaminants, while preserving the infectivity of viral particles. Principal Findings We evaluated different viral purification steps, random reverse transcriptions and sequence-independent amplifications of a pool of representative RNA viruses. Viruses remained infectious after the purification process. We then validated the protocol by sequencing the RNA virome of human body lice engorged in vitro with artificially contaminated human blood. The full genomes of the most abundant viruses absorbed by the lice during the blood meal were successfully sequenced. Interestingly, random amplifications differed in the genome coverage of segmented RNA viruses. Moreover, the majority of reads were taxonomically identified, and only 7–15% of all reads were classified as “unknown”, depending on the random amplification method. Conclusion The protocol reported here could easily be applied to generate RNA viral metagenomes from complex biological samples of different origins. Our protocol allows further virological characterizations of the described viral communities because it preserves the infectivity of viral particles and allows for the isolation of viruses. PMID:26431175

  14. Precipitation and Latent Heating Distributions from Satellite Passive Microwave Radiometry. Part 1; Method and Uncertainties

    NASA Technical Reports Server (NTRS)

    Olson, William S.; Kummerow, Christian D.; Yang, Song; Petty, Grant W.; Tao, Wei-Kuo; Bell, Thomas L.; Braun, Scott A.; Wang, Yansen; Lang, Stephen E.; Johnson, Daniel E.

    2004-01-01

    A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating/drying profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and non-convective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud resolving model simulations, and from the Bayesian formulation itself. Synthetic rain rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in instantaneous rain rate estimates at 0.5 deg resolution range from approximately 50% at 1 mm/h to 20% at 14 mm/h. These errors represent about 70-90% of the mean random deviation between collocated passive microwave and spaceborne radar rain rate estimates. The cumulative algorithm error in TMI estimates at monthly, 2.5 deg resolution is relatively small (less than 6% at 5 mm/day) compared to the random error due to infrequent satellite temporal sampling (8-35% at the same rain rate).

  15. Efficient sampling of complex network with modified random walk strategies

    NASA Astrophysics Data System (ADS)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  16. Decompressive Surgery for the Treatment of Malignant Infarction of the Middle Cerebral Artery (DESTINY): a randomized, controlled trial.

    PubMed

    Jüttler, Eric; Schwab, Stefan; Schmiedek, Peter; Unterberg, Andreas; Hennerici, Michael; Woitzik, Johannes; Witte, Steffen; Jenetzky, Ekkehart; Hacke, Werner

    2007-09-01

    Decompressive surgery (hemicraniectomy) for life-threatening massive cerebral infarction represents a controversial issue in neurocritical care medicine. We report here the 30-day mortality and 6- and 12-month functional outcomes from the DESTINY trial. DESTINY (ISRCTN01258591) is a prospective, multicenter, randomized, controlled, clinical trial based on a sequential design that used mortality after 30 days as the first end point. When this end point was reached, patient enrollment was interrupted as per protocol until recalculation of the projected sample size was performed on the basis of the 6-month outcome (primary end point=modified Rankin Scale score, dichotomized to 0 to 3 versus 4 to 6). All analyses were based on intention to treat. A statistically significant reduction in mortality was reached after 32 patients had been included: 15 of 17 (88%) patients randomized to hemicraniectomy versus 7 of 15 (47%) patients randomized to conservative therapy survived after 30 days (P=0.02). After 6 and 12 months, 47% of patients in the surgical arm versus 27% of patients in the conservative treatment arm had a modified Rankin Scale score of 0 to 3 (P=0.23). DESTINY showed that hemicraniectomy reduces mortality in large hemispheric stroke. With 32 patients included, the primary end point failed to demonstrate statistical superiority of hemicraniectomy, and the projected sample size was calculated to 188 patients. Despite this failure to meet the primary end point, the steering committee decided to terminate the trial in light of the results of the joint analysis of the 3 European hemicraniectomy trials.

  17. Methodology Series Module 5: Sampling Strategies

    PubMed Central

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  18. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    PubMed

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  19. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    PubMed

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  20. Improving riverine constituent concentration and flux estimation by accounting for antecedent discharge conditions

    NASA Astrophysics Data System (ADS)

    Zhang, Qian; Ball, William P.

    2017-04-01

    Regression-based approaches are often employed to estimate riverine constituent concentrations and fluxes based on typically sparse concentration observations. One such approach is the recently developed WRTDS ("Weighted Regressions on Time, Discharge, and Season") method, which has been shown to provide more accurate estimates than prior approaches in a wide range of applications. Centered on WRTDS, this work was aimed at developing improved models for constituent concentration and flux estimation by accounting for antecedent discharge conditions. Twelve modified models were developed and tested, each of which contains one additional flow variable to represent antecedent conditions and which can be directly derived from the daily discharge record. High-resolution (∼daily) data at nine diverse monitoring sites were used to evaluate the relative merits of the models for estimation of six constituents - chloride (Cl), nitrate-plus-nitrite (NOx), total Kjeldahl nitrogen (TKN), total phosphorus (TP), soluble reactive phosphorus (SRP), and suspended sediment (SS). For each site-constituent combination, 30 concentration subsets were generated from the original data through Monte Carlo subsampling and then used to evaluate model performance. For the subsampling, three sampling strategies were adopted: (A) 1 random sample each month (12/year), (B) 12 random monthly samples plus additional 8 random samples per year (20/year), and (C) flow-stratified sampling with 12 regular (non-storm) and 8 storm samples per year (20/year). Results reveal that estimation performance varies with both model choice and sampling strategy. In terms of model choice, the modified models show general improvement over the original model under all three sampling strategies. Major improvements were achieved for NOx by the long-term flow-anomaly model and for Cl by the ADF (average discounted flow) model and the short-term flow-anomaly model. Moderate improvements were achieved for SS, TP, and TKN by the ADF model. By contrast, no such achievement was achieved for SRP by any proposed model. In terms of sampling strategy, performance of all models (including the original) was generally best using strategy C and worst using strategy A, and especially so for SS, TP, and SRP, confirming the value of routinely collecting stormflow samples. Overall, this work provides a comprehensive set of statistical evidence for supporting the incorporation of antecedent discharge conditions into the WRTDS model for estimation of constituent concentration and flux, thereby combining the advantages of two recent developments in water quality modeling.

  1. Markov and semi-Markov switching linear mixed models used to identify forest tree growth components.

    PubMed

    Chaubert-Pereira, Florence; Guédon, Yann; Lavergne, Christian; Trottier, Catherine

    2010-09-01

    Tree growth is assumed to be mainly the result of three components: (i) an endogenous component assumed to be structured as a succession of roughly stationary phases separated by marked change points that are asynchronous among individuals, (ii) a time-varying environmental component assumed to take the form of synchronous fluctuations among individuals, and (iii) an individual component corresponding mainly to the local environment of each tree. To identify and characterize these three components, we propose to use semi-Markov switching linear mixed models, i.e., models that combine linear mixed models in a semi-Markovian manner. The underlying semi-Markov chain represents the succession of growth phases and their lengths (endogenous component) whereas the linear mixed models attached to each state of the underlying semi-Markov chain represent-in the corresponding growth phase-both the influence of time-varying climatic covariates (environmental component) as fixed effects, and interindividual heterogeneity (individual component) as random effects. In this article, we address the estimation of Markov and semi-Markov switching linear mixed models in a general framework. We propose a Monte Carlo expectation-maximization like algorithm whose iterations decompose into three steps: (i) sampling of state sequences given random effects, (ii) prediction of random effects given state sequences, and (iii) maximization. The proposed statistical modeling approach is illustrated by the analysis of successive annual shoots along Corsican pine trunks influenced by climatic covariates. © 2009, The International Biometric Society.

  2. Sampling Large Graphs for Anticipatory Analytics

    DTIC Science & Technology

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  3. Assessing the representativeness of physician and patient respondents to a primary care survey using administrative data.

    PubMed

    Li, Allanah; Cronin, Shawna; Bai, Yu Qing; Walker, Kevin; Ammi, Mehdi; Hogg, William; Wong, Sabrina T; Wodchis, Walter P

    2018-05-30

    QUALICOPC is an international survey of primary care performance. QUALICOPC data have been used in several studies, yet the representativeness of the Canadian QUALICOPC survey is unknown, potentially limiting the generalizability of findings. This study examined the representativeness of QUALICOPC physician and patient respondents in Ontario using health administrative data. This representativeness study linked QUALICOPC physician and patient respondents in Ontario to health administrative databases at the Institute for Clinical Evaluative Sciences. Physician respondents were compared to other physicians in their practice group and all Ontario primary care physicians on demographic and practice characteristics. Patient respondents were compared to other patients rostered to their primary care physicians, patients rostered to their physicians' practice groups, and a random sample of Ontario residents on sociodemographic characteristics, morbidity, and health care utilization. Standardized differences were calculated to compare the distribution of characteristics across cohorts. QUALICOPC physician respondents included a higher proportion of younger, female physicians and Canadian medical graduates compared to other Ontario primary care physicians. A higher proportion of physician respondents practiced in Family Health Team models, compared to the provincial proportion for primary care physicians. QUALICOPC patient respondents were more likely to be older and female, with significantly higher levels of morbidity and health care utilization, compared with the other patient groups examined. However, when looking at the QUALICOPC physicians' whole rosters, rather than just the patient survey respondents, the practice profiles were similar to those of the other physicians in their practice groups and Ontario patients in general. Comparisons revealed some differences in responding physicians' demographic and practice characteristics, as well as differences in responding patients' characteristics compared to the other patient groups tested, which may have resulted from the visit-based sampling strategy. Ontario QUALICOPC physicians had similar practice profiles as compared to non-participating physicians, providing some evidence that the participating practices are representative of other non-participating practices, and patients selected by visit-based sampling may also be representative of visiting patients in other practices. Those using QUALICOPC data should understand this limited representativeness when generalizing results, and consider the potential for bias in their analyses.

  4. Electromagnetic Scattering by Fully Ordered and Quasi-Random Rigid Particulate Samples

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Dlugach, Janna M.; Mackowski, Daniel W.

    2016-01-01

    In this paper we have analyzed circumstances under which a rigid particulate sample can behave optically as a true discrete random medium consisting of particles randomly moving relative to each other during measurement. To this end, we applied the numerically exact superposition T-matrix method to model far-field scattering characteristics of fully ordered and quasi-randomly arranged rigid multiparticle groups in fixed and random orientations. We have shown that, in and of itself, averaging optical observables over movements of a rigid sample as a whole is insufficient unless it is combined with a quasi-random arrangement of the constituent particles in the sample. Otherwise, certain scattering effects typical of discrete random media (including some manifestations of coherent backscattering) may not be accurately replicated.

  5. Methodological Challenges in Collecting Social and Behavioural Data Regarding the HIV Epidemic among Gay and Other Men Who Have Sex with Men in Australia

    PubMed Central

    Holt, Martin; de Wit, John; Brown, Graham; Maycock, Bruce; Fairley, Christopher; Prestage, Garrett

    2014-01-01

    Background Behavioural surveillance and research among gay and other men who have sex with men (GMSM) commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS) and discussed their utility for behavioural surveillance. Methods Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. Results Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR) or casual partners (UAIC). The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. Conclusion Respondent-driven sampling produced the sample that was most consistent to population estimates, but this methodology is complex and logistically demanding. Time-location and online recruitment are more cost-effective and easier to implement; using these approaches in combination may offer the potential to recruit a more representative sample of GMSM. PMID:25409440

  6. Methodological challenges in collecting social and behavioural data regarding the HIV epidemic among gay and other men who have sex with men in Australia.

    PubMed

    Zablotska, Iryna B; Frankland, Andrew; Holt, Martin; de Wit, John; Brown, Graham; Maycock, Bruce; Fairley, Christopher; Prestage, Garrett

    2014-01-01

    Behavioural surveillance and research among gay and other men who have sex with men (GMSM) commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS) and discussed their utility for behavioural surveillance. Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR) or casual partners (UAIC). The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. Respondent-driven sampling produced the sample that was most consistent to population estimates, but this methodology is complex and logistically demanding. Time-location and online recruitment are more cost-effective and easier to implement; using these approaches in combination may offer the potential to recruit a more representative sample of GMSM.

  7. How college students conceptualize and practice responsible drinking.

    PubMed

    Barry, Adam E; Goodson, Patricia

    2011-01-01

    This study sought to employ a mixed-methods approach to (a) qualitatively explore responsible drinking beliefs and behaviors among a sample of college students, and (b) quantitatively assess the prevalence of those behaviors. Convenience samples, drawn from currently enrolled students attending a large public university in Texas, comprised 13 participants in the qualitative phase and a random sample of 729 students for the quantitative phase. A partially mixed sequential dominant status design (qual → QUAN) was employed. PARTICIPANTS associated 7 distinct themes with drinking responsibly; however, embedded inside these themes were numerous potentially harmful elements. Quantitative findings supported the qualitative themes, also highlighting gender and ethnic differences. Males believed responsible drinking behaviors should occur with significantly less frequency than females, whereas Whites attached less relative necessity to certain responsible drinking behaviors. This study represents an initial attempt to determine specific, evidence-based characteristics of responsible drinking.

  8. Comparing decision making between cancer patients and the general population: thoughts, emotions, or social influence?

    PubMed

    Yang, Z Janet; McComas, Katherine A; Gay, Geri K; Leonard, John P; Dannenberg, Andrew J; Dillon, Hildy

    2012-01-01

    This study extends a risk information seeking and processing model to explore the relative effect of cognitive processing strategies, positive and negative emotions, and normative beliefs on individuals' decision making about potential health risks. Most previous research based on this theoretical framework has examined environmental risks. Applying this risk communication model to study health decision making presents an opportunity to explore theoretical boundaries of the model, while also bringing this research to bear on a pressing medical issue: low enrollment in clinical trials. Comparative analysis of data gathered from 2 telephone surveys of a representative national sample (n = 500) and a random sample of cancer patients (n = 411) indicated that emotions played a more substantive role in cancer patients' decisions to enroll in a potential trial, whereas cognitive processing strategies and normative beliefs had greater influences on the decisions of respondents from the national sample.

  9. Field size, length, and width distributions based on LACIE ground truth data. [large area crop inventory experiment

    NASA Technical Reports Server (NTRS)

    Pitts, D. E.; Badhwar, G.

    1980-01-01

    The development of agricultural remote sensing systems requires knowledge of agricultural field size distributions so that the sensors, sampling frames, image interpretation schemes, registration systems, and classification systems can be properly designed. Malila et al. (1976) studied the field size distribution for wheat and all other crops in two Kansas LACIE (Large Area Crop Inventory Experiment) intensive test sites using ground observations of the crops and measurements of their field areas based on current year rectified aerial photomaps. The field area and size distributions reported in the present investigation are derived from a representative subset of a stratified random sample of LACIE sample segments. In contrast to previous work, the obtained results indicate that most field-size distributions are not log-normally distributed. The most common field size observed in this study was 10 acres for most crops studied.

  10. Assessment of wadeable stream resources in the driftless area ecoregion in Western Wisconsin using a probabilistic sampling design.

    PubMed

    Miller, Michael A; Colby, Alison C C; Kanehl, Paul D; Blocksom, Karen

    2009-03-01

    The Wisconsin Department of Natural Resources (WDNR), with support from the U.S. EPA, conducted an assessment of wadeable streams in the Driftless Area ecoregion in western Wisconsin using a probabilistic sampling design. This ecoregion encompasses 20% of Wisconsin's land area and contains 8,800 miles of perennial streams. Randomly-selected stream sites (n = 60) equally distributed among stream orders 1-4 were sampled. Watershed land use, riparian and in-stream habitat, water chemistry, macroinvertebrate, and fish assemblage data were collected at each true random site and an associated "modified-random" site on each stream that was accessed via a road crossing nearest to the true random site. Targeted least-disturbed reference sites (n = 22) were also sampled to develop reference conditions for various physical, chemical, and biological measures. Cumulative distribution function plots of various measures collected at the true random sites evaluated with reference condition thresholds, indicate that high proportions of the random sites (and by inference the entire Driftless Area wadeable stream population) show some level of degradation. Study results show no statistically significant differences between the true random and modified-random sample sites for any of the nine physical habitat, 11 water chemistry, seven macroinvertebrate, or eight fish metrics analyzed. In Wisconsin's Driftless Area, 79% of wadeable stream lengths were accessible via road crossings. While further evaluation of the statistical rigor of using a modified-random sampling design is warranted, sampling randomly-selected stream sites accessed via the nearest road crossing may provide a more economical way to apply probabilistic sampling in stream monitoring programs.

  11. Prevalence of major depressive disorder and socio-demographic correlates: Results of a representative household epidemiological survey in Beijing, China.

    PubMed

    Liu, Jing; Yan, Fang; Ma, Xin; Guo, Hong-Li; Tang, Yi-Lang; Rakofsky, Jeffrey J; Wu, Xiao-Mei; Li, Xiao-Qiang; Zhu, Hong; Guo, Xiao-Bing; Yang, Yang; Li, Peng; Cao, Xin-Dong; Li, Hai-Ying; Li, Zhen-Bo; Wang, Ping; Xu, Qiu-Yue

    2015-07-01

    Major depressive disorder (MDD) is the most prevalent mental disorder in the general population and has been associated with socioeconomic factors. Beijing has undergone significant socioeconomic changes in last decade, however no large-scale community epidemiological surveys of MDD have been conducted in Beijing since 2003. To determine the prevalence of MDD and its socio-demographic correlates in a representative household sample of the general population in Beijing, China. Data were collected from the 2010 representative household epidemiological survey of mental disorders in Beijing. The multistage cluster random sampling method was used to select qualified subjects in 18 districts and counties, and then face-to-face interviews were administered using the Chinese version of Structured Clinical Interview for DSM-IV-TR Axis I Disorders-Patient Edition (SCID-I/P) during November 1, 2010 to December 31, 2010. 19,874 registered permanent residents were randomly identified and 16,032 (response rate=80.7%) completed face-to-face interviews. The time-point and life-time prevalence rates of MDD were estimated to be 1.10% (95% CI: 0.94-1.26%) and 3.56% (95% CI: 3.27-3.85%) respectively. Significant differences were found in sex, age, location of residence, marital status, education, employment status, personal/family monthly income, perception of family environment and relationship with others, when comparing residents with MDD to those without MDD. Those who were female, aged 45 or above, reported low family income, or reported an "average" or "poor" family environment were associated with a higher risk of MDD. The prevalence of MDD reported in this survey is relatively lower than that in other western countries. Female sex, age older than 45, low family income, and poor family environment appear to be independent risk factors for MDD. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Assessing the performance of multiplexed tandem PCR for the diagnosis of pathogenic genotypes of Theileria orientalis using pooled blood samples from cattle.

    PubMed

    Gebrekidan, Hagos; Gasser, Robin B; Stevenson, Mark A; McGrath, Sean; Jabbar, Abdul

    2017-02-01

    Oriental theileriosis caused by multiple genotypes of Theileria orientalis is an important tick-borne disease of bovines. Here, we assessed the performance of an established multiplexed tandem PCR (MT-PCR) for the diagnosis of the two recognized, pathogenic genotypes (chitose and ikeda) of T. orientalis in cattle using pooled blood samples. We used a total of 265 cattle blood samples, which were divided into two groups according to previous MT-PCR results for individual samples. Samples in group 1 (n = 155) were from a herd with a relatively high prevalence of T. orientalis infection; and those in group 2 (n = 110) were from four herds with a low prevalence. For group 1, 31 and 15 batches of five- and ten-pooled samples (selected at random), respectively, were formed. For group 2, 22 and 11 batches of five- and ten-pooled samples (selected at random), respectively, were formed. DNAs from individual pooled samples in each batch and group were then tested by MT-PCR. For group 1, the apparent prevalences estimated using the 31 batches of five-pooled samples (97%) and 15 batches of ten-pooled samples (100%) were significantly higher compared with individual samples (75%). For group 2, higher apparent prevalences (9% and 36%) were also recorded for the 22 and 11 batches of pooled samples, respectively, compared with individual samples (7%). Overall, the average infection intensity recorded for the genotypes of chitose and ikeda were considerably lower in pooled compared with individual samples. The diagnostic specificities of MT-PCR were estimated at 95% and 94%, respectively, when batches of five- and ten-pooled samples were tested, and 94% for individual samples. The diagnostic sensitivity of this assay was estimated at 98% same for all individual, five- and ten-pooled samples. This study shows that screening batches of five- and ten-pooled blood samples from cattle herds are similar to those obtained for individual samples, and, importantly, that the reduced cost for the testing of pooled samples represents a considerable saving to herd managers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Development of new demi-span equations from a nationally representative sample of adults to estimate maximal adult height.

    PubMed

    Hirani, Vasant; Tabassum, Faiza; Aresu, Maria; Mindell, Jennifer

    2010-08-01

    Various measures have been used to estimate height when assessing nutritional status. Current equations to obtain demi-span equivalent height (DEH(Bassey)) are based on a small sample from a single study. The objectives of this study were to develop more robust DEH equations from a large number of men (n = 591) and women (n = 830) aged 25-45 y from a nationally representative cross-sectional sample (Health Survey for England 2007). Sex-specific regression equations were produced from young adults' (aged 25-45 y) measured height and demi-span to estimate new DEH equations (DEH(new)). DEH in people aged >or= 65 y was calculated using DEH(new). DEH(new) estimated current height in people aged 25-45 y with a mean difference of 0.04 in men (P = 0.80) and -0.29 in women (P = 0.05). Height, demi-span, DEH(new), and DEH(Bassey) declined by age group in both sexes aged >or=65 y (P < 0.05); DEH were larger than the measured height for all age groups (mean difference between DEH(new) and current height was -2.64 in men and -3.16 in women; both P < 0.001). Comparisons of DEH estimates showed good agreement, but DEH(new) was significantly higher than DEH(Bassey) in each age and sex group in older people. The new equations that are based on a large, randomly selected, nationally representative sample of young adults are more robust for predicting current height in young adults when height measurements are unavailable and can be used in the future to predict maximal adult height more accurately in currently young adults as they age.

  14. Trends and college-level characteristics associated with the non-medical use of prescription drugs among US college students from 1993 to 2001

    PubMed Central

    McCabe, Sean Esteban; West, Brady T.; Wechsler, Henry

    2007-01-01

    Aims The present study examines the prevalence trends and college-level characteristics associated with the non-medical use of prescription drugs (i.e. amphetamines, opioids, sedatives, tranquilizers) and illicit drug use among US college students between 1993 and 2001. Design Data were collected from self-administered mail surveys, sent to independent cross-sectional samples of college students from a nationally representative sample of 119 colleges in 4 years between 1993 and 2001. Setting Nationally representative 4-year US colleges and universities in 1993, 1997, 1999 and 2001. Participants Representative samples of 15 282, 14 428, 13 953 and 10 904 randomly selected college students at these colleges in 1993, 1997, 1999 and 2001, respectively. Findings The results indicate that life-time and 12-month prevalence rates of non-medical use of prescription drugs (NMPD) increased between 1993 and 2001. Specific college-level characteristics were found to be correlated positively (marijuana use) and negatively (historically black college status and commuter status) with NMPD, consistently across the four cross-sectional samples. Significant between-college variation in terms of trajectories in the prevalence of NMPD over time was found in hierarchical linear models, and selected college-level characteristics were not found to explain all of the variation in the trajectories, suggesting the need for further investigation of what determines between-college variance in the prevalence trends. Conclusions The findings of the present study suggest that continued monitoring of NMPD and illicit drug use among college students is needed and collegiate substance prevention programs should include efforts to reduce these drug use behaviors. PMID:17298654

  15. Variation in Death Rate After Abdominal Aortic Aneurysmectomy in the United States

    PubMed Central

    Dimick, Justin B.; Stanley, James C.; Axelrod, David A.; Kazmers, Andris; Henke, Peter K.; Jacobs, Lloyd A.; Wakefield, Thomas W.; Greenfield, Lazar J.; Upchurch, Gilbert R.

    2002-01-01

    Objective To determine whether high-volume hospitals (HVHs) have lower in-hospital death rates after abdominal aortic aneurysm (AAA) repair compared with low-volume hospitals (LVHs). Summary Background Data Select statewide studies have shown that HVHs have superior outcomes compared with LVHs for AAA repair, but they may not be representative of the true volume–outcome relationship for the entire United States. Methods Patients undergoing repair of intact or ruptured AAAs in the Nationwide Inpatient Sample (NIS) for 1996 and 1997 were included (n = 13,887) for study. The NIS represents a 20% stratified random sample representative of all U.S. hospitals. Unadjusted and case mix-adjusted analyses were performed. Results The overall death rate was 3.8% for intact AAA repair and 47% for ruptured AAA repair. For repair of intact AAAs, HVHs had a lower death rate than LVHs. The death rate after repair of ruptured AAA was also slightly lower at HVHs. In a multivariate analysis adjusting for case mix, having surgery at an LVH was associated with a 56% increased risk of in-hospital death. Other independent risk factors for in-hospital death included female gender, age older than 65 years, aneurysm rupture, urgent or emergent admission, and comorbid disease. Conclusions This study from a representative national database documents that HVHs have a significantly lower death rate than LVHs for repair of both intact and ruptured AAA. These data support the regionalization of patients to HVHs for AAA repair. PMID:11923615

  16. Culturally appropriate methodology in obtaining a representative sample of South Australian Aboriginal adults for a cross-sectional population health study: challenges and resolutions.

    PubMed

    Marin, Tania; Taylor, Anne Winifred; Grande, Eleonora Dal; Avery, Jodie; Tucker, Graeme; Morey, Kim

    2015-05-19

    The considerably lower average life expectancy of Aboriginal and Torres Strait Islander Australians, compared with non-Aboriginal and non-Torres Strait Islander Australians, has been widely reported. Prevalence data for chronic disease and health risk factors are needed to provide evidence based estimates for Australian Aboriginal and Torres Strait Islanders population health planning. Representative surveys for these populations are difficult due to complex methodology. The focus of this paper is to describe in detail the methodological challenges and resolutions of a representative South Australian Aboriginal population-based health survey. Using a stratified multi-stage sampling methodology based on the Australian Bureau of Statistics 2006 Census with culturally appropriate and epidemiological rigorous methods, 11,428 randomly selected dwellings were approached from a total of 209 census collection districts. All persons eligible for the survey identified as Aboriginal and/or Torres Strait Islander and were selected from dwellings identified as having one or more Aboriginal person(s) living there at the time of the survey. Overall, the 399 interviews from an eligible sample of 691 SA Aboriginal adults yielded a response rate of 57.7%. These face-to-face interviews were conducted by ten interviewers retained from a total of 27 trained Aboriginal interviewers. Challenges were found in three main areas: identification and recruitment of participants; interviewer recruitment and retainment; and using appropriate engagement with communities. These challenges were resolved, or at least mainly overcome, by following local protocols with communities and their representatives, and reaching agreement on the process of research for Aboriginal people. Obtaining a representative sample of Aboriginal participants in a culturally appropriate way was methodologically challenging and required high levels of commitment and resources. Adhering to these principles has resulted in a rich and unique data set that provides an overview of the self-reported health status for Aboriginal people living in South Australia. This process provides some important principles to be followed when engaging with Aboriginal people and their communities for the purpose of health research.

  17. A random spatial sampling method in a rural developing nation

    Treesearch

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  18. Use of psychotherapy in a representative adult community sample in São Paulo, Brazil

    PubMed Central

    Blay, Sergio L.; Fillenbaum, Gerda G.; da Silva, Paula Freitas R.; Peluso, Erica T.

    2014-01-01

    Little is known about the use of psychotherapy to treat common mental disorders in a major city in a middle income country. Data come from in-home interviews with a stratified random sample of 2,000 community residents age 18–65 in the city of São Paulo, Brazil. The information obtained included sociodemographic characteristics; psychotropic drugs; mental status; and lifetime, previous 12 months, and current use of psychotherapy. Logistic regression was used to examine determinants of use of psychotherapy. Of the sample, 22.7% met General Health Questionnaire-12 criteria for common mental disorders. Lifetime, previous 12 months, and current use of psychotherapy were reported by 14.6%, 4.6%, and 2.3% of the sample respectively. Users were typically women, more educated, higher income, not married, unemployed, with common mental disorders. Further analysis found that 47% (with higher education and income) paid out-of-pocket, and 53% used psychotropic medication. Psychotherapy does not appear to be the preferred treatment for common mental disorders. PMID:25118139

  19. Mental health, service use and social capital among Indian-Australians: findings of a wellbeing survey.

    PubMed

    Maheshwari, Rajesh; Steel, Zachary

    2012-10-01

    Indian-Australians represent a distinct immigrant group both demographically and culturally. Yet, despite an expanding body of research on transcultural mental health in Australia, there is a paucity of studies regarding mental health of Indian-Australians. This paper explores the extent of psychological morbidity and related service use in a representative sample of Indian-Australians. It further examines the association of mental health with social participation and networking in this ethnic community. Measures to assess current levels of psychological distress, functional disability, service use, and social capital were administered in a random sample of 71 Indian-Australian family groups living in Sydney. Amongst participants, 15% reported high to very high levels of psychological distress. Psychological distress was associated with increased days of functional disability and higher levels of functional impairment, and an increased likelihood of a GP consultation. However, 91% of participants with identifiable mental health needs did not seek any mental health consultation. Social capital was not found to be a significant predictor of psychological health or service use in this sample. Psychological morbidity in the Indian-Australian community is associated with high levels of functional disability, both in number of days and extent of severity, but only a small proportion seeks mental health help.

  20. Health Surveys Using Mobile Phones in Developing Countries: Automated Active Strata Monitoring and Other Statistical Considerations for Improving Precision and Reducing Biases.

    PubMed

    Labrique, Alain; Blynn, Emily; Ahmed, Saifuddin; Gibson, Dustin; Pariyo, George; Hyder, Adnan A

    2017-05-05

    In low- and middle-income countries (LMICs), historically, household surveys have been carried out by face-to-face interviews to collect survey data related to risk factors for noncommunicable diseases. The proliferation of mobile phone ownership and the access it provides in these countries offers a new opportunity to remotely conduct surveys with increased efficiency and reduced cost. However, the near-ubiquitous ownership of phones, high population mobility, and low cost require a re-examination of statistical recommendations for mobile phone surveys (MPS), especially when surveys are automated. As with landline surveys, random digit dialing remains the most appropriate approach to develop an ideal survey-sampling frame. Once the survey is complete, poststratification weights are generally applied to reduce estimate bias and to adjust for selectivity due to mobile ownership. Since weights increase design effects and reduce sampling efficiency, we introduce the concept of automated active strata monitoring to improve representativeness of the sample distribution to that of the source population. Although some statistical challenges remain, MPS represent a promising emerging means for population-level data collection in LMICs. ©Alain Labrique, Emily Blynn, Saifuddin Ahmed, Dustin Gibson, George Pariyo, Adnan A Hyder. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 05.05.2017.

  1. Are Video Games a Gateway to Gambling? A Longitudinal Study Based on a Representative Norwegian Sample.

    PubMed

    Molde, Helge; Holmøy, Bjørn; Merkesdal, Aleksander Garvik; Torsheim, Torbjørn; Mentzoni, Rune Aune; Hanns, Daniel; Sagoe, Dominic; Pallesen, Ståle

    2018-06-05

    The scope and variety of video games and monetary gambling opportunities are expanding rapidly. In many ways, these forms of entertainment are converging on digital and online video games and gambling sites. However, little is known about the relationship between video gaming and gambling. The present study explored the possibility of a directional relationship between measures of problem gaming and problem gambling, while also controlling for the influence of sex and age. In contrast to most previous investigations which are based on cross-sectional designs and non-representative samples, the present study utilized a longitudinal design conducted over 2 years (2013, 2015) and comprising 4601 participants (males 47.2%, age range 16-74) drawn from a random sample from the general population. Video gaming and gambling were assessed using the Gaming Addiction Scale for Adolescents and the Canadian Problem Gambling Index, respectively. Using an autoregressive cross-lagged structural equation model, we found a positive relationship between scores on problematic gaming and later scores on problematic gambling, whereas we found no evidence of the reverse relationship. Hence, video gaming problems appear to be a gateway behavior to problematic gambling behavior. In future research, one should continue to monitor the possible reciprocal behavioral influences between gambling and video gaming.

  2. Prevalence of second-hand smoke exposure after introduction of the Italian smoking ban: the Florence and Belluno survey.

    PubMed

    Gorini, Giuseppe; Gasparrini, Antonio; Tamang, Elizabeth; Nebot, Manel; Lopez, Maria José; Albertini, Marco; Marcolina, Daniela; Fernandez, Esteve

    2008-01-01

    A law banning smoking in enclosed public places was implemented in Italy on January 10, 2005. The aim of this paper is to present a cross-sectional survey on two representative samples of non-smokers of two Italian towns (Florence and Belluno), conducted one year after the introduction of the ban, in order to assess prevalence of second-hand smoke exposure, to record the attitudes towards the ban, and the perception about its compliance in a representative sample of non-smokers. Computer-assisted telephone interviews were carried out in March 2006, from a random sample of households from telephone registries. Respondents were 402 non-smokers from Belluno and 1,073 from Florence. About 12% of Florentines and 7% of Belluno respondents were exposed at home; 39% and 19%, respectively, at work; 10% and 5% in hospitality venues; 20% and 10% in cars. The smoke-free law was almost universally supported (about 98%) even if a smaller proportion of people (about 90%) had the perception that the ban was observed. Second-hand smoke exposure at home and in hospitality premises has dropped to < or = 10%, whereas exposure at work remained higher. These results suggest the need for more controls in workplaces other than hospitality venues.

  3. Methods for assessing long-term mean pathogen count in drinking water and risk management implications.

    PubMed

    Englehardt, James D; Ashbolt, Nicholas J; Loewenstine, Chad; Gadzinski, Erik R; Ayenu-Prah, Albert Y

    2012-06-01

    Recently pathogen counts in drinking and source waters were shown theoretically to have the discrete Weibull (DW) or closely related discrete growth distribution (DGD). The result was demonstrated versus nine short-term and three simulated long-term water quality datasets. These distributions are highly skewed such that available datasets seldom represent the rare but important high-count events, making estimation of the long-term mean difficult. In the current work the methods, and data record length, required to assess long-term mean microbial count were evaluated by simulation of representative DW and DGD waterborne pathogen count distributions. Also, microbial count data were analyzed spectrally for correlation and cycles. In general, longer data records were required for more highly skewed distributions, conceptually associated with more highly treated water. In particular, 500-1,000 random samples were required for reliable assessment of the population mean ±10%, though 50-100 samples produced an estimate within one log (45%) below. A simple correlated first order model was shown to produce count series with 1/f signal, and such periodicity over many scales was shown in empirical microbial count data, for consideration in sampling. A tiered management strategy is recommended, including a plan for rapid response to unusual levels of routinely-monitored water quality indicators.

  4. Some practical problems in implementing randomization.

    PubMed

    Downs, Matt; Tucker, Kathryn; Christ-Schmidt, Heidi; Wittes, Janet

    2010-06-01

    While often theoretically simple, implementing randomization to treatment in a masked, but confirmable, fashion can prove difficult in practice. At least three categories of problems occur in randomization: (1) bad judgment in the choice of method, (2) design and programming errors in implementing the method, and (3) human error during the conduct of the trial. This article focuses on these latter two types of errors, dealing operationally with what can go wrong after trial designers have selected the allocation method. We offer several case studies and corresponding recommendations for lessening the frequency of problems in allocating treatment or for mitigating the consequences of errors. Recommendations include: (1) reviewing the randomization schedule before starting a trial, (2) being especially cautious of systems that use on-demand random number generators, (3) drafting unambiguous randomization specifications, (4) performing thorough testing before entering a randomization system into production, (5) maintaining a dataset that captures the values investigators used to randomize participants, thereby allowing the process of treatment allocation to be reproduced and verified, (6) resisting the urge to correct errors that occur in individual treatment assignments, (7) preventing inadvertent unmasking to treatment assignments in kit allocations, and (8) checking a sample of study drug kits to allow detection of errors in drug packaging and labeling. Although we performed a literature search of documented randomization errors, the examples that we provide and the resultant recommendations are based largely on our own experience in industry-sponsored clinical trials. We do not know how representative our experience is or how common errors of the type we have seen occur. Our experience underscores the importance of verifying the integrity of the treatment allocation process before and during a trial. Clinical Trials 2010; 7: 235-245. http://ctj.sagepub.com.

  5. A randomized trial found online questionnaires supplemented by postal reminders generated a cost-effective and generalizable sample but don't forget the reminders.

    PubMed

    Loban, Amanda; Mandefield, Laura; Hind, Daniel; Bradburn, Mike

    2017-12-01

    The objective of this study was to compare the response rates, data completeness, and representativeness of survey data produced by online and postal surveys. A randomized trial nested within a cohort study in Yorkshire, United Kingdom. Participants were randomized to receive either an electronic (online) survey questionnaire with paper reminder (N = 2,982) or paper questionnaire with electronic reminder (N = 2,855). Response rates were similar for electronic contact and postal contacts (50.9% vs. 49.7%, difference = 1.2%, 95% confidence interval: -1.3% to 3.8%). The characteristics of those responding to the two groups were similar. Participants nevertheless demonstrated an overwhelming preference for postal questionnaires, with the majority responding by post in both groups. Online survey questionnaire systems need to be supplemented with a postal reminder to achieve acceptable uptake, but doing so provides a similar response rate and case mix when compared to postal questionnaires alone. For large surveys, online survey systems may be cost saving. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Reduction of display artifacts by random sampling

    NASA Technical Reports Server (NTRS)

    Ahumada, A. J., Jr.; Nagel, D. C.; Watson, A. B.; Yellott, J. I., Jr.

    1983-01-01

    The application of random-sampling techniques to remove visible artifacts (such as flicker, moire patterns, and paradoxical motion) introduced in TV-type displays by discrete sequential scanning is discussed and demonstrated. Sequential-scanning artifacts are described; the window of visibility defined in spatiotemporal frequency space by Watson and Ahumada (1982 and 1983) and Watson et al. (1983) is explained; the basic principles of random sampling are reviewed and illustrated by the case of the human retina; and it is proposed that the sampling artifacts can be replaced by random noise, which can then be shifted to frequency-space regions outside the window of visibility. Vertical sequential, single-random-sequence, and continuously renewed random-sequence plotting displays generating 128 points at update rates up to 130 Hz are applied to images of stationary and moving lines, and best results are obtained with the single random sequence for the stationary lines and with the renewed random sequence for the moving lines.

  7. Fabrication of Bi2223 bulks with high critical current properties sintered in Ag tubes

    NASA Astrophysics Data System (ADS)

    Takeda, Yasuaki; Shimoyama, Jun-ichi; Motoki, Takanori; Kishio, Kohji; Nakashima, Takayoshi; Kagiyama, Tomohiro; Kobayashi, Shin-ichi; Hayashi, Kazuhiko

    2017-03-01

    Randomly grain oriented Bi2223 sintered bulks are one of the representative superconducting materials having weak-link problem due to very short coherence length particularly along the c-axis, resulting in poor intergrain Jc properties. In our previous studies, sintering and/or post-annealing under moderately reducing atmospheres were found to be effective for improving grain coupling in Bi2223 sintered bulks. Further optimizations of the synthesis process for Bi2223 sintered bulks were attempted in the present study to enhance their intergrain Jc. Effects of applied pressure of uniaxial pressing and sintering conditions on microstructure and superconducting properties have been systematically investigated. The best sample showed intergrain Jc of 2.0 kA cm-2 at 77 K and 8.2 kA cm-2 at 20 K, while its relative density was low ∼65%. These values are quite high as for a randomly oriented sintered bulk of cuprate superconductors.

  8. Protection motivation theory and physical activity: a longitudinal test among a representative population sample of Canadian adults.

    PubMed

    Plotnikoff, Ronald C; Rhodes, Ryan E; Trinh, Linda

    2009-11-01

    The purpose of this study was to examine the Protection Motivation Theory (PMT) to predict physical activity (PA) behaviour in a large, population-based sample of adults. One thousand six hundred and two randomly selected individuals completed two telephone interviews over two consecutive six-month periods assessing PMT constructs. PMT explained 35 per cent and 20 per cent of the variance in intention and behaviour respectively. Coping cognitions as moderators of threat explained 1 per cent of the variance in intention and behaviour. Age and gender as moderators of threat did not provide additional variance in the models. We conclude that salient PMT predictors (e.g. self-efficacy) may guide the development of effective PA interventions in the general population.

  9. Quantification of the Flavonoid-Degrading Bacterium Eubacterium ramulus in Human Fecal Samples with a Species-Specific Oligonucleotide Hybridization Probe

    PubMed Central

    Simmering, Rainer; Kleessen, Brigitta; Blaut, Michael

    1999-01-01

    To investigate the occurrence of the flavonoid-degrading bacterium Eubacterium ramulus in the human intestinal tract, an oligonucleotide probe designated S-S-E.ram-0997-a-A-18 was designed and validated, with over 90 bacterial strains representing the dominant described human fecal flora. Application of S-S-E.ram-0997-a-A-18 to fecal samples from 20 subjects indicated the presence of E. ramulus in each individual tested in numbers from 4.4 × 107 to 2.0 × 109 cells/g of fecal dry mass. Six fecal E. ramulus isolates were recognized by S-S-E.ram-0997-a-A-18 but exhibited different band patterns when analyzed by randomly amplified polymorphic DNA. PMID:10427069

  10. Revisiting sample size: are big trials the answer?

    PubMed

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  11. Child protection and adult depression: evaluating the long-term consequences of evacuating children to foster care during World War II.

    PubMed

    Santavirta, Nina; Santavirta, Torsten

    2014-03-01

    This paper combined data collected from war time government records with survey data including background characteristics, such as factors that affected eligibility, to examine the adult depression outcomes of individuals who were evacuated from Finland to temporary foster care in Sweden during World War II. Using war time government records and survey data for a random sample of 723 exposed individuals and 1321 matched unexposed individuals, the authors conducted least squares adjusted means comparison to examine the association between evacuation and adult depression (Beck Depression Inventory). The random sample was representative for the whole population of evacuees who returned to their biological families after World War II. The authors found no statistically significant difference in depressive symptoms during late adulthood between the two groups; for example, the exposed group had a 0.41 percentage points lower average Beck Depression Inventory score than the unexposed group (p = 0.907). This study provides no support for family disruption during early childhood because of the onset of sudden shocks elevating depressive symptoms during late adulthood. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Quantifying Rock Weakening Due to Decreasing Calcite Mineral Content by Numerical Simulations

    PubMed Central

    2018-01-01

    The quantification of changes in geomechanical properties due to chemical reactions is of paramount importance for geological subsurface utilisation, since mineral dissolution generally reduces rock stiffness. In the present study, the effective elastic moduli of two digital rock samples, the Fontainebleau and Bentheim sandstones, are numerically determined based on micro-CT images. Reduction in rock stiffness due to the dissolution of 10% calcite cement by volume out of the pore network is quantified for three synthetic spatial calcite distributions (coating, partial filling and random) using representative sub-cubes derived from the digital rock samples. Due to the reduced calcite content, bulk and shear moduli decrease by 34% and 38% in maximum, respectively. Total porosity is clearly the dominant parameter, while spatial calcite distribution has a minor impact, except for a randomly chosen cement distribution within the pore network. Moreover, applying an initial stiffness reduced by 47% for the calcite cement results only in a slightly weaker mechanical behaviour. Using the quantitative approach introduced here substantially improves the accuracy of predictions in elastic rock properties compared to general analytical methods, and further enables quantification of uncertainties related to spatial variations in porosity and mineral distribution. PMID:29614776

  13. Quantifying Rock Weakening Due to Decreasing Calcite Mineral Content by Numerical Simulations.

    PubMed

    Wetzel, Maria; Kempka, Thomas; Kühn, Michael

    2018-04-01

    The quantification of changes in geomechanical properties due to chemical reactions is of paramount importance for geological subsurface utilisation, since mineral dissolution generally reduces rock stiffness. In the present study, the effective elastic moduli of two digital rock samples, the Fontainebleau and Bentheim sandstones, are numerically determined based on micro-CT images. Reduction in rock stiffness due to the dissolution of 10% calcite cement by volume out of the pore network is quantified for three synthetic spatial calcite distributions (coating, partial filling and random) using representative sub-cubes derived from the digital rock samples. Due to the reduced calcite content, bulk and shear moduli decrease by 34% and 38% in maximum, respectively. Total porosity is clearly the dominant parameter, while spatial calcite distribution has a minor impact, except for a randomly chosen cement distribution within the pore network. Moreover, applying an initial stiffness reduced by 47% for the calcite cement results only in a slightly weaker mechanical behaviour. Using the quantitative approach introduced here substantially improves the accuracy of predictions in elastic rock properties compared to general analytical methods, and further enables quantification of uncertainties related to spatial variations in porosity and mineral distribution.

  14. Using multivariate generalizability theory to assess the effect of content stratification on the reliability of a performance assessment.

    PubMed

    Keller, Lisa A; Clauser, Brian E; Swanson, David B

    2010-12-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates of reliability may not be accurate. For tests built according to a table of specifications, tasks are randomly sampled from different strata (content domains, skill areas, etc.). If these strata remain fixed in the test construction process, ignoring this stratification in the reliability analysis results in an underestimate of "parallel forms" reliability, and an overestimate of the person-by-task component. This research explores the effect of representing and misrepresenting the stratification appropriately in estimation of reliability and the standard error of measurement. Both multivariate and univariate generalizability studies are reported. Results indicate that the proper specification of the analytic design is essential in yielding the proper information both about the generalizability of the assessment and the standard error of measurement. Further, illustrative D studies present the effect under a variety of situations and test designs. Additional benefits of multivariate generalizability theory in test design and evaluation are also discussed.

  15. Female reproductive success variation in a Pseudotsuga menziesii seed orchard as revealed by pedigree reconstruction from a bulk seed collection.

    PubMed

    El-Kassaby, Yousry A; Funda, Tomas; Lai, Ben S K

    2010-01-01

    The impact of female reproductive success on the mating system, gene flow, and genetic diversity of the filial generation was studied using a random sample of 801 bulk seed from a 49-clone Pseudotsuga menziesii seed orchard. We used microsatellite DNA fingerprinting and pedigree reconstruction to assign each seed's maternal and paternal parents and directly estimated clonal reproductive success, selfing rate, and the proportion of seed sired by outside pollen sources. Unlike most family array mating system and gene flow studies conducted on natural and experimental populations, which used an equal number of seeds per maternal genotype and thus generating unbiased inferences only on male reproductive success, the random sample we used was a representative of the entire seed crop; therefore, provided a unique opportunity to draw unbiased inferences on both female and male reproductive success variation. Selfing rate and the number of seed sired by outside pollen sources were found to be a function of female fertility variation. This variation also substantially and negatively affected female effective population size. Additionally, the results provided convincing evidence that the use of clone size as a proxy to fertility is questionable and requires further consideration.

  16. Ensembles of Spiking Neurons with Noise Support Optimal Probabilistic Inference in a Dynamically Changing Environment

    PubMed Central

    Legenstein, Robert; Maass, Wolfgang

    2014-01-01

    It has recently been shown that networks of spiking neurons with noise can emulate simple forms of probabilistic inference through “neural sampling”, i.e., by treating spikes as samples from a probability distribution of network states that is encoded in the network. Deficiencies of the existing model are its reliance on single neurons for sampling from each random variable, and the resulting limitation in representing quickly varying probabilistic information. We show that both deficiencies can be overcome by moving to a biologically more realistic encoding of each salient random variable through the stochastic firing activity of an ensemble of neurons. The resulting model demonstrates that networks of spiking neurons with noise can easily track and carry out basic computational operations on rapidly varying probability distributions, such as the odds of getting rewarded for a specific behavior. We demonstrate the viability of this new approach towards neural coding and computation, which makes use of the inherent parallelism of generic neural circuits, by showing that this model can explain experimentally observed firing activity of cortical neurons for a variety of tasks that require rapid temporal integration of sensory information. PMID:25340749

  17. The male-taller norm: Lack of evidence from a developing country.

    PubMed

    Sohn, K

    2015-08-01

    In general, women prefer men taller than themselves; this is referred to as the male-taller norm. However, since women are shorter than men on average, it is difficult to determine whether the fact that married women are on average shorter than their husbands results from the norm or is a simple artifact generated by the shorter stature of women. This study addresses the question by comparing the rate of adherence to the male-taller norm between actual mating and hypothetical random mating. A total of 7954 actually married couples are drawn from the last follow-up of the Indonesian Family Life Survey, a nationally representative survey. Their heights were measured by trained nurses. About 10,000 individuals are randomly sampled from the actual couples and randomly matched. An alternative random mating of about 100,000 couples is also performed, taking into account an age difference of 5 years within a couple. The rate of adherence to the male-taller norm is 93.4% for actual couples and 88.8% for random couples. The difference between the two figures is statistically significant, but it is emphasized that it is very small. The alternative random mating produces a rate of 91.4%. The male-taller norm exists in Indonesia, but only in a statistical sense. The small difference suggests that the norm is mostly explained by the fact that women are shorter than men on average. Copyright © 2015 Elsevier GmbH. All rights reserved.

  18. Rare event simulation in radiation transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less

  19. Determinants of serum zinc in a random population sample of four Belgian towns with different degrees of environmental exposure to cadmium

    PubMed Central

    Thijs, Lutgarde; Staessen, Jan; Amery, Antoon; Bruaux, Pierre; Buchet, Jean-Pierre; Claeys, FranÇoise; De Plaen, Pierre; Ducoffre, Geneviève; Lauwerys, Robert; Lijnen, Paul; Nick, Laurence; Remy, Annie Saint; Roels, Harry; Rondia, Désiré; Sartor, Francis

    1992-01-01

    This report investigated the distribution of serum zinc and the factors determining serum zinc concentration in a large random population sample. The 1977 participants (959 men and 1018 women), 20–80 years old, constituted a stratified random sample of the population of four Belgian districts, representing two areas with low and two with high environmental exposure to cadmium. For each exposure level, a rural and an urban area were selected. The serum concentration of zinc, frequently used as an index for zinc status in human subjects, was higher in men (13.1 μmole/L, range 6.5–23.0 μmole/L) than in women (12.6 μmole/L, range 6.3–23.2 μmole/L). In men, 20% of the variance of serum zinc was explained by age (linear and squared term, R = 0.29), diurnal variation (r = 0.29), and total cholesterol (r = 0.16). After adjustment for these covariates, a negative relationship was observed between serum zinc and both blood (r = −0.10) and urinary cadmium (r = −0.14). In women, 11% of the variance could be explained by age (linear and squared term, R = 0.15), diurnal variation in serum zinc (r = 0.27), creatinine clearance (r = −0.11), log γ-glutamyltranspeptidase (r = 0.08), cholesterol (r = 0.07), contraceptive pill intake (r = −0.07), and log serum ferritin (r = 0.06). Before and after adjustment for significant covariates, serum zinc was, on average, lowest in the two districts where the body burden of cadmium, as assessed by urinary cadmium excretion, was highest. These results were not altered when subjects exposed to heavy metals at work were excluded from analysis. PMID:1486857

  20. Being “SMART” about Adolescent Conduct Problems Prevention: Executing a SMART Pilot Study in a Juvenile Diversion Agency

    PubMed Central

    August, Gerald J.; Piehler, Timothy F.; Bloomquist, Michael L.

    2014-01-01

    OBJECTIVE The development of adaptive treatment strategies (ATS) represents the next step in innovating conduct problems prevention programs within a juvenile diversion context. Towards this goal, we present the theoretical rationale, associated methods, and anticipated challenges for a feasibility pilot study in preparation for implementing a full-scale SMART (i.e., sequential, multiple assignment, randomized trial) for conduct problems prevention. The role of a SMART design in constructing ATS is presented. METHOD The SMART feasibility pilot study includes a sample of 100 youth (13–17 years of age) identified by law enforcement as early stage offenders and referred for pre-court juvenile diversion programming. Prior data on the sample population detail a high level of ethnic diversity and approximately equal representations of both genders. Within the SMART, youth and their families are first randomly assigned to one of two different brief-type evidence-based prevention programs, featuring parent-focused behavioral management or youth-focused strengths-building components. Youth who do not respond sufficiently to brief first-stage programming will be randomly assigned a second time to either an extended parent- or youth-focused second-stage programming. Measures of proximal intervention response and measures of potential candidate tailoring variables for developing ATS within this sample are detailed. RESULTS Results of the described pilot study will include information regarding feasibility and acceptability of the SMART design. This information will be used to refine a subsequent full-scale SMART. CONCLUSIONS The use of a SMART to develop ATS for prevention will increase the efficiency and effectiveness of prevention programing for youth with developing conduct problems. PMID:25256135

  1. The frequency and nature of alcohol and tobacco advertising in televised sports, 1990 through 1992.

    PubMed

    Madden, P A; Grube, J W

    1994-02-01

    This study examines the frequency and nature of alcohol and tobacco advertising in a random sample of 166 televised sports events representing 443.7 hours of network programming broadcast from fall 1990 through summer 1992. More commercials appear for alcohol products than for any other beverage. Beer commercials predominate and include images at odds with recommendations from former Surgeon General Koop. The audience is also exposed to alcohol and tobacco advertising through the appearances of stadium signs, other on-site promotions, and verbal or visual brief product sponsorships. Moderation messages and public service announcements are rare.

  2. Prevalence of alcohol-impaired drivers based on random breath tests in a roadside survey in Catalonia (Spain).

    PubMed

    Alcañiz, Manuela; Guillén, Montserrat; Santolino, Miguel; Sánchez-Moscona, Daniel; Llatje, Oscar; Ramon, Lluís

    2014-04-01

    Sobriety checkpoints are not usually randomly located by traffic authorities. As such, information provided by non-random alcohol tests cannot be used to infer the characteristics of the general driving population. In this paper a case study is presented in which the prevalence of alcohol-impaired driving is estimated for the general population of drivers. A stratified probabilistic sample was designed to represent vehicles circulating in non-urban areas of Catalonia (Spain), a region characterized by its complex transportation network and dense traffic around the metropolis of Barcelona. Random breath alcohol concentration tests were performed during spring 2012 on 7596 drivers. The estimated prevalence of alcohol-impaired drivers was 1.29%, which is roughly a third of the rate obtained in non-random tests. Higher rates were found on weekends (1.90% on Saturdays and 4.29% on Sundays) and especially at night. The rate is higher for men (1.45%) than for women (0.64%) and it shows an increasing pattern with age. In vehicles with two occupants, the proportion of alcohol-impaired drivers is estimated at 2.62%, but when the driver was alone the rate drops to 0.84%, which might reflect the socialization of drinking habits. The results are compared with outcomes in previous surveys, showing a decreasing trend in the prevalence of alcohol-impaired drivers over time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. [Comparison study on sampling methods of Oncomelania hupensis snail survey in marshland schistosomiasis epidemic areas in China].

    PubMed

    An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang

    2016-06-29

    To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.

  4. From cars to bikes - the feasibility and effect of using e-bikes, longtail bikes and traditional bikes for transportation among parents of children attending kindergarten: design of a randomized cross-over trial.

    PubMed

    Bjørnarå, Helga Birgit; Berntsen, Sveinung; Te Velde, Saskia J; Fegran, Liv; Fyhri, Aslak; Deforche, Benedicte; Andersen, Lars Bo; Bere, Elling

    2017-12-28

    The present study aims to increase bicycling and level of physical activity (PA), and thereby promote health in parents of toddlers, by giving access to different bicycle types. There is a need for greater understanding of e-bikes and their role in the transportation network, and further effects on PA levels and health. Moreover, longtail bikes could meet certain practical needs not fulfilled by e-bikes or traditional bikes, hence increased knowledge regarding their feasibility should be obtained. No previous studies have investigated whether providing an e-bike or a longtail bike over an extended period in a sample of parents of toddlers influence objectively assessed amount of bicycling and total PA level, transportation habits, cardiorespiratory fitness, body composition and blood pressure. A randomized cross-over trial will be performed, entailing that participants in the intervention group (n = 18) complete the following intervention arms in random order: (i) three months access to an e-bicycle with trailer for child transportation (n = 6), (ii) three months access to a longtail bicycle (n = 6), and (iii) three months access to a regular bicycle with trailer (n = 6), in total nine months. Also, a control group (n = 18) maintaining usual transportation and PA habits will be included. A convenience sample consisting of 36 parents of toddlers residing in Kristiansand municipality, Southern Norway, will be recruited. Total amount of bicycling (distance and time), total level of PA, and transportation habits will be measured at baseline and in connection to each intervention arm. Cardiorespiratory fitness, body composition and blood pressure will be measured at baseline and post-intervention. Main outcome will be bicycling distance and time spent cycling. New knowledge relevant for the timely issues of public health and environmental sustainability will be provided among parents of toddlers, representing a target group of greatest importance. There is a call for research on the influence of e-bikes and longtail bikes on travel behavior and PA levels, and whether voluntary cycling could improve health. If the present study reveals promising results, it should be replicated in larger and more representative samples. Eventually, inclusion in national public health policies should be considered. ID NCT03131518 , made public 26.04.2017.

  5. A systematic random sampling scheme optimized to detect the proportion of rare synapses in the neuropil.

    PubMed

    da Costa, Nuno Maçarico; Hepp, Klaus; Martin, Kevan A C

    2009-05-30

    Synapses can only be morphologically identified by electron microscopy and this is often a very labor-intensive and time-consuming task. When quantitative estimates are required for pathways that contribute a small proportion of synapses to the neuropil, the problems of accurate sampling are particularly severe and the total time required may become prohibitive. Here we present a sampling method devised to count the percentage of rarely occurring synapses in the neuropil using a large sample (approximately 1000 sampling sites), with the strong constraint of doing it in reasonable time. The strategy, which uses the unbiased physical disector technique, resembles that used in particle physics to detect rare events. We validated our method in the primary visual cortex of the cat, where we used biotinylated dextran amine to label thalamic afferents and measured the density of their synapses using the physical disector method. Our results show that we could obtain accurate counts of the labeled synapses, even when they represented only 0.2% of all the synapses in the neuropil.

  6. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    NASA Astrophysics Data System (ADS)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  7. Estimation of distribution overlap of urn models.

    PubMed

    Hampton, Jerrad; Lladser, Manuel E

    2012-01-01

    A classical problem in statistics is estimating the expected coverage of a sample, which has had applications in gene expression, microbial ecology, optimization, and even numismatics. Here we consider a related extension of this problem to random samples of two discrete distributions. Specifically, we estimate what we call the dissimilarity probability of a sample, i.e., the probability of a draw from one distribution not being observed in [Formula: see text] draws from another distribution. We show our estimator of dissimilarity to be a [Formula: see text]-statistic and a uniformly minimum variance unbiased estimator of dissimilarity over the largest appropriate range of [Formula: see text]. Furthermore, despite the non-Markovian nature of our estimator when applied sequentially over [Formula: see text], we show it converges uniformly in probability to the dissimilarity parameter, and we present criteria when it is approximately normally distributed and admits a consistent jackknife estimator of its variance. As proof of concept, we analyze V35 16S rRNA data to discern between various microbial environments. Other potential applications concern any situation where dissimilarity of two discrete distributions may be of interest. For instance, in SELEX experiments, each urn could represent a random RNA pool and each draw a possible solution to a particular binding site problem over that pool. The dissimilarity of these pools is then related to the probability of finding binding site solutions in one pool that are absent in the other.

  8. Full syndrome and subthreshold attention-deficit/hyperactivity disorder in a Korean community sample: comorbidity and temperament findings.

    PubMed

    Cho, Soo-Churl; Kim, Boong-Nyun; Kim, Jae-Won; Rohde, Luis Augusto; Hwang, Jun-Won; Chungh, Dong-Seon; Shin, Min-Sup; Lyoo, In Kyoon; Go, Bock-Ja; Lee, Sang-Eun; Kim, Hyo-Won

    2009-07-01

    The main objective of this study was to investigate the comorbid disorders and temperamental profiles of full syndrome and subthreshold attention-deficit/hyperactivity disorder (ADHD). A sample of 2,493 students was randomly selected from six representative elementary schools in Seoul, Korea. Among 245 children with full syndrome and subthreshold ADHD diagnosed by the diagnostic interview schedule for children-4th version, parents of 185 subjects (mean age 9.0 +/- 1.7 years) and of a random sample of 185 age- and gender-matched non-ADHD children have completed the parent's version of the children behavior checklist (CBCL) and the juvenile temperament and character inventory (JTCI). The prevalence rates of full syndrome and subthreshold ADHD were, respectively, 5.90% (95% confidence interval = 4.74-7.06) and 9.00% (95% confidence interval = 7.58-10.41). Subthreshold ADHD cases did not differ from full syndrome ADHD in any JTCI profile, showing high novelty seeking/low persistence/low self-directedness than controls. Subthreshold ADHD also showed increased risk for externalizing disorders and higher scores in eight CBCL scales (somatic complaints, anxious/depressed, social problems, attention problems, delinquent behaviors, aggressive behaviors, externalizing problems and total behavioral problems) compared to the controls. These results support the clinical relevance of subthreshold ADHD in Asian culture. Increased clinical awareness for children with subthreshold ADHD is needed.

  9. Effectiveness of Systems Training for Emotional Predictability and Problem Solving (STEPPS) for borderline personality problems in a 'real-world' sample: moderation by diagnosis or severity?

    PubMed

    Bos, Elisabeth H; van Wel, E Bas; Appelo, Martin T; Verbraak, Marc J P M

    2011-01-01

    Systems Training for Emotional Predictability and Problem Solving (STEPPS) is a group treatment for borderline personality disorder (BPD). Two prior randomized controlled trials (RCTs) have shown the efficacy of this training. In both RCTs, patients with borderline features who did not meet the DSM-IV criteria for BPD were excluded, which were many. We investigated the effectiveness of STEPPS in a sample representative of routine clinical practice and examined whether DSM-IV diagnosis and/or baseline severity were related to differential effectiveness. Patients whom their practicing clinician diagnosed with BPD were randomized to STEPPS plus adjunctive individual therapy (STEPPS, n = 84) or to treatment as usual (TAU, n = 84). STEPPS recipients showed more improvement on measures of general and BPD-specific psychopathology as well as quality of life than TAU recipients, both at the end of treatment and at a 6-month follow-up. Presence of DSM-IV-diagnosed BPD was not related to differential treatment effectiveness, but dimensional measures of symptom severity were; STEPPS was superior to TAU particularly in patients with higher baseline severity scores. The findings show the effectiveness of STEPPS in a 'real-world' sample, and underscore the importance of dimensional versus categorical measures of personality disturbance. Copyright © 2011 S. Karger AG, Basel.

  10. Occurrence and distribution of methyl tert-butyl ether and other volatile organic compounds in drinking water in the Northeast and Mid-Atlantic regions of the United States, 1993-98

    USGS Publications Warehouse

    Grady, S.J.; Casey, G.D.

    2001-01-01

    Data on volatile organic compounds (VOCs) in drinking water supplied by 2,110 randomly selected community water systems (CWSs) in 12 Northeast and Mid-Atlantic States indicate 64 VOC analytes were detected at least once during 1993-98. Selection of the 2,110 CWSs inventoried for this study targeted 20 percent of the 10,479 active CWSs in the region and represented a random subset of the total distribution by State, source of water, and size of system. The data include 21,635 analyses of drinking water collected for compliance monitoring under the Safe Drinking Water Act; the data mostly represent finished drinking water collected at the pointof- entry to, or at more distal locations within, each CWS?s distribution system following any watertreatment processes. VOC detections were more common in drinking water supplied by large systems (serving more than 3,300 people) that tap surface-water sources or both surface- and groundwater sources than in small systems supplied exclusively by ground-water sources. Trihalomethane (THM) compounds, which are potentially formed during the process of disinfecting drinking water with chlorine, were detected in 45 percent of the randomly selected CWSs. Chloroform was the most frequently detected THM, reported in 39 percent of the CWSs. The gasoline additive methyl tert-butyl ether (MTBE) was the most frequently detected VOC in drinking water after the THMs. MTBE was detected in 8.9 percent of the 1,194 randomly selected CWSs that analyzed samples for MTBE at any reporting level, and it was detected in 7.8 percent of the 1,074 CWSs that provided MTBE data at the 1.0-?g/L (microgram per liter) reporting level. As with other VOCs reported in drinking water, most MTBE concentrations were less than 5.0 ?g/L, and less than 1 percent of CWSs reported MTBE concentrations at or above the 20.0-?g/L lower limit recommended by the U.S. Environmental Protection Agency?s Drinking-Water Advisory. The frequency of MTBE detections in drinking water is significantly related to high- MTBE-use patterns. Detections are five times more likely in areas where MTBE is or has been used in gasoline at greater than 5 percent by volume as part of the oxygenated or reformulated (OXY/RFG) fuels program. Detection frequencies of the individual gasoline compounds (benzene, toluene, ethylbenzene, and xylenes (BTEX)) were mostly less than 3 percent of the randomly selected CWSs, but collectively, BTEX compounds were detected in 8.4 percent of CWSs. BTEX concentrations also were low and just three drinkingwater samples contained BTEX at concentrations exceeding 20 ?g/L. Co-occurrence of MTBE and BTEX was rare, and only 0.8 percent of CWSs reported simultaneous detections of MTBE and BTEX compounds. Low concentrations and cooccurrence of MTBE and BTEX indicate most gasoline contaminants in drinking water probably represent nonpoint sources. Solvents were frequently detected in drinking water in the 12-State area. One or more of 27 individual solvent VOCs were detected at any reporting level in 3,080 drinking-water samples from 304 randomly selected CWSs (14 percent) and in 206 CWSs (9.8 percent) at concentrations at or above 1.0 ?g/L. High co-occurrence among solvents probably reflects common sources and the presence of transformation by-products. Other VOCs were relatively rarely detected in drinking water in the 12-State area. Six percent (127) of the 2,110 randomly selected CWSs reported concentrations of 16 VOCs at or above drinking-water criteria. The 127 CWSs collectively serve 2.6 million people. The occurrence of VOCs in drinking water was significantly associated (p<0.0001) with high population- density urban areas. New Jersey, Massachusetts, and Rhode Island, States with substantial urbanization and high population density, had the highest frequency of VOC detections among the 12 States. More than two-thirds of the randomly selected CWSs in New Jersey reported detecting VOC concentrations in drinking water at or above 1

  11. Changes in Brain Volume and Cognition in a Randomized Trial of Exercise and Social Interaction in a Community-Based Sample of Non-Demented Chinese Elders

    PubMed Central

    Mortimer, James A.; Ding, Ding; Borenstein, Amy R.; DeCarli, Charles; Guo, Qihao; Wu, Yougui; Zhao, Qianhua; Chu, Shugang

    2013-01-01

    Physical exercise has been shown to increase brain volume and improve cognition in randomized trials of non-demented elderly. Although greater social engagement was found to reduce dementia risk in observational studies, randomized trials of social interventions have not been reported. A representative sample of 120 elderly from Shanghai, China was randomized to four groups (Tai Chi, Walking, Social Interaction, No Intervention) for 40 weeks. Two MRIs were obtained, one before the intervention period, the other after. A neuropsychological battery was administered at baseline, 20 weeks, and 40 weeks. Comparison of changes in brain volumes in intervention groups with the No Intervention group were assessed by t-tests. Time-intervention group interactions for neuropsychological measures were evaluated with repeated-measures mixed models. Compared to the No Intervention group, significant increases in brain volume were seen in the Tai Chi and Social Intervention groups (p < 0.05). Improvements also were observed in several neuropsychological measures in the Tai Chi group, including the Mattis Dementia Rating Scale score (p = 0.004), the Trailmaking Test A (p = 0.002) and B (p = 0.0002), the Auditory Verbal Learning Test (p = 0.009), and verbal fluency for animals (p = 0.01). The Social Interaction group showed improvement on some, but fewer neuropsychological indices. No differences were observed between the Walking and No Intervention groups. The findings differ from previous clinical trials in showing increases in brain volume and improvements in cognition with a largely non-aerobic exercise (Tai Chi). In addition, intellectual stimulation through social interaction was associated with increases in brain volume as well as with some cognitive improvements. PMID:22451320

  12. External validity of randomized controlled trials in older adults, a systematic review.

    PubMed

    van Deudekom, Floor J; Postmus, Iris; van der Ham, Danielle J; Pothof, Alexander B; Broekhuizen, Karen; Blauw, Gerard J; Mooijaart, Simon P

    2017-01-01

    To critically assess the external validity of randomized controlled trials (RCTs) it is important to know what older adults have been enrolled in the trials. The aim of this systematic review is to study what proportion of trials specifically designed for older patients report on somatic status, physical and mental functioning, social environment and frailty in the patient characteristics. PubMed was searched for articles published in 2012 and only RCTs were included. Articles were further excluded if not conducted with humans or only secondary analyses were reported. A random sample of 10% was drawn. The current review analyzed this random sample and further selected trials when the reported mean age was ≥ 60 years. We extracted geriatric assessments from the population descriptives or the in- and exclusion criteria. In total 1396 trials were analyzed and 300 trials included. The median of the reported mean age was 66 (IQR 63-70) and the median percentage of men in the trials was 60 (IQR 45-72). In 34% of the RCTs specifically designed for older patients somatic status, physical and mental functioning, social environment or frailty were reported in the population descriptives or the in- and exclusion criteria. Physical and mental functioning was reported most frequently (22% and 14%). When selecting RCTs on a mean age of 70 or 80 years the report of geriatric assessments in the patient characteristics was 46% and 85% respectively but represent only 5% and 1% of the trials. Somatic status, physical and mental functioning, social environment and frailty are underreported even in RCTs specifically designed for older patients published in 2012. Therefore, it is unclear for clinicians to which older patients the results can be applied. We recommend systematic to transparently report these relevant characteristics of older participants included in RCTs.

  13. Changes in brain volume and cognition in a randomized trial of exercise and social interaction in a community-based sample of non-demented Chinese elders.

    PubMed

    Mortimer, James A; Ding, Ding; Borenstein, Amy R; DeCarli, Charles; Guo, Qihao; Wu, Yougui; Zhao, Qianhua; Chu, Shugang

    2012-01-01

    Physical exercise has been shown to increase brain volume and improve cognition in randomized trials of non-demented elderly. Although greater social engagement was found to reduce dementia risk in observational studies, randomized trials of social interventions have not been reported. A representative sample of 120 elderly from Shanghai, China was randomized to four groups (Tai Chi, Walking, Social Interaction, No Intervention) for 40 weeks. Two MRIs were obtained, one before the intervention period, the other after. A neuropsychological battery was administered at baseline, 20 weeks, and 40 weeks. Comparison of changes in brain volumes in intervention groups with the No Intervention group were assessed by t-tests. Time-intervention group interactions for neuropsychological measures were evaluated with repeated-measures mixed models. Compared to the No Intervention group, significant increases in brain volume were seen in the Tai Chi and Social Intervention groups (p < 0.05). Improvements also were observed in several neuropsychological measures in the Tai Chi group, including the Mattis Dementia Rating Scale score (p = 0.004), the Trailmaking Test A (p = 0.002) and B (p = 0.0002), the Auditory Verbal Learning Test (p = 0.009), and verbal fluency for animals (p = 0.01). The Social Interaction group showed improvement on some, but fewer neuropsychological indices. No differences were observed between the Walking and No Intervention groups. The findings differ from previous clinical trials in showing increases in brain volume and improvements in cognition with a largely non-aerobic exercise (Tai Chi). In addition, intellectual stimulation through social interaction was associated with increases in brain volume as well as with some cognitive improvements.

  14. Employing a Multi-level Approach to Recruit a Representative Sample of Women with Recent Gestational Diabetes Mellitus into a Randomized Lifestyle Intervention Trial.

    PubMed

    Nicklas, Jacinda M; Skurnik, Geraldine; Zera, Chloe A; Reforma, Liberty G; Levkoff, Sue E; Seely, Ellen W

    2016-02-01

    The postpartum period is a window of opportunity for diabetes prevention in women with recent gestational diabetes (GDM), but recruitment for clinical trials during this period of life is a major challenge. We adapted a social-ecologic model to develop a multi-level recruitment strategy at the macro (high or institutional level), meso (mid or provider level), and micro (individual) levels. Our goal was to recruit 100 women with recent GDM into the Balance after Baby randomized controlled trial over a 17-month period. Participants were asked to attend three in-person study visits at 6 weeks, 6, and 12 months postpartum. They were randomized into a control arm or a web-based intervention arm at the end of the baseline visit at six weeks postpartum. At the end of the recruitment period, we compared population characteristics of our enrolled subjects to the entire population of women with GDM delivering at Brigham and Women's Hospital (BWH). We successfully recruited 107 of 156 (69 %) women assessed for eligibility, with the majority (92) recruited during pregnancy at a mean 30 (SD ± 5) weeks of gestation, and 15 recruited postpartum, at a mean 2 (SD ± 3) weeks postpartum. 78 subjects attended the initial baseline visit, and 75 subjects were randomized into the trial at a mean 7 (SD ± 2) weeks postpartum. The recruited subjects were similar in age and race/ethnicity to the total population of 538 GDM deliveries at BWH over the 17-month recruitment period. Our multilevel approach allowed us to successfully meet our recruitment goal and recruit a representative sample of women with recent GDM. We believe that our most successful strategies included using a dedicated in-person recruiter, integrating recruitment into clinical flow, allowing for flexibility in recruitment, minimizing barriers to participation, and using an opt-out strategy with providers. Although the majority of women were recruited while pregnant, women recruited in the early postpartum period were more likely to present for the first study visit. Given the increased challenges of recruiting postpartum women with GDM into research studies, we believe our findings will be useful to other investigators seeking to study this population.

  15. A Portuguese value set for the SF-6D.

    PubMed

    Ferreira, Lara N; Ferreira, Pedro L; Pereira, Luis N; Brazier, John; Rowen, Donna

    2010-08-01

    The SF-6D is a preference-based measure of health derived from the SF-36 that can be used for cost-effectiveness analysis using cost-per-quality adjusted life-year analysis. This study seeks to estimate a system weight for the SF-6D for Portugal and to compare the results with the UK system weights. A sample of 55 health states defined by the SF-6D has been valued by a representative random sample of the Portuguese population, stratified by sex and age (n = 140), using the Standard Gamble (SG). Several models are estimated at both the individual and aggregate levels for predicting health-state valuations. Models with main effects, with interaction effects and with the constant forced to unity are presented. Random effects (RE) models are estimated using generalized least squares (GLS) regressions. Generalized estimation equations (GEE) are used to estimate RE models with the constant forced to unity. Estimations at the individual level were performed using 630 health-state valuations. Alternative functional forms are considered to account for the skewed distribution of health-state valuations. The models are analyzed in terms of their coefficients, overall fit, and the ability for predicting the SG-values. The RE models estimated using GLS and through GEE produce significant coefficients, which are robust across model specification. However, there are concerns regarding some inconsistent estimates, and so parsimonious consistent models were estimated. There is evidence of under prediction in some states assigned to poor health. The results are consistent with the UK results. The models estimated provide preference-based quality of life weights for the Portuguese population when health status data have been collected using the SF-36. Although the sample was randomly drowned findings should be treated with caution, given the small sample size, even knowing that they have been estimated at the individual level.

  16. Assessing sample representativeness in randomized controlled trials: application to the National Institute of Drug Abuse Clinical Trials Network.

    PubMed

    Susukida, Ryoko; Crum, Rosa M; Stuart, Elizabeth A; Ebnesajjad, Cyrus; Mojtabai, Ramin

    2016-07-01

    To compare the characteristics of individuals participating in randomized controlled trials (RCTs) of treatments of substance use disorder (SUD) with individuals receiving treatment in usual care settings, and to provide a summary quantitative measure of differences between characteristics of these two groups of individuals using propensity score methods. Design Analyses using data from RCT samples from the National Institute of Drug Abuse Clinical Trials Network (CTN) and target populations of patients drawn from the Treatment Episodes Data Set-Admissions (TEDS-A). Settings Multiple clinical trial sites and nation-wide usual SUD treatment settings in the United States. A total of 3592 individuals from 10 CTN samples and 1 602 226 individuals selected from TEDS-A between 2001 and 2009. Measurements The propensity scores for enrolling in the RCTs were computed based on the following nine observable characteristics: sex, race/ethnicity, age, education, employment status, marital status, admission to treatment through criminal justice, intravenous drug use and the number of prior treatments. Findings The proportion of those with ≥ 12 years of education and the proportion of those who had full-time jobs were significantly higher among RCT samples than among target populations (in seven and nine trials, respectively, at P < 0.001). The pooled difference in the mean propensity scores between the RCTs and the target population was 1.54 standard deviations and was statistically significant at P < 0.001. In the United States, individuals recruited into randomized controlled trials of substance use disorder treatments appear to be very different from individuals receiving treatment in usual care settings. Notably, RCT participants tend to have more years of education and a greater likelihood of full-time work compared with people receiving care in usual care settings. © 2016 Society for the Study of Addiction.

  17. Forecasting the brittle failure of heterogeneous, porous geomaterials

    NASA Astrophysics Data System (ADS)

    Vasseur, Jérémie; Wadsworth, Fabian; Heap, Michael; Main, Ian; Lavallée, Yan; Dingwell, Donald

    2017-04-01

    Heterogeneity develops in magmas during ascent and is dominated by the development of crystal and importantly, bubble populations or pore-network clusters which grow, interact, localize, coalesce, outgas and resorb. Pore-scale heterogeneity is also ubiquitous in sedimentary basin fill during diagenesis. As a first step, we construct numerical simulations in 3D in which randomly generated heterogeneous and polydisperse spheres are placed in volumes and which are permitted to overlap with one another, designed to represent the random growth and interaction of bubbles in a liquid volume. We use these simulated geometries to show that statistical predictions of the inter-bubble lengthscales and evolving bubble surface area or cluster densities can be made based on fundamental percolation theory. As a second step, we take a range of well constrained random heterogeneous rock samples including sandstones, andesites, synthetic partially sintered glass bead samples, and intact glass samples and subject them to a variety of stress loading conditions at a range of temperatures until failure. We record in real time the evolution of the number of acoustic events that precede failure and show that in all scenarios, the acoustic event rate accelerates toward failure, consistent with previous findings. Applying tools designed to forecast the failure time based on these precursory signals, we constrain the absolute error on the forecast time. We find that for all sample types, the error associated with an accurate forecast of failure scales non-linearly with the lengthscale between the pore clusters in the material. Moreover, using a simple micromechanical model for the deformation of porous elastic bodies, we show that the ratio between the equilibrium sub-critical crack length emanating from the pore clusters relative to the inter-pore lengthscale, provides a scaling for the error on forecast accuracy. Thus for the first time we provide a potential quantitative correction for forecasting the failure of porous brittle solids that build the Earth's crust.

  18. Evaluation of different approaches for identifying optimal sites to predict mean hillslope soil moisture content

    NASA Astrophysics Data System (ADS)

    Liao, Kaihua; Zhou, Zhiwen; Lai, Xiaoming; Zhu, Qing; Feng, Huihui

    2017-04-01

    The identification of representative soil moisture sampling sites is important for the validation of remotely sensed mean soil moisture in a certain area and ground-based soil moisture measurements in catchment or hillslope hydrological studies. Numerous approaches have been developed to identify optimal sites for predicting mean soil moisture. Each method has certain advantages and disadvantages, but they have rarely been evaluated and compared. In our study, surface (0-20 cm) soil moisture data from January 2013 to March 2016 (a total of 43 sampling days) were collected at 77 sampling sites on a mixed land-use (tea and bamboo) hillslope in the hilly area of Taihu Lake Basin, China. A total of 10 methods (temporal stability (TS) analyses based on 2 indices, K-means clustering based on 6 kinds of inputs and 2 random sampling strategies) were evaluated for determining optimal sampling sites for mean soil moisture estimation. They were TS analyses based on the smallest index of temporal stability (ITS, a combination of the mean relative difference and standard deviation of relative difference (SDRD)) and based on the smallest SDRD, K-means clustering based on soil properties and terrain indices (EFs), repeated soil moisture measurements (Theta), EFs plus one-time soil moisture data (EFsTheta), and the principal components derived from EFs (EFs-PCA), Theta (Theta-PCA), and EFsTheta (EFsTheta-PCA), and global and stratified random sampling strategies. Results showed that the TS based on the smallest ITS was better (RMSE = 0.023 m3 m-3) than that based on the smallest SDRD (RMSE = 0.034 m3 m-3). The K-means clustering based on EFsTheta (-PCA) was better (RMSE <0.020 m3 m-3) than these based on EFs (-PCA) and Theta (-PCA). The sampling design stratified by the land use was more efficient than the global random method. Forty and 60 sampling sites are needed for stratified sampling and global sampling respectively to make their performances comparable to the best K-means method (EFsTheta-PCA). Overall, TS required only one site, but its accuracy was limited. The best K-means method required <8 sites and yielded high accuracy, but extra soil and terrain information is necessary when using this method. The stratified sampling strategy can only be used if no pre-knowledge about soil moisture variation is available. This information will help in selecting the optimal methods for estimation the area mean soil moisture.

  19. Quantum Entanglement in Neural Network States

    NASA Astrophysics Data System (ADS)

    Deng, Dong-Ling; Li, Xiaopeng; Das Sarma, S.

    2017-04-01

    Machine learning, one of today's most rapidly growing interdisciplinary fields, promises an unprecedented perspective for solving intricate quantum many-body problems. Understanding the physical aspects of the representative artificial neural-network states has recently become highly desirable in the applications of machine-learning techniques to quantum many-body physics. In this paper, we explore the data structures that encode the physical features in the network states by studying the quantum entanglement properties, with a focus on the restricted-Boltzmann-machine (RBM) architecture. We prove that the entanglement entropy of all short-range RBM states satisfies an area law for arbitrary dimensions and bipartition geometry. For long-range RBM states, we show by using an exact construction that such states could exhibit volume-law entanglement, implying a notable capability of RBM in representing quantum states with massive entanglement. Strikingly, the neural-network representation for these states is remarkably efficient, in the sense that the number of nonzero parameters scales only linearly with the system size. We further examine the entanglement properties of generic RBM states by randomly sampling the weight parameters of the RBM. We find that their averaged entanglement entropy obeys volume-law scaling, and the meantime strongly deviates from the Page entropy of the completely random pure states. We show that their entanglement spectrum has no universal part associated with random matrix theory and bears a Poisson-type level statistics. Using reinforcement learning, we demonstrate that RBM is capable of finding the ground state (with power-law entanglement) of a model Hamiltonian with a long-range interaction. In addition, we show, through a concrete example of the one-dimensional symmetry-protected topological cluster states, that the RBM representation may also be used as a tool to analytically compute the entanglement spectrum. Our results uncover the unparalleled power of artificial neural networks in representing quantum many-body states regardless of how much entanglement they possess, which paves a novel way to bridge computer-science-based machine-learning techniques to outstanding quantum condensed-matter physics problems.

  20. Efficacy of a Mandibular Advancement Appliance on Sleep Disordered Breathing in Children: A Study Protocol of a Crossover Randomized Controlled Trial.

    PubMed

    Idris, Ghassan; Galland, Barbara; Robertson, Christopher J; Farella, Mauro

    2016-01-01

    Sleep-Disordered Breathing (SDB) varies from habitual snoring to partial or complete obstruction of the upper airway and can be found in up to 10% of children. SDB can significantly affect children's wellbeing, as it can cause growth disorders, educational and behavioral problems, and even life-threatening conditions, such as cardiorespiratory failure. Adenotonsillectomy represents the primary treatment for pediatric SDB where adeno-tonsillar hypertrophy is indicated. For those with craniofacial anomalies, or for whom adenotonsillectomy or other treatment modalities have failed, or surgery is contra-indicated, mandibular advancement splints (MAS) may represent a viable treatment option. Whilst the efficacy of these appliances has been consistently demonstrated in adults, there is little information about their effectiveness in children. To determine the efficacy of mandibular advancement appliances for the management of SDB and related health problems in children. The study will be designed as a single-blind crossover randomized controlled trial with administration of both an "Active MAS" (Twin-block) and a "Sham MAS." Eligible participants will be children aged 8-12 years whose parents report they snore ≥3 nights per week. Sixteen children will enter the full study after confirming other inclusion criteria, particularly Skeletal class I or class II confirmed by lateral cephalometric radiograph. Each child will be randomly assigned to either a treatment sequence starting with the Active or the Sham MAS. Participants will wear the appliances for 3 weeks separated by a 2-week washout period. For each participant, home-based polysomnographic data will be collected four times; once before and once after each treatment period. The Apnea Hypopnea Index (AHI) will represent the main outcome variable. Secondary outcomes will include, snoring frequency, masseter muscle activity, sleep symptoms, quality of life, daytime sleepiness, children behavior, and nocturnal enuresis. In addition, blood samples will be collected to assess growth hormone changes. This study was registered in the Australian New Zealand Clinical Trials Registry (ANZCTR): [ACTRN12614001013651].

  1. Advance notification letters increase adherence in colorectal cancer screening: a population-based randomized trial.

    PubMed

    van Roon, A H C; Hol, L; Wilschut, J A; Reijerink, J C I Y; van Vuuren, A J; van Ballegooijen, M; Habbema, J D F; van Leerdam, M E; Kuipers, Ernst J

    2011-06-01

    The population benefit of screening depends not only on the effectiveness of the test, but also on adherence, which, for colorectal cancer (CRC) screening remains low. An advance notification letter may increase adherence, however, no population-based randomized trials have been conducted to provide evidence of this. In 2008, a representative sample of the Dutch population (aged 50-74 years) was randomized. All 2493 invitees in group A were sent an advance notification letter, followed two weeks later by a standard invitation. The 2507 invitees in group B only received the standard invitation. Non-respondents in both groups were sent a reminder 6 weeks after the invitation. The advance notification letters resulted in a significantly higher adherence (64.4% versus 61.1%, p-value 0.019). Multivariate logistic regression analysis showed no significant interactions between group and age, sex, or socio-economic status. Cost analysis showed that the incremental cost per additional detected advanced neoplasia due to sending an advance notification letter was € 957. This population-based randomized trial demonstrates that sending an advance notification letter significantly increases adherence by 3.3%. The incremental cost per additional detected advanced neoplasia is acceptable. We therefore recommend that such letters are incorporated within the standard CRC-screening invitation process. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Evaluation of physical activity interventions in children via the reach, efficacy/effectiveness, adoption, implementation, and maintenance (RE-AIM) framework: A systematic review of randomized and non-randomized trials.

    PubMed

    McGoey, Tara; Root, Zach; Bruner, Mark W; Law, Barbi

    2016-01-01

    Existing reviews of physical activity (PA) interventions designed to increase PA behavior exclusively in children (ages 5 to 11years) focus primarily on the efficacy (e.g., internal validity) of the interventions without addressing the applicability of the results in terms of generalizability and translatability (e.g., external validity). This review used the RE-AIM (Reach, Efficacy/Effectiveness, Adoption, Implementation, Maintenance) framework to measure the degree to which randomized and non-randomized PA interventions in children report on internal and external validity factors. A systematic search for controlled interventions conducted within the past 12years identified 78 studies that met the inclusion criteria. Based on the RE-AIM criteria, most of the studies focused on elements of internal validity (e.g., sample size, intervention location and efficacy/effectiveness) with minimal reporting of external validity indicators (e.g., representativeness of participants, start-up costs, protocol fidelity and sustainability). Results of this RE-AIM review emphasize the need for future PA interventions in children to report on real-world challenges and limitations, and to highlight considerations for translating evidence-based results into health promotion practice. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Handling Correlations between Covariates and Random Slopes in Multilevel Models

    ERIC Educational Resources Information Center

    Bates, Michael David; Castellano, Katherine E.; Rabe-Hesketh, Sophia; Skrondal, Anders

    2014-01-01

    This article discusses estimation of multilevel/hierarchical linear models that include cluster-level random intercepts and random slopes. Viewing the models as structural, the random intercepts and slopes represent the effects of omitted cluster-level covariates that may be correlated with included covariates. The resulting correlations between…

  4. Task shifting of frontline community health workers for cardiovascular risk reduction: design and rationale of a cluster randomised controlled trial (DISHA study) in India.

    PubMed

    Jeemon, Panniyammakal; Narayanan, Gitanjali; Kondal, Dimple; Kahol, Kashvi; Bharadwaj, Ashok; Purty, Anil; Negi, Prakash; Ladhani, Sulaiman; Sanghvi, Jyoti; Singh, Kuldeep; Kapoor, Deksha; Sobti, Nidhi; Lall, Dorothy; Manimunda, Sathyaprakash; Dwivedi, Supriya; Toteja, Gurudyal; Prabhakaran, Dorairaj

    2016-03-15

    Effective task-shifting interventions targeted at reducing the global cardiovascular disease (CVD) epidemic in low and middle-income countries (LMICs) are urgently needed. DISHA is a cluster randomised controlled trial conducted across 10 sites (5 in phase 1 and 5 in phase 2) in India in 120 clusters. At each site, 12 clusters were randomly selected from a district. A cluster is defined as a small village with 250-300 households and well defined geographical boundaries. They were then randomly allocated to intervention and control clusters in a 1:1 allocation sequence. If any of the intervention and control clusters were <10 km apart, one was dropped and replaced with another randomly selected cluster from the same district. The study included a representative baseline cross-sectional survey, development of a structured intervention model, delivery of intervention for a minimum period of 18 months by trained frontline health workers (mainly Anganwadi workers and ASHA workers) and a post intervention survey in a representative sample. The study staff had no information on intervention allocation until the completion of the baseline survey. In order to ensure comparability of data across sites, the DISHA study follows a common protocol and manual of operation with standardized measurement techniques. Our study is the largest community based cluster randomised trial in low and middle-income country settings designed to test the effectiveness of 'task shifting' interventions involving frontline health workers for cardiovascular risk reduction. CTRI/2013/10/004049 . Registered 7 October 2013.

  5. Randomized controlled trials of simulation-based interventions in Emergency Medicine: a methodological review.

    PubMed

    Chauvin, Anthony; Truchot, Jennifer; Bafeta, Aida; Pateron, Dominique; Plaisance, Patrick; Yordanov, Youri

    2018-04-01

    The number of trials assessing Simulation-Based Medical Education (SBME) interventions has rapidly expanded. Many studies show that potential flaws in design, conduct and reporting of randomized controlled trials (RCTs) can bias their results. We conducted a methodological review of RCTs assessing a SBME in Emergency Medicine (EM) and examined their methodological characteristics. We searched MEDLINE via PubMed for RCT that assessed a simulation intervention in EM, published in 6 general and internal medicine and in the top 10 EM journals. The Cochrane Collaboration risk of Bias tool was used to assess risk of bias, intervention reporting was evaluated based on the "template for intervention description and replication" checklist, and methodological quality was evaluated by the Medical Education Research Study Quality Instrument. Reports selection and data extraction was done by 2 independents researchers. From 1394 RCTs screened, 68 trials assessed a SBME intervention. They represent one quarter of our sample. Cardiopulmonary resuscitation (CPR) is the most frequent topic (81%). Random sequence generation and allocation concealment were performed correctly in 66 and 49% of trials. Blinding of participants and assessors was performed correctly in 19 and 68%. Risk of attrition bias was low in three-quarters of the studies (n = 51). Risk of selective reporting bias was unclear in nearly all studies. The mean MERQSI score was of 13.4/18.4% of the reports provided a description allowing the intervention replication. Trials assessing simulation represent one quarter of RCTs in EM. Their quality remains unclear, and reproducing the interventions appears challenging due to reporting issues.

  6. Determination of the influence of dispersion pattern of pesticide-resistant individuals on the reliability of resistance estimates using different sampling plans.

    PubMed

    Shah, R; Worner, S P; Chapman, R B

    2012-10-01

    Pesticide resistance monitoring includes resistance detection and subsequent documentation/ measurement. Resistance detection would require at least one (≥1) resistant individual(s) to be present in a sample to initiate management strategies. Resistance documentation, on the other hand, would attempt to get an estimate of the entire population (≥90%) of the resistant individuals. A computer simulation model was used to compare the efficiency of simple random and systematic sampling plans to detect resistant individuals and to document their frequencies when the resistant individuals were randomly or patchily distributed. A patchy dispersion pattern of resistant individuals influenced the sampling efficiency of systematic sampling plans while the efficiency of random sampling was independent of such patchiness. When resistant individuals were randomly distributed, sample sizes required to detect at least one resistant individual (resistance detection) with a probability of 0.95 were 300 (1%) and 50 (10% and 20%); whereas, when resistant individuals were patchily distributed, using systematic sampling, sample sizes required for such detection were 6000 (1%), 600 (10%) and 300 (20%). Sample sizes of 900 and 400 would be required to detect ≥90% of resistant individuals (resistance documentation) with a probability of 0.95 when resistant individuals were randomly dispersed and present at a frequency of 10% and 20%, respectively; whereas, when resistant individuals were patchily distributed, using systematic sampling, a sample size of 3000 and 1500, respectively, was necessary. Small sample sizes either underestimated or overestimated the resistance frequency. A simple random sampling plan is, therefore, recommended for insecticide resistance detection and subsequent documentation.

  7. Investigating the Randomness of Numbers

    ERIC Educational Resources Information Center

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  8. Recruitment of Older Adults: Success May Be in the Details

    PubMed Central

    McHenry, Judith C.; Insel, Kathleen C.; Einstein, Gilles O.; Vidrine, Amy N.; Koerner, Kari M.; Morrow, Daniel G.

    2015-01-01

    Purpose: Describe recruitment strategies used in a randomized clinical trial of a behavioral prospective memory intervention to improve medication adherence for older adults taking antihypertensive medication. Results: Recruitment strategies represent 4 themes: accessing an appropriate population, communication and trust-building, providing comfort and security, and expressing gratitude. Recruitment activities resulted in 276 participants with a mean age of 76.32 years, and study enrollment included 207 women, 69 men, and 54 persons representing ethnic minorities. Recruitment success was linked to cultivating relationships with community-based organizations, face-to-face contact with potential study participants, and providing service (e.g., blood pressure checks) as an access point to eligible participants. Seventy-two percent of potential participants who completed a follow-up call and met eligibility criteria were enrolled in the study. The attrition rate was 14.34%. Implications: The projected increase in the number of older adults intensifies the need to study interventions that improve health outcomes. The challenge is to recruit sufficient numbers of participants who are also representative of older adults to test these interventions. Failing to recruit a sufficient and representative sample can compromise statistical power and the generalizability of study findings. PMID:22899424

  9. Who Governs Federally Qualified Health Centers?

    PubMed Central

    Wright, Brad

    2017-01-01

    To make them more responsive to their community’s needs, federally qualified health centers (FQHCs) are required to have a governing board comprised of at least 51% consumers. However, the extent to which consumer board members actually resemble the typical FQHC patient has not been assessed, which according to the political science literature on representation may influence the board’s ability to represent the community. This mixed-methods study uses four years of data from the Health Resources and Services Administration, combined with Uniform Data System, Bureau of Labor Statistics, and Area Resource File data to describe and identify factors associated with the composition of FQHC governing boards. Board members are classified into one of three groups: non-consumers, non-representative consumers (who do not resemble the typical FQHC patient), and representative consumers (who resemble the typical FQHC patient). The analysis finds that a minority of board members are representative consumers, and telephone interviews with a stratified random sample of 30 FQHC board members confirmed the existence of significant socioeconomic gaps between consumer board members and FQHC patients. This may make FQHCs less responsive to the needs of the predominantly low-income communities they serve. PMID:23052684

  10. Authors of clinical trials reported individual and financial conflicts of interest more frequently than institutional and nonfinancial ones: a methodological survey.

    PubMed

    Hakoum, Maram B; Jouni, Nahla; Abou-Jaoude, Eliane A; Hasbani, Divina Justina; Abou-Jaoude, Elias A; Lopes, Luciane Cruz; Khaldieh, Mariam; Hammoud, Mira Z; Al-Gibbawi, Mounir; Anouti, Sirine; Guyatt, Gordon; Akl, Elie A

    2017-07-01

    Conflicts of interest (COIs) are increasingly recognized as important to disclose and manage in health research. The objective of this study was to assess the reporting of both financial and nonfinancial COI by authors of randomized controlled trials published in a representative sample of clinical journals. We searched Ovid Medline and included a random sample of 200 randomized controlled trials published in 2015 in one of the 119 Core Clinical Journals. We classified COI using a comprehensive framework that includes the following: individual COIs (financial, professional, scholarly, advocatory, personal) and institutional COIs (financial, professional, scholarly, and advocatory). We conducted descriptive and regression analyses. Of the 200 randomized controlled trials, 188 (94%) reported authors' COI disclosures that were available in the main document (92%) and as International Committee of Medical Journal Editors forms accessible online (12%). Of the 188 trials, 57% had at least one author reporting at least one COI; in all these trials, at least one author reported financial COI. Institutional COIs (11%) and nonfinancial COIs (4%) were less commonly reported. References to COI disclosure statements for editors (1%) and medical writers (0%) were seldom present. Regression analyses showed positive associations between reporting individual financial COI and higher journal impact factor (odds ratio [OR] = 1.06, 95% confidence interval [CI] = 1.02-1.10), larger number of authors (OR = 1.10, 95% CI 1.02-1.20), affiliation with an institution from a high-income country (OR = 16.75, 95% CI 3.38-82.87), and trials reporting on pharmacological interventions (OR = 2.28, 95% CI 1.13-4.62). More than half of published randomized controlled trials report that at least one author has a COI. Trial authors report financial COIs more often than nonfinancial COIs and individual COIs more frequently than institutional COIs. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Pollution gets personal! A first population-based human biomonitoring study in Austria.

    PubMed

    Hohenblum, Philipp; Steinbichl, Philipp; Raffesberg, Wolfgang; Weiss, Stefan; Moche, Wolfgang; Vallant, Birgit; Scharf, Sigrid; Haluza, Daniela; Moshammer, Hanns; Kundi, Michael; Piegler, Brigitte; Wallner, Peter; Hutter, Hans-Peter

    2012-02-01

    Humans are exposed to a broad variety of man-made chemicals. Human biomonitoring (HBM) data reveal the individual body burden irrespective of sources and routes of uptake. A first population-based study was started in Austria in 2008 and was finished at the end of May 2011. This cross sectional study aims at documenting the extent, the distribution and the determinants of human exposure to industrial chemicals as well as proving the feasibility of a representative HBM study. Overall, 150 volunteers (50 families) were selected by stratified random sampling. Exposure to phthalates, trisphosphates, polybrominated diphenyl ethers (PBDE), bisphenol A (along with nonyl- and octyl phenol) and methyl mercury was assessed. Sixteen of 18 PBDE determined were detected above the limit of quantification (LOQ) in blood samples with #153 and #197 the most abundant species. Bisphenol A in urine was measured in a subsample of 25 with only 4 samples found above the LOQ. In 3 of 100 urine samples at least one of 8 trisphosphate compounds assessed was above the LOQ. These first analytical results of the human biomonitoring data show that the body burden of the Austrian population with respect to the assessed compounds is comparable to or even lower than in other European countries. Overall, the study revealed that in order to develop a feasible protocol for representative human biomonitoring studies procedures have to be optimized to allow for non-invasive sampling of body tissues in accordance with the main metabolic pathways. Procedures of participants' recruitment were, however, labor intensive and have to be improved. Copyright © 2011 Elsevier GmbH. All rights reserved.

  12. Subsampling for dataset optimisation

    NASA Astrophysics Data System (ADS)

    Ließ, Mareike

    2017-04-01

    Soil-landscapes have formed by the interaction of soil-forming factors and pedogenic processes. In modelling these landscapes in their pedodiversity and the underlying processes, a representative unbiased dataset is required. This concerns model input as well as output data. However, very often big datasets are available which are highly heterogeneous and were gathered for various purposes, but not to model a particular process or data space. As a first step, the overall data space and/or landscape section to be modelled needs to be identified including considerations regarding scale and resolution. Then the available dataset needs to be optimised via subsampling to well represent this n-dimensional data space. A couple of well-known sampling designs may be adapted to suit this purpose. The overall approach follows three main strategies: (1) the data space may be condensed and de-correlated by a factor analysis to facilitate the subsampling process. (2) Different methods of pattern recognition serve to structure the n-dimensional data space to be modelled into units which then form the basis for the optimisation of an existing dataset through a sensible selection of samples. Along the way, data units for which there is currently insufficient soil data available may be identified. And (3) random samples from the n-dimensional data space may be replaced by similar samples from the available dataset. While being a presupposition to develop data-driven statistical models, this approach may also help to develop universal process models and identify limitations in existing models.

  13. IndeCut evaluates performance of network motif discovery algorithms.

    PubMed

    Ansariola, Mitra; Megraw, Molly; Koslicki, David

    2018-05-01

    Genomic networks represent a complex map of molecular interactions which are descriptive of the biological processes occurring in living cells. Identifying the small over-represented circuitry patterns in these networks helps generate hypotheses about the functional basis of such complex processes. Network motif discovery is a systematic way of achieving this goal. However, a reliable network motif discovery outcome requires generating random background networks which are the result of a uniform and independent graph sampling method. To date, there has been no method to numerically evaluate whether any network motif discovery algorithm performs as intended on realistically sized datasets-thus it was not possible to assess the validity of resulting network motifs. In this work, we present IndeCut, the first method to date that characterizes network motif finding algorithm performance in terms of uniform sampling on realistically sized networks. We demonstrate that it is critical to use IndeCut prior to running any network motif finder for two reasons. First, IndeCut indicates the number of samples needed for a tool to produce an outcome that is both reproducible and accurate. Second, IndeCut allows users to choose the tool that generates samples in the most independent fashion for their network of interest among many available options. The open source software package is available at https://github.com/megrawlab/IndeCut. megrawm@science.oregonstate.edu or david.koslicki@math.oregonstate.edu. Supplementary data are available at Bioinformatics online.

  14. The Saving and Empowering Young Lives in Europe (SEYLE) Randomized Controlled Trial (RCT): methodological issues and participant characteristics

    PubMed Central

    2013-01-01

    Background Mental health problems and risk behaviours among young people are of great public health concern. Consequently, within the VII Framework Programme, the European Commission funded the Saving and Empowering Young Lives in Europe (SEYLE) project. This Randomized Controlled Trial (RCT) was conducted in eleven European countries, with Sweden as the coordinating centre, and was designed to identify an effective way to promote mental health and reduce suicidality and risk taking behaviours among adolescents. Objective To describe the methodological and field procedures in the SEYLE RCT among adolescents, as well as to present the main characteristics of the recruited sample. Methods Analyses were conducted to determine: 1) representativeness of study sites compared to respective national data; 2) response rate of schools and pupils, drop-out rates from baseline to 3 and 12 month follow-up, 3) comparability of samples among the four Intervention Arms; 4) properties of the standard scales employed: Beck Depression Inventory, Second Edition (BDI-II), Zung Self-Rating Anxiety Scale (Z-SAS), Strengths and Difficulties Questionnaire (SDQ), World Health Organization Well-Being Scale (WHO-5). Results Participants at baseline comprised 12,395 adolescents (M/F: 5,529/6,799; mean age=14.9±0.9) from Austria, Estonia, France, Germany, Hungary, Ireland, Israel, Italy, Romania, Slovenia and Spain. At the 3 and 12 months follow up, participation rates were 87.3% and 79.4%, respectively. Demographic characteristics of participating sites were found to be reasonably representative of their respective national population. Overall response rate of schools was 67.8%. All scales utilised in the study had good to very good internal reliability, as measured by Cronbach’s alpha (BDI-II: 0.864; Z-SAS: 0.805; SDQ: 0.740; WHO-5: 0.799). Conclusions SEYLE achieved its objective of recruiting a large representative sample of adolescents within participating European countries. Analysis of SEYLE data will shed light on the effectiveness of important interventions aimed at improving adolescent mental health and well-being, reducing risk-taking and self-destructive behaviour and preventing suicidality. Trial registration US National Institute of Health (NIH) clinical trial registry (NCT00906620) and the German Clinical Trials Register (DRKS00000214). PMID:23679917

  15. RECAL: A Computer Program for Selecting Sample Days for Recreation Use Estimation

    Treesearch

    D.L. Erickson; C.J. Liu; H. Ken Cordell; W.L. Chen

    1980-01-01

    Recreation Calendar (RECAL) is a computer program in PL/I for drawing a sample of days for estimating recreation use. With RECAL, a sampling period of any length may be chosen; simple random, stratified random, and factorial designs can be accommodated. The program randomly allocates days to strata and locations.

  16. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  17. [Regional and individual factors of stress experience in Germany: results of a representative survey with the perceived stress questionnaire (PSQ)].

    PubMed

    Kocalevent, R-D; Hinz, A; Brähler, E; Klapp, B F

    2011-12-01

    The aim of the present study was to define, besides prevalence data, regional and individual factors of stress experience in a representative sample of the German general population. Regional factors were examined separately by federal state and the size of the political location. Individual factors were defined according to the severity of the stress experience as well as on the basis of central social factors such as family state, profession and earnings. The Perceived Stress Questionnaire (PSQ), a validated, self-evaluation process for recording a subjective representation of frequency estimates of stress experiences was used. Data acquisition was carried out by a market research institute in a multi-topic questionnaire (N=2,552). Households were selected by the random route procedure, target persons were also selected at random. The prevalence rate for an elevated stress experience was 14.5%, that for a very high stress experience 3.1% of the sample. People without education exhibited the highest rates of stress experience (36.8%), followed by the unemployed (30.6%). Individual and social factors that favour an increased stress experience are a subjectively poor state of health (OR: 3.42) or belonging to the lower social economic status (OR: 1.30). Furthermore, there are indications of regional factors such as size of the location as well as differences between the individual federal states. An east-west comparion did not show any significant differences with regard to stress experiences. In the light of the illness burden associated with chronic stress situations, preventative measures in cases of unemployment or low level of education should be given priority. © Georg Thieme Verlag KG Stuttgart · New York.

  18. Intersectionality takes it to the streets: Mobilizing across diverse interests for the Women’s March

    PubMed Central

    Fisher, Dana R.; Dow, Dawn M.; Ray, Rashawn

    2017-01-01

    Can a diverse crowd of individuals whose interests focus on distinct issues related to racial identity, class, gender, and sexuality mobilize around a shared issue? If so, how does this process work in practice? To date, limited research has explored intersectionality as a mobilization tool for social movements. This paper unpacks how intersectionality influences the constituencies represented in one of the largest protests ever observed in the United States: the Women’s March on Washington in January 2017. Analyzing a data set collected from a random sample of participants, we explore how social identities influenced participation in the Women’s March. Our analysis demonstrates how individuals’ motivations to participate represented an intersectional set of issues and how coalitions of issues emerge. We conclude by discussing how these coalitions enable us to understand and predict the future of the anti-Trump resistance. PMID:28948230

  19. Intersectionality takes it to the streets: Mobilizing across diverse interests for the Women's March.

    PubMed

    Fisher, Dana R; Dow, Dawn M; Ray, Rashawn

    2017-09-01

    Can a diverse crowd of individuals whose interests focus on distinct issues related to racial identity, class, gender, and sexuality mobilize around a shared issue? If so, how does this process work in practice? To date, limited research has explored intersectionality as a mobilization tool for social movements. This paper unpacks how intersectionality influences the constituencies represented in one of the largest protests ever observed in the United States: the Women's March on Washington in January 2017. Analyzing a data set collected from a random sample of participants, we explore how social identities influenced participation in the Women's March. Our analysis demonstrates how individuals' motivations to participate represented an intersectional set of issues and how coalitions of issues emerge. We conclude by discussing how these coalitions enable us to understand and predict the future of the anti-Trump resistance.

  20. A Comparison of Techniques for Scheduling Fleets of Earth-Observing Satellites

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna

    2003-01-01

    Earth observing satellite (EOS) scheduling is a complex real-world domain representative of a broad class of over-subscription scheduling problems. Over-subscription problems are those where requests for a facility exceed its capacity. These problems arise in a wide variety of NASA and terrestrial domains and are .XI important class of scheduling problems because such facilities often represent large capital investments. We have run experiments comparing multiple variants of the genetic algorithm, hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on two variants of a realistically-sized model of the EOS scheduling problem. These are implemented as permutation-based methods; methods that search in the space of priority orderings of observation requests and evaluate each permutation by using it to drive a greedy scheduler. Simulated annealing performs best and random mutation operators outperform our squeaky (more intelligent) operator. Furthermore, taking smaller steps towards the end of the search improves performance.

  1. [The state of quality management implementation in ambulatory care nursing and inpatient nursing].

    PubMed

    Farin, E; Hauer, J; Schmidt, E; Kottner, J; Jäckel, W H

    2013-02-01

    The demands being made on quality assurance and quality management in ambulatory care nursing and inpatient nursing facilities continue to grow. As opposed to health-care facilities such as hospitals and rehabilitation centres, we know of no other empirical studies addressing the current state of affairs in quality management in nursing institutions. The aim of this investigation was, by means of a questionnaire, to analyse the current (as of spring 2011) dissemination of quality management and certification in nursing facilities using a random sample as representative as possible of in- and outpatient institutions. To obtain our sample we compiled 800 inpatient and 800 outpatient facilities as a stratified random sample. Federal state, holder and, for inpatient facilities, the number of beds were used as stratification variables. 24% of the questionnaires were returned, giving us information on 188 outpatient and 220 inpatient institutions. While the distribution in the sample of outpatient institutions is equivalent to the population distribution, we observed discrepancies in the inpatient facilities sample. As they do not seem to be related to any demonstrable bias, we assume that our data are sufficiently representative. 4 of 5 of the responding facilities claim to employ their own quality management system, however the degree to which the quality management mechanisms are actually in use is an estimated 75%. Almost 90% of all the facilities have a quality management representative who often possesses specific additional qualifications. Many relevant quality management instruments (i. e., nursing standards of care, questionnaires, quality circles) are used in 75% of the responding institutions. Various factors in our data give the impression that quality management and certification efforts have made more progress in the inpatient facilities. Although 80% of the outpatient institutions claim to have a quality management system, only 32.1% of them admit to having already been (or be in current preparation to be) certified, a figure that was 41.5% among the inpatient facilities. These percentages are smaller when one relies on information provided by the certifying institutions themselves rather on the nursing facilities. Most frequent is the certification according to the DIN EN ISO 9001 standard, since the care-specific certification procedures most widespread on the market enable facilities to combine a care-specific certificate with one according to DIN norms. Quality management has become very widespread in nursing facilities: every third institution claims to have been certified, and the trend to become certified has clearly intensified over the last few years. We observe overall very great acceptance of both internal quality management and external quality assurance. We suspect that the current use of quality management instruments in many nursing facilities will not fall behind such efforts in hospitals and rehabilitation centres. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Understanding and comparisons of different sampling approaches for the Fourier Amplitudes Sensitivity Test (FAST)

    PubMed Central

    Xu, Chonggang; Gertner, George

    2013-01-01

    Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements. PMID:24143037

  3. Understanding and comparisons of different sampling approaches for the Fourier Amplitudes Sensitivity Test (FAST).

    PubMed

    Xu, Chonggang; Gertner, George

    2011-01-01

    Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements.

  4. Statistical modeling of interfractional tissue deformation and its application in radiation therapy planning

    NASA Astrophysics Data System (ADS)

    Vile, Douglas J.

    In radiation therapy, interfraction organ motion introduces a level of geometric uncertainty into the planning process. Plans, which are typically based upon a single instance of anatomy, must be robust against daily anatomical variations. For this problem, a model of the magnitude, direction, and likelihood of deformation is useful. In this thesis, principal component analysis (PCA) is used to statistically model the 3D organ motion for 19 prostate cancer patients, each with 8-13 fractional computed tomography (CT) images. Deformable image registration and the resultant displacement vector fields (DVFs) are used to quantify the interfraction systematic and random motion. By applying the PCA technique to the random DVFs, principal modes of random tissue deformation were determined for each patient, and a method for sampling synthetic random DVFs was developed. The PCA model was then extended to describe the principal modes of systematic and random organ motion for the population of patients. A leave-one-out study tested both the systematic and random motion model's ability to represent PCA training set DVFs. The random and systematic DVF PCA models allowed the reconstruction of these data with absolute mean errors between 0.5-0.9 mm and 1-2 mm, respectively. To the best of the author's knowledge, this study is the first successful effort to build a fully 3D statistical PCA model of systematic tissue deformation in a population of patients. By sampling synthetic systematic and random errors, organ occupancy maps were created for bony and prostate-centroid patient setup processes. By thresholding these maps, PCA-based planning target volume (PTV) was created and tested against conventional margin recipes (van Herk for bony alignment and 5 mm fixed [3 mm posterior] margin for centroid alignment) in a virtual clinical trial for low-risk prostate cancer. Deformably accumulated delivered dose served as a surrogate for clinical outcome. For the bony landmark setup subtrial, the PCA PTV significantly (p<0.05) reduced D30, D20, and D5 to bladder and D50 to rectum, while increasing rectal D20 and D5. For the centroid-aligned setup, the PCA PTV significantly reduced all bladder DVH metrics and trended to lower rectal toxicity metrics. All PTVs covered the prostate with the prescription dose.

  5. Isolation of leptospira Serovars Canicola and Copenhageni from cattle urine in the state of ParanÁ, Brazil

    PubMed Central

    Zacarias, Francielle Gibson da Silva; Vasconcellos, Silvio Arruda; Anzai, Eleine Kuroki; Giraldi, Nilson; de Freitas, Julio Cesar; Hartskeerl, Rudy

    2008-01-01

    In 2001, 698 urine samples were randomly collected from cattle at a slaughterhouse in the State of Paraná, Brazil. Direct examination using dark field microscopy was carried out immediately after collection. Five putative positive samples were cultured in modified EMJH medium, yielding two positive cultures (LO-14 and LO-10). Typing with monoclonal antibodies revealed that the two isolates were similar to Canicola (LO-14) and Copenhageni (LO-10). Microscopic agglutination test results show that Hardjo is the most common serovar in cattle in Brazil. Rats and dogs are the common maintenance hosts of serovars Copenhageni and Canicola. The excretion of highly pathogenic serovars such as Copenhageni and Canicola by cattle can represent an increasing risk for severe leptospirosis is large populations, mainly living in rural areas. PMID:24031301

  6. Understanding the Effects of Sampling on Healthcare Risk Modeling for the Prediction of Future High-Cost Patients

    NASA Astrophysics Data System (ADS)

    Moturu, Sai T.; Liu, Huan; Johnson, William G.

    Rapidly rising healthcare costs represent one of the major issues plaguing the healthcare system. Data from the Arizona Health Care Cost Containment System, Arizona's Medicaid program provide a unique opportunity to exploit state-of-the-art machine learning and data mining algorithms to analyze data and provide actionable findings that can aid cost containment. Our work addresses specific challenges in this real-life healthcare application with respect to data imbalance in the process of building predictive risk models for forecasting high-cost patients. We survey the literature and propose novel data mining approaches customized for this compelling application with specific focus on non-random sampling. Our empirical study indicates that the proposed approach is highly effective and can benefit further research on cost containment in the healthcare industry.

  7. Shoulder strength value differences between genders and age groups.

    PubMed

    Balcells-Diaz, Eudald; Daunis-I-Estadella, Pepus

    2018-03-01

    The strength of a normal shoulder differs according to gender and decreases with age. Therefore, the Constant score, which is a shoulder function measurement tool that allocates 25% of the final score to strength, differs from the absolute values but likely reflects a normal shoulder. To compare group results, a normalized Constant score is needed, and the first step to achieving normalization involves statistically establishing the gender differences and age-related decline. In this investigation, we sought to verify the gender difference and age-related decline in strength. We obtained a randomized representative sample of the general population in a small to medium-sized Spanish city. We then invited this population to participate in our study, and we measured their shoulder strength. We performed a statistical analysis with a power of 80% and a P value < .05. We observed a statistically significant difference between the genders and a statistically significant decline with age. To the best of our knowledge, this is the first investigation to study a representative sample of the general population from which conclusions can be drawn regarding Constant score normalization. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  8. Resilience-promoting factors in war-exposed adolescents: an epidemiologic study.

    PubMed

    Fayyad, John; Cordahi-Tabet, C; Yeretzian, J; Salamoun, M; Najm, C; Karam, E G

    2017-02-01

    Studies of war-exposed children have not investigated a comprehensive array of resilience-promoting factors, nor representative samples of children and adolescents. A representative sample of N = 710 adolescents was randomly selected from communities recently exposed to war. All those who had experienced war trauma were administered questionnaires measuring war exposure, family violence, availability of leisure activities, school-related problems, interpersonal and peer problems, socialization, daily routine problems, displacement, availability of parental supervision and contact and medical needs as well as coping skills related to religious coping, denial, self-control, avoidance and problem solving. Mental health was measured by the Strengths and Difficulties Questionnaire (SDQ) and the Child-Revised Impact of Events Scale (CRIES). Resilient adolescents were defined as those who experienced war trauma, but did not manifest any symptoms on the SDQ or CRIES. Resilience was related to being male, using problem-solving techniques, having leisure activities, and having parents who spent time with their adolescents and who supported them with school work. Interventions designed for war-traumatized youth must build individual coping skills of children and adolescents, yet at the same time target parents and teachers in an integrated manner.

  9. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  10. Systematic bias in genomic classification due to contaminating non-neoplastic tissue in breast tumor samples.

    PubMed

    Elloumi, Fathi; Hu, Zhiyuan; Li, Yan; Parker, Joel S; Gulley, Margaret L; Amos, Keith D; Troester, Melissa A

    2011-06-30

    Genomic tests are available to predict breast cancer recurrence and to guide clinical decision making. These predictors provide recurrence risk scores along with a measure of uncertainty, usually a confidence interval. The confidence interval conveys random error and not systematic bias. Standard tumor sampling methods make this problematic, as it is common to have a substantial proportion (typically 30-50%) of a tumor sample comprised of histologically benign tissue. This "normal" tissue could represent a source of non-random error or systematic bias in genomic classification. To assess the performance characteristics of genomic classification to systematic error from normal contamination, we collected 55 tumor samples and paired tumor-adjacent normal tissue. Using genomic signatures from the tumor and paired normal, we evaluated how increasing normal contamination altered recurrence risk scores for various genomic predictors. Simulations of normal tissue contamination caused misclassification of tumors in all predictors evaluated, but different breast cancer predictors showed different types of vulnerability to normal tissue bias. While two predictors had unpredictable direction of bias (either higher or lower risk of relapse resulted from normal contamination), one signature showed predictable direction of normal tissue effects. Due to this predictable direction of effect, this signature (the PAM50) was adjusted for normal tissue contamination and these corrections improved sensitivity and negative predictive value. For all three assays quality control standards and/or appropriate bias adjustment strategies can be used to improve assay reliability. Normal tissue sampled concurrently with tumor is an important source of bias in breast genomic predictors. All genomic predictors show some sensitivity to normal tissue contamination and ideal strategies for mitigating this bias vary depending upon the particular genes and computational methods used in the predictor.

  11. Statistical and population genetics issues of two Hungarian datasets from the aspect of DNA evidence interpretation.

    PubMed

    Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma

    2015-11-01

    When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Knowledge and attitude towards total knee arthroplasty among the public in Saudi Arabia: a nationwide population-based study.

    PubMed

    Al-Mohrej, Omar A; Alshammari, Faris O; Aljuraisi, Abdulrahman M; Bin Amer, Lujain A; Masuadi, Emad M; Al-Kenani, Nader S

    2018-04-01

    Studies on total knee arthroplasty (TKA) in Saudi Arabia are scarce, and none have reported the knowledge and attitude of the procedure in Saudi Arabia. Our study aims to measure the knowledge and attitude of TKA among the adult Saudi population. To encompass a representative sample of this cross-sectional survey, all 13 administrative areas were used as ready-made geographical clusters. For each cluster, stratified random sampling was performed to maximize participation in the study. In each area, random samples of mobile phone numbers were selected with a probability proportional to the administrative area population size. Sample size calculation was based on the assumption that 50% of the participants would have some level of knowledge, with a 2% margin of error and 95% confidence level. To reach our intended sample size of 1540, we contacted 1722 participants with a response rate of 89.4%. The expected percentage of public knowledge was 50%; however, the actual percentage revealed by this study was much lower (29.7%). A stepwise multiple logistic regression was used to assess the factors that positively affected the knowledge score regarding TKA. Age [P = 0.016 with OR of 0.47], higher income [P = 0.001 with OR of 0.52] and participants with a positive history of TKA or that have known someone who underwent the surgery [P < 0.001 with OR of 0.15] had a positive impact on the total knowledge score. There are still misconceptions among the public in Saudi Arabia concerning TKA, its indications and results. We recommend that doctors use the results of our survey to assess their conversations with their patients, and to determine whether the results of the procedure are adequately clarified.

  13. Three-dimensional analysis of the uniqueness of the anterior dentition in orthodontically treated patients and twins.

    PubMed

    Franco, A; Willems, G; Souza, P H C; Tanaka, O M; Coucke, W; Thevissen, P

    2017-04-01

    Dental uniqueness can be proven if no perfect match in pair-wise morphological comparisons of human dentitions is detected. Establishing these comparisons in a worldwide random population is practically unfeasible due to the need for a large and representative sample size. Sample stratification is an option to reduce sample size. The present study investigated the uniqueness of the human dentition in randomly selected subjects (Group 1), orthodontically treated patients (Group 2), twins (Group 3), and orthodontically treated twins (Group 4) in comparison with a threshold control sample of identical dentitions (Group 5). The samples consisted of digital cast files (DCF) obtained through extraoral 3D scanning. A total of 2.013 pair-wise morphological comparisons were performed (Group 1 n=110, Group 2 n=1.711, Group 3 n=172, Group 4 n=10, Group 5 n=10) with Geomagic Studio ® (3D Systems ® , Rock Hill, SC, USA) software package. Comparisons within groups were performed quantifying the morphological differences between DCF in Euclidean distances. Comparisons between groups were established applying One-way ANOVA. To ensure fair comparisons a post-hoc Power Analysis was performed. ROC analysis was applied to distinguish unique from non-unique dentures. Identical DCF were not detected within the experimental groups (from 1 to 4). The most similar DCF had Euclidian distance of 5.19mm in Group 1, 2.06mm in Group 2, 2.03mm in Group 3, and 1.88mm in Group 4. Groups 2 and 3 were statistically different from Group 5 (p<0.05). Statistically significant difference between Group 4 and 5 revealed to be possible including more pair-wise comparisons in both groups. The ROC analysis revealed sensitivity rate of 80% and specificity between 66.7% and 81.6%. Evidence to sustain the uniqueness of the human dentition in random and stratified populations was observed in the present study. Further studies testing the influence of the quantity of tooth material on morphological difference between dentitions and its impact on uniqueness remain necessary. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Lunar Meteorites: A Global Geochemical Dataset

    NASA Technical Reports Server (NTRS)

    Zeigler, R. A.; Joy, K. H.; Arai, T.; Gross, J.; Korotev, R. L.; McCubbin, F. M.

    2017-01-01

    To date, the world's meteorite collections contain over 260 lunar meteorite stones representing at least 120 different lunar meteorites. Additionally, there are 20-30 as yet unnamed stones currently in the process of being classified. Collectively these lunar meteorites likely represent 40-50 distinct sampling locations from random locations on the Moon. Although the exact provenance of each individual lunar meteorite is unknown, collectively the lunar meteorites represent the best global average of the lunar crust. The Apollo sites are all within or near the Procellarum KREEP Terrane (PKT), thus lithologies from the PKT are overrepresented in the Apollo sample suite. Nearly all of the lithologies present in the Apollo sample suite are found within the lunar meteorites (high-Ti basalts are a notable exception), and the lunar meteorites contain several lithologies not present in the Apollo sample suite (e.g., magnesian anorthosite). This chapter will not be a sample-by-sample summary of each individual lunar meteorite. Rather, the chapter will summarize the different types of lunar meteorites and their relative abundances, comparing and contrasting the lunar meteorite sample suite with the Apollo sample suite. This chapter will act as one of the introductory chapters to the volume, introducing lunar samples in general and setting the stage for more detailed discussions in later more specialized chapters. The chapter will begin with a description of how lunar meteorites are ejected from the Moon, how deep samples are being excavated from, what the likely pairing relationships are among the lunar meteorite samples, and how the lunar meteorites can help to constrain the impactor flux in the inner solar system. There will be a discussion of the biases inherent to the lunar meteorite sample suite in terms of underrepresented lithologies or regions of the Moon, and an examination of the contamination and limitations of lunar meteorites due to terrestrial weathering. The bulk of the chapter will use examples from the lunar meteorite suite to examine important recent advances in lunar science, including (but not limited to the following: (1) Understanding the global compositional diversity of the lunar surface; (2) Understanding the formation of the ancient lunar primary crust; (3) Understanding the diversity and timing of mantle melting, and secondary crust formation; (4) Comparing KREEPy lunar meteorites to KREEPy Apollo samples as evidence of variability within the PKT; and (5) A better understanding of the South Pole Aitken Basin through lunar meteorites whose provenance are within that Terrane.

  15. Leveraging the rice genome sequence for monocot comparative and translational genomics.

    PubMed

    Lohithaswa, H C; Feltus, F A; Singh, H P; Bacon, C D; Bailey, C D; Paterson, A H

    2007-07-01

    Common genome anchor points across many taxa greatly facilitate translational and comparative genomics and will improve our understanding of the Tree of Life. To add to the repertoire of genomic tools applicable to the study of monocotyledonous plants in general, we aligned Allium and Musa ESTs to Oryza BAC sequences and identified candidate Allium-Oryza and Musa-Oryza conserved intron-scanning primers (CISPs). A random sampling of 96 CISP primer pairs, representing loci from 11 of the 12 chromosomes in rice, were tested on seven members of the order Poales and on representatives of the Arecales, Asparagales, and Zingiberales monocot orders. The single-copy amplification success rates of Allium (31.3%), Cynodon (31.4%), Hordeum (30.2%), Musa (37.5%), Oryza (61.5%), Pennisetum (33.3%), Sorghum (47.9%), Zea (33.3%), Triticum (30.2%), and representatives of the palm family (32.3%) suggest that subsets of these primers will provide DNA markers suitable for comparative and translational genomics in orphan crops, as well as for applications in conservation biology, ecology, invasion biology, population biology, systematic biology, and related fields.

  16. [Sensitivity of four representative angular cephalometric measures].

    PubMed

    Xü, T; Ahn, J; Baumrind, S

    2000-05-01

    Examined the sensitivity of four representative cephalometric angles to the detection of different vectors of craniofacial growth. Landmark coordinate data from a stratified random sample of 48 adolescent subjects were used to calculate conventional values for changes between the pretreatment and end-of-treatment lateral cephalograms. By modifying the end-of-treatment coordinate values appropriately, the angular changes could be recalculated reflecting three hypothetical situations: Case 1. What if there were no downward landmark displacement between timepoints? Case 2. What if there were no forward landmark displacement between timepoints? Case 3. What if there were no Nasion change? These questions were asked for four representative cephalometric angles: SNA, ANB, NAPg and UI-SN. For Case 1, the associations (r) between the baseline and the modified measure for the three angles were very highly significant (P < 0.001) with r2 values no lower than 0.94! For Case 2, however, the associations were much weaker and no r value reached significance. These angular measurements are less sensitive for measuring downward landmark displacement than they are for measuring forward landmark displacement.

  17. Improving the Representativeness of Behavioral and Clinical Surveillance for Persons with HIV in the United States: The Rationale for Developing a Population-Based Approach

    PubMed Central

    McNaghten, A. D.; Wolfe, Mitchell I.; Onorato, Ida; Nakashima, Allyn K.; Valdiserri, Ronald O.; Mokotoff, Eve; Romaguera, Raul A.; Kroliczak, Alice; Janssen, Robert S.; Sullivan, Patrick S.

    2007-01-01

    The need for a new surveillance approach to understand the clinical outcomes and behaviors of people in care for HIV evolved from the new challenges for monitoring clinical outcomes in the HAART era, the impact of the epidemic on an increasing number of areas in the US, and the need for representative data to describe the epidemic and related resource utilization and needs. The Institute of Medicine recommended that the Centers for Disease Control and Prevention and the Heath Resources and Services Administration coordinate efforts to survey a random sample of HIV-infected persons in care, in order to more accurately measure the need for prevention and care services. The Medical Monitoring Project (MMP) was created to meet these needs. This manuscript describes the evolution and design of MMP, a new nationally representative clinical outcomes and behavioral surveillance system, and describes how MMP data will be used locally and nationally to identify care and treatment utilization needs, and to plan for prevention interventions and services. PMID:17579722

  18. Influences of sampling size and pattern on the uncertainty of correlation estimation between soil water content and its influencing factors

    NASA Astrophysics Data System (ADS)

    Lai, Xiaoming; Zhu, Qing; Zhou, Zhiwen; Liao, Kaihua

    2017-12-01

    In this study, seven random combination sampling strategies were applied to investigate the uncertainties in estimating the hillslope mean soil water content (SWC) and correlation coefficients between the SWC and soil/terrain properties on a tea + bamboo hillslope. One of the sampling strategies is the global random sampling and the other six are the stratified random sampling on the top, middle, toe, top + mid, top + toe and mid + toe slope positions. When each sampling strategy was applied, sample sizes were gradually reduced and each sampling size contained 3000 replicates. Under each sampling size of each sampling strategy, the relative errors (REs) and coefficients of variation (CVs) of the estimated hillslope mean SWC and correlation coefficients between the SWC and soil/terrain properties were calculated to quantify the accuracy and uncertainty. The results showed that the uncertainty of the estimations decreased as the sampling size increasing. However, larger sample sizes were required to reduce the uncertainty in correlation coefficient estimation than in hillslope mean SWC estimation. Under global random sampling, 12 randomly sampled sites on this hillslope were adequate to estimate the hillslope mean SWC with RE and CV ≤10%. However, at least 72 randomly sampled sites were needed to ensure the estimated correlation coefficients with REs and CVs ≤10%. Comparing with all sampling strategies, reducing sampling sites on the middle slope had the least influence on the estimation of hillslope mean SWC and correlation coefficients. Under this strategy, 60 sites (10 on the middle slope and 50 on the top and toe slopes) were enough to ensure the estimated correlation coefficients with REs and CVs ≤10%. This suggested that when designing the SWC sampling, the proportion of sites on the middle slope can be reduced to 16.7% of the total number of sites. Findings of this study will be useful for the optimal SWC sampling design.

  19. Injury-related mortality in South Africa: a retrospective descriptive study of postmortem investigations.

    PubMed

    Matzopoulos, Richard; Prinsloo, Megan; Pillay-van Wyk, Victoria; Gwebushe, Nomonde; Mathews, Shanaaz; Martin, Lorna J; Laubscher, Ria; Abrahams, Naeemah; Msemburi, William; Lombard, Carl; Bradshaw, Debbie

    2015-05-01

    To investigate injury-related mortality in South Africa using a nationally representative sample and compare the results with previous estimates. We conducted a retrospective descriptive study of medico-legal postmortem investigation data from mortuaries using a multistage random sample, stratified by urban and non-urban areas and mortuary size. We calculated age-specific and age-standardized mortality rates for external causes of death. Postmortem reports revealed 52,493 injury-related deaths in 2009 (95% confidence interval, CI: 46,930-58,057). Almost half (25,499) were intentionally inflicted. Age-standardized mortality rates per 100,000 population were as follows: all injuries: 109.0 (95% CI: 97.1-121.0); homicide 38.4 (95% CI: 33.8-43.0; suicide 13.4 (95% CI: 11.6-15.2) and road-traffic injury 36.1 (95% CI: 30.9-41.3). Using postmortem reports, we found more than three times as many deaths from homicide and road-traffic injury than had been recorded by vital registration for this period. The homicide rate was similar to the estimate for South Africa from a global analysis, but road-traffic and suicide rates were almost fourfold higher. This is the first nationally representative sample of injury-related mortality in South Africa. It provides more accurate estimates and cause-specific profiles that are not available from other sources.

  20. Differences Between Landline and Mobile Phone Users in Sexual Behavior Research.

    PubMed

    Badcock, Paul B; Patrick, Kent; Smith, Anthony M A; Simpson, Judy M; Pennay, Darren; Rissel, Chris E; de Visser, Richard O; Grulich, Andrew E; Richters, Juliet

    2017-08-01

    This study investigated differences between the demographic characteristics, participation rates (i.e., agreeing to respond to questions about sexual behavior), and sexual behaviors of landline and mobile phone samples in Australia. A nationally representative sample of Australians aged 18 years and over was recruited via random digit dialing in December 2011 to collect data via computer-assisted telephone interviews. A total of 1012 people (370 men, 642 women) completed a landline interview and 1002 (524 men, 478 women) completed a mobile phone interview. Results revealed that telephone user status was significantly related to all demographic variables: gender, age, educational attainment, area of residence, country of birth, household composition, and current ongoing relationship status. In unadjusted analyses, telephone status was also associated with women's participation rates, participants' number of other-sex sexual partners in the previous year, and women's lifetime sexual experience. However, after controlling for significant demographic factors, telephone status was only independently related to women's participation rates. Post hoc analyses showed that significant, between-group differences for all other sexual behavior outcomes could be explained by demographic covariates. Results also suggested that telephone status may be associated with participation bias in research on sexual behavior. Taken together, these findings highlight the importance of sampling both landline and mobile phone users to improve the representativeness of sexual behavior data collected via telephone interviews.

  1. Sequential time interleaved random equivalent sampling for repetitive signal.

    PubMed

    Zhao, Yijiu; Liu, Jingjing

    2016-12-01

    Compressed sensing (CS) based sampling techniques exhibit many advantages over other existing approaches for sparse signal spectrum sensing; they are also incorporated into non-uniform sampling signal reconstruction to improve the efficiency, such as random equivalent sampling (RES). However, in CS based RES, only one sample of each acquisition is considered in the signal reconstruction stage, and it will result in more acquisition runs and longer sampling time. In this paper, a sampling sequence is taken in each RES acquisition run, and the corresponding block measurement matrix is constructed using a Whittaker-Shannon interpolation formula. All the block matrices are combined into an equivalent measurement matrix with respect to all sampling sequences. We implemented the proposed approach with a multi-cores analog-to-digital converter (ADC), whose ADC cores are time interleaved. A prototype realization of this proposed CS based sequential random equivalent sampling method has been developed. It is able to capture an analog waveform at an equivalent sampling rate of 40 GHz while sampled at 1 GHz physically. Experiments indicate that, for a sparse signal, the proposed CS based sequential random equivalent sampling exhibits high efficiency.

  2. Challenges to Recruiting Population Representative Samples of Female Sex Workers in China Using Respondent Driven Sampling1

    PubMed Central

    Merli, M. Giovanna; Moody, James; Smith, Jeffrey; Li, Jing; Weir, Sharon; Chen, Xiangsheng

    2014-01-01

    We explore the network coverage of a sample of female sex workers (FSWs) in China recruited through Respondent Drive Sampling (RDS) as part of an effort to evaluate the claim of RDS of population representation with empirical data. We take advantage of unique information on the social networks of FSWs obtained from two overlapping studies --RDS and a venue-based sampling approach (PLACE) -- and use an exponential random graph modeling (ERGM) framework from local networks to construct a likely network from which our observed RDS sample is drawn. We then run recruitment chains over this simulated network to assess the assumption that the RDS chain referral process samples participants in proportion to their degree and the extent to which RDS satisfactorily covers certain parts of the network. We find evidence that, contrary to assumptions, RDS oversamples low degree nodes and geographically central areas of the network. Unlike previous evaluations of RDS which have explored the performance of RDS sampling chains on a non-hidden population, or the performance of simulated chains over previously mapped realistic social networks, our study provides a robust, empirically grounded evaluation of the performance of RDS chains on a real-world hidden population. PMID:24834869

  3. Sampling large random knots in a confined space

    NASA Astrophysics Data System (ADS)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  4. The factors that influence job satisfaction among royal Malaysian customs department employee

    NASA Astrophysics Data System (ADS)

    Ammar Shafi, Muhammad; Saifullah Rusiman, Mohd; Nor, Maria Elena; Khamis, Azme; Nabilah Syuhada Abdullah, Siti; Syafiq Azmi, Mohd; Sakinah Zainal Abidin, Munirah; Ali, Maselan

    2018-04-01

    This research aims to spot the factors that influence job satisfaction among Royal Malaysian Customs Department employees. Primary data was used in this research and it was collected from the employees who work in five different departments at Royal Malaysian Customs Department Tower Johor. Those departments were customs department, Internal Taxes, Technical Services, Management and Prevention. The research used stratified random sampling to collect the sample and Structural Equation Modelling (SEM) to measure the relationship between variables using AMOS software. About 127 employees are selected as the respondents from five departments to represent the sample. The result showed that ‘Organizational Commitment’ (p-value = 0.001) has significant and direct effect toward job satisfaction compared to the ‘Stress Condition’ (p-value = 0.819) and ‘Motivation’ factor (p-value = 0.978). It was also concluded that ‘Organizational Commitment’ was the most influential factor toward job satisfaction among Royal Malaysian Customs Department employees at Tower Custom Johor, Johor Bahru.

  5. Microbiological survey of raw and ready-to-eat leafy green vegetables marketed in Italy.

    PubMed

    Losio, M N; Pavoni, E; Bilei, S; Bertasi, B; Bove, D; Capuano, F; Farneti, S; Blasi, G; Comin, D; Cardamone, C; Decastelli, L; Delibato, E; De Santis, P; Di Pasquale, S; Gattuso, A; Goffredo, E; Fadda, A; Pisanu, M; De Medici, D

    2015-10-01

    The presence of foodborne pathogens (Salmonella spp., Listeria monocytogenes, Escherichia coli O157:H7, thermotolerant Campylobacter, Yersinia enterocolitica and norovirus) in fresh leafy (FL) and ready-to-eat (RTE) vegetable products, sampled at random on the Italian market, was investigated to evaluate the level of risk to consumers. Nine regional laboratories, representing 18 of the 20 regions of Italy and in which 97.7% of the country's population resides, were involved in this study. All laboratories used the same sampling procedures and analytical methods. The vegetable samples were screened using validated real-time PCR (RT-PCR) methods and standardized reference ISO culturing methods. The results show that 3.7% of 1372 fresh leafy vegetable products and 1.8% of 1160 "fresh-cut" or "ready-to-eat" (RTE) vegetable retailed in supermarkets or farm markets, were contaminated with one or more foodborne pathogens harmful to human health. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Random bit generation at tunable rates using a chaotic semiconductor laser under distributed feedback.

    PubMed

    Li, Xiao-Zhou; Li, Song-Sui; Zhuang, Jun-Ping; Chan, Sze-Chun

    2015-09-01

    A semiconductor laser with distributed feedback from a fiber Bragg grating (FBG) is investigated for random bit generation (RBG). The feedback perturbs the laser to emit chaotically with the intensity being sampled periodically. The samples are then converted into random bits by a simple postprocessing of self-differencing and selecting bits. Unlike a conventional mirror that provides localized feedback, the FBG provides distributed feedback which effectively suppresses the information of the round-trip feedback delay time. Randomness is ensured even when the sampling period is commensurate with the feedback delay between the laser and the grating. Consequently, in RBG, the FBG feedback enables continuous tuning of the output bit rate, reduces the minimum sampling period, and increases the number of bits selected per sample. RBG is experimentally investigated at a sampling period continuously tunable from over 16 ns down to 50 ps, while the feedback delay is fixed at 7.7 ns. By selecting 5 least-significant bits per sample, output bit rates from 0.3 to 100 Gbps are achieved with randomness examined by the National Institute of Standards and Technology test suite.

  7. Optimal spatial sampling techniques for ground truth data in microwave remote sensing of soil moisture

    NASA Technical Reports Server (NTRS)

    Rao, R. G. S.; Ulaby, F. T.

    1977-01-01

    The paper examines optimal sampling techniques for obtaining accurate spatial averages of soil moisture, at various depths and for cell sizes in the range 2.5-40 acres, with a minimum number of samples. Both simple random sampling and stratified sampling procedures are used to reach a set of recommended sample sizes for each depth and for each cell size. Major conclusions from statistical sampling test results are that (1) the number of samples required decreases with increasing depth; (2) when the total number of samples cannot be prespecified or the moisture in only one single layer is of interest, then a simple random sample procedure should be used which is based on the observed mean and SD for data from a single field; (3) when the total number of samples can be prespecified and the objective is to measure the soil moisture profile with depth, then stratified random sampling based on optimal allocation should be used; and (4) decreasing the sensor resolution cell size leads to fairly large decreases in samples sizes with stratified sampling procedures, whereas only a moderate decrease is obtained in simple random sampling procedures.

  8. Chi-Squared Test of Fit and Sample Size-A Comparison between a Random Sample Approach and a Chi-Square Value Adjustment Method.

    PubMed

    Bergh, Daniel

    2015-01-01

    Chi-square statistics are commonly used for tests of fit of measurement models. Chi-square is also sensitive to sample size, which is why several approaches to handle large samples in test of fit analysis have been developed. One strategy to handle the sample size problem may be to adjust the sample size in the analysis of fit. An alternative is to adopt a random sample approach. The purpose of this study was to analyze and to compare these two strategies using simulated data. Given an original sample size of 21,000, for reductions of sample sizes down to the order of 5,000 the adjusted sample size function works as good as the random sample approach. In contrast, when applying adjustments to sample sizes of lower order the adjustment function is less effective at approximating the chi-square value for an actual random sample of the relevant size. Hence, the fit is exaggerated and misfit under-estimated using the adjusted sample size function. Although there are big differences in chi-square values between the two approaches at lower sample sizes, the inferences based on the p-values may be the same.

  9. Who are your public? A survey comparing the views of a population-based sample with those of a community-based public forum in Scotland.

    PubMed

    Emslie, Margaret J; Andrew, Jane; Angus, Val; Entwistle, Vikki; Walker, Kim

    2005-03-01

    This paper describes a questionnaire survey, carried out in the NHS Grampian area of NE Scotland. It compares responses from 84 members of a community-based public forum (39 of whom were sent questionnaires) and a random sample of 10,000 adults registered with general practices in Grampian (2,449 of whom were sent questionnaires). differences in demographic profiles and opinions about different feedback mechanisms (patient representative, telephone helpline and NHS feedback website) and their likely effectiveness in three different scenarios. 46% of community forum members consented to take part compared to 24% of the population sample. Younger people and residents in more deprived areas were under-represented in both groups. Community forum members were older (only one under 40 years of age), more likely to be retired and not in employment. Internet access was similar in both groups. Opinions about different systems of feeding back views to the NHS varied but community forum members were more likely to be positive in their opinions about the value of different feedback mechanisms and less likely to think they were 'a waste of NHS money'. Responses to three scenarios revealed similar opinions, but on some issues, there were key differences in the responses from the two groups. Community forum members were more likely to consider writing a letter as a means of getting something done about a problem and were more likely to talk to their GP if experiencing a problem than respondents in the main group. In general their responses were more positive towards the NHS. There is a need to ensure a broad basis for membership of public forums and/or proactively seek the views of groups that are under-represented if public forums are to be used to represent the views of the wider population and inform decision making in the NHS.

  10. Comparing Study Populations of Men Who Have Sex with Men: Evaluating Consistency Within Repeat Studies and Across Studies in the Seattle Area Using Different Recruitment Methodologies

    PubMed Central

    Burt, Richard D.; Oster, Alexandra M.; Golden, Mathew R.; Thiede, Hanne

    2013-01-01

    There is no gold standard for recruiting unbiased samples of men who have sex with men (MSM). To assess differing recruitment methods, we compared Seattle-area MSM samples from: venue-day-time sampling-based National HIV Behavioral Surveillance (NHBS) surveys in 2008 and 2011, random-digit-dialed (RDD) surveys in 2003 and 2006, and STD clinic patient data 2001–2011. We compared sociodemographics, sexual and drug-associated behavior, and HIV status and testing. There was generally good consistency between the two NHBS surveys and within STD clinic data across time. NHBS participants reported higher levels of drug-associated and lower levels of sexual risk than STD clinic patients. RDD participants differed from the other study populations in sociodemographics and some risk behaviors. While neither NHBS nor the STD clinic study populations may be representative of all MSM, both appear to provide consistent samples of MSM subpopulations across time that can provide useful information to guide HIV prevention. PMID:23900958

  11. Radon in harvested rainwater at the household level, Palestine.

    PubMed

    Al-Khatib, Issam A; Al Zabadi, Hamzeh; Saffarini, Ghassan

    2017-04-01

    The main objective of this study was to assess Radon concentration in the harvested rainwater (HRW) at the household level in Yatta area, Palestine. HRW is mainly used for drinking as it is the major source of water for domestic uses due to water scarcity. Ninety HRW samples from the household cisterns were collected from six localities (a town and five villages) and Radon concentrations were measured. The samples were randomly collected from different households to represent the Yatta area. Fifteen samples were collected from each locality at the same day. RAD7 device was used for analysis and each sample was measured in duplicate. Radon concentrations ranged from 0.037 to 0.26 Bq/L with a mean ± standard deviation of 0.14 ± 0.06 Bq/L. The estimated annual effective radiation doses for babies, children and adults were all far below the maximum limit of 5 mSvy -1 set by the National Council on Radiation Protection and Measurements. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Control of diabetes and fibrinogen levels as well as improvement in health care might delay low cognitive performance in societies aging progressively.

    PubMed

    Lopes, Daniele Almeida; Moraes, Suzana Alves de; Freitas, Isabel Cristina Martins de

    2015-01-01

    To know the prevalence and factors associated to low cognitive performance in a representative sample of the adult population in a society aging progressively. Cross-sectional population-based study carried out in a three-stage sampling: 81 census tracts (primary sampling unity) were randomly selected, followed by 1,672 households and 2,471 participants (weighted sample) corresponding to the second and third stages, respectively. The outcome prevalence was calculated according sociodemographic, behavioral and health related variables. Crude and adjusted prevalence ratios were estimated using Poisson regression. The prevalence of low cognitive performance was high, mainly among females, and indicated linear trends into categories of age, schooling, income, plasma fibrinogen and self-reported health status. In multivariate models, gender, diabetes, fibrinogen and self-reported health status presented positive associations, while schooling, employment and sitting time presented negative associations with the outcome. Interventions related to diabetes and fibrinogen levels control as well as improvement in health care might delay low cognitive performance in societies aging progressively as such the study population.

  13. A post-mortem survey on end-of-life decisions using a representative sample of death certificates in Flanders, Belgium: research protocol

    PubMed Central

    Chambaere, Kenneth; Bilsen, Johan; Cohen, Joachim; Pousset, Geert; Onwuteaka-Philipsen, Bregje; Mortier, Freddy; Deliens, Luc

    2008-01-01

    Background Reliable studies of the incidence and characteristics of medical end-of-life decisions with a certain or possible life shortening effect (ELDs) are indispensable for an evidence-based medical and societal debate on this issue. This article presents the protocol drafted for the 2007 ELD Study in Flanders, Belgium, and outlines how the main aims and challenges of the study (i.e. making reliable incidence estimates of end-of-life decisions, even rare ones, and describing their characteristics; allowing comparability with past ELD studies; guaranteeing strict anonymity given the sensitive nature of the research topic; and attaining a sufficient response rate) are addressed in a post-mortem survey using a representative sample of death certificates. Study design Reliable incidence estimates are achievable by using large at random samples of death certificates of deceased persons in Flanders (aged one year or older). This entails the cooperation of the appropriate administrative authorities. To further ensure the reliability of the estimates and descriptions, especially of less prevalent end-of-life decisions (e.g. euthanasia), a stratified sample is drawn. A questionnaire is sent out to the certifying physician of each death sampled. The questionnaire, tested thoroughly and avoiding emotionally charged terms is based largely on questions that have been validated in previous national and European ELD studies. Anonymity of both patient and physician is guaranteed through a rigorous procedure, involving a lawyer as intermediary between responding physicians and researchers. To increase response we follow the Total Design Method (TDM) with a maximum of three follow-up mailings. Also, a non-response survey is conducted to gain insight into the reasons for lack of response. Discussion The protocol of the 2007 ELD Study in Flanders, Belgium, is appropriate for achieving the objectives of the study; as past studies in Belgium, the Netherlands, and other European countries have shown, strictly anonymous and thorough surveys among physicians using a large, stratified, and representative death certificate sample are most suitable in nationwide studies of incidence and characteristics of end-of-life decisions. There are however also some limitations to the study design. PMID:18752659

  14. The frequency and nature of alcohol and tobacco advertising in televised sports, 1990 through 1992.

    PubMed Central

    Madden, P A; Grube, J W

    1994-01-01

    This study examines the frequency and nature of alcohol and tobacco advertising in a random sample of 166 televised sports events representing 443.7 hours of network programming broadcast from fall 1990 through summer 1992. More commercials appear for alcohol products than for any other beverage. Beer commercials predominate and include images at odds with recommendations from former Surgeon General Koop. The audience is also exposed to alcohol and tobacco advertising through the appearances of stadium signs, other on-site promotions, and verbal or visual brief product sponsorships. Moderation messages and public service announcements are rare. PMID:8296959

  15. Work as a cultural and personal value: attitudes towards work in Polish society.

    PubMed

    Skarzyńska, Krystyna

    2002-01-01

    The meaning of work for Poles is analyzed here from 2 perspectives: macrosocial and individual. From the macrosocial perspective work attitudes are explained by 3 factors: traditional Polish Catholicism, cultural patterns (influence of noble class tradition), and experience of "real socialism." From an individual perspective some psychological and demographic predictors of an autonomous (intrinsic) work attitude are empirically tested. The autonomous attitude towards work is understood here as treating work as an important autonomous value versus only an instrumental means for earning money. The data was collected by means of standardized interviews run on a representative random sample of adult working Poles, N = 1340.

  16. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    PubMed

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  17. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  18. Representativeness of direct observations selected using a work-sampling equation.

    PubMed

    Sharp, Rebecca A; Mudford, Oliver C; Elliffe, Douglas

    2015-01-01

    Deciding on appropriate sampling to obtain representative samples of behavior is important but not straightforward, because the relative duration of the target behavior may affect its observation in a given sampling interval. Work-sampling methods, which offer a way to adjust the frequency of sampling according to a priori or ongoing estimates of the behavior to achieve a preselected level of representativeness, may provide a solution. Full-week observations of 7 behaviors were conducted for 3 students with autism spectrum disorder and intellectual disabilities. Work-sampling methods were used to select momentary time samples from the full time-of-interest, which produced representative samples. However, work sampling required impractically high numbers of time samples to obtain representative samples. More practical momentary time samples produced less representative samples, particularly for low-duration behaviors. The utility and limits of work-sampling methods for applied behavior analysis are discussed. © Society for the Experimental Analysis of Behavior.

  19. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    NASA Astrophysics Data System (ADS)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  20. Contextuality in canonical systems of random variables

    NASA Astrophysics Data System (ADS)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  1. A prospective cohort and extended comprehensive-cohort design provided insights about the generalizability of a pragmatic trial: the ProtecT prostate cancer trial.

    PubMed

    Donovan, Jenny L; Young, Grace J; Walsh, Eleanor I; Metcalfe, Chris; Lane, J Athene; Martin, Richard M; Tazewell, Marta K; Davis, Michael; Peters, Tim J; Turner, Emma L; Mills, Nicola; Khazragui, Hanan; Khera, Tarnjit K; Neal, David E; Hamdy, Freddie C

    2018-04-01

    Randomized controlled trials (RCTs) deliver robust internally valid evidence but generalizability is often neglected. Design features built into the Prostate testing for cancer and Treatment (ProtecT) RCT of treatments for localized prostate cancer (PCa) provided insights into its generalizability. Population-based cluster randomization created a prospective study of prostate-specific antigen (PSA) testing and a comprehensive-cohort study including groups choosing treatment or excluded from the RCT, as well as those randomized. Baseline information assessed selection and response during RCT conduct. The prospective study (82,430 PSA-tested men) represented healthy men likely to respond to a screening invitation. The extended comprehensive cohort comprised 1,643 randomized, 997 choosing treatment, and 557 excluded with advanced cancer/comorbidities. Men choosing treatment were very similar to randomized men except for having more professional/managerial occupations. Excluded men were similar to the randomized socio-demographically but different clinically, representing less healthy men with more advanced PCa. The design features of the ProtecT RCT provided data to assess the representativeness of the prospective cohort and generalizability of the findings of the RCT. Greater attention to collecting data at the design stage of pragmatic trials would better support later judgments by clinicians/policy-makers about the generalizability of RCT findings in clinical practice. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  2. A closer look at cross-validation for assessing the accuracy of gene regulatory networks and models.

    PubMed

    Tabe-Bordbar, Shayan; Emad, Amin; Zhao, Sihai Dave; Sinha, Saurabh

    2018-04-26

    Cross-validation (CV) is a technique to assess the generalizability of a model to unseen data. This technique relies on assumptions that may not be satisfied when studying genomics datasets. For example, random CV (RCV) assumes that a randomly selected set of samples, the test set, well represents unseen data. This assumption doesn't hold true where samples are obtained from different experimental conditions, and the goal is to learn regulatory relationships among the genes that generalize beyond the observed conditions. In this study, we investigated how the CV procedure affects the assessment of supervised learning methods used to learn gene regulatory networks (or in other applications). We compared the performance of a regression-based method for gene expression prediction estimated using RCV with that estimated using a clustering-based CV (CCV) procedure. Our analysis illustrates that RCV can produce over-optimistic estimates of the model's generalizability compared to CCV. Next, we defined the 'distinctness' of test set from training set and showed that this measure is predictive of performance of the regression method. Finally, we introduced a simulated annealing method to construct partitions with gradually increasing distinctness and showed that performance of different gene expression prediction methods can be better evaluated using this method.

  3. A Distributed-Memory Package for Dense Hierarchically Semi-Separable Matrix Computations Using Randomization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter

    In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less

  4. A Distributed-Memory Package for Dense Hierarchically Semi-Separable Matrix Computations Using Randomization

    DOE PAGES

    Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter; ...

    2016-06-30

    In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less

  5. 40 CFR 761.130 - Sampling requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sampling scheme and the guidance document are available on EPA's PCB Web site at http://www.epa.gov/pcb, or... § 761.125(c) (2) through (4). Using its best engineering judgment, EPA may sample a statistically valid random or grid sampling technique, or both. When using engineering judgment or random “grab” samples, EPA...

  6. 40 CFR 761.130 - Sampling requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sampling scheme and the guidance document are available on EPA's PCB Web site at http://www.epa.gov/pcb, or... § 761.125(c) (2) through (4). Using its best engineering judgment, EPA may sample a statistically valid random or grid sampling technique, or both. When using engineering judgment or random “grab” samples, EPA...

  7. Quality of life and visual function in Nigeria: findings from the National Survey of Blindness and Visual Impairment.

    PubMed

    Tran, Hang My; Mahdi, Abdull M; Sivasubramaniam, Selvaraj; Gudlavalleti, Murthy V S; Gilbert, Clare E; Shah, Shaheen P; Ezelum, C C; Abubakar, Tafida; Bankole, Olufunmilayo O

    2011-12-01

    To assess associations of visual function (VF) and quality of life (QOL) by visual acuity (VA), causes of blindness and types of cataract procedures in Nigeria. Multi-stage stratified cluster random sampling was used to identify a nationally representative sample of persons aged ≥ 40 years. VF/QOL questionnaires were administered to participants with VA <6/60 in one or both eyes and/or Mehra-Minassian cataract grade 2B or 3 in one or both eyes and a random sample of those with bilateral VA ≥ 6/12. VF/QOL questionnaires were administered to 2076 participants. Spearman's rank correlation showed a strong correlation between decreasing VA and VF/QOL scores (p<0.0001) with greatest impact on social (p<0.0001) and mobility-related activities (p<0.0001). People who were blind due to glaucoma had lower VF and QOL scores than those who were blind due to cataract. Mean VF and QOL scores were lower after couching compared with conventional cataract surgery (mean VF score=51.0 vs 63.0 and mean QOL score=71.3 vs 79.3). Finally, VF and QOL scores were lower among populations with specific characteristics. Populations with the following characteristics should be targeted to improve VF and QOL: people who are blind, older people, women, manual labourers, people living in rural areas, those living in the northern geopolitical zones, those practising Islamic and Traditionalism faith, those not currently married and those who have undergone couching.

  8. Effects of major depression on moment-in-time work performance.

    PubMed

    Wang, Philip S; Beck, Arne L; Berglund, Pat; McKenas, David K; Pronk, Nicolaas P; Simon, Gregory E; Kessler, Ronald C

    2004-10-01

    Although major depression is thought to have substantial negative effects on work performance, the possibility of recall bias limits self-report studies of these effects. The authors used the experience sampling method to address this problem by collecting comparative data on moment-in-time work performance among service workers who were depressed and those who were not depressed. The group studied included 105 airline reservation agents and 181 telephone customer service representatives selected from a larger baseline sample; depressed workers were deliberately oversampled. Respondents were given pagers and experience sampling method diaries for each day of the study. A computerized autodialer paged respondents at random time points. When paged, respondents reported on their work performance in the diary. Moment-in-time work performance was assessed at five random times each day over a 7-day data collection period (35 data points for each respondent). Seven conditions (allergies, arthritis, back pain, headaches, high blood pressure, asthma, and major depression) occurred often enough in this group of respondents to be studied. Major depression was the only condition significantly related to decrements in both of the dimensions of work performance assessed in the diaries: task focus and productivity. These effects were equivalent to approximately 2.3 days absent because of sickness per depressed worker per month of being depressed. Previous studies based on days missed from work significantly underestimate the adverse economic effects associated with depression. Productivity losses related to depression appear to exceed the costs of effective treatment.

  9. Management of low-grade cervical abnormalities detected at screening: which method do women prefer?

    PubMed

    Whynes, D K; Woolley, C; Philips, Z

    2008-12-01

    To establish whether women with low-grade abnormalities detected during screening for cervical cancer prefer to be managed by cytological surveillance or by immediate colposcopy. TOMBOLA (Trial of Management of Borderline and Other Low-grade Abnormal smears) is a randomized controlled trial comparing alternative management strategies following the screen-detection of low-grade cytological abnormalities. At exit, a sample of TOMBOLA women completed a questionnaire eliciting opinions on their management, contingent valuations (CV) of the management methods and preferences. Within-trial quality of life (EQ-5D) data collected for a sample of TOMBOLA women throughout their follow-up enabled the comparison of self-reported health at various time points, by management method. Once management had been initiated, self-reported health in the colposcopy arm rose relative to that in the surveillance arm, although the effect was short-term only. For the majority of women, the satisfaction ratings and the CV indicated approval of the management method to which they had been randomized. Of the minority manifesting a preference for the method which they had not experienced, relatively more would have preferred colposcopy than would have preferred surveillance. The findings must be interpreted in the light of sample bias with respect to preferences, whereby enthusiasm for colposcopy was probably over-represented amongst trial participants. The study suggests that neither of the management methods is preferred unequivocally; rather, individual women have individual preferences, although many would be indifferent between methods.

  10. Infrared spectra alteration in water proximate to the palms of therapeutic practitioners.

    PubMed

    Schwartz, Stephan A; De Mattei, Randall J; Brame, Edward G; Spottiswoode, S James P

    2015-01-01

    Through standard techniques of infrared (IR) spectrophotometry, sterile water samples in randomly selected sealed vials evidence alteration of infrared (IR) spectra after being proximate to the palms of the hands of both Practicing and Non-practicing Therapy Practitioners, each of whom employed a personal variation of the Laying-on-of-Hands/Therapeutic Touch processes. This pilot study presents 14 cases, involving 14 Practitioners and 14 Recipients. The first hypothesis, that a variation in the spectra of all (84) Treated spectra compared with all (57) Control spectra would be observed in the 2.5-3.0µm range, was confirmed (P = .02). Overall, 10% (15) of the spectra were done using a germanium internal reflection element (IRE), and 90% of the spectra (126) were done with a zinc selenide IRE. The difference in refractive index between the two IREs skews the data. The zinc selenide IRE spectra alone yield P = .005. The authors believe the most representative evidence for the effect appeared in the sample group of Treated vs Calibration Controls using the zinc selenide IRE (P = .0004). The second hypothesis, that there existed a direct relationship between intensity of effect and time of exposure, was not confirmed. This study replicates earlier findings under conditions of blindness, randomicity, and several levels of controls. Environmental factors are considered as explanations for the observed IR spectrum alteration, including temperature, barometric pressure, and variations dependent on sampling order. They do not appear to explain the effect. Copyright © 2015. Published by Elsevier Inc.

  11. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    PubMed

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  12. Characterization of rock thermal conductivity by high-resolution optical scanning

    USGS Publications Warehouse

    Popov, Y.A.; Pribnow, D.F.C.; Sass, J.H.; Williams, C.F.; Burkhardt, H.

    1999-01-01

    We compared thress laboratory methods for thermal conductivity measurements: divided-bar, line-source and optical scanning. These methods are widely used in geothermal and petrophysical studies, particularly as applied to research on cores from deep scientific boreholes. The relatively new optical scanning method has recently been perfected and applied to geophysical problems. A comparison among these methods for determining the thermal conductivity tensor for anisotropic rocks is based on a representative collection of 80 crystalline rock samples from the KTB continental deep borehole (Germany). Despite substantial thermal inhomogeneity of rock thermal conductivity (up to 40-50% variation) and high anisotropy (with ratios of principal values attaining 2 and more), the results of measurements agree very well among the different methods. The discrepancy for measurements along the foliation is negligible (<1%). The component of thermal conductivity normal to the foliation reveals somewhat larger differences (3-4%). Optical scanning allowed us to characterize the thermal inhomogeneity of rocks and to identify a three-dimensional anisotropy in thermal conductivity of some gneiss samples. The merits of optical scanning include minor random errors (1.6%), the ability to record the variation of thermal conductivity along the sample, the ability to sample deeply using a slow scanning rate, freedom from constraints for sample size and shape, and quality of mechanical treatment of the sample surface, a contactless mode of measurement, high speed of operation, and the ability to measure on a cylindrical sample surface. More traditional methods remain superior for characterizing bulk conductivity at elevated temperature.Three laboratory methods including divided-bar, line-source and optical scanning are widely applied in geothermal and petrophysical studies. In this study, these three methods were compared for determining the thermal conductivity tensor for anisotropic rocks. For this study, a representative collection of 80 crystalline rock samples from the KTB continental deep borehole was used. Despite substantial thermal inhomogeneity of rock thermal conductivity and high anisotropy, measurement results were in excellent agreement among the three methods.

  13. The Expected Sample Variance of Uncorrelated Random Variables with a Common Mean and Some Applications in Unbalanced Random Effects Models

    ERIC Educational Resources Information Center

    Vardeman, Stephen B.; Wendelberger, Joanne R.

    2005-01-01

    There is a little-known but very simple generalization of the standard result that for uncorrelated random variables with common mean [mu] and variance [sigma][superscript 2], the expected value of the sample variance is [sigma][superscript 2]. The generalization justifies the use of the usual standard error of the sample mean in possibly…

  14. Methods for estimating the amount of vernal pool habitat in the northeastern United States

    USGS Publications Warehouse

    Van Meter, R.; Bailey, L.L.; Grant, E.H.C.

    2008-01-01

    The loss of small, seasonal wetlands is a major concern for a variety of state, local, and federal organizations in the northeastern U.S. Identifying and estimating the number of vernal pools within a given region is critical to developing long-term conservation and management strategies for these unique habitats and their faunal communities. We use three probabilistic sampling methods (simple random sampling, adaptive cluster sampling, and the dual frame method) to estimate the number of vernal pools on protected, forested lands. Overall, these methods yielded similar values of vernal pool abundance for each study area, and suggest that photographic interpretation alone may grossly underestimate the number of vernal pools in forested habitats. We compare the relative efficiency of each method and discuss ways of improving precision. Acknowledging that the objectives of a study or monitoring program ultimately determine which sampling designs are most appropriate, we recommend that some type of probabilistic sampling method be applied. We view the dual-frame method as an especially useful way of combining incomplete remote sensing methods, such as aerial photograph interpretation, with a probabilistic sample of the entire area of interest to provide more robust estimates of the number of vernal pools and a more representative sample of existing vernal pool habitats.

  15. Impact of different privacy conditions and incentives on survey response rate, participant representativeness, and disclosure of sensitive information: a randomized controlled trial.

    PubMed

    Murdoch, Maureen; Simon, Alisha Baines; Polusny, Melissa Anderson; Bangerter, Ann Kay; Grill, Joseph Patrick; Noorbaloochi, Siamak; Partin, Melissa Ruth

    2014-07-16

    Anonymous survey methods appear to promote greater disclosure of sensitive or stigmatizing information compared to non-anonymous methods. Higher disclosure rates have traditionally been interpreted as being more accurate than lower rates. We examined the impact of 3 increasingly private mailed survey conditions-ranging from potentially identifiable to completely anonymous-on survey response and on respondents' representativeness of the underlying sampling frame, completeness in answering sensitive survey items, and disclosure of sensitive information. We also examined the impact of 2 incentives ($10 versus $20) on these outcomes. A 3X2 factorial, randomized controlled trial of 324 representatively selected, male Gulf War I era veterans who had applied for United States Department of Veterans Affairs (VA) disability benefits. Men were asked about past sexual assault experiences, childhood abuse, combat, other traumas, mental health symptoms, and sexual orientation. We used a novel technique, the pre-merged questionnaire, to link anonymous responses to administrative data. Response rates ranged from 56.0% to 63.3% across privacy conditions (p = 0.49) and from 52.8% to 68.1% across incentives (p = 0.007). Respondents' characteristics differed by privacy and by incentive assignments, with completely anonymous respondents and $20 respondents appearing least different from their non-respondent counterparts. Survey completeness did not differ by privacy or by incentive. No clear pattern of disclosing sensitive information by privacy condition or by incentive emerged. For example, although all respondents came from the same sampling frame, estimates of sexual abuse ranged from 13.6% to 33.3% across privacy conditions, with the highest estimate coming from the intermediate privacy condition (p = 0.007). Greater privacy and larger incentives do not necessarily result in higher disclosure rates of sensitive information than lesser privacy and lower incentives. Furthermore, disclosure of sensitive or stigmatizing information under differing privacy conditions may have less to do with promoting or impeding participants' "honesty" or "accuracy" than with selectively recruiting or attracting subpopulations that are higher or lower in such experiences. Pre-merged questionnaires bypassed many historical limitations of anonymous surveys and hold promise for exploring non-response issues in future research.

  16. A stochastic convolution/superposition method with isocenter sampling to evaluate intrafraction motion effects in IMRT.

    PubMed

    Naqvi, Shahid A; D'Souza, Warren D

    2005-04-01

    Current methods to calculate dose distributions with organ motion can be broadly classified as "dose convolution" and "fluence convolution" methods. In the former, a static dose distribution is convolved with the probability distribution function (PDF) that characterizes the motion. However, artifacts are produced near the surface and around inhomogeneities because the method assumes shift invariance. Fluence convolution avoids these artifacts by convolving the PDF with the incident fluence instead of the patient dose. In this paper we present an alternative method that improves the accuracy, generality as well as the speed of dose calculation with organ motion. The algorithm starts by sampling an isocenter point from a parametrically defined space curve corresponding to the patient-specific motion trajectory. Then a photon is sampled in the linac head and propagated through the three-dimensional (3-D) collimator structure corresponding to a particular MLC segment chosen randomly from the planned IMRT leaf sequence. The photon is then made to interact at a point in the CT-based simulation phantom. Randomly sampled monoenergetic kernel rays issued from this point are then made to deposit energy in the voxels. Our method explicitly accounts for MLC-specific effects (spectral hardening, tongue-and-groove, head scatter) as well as changes in SSD with isocentric displacement, assuming that the body moves rigidly with the isocenter. Since the positions are randomly sampled from a continuum, there is no motion discretization, and the computation takes no more time than a static calculation. To validate our method, we obtained ten separate film measurements of an IMRT plan delivered on a phantom moving sinusoidally, with each fraction starting with a random phase. For 2 cm motion amplitude, we found that a ten-fraction average of the film measurements gave an agreement with the calculated infinite fraction average to within 2 mm in the isodose curves. The results also corroborate the existing notion that the interfraction dose variability due to the interplay between the MLC motion and breathing motion averages out over typical multifraction treatments. Simulation with motion waveforms more representative of real breathing indicate that the motion can produce penumbral spreading asymmetric about the static dose distributions. Such calculations can help a clinician decide to use, for example, a larger margin in the superior direction than in the inferior direction. In the paper we demonstrate that a 15 min run on a single CPU can readily illustrate the effect of a patient-specific breathing waveform, and can guide the physician in making informed decisions about margin expansion and dose escalation.

  17. Accelerating Convergence in Molecular Dynamics Simulations of Solutes in Lipid Membranes by Conducting a Random Walk along the Bilayer Normal.

    PubMed

    Neale, Chris; Madill, Chris; Rauscher, Sarah; Pomès, Régis

    2013-08-13

    All molecular dynamics simulations are susceptible to sampling errors, which degrade the accuracy and precision of observed values. The statistical convergence of simulations containing atomistic lipid bilayers is limited by the slow relaxation of the lipid phase, which can exceed hundreds of nanoseconds. These long conformational autocorrelation times are exacerbated in the presence of charged solutes, which can induce significant distortions of the bilayer structure. Such long relaxation times represent hidden barriers that induce systematic sampling errors in simulations of solute insertion. To identify optimal methods for enhancing sampling efficiency, we quantitatively evaluate convergence rates using generalized ensemble sampling algorithms in calculations of the potential of mean force for the insertion of the ionic side chain analog of arginine in a lipid bilayer. Umbrella sampling (US) is used to restrain solute insertion depth along the bilayer normal, the order parameter commonly used in simulations of molecular solutes in lipid bilayers. When US simulations are modified to conduct random walks along the bilayer normal using a Hamiltonian exchange algorithm, systematic sampling errors are eliminated more rapidly and the rate of statistical convergence of the standard free energy of binding of the solute to the lipid bilayer is increased 3-fold. We compute the ratio of the replica flux transmitted across a defined region of the order parameter to the replica flux that entered that region in Hamiltonian exchange simulations. We show that this quantity, the transmission factor, identifies sampling barriers in degrees of freedom orthogonal to the order parameter. The transmission factor is used to estimate the depth-dependent conformational autocorrelation times of the simulation system, some of which exceed the simulation time, and thereby identify solute insertion depths that are prone to systematic sampling errors and estimate the lower bound of the amount of sampling that is required to resolve these sampling errors. Finally, we extend our simulations and verify that the conformational autocorrelation times estimated by the transmission factor accurately predict correlation times that exceed the simulation time scale-something that, to our knowledge, has never before been achieved.

  18. Infinite hidden conditional random fields for human behavior analysis.

    PubMed

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja

    2013-01-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF (iHCRF), which is a nonparametric model based on hierarchical Dirichlet processes and is capable of automatically learning the optimal number of hidden states for a classification task. We show how we learn the model hyperparameters with an effective Markov-chain Monte Carlo sampling technique, and we explain the process that underlines our iHCRF model with the Restaurant Franchise Rating Agencies analogy. We show that the iHCRF is able to converge to a correct number of represented hidden states, and outperforms the best finite HCRFs--chosen via cross-validation--for the difficult tasks of recognizing instances of agreement, disagreement, and pain. Moreover, the iHCRF manages to achieve this performance in significantly less total training, validation, and testing time.

  19. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    PubMed

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  20. Environmental Health Practice: Statistically Based Performance Measurement

    PubMed Central

    Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.

    2007-01-01

    Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709

  1. Modeling Soil Organic Carbon Variation Along Climatic and Topographic Trajectories in the Central Andes

    NASA Astrophysics Data System (ADS)

    Gavilan, C.; Grunwald, S.; Quiroz, R.; Zhu, L.

    2015-12-01

    The Andes represent the largest and highest mountain range in the tropics. Geological and climatic differentiation favored landscape and soil diversity, resulting in ecosystems adapted to very different climatic patterns. Although several studies support the fact that the Andes are a vast sink of soil organic carbon (SOC) only few have quantified this variable in situ. Estimating the spatial distribution of SOC stocks in data-poor and/or poorly accessible areas, like the Andean region, is challenging due to the lack of recent soil data at high spatial resolution and the wide range of coexistent ecosystems. Thus, the sampling strategy is vital in order to ensure the whole range of environmental covariates (EC) controlling SOC dynamics is represented. This approach allows grasping the variability of the area, which leads to more efficient statistical estimates and improves the modeling process. The objectives of this study were to i) characterize and model the spatial distribution of SOC stocks in the Central Andean region using soil-landscape modeling techniques, and to ii) validate and evaluate the model for predicting SOC content in the area. For that purpose, three representative study areas were identified and a suite of variables including elevation, mean annual temperature, annual precipitation and Normalized Difference Vegetation Index (NDVI), among others, was selected as EC. A stratified random sampling (namely conditioned Latin Hypercube) was implemented and a total of 400 sampling locations were identified. At all sites, four composite topsoil samples (0-30 cm) were collected within a 2 m radius. SOC content was measured using dry combustion and SOC stocks were estimated using bulk density measurements. Regression Kriging was used to map the spatial variation of SOC stocks. The accuracy, fit and bias of SOC models was assessed using a rigorous validation assessment. This study produced the first comprehensive, geospatial SOC stock assessment in this undersampled region that serves as a baseline reference to assess potential impacts of climate and land use change.

  2. Physical activity intervention for elderly patients with reduced physical performance after acute coronary syndrome (HULK study): rationale and design of a randomized clinical trial.

    PubMed

    Tonet, Elisabetta; Maietti, Elisa; Chiaranda, Giorgio; Vitali, Francesco; Serenelli, Matteo; Bugani, Giulia; Mazzoni, Gianni; Ruggiero, Rossella; Myers, Jonathan; Villani, Giovanni Quinto; Corvi, Ursula; Pasanisi, Giovanni; Biscaglia, Simone; Pavasini, Rita; Lucchi, Giulia Ricci; Sella, Gianluigi; Ferrari, Roberto; Volpato, Stefano; Campo, Gianluca; Grazzi, Giovanni

    2018-05-21

    Reduced physical performance and impaired mobility are common in elderly patients after acute coronary syndrome (ACS) and they represent independent risk factors for disability, morbidity, hospital readmission and mortality. Regular physical exercise represents a means for improving functional capacity. Nevertheless, its clinical benefit has been less investigated in elderly patients in the early phase after ACS. The HULK trial aims to investigate the clinical benefit of an early, tailored low-cost physical activity intervention in comparison to standard of care in elderly ACS patients with reduced physical performance. HULK is an investigator-initiated, prospective multicenter randomized controlled trial (NCT03021044). After successful management of the ACS acute phase and uneventful first 1 month, elderly (≥70 years) patients showing reduced physical performance are randomized (1:1 ratio) to either standard of care or physical activity intervention. Reduced physical performance is defined as a short physical performance battery (SPPB) score of 4-9. The early, tailored, low-cost physical intervention includes 4 sessions of physical activity with a supervisor and an home-based program of physical exercise. The chosen primary endpoint is the 6-month SPPB value. Secondary endpoints briefly include quality of life, on-treatment platelet reactivity, some laboratory data and clinical adverse events. To demonstrate an increase of at least one SPPB point in the experimental arm, a sample size of 226 patients is needed. The HULK study will test the hypothesis that an early, tailored low-cost physical activity intervention improves physical performance, quality of life, frailty status and outcome in elderly ACS patients with reduced physical performance. Clinicaltrials.gov, identifier NCT03021044 , first posted January, 13th 2017.

  3. Matched-filter algorithm for subpixel spectral detection in hyperspectral image data

    NASA Astrophysics Data System (ADS)

    Borough, Howard C.

    1991-11-01

    Hyperspectral imagery, spatial imagery with associated wavelength data for every pixel, offers a significant potential for improved detection and identification of certain classes of targets. The ability to make spectral identifications of objects which only partially fill a single pixel (due to range or small size) is of considerable interest. Multiband imagery such as Landsat's 5 and 7 band imagery has demonstrated significant utility in the past. Hyperspectral imaging systems with hundreds of spectral bands offer improved performance. To explore the application of differentpixel spectral detection algorithms a synthesized set of hyperspectral image data (hypercubes) was generated utilizing NASA earth resources and other spectral data. The data was modified using LOWTRAN 7 to model the illumination, atmospheric contributions, attenuations and viewing geometry to represent a nadir view from 10,000 ft. altitude. The base hypercube (HC) represented 16 by 21 spatial pixels with 101 wavelength samples from 0.5 to 2.5 micrometers for each pixel. Insertions were made into the base data to provide random location, random pixel percentage, and random material. Fifteen different hypercubes were generated for blind testing of candidate algorithms. An algorithm utilizing a matched filter in the spectral dimension proved surprisingly good yielding 100% detections for pixels filled greater than 40% with a standard camouflage paint, and a 50% probability of detection for pixels filled 20% with the paint, with no false alarms. The false alarm rate as a function of the number of spectral bands in the range from 101 to 12 bands was measured and found to increase from zero to 50% illustrating the value of a large number of spectral bands. This test was on imagery without system noise; the next step is to incorporate typical system noise sources.

  4. Assessment of the effect of population and diary sampling methods on estimation of school-age children exposure to fine particles.

    PubMed

    Che, W W; Frey, H Christopher; Lau, Alexis K H

    2014-12-01

    Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified-random and random-random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov-chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5 ) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m(3) for simulated children using the stratified population sampling method, and 12.2 μg/m(3) using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m(3) due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern. © 2014 Society for Risk Analysis.

  5. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    PubMed

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavlou, A. T.; Betzler, B. R.; Burke, T. P.

    Uncertainties in the composition and fabrication of fuel compacts for the Fort St. Vrain (FSV) high temperature gas reactor have been studied by performing eigenvalue sensitivity studies that represent the key uncertainties for the FSV neutronic analysis. The uncertainties for the TRISO fuel kernels were addressed by developing a suite of models for an 'average' FSV fuel compact that models the fuel as (1) a mixture of two different TRISO fuel particles representing fissile and fertile kernels, (2) a mixture of four different TRISO fuel particles representing small and large fissile kernels and small and large fertile kernels and (3)more » a stochastic mixture of the four types of fuel particles where every kernel has its diameter sampled from a continuous probability density function. All of the discrete diameter and continuous diameter fuel models were constrained to have the same fuel loadings and packing fractions. For the non-stochastic discrete diameter cases, the MCNP compact model arranged the TRISO fuel particles on a hexagonal honeycomb lattice. This lattice-based fuel compact was compared to a stochastic compact where the locations (and kernel diameters for the continuous diameter cases) of the fuel particles were randomly sampled. Partial core configurations were modeled by stacking compacts into fuel columns containing graphite. The differences in eigenvalues between the lattice-based and stochastic models were small but the runtime of the lattice-based fuel model was roughly 20 times shorter than with the stochastic-based fuel model. (authors)« less

  7. Conic Sampling: An Efficient Method for Solving Linear and Quadratic Programming by Randomly Linking Constraints within the Interior

    PubMed Central

    Serang, Oliver

    2012-01-01

    Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics. PMID:22952741

  8. Predicate Argument Structure Frames for Modeling Information in Operative Notes

    PubMed Central

    Wang, Yan; Pakhomov, Serguei; Melton, Genevieve B.

    2015-01-01

    The rich information about surgical procedures contained in operative notes is a valuable data source for improving the clinical evidence base and clinical research. In this study, we propose a set of Predicate Argument Structure (PAS) frames for surgical action verbs to assist in the creation of an information extraction (IE) system to automatically extract details about the techniques, equipment, and operative steps from operative notes. We created PropBank style PAS frames for the 30 top surgical action verbs based on examination of randomly selected sample sentences from 3,000 Laparoscopic Cholecystectomy notes. To assess completeness of the PAS frames to represent usage of same action verbs, we evaluated the PAS frames created on sample sentences from operative notes of 6 other gastrointestinal surgical procedures. Our results showed that the PAS frames created with one type of surgery can successfully denote the usage of the same verbs in operative notes of broader surgical categories. PMID:23920664

  9. Family caregiving to those with dementia in rural Alabama: racial similarities and differences.

    PubMed

    Kosberg, Jordan I; Kaufman, Allan V; Burgio, Louis D; Leeper, James D; Sun, Fei

    2007-02-01

    This study explored differences and similarities in the experiences of African American and White family caregivers of dementia patients living in rural Alabama. This cross-sectional survey used a caregiving stress model to investigate the interrelationships between caregiving burden, mediators, and outcomes. Random-digit-dialing telephone interviews were used to obtain data on a probability sample of 74 non-Hispanic White and 67 African American caregivers. White caregivers were more likely to be married and older, used acceptance and humor as coping styles, and had fewer financial problems. African American caregivers gave more hours of care, used religion and denial as coping styles, and were less burdened. The authors have developed a methodology for obtaining a representative sample of African American and White rural caregivers. Further investigations are needed of the interactions between urban/rural location and ethnic/racial backgrounds of dementia caregivers for heuristic and applied reasons.

  10. A machine learning model with human cognitive biases capable of learning from small and biased datasets.

    PubMed

    Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro

    2018-05-09

    Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.

  11. Incidence and seroprevalence of tularaemia in Finland, 1995 to 2013: regional epidemics with cyclic pattern.

    PubMed

    Rossow, H; Ollgren, J; Hytonen, J; Rissanen, H; Huitu, O; Henttonen, H; Kuusi, M; Vapalahti, O

    2015-08-20

    We studied the incidence of reported tularaemia by year and region and the prevalence of antibodies against Francisella tularensis in the adult general population in Finland. Moreover, we assessed the correlation between vole population cycles and human tularaemia outbreaks. The seroprevalence study made use of serum samples from a nationwide population-based health survey (Health 2000). The samples of 1,045 randomly selected persons, representative for the Finnish population in each region, were screened with an enzyme-linked immunosorbent assay (ELISA) for the presence of IgG antibodies against F. tularensis, and positive results were further confirmed by immunoblotting. A serological response to F. tularensis was found in 2% (95% confidence interval: 1.1–3.5) of the population. Incidence and seroprevalence were highest in the same areas, and vole population peaks clearly preceded tularaemia outbreaks one year later.

  12. The relationship among the resiliency practices in supply chain, financial performance, and competitive advantage in manufacturing firms in Indonesia and Sierra Leone

    NASA Astrophysics Data System (ADS)

    Musa, I.; Nyoman Pujawan, I.

    2018-04-01

    Current supply chain management (SCM) has become a potentially treasured way of safeguarding competitive advantage and improving organizational performance since competition is no longer between organizations, but among supply chains. This research conceptualizes and develops four resiliency practices (Flexibility, Redundancy, Collaboration and Agility) and tests the relationships between organizations’ financial performance and competitive advantage in manufacturing firms. The study involves manufacturing firms in Indonesia and Sierra Leone. The study used stratified random sampling to pick a sample size of 95 manufacturing firms, which represented different industrial sectors. The respondents were mainly managers of different manufacturing companies. The relationships proposed in the conceptual framework were tested using correlation analysis. The results indicate that higher levels of resilience practices in manufacturing firms can lead to enhanced competitive advantage and improved financial performance.

  13. Uncertainty, learning, and the optimal management of wildlife

    USGS Publications Warehouse

    Williams, B.K.

    2001-01-01

    Wildlife management is limited by uncontrolled and often unrecognized environmental variation, by limited capabilities to observe and control animal populations, and by a lack of understanding about the biological processes driving population dynamics. In this paper I describe a comprehensive framework for management that includes multiple models and likelihood values to account for structural uncertainty, along with stochastic factors to account for environmental variation, random sampling, and partial controllability. Adaptive optimization is developed in terms of the optimal control of incompletely understood populations, with the expected value of perfect information measuring the potential for improving control through learning. The framework for optimal adaptive control is generalized by including partial observability and non-adaptive, sample-based updating of model likelihoods. Passive adaptive management is derived as a special case of constrained adaptive optimization, representing a potentially efficient suboptimal alternative that nonetheless accounts for structural uncertainty.

  14. [Validation of the Eating Attitudes Test as a screening instrument for eating disorders in general population].

    PubMed

    Peláez-Fernández, María Angeles; Ruiz-Lázaro, Pedro Manuel; Labrador, Francisco Javier; Raich, Rosa María

    2014-02-20

    To validate the best cut-off point of the Eating Attitudes Test (EAT-40), Spanish version, for the screening of eating disorders (ED) in the general population. This was a transversal cross-sectional study. The EAT-40 Spanish version was administered to a representative sample of 1.543 students, age range 12 to 21 years, in the Region of Madrid. Six hundred and two participants (probable cases and a random sample of controls) were interviewed. The best diagnostic prediction was obtained with a cut-off point of 21, with sensitivity: 88.2%; specificity: 62.1%; positive predictive value: 17.7%; negative predictive value: 62.1%. Use of a cut-off point of 21 is recommended in epidemiological studies of eating disorders in the Spanish general population. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  15. Organizational characteristics influencing nursing home social service directors' qualifications: a national study.

    PubMed

    Simons, Kelsey V

    2006-11-01

    This research sought to identify organizational characteristics associated with the amount of professional qualifications among a nationally representative sample of nursing home social service directors. A self-administered survey was sent to directors in 675 facilities randomly sampled from a federal database, excluding facilities with fewer than 120 beds that are not required to staff a full-time social worker. The response rate was 45 percent (N = 299). Univariate results showed that most respondents possessed a social work degree, most lacked licensure, and few were clinically supervised. A multiple regression analysis found that nonprofit, independently owned facilities in rural areas staffed social service directors who were significantly more qualified than directors in for-profit, chain-affiliated facilities in urban and suburban areas. Facilities with fewer psychosocial deficiencies and higher occupancy rates employed social service directors with greater qualifications. The implications of these findings for social work education, practice, policy, and research are discussed.

  16. Cognitive-Behavioral Treatment of Panic Disorder in Adolescence

    ERIC Educational Resources Information Center

    Pincus, Donna B.; May, Jill Ehrenreich; Whitton, Sarah W.; Mattis, Sara G.; Barlow, David H.

    2010-01-01

    This investigation represents the first randomized controlled trial to evaluate the feasibility and efficacy of Panic Control Treatment for Adolescents (PCT-A). Thirteen adolescents, ages 14 to 17, were randomized to 11 weekly sessions of PCT-A treatment, whereas 13 were randomized to a self-monitoring control group. Results indicate that…

  17. Randomized branch sampling

    Treesearch

    Harry T. Valentine

    2002-01-01

    Randomized branch sampling (RBS) is a special application of multistage probability sampling (see Sampling, environmental), which was developed originally by Jessen [3] to estimate fruit counts on individual orchard trees. In general, the method can be used to obtain estimates of many different attributes of trees or other branched plants. The usual objective of RBS is...

  18. Deterministic multidimensional nonuniform gap sampling.

    PubMed

    Worley, Bradley; Powers, Robert

    2015-12-01

    Born from empirical observations in nonuniformly sampled multidimensional NMR data relating to gaps between sampled points, the Poisson-gap sampling method has enjoyed widespread use in biomolecular NMR. While the majority of nonuniform sampling schemes are fully randomly drawn from probability densities that vary over a Nyquist grid, the Poisson-gap scheme employs constrained random deviates to minimize the gaps between sampled grid points. We describe a deterministic gap sampling method, based on the average behavior of Poisson-gap sampling, which performs comparably to its random counterpart with the additional benefit of completely deterministic behavior. We also introduce a general algorithm for multidimensional nonuniform sampling based on a gap equation, and apply it to yield a deterministic sampling scheme that combines burst-mode sampling features with those of Poisson-gap schemes. Finally, we derive a relationship between stochastic gap equations and the expectation value of their sampling probability densities. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Assessing the Relationship of Ancient and Modern Populations

    PubMed Central

    Schraiber, Joshua G.

    2018-01-01

    Genetic material sequenced from ancient samples is revolutionizing our understanding of the recent evolutionary past. However, ancient DNA is often degraded, resulting in low coverage, error-prone sequencing. Several solutions exist to this problem, ranging from simple approach, such as selecting a read at random for each site, to more complicated approaches involving genotype likelihoods. In this work, we present a novel method for assessing the relationship of an ancient sample with a modern population, while accounting for sequencing error and postmortem damage by analyzing raw reads from multiple ancient individuals simultaneously. We show that, when analyzing SNP data, it is better to sequence more ancient samples to low coverage: two samples sequenced to 0.5× coverage provide better resolution than a single sample sequenced to 2× coverage. We also examined the power to detect whether an ancient sample is directly ancestral to a modern population, finding that, with even a few high coverage individuals, even ancient samples that are very slightly diverged from the modern population can be detected with ease. When we applied our approach to European samples, we found that no ancient samples represent direct ancestors of modern Europeans. We also found that, as shown previously, the most ancient Europeans appear to have had the smallest effective population sizes, indicating a role for agriculture in modern population growth. PMID:29167200

  20. What proportion of people who try one cigarette become daily smokers? A meta analysis of representative surveys.

    PubMed

    Birge, Max; Duffy, Stephen; Miler, Joanna Astrid; Hajek, Peter

    2017-11-04

    The 'conversion rate' from initial experimentation to daily smoking is a potentially important metric of smoking behavior, but estimates of it based on current representative data are lacking. The Global Health Data Exchange was searched for representative surveys conducted in English speaking, developed countries after year 2000 that included questions about ever trying a cigarette and ever smoking daily. The initial search identified 2776 surveys that were further screened for language, location, year, sample size, survey structure and representativeness. 44 surveys that passed the screening process were accessed and their codebooks were examined to see whether the two questions of interest were included. Eight datasets allowed extraction or estimation of relevant information. Survey quality was assessed with regards to response rates, sampling methods and data collection procedures. PRISMA guidelines were followed, with explicit rules for approaching derived variables and skip patterns. Proportions were pooled using random effects meta-analysis. The eight surveys used representative samples of the general adult population. Response rates varied from 45% to 88%. Survey methods were on par with the best practice in this field. Altogether 216,314 respondents were included of whom 60.3% (95%CI 51.3-69.3) ever tried a cigarette. Among those, 68.9% (95% CI 60.9-76.9%) progressed to daily smoking. Over two thirds of people who try one cigarette become, at least temporarily, daily smokers. The finding provides strong support for the current efforts to reduce cigarette experimentation among adolescents. The transition from trying the first cigarette through occasional to daily smoking usually implies that a recreational activity is turning into a compulsive need that has to be satisfied virtually continuously. The 'conversion rate' from initial experimentation to daily smoking is thus a potentially important metric of smoking behavior, but estimates of it based on representative data are lacking. The present meta analysis addressed this gap. Currently, about two thirds of non-smokers experimenting with cigarettes progress to daily smoking. The finding supports strongly the current efforts to reduce cigarette experimentation among adolescents. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Recruitment of Older Adults: Success May Be in the Details.

    PubMed

    McHenry, Judith C; Insel, Kathleen C; Einstein, Gilles O; Vidrine, Amy N; Koerner, Kari M; Morrow, Daniel G

    2015-10-01

    Describe recruitment strategies used in a randomized clinical trial of a behavioral prospective memory intervention to improve medication adherence for older adults taking antihypertensive medication. Recruitment strategies represent 4 themes: accessing an appropriate population, communication and trust-building, providing comfort and security, and expressing gratitude. Recruitment activities resulted in 276 participants with a mean age of 76.32 years, and study enrollment included 207 women, 69 men, and 54 persons representing ethnic minorities. Recruitment success was linked to cultivating relationships with community-based organizations, face-to-face contact with potential study participants, and providing service (e.g., blood pressure checks) as an access point to eligible participants. Seventy-two percent of potential participants who completed a follow-up call and met eligibility criteria were enrolled in the study. The attrition rate was 14.34%. The projected increase in the number of older adults intensifies the need to study interventions that improve health outcomes. The challenge is to recruit sufficient numbers of participants who are also representative of older adults to test these interventions. Failing to recruit a sufficient and representative sample can compromise statistical power and the generalizability of study findings. © The Author 2012. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using ESRI software (ArcGIS) extended by Hawth's Tools and later on its replacement the Geospatial Modelling Environment (GME). 88% of all desired points could actually be reached in the field and have been successfully sampled. Our results indicate that the sampled calibration and validation sets are representative for each other and could be successfully used as interpolation data for spatial prediction purposes. With respect to soil textural fractions, for instance, equal multivariate means and variance homogeneity were found for the two datasets as evidenced by significant (P > 0.05) Hotelling T²-test (2.3 with df1 = 3, df2 = 193) and Bartlett's test statistics (6.4 with df = 6). The multivariate prediction of clay, silt and sand content using a neural network residual cokriging approach reached an explained variance level of 56%, 47% and 63%. Thus, the presented case study is a successful example of considering readily available continuous information on soil forming factors such as geology and relief as stratifying variables for designing sampling schemes in digital soil mapping projects.

  3. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    PubMed

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  4. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...

  5. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...

  6. Breaking through the bandwidth barrier in distributed fiber vibration sensing by sub-Nyquist randomized sampling

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdong; Zhu, Tao; Zheng, Hua; Kuang, Yang; Liu, Min; Huang, Wei

    2017-04-01

    The round trip time of the light pulse limits the maximum detectable frequency response range of vibration in phase-sensitive optical time domain reflectometry (φ-OTDR). We propose a method to break the frequency response range restriction of φ-OTDR system by modulating the light pulse interval randomly which enables a random sampling for every vibration point in a long sensing fiber. This sub-Nyquist randomized sampling method is suits for detecting sparse-wideband- frequency vibration signals. Up to MHz resonance vibration signal with over dozens of frequency components and 1.153MHz single frequency vibration signal are clearly identified for a sensing range of 9.6km with 10kHz maximum sampling rate.

  7. Diagnostic-test evaluation of immunoassays for anti-Toxoplasma gondii IgG antibodies in a random sample of Mexican population.

    PubMed

    Caballero-Ortega, Heriberto; Castillo-Cruz, Rocío; Murieta, Sandra; Ortíz-Alegría, Luz Belinda; Calderón-Segura, Esther; Conde-Glez, Carlos J; Cañedo-Solares, Irma; Correa, Dolores

    2014-05-14

    There are few articles on evaluation of Toxoplasma gondii serological tests. Besides, commercially available tests are not always useful and are expensive for studies in open population. The aim of this study was to evaluate in-house ELISA and western blot for IgG antibodies in a representative sample of people living in Mexico. Three hundred and five serum samples were randomly selected from two national seroepidemiological survey banks; they were taken from men and women of all ages and from all areas of the country. ELISA cut-off was established using the mean plus three standard deviations of negative samples. Western blots were analysed by two experienced technicians and positivity was established according to the presence of at least three diagnostic bands. A commercial ELISA kit was used as a third test. Two reference standards were built up: one using concordant results of two assays leaving the evaluated test out and the other in which the evaluated test was included (IN) with at least two concordant results to define diagnosis. the lowest values of diagnostic parameters were obtained with the OUT reference standards: in-house ELISA had 96.9% sensitivity, 62.1% specificity, 49.6% PPV, 98.1% NPV and 71.8% accuracy, while western blot presented 81.8%, 89.7%, 84.0%, 88.2% and 86.6% values and the best kappa coefficient (0.72-0.82). The in-house ELISA is useful for screening people of Mexico, due to its high sensitivity, while western blot may be used to confirm diagnosis. These techniques might prove useful in other Latin American countries.

  8. Survey of Obstetrician-Gynecologists in the United States About Toxoplasmosis

    PubMed Central

    Dietz, Vance J.; Power, Michael; Lopez, Adriana; Wilson, Marianna; Navin, Thomas R.; Gibbs, Ronald; Schulkin, Jay

    2001-01-01

    Background: Although the incidence of toxoplasmosis is low in the United States, up to 6000 congenital cases occur annually. In September 1998, the Centers for Disease Control and Prevention held a conference about toxoplasmosis; participants recommended a survey of the toxoplasmosis-related knowledge and practices of obstetrician-gynecologists and the development of professional educational materials for them. Methods: In the fall of 1999, surveys were mailed to a 2% random sample of American College of Obstetricians and Gynecologists (ACOG) members and to a demographically representative group of ACOGmembers known as the Collaborative Ambulatory Research Network (CARN). Responses were not significantly different for the random and CARN groups for most questions (p value shown when different). Results: Among 768 US practicing ACOG members surveyed, 364 (47%) responded. Seven per cent (CARN 10%, random 5%) had diagnosed one or more case(s) of acute toxoplasmosis in the past year. Respondents were well-informed about how to prevent toxoplasmosis. However, only 12% (CARN 11%, random 12%) indicated that a positive Toxoplasma IgM test might be a false–positive result, and only 11% (CARN 14%, random 9%) were aware that the Food and Drug Administration sent an advisory to all ACOG members in 1997 stating that some Toxoplasma IgM test kits have high false–positive rates. Most of those surveyed (CARN 70%, random 59%; X2 p < 0.05) were opposed to universal screening of pregnant women. Conclusions: Many US obstetrician-gynecologists will encounter acute toxoplasmosis during their careers, but they are frequently uncertain about interpretation of the laboratory tests for the disease. Most would not recommend universal screening of pregnant women. PMID:11368255

  9. Nonlinear Spatial Inversion Without Monte Carlo Sampling

    NASA Astrophysics Data System (ADS)

    Curtis, A.; Nawaz, A.

    2017-12-01

    High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable, so these do not need to be estimated from samples as is required in MC methods. On a 2-D test example the method is shown to outperform previous methods significantly, and at a fraction of the computational cost. In many foreseeable applications there are therefore no serious impediments to extending the method to 3-D spatial models.

  10. 78 FR 57033 - United States Standards for Condition of Food Containers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-17

    ... containers during production. Stationary lot sampling is the process of randomly selecting sample units from.... * * * * * Stationary lot sampling. The process of randomly selecting sample units from a lot whose production has been... less than \\1/16\\-inch Stringy seal (excessive plastic threads showing at edge of seal 222 area...

  11. Health indicators: eliminating bias from convenience sampling estimators.

    PubMed

    Hedt, Bethany L; Pagano, Marcello

    2011-02-28

    Public health practitioners are often called upon to make inference about a health indicator for a population at large when the sole available information are data gathered from a convenience sample, such as data gathered on visitors to a clinic. These data may be of the highest quality and quite extensive, but the biases inherent in a convenience sample preclude the legitimate use of powerful inferential tools that are usually associated with a random sample. In general, we know nothing about those who do not visit the clinic beyond the fact that they do not visit the clinic. An alternative is to take a random sample of the population. However, we show that this solution would be wasteful if it excluded the use of available information. Hence, we present a simple annealing methodology that combines a relatively small, and presumably far less expensive, random sample with the convenience sample. This allows us to not only take advantage of powerful inferential tools, but also provides more accurate information than that available from just using data from the random sample alone. Copyright © 2011 John Wiley & Sons, Ltd.

  12. FastRNABindR: Fast and Accurate Prediction of Protein-RNA Interface Residues.

    PubMed

    El-Manzalawy, Yasser; Abbas, Mostafa; Malluhi, Qutaibah; Honavar, Vasant

    2016-01-01

    A wide range of biological processes, including regulation of gene expression, protein synthesis, and replication and assembly of many viruses are mediated by RNA-protein interactions. However, experimental determination of the structures of protein-RNA complexes is expensive and technically challenging. Hence, a number of computational tools have been developed for predicting protein-RNA interfaces. Some of the state-of-the-art protein-RNA interface predictors rely on position-specific scoring matrix (PSSM)-based encoding of the protein sequences. The computational efforts needed for generating PSSMs severely limits the practical utility of protein-RNA interface prediction servers. In this work, we experiment with two approaches, random sampling and sequence similarity reduction, for extracting a representative reference database of protein sequences from more than 50 million protein sequences in UniRef100. Our results suggest that random sampled databases produce better PSSM profiles (in terms of the number of hits used to generate the profile and the distance of the generated profile to the corresponding profile generated using the entire UniRef100 data as well as the accuracy of the machine learning classifier trained using these profiles). Based on our results, we developed FastRNABindR, an improved version of RNABindR for predicting protein-RNA interface residues using PSSM profiles generated using 1% of the UniRef100 sequences sampled uniformly at random. To the best of our knowledge, FastRNABindR is the only protein-RNA interface residue prediction online server that requires generation of PSSM profiles for query sequences and accepts hundreds of protein sequences per submission. Our approach for determining the optimal BLAST database for a protein-RNA interface residue classification task has the potential of substantially speeding up, and hence increasing the practical utility of, other amino acid sequence based predictors of protein-protein and protein-DNA interfaces.

  13. WE-AB-207A-04: Random Undersampled Cone Beam CT: Theoretical Analysis and a Novel Reconstruction Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, C; Chen, L; Jia, X

    2016-06-15

    Purpose: Reducing x-ray exposure and speeding up data acquisition motived studies on projection data undersampling. It is an important question that for a given undersampling ratio, what the optimal undersampling approach is. In this study, we propose a new undersampling scheme: random-ray undersampling. We will mathematically analyze its projection matrix properties and demonstrate its advantages. We will also propose a new reconstruction method that simultaneously performs CT image reconstruction and projection domain data restoration. Methods: By representing projection operator under the basis of singular vectors of full projection operator, matrix representations for an undersampling case can be generated and numericalmore » singular value decomposition can be performed. We compared properties of matrices among three undersampling approaches: regular-view undersampling, regular-ray undersampling, and the proposed random-ray undersampling. To accomplish CT reconstruction for random undersampling, we developed a novel method that iteratively performs CT reconstruction and missing projection data restoration via regularization approaches. Results: For a given undersampling ratio, random-ray undersampling preserved mathematical properties of full projection operator better than the other two approaches. This translates to advantages of reconstructing CT images at lower errors. Different types of image artifacts were observed depending on undersampling strategies, which were ascribed to the unique singular vectors of the sampling operators in the image domain. We tested the proposed reconstruction algorithm on a Forbid phantom with only 30% of the projection data randomly acquired. Reconstructed image error was reduced from 9.4% in a TV method to 7.6% in the proposed method. Conclusion: The proposed random-ray undersampling is mathematically advantageous over other typical undersampling approaches. It may permit better image reconstruction at the same undersampling ratio. The novel algorithm suitable for this random-ray undersampling was able to reconstruct high-quality images.« less

  14. 21 CFR 111.80 - What representative samples must you collect?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Process Control System § 111.80 What representative samples must you collect? The representative samples... unique lot within each unique shipment); (b) Representative samples of in-process materials for each manufactured batch at points, steps, or stages, in the manufacturing process as specified in the master...

  15. Measles deaths in Nepal: estimating the national case-fatality ratio.

    PubMed

    Joshi, Anand B; Luman, Elizabeth T; Nandy, Robin; Subedi, Bal K; Liyanage, Jayantha B L; Wierzba, Thomas F

    2009-06-01

    To estimate the case-fatality ratio (CFR) for measles in Nepal, determine the role of risk factors, such as political instability, for measles mortality, and compare the use of a nationally representative sample of outbreaks versus routine surveillance or a localized study to establish the national CFR (nCFR). This was a retrospective study of measles cases and deaths in Nepal. Through two-stage random sampling, we selected 37 districts with selection probability proportional to the number of districts in each region, and then randomly selected within each district one outbreak among all those that had occurred between 1 March and 1 September 2004. Cases were identified by interviewing a member of each and every household and tracing contacts. Bivariate analyses were performed to assess the risk factors for a high CFR and determine the time from rash onset until death. Each factor's contribution to the CFR was determined through multivariate logistic regression. From the number of measles cases and deaths found in the study we calculated the total number of measles cases and deaths for all of Nepal during the study period and in 2004. We identified 4657 measles cases and 64 deaths in the study period and area. This yielded a total of about 82 000 cases and 900 deaths for all outbreaks in 2004 and a national CFR of 1.1% (95% confidence interval, CI: 0.5-2.3). CFR ranged from 0.1% in the eastern region to 3.4% in the mid-western region and was highest in politically insecure areas, in the Ganges plains and among cases < 5 years of age. Vitamin A treatment and measles immunization were protective. Most deaths occurred during the first week of illness. To our knowledge, this is the first CFR study based on a nationally representative sample of measles outbreaks. Routine surveillance and studies of a single outbreak may not yield an accurate nCFR. Increased fatalities associated with political insecurity are a challenge for health-care service delivery. The short period from disease onset to death and reduced mortality from treatment with vitamin A suggest the need for rapid, field-based treatment early in the outbreak.

  16. Tigers on trails: occupancy modeling for cluster sampling.

    PubMed

    Hines, J E; Nichols, J D; Royle, J A; MacKenzie, D I; Gopalaswamy, A M; Kumar, N Samba; Karanth, K U

    2010-07-01

    Occupancy modeling focuses on inference about the distribution of organisms over space, using temporal or spatial replication to allow inference about the detection process. Inference based on spatial replication strictly requires that replicates be selected randomly and with replacement, but the importance of these design requirements is not well understood. This paper focuses on an increasingly popular sampling design based on spatial replicates that are not selected randomly and that are expected to exhibit Markovian dependence. We develop two new occupancy models for data collected under this sort of design, one based on an underlying Markov model for spatial dependence and the other based on a trap response model with Markovian detections. We then simulated data under the model for Markovian spatial dependence and fit the data to standard occupancy models and to the two new models. Bias of occupancy estimates was substantial for the standard models, smaller for the new trap response model, and negligible for the new spatial process model. We also fit these models to data from a large-scale tiger occupancy survey recently conducted in Karnataka State, southwestern India. In addition to providing evidence of a positive relationship between tiger occupancy and habitat, model selection statistics and estimates strongly supported the use of the model with Markovian spatial dependence. This new model provides another tool for the decomposition of the detection process, which is sometimes needed for proper estimation and which may also permit interesting biological inferences. In addition to designs employing spatial replication, we note the likely existence of temporal Markovian dependence in many designs using temporal replication. The models developed here will be useful either directly, or with minor extensions, for these designs as well. We believe that these new models represent important additions to the suite of modeling tools now available for occupancy estimation in conservation monitoring. More generally, this work represents a contribution to the topic of cluster sampling for situations in which there is a need for specific modeling (e.g., reflecting dependence) for the distribution of the variable(s) of interest among subunits.

  17. Patterns of Childhood Abuse and Neglect in a Representative German Population Sample

    PubMed Central

    Schilling, Christoph; Weidner, Kerstin; Brähler, Elmar; Glaesmer, Heide; Häuser, Winfried; Pöhlmann, Karin

    2016-01-01

    Background Different types of childhood maltreatment, like emotional abuse, emotional neglect, physical abuse, physical neglect and sexual abuse are interrelated because of their co-occurrence. Different patterns of childhood abuse and neglect are associated with the degree of severity of mental disorders in adulthood. The purpose of this study was (a) to identify different patterns of childhood maltreatment in a representative German community sample, (b) to replicate the patterns of childhood neglect and abuse recently found in a clinical German sample, (c) to examine whether participants reporting exposure to specific patterns of child maltreatment would report different levels of psychological distress, and (d) to compare the results of the typological approach and the results of a cumulative risk model based on our data set. Methods In a cross-sectional survey conducted in 2010, a representative random sample of 2504 German participants aged between 14 and 92 years completed the Childhood Trauma Questionnaire (CTQ). General anxiety and depression were assessed by standardized questionnaires (GAD-2, PHQ-2). Cluster analysis was conducted with the CTQ-subscales to identify different patterns of childhood maltreatment. Results Three different patterns of childhood abuse and neglect could be identified by cluster analysis. Cluster one showed low values on all CTQ-scales. Cluster two showed high values in emotional and physical neglect. Only cluster three showed high values in physical and sexual abuse. The three patterns of childhood maltreatment showed different degrees of depression (PHQ-2) and anxiety (GAD-2). Cluster one showed lowest levels of psychological distress, cluster three showed highest levels of mental distress. Conclusion The results show that different types of childhood maltreatment are interrelated and can be grouped into specific patterns of childhood abuse and neglect, which are associated with differing severity of psychological distress in adulthood. The results correspond to those recently found in a German clinical sample and support a typological approach in the research of maltreatment. While cumulative risk models focus on the number of maltreatment types, the typological approach takes the number as well as the severity of the maltreatment types into account. Thus, specific patterns of maltreatment can be examined with regard to specific long-term psychological consequences. PMID:27442446

  18. Design and methods of the Midwest Stream Quality Assessment (MSQA), 2013

    USGS Publications Warehouse

    Garrett, Jessica D.; Frey, Jeffrey W.; Van Metre, Peter C.; Journey, Celeste A.; Nakagaki, Naomi; Button, Daniel T.; Nowell, Lisa H.

    2017-10-18

    During 2013, the U.S. Geological Survey (USGS) National Water-Quality Assessment Project (NAWQA), in collaboration with the USGS Columbia Environmental Research Center, the U.S. Environmental Protection Agency (EPA) National Rivers and Streams Assessment (NRSA), and the EPA Office of Pesticide Programs assessed stream quality across the Midwestern United States. This Midwest Stream Quality Assessment (MSQA) simultaneously characterized watershed and stream-reach water-quality stressors along with instream biological conditions, to better understand regional stressor-effects relations. The MSQA design focused on effects from the widespread agriculture in the region and urban development because of their importance as ecological stressors of particular concern to Midwest region resource managers.A combined random stratified selection and a targeted selection based on land-use data were used to identify and select sites representing gradients in agricultural intensity across the region. During a 14-week period from May through August 2013, 100 sites were selected and sampled 12 times for contaminants, nutrients, and sediment. This 14-week water-quality “index” period culminated with an ecological survey of habitat, periphyton, benthic macroinvertebrates, and fish at all sites. Sediment was collected during the ecological survey for analysis of sediment chemistry and toxicity testing. Of the 100 sites, 50 were selected for the MSQA random stratified group from 154 NRSA sites planned for the region, and the other 50 MSQA sites were selected as targeted sites to more evenly cover agricultural and urban stressor gradients in the study area. Of the 50 targeted sites, 12 were in urbanized watersheds and 21 represented “good” biological conditions or “least disturbed” conditions. The remaining 17 targeted sites were selected to improve coverage of the agricultural intensity gradient or because of historical data collection to provide temporal context for the study.This report provides a detailed description of the MSQA study components, including surveys of ecological conditions, routine water sampling, deployment of passive polar organic compound integrative samplers, and stream sediment sampling at all sites. Component studies that were completed to provide finer scale temporal data or more extensive analysis at selected sites, included continuous water-quality monitoring, daily pesticide sampling, laboratory and in-stream water toxicity testing efforts, and deployment of passive suspended-sediment samplers.

  19. After-School Multifamily Groups: A Randomized Controlled Trial Involving Low-Income, Urban, Latino Children

    ERIC Educational Resources Information Center

    McDonald, Lynn; Moberg, D. Paul; Brown, Roger; Rodriguez-Espiricueta, Ismael; Flores, Nydia I.; Burke, Melissa P.; Coover, Gail

    2006-01-01

    This randomized controlled trial evaluated a culturally representative parent engagement strategy with Latino parents of elementary school children. Ten urban schools serving low-income children from mixed cultural backgrounds participated in a large study. Classrooms were randomly assigned either either to an after-school, multifamily support…

  20. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    PubMed

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  1. Random phase detection in multidimensional NMR.

    PubMed

    Maciejewski, Mark W; Fenwick, Matthew; Schuyler, Adam D; Stern, Alan S; Gorbatyuk, Vitaliy; Hoch, Jeffrey C

    2011-10-04

    Despite advances in resolution accompanying the development of high-field superconducting magnets, biomolecular applications of NMR require multiple dimensions in order to resolve individual resonances, and the achievable resolution is typically limited by practical constraints on measuring time. In addition to the need for measuring long evolution times to obtain high resolution, the need to distinguish the sign of the frequency constrains the ability to shorten measuring times. Sign discrimination is typically accomplished by sampling the signal with two different receiver phases or by selecting a reference frequency outside the range of frequencies spanned by the signal and then sampling at a higher rate. In the parametrically sampled (indirect) time dimensions of multidimensional NMR experiments, either method imposes an additional factor of 2 sampling burden for each dimension. We demonstrate that by using a single detector phase at each time sample point, but randomly altering the phase for different points, the sign ambiguity that attends fixed single-phase detection is resolved. Random phase detection enables a reduction in experiment time by a factor of 2 for each indirect dimension, amounting to a factor of 8 for a four-dimensional experiment, albeit at the cost of introducing sampling artifacts. Alternatively, for fixed measuring time, random phase detection can be used to double resolution in each indirect dimension. Random phase detection is complementary to nonuniform sampling methods, and their combination offers the potential for additional benefits. In addition to applications in biomolecular NMR, random phase detection could be useful in magnetic resonance imaging and other signal processing contexts.

  2. Detecting Beer Intake by Unique Metabolite Patterns.

    PubMed

    Gürdeniz, Gözde; Jensen, Morten Georg; Meier, Sebastian; Bech, Lene; Lund, Erik; Dragsted, Lars Ove

    2016-12-02

    Evaluation of the health related effects of beer intake is hampered by the lack of accurate tools for assessing intakes (biomarkers). Therefore, we identified plasma and urine metabolites associated with recent beer intake by untargeted metabolomics and established a characteristic metabolite pattern representing raw materials and beer production as a qualitative biomarker of beer intake. In a randomized, crossover, single-blinded meal study (MSt1), 18 participants were given, one at a time, four different test beverages: strong, regular, and nonalcoholic beers and a soft drink. Four participants were assigned to have two additional beers (MSt2). In addition to plasma and urine samples, test beverages, wort, and hops extract were analyzed by UPLC-QTOF. A unique metabolite pattern reflecting beer metabolome, including metabolites derived from beer raw material (i.e., N-methyl tyramine sulfate and the sum of iso-α-acids and tricyclohumols) and the production process (i.e., pyro-glutamyl proline and 2-ethyl malate), was selected to establish a compliance biomarker model for detection of beer intake based on MSt1. The model predicted the MSt2 samples collected before and up to 12 h after beer intake correctly (AUC = 1). A biomarker model including four metabolites representing both beer raw materials and production steps provided a specific and accurate tool for measurement of beer consumption.

  3. Elevated incidence rates of diabetes in Peru: report from PERUDIAB, a national urban population-based longitudinal study

    PubMed Central

    Seclen, Segundo Nicolas; Rosas, Moises Ernesto; Arias, Arturo Jaime; Medina, Cecilia Alexandra

    2017-01-01

    Objective A recent report from a non-nationally representative, geographically diverse sample in four separate communities in Peru suggests an unusually high diabetes incidence. We aimed to estimate the national diabetes incidence rate using PERUDIAB, a probabilistic, national urban population-based longitudinal study. Research design and methods 662 subjects without diabetes, selected by multistage, cluster, random sampling of households, representing the 24 administrative and the 3 (coast, highlands and jungle) natural regions across the country, from both sexes, aged 25+ years at baseline, enrolled in 2010–2012, were followed for 3.8 years. New diabetes cases were defined as fasting blood glucose ≥126 mg/dL or on medical diabetes treatment. Results There were 49 cases of diabetes in 2408 person-years follow-up. The weighted cumulative incidence of diabetes was 7.2% while the weighted incidence rate was estimated at 19.5 (95% CI 13.9 to 28.3) new cases per 1000 person-years. Older age, obesity and technical or higher education were statistically associated with the incidence of diabetes. Conclusion Our results confirm that the incidence of diabetes in Peru is among the highest reported globally. The fast economic growth in the last 20 years, high overweight and obesity rates may have triggered this phenomenon. PMID:28878935

  4. Elevated incidence rates of diabetes in Peru: report from PERUDIAB, a national urban population-based longitudinal study.

    PubMed

    Seclen, Segundo Nicolas; Rosas, Moises Ernesto; Arias, Arturo Jaime; Medina, Cecilia Alexandra

    2017-01-01

    A recent report from a non-nationally representative, geographically diverse sample in four separate communities in Peru suggests an unusually high diabetes incidence. We aimed to estimate the national diabetes incidence rate using PERUDIAB, a probabilistic, national urban population-based longitudinal study. 662 subjects without diabetes, selected by multistage, cluster, random sampling of households, representing the 24 administrative and the 3 (coast, highlands and jungle) natural regions across the country, from both sexes, aged 25+ years at baseline, enrolled in 2010-2012, were followed for 3.8 years. New diabetes cases were defined as fasting blood glucose ≥126 mg/dL or on medical diabetes treatment. There were 49 cases of diabetes in 2408 person-years follow-up. The weighted cumulative incidence of diabetes was 7.2% while the weighted incidence rate was estimated at 19.5 (95% CI 13.9 to 28.3) new cases per 1000 person-years. Older age, obesity and technical or higher education were statistically associated with the incidence of diabetes. Our results confirm that the incidence of diabetes in Peru is among the highest reported globally. The fast economic growth in the last 20 years, high overweight and obesity rates may have triggered this phenomenon.

  5. Injury-related mortality in South Africa: a retrospective descriptive study of postmortem investigations

    PubMed Central

    Prinsloo, Megan; Pillay-van Wyk, Victoria; Gwebushe, Nomonde; Mathews, Shanaaz; Martin, Lorna J; Laubscher, Ria; Abrahams, Naeemah; Msemburi, William; Lombard, Carl; Bradshaw, Debbie

    2015-01-01

    Abstract Objective To investigate injury-related mortality in South Africa using a nationally representative sample and compare the results with previous estimates. Methods We conducted a retrospective descriptive study of medico-legal postmortem investigation data from mortuaries using a multistage random sample, stratified by urban and non-urban areas and mortuary size. We calculated age-specific and age-standardized mortality rates for external causes of death. Findings Postmortem reports revealed 52 493 injury-related deaths in 2009 (95% confidence interval, CI: 46 930–58 057). Almost half (25 499) were intentionally inflicted. Age-standardized mortality rates per 100 000 population were as follows: all injuries: 109.0 (95% CI: 97.1–121.0); homicide 38.4 (95% CI: 33.8–43.0; suicide 13.4 (95% CI: 11.6–15.2) and road-traffic injury 36.1 (95% CI: 30.9–41.3). Using postmortem reports, we found more than three times as many deaths from homicide and road-traffic injury than had been recorded by vital registration for this period. The homicide rate was similar to the estimate for South Africa from a global analysis, but road-traffic and suicide rates were almost fourfold higher. Conclusion This is the first nationally representative sample of injury-related mortality in South Africa. It provides more accurate estimates and cause-specific profiles that are not available from other sources. PMID:26229201

  6. Family meals and body weight in US adults.

    PubMed

    Sobal, Jeffery; Hanson, Karla

    2011-09-01

    Family meals are an important ritual in contemporary societies and many studies have reported associations of family meals with several biopsychosocial outcomes among children and adolescents. However, few representative analyses of family meals have been conducted in samples of adults, and adults may differ from young people in predictors and outcomes of family meal consumption. We examined the prevalence and predictors of adult family meals and body weight outcomes. The cross-sectional 2009 Cornell National Social Survey (CNSS) included questions about the frequency of family meals, body weight as BMI and sociodemographic characteristics. The CNSS telephone survey used random digit dialling to sample individuals. We analysed data from 882 adults living with family members in a nationally representative US sample. Prevalence of family meals among these adults revealed that 53 % reported eating family meals seven or more times per week. Predictive results revealed that adults who more frequently ate family meals were more likely to be married and less likely to be employed full-time, year-round. Outcome results revealed that the overall frequency of family meals among adults was not significantly associated with any measure of body weight. However, interaction term analysis suggested an inverse association between frequency of family meals and BMI for adults with children in the household, and no association among adults without children. These findings suggest that family meals among adults are commonplace, associated with marital and work roles, and marginally associated with body weight only in households with children.

  7. High-speed imaging using CMOS image sensor with quasi pixel-wise exposure

    NASA Astrophysics Data System (ADS)

    Sonoda, T.; Nagahara, H.; Endo, K.; Sugiyama, Y.; Taniguchi, R.

    2017-02-01

    Several recent studies in compressive video sensing have realized scene capture beyond the fundamental trade-off limit between spatial resolution and temporal resolution using random space-time sampling. However, most of these studies showed results for higher frame rate video that were produced by simulation experiments or using an optically simulated random sampling camera, because there are currently no commercially available image sensors with random exposure or sampling capabilities. We fabricated a prototype complementary metal oxide semiconductor (CMOS) image sensor with quasi pixel-wise exposure timing that can realize nonuniform space-time sampling. The prototype sensor can reset exposures independently by columns and fix these amount of exposure by rows for each 8x8 pixel block. This CMOS sensor is not fully controllable via the pixels, and has line-dependent controls, but it offers flexibility when compared with regular CMOS or charge-coupled device sensors with global or rolling shutters. We propose a method to realize pseudo-random sampling for high-speed video acquisition that uses the flexibility of the CMOS sensor. We reconstruct the high-speed video sequence from the images produced by pseudo-random sampling using an over-complete dictionary.

  8. Efficacy of a Mandibular Advancement Appliance on Sleep Disordered Breathing in Children: A Study Protocol of a Crossover Randomized Controlled Trial

    PubMed Central

    Idris, Ghassan; Galland, Barbara; Robertson, Christopher J.; Farella, Mauro

    2016-01-01

    Background: Sleep-Disordered Breathing (SDB) varies from habitual snoring to partial or complete obstruction of the upper airway and can be found in up to 10% of children. SDB can significantly affect children's wellbeing, as it can cause growth disorders, educational and behavioral problems, and even life-threatening conditions, such as cardiorespiratory failure. Adenotonsillectomy represents the primary treatment for pediatric SDB where adeno-tonsillar hypertrophy is indicated. For those with craniofacial anomalies, or for whom adenotonsillectomy or other treatment modalities have failed, or surgery is contra-indicated, mandibular advancement splints (MAS) may represent a viable treatment option. Whilst the efficacy of these appliances has been consistently demonstrated in adults, there is little information about their effectiveness in children. Aims: To determine the efficacy of mandibular advancement appliances for the management of SDB and related health problems in children. Methods/design: The study will be designed as a single-blind crossover randomized controlled trial with administration of both an “Active MAS” (Twin-block) and a “Sham MAS.” Eligible participants will be children aged 8–12 years whose parents report they snore ≥3 nights per week. Sixteen children will enter the full study after confirming other inclusion criteria, particularly Skeletal class I or class II confirmed by lateral cephalometric radiograph. Each child will be randomly assigned to either a treatment sequence starting with the Active or the Sham MAS. Participants will wear the appliances for 3 weeks separated by a 2-week washout period. For each participant, home-based polysomnographic data will be collected four times; once before and once after each treatment period. The Apnea Hypopnea Index (AHI) will represent the main outcome variable. Secondary outcomes will include, snoring frequency, masseter muscle activity, sleep symptoms, quality of life, daytime sleepiness, children behavior, and nocturnal enuresis. In addition, blood samples will be collected to assess growth hormone changes. Trial registration: This study was registered in the Australian New Zealand Clinical Trials Registry (ANZCTR): [ACTRN12614001013651]. PMID:27594841

  9. Mass media influence spreading in social networks with community structure

    NASA Astrophysics Data System (ADS)

    Candia, Julián; Mazzitello, Karina I.

    2008-07-01

    We study an extension of Axelrod's model for social influence, in which cultural drift is represented as random perturbations, while mass media are introduced by means of an external field. In this scenario, we investigate how the modular structure of social networks affects the propagation of mass media messages across a society. The community structure of social networks is represented by coupled random networks, in which two random graphs are connected by intercommunity links. Considering inhomogeneous mass media fields, we study the conditions for successful message spreading and find a novel phase diagram in the multidimensional parameter space. These findings show that social modularity effects are of paramount importance for designing successful, cost-effective advertising campaigns.

  10. Pharmacy journal abstracts published in PubMed that abide by the CONsolidated Standards Of Reporting Trials (CONSORT) guidelines

    PubMed Central

    Blair, Daniel A.; Woolley, Thomas W.

    2014-01-01

    The purpose of this research was to determine the proportion of abstracts in pharmacy journals that are prepared according to the CONsolidated Standards Of Reporting Trials (CONSORT) criteria for abstracts. Certain abstracts for randomized controlled clinical trials (RCTs) indexed in PubMed were eligible for inclusion, with the primary endpoint being median overall compliance to CONSORT recommendations for abstracts. A total of 63 RCT abstracts were included in the analysis, with only 56% of the recommended CONSORT items represented in the sample. It is recommended that pharmacy journals encourage authors to follow CONSORT recommendations for abstracts when submitting RCTs for publication. PMID:24860268

  11. Behavior intentions of the public after bans on smoking in restaurants and bars.

    PubMed Central

    Biener, L; Siegel, M

    1997-01-01

    OBJECTIVES: This study assessed the potential effect of smoke-free policies on bar and restaurant patronage. METHODS: Random-digit dialing techniques were used in surveying a representative sample of Massachusetts adults (n = 2356) by telephone. RESULTS: Approximately 61% of the respondents predicted no change in their use of restaurants in response to smoke-free policies, 30% predicted increased use, and 8% predicted decreased use. In turn, 69% of the respondents predicted no change in their patronage of bars, while 20% predicted increased use and 11% predicted decreased use. CONCLUSIONS: These results suggest that smoke-free policies are likely to increase overall patronage of bars and restaurants. PMID:9431301

  12. Mapping Soil Properties of Africa at 250 m Resolution: Random Forests Significantly Improve Current Predictions

    PubMed Central

    Hengl, Tomislav; Heuvelink, Gerard B. M.; Kempen, Bas; Leenaars, Johan G. B.; Walsh, Markus G.; Shepherd, Keith D.; Sila, Andrew; MacMillan, Robert A.; Mendes de Jesus, Jorge; Tamene, Lulseged; Tondoh, Jérôme E.

    2015-01-01

    80% of arable land in Africa has low soil fertility and suffers from physical soil problems. Additionally, significant amounts of nutrients are lost every year due to unsustainable soil management practices. This is partially the result of insufficient use of soil management knowledge. To help bridge the soil information gap in Africa, the Africa Soil Information Service (AfSIS) project was established in 2008. Over the period 2008–2014, the AfSIS project compiled two point data sets: the Africa Soil Profiles (legacy) database and the AfSIS Sentinel Site database. These data sets contain over 28 thousand sampling locations and represent the most comprehensive soil sample data sets of the African continent to date. Utilizing these point data sets in combination with a large number of covariates, we have generated a series of spatial predictions of soil properties relevant to the agricultural management—organic carbon, pH, sand, silt and clay fractions, bulk density, cation-exchange capacity, total nitrogen, exchangeable acidity, Al content and exchangeable bases (Ca, K, Mg, Na). We specifically investigate differences between two predictive approaches: random forests and linear regression. Results of 5-fold cross-validation demonstrate that the random forests algorithm consistently outperforms the linear regression algorithm, with average decreases of 15–75% in Root Mean Squared Error (RMSE) across soil properties and depths. Fitting and running random forests models takes an order of magnitude more time and the modelling success is sensitive to artifacts in the input data, but as long as quality-controlled point data are provided, an increase in soil mapping accuracy can be expected. Results also indicate that globally predicted soil classes (USDA Soil Taxonomy, especially Alfisols and Mollisols) help improve continental scale soil property mapping, and are among the most important predictors. This indicates a promising potential for transferring pedological knowledge from data rich countries to countries with limited soil data. PMID:26110833

  13. Randomized Cross-Sectional Study to Compare HIV-1 Specific Antibody and Cytokine Concentrations in Female Genital Secretions Obtained by Menstrual Cup and Cervicovaginal Lavage.

    PubMed

    Archary, Derseree; Liebenberg, Lenine J; Werner, Lise; Tulsi, Sahil; Majola, Nelisile; Naicker, Nivashnee; Dlamini, Sarah; Hope, Thomas J; Samsunder, Natasha; Abdool Karim, Salim S; Morris, Lynn; Passmore, Jo-Ann S; Garrett, Nigel J

    2015-01-01

    Optimizing methods for genital specimen collection to accurately characterize mucosal immune responses is a priority for the HIV prevention field. The menstrual cup (MC) has been proposed as an alternative to other methods including cervicovaginal lavage (CVL), but no study has yet formally compared these two methods. Forty HIV-infected, antiretroviral therapy-naïve women from the CAPRISA 002 acute HIV infection cohort study were randomized to have genital fluid collected using the MC with subsequent CVL, or by CVL alone. Qualitative data, which assessed levels of comfort and acceptability of MC using a 5-point Likert scale, was collected. Luminex multiplex assays were used to measure HIV-specific IgG against multiple gene products and 48 cytokines. The majority (94%) of participants indicated that insertion, wearing and removal of the MC was comfortable. Nineteen MCs with 18 matching, subsequent CVLs and 20 randomized CVLs were available for analysis. Mucosal IgG responses against four HIV-antigens were detected in 99% of MCs compared to only 80% of randomized CVLs (p = 0.029). Higher specific antibody activity and total antibodies were observed in MCs compared to CVL (all p<0.001). In MCs, 42/48 (88%) cytokines were in the detectable range in all participants compared to 27/48 (54%) in CVL (p<0.001). Concentrations of 22/41 cytokines (53.7%) were significantly higher in fluid collected by MC. Both total IgG (r = 0.63; p = 0.005) and cytokine concentrations (r = 0.90; p<0.001) correlated strongly between MC and corresponding post-MC CVL. MC sampling improves the detection of mucosal cytokines and antibodies, particularly those present at low concentrations. MC may therefore represent an ideal tool to assess immunological parameters in genital secretions, without interfering with concurrent collection of conventional CVL samples.

  14. Mapping Soil Properties of Africa at 250 m Resolution: Random Forests Significantly Improve Current Predictions.

    PubMed

    Hengl, Tomislav; Heuvelink, Gerard B M; Kempen, Bas; Leenaars, Johan G B; Walsh, Markus G; Shepherd, Keith D; Sila, Andrew; MacMillan, Robert A; Mendes de Jesus, Jorge; Tamene, Lulseged; Tondoh, Jérôme E

    2015-01-01

    80% of arable land in Africa has low soil fertility and suffers from physical soil problems. Additionally, significant amounts of nutrients are lost every year due to unsustainable soil management practices. This is partially the result of insufficient use of soil management knowledge. To help bridge the soil information gap in Africa, the Africa Soil Information Service (AfSIS) project was established in 2008. Over the period 2008-2014, the AfSIS project compiled two point data sets: the Africa Soil Profiles (legacy) database and the AfSIS Sentinel Site database. These data sets contain over 28 thousand sampling locations and represent the most comprehensive soil sample data sets of the African continent to date. Utilizing these point data sets in combination with a large number of covariates, we have generated a series of spatial predictions of soil properties relevant to the agricultural management--organic carbon, pH, sand, silt and clay fractions, bulk density, cation-exchange capacity, total nitrogen, exchangeable acidity, Al content and exchangeable bases (Ca, K, Mg, Na). We specifically investigate differences between two predictive approaches: random forests and linear regression. Results of 5-fold cross-validation demonstrate that the random forests algorithm consistently outperforms the linear regression algorithm, with average decreases of 15-75% in Root Mean Squared Error (RMSE) across soil properties and depths. Fitting and running random forests models takes an order of magnitude more time and the modelling success is sensitive to artifacts in the input data, but as long as quality-controlled point data are provided, an increase in soil mapping accuracy can be expected. Results also indicate that globally predicted soil classes (USDA Soil Taxonomy, especially Alfisols and Mollisols) help improve continental scale soil property mapping, and are among the most important predictors. This indicates a promising potential for transferring pedological knowledge from data rich countries to countries with limited soil data.

  15. Adaptive Peer Sampling with Newscast

    NASA Astrophysics Data System (ADS)

    Tölgyesi, Norbert; Jelasity, Márk

    The peer sampling service is a middleware service that provides random samples from a large decentralized network to support gossip-based applications such as multicast, data aggregation and overlay topology management. Lightweight gossip-based implementations of the peer sampling service have been shown to provide good quality random sampling while also being extremely robust to many failure scenarios, including node churn and catastrophic failure. We identify two problems with these approaches. The first problem is related to message drop failures: if a node experiences a higher-than-average message drop rate then the probability of sampling this node in the network will decrease. The second problem is that the application layer at different nodes might request random samples at very different rates which can result in very poor random sampling especially at nodes with high request rates. We propose solutions for both problems. We focus on Newscast, a robust implementation of the peer sampling service. Our solution is based on simple extensions of the protocol and an adaptive self-control mechanism for its parameters, namely—without involving failure detectors—nodes passively monitor local protocol events using them as feedback for a local control loop for self-tuning the protocol parameters. The proposed solution is evaluated by simulation experiments.

  16. Matching a Distribution by Matching Quantiles Estimation

    PubMed Central

    Sgouropoulos, Nikolaos; Yao, Qiwei; Yastremiz, Claudia

    2015-01-01

    Motivated by the problem of selecting representative portfolios for backtesting counterparty credit risks, we propose a matching quantiles estimation (MQE) method for matching a target distribution by that of a linear combination of a set of random variables. An iterative procedure based on the ordinary least-squares estimation (OLS) is proposed to compute MQE. MQE can be easily modified by adding a LASSO penalty term if a sparse representation is desired, or by restricting the matching within certain range of quantiles to match a part of the target distribution. The convergence of the algorithm and the asymptotic properties of the estimation, both with or without LASSO, are established. A measure and an associated statistical test are proposed to assess the goodness-of-match. The finite sample properties are illustrated by simulation. An application in selecting a counterparty representative portfolio with a real dataset is reported. The proposed MQE also finds applications in portfolio tracking, which demonstrates the usefulness of combining MQE with LASSO. PMID:26692592

  17. Mining the human gut microbiota for effector strains that shape the immune system

    PubMed Central

    Ahern, Philip P.; Faith, Jeremiah J.; Gordon, Jeffrey I.

    2014-01-01

    Summary The gut microbiota co-develops with the immune system beginning at birth. Mining the microbiota for bacterial strains responsible for shaping the structure and dynamic operations of the innate and adaptive arms of the immune system represents a formidable combinatorial problem but one that needs to be overcome to advance mechanistic understanding of microbial community-immune system co-regulation, and in order to develop new diagnostic and therapeutic approaches that promote health. Here, we discuss a scalable, less biased approach for identifying effector strains in complex microbial communities that impact immune function. The approach begins by identifying uncultured human fecal microbiota samples that transmit immune phenotypes to germ-free mice. Clonally-arrayed sequenced collections of bacterial strains are constructed from representative donor microbiota. If the collection transmits phenotypes, effector strains are identified by testing randomly generated subsets with overlapping membership in individually-housed germ-free animals. Detailed mechanistic studies of effector strain-host interactions can then be performed. PMID:24950201

  18. Learning Semantic Tags from Big Data for Clinical Text Representation.

    PubMed

    Li, Yanpeng; Liu, Hongfang

    2015-01-01

    In clinical text mining, it is one of the biggest challenges to represent medical terminologies and n-gram terms in sparse medical reports using either supervised or unsupervised methods. Addressing this issue, we propose a novel method for word and n-gram representation at semantic level. We first represent each word by its distance with a set of reference features calculated by reference distance estimator (RDE) learned from labeled and unlabeled data, and then generate new features using simple techniques of discretization, random sampling and merging. The new features are a set of binary rules that can be interpreted as semantic tags derived from word and n-grams. We show that the new features significantly outperform classical bag-of-words and n-grams in the task of heart disease risk factor extraction in i2b2 2014 challenge. It is promising to see that semantics tags can be used to replace the original text entirely with even better prediction performance as well as derive new rules beyond lexical level.

  19. Lessons learned and insights from the implementation of a food and physical activity policy to prevent obesity in Mexican schools: An analysis of nationally representative survey results.

    PubMed

    Théodore, Florence L; Moreno-Saracho, Jessica E; Bonvecchio, Anabelle; Morales-Ruán, María Del Carmen; Tolentino-Mayo, Lizbeth; López-Olmedo, Nancy; Shamah-Levy, Teresa; Rivera, Juan A

    2018-01-01

    Obesity is a serious problem among children in Mexico. In 2010, the government implemented a national food and physical activity policy in elementary schools, to prevent obesity. The goal of this study is to assess the implementation of this policy, using the logic model from a descriptive survey with national representativeness at the elementary school level and based on a stratified cluster design. We used a systematic random sampling of schools (n = 122), stratified into public and private. We administered questionnaires to 116 principals, 165 members of the Food and Physical Activity Committees, 132 food school food vendors, 119 teachers, 348 parents. This study evidences a significant deviation in implementation from what had been planned. Our lessons learned are the importance to: base the design/implementation of the policy on a theoretical framework, make programs appealing to stakeholders, select concrete and measurable objective or goals, and support stakeholders during the implementation process.

  20. Final report : sampling plan for pavement condition ratings of secondary roads.

    DOT National Transportation Integrated Search

    1984-01-01

    The purpose of this project was to develop a random sampling plan for use in selecting segments of the secondary highway system for evaluation under the Department's PMS. The plan developed is described here. It is a simple, workable, random sampling...

Top