[Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].
Suzukawa, Yumi; Toyoda, Hideki
2012-04-01
This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.
Dahling, Daniel R
2002-01-01
Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.
Big Data and Large Sample Size: A Cautionary Note on the Potential for Bias
Chambers, David A.; Glasgow, Russell E.
2014-01-01
Abstract A number of commentaries have suggested that large studies are more reliable than smaller studies and there is a growing interest in the analysis of “big data” that integrates information from many thousands of persons and/or different data sources. We consider a variety of biases that are likely in the era of big data, including sampling error, measurement error, multiple comparisons errors, aggregation error, and errors associated with the systematic exclusion of information. Using examples from epidemiology, health services research, studies on determinants of health, and clinical trials, we conclude that it is necessary to exercise greater caution to be sure that big sample size does not lead to big inferential errors. Despite the advantages of big studies, large sample size can magnify the bias associated with error resulting from sampling or study design. Clin Trans Sci 2014; Volume #: 1–5 PMID:25043853
USE OF DISPOSABLE DIAPERS TO COLLECT URINE IN EXPOSURE STUDIES
Large studies of children's health as it relates to exposures to chemicals in the environment often require measurements of biomarkers of chemical exposures or effects in urine samples. But collection of urine samples from infants and toddlers is difficult. For large exposure s...
Flexible sampling large-scale social networks by self-adjustable random walk
NASA Astrophysics Data System (ADS)
Xu, Xiao-Ke; Zhu, Jonathan J. H.
2016-12-01
Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.
Got power? A systematic review of sample size adequacy in health professions education research.
Cook, David A; Hatala, Rose
2015-03-01
Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011, and included all studies evaluating simulation-based education for health professionals in comparison with no intervention or another simulation intervention. Reviewers working in duplicate abstracted information to calculate standardized mean differences (SMD's). We included 897 original research studies. Among the 627 no-intervention-comparison studies the median sample size was 25. Only two studies (0.3%) had ≥80% power to detect a small difference (SMD > 0.2 standard deviations) and 136 (22%) had power to detect a large difference (SMD > 0.8). 110 no-intervention-comparison studies failed to find a statistically significant difference, but none excluded a small difference and only 47 (43%) excluded a large difference. Among 297 studies comparing alternate simulation approaches the median sample size was 30. Only one study (0.3%) had ≥80% power to detect a small difference and 79 (27%) had power to detect a large difference. Of the 128 studies that did not detect a statistically significant effect, 4 (3%) excluded a small difference and 91 (71%) excluded a large difference. In conclusion, most education research studies are powered only to detect effects of large magnitude. For most studies that do not reach statistical significance, the possibility of large and important differences still exists.
Kevin McKelvey; Michael Young; W. L. Knotek; K. J. Carim; T. M. Wilcox; T. M. Padgett-Stewart; Michael Schwartz
2016-01-01
This study tested the efficacy of environmental DNA (eDNA) sampling to delineate the distribution of bull trout Salvelinus confluentus in headwater streams in western Montana, U.S.A. Surveys proved fast, reliable and sensitive: 124 samples were collected across five basins by a single crew in c. 8days. Results were largely consistent with past electrofishing,...
An internal pilot design for prospective cancer screening trials with unknown disease prevalence.
Brinton, John T; Ringham, Brandy M; Glueck, Deborah H
2015-10-13
For studies that compare the diagnostic accuracy of two screening tests, the sample size depends on the prevalence of disease in the study population, and on the variance of the outcome. Both parameters may be unknown during the design stage, which makes finding an accurate sample size difficult. To solve this problem, we propose adapting an internal pilot design. In this adapted design, researchers will accrue some percentage of the planned sample size, then estimate both the disease prevalence and the variances of the screening tests. The updated estimates of the disease prevalence and variance are used to conduct a more accurate power and sample size calculation. We demonstrate that in large samples, the adapted internal pilot design produces no Type I inflation. For small samples (N less than 50), we introduce a novel adjustment of the critical value to control the Type I error rate. We apply the method to two proposed prospective cancer screening studies: 1) a small oral cancer screening study in individuals with Fanconi anemia and 2) a large oral cancer screening trial. Conducting an internal pilot study without adjusting the critical value can cause Type I error rate inflation in small samples, but not in large samples. An internal pilot approach usually achieves goal power and, for most studies with sample size greater than 50, requires no Type I error correction. Further, we have provided a flexible and accurate approach to bound Type I error below a goal level for studies with small sample size.
Software engineering the mixed model for genome-wide association studies on large samples
USDA-ARS?s Scientific Manuscript database
Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample siz...
Lin, Yu-Pin; Chu, Hone-Jay; Wang, Cheng-Long; Yu, Hsiao-Hsuan; Wang, Yung-Chieh
2009-01-01
This study applies variogram analyses of normalized difference vegetation index (NDVI) images derived from SPOT HRV images obtained before and after the ChiChi earthquake in the Chenyulan watershed, Taiwan, as well as images after four large typhoons, to delineate the spatial patterns, spatial structures and spatial variability of landscapes caused by these large disturbances. The conditional Latin hypercube sampling approach was applied to select samples from multiple NDVI images. Kriging and sequential Gaussian simulation with sufficient samples were then used to generate maps of NDVI images. The variography of NDVI image results demonstrate that spatial patterns of disturbed landscapes were successfully delineated by variogram analysis in study areas. The high-magnitude Chi-Chi earthquake created spatial landscape variations in the study area. After the earthquake, the cumulative impacts of typhoons on landscape patterns depended on the magnitudes and paths of typhoons, but were not always evident in the spatiotemporal variability of landscapes in the study area. The statistics and spatial structures of multiple NDVI images were captured by 3,000 samples from 62,500 grids in the NDVI images. Kriging and sequential Gaussian simulation with the 3,000 samples effectively reproduced spatial patterns of NDVI images. However, the proposed approach, which integrates the conditional Latin hypercube sampling approach, variogram, kriging and sequential Gaussian simulation in remotely sensed images, efficiently monitors, samples and maps the effects of large chronological disturbances on spatial characteristics of landscape changes including spatial variability and heterogeneity.
ERIC Educational Resources Information Center
Pfaffel, Andreas; Spiel, Christiane
2016-01-01
Approaches to correcting correlation coefficients for range restriction have been developed under the framework of large sample theory. The accuracy of missing data techniques for correcting correlation coefficients for range restriction has thus far only been investigated with relatively large samples. However, researchers and evaluators are…
Hunt, Kathleen E.; Moore, Michael J.; Rolland, Rosalind M.; Kellar, Nicholas M.; Hall, Ailsa J.; Kershaw, Joanna; Raverty, Stephen A.; Davis, Cristina E.; Yeates, Laura C.; Fauquier, Deborah A.; Rowles, Teresa K.; Kraus, Scott D.
2013-01-01
Large whales are subjected to a variety of conservation pressures that could be better monitored and managed if physiological information could be gathered readily from free-swimming whales. However, traditional approaches to studying physiology have been impractical for large whales, because there is no routine method for capture of the largest species and there is presently no practical method of obtaining blood samples from free-swimming whales. We review the currently available techniques for gathering physiological information on large whales using a variety of non-lethal and minimally invasive (or non-invasive) sample matrices. We focus on methods that should produce information relevant to conservation physiology, e.g. measures relevant to stress physiology, reproductive status, nutritional status, immune response, health, and disease. The following four types of samples are discussed: faecal samples, respiratory samples (‘blow’), skin/blubber samples, and photographs. Faecal samples have historically been used for diet analysis but increasingly are also used for hormonal analyses, as well as for assessment of exposure to toxins, pollutants, and parasites. Blow samples contain many hormones as well as respiratory microbes, a diverse array of metabolites, and a variety of immune-related substances. Biopsy dart samples are widely used for genetic, contaminant, and fatty-acid analyses and are now being used for endocrine studies along with proteomic and transcriptomic approaches. Photographic analyses have benefited from recently developed quantitative techniques allowing assessment of skin condition, ectoparasite load, and nutritional status, along with wounds and scars from ship strikes and fishing gear entanglement. Field application of these techniques has the potential to improve our understanding of the physiology of large whales greatly, better enabling assessment of the relative impacts of many anthropogenic and ecological pressures. PMID:27293590
SAMPLING LARGE RIVERS FOR ALGAE, BENTHIC MACROINVERTEBRATES AND FISH
Multiple projects are currently underway to increase our understanding of the effects of different sampling methods and designs used for the biological assessment and monitoring of large (boatable) rivers. Studies include methods used to assess fish, benthic macroinvertebrates, ...
2013-01-01
Background The composition of the microbiota of the equine intestinal tract is complex. Determining whether the microbial composition of fecal samples is representative of proximal compartments of the digestive tract could greatly simplify future studies. The objectives of this study were to compare the microbial populations of the duodenum, ileum, cecum, colon and rectum (feces) within and between healthy horses, and to determine whether rectal (fecal) samples are representative of proximal segments of the gastrointestinal tract. Intestinal samples were collected from ten euthanized horses. 16S rRNA gene PCR-based TRFLP was used to investigate microbiota richness in various segments of the gastrointestinal tract, and dice similarity indices were calculated to compare the samples. Results Within horses large variations of microbial populations along the gastrointestinal tract were seen. The microbiota in rectal samples was only partially representative of other intestinal compartments. The highest similarity was obtained when feces were compared to the cecum. Large compartmental variations were also seen when microbial populations were compared between six horses with similar dietary and housing management. Conclusion Rectal samples were not entirely representative of intestinal compartments in the small or large intestine. This should be taken into account when designing studies using fecal sampling to assess other intestinal compartments. Similarity between horses with similar dietary and husbandry management was also limited, suggesting that parts of the intestinal microbiota were unique to each animal in this study. PMID:23497580
Shen, You-xin; Liu, Wei-li; Li, Yu-hui; Guan, Hui-lin
2014-01-01
A large number of small-sized samples invariably shows that woody species are absent from forest soil seed banks, leading to a large discrepancy with the seedling bank on the forest floor. We ask: 1) Does this conventional sampling strategy limit the detection of seeds of woody species? 2) Are large sample areas and sample sizes needed for higher recovery of seeds of woody species? We collected 100 samples that were 10 cm (length) × 10 cm (width) × 10 cm (depth), referred to as larger number of small-sized samples (LNSS) in a 1 ha forest plot, and placed them to germinate in a greenhouse, and collected 30 samples that were 1 m × 1 m × 10 cm, referred to as small number of large-sized samples (SNLS) and placed them (10 each) in a nearby secondary forest, shrub land and grass land. Only 15.7% of woody plant species of the forest stand were detected by the 100 LNSS, contrasting with 22.9%, 37.3% and 20.5% woody plant species being detected by SNLS in the secondary forest, shrub land and grassland, respectively. The increased number of species vs. sampled areas confirmed power-law relationships for forest stand, the LNSS and SNLS at all three recipient sites. Our results, although based on one forest, indicate that conventional LNSS did not yield a high percentage of detection for woody species, but SNLS strategy yielded a higher percentage of detection for woody species in the seed bank if samples were exposed to a better field germination environment. A 4 m2 minimum sample area derived from power equations is larger than the sampled area in most studies in the literature. Increased sample size also is needed to obtain an increased sample area if the number of samples is to remain relatively low.
FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS ? (ABSTRACT)
Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...
FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS?
Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...
Harada, Sei; Hirayama, Akiyoshi; Chan, Queenie; Kurihara, Ayako; Fukai, Kota; Iida, Miho; Kato, Suzuka; Sugiyama, Daisuke; Kuwabara, Kazuyo; Takeuchi, Ayano; Akiyama, Miki; Okamura, Tomonori; Ebbels, Timothy M D; Elliott, Paul; Tomita, Masaru; Sato, Asako; Suzuki, Chizuru; Sugimoto, Masahiro; Soga, Tomoyoshi; Takebayashi, Toru
2018-01-01
Cohort studies with metabolomics data are becoming more widespread, however, large-scale studies involving 10,000s of participants are still limited, especially in Asian populations. Therefore, we started the Tsuruoka Metabolomics Cohort Study enrolling 11,002 community-dwelling adults in Japan, and using capillary electrophoresis-mass spectrometry (CE-MS) and liquid chromatography-mass spectrometry. The CE-MS method is highly amenable to absolute quantification of polar metabolites, however, its reliability for large-scale measurement is unclear. The aim of this study is to examine reproducibility and validity of large-scale CE-MS measurements. In addition, the study presents absolute concentrations of polar metabolites in human plasma, which can be used in future as reference ranges in a Japanese population. Metabolomic profiling of 8,413 fasting plasma samples were completed using CE-MS, and 94 polar metabolites were structurally identified and quantified. Quality control (QC) samples were injected every ten samples and assessed throughout the analysis. Inter- and intra-batch coefficients of variation of QC and participant samples, and technical intraclass correlation coefficients were estimated. Passing-Bablok regression of plasma concentrations by CE-MS on serum concentrations by standard clinical chemistry assays was conducted for creatinine and uric acid. In QC samples, coefficient of variation was less than 20% for 64 metabolites, and less than 30% for 80 metabolites out of the 94 metabolites. Inter-batch coefficient of variation was less than 20% for 81 metabolites. Estimated technical intraclass correlation coefficient was above 0.75 for 67 metabolites. The slope of Passing-Bablok regression was estimated as 0.97 (95% confidence interval: 0.95, 0.98) for creatinine and 0.95 (0.92, 0.96) for uric acid. Compared to published data from other large cohort measurement platforms, reproducibility of metabolites common to the platforms was similar to or better than in the other studies. These results show that our CE-MS platform is suitable for conducting large-scale epidemiological studies.
ERIC Educational Resources Information Center
Kaplan, David; Su, Dan
2016-01-01
This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…
Bailey, S R; Townsend, C L; Dent, H; Mallet, C; Tsaliki, E; Riley, E M; Noursadeghi, M; Lawley, T D; Rodger, A J; Brocklehurst, P; Field, N
2017-12-28
Few data are available to guide biological sample collection around the time of birth for large-scale birth cohorts. We are designing a large UK birth cohort to investigate the role of infection and the developing immune system in determining future health and disease. We undertook a pilot to develop methodology for the main study, gain practical experience of collecting samples, and understand the acceptability of sample collection to women in late pregnancy. Between February-July 2014, we piloted the feasibility and acceptability of collecting maternal stool, baby stool and cord blood samples from participants recruited at prolonged pregnancy and planned pre-labour caesarean section clinics at University College London Hospital. Participating women were asked to complete acceptability questionnaires. Overall, 265 women were approached and 171 (65%) participated, with ≥1 sample collected from 113 women or their baby (66%). Women had a mean age of 34 years, were primarily of white ethnicity (130/166, 78%), and half were nulliparous (86/169, 51%). Women undergoing planned pre-labour caesarean section were more likely than those who delivered vaginally to provide ≥1 sample (98% vs 54%), but less likely to provide maternal stool (10% vs 43%). Pre-sample questionnaires were completed by 110/171 women (64%). Most women reported feeling comfortable with samples being collected from their baby (<10% uncomfortable), but were less comfortable about their own stool (19% uncomfortable) or a vaginal swab (24% uncomfortable). It is possible to collect a range of biological samples from women around the time of delivery, and this was acceptable for most women. These data inform study design and protocol development for large-scale birth cohorts.
Sampling errors in the estimation of empirical orthogonal functions. [for climatology studies
NASA Technical Reports Server (NTRS)
North, G. R.; Bell, T. L.; Cahalan, R. F.; Moeng, F. J.
1982-01-01
Empirical Orthogonal Functions (EOF's), eigenvectors of the spatial cross-covariance matrix of a meteorological field, are reviewed with special attention given to the necessary weighting factors for gridded data and the sampling errors incurred when too small a sample is available. The geographical shape of an EOF shows large intersample variability when its associated eigenvalue is 'close' to a neighboring one. A rule of thumb indicating when an EOF is likely to be subject to large sampling fluctuations is presented. An explicit example, based on the statistics of the 500 mb geopotential height field, displays large intersample variability in the EOF's for sample sizes of a few hundred independent realizations, a size seldom exceeded by meteorological data sets.
Evaluation of Sampling Methods for Bacillus Spore ...
Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.
Research on the self-absorption corrections for PGNAA of large samples
NASA Astrophysics Data System (ADS)
Yang, Jian-Bo; Liu, Zhi; Chang, Kang; Li, Rui
2017-02-01
When a large sample is analysed with the prompt gamma neutron activation analysis (PGNAA) neutron self-shielding and gamma self-absorption affect the accuracy, the correction method for the detection efficiency of the relative H of each element in a large sample is described. The influences of the thickness and density of the cement samples on the H detection efficiency, as well as the impurities Fe2O3 and SiO2 on the prompt γ ray yield for each element in the cement samples, were studied. The phase functions for Ca, Fe, and Si on H with changes in sample thickness and density were provided to avoid complicated procedures for preparing the corresponding density or thickness scale for measuring samples under each density or thickness value and to present a simplified method for the measurement efficiency scale for prompt-gamma neutron activation analysis.
A COMPARISON OF SIX BENTHIC MACROINVERTEBRATE SAMPLING METHODS IN FOUR LARGE RIVERS
In 1999, a study was conducted to compare six macroinvertebrate sampling methods in four large (boatable) rivers that drain into the Ohio River. Two methods each were adapted from existing methods used by the USEPA, USGS and Ohio EPA. Drift nets were unable to collect a suffici...
NASA Technical Reports Server (NTRS)
Zeigler, R. A.
2015-01-01
From 1969-1972 the Apollo missions collected 382 kg of lunar samples from six distinct locations on the Moon. Studies of the Apollo sample suite have shaped our understanding of the formation and early evolution of the Earth-Moon system, and have had important implications for studies of the other terrestrial planets (e.g., through the calibration of the crater counting record) and even the outer planets (e.g., the Nice model of the dynamical evolution of the Solar System). Despite nearly 50 years of detailed research on Apollo samples, scientists are still developing new theories about the origin and evolution of the Moon. Three areas of active research are: (1) the abundance of water (and other volatiles) in the lunar mantle, (2) the timing of the formation of the Moon and the duration of lunar magma ocean crystallization, (3) the formation of evolved lunar lithologies (e.g., granites) and implications for tertiary crustal processes on the Moon. In order to fully understand these (and many other) theories about the Moon, scientists need access to "new" lunar samples, particularly new plutonic samples. Over 100 lunar meteorites have been identified over the past 30 years, and the study of these samples has greatly aided in our understanding of the Moon. However, terrestrial alteration and the lack of geologic context limit what can be learned from the lunar meteorites. Although no "new" large plutonic samples (i.e., hand-samples) remain to be discovered in the Apollo sample collection, there are many large polymict breccias in the Apollo collection containing relatively large (approximately 1 cm or larger) previously identified plutonic clasts, as well as a large number of unclassified lithic clasts. In addition, new, previously unidentified plutonic clasts are potentially discoverable within these breccias. The question becomes how to non-destructively locate and identify new lithic clasts of interest while minimizing the contamination and physical degradation of the samples.
Annealing Increases Stability Of Iridium Thermocouples
NASA Technical Reports Server (NTRS)
Germain, Edward F.; Daryabeigi, Kamran; Alderfer, David W.; Wright, Robert E.; Ahmed, Shaffiq
1989-01-01
Metallurgical studies carried out on samples of iridium versus iridium/40-percent rhodium thermocouples in condition received from manufacturer. Metallurgical studies included x-ray, macroscopic, resistance, and metallographic studies. Revealed large amount of internal stress caused by cold-working during manufacturing, and large number of segregations and inhomogeneities. Samples annealed in furnace at temperatures from 1,000 to 2,000 degree C for intervals up to 1 h to study effects of heat treatment. Wire annealed by this procedure found to be ductile.
Capers, Patrice L.; Brown, Andrew W.; Dawson, John A.; Allison, David B.
2015-01-01
Background: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing) has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. Objective: To evaluate the use of double sampling combined with multiple imputation (DS + MI) to address meta-research questions, using as an example adherence of PubMed entries to two simple consolidated standards of reporting trials guidelines for titles and abstracts. Methods: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT, human, abstract available, and English language (n = 322, 107). For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI) method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO) human rating method. Multiple imputation of the missing-completely at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. Results: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title = 1.00, abstract = 0.92). Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS + MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by year: subsample RHITLO 1.050–1.174 vs. DS + MI 1.082–1.151). As evidence of improved accuracy, DS + MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. Conclusion: Our results support our hypothesis that DS + MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of literature. PMID:25988135
ERIC Educational Resources Information Center
Boivin, Michel; Perusse, Daniel; Dionne, Ginette; Saysset, Valerie; Zoccolillo, Mark; Tarabulsy, George M.; Tremblay, Nathalie; Tremblay, Richard E.
2005-01-01
Background: Given the importance of parenting for the child's early socio-emotional development, parenting perceptions and behaviours, and their correlates, should be assessed as early as possible in the child's life. The goals of the present study were 1) to confirm, in two parallel population-based samples, including a large sample of twins, the…
Low-cost floating emergence net and bottle trap: Comparison of two designs
Cadmus, Pete; Pomeranz, Justin; Kraus, Johanna M.
2016-01-01
Sampling emergent aquatic insects is of interest to many freshwater ecologists. Many quantitative emergence traps require the use of aspiration for collection. However, aspiration is infeasible in studies with large amounts of replication that is often required in large biomonitoring projects. We designed an economic, collapsible pyramid-shaped floating emergence trap with an external collection bottle that avoids the need for aspiration. This design was compared experimentally to a design of similar dimensions that relied on aspiration to ensure comparable results. The pyramid-shaped design captured twice as many total emerging insects. When a preservative was used in bottle collectors, >95% of the emergent abundance was collected in the bottle. When no preservative was used, >81% of the total insects were collected from the bottle. In addition to capturing fewer emergent insects, the traps that required aspiration took significantly longer to sample. Large studies and studies sampling remote locations could benefit from the economical construction, speed of sampling, and capture efficiency.
A self-sampling method to obtain large volumes of undiluted cervicovaginal secretions.
Boskey, Elizabeth R; Moench, Thomas R; Hees, Paul S; Cone, Richard A
2003-02-01
Studies of vaginal physiology and pathophysiology sometime require larger volumes of undiluted cervicovaginal secretions than can be obtained by current methods. A convenient method for self-sampling these secretions outside a clinical setting can facilitate such studies of reproductive health. The goal was to develop a vaginal self-sampling method for collecting large volumes of undiluted cervicovaginal secretions. A menstrual collection device (the Instead cup) was inserted briefly into the vagina to collect secretions that were then retrieved from the cup by centrifugation in a 50-ml conical tube. All 16 women asked to perform this procedure found it feasible and acceptable. Among 27 samples, an average of 0.5 g of secretions (range, 0.1-1.5 g) was collected. This is a rapid and convenient self-sampling method for obtaining relatively large volumes of undiluted cervicovaginal secretions. It should prove suitable for a wide range of assays, including those involving sexually transmitted diseases, microbicides, vaginal physiology, immunology, and pathophysiology.
Association of CLU and PICALM variants with Alzheimer's disease
Kamboh, M.I.; Minster, R. L.; Demirci, F.Y.; Ganguli, M.; DeKosky, S.T.; Lopez, O.L.; Barmada, M.M.
2010-01-01
Two recent large genome-wide association studies have reported significant associations in the CLU (APOJ), CR1 and PICALM genes. In order to replicate these findings, we examined 7 single nucleotide polymorphisms (SNPs) most significantly implicated by these studies in a large case-control sample comprising of 2,707 individuals. Principle components analysis revealed no population substructure in our sample. While no association was observed with CR1 SNPs (P=0.30–0.457), a trend of association was seen with the PICALM (P=0.071–0.086) and CLU (P=0.148–0.258) SNPs. A meta-analysis of three studies revealed significant associations with all three genes. Our data from an independent and large case-control sample suggest that these gene regions should be followed up by comprehensive resequencing to find functional variants. PMID:20570404
NASA Astrophysics Data System (ADS)
Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander
2016-09-01
In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous throughfall studies relied on method-of-moments variogram estimation and sample sizes ≪200, currently available data are prone to large uncertainties.
NASA Astrophysics Data System (ADS)
Breier, J. A.; Sheik, C. S.; Gomez-Ibanez, D.; Sayre-McCord, R. T.; Sanger, R.; Rauch, C.; Coleman, M.; Bennett, S. A.; Cron, B. R.; Li, M.; German, C. R.; Toner, B. M.; Dick, G. J.
2014-12-01
A new tool was developed for large volume sampling to facilitate marine microbiology and biogeochemical studies. It was developed for remotely operated vehicle and hydrocast deployments, and allows for rapid collection of multiple sample types from the water column and dynamic, variable environments such as rising hydrothermal plumes. It was used successfully during a cruise to the hydrothermal vent systems of the Mid-Cayman Rise. The Suspended Particulate Rosette V2 large volume multi-sampling system allows for the collection of 14 sample sets per deployment. Each sample set can include filtered material, whole (unfiltered) water, and filtrate. Suspended particulate can be collected on filters up to 142 mm in diameter and pore sizes down to 0.2 μm. Filtration is typically at flowrates of 2 L min-1. For particulate material, filtered volume is constrained only by sampling time and filter capacity, with all sample volumes recorded by digital flowmeter. The suspended particulate filter holders can be filled with preservative and sealed immediately after sample collection. Up to 2 L of whole water, filtrate, or a combination of the two, can be collected as part of each sample set. The system is constructed of plastics with titanium fasteners and nickel alloy spring loaded seals. There are no ferrous alloys in the sampling system. Individual sample lines are prefilled with filtered, deionized water prior to deployment and remain sealed unless a sample is actively being collected. This system is intended to facilitate studies concerning the relationship between marine microbiology and ocean biogeochemistry.
Evaluating noninvasive genetic sampling techniques to estimate large carnivore abundance.
Mumma, Matthew A; Zieminski, Chris; Fuller, Todd K; Mahoney, Shane P; Waits, Lisette P
2015-09-01
Monitoring large carnivores is difficult because of intrinsically low densities and can be dangerous if physical capture is required. Noninvasive genetic sampling (NGS) is a safe and cost-effective alternative to physical capture. We evaluated the utility of two NGS methods (scat detection dogs and hair sampling) to obtain genetic samples for abundance estimation of coyotes, black bears and Canada lynx in three areas of Newfoundland, Canada. We calculated abundance estimates using program capwire, compared sampling costs, and the cost/sample for each method relative to species and study site, and performed simulations to determine the sampling intensity necessary to achieve abundance estimates with coefficients of variation (CV) of <10%. Scat sampling was effective for both coyotes and bears and hair snags effectively sampled bears in two of three study sites. Rub pads were ineffective in sampling coyotes and lynx. The precision of abundance estimates was dependent upon the number of captures/individual. Our simulations suggested that ~3.4 captures/individual will result in a < 10% CV for abundance estimates when populations are small (23-39), but fewer captures/individual may be sufficient for larger populations. We found scat sampling was more cost-effective for sampling multiple species, but suggest that hair sampling may be less expensive at study sites with limited road access for bears. Given the dependence of sampling scheme on species and study site, the optimal sampling scheme is likely to be study-specific warranting pilot studies in most circumstances. © 2015 John Wiley & Sons Ltd.
Replicability and Robustness of GWAS for Behavioral Traits
Rietveld, Cornelius A.; Conley, Dalton; Eriksson, Nicholas; Esko, Tõnu; Medland, Sarah E.; Vinkhuyzen, Anna A.E.; Yang, Jian; Boardman, Jason D.; Chabris, Christopher F.; Dawes, Christopher T.; Domingue, Benjamin W.; Hinds, David A.; Johannesson, Magnus; Kiefer, Amy K.; Laibson, David; Magnusson, Patrik K. E.; Mountain, Joanna L.; Oskarsson, Sven; Rostapshova, Olga; Teumer, Alexander; Tung, Joyce Y.; Visscher, Peter M.; Benjamin, Daniel J.; Cesarini, David; Koellinger, Philipp D.
2015-01-01
A recent genome-wide association study (GWAS) of educational attainment identified three single-nucleotide polymorphisms (SNPs) that, despite their small effect sizes (each R2 ≈ 0.02%), reached genome-wide significance (p < 5×10−8) in a large discovery sample and replicated in an independent sample (p < 0.05). The study also reported associations between educational attainment and indices of SNPs called “polygenic scores.” We evaluate the robustness of these findings. Study 1 finds that all three SNPs replicate in another large (N = 34,428) independent sample. We also find that the scores remain predictive (R2 ≈ 2%) with stringent controls for stratification (Study 2) and in new within-family analyses (Study 3). Our results show that large and therefore well-powered GWASs can identify replicable genetic associations with behavioral traits. The small effect sizes of individual SNPs are likely to be a major contributing explanation for the striking contrast between our results and the disappointing replication record of most candidate gene studies. PMID:25287667
Wei, Binnian; Feng, June; Rehmani, Imran J; Miller, Sharyn; McGuffey, James E; Blount, Benjamin C; Wang, Lanqing
2014-09-25
Most sample preparation methods characteristically involve intensive and repetitive labor, which is inefficient when preparing large numbers of samples from population-scale studies. This study presents a robotic system designed to meet the sampling requirements for large population-scale studies. Using this robotic system, we developed and validated a method to simultaneously measure urinary anatabine, anabasine, nicotine and seven major nicotine metabolites: 4-Hydroxy-4-(3-pyridyl)butanoic acid, cotinine-N-oxide, nicotine-N-oxide, trans-3'-hydroxycotinine, norcotinine, cotinine and nornicotine. We analyzed robotically prepared samples using high-performance liquid chromatography (HPLC) coupled with triple quadrupole mass spectrometry in positive electrospray ionization mode using scheduled multiple reaction monitoring (sMRM) with a total runtime of 8.5 min. The optimized procedure was able to deliver linear analyte responses over a broad range of concentrations. Responses of urine-based calibrators delivered coefficients of determination (R(2)) of >0.995. Sample preparation recovery was generally higher than 80%. The robotic system was able to prepare four 96-well plate (384 urine samples) per day, and the overall method afforded an accuracy range of 92-115%, and an imprecision of <15.0% on average. The validation results demonstrate that the method is accurate, precise, sensitive, robust, and most significantly labor-saving for sample preparation, making it efficient and practical for routine measurements in large population-scale studies such as the National Health and Nutrition Examination Survey (NHANES) and the Population Assessment of Tobacco and Health (PATH) study. Published by Elsevier B.V.
Nelson, Jennifer C.; Marsh, Tracey; Lumley, Thomas; Larson, Eric B.; Jackson, Lisa A.; Jackson, Michael
2014-01-01
Objective Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased due to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. Study Design and Setting We applied two such methods, imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method’s ability to reduce bias using the control time period prior to influenza circulation. Results Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not utilize the validation sample confounders. Conclusion Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from healthcare database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which data can be imputed or reweighted using the additional validation sample information. PMID:23849144
Accuracy or precision: Implications of sample design and methodology on abundance estimation
Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.
2015-01-01
Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.
Likelihood inference of non-constant diversification rates with incomplete taxon sampling.
Höhna, Sebastian
2014-01-01
Large-scale phylogenies provide a valuable source to study background diversification rates and investigate if the rates have changed over time. Unfortunately most large-scale, dated phylogenies are sparsely sampled (fewer than 5% of the described species) and taxon sampling is not uniform. Instead, taxa are frequently sampled to obtain at least one representative per subgroup (e.g. family) and thus to maximize diversity (diversified sampling). So far, such complications have been ignored, potentially biasing the conclusions that have been reached. In this study I derive the likelihood of a birth-death process with non-constant (time-dependent) diversification rates and diversified taxon sampling. Using simulations I test if the true parameters and the sampling method can be recovered when the trees are small or medium sized (fewer than 200 taxa). The results show that the diversification rates can be inferred and the estimates are unbiased for large trees but are biased for small trees (fewer than 50 taxa). Furthermore, model selection by means of Akaike's Information Criterion favors the true model if the true rates differ sufficiently from alternative models (e.g. the birth-death model is recovered if the extinction rate is large and compared to a pure-birth model). Finally, I applied six different diversification rate models--ranging from a constant-rate pure birth process to a decreasing speciation rate birth-death process but excluding any rate shift models--on three large-scale empirical phylogenies (ants, mammals and snakes with respectively 149, 164 and 41 sampled species). All three phylogenies were constructed by diversified taxon sampling, as stated by the authors. However only the snake phylogeny supported diversified taxon sampling. Moreover, a parametric bootstrap test revealed that none of the tested models provided a good fit to the observed data. The model assumptions, such as homogeneous rates across species or no rate shifts, appear to be violated.
Dynamics of airborne fungal populations in a large office building
NASA Technical Reports Server (NTRS)
Burge, H. A.; Pierson, D. L.; Groves, T. O.; Strawn, K. F.; Mishra, S. K.
2000-01-01
The increasing concern with bioaerosols in large office buildings prompted this prospective study of airborne fungal concentrations in a newly constructed building on the Gulf coast. We collected volumetric culture plate air samples on 14 occasions over the 18-month period immediately following building occupancy. On each sampling occasion, we collected duplicate samples from three sites on three floors of this six-story building, and an outdoor sample. Fungal concentrations indoors were consistently below those outdoors, and no sample clearly indicated fungal contamination in the building, although visible growth appeared in the ventilation system during the course of the study. We conclude that modern mechanically ventilated buildings prevent the intrusion of most of the outdoor fungal aerosol, and that even relatively extensive air sampling protocols may not sufficiently document the microbial status of buildings.
Su, Xiaoquan; Xu, Jian; Ning, Kang
2012-10-01
It has long been intriguing scientists to effectively compare different microbial communities (also referred as 'metagenomic samples' here) in a large scale: given a set of unknown samples, find similar metagenomic samples from a large repository and examine how similar these samples are. With the current metagenomic samples accumulated, it is possible to build a database of metagenomic samples of interests. Any metagenomic samples could then be searched against this database to find the most similar metagenomic sample(s). However, on one hand, current databases with a large number of metagenomic samples mostly serve as data repositories that offer few functionalities for analysis; and on the other hand, methods to measure the similarity of metagenomic data work well only for small set of samples by pairwise comparison. It is not yet clear, how to efficiently search for metagenomic samples against a large metagenomic database. In this study, we have proposed a novel method, Meta-Storms, that could systematically and efficiently organize and search metagenomic data. It includes the following components: (i) creating a database of metagenomic samples based on their taxonomical annotations, (ii) efficient indexing of samples in the database based on a hierarchical taxonomy indexing strategy, (iii) searching for a metagenomic sample against the database by a fast scoring function based on quantitative phylogeny and (iv) managing database by index export, index import, data insertion, data deletion and database merging. We have collected more than 1300 metagenomic data from the public domain and in-house facilities, and tested the Meta-Storms method on these datasets. Our experimental results show that Meta-Storms is capable of database creation and effective searching for a large number of metagenomic samples, and it could achieve similar accuracies compared with the current popular significance testing-based methods. Meta-Storms method would serve as a suitable database management and search system to quickly identify similar metagenomic samples from a large pool of samples. ningkang@qibebt.ac.cn Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Ustione, A.; Cricenti, A.; Piacentini, M.; Felici, A. C.
2006-09-01
A new implementation of a shear-force microscope is described that uses a shear-force detection system to perform topographical imaging of large areas (˜1×1mm2). This implementation finds very interesting application in the study of archeological or artistic samples. Three dc motors are used to move a sample during a scan, allowing the probe tip to follow the surface and to face height differences of several tens of micrometers. This large-area topographical imaging mode exploits new subroutines that were added to the existing homemade software; these subroutines were created in Microsoft VISUAL BASIC 6.0 programming language. With this new feature our shear-force microscope can be used to study topographical details over large areas of archaeological samples in a nondestructive way. We show results detecting worn reliefs over a coin.
Logistics and quality control for DNA sampling in large multicenter studies.
Nederhand, R J; Droog, S; Kluft, C; Simoons, M L; de Maat, M P M
2003-05-01
To study associations between genetic variation and disease, large bio-banks need to be created in multicenter studies. Therefore, we studied the effects of storage time and temperature on DNA quality and quantity in a simulation experiment with storage up to 28 days frozen, at 4 degrees C and at room temperature. In the simulation experiment, the conditions did not influence the amount or quality of DNA to an unsatisfactory level. However, the amount of extracted DNA was decreased in frozen samples and in samples that were stored for > 7 days at room temperature. In a sample of patients from 24 countries of the EUROPA trial obtained by mail with transport times up to 1 month DNA yield and quality were adequate. From these results we conclude that transport of non-frozen blood by ordinary mail is usable and practical for DNA isolation for polymerase chain reaction in clinical and epidemiological studies.
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.
How Large Should a Statistical Sample Be?
ERIC Educational Resources Information Center
Menil, Violeta C.; Ye, Ruili
2012-01-01
This study serves as a teaching aid for teachers of introductory statistics. The aim of this study was limited to determining various sample sizes when estimating population proportion. Tables on sample sizes were generated using a C[superscript ++] program, which depends on population size, degree of precision or error level, and confidence…
Internal pilots for a class of linear mixed models with Gaussian and compound symmetric data
Gurka, Matthew J.; Coffey, Christopher S.; Muller, Keith E.
2015-01-01
SUMMARY An internal pilot design uses interim sample size analysis, without interim data analysis, to adjust the final number of observations. The approach helps to choose a sample size sufficiently large (to achieve the statistical power desired), but not too large (which would waste money and time). We report on recent research in cerebral vascular tortuosity (curvature in three dimensions) which would benefit greatly from internal pilots due to uncertainty in the parameters of the covariance matrix used for study planning. Unfortunately, observations correlated across the four regions of the brain and small sample sizes preclude using existing methods. However, as in a wide range of medical imaging studies, tortuosity data have no missing or mistimed data, a factorial within-subject design, the same between-subject design for all responses, and a Gaussian distribution with compound symmetry. For such restricted models, we extend exact, small sample univariate methods for internal pilots to linear mixed models with any between-subject design (not just two groups). Planning a new tortuosity study illustrates how the new methods help to avoid sample sizes that are too small or too large while still controlling the type I error rate. PMID:17318914
Nelson, Jennifer Clark; Marsh, Tracey; Lumley, Thomas; Larson, Eric B; Jackson, Lisa A; Jackson, Michael L
2013-08-01
Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased owing to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. We applied two such methods, namely imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method's ability to reduce bias using the control time period before influenza circulation. Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not use the validation sample confounders. Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from health care database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which the data can be imputed or reweighted using the additional validation sample information. Copyright © 2013 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Pestle, Sarah L.; Chorpita, Bruce F.; Schiffman, Jason
2008-01-01
The Penn State Worry Questionnaire for Children (PSWQ-C; Chorpita, Tracey, Brown, Collica, & Barlow, 1997) is a 14-item self-report measure of worry in children and adolescents. Although the PSWQ-C has demonstrated favorable psychometric properties in small clinical and large community samples, this study represents the first psychometric…
ERIC Educational Resources Information Center
Roseth, Cary J.; Missall, Kristen N.; McConnell, Scott R.
2012-01-01
Early literacy individual growth and development indicators (EL-IGDIs) assess preschoolers' expressive vocabulary development and phonological awareness. This study investigated longitudinal change in EL-IGDIs using a large (N=7355), internet-based sample of 36- to 60-month-old United States preschoolers without identified risks for later…
A non-destructive method for quantifying small-diameter woody biomass in southern pine forests
D. Andrew Scott; Rick Stagg; Morris Smith
2006-01-01
Quantifying the impact of silvicultural treatments on woody understory vegetation largely has been accomplished by destructive sampling or through estimates of frequency and coverage. In studies where repeated measures of understory biomass across large areas are needed, destructive sampling and percent cover estimates are not satisfactory. For example, estimates of...
David, Victor; Galaon, Toma; Aboul-Enein, Hassan Y
2014-01-03
Recent studies showed that injection of large volume of hydrophobic solvents used as sample diluents could be applied in reversed-phase liquid chromatography (RP-LC). This study reports a systematic research focused on the influence of a series of aliphatic alcohols (from methanol to 1-octanol) on the retention process in RP-LC, when large volumes of sample are injected on the column. Several model analytes with low hydrophobic character were studied by RP-LC process, for mobile phases containing methanol or acetonitrile as organic modifiers in different proportions with aqueous component. It was found that starting with 1-butanol, the aliphatic alcohols can be used as sample solvents and they can be injected in high volumes, but they may influence the retention factor and peak shape of the dissolved solutes. The dependence of the retention factor of the studied analytes on the injection volume of these alcohols is linear, with a decrease of its value as the sample volume is increased. The retention process in case of injecting up to 200μL of upper alcohols is dependent also on the content of the organic modifier (methanol or acetonitrile) in mobile phase. Copyright © 2013 Elsevier B.V. All rights reserved.
The Relationship between Sample Sizes and Effect Sizes in Systematic Reviews in Education
ERIC Educational Resources Information Center
Slavin, Robert; Smith, Dewi
2009-01-01
Research in fields other than education has found that studies with small sample sizes tend to have larger effect sizes than those with large samples. This article examines the relationship between sample size and effect size in education. It analyzes data from 185 studies of elementary and secondary mathematics programs that met the standards of…
When the Test of Mediation is More Powerful than the Test of the Total Effect
O'Rourke, Holly P.; MacKinnon, David P.
2014-01-01
Although previous research has studied power in mediation models, the extent to which the inclusion of a mediator will increase power has not been investigated. First, a study compared analytical power of the mediated effect to the total effect in a single mediator model to identify the situations in which the inclusion of one mediator increased statistical power. Results from the first study indicated that including a mediator increased statistical power in small samples with large coefficients and in large samples with small coefficients, and when coefficients were non-zero and equal across models. Next, a study identified conditions where power was greater for the test of the total mediated effect compared to the test of the total effect in the parallel two mediator model. Results indicated that including two mediators increased power in small samples with large coefficients and in large samples with small coefficients, the same pattern of results found in the first study. Finally, a study assessed analytical power for a sequential (three-path) two mediator model and compared power to detect the three-path mediated effect to power to detect both the test of the total effect and the test of the mediated effect for the single mediator model. Results indicated that the three-path mediated effect had more power than the mediated effect from the single mediator model and the test of the total effect. Practical implications of these results for researchers are then discussed. PMID:24903690
Hu, Jian Zhi; Sears, Jr., Jesse A.; Hoyt, David W.; Mehta, Hardeep S.; Peden, Charles H. F.
2015-11-24
A continuous-flow (CF) magic angle sample spinning (CF-MAS) NMR rotor and probe are described for investigating reaction dynamics, stable intermediates/transition states, and mechanisms of catalytic reactions in situ. The rotor includes a sample chamber of a flow-through design with a large sample volume that delivers a flow of reactants through a catalyst bed contained within the sample cell allowing in-situ investigations of reactants and products. Flow through the sample chamber improves diffusion of reactants and products through the catalyst. The large volume of the sample chamber enhances sensitivity permitting in situ .sup.13C CF-MAS studies at natural abundance.
2013-01-01
Introduction Small-study effects refer to the fact that trials with limited sample sizes are more likely to report larger beneficial effects than large trials. However, this has never been investigated in critical care medicine. Thus, the present study aimed to examine the presence and extent of small-study effects in critical care medicine. Methods Critical care meta-analyses involving randomized controlled trials and reported mortality as an outcome measure were considered eligible for the study. Component trials were classified as large (≥100 patients per arm) and small (<100 patients per arm) according to their sample sizes. Ratio of odds ratio (ROR) was calculated for each meta-analysis and then RORs were combined using a meta-analytic approach. ROR<1 indicated larger beneficial effect in small trials. Small and large trials were compared in methodological qualities including sequence generating, blinding, allocation concealment, intention to treat and sample size calculation. Results A total of 27 critical care meta-analyses involving 317 trials were included. Of them, five meta-analyses showed statistically significant RORs <1, and other meta-analyses did not reach a statistical significance. Overall, the pooled ROR was 0.60 (95% CI: 0.53 to 0.68); the heterogeneity was moderate with an I2 of 50.3% (chi-squared = 52.30; P = 0.002). Large trials showed significantly better reporting quality than small trials in terms of sequence generating, allocation concealment, blinding, intention to treat, sample size calculation and incomplete follow-up data. Conclusions Small trials are more likely to report larger beneficial effects than large trials in critical care medicine, which could be partly explained by the lower methodological quality in small trials. Caution should be practiced in the interpretation of meta-analyses involving small trials. PMID:23302257
Radiocarbon dating of extinct fauna in the Americas recovered from tar pits
NASA Astrophysics Data System (ADS)
Jull, A. J. T.; Iturralde-Vinent, M.; O'Malley, J. M.; MacPhee, R. D. E.; McDonald, H. G.; Martin, P. S.; Moody, J.; Rincón, A.
2004-08-01
We have obtained radiocarbon dates by accelerator mass spectrometry on bones of extinct large mammals from tar pits. Results on some samples of Glyptodon and Holmesina (extinct large mammals similar to armadillos) yielded ages of >25 and >21 ka, respectively. We also studied the radiocarbon ages of three different samples of bones from the extinct Cuban ground sloth, Parocnus bownii, which yielded dates ranging from 4960 ± 280 to 11 880 ± 420 yr BP. In order to remove the tar component pretreat the samples sufficiently to obtain reliable dates, we cleaned the samples by Soxhlet extraction in benzene. Resulting samples of collagenous material were often small.
Measuring discharge with ADCPs: Inferences from synthetic velocity profiles
Rehmann, C.R.; Mueller, D.S.; Oberg, K.A.
2009-01-01
Synthetic velocity profiles are used to determine guidelines for sampling discharge with acoustic Doppler current profilers (ADCPs). The analysis allows the effects of instrument characteristics, sampling parameters, and properties of the flow to be studied systematically. For mid-section measurements, the averaging time required for a single profile measurement always exceeded the 40 s usually recommended for velocity measurements, and it increased with increasing sample interval and increasing time scale of the large eddies. Similarly, simulations of transect measurements show that discharge error decreases as the number of large eddies sampled increases. The simulations allow sampling criteria that account for the physics of the flow to be developed. ?? 2009 ASCE.
Exploring Collaborative Culture and Leadership in Large High Schools
ERIC Educational Resources Information Center
Jeffers, Michael P.
2013-01-01
The purpose of this exploratory study was to analyze how high school principals approached developing a collaborative culture and providing collaborative leadership in a large high school setting. The population sample for this study was 82 principals of large comprehensive high schools of grades 9 through 12 or some combination thereof with…
Cooperative investigation of precision and accuracy: In chemical analysis of silicate rocks
Schlecht, W.G.
1951-01-01
This is the preliminary report of the first extensive program ever organized to study the analysis of igneous rocks, a study sponsored by the United States Geological Survey, the Massachusetts Institute of Technology, and the Geophysical Laboratory of the Carnegie Institution of Washington. Large samples of two typical igneous rocks, a granite and a diabase, were carefully prepared and divided. Small samples (about 70 grams) of each were sent to 25 rock-analysis laboratories throughout the world; analyses of one or both samples were reported by 34 analysts in these laboratories. The results, which showed rather large discrepancies, are presented in histograms. The great discordance in results reflects the present unsatisfactory state of rock analysis. It is hoped that the ultimate establishment of standard samples and procedures will contribute to the improvement of quality of analyses. The two rock samples have also been thoroughly studied spectrographically and petrographically. Detailed reports of all the studies will be published.
Vitamin D receptor gene and osteoporosis - author`s response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Looney, J.E.; Yoon, Hyun Koo; Fischer, M.
1996-04-01
We appreciate the comments of Dr. Nguyen et al. about our recent study, but we disagree with their suggestion that the lack of an association between low bone density and the BB VDR genotype, which we reported, is an artifact generated by the small sample size. Furthermore, our results are consistent with similar conclusions reached by a number of other investigators, as recently reported by Peacock. Peacock states {open_quotes}Taken as a whole, the results of studies outlined ... indicate that VDR alleles, cannot account for the major part of the heritable component of bone density as indicated by Morrison etmore » al.{close_quotes}. The majority of the 17 studies cited in this editorial could not confirm an association between the VDR genotype and the bone phenotype. Surely one cannot criticize this combined work as representing an artifact because of a too small sample size. We do not dispute the suggestion by Nguyen et al. that large sample sizes are required to analyze small biological effects. This is evident in both Peacock`s summary and in their own bone density studies. We did not design our study with a larger sample size because, based on the work of Morrison et al., we had hypothesized a large biological effect; large sample sizes are only needed for small biological effects. 4 refs.« less
Technology Tips: Sample Too Small? Probably Not!
ERIC Educational Resources Information Center
Strayer, Jeremy F.
2013-01-01
Statistical studies are referenced in the news every day, so frequently that people are sometimes skeptical of reported results. Often, no matter how large a sample size researchers use in their studies, people believe that the sample size is too small to make broad generalizations. The tasks presented in this article use simulations of repeated…
NASA Technical Reports Server (NTRS)
Tueller, P. T.
1977-01-01
Large scale 70mm aerial photography is a valuable supplementary tool for rangeland studies. A wide assortment of applications were developed varying from vegetation mapping to assessing environmental impact on rangelands. Color and color infrared stereo pairs are useful for effectively sampling sites limited by ground accessibility. They allow an increased sample size at similar or lower cost than ground sampling techniques and provide a permanent record.
Pestle, Sarah L; Chorpita, Bruce F; Schiffman, Jason
2008-04-01
The Penn State Worry Questionnaire for Children (PSWQ-C; Chorpita, Tracey, Brown, Collica, & Barlow, 1997) is a 14-item self-report measure of worry in children and adolescents. Although the PSWQ-C has demonstrated favorable psychometric properties in small clinical and large community samples, this study represents the first psychometric evaluation of the PSWQ-C in a large clinical sample (N = 491). Factor analysis indicated a two-factor structure, in contrast to all previously published findings on the measure. The PSWQ-C demonstrated favorable psychometric properties in this sample, including high internal consistency, high convergent validity with related constructs, and acceptable discriminative validity between diagnostic categories. The performance of the 3 reverse-scored items was closely examined, and results indicated retaining all 14 items.
ERIC Educational Resources Information Center
Zullig, Keith J.; Collins, Rani; Ghani, Nadia; Patton, Jon M.; Huebner, E. Scott; Ajamie, Jean
2014-01-01
Background: The School Climate Measure (SCM) was developed and validated in 2010 in response to a dearth of psychometrically sound school climate instruments. This study sought to further validate the SCM on a large, diverse sample of Arizona public school adolescents (N = 20,953). Methods: Four SCM domains (positive student-teacher relationships,…
ERIC Educational Resources Information Center
Buttell, Frederick P.; Carney, Michelle Mohr
2006-01-01
Objective: The purpose of the present study was to (a) evaluate a 26-week batterer intervention program by investigating changes in psychological variables related to abuse (i.e., truthfulness, violence, lethality, control, alcohol use, drug use, and stress coping abilities) between pretreatment and posttreatment assessments in a large sample of…
ERIC Educational Resources Information Center
Lewis, Gary J.; Ritchie, Stuart J.; Bates, Timothy C.
2011-01-01
High levels of religiosity have been linked to lower levels of intelligence in a number of recent studies. These results have generated both controversy and theoretical interest. Here in a large sample of US adults we address several issues that restricted the generalizability of these previous results. We measured six dimensions of religiosity…
Mucci, A; Galderisi, S; Merlotti, E; Rossi, A; Rocca, P; Bucci, P; Piegari, G; Chieffi, M; Vignapiano, A; Maj, M
2015-07-01
The Brief Negative Symptom Scale (BNSS) was developed to address the main limitations of the existing scales for the assessment of negative symptoms of schizophrenia. The initial validation of the scale by the group involved in its development demonstrated good convergent and discriminant validity, and a factor structure confirming the two domains of negative symptoms (reduced emotional/verbal expression and anhedonia/asociality/avolition). However, only relatively small samples of patients with schizophrenia were investigated. Further independent validation in large clinical samples might be instrumental to the broad diffusion of the scale in clinical research. The present study aimed to examine the BNSS inter-rater reliability, convergent/discriminant validity and factor structure in a large Italian sample of outpatients with schizophrenia. Our results confirmed the excellent inter-rater reliability of the BNSS (the intraclass correlation coefficient ranged from 0.81 to 0.98 for individual items and was 0.98 for the total score). The convergent validity measures had r values from 0.62 to 0.77, while the divergent validity measures had r values from 0.20 to 0.28 in the main sample (n=912) and in a subsample without clinically significant levels of depression and extrapyramidal symptoms (n=496). The BNSS factor structure was supported in both groups. The study confirms that the BNSS is a promising measure for quantifying negative symptoms of schizophrenia in large multicenter clinical studies. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Gonzalez Murcia, Josue D; Schmutz, Cameron; Munger, Caitlin; Perkes, Ammon; Gustin, Aaron; Peterson, Michael; Ebbert, Mark T W; Norton, Maria C; Tschanz, Joann T; Munger, Ronald G; Corcoran, Christopher D; Kauwe, John S K
2013-12-01
Recent studies have identified the rs75932628 (R47H) variant in TREM2 as an Alzheimer's disease risk factor with estimated odds ratio ranging from 2.9 to 5.1. The Cache County Memory Study is a large, population-based sample designed for the study of memory and aging. We genotyped R47H in 2974 samples (427 cases and 2540 control subjects) from the Cache County study using a custom TaqMan assay. We observed 7 heterozygous cases and 12 heterozygous control subjects with an odds ratio of 3.5 (95% confidence interval, 1.3-8.8; p = 0.0076). The minor allele frequency and population attributable fraction for R47H were 0.0029 and 0.004, respectively. This study replicates the association between R47H and Alzheimer's disease risk in a large, population-based sample, and estimates the population frequency and attributable risk of this rare variant. Copyright © 2013 Elsevier Inc. All rights reserved.
Grossmann, Sebastian; Nowak, Piotr; Neogi, Ujjwal
2015-01-01
HIV-1 near full-length genome (HIV-NFLG) sequencing from plasma is an attractive multidimensional tool to apply in large-scale population-based molecular epidemiological studies. It also enables genotypic resistance testing (GRT) for all drug target sites allowing effective intervention strategies for control and prevention in high-risk population groups. Thus, the main objective of this study was to develop a simplified subtype-independent, cost- and labour-efficient HIV-NFLG protocol that can be used in clinical management as well as in molecular epidemiological studies. Plasma samples (n=30) were obtained from HIV-1B (n=10), HIV-1C (n=10), CRF01_AE (n=5) and CRF01_AG (n=5) infected individuals with minimum viral load >1120 copies/ml. The amplification was performed with two large amplicons of 5.5 kb and 3.7 kb, sequenced with 17 primers to obtain HIV-NFLG. GRT was validated against ViroSeq™ HIV-1 Genotyping System. After excluding four plasma samples with low-quality RNA, a total of 26 samples were attempted. Among them, NFLG was obtained from 24 (92%) samples with the lowest viral load being 3000 copies/ml. High (>99%) concordance was observed between HIV-NFLG and ViroSeq™ when determining the drug resistance mutations (DRMs). The N384I connection mutation was additionally detected by NFLG in two samples. Our high efficiency subtype-independent HIV-NFLG is a simple and promising approach to be used in large-scale molecular epidemiological studies. It will facilitate the understanding of the HIV-1 pandemic population dynamics and outline effective intervention strategies. Furthermore, it can potentially be applicable in clinical management of drug resistance by evaluating DRMs against all available antiretrovirals in a single assay.
Qualitative Meta-Analysis on the Hospital Task: Implications for Research
ERIC Educational Resources Information Center
Noll, Jennifer; Sharma, Sashi
2014-01-01
The "law of large numbers" indicates that as sample size increases, sample statistics become less variable and more closely estimate their corresponding population parameters. Different research studies investigating how people consider sample size when evaluating the reliability of a sample statistic have found a wide range of…
Abrahamson, Melanie; Hooker, Elizabeth; Ajami, Nadim J; Petrosino, Joseph F; Orwoll, Eric S
2017-09-01
The relationship of the gastrointestinal microbiome to health and disease is of major research interest, including the effects of the gut microbiota on age related conditions. Here we report on the outcome of a project to collect stool samples on a large number of community dwelling elderly men using the OMNIgene-GUT stool/feces collection kit (OMR-200, DNA Genotek, Ottawa, Canada). Among 1,328 men who were eligible for stool collection, 982 (74%) agreed to participate and 951 submitted samples. The collection process was reported to be acceptable, almost all samples obtained were adequate, the process of sample handling by mail was uniformly successful. The DNA obtained provided excellent results in microbiome analyses, yielding an abundance of species and a diversity of taxa as would be predicted. Our results suggest that population studies of older participants involving remote stool sample collection are feasible. These approaches would allow large scale research projects of the association of the gut microbiota with important clinical outcomes.
NASA Astrophysics Data System (ADS)
Murasawa, Go; Yeduru, Srinivasa R.; Kohl, Manfred
2016-12-01
This study investigated macroscopic inhomogeneous deformation occurring in single-crystal Ni-Mn-Ga foils under uniaxial tensile loading. Two types of single-crystal Ni-Mn-Ga foil samples were examined as-received and after thermo-mechanical training. Local strain and the strain field were measured under tensile loading using laser speckle and digital image correlation. The as-received sample showed a strongly inhomogeneous strain field with intermittence under progressive deformation, but the trained sample result showed strain field homogeneity throughout the specimen surface. The as-received sample is a mainly polycrystalline-like state composed of the domain structure. The sample contains many domain boundaries and large domain structures in the body. Its structure would cause large local strain band nucleation with intermittence. However, the trained one is an ideal single-crystalline state with a transformation preferential orientation of variants after almost all domain boundary and large domain structures vanish during thermo-mechanical training. As a result, macroscopic homogeneous deformation occurs on the trained sample surface during deformation.
Comparison of different biopsy forceps models for tissue sampling in eosinophilic esophagitis.
Bussmann, Christian; Schoepfer, Alain M; Safroneeva, Ekaterina; Haas, Nadine; Godat, Sébastien; Sempoux, Christine; Simon, Hans-Uwe; Straumann, Alex
2016-12-01
Background and aims: Eosinophilic esophagitis (EoE) is a mixed inflammatory and fibrostenotic disease. Unlike superficial inflammatory changes, subepithelial fibrosis is not routinely sampled in esophageal biopsies. This study aimed to evaluate the efficacy and safety of deep esophageal sampling with four different types of biopsy forceps. Patients and methods: In this cross-sectional study, esophageal biopsies were taken in 30 adult patients by one expert endoscopist. Biopsies sampled from distal esophagus using a static jaw forceps (Olympus, FB-11K-1) were compared with proximal biopsies sampled with static jaw (Olympus, FB-45Q-1), alligator jaw (Olympus, FB-210K), and large-capacity forceps (Boston Scientific, Radial Jaw 4). One pathologist calculated the surface area of epithelial and subepithelial layers in hematoxylin and eosin (H&E)-stained biopsies. Results: Subepithelial tissue was acquired in 97 % (static jaw FB-11K-1), 93 % (static jaw FB-45Q-1), 80 % (alligator jaw), and 55 % (large-capacity) of samples. Median (interquartile [IQR]) ratios of surface area of epithelial to subepithelial tissue were: static jaw FB-45Q-1, 1.07 (0.65 - 4.465); static jaw FB-11K-1, 1.184 (0.608 - 2.545); alligator jaw, 2.353 (1.312 - 4.465); and large-capacity, 2.71 (1.611 - 4.858). The static jaw models obtained a larger surface area of subepithelial tissue compared with the alligator jaw ( P < 0.001 and P = 0.037, for FB-11K-1 and FB-45Q-1, respectively) and the large-capacity forceps ( P < 0.001, for both static jaw models). No esophageal perforations occurred. Conclusions: The static jaw forceps models allowed sampling of subepithelial tissue in > 90 % of biopsies and appear to be superior to alligator or large-capacity forceps in sampling larger amounts of subepithelial tissue. © Georg Thieme Verlag KG Stuttgart · New York.
A METHODS COMPARISON FOR COLLECTING MACROINVERTEBRATES IN THE OHIO RIVER
Collection of representative benthic macroinvertebrate samples from large rivers has been challenging researchers for many years. The objective of our study was to develop an appropriate method(s) for sampling macroinvertebrates from the Ohio River. Four existing sampling metho...
Workforce Readiness: A Study of University Students' Fluency with Information Technology
ERIC Educational Resources Information Center
Kaminski, Karen; Switzer, Jamie; Gloeckner, Gene
2009-01-01
This study with data collected from a large sample of freshmen in 2001 and a random stratified sample of seniors in 2005 examined students perceived FITness (fluency with Information Technology). In the fall of 2001 freshmen at a medium sized research-one institution completed a survey and in spring 2005 a random sample of graduating seniors…
PASSIM--an open source software system for managing information in biomedical studies.
Viksna, Juris; Celms, Edgars; Opmanis, Martins; Podnieks, Karlis; Rucevskis, Peteris; Zarins, Andris; Barrett, Amy; Neogi, Sudeshna Guha; Krestyaninova, Maria; McCarthy, Mark I; Brazma, Alvis; Sarkans, Ugis
2007-02-09
One of the crucial aspects of day-to-day laboratory information management is collection, storage and retrieval of information about research subjects and biomedical samples. An efficient link between sample data and experiment results is absolutely imperative for a successful outcome of a biomedical study. Currently available software solutions are largely limited to large-scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but often implies sufficient investment of time, effort and funds, which are not always available. There is a clear need for lightweight open source systems for patient and sample information management. We present a web-based tool for submission, management and retrieval of sample and research subject data. The system secures confidentiality by separating anonymized sample information from individuals' records. It is simple and generic, and can be customised for various biomedical studies. Information can be both entered and accessed using the same web interface. User groups and their privileges can be defined. The system is open-source and is supplied with an on-line tutorial and necessary documentation. It has proven to be successful in a large international collaborative project. The presented system closes the gap between the need and the availability of lightweight software solutions for managing information in biomedical studies involving human research subjects.
Molecular epidemiology biomarkers-Sample collection and processing considerations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Nina T.; Pfleger, Laura; Berger, Eileen
2005-08-07
Biomarker studies require processing and storage of numerous biological samples with the goals of obtaining a large amount of information and minimizing future research costs. An efficient study design includes provisions for processing of the original samples, such as cryopreservation, DNA isolation, and preparation of specimens for exposure assessment. Use of standard, two-dimensional and nanobarcodes and customized electronic databases assure efficient management of large sample collections and tracking results of data analyses. Standard operating procedures and quality control plans help to protect sample quality and to assure validity of the biomarker data. Specific state, federal and international regulations are inmore » place regarding research with human samples, governing areas including custody, safety of handling, and transport of human samples. Appropriate informed consent must be obtained from the study subjects prior to sample collection and confidentiality of results maintained. Finally, examples of three biorepositories of different scale (European Cancer Study, National Cancer Institute and School of Public Health Biorepository, University of California, Berkeley) are used to illustrate challenges faced by investigators and the ways to overcome them. New software and biorepository technologies are being developed by many companies that will help to bring biological banking to a new level required by molecular epidemiology of the 21st century.« less
Mars, Phobos, and Deimos Sample Return Enabled by ARRM Alternative Trade Study Spacecraft
NASA Technical Reports Server (NTRS)
Englander, Jacob A.; Vavrina, Matthew; Merrill, Raymond G.; Qu, Min; Naasz, Bo J.
2014-01-01
The Asteroid Robotic Redirect Mission (ARRM) has been the topic of many mission design studies since 2011. The reference ARRM spacecraft uses a powerful solar electric propulsion (SEP) system and a bag device to capture a small asteroid from an Earth-like orbit and redirect it to a distant retrograde orbit (DRO) around the moon. The ARRM Option B spacecraft uses the same propulsion system and multi-Degree of Freedom (DoF) manipulators device to retrieve a very large sample (thousands of kilograms) from a 100+ meter diameter farther-away Near Earth Asteroid (NEA). This study will demonstrate that the ARRM Option B spacecraft design can also be used to return samples from Mars and its moons - either by acquiring a large rock from the surface of Phobos or Deimos, and or by rendezvousing with a sample-return spacecraft launched from the surface of Mars.
Mars, Phobos, and Deimos Sample Return Enabled by ARRM Alternative Trade Study Spacecraft
NASA Technical Reports Server (NTRS)
Englander, Jacob A.; Vavrina, Matthew; Naasz, Bo; Merill, Raymond G.; Qu, Min
2014-01-01
The Asteroid Robotic Redirect Mission (ARRM) has been the topic of many mission design studies since 2011. The reference ARRM spacecraft uses a powerful solar electric propulsion (SEP) system and a bag device to capture a small asteroid from an Earth-like orbit and redirect it to a distant retrograde orbit (DRO) around the moon. The ARRM Option B spacecraft uses the same propulsion system and multi-Degree of Freedom (DoF) manipulators device to retrieve a very large sample (thousands of kilograms) from a 100+ meter diameter farther-away Near Earth Asteroid (NEA). This study will demonstrate that the ARRM Option B spacecraft design can also be used to return samples from Mars and its moons - either by acquiring a large rock from the surface of Phobos or Deimos, and/or by rendezvousing with a sample-return spacecraft launched from the surface of Mars.
An adaptive two-stage sequential design for sampling rare and clustered populations
Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.
2008-01-01
How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.
The multiple imputation method: a case study involving secondary data analysis.
Walani, Salimah R; Cleland, Charles M
2015-05-01
To illustrate with the example of a secondary data analysis study the use of the multiple imputation method to replace missing data. Most large public datasets have missing data, which need to be handled by researchers conducting secondary data analysis studies. Multiple imputation is a technique widely used to replace missing values while preserving the sample size and sampling variability of the data. The 2004 National Sample Survey of Registered Nurses. The authors created a model to impute missing values using the chained equation method. They used imputation diagnostics procedures and conducted regression analysis of imputed data to determine the differences between the log hourly wages of internationally educated and US-educated registered nurses. The authors used multiple imputation procedures to replace missing values in a large dataset with 29,059 observations. Five multiple imputed datasets were created. Imputation diagnostics using time series and density plots showed that imputation was successful. The authors also present an example of the use of multiple imputed datasets to conduct regression analysis to answer a substantive research question. Multiple imputation is a powerful technique for imputing missing values in large datasets while preserving the sample size and variance of the data. Even though the chained equation method involves complex statistical computations, recent innovations in software and computation have made it possible for researchers to conduct this technique on large datasets. The authors recommend nurse researchers use multiple imputation methods for handling missing data to improve the statistical power and external validity of their studies.
Three Conceptual Replication Studies in Group Theory
ERIC Educational Resources Information Center
Melhuish, Kathleen
2018-01-01
Many studies in mathematics education research occur with a nonrepresentative sample and are never replicated. To challenge this paradigm, I designed a large-scale study evaluating student conceptions in group theory that surveyed a national, representative sample of students. By replicating questions previously used to build theory around student…
Legal & ethical compliance when sharing biospecimen.
Klingstrom, Tomas; Bongcam-Rudloff, Erik; Reichel, Jane
2018-01-01
When obtaining samples from biobanks, resolving ethical and legal concerns is a time-consuming task where researchers need to balance the needs of privacy, trust and scientific progress. The Biobanking and Biomolecular Resources Research Infrastructure-Large Prospective Cohorts project has resolved numerous such issues through intense communication between involved researchers and experts in its mission to unite large prospective study sets in Europe. To facilitate efficient communication, it is useful for nonexperts to have an at least basic understanding of the regulatory system for managing biological samples.Laws regulating research oversight are based on national law and normally share core principles founded on international charters. In interview studies among donors, chief concerns are privacy, efficient sample utilization and access to information generated from their samples. Despite a lack of clear evidence regarding which concern takes precedence, scientific as well as public discourse has largely focused on privacy concerns and the right of donors to control the usage of their samples.It is therefore important to proactively deal with ethical and legal issues to avoid complications that delay or prevent samples from being accessed. To help biobank professionals avoid making unnecessary mistakes, we have developed this basic primer covering the relationship between ethics and law, the concept of informed consent and considerations for returning findings to donors. © The Author 2017. Published by Oxford University Press.
Legal & ethical compliance when sharing biospecimen
Klingstrom, Tomas; Bongcam-Rudloff, Erik; Reichel, Jane
2018-01-01
Abstract When obtaining samples from biobanks, resolving ethical and legal concerns is a time-consuming task where researchers need to balance the needs of privacy, trust and scientific progress. The Biobanking and Biomolecular Resources Research Infrastructure-Large Prospective Cohorts project has resolved numerous such issues through intense communication between involved researchers and experts in its mission to unite large prospective study sets in Europe. To facilitate efficient communication, it is useful for nonexperts to have an at least basic understanding of the regulatory system for managing biological samples. Laws regulating research oversight are based on national law and normally share core principles founded on international charters. In interview studies among donors, chief concerns are privacy, efficient sample utilization and access to information generated from their samples. Despite a lack of clear evidence regarding which concern takes precedence, scientific as well as public discourse has largely focused on privacy concerns and the right of donors to control the usage of their samples. It is therefore important to proactively deal with ethical and legal issues to avoid complications that delay or prevent samples from being accessed. To help biobank professionals avoid making unnecessary mistakes, we have developed this basic primer covering the relationship between ethics and law, the concept of informed consent and considerations for returning findings to donors. PMID:28460118
Bhaskar, Anand; Wang, Y X Rachel; Song, Yun S
2015-02-01
With the recent increase in study sample sizes in human genetics, there has been growing interest in inferring historical population demography from genomic variation data. Here, we present an efficient inference method that can scale up to very large samples, with tens or hundreds of thousands of individuals. Specifically, by utilizing analytic results on the expected frequency spectrum under the coalescent and by leveraging the technique of automatic differentiation, which allows us to compute gradients exactly, we develop a very efficient algorithm to infer piecewise-exponential models of the historical effective population size from the distribution of sample allele frequencies. Our method is orders of magnitude faster than previous demographic inference methods based on the frequency spectrum. In addition to inferring demography, our method can also accurately estimate locus-specific mutation rates. We perform extensive validation of our method on simulated data and show that it can accurately infer multiple recent epochs of rapid exponential growth, a signal that is difficult to pick up with small sample sizes. Lastly, we use our method to analyze data from recent sequencing studies, including a large-sample exome-sequencing data set of tens of thousands of individuals assayed at a few hundred genic regions. © 2015 Bhaskar et al.; Published by Cold Spring Harbor Laboratory Press.
NASA Astrophysics Data System (ADS)
Wu, Qiujie; Tan, Liu; Xu, Sen; Liu, Dabin; Min, Li
2018-04-01
Numerous accidents of emulsion explosive (EE) are attributed to uncontrolled thermal decomposition of ammonium nitrate emulsion (ANE, the intermediate of EE) and EE in large scale. In order to study the thermal decomposition characteristics of ANE and EE in different scales, a large-scale test of modified vented pipe test (MVPT), and two laboratory-scale tests of differential scanning calorimeter (DSC) and accelerating rate calorimeter (ARC) were applied in the present study. The scale effect and water effect both play an important role in the thermal stability of ANE and EE. The measured decomposition temperatures of ANE and EE in MVPT are 146°C and 144°C, respectively, much lower than those in DSC and ARC. As the size of the same sample in DSC, ARC, and MVPT successively increases, the onset temperatures decrease. In the same test, the measured onset temperature value of ANE is higher than that of EE. The water composition of the sample stabilizes the sample. The large-scale test of MVPT can provide information for the real-life operations. The large-scale operations have more risks, and continuous overheating should be avoided.
ASSESSMENT OF LARGE RIVER MACROINVERTEBRATES: HOW FAR IS ENOUGH?
During the summer of 2001, twelve sites were sampled for macroinvertebrates, six each on the Great Miami and Kentucky Rivers. Sites were chosen to reflect a disturbance gradient in each river using sites sampled in a 1999 methods comparison study. Our sampling protocol improves...
Dainer-Best, Justin; Lee, Hae Yeon; Shumake, Jason D; Yeager, David S; Beevers, Christopher G
2018-06-07
Although the self-referent encoding task (SRET) is commonly used to measure self-referent cognition in depression, many different SRET metrics can be obtained. The current study used best subsets regression with cross-validation and independent test samples to identify the SRET metrics most reliably associated with depression symptoms in three large samples: a college student sample (n = 572), a sample of adults from Amazon Mechanical Turk (n = 293), and an adolescent sample from a school field study (n = 408). Across all 3 samples, SRET metrics associated most strongly with depression severity included number of words endorsed as self-descriptive and rate of accumulation of information required to decide whether adjectives were self-descriptive (i.e., drift rate). These metrics had strong intratask and split-half reliability and high test-retest reliability across a 1-week period. Recall of SRET stimuli and traditional reaction time (RT) metrics were not robustly associated with depression severity. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
ERIC Educational Resources Information Center
Anderson, J. M.
1978-01-01
A method is described for preparing large gelatine-embedded soil sections for ecological studies. Sampling methods reduce structural disturbance of the samples to a minimum and include freezing the samples in the field to kill soil invertebrates in their natural microhabitats. Projects are suggested for upper secondary school students. (Author/BB)
Software engineering the mixed model for genome-wide association studies on large samples.
Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J
2009-11-01
Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.
Medvedovici, Andrei; Udrescu, Stefan; Albu, Florin; Tache, Florentin; David, Victor
2011-09-01
Liquid-liquid extraction of target compounds from biological matrices followed by the injection of a large volume from the organic layer into the chromatographic column operated under reversed-phase (RP) conditions would successfully combine the selectivity and the straightforward character of the procedure in order to enhance sensitivity, compared with the usual approach of involving solvent evaporation and residue re-dissolution. Large-volume injection of samples in diluents that are not miscible with the mobile phase was recently introduced in chromatographic practice. The risk of random errors produced during the manipulation of samples is also substantially reduced. A bioanalytical method designed for the bioequivalence of fenspiride containing pharmaceutical formulations was based on a sample preparation procedure involving extraction of the target analyte and the internal standard (trimetazidine) from alkalinized plasma samples in 1-octanol. A volume of 75 µl from the octanol layer was directly injected on a Zorbax SB C18 Rapid Resolution, 50 mm length × 4.6 mm internal diameter × 1.8 µm particle size column, with the RP separation being carried out under gradient elution conditions. Detection was made through positive ESI and MS/MS. Aspects related to method development and validation are discussed. The bioanalytical method was successfully applied to assess bioequivalence of a modified release pharmaceutical formulation containing 80 mg fenspiride hydrochloride during two different studies carried out as single-dose administration under fasting and fed conditions (four arms), and multiple doses administration, respectively. The quality attributes assigned to the bioanalytical method, as resulting from its application to the bioequivalence studies, are highlighted and fully demonstrate that sample preparation based on large-volume injection of immiscible diluents has an increased potential for application in bioanalysis.
Survey of Large Methane Emitters in North America
NASA Astrophysics Data System (ADS)
Deiker, S.
2017-12-01
It has been theorized that methane emissions in the oil and gas industry follow log normal or "fat tail" distributions, with large numbers of small sources for every very large source. Such distributions would have significant policy and operational implications. Unfortunately, by their very nature such distributions would require large sample sizes to verify. Until recently, such large-scale studies would be prohibitively expensive. The largest public study to date sampled 450 wells, an order of magnitude too low to effectively constrain these models. During 2016 and 2017, Kairos Aerospace conducted a series of surveys the LeakSurveyor imaging spectrometer, mounted on light aircraft. This small, lightweight instrument was designed to rapidly locate large emission sources. The resulting survey covers over three million acres of oil and gas production. This includes over 100,000 wells, thousands of storage tanks and over 7,500 miles of gathering lines. This data set allows us to now probe the distribution of large methane emitters. Results of this survey, and implications for methane emission distribution, methane policy and LDAR will be discussed.
The cost of large numbers of hypothesis tests on power, effect size and sample size.
Lazzeroni, L C; Ray, A
2012-01-01
Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.
Hannett, George E.; Stone, Ward B.; Davis, Stephen W.; Wroblewski, Danielle
2011-01-01
The genetic relatedness of Clostridium botulinum type E isolates associated with an outbreak of wildlife botulism was studied using random amplification of polymorphic DNA (RAPD). Specimens were collected from November 2000 to December 2008 during a large outbreak of botulism affecting birds and fish living in and around Lake Erie and Lake Ontario. In our present study, a total of 355 wildlife samples were tested for the presence of botulinum toxin and/or organisms. Type E botulinum toxin was detected in 110 samples from birds, 12 samples from fish, and 2 samples from mammals. Sediment samples from Lake Erie were also examined for the presence of C. botulinum. Fifteen of 17 sediment samples were positive for the presence of C. botulinum type E. Eighty-one C. botulinum isolates were obtained from plants, animals, and sediments; of these isolates, 44 C. botulinum isolates produced type E toxin, as determined by mouse bioassay, while the remaining 37 isolates were not toxic for mice. All toxin-producing isolates were typed by RAPD; that analysis showed 12 different RAPD types and multiple subtypes. Our study thus demonstrates that multiple genetically distinct strains of C. botulinum were involved in the present outbreak of wildlife botulism. We found that C. botulinum type E is present in the sediments of Lake Erie and that a large range of bird and fish species is affected. PMID:21115703
A prototype splitter apparatus for dividing large catches of small fish
Stapanian, Martin A.; Edwards, William H.
2012-01-01
Due to financial and time constraints, it is often necessary in fisheries studies to divide large samples of fish and estimate total catch from the subsample. The subsampling procedure may involve potential human biases or may be difficult to perform in rough conditions. We present a prototype gravity-fed splitter apparatus for dividing large samples of small fish (30–100 mm TL). The apparatus features a tapered hopper with a sliding and removable shutter. The apparatus provides a comparatively stable platform for objectively obtaining subsamples, and it can be modified to accommodate different sizes of fish and different sample volumes. The apparatus is easy to build, inexpensive, and convenient to use in the field. To illustrate the performance of the apparatus, we divided three samples (total N = 2,000 fish) composed of four fish species. Our results indicated no significant bias in estimating either the number or proportion of each species from the subsample. Use of this apparatus or a similar apparatus can help to standardize subsampling procedures in large surveys of fish. The apparatus could be used for other applications that require dividing a large amount of material into one or more smaller subsamples.
Low-energy transmission electron diffraction and imaging of large-area graphene
Zhao, Wei; Xia, Bingyu; Lin, Li; Xiao, Xiaoyang; Liu, Peng; Lin, Xiaoyang; Peng, Hailin; Zhu, Yuanmin; Yu, Rong; Lei, Peng; Wang, Jiangtao; Zhang, Lina; Xu, Yong; Zhao, Mingwen; Peng, Lianmao; Li, Qunqing; Duan, Wenhui; Liu, Zhongfan; Fan, Shoushan; Jiang, Kaili
2017-01-01
Two-dimensional (2D) materials have attracted interest because of their excellent properties and potential applications. A key step in realizing industrial applications is to synthesize wafer-scale single-crystal samples. Until now, single-crystal samples, such as graphene domains up to the centimeter scale, have been synthesized. However, a new challenge is to efficiently characterize large-area samples. Currently, the crystalline characterization of these samples still relies on selected-area electron diffraction (SAED) or low-energy electron diffraction (LEED), which is more suitable for characterizing very small local regions. This paper presents a highly efficient characterization technique that adopts a low-energy electrostatically focused electron gun and a super-aligned carbon nanotube (SACNT) film sample support. It allows rapid crystalline characterization of large-area graphene through a single photograph of a transmission-diffracted image at a large beam size. Additionally, the low-energy electron beam enables the observation of a unique diffraction pattern of adsorbates on the suspended graphene at room temperature. This work presents a simple and convenient method for characterizing the macroscopic structures of 2D materials, and the instrument we constructed allows the study of the weak interaction with 2D materials. PMID:28879233
Low-energy transmission electron diffraction and imaging of large-area graphene.
Zhao, Wei; Xia, Bingyu; Lin, Li; Xiao, Xiaoyang; Liu, Peng; Lin, Xiaoyang; Peng, Hailin; Zhu, Yuanmin; Yu, Rong; Lei, Peng; Wang, Jiangtao; Zhang, Lina; Xu, Yong; Zhao, Mingwen; Peng, Lianmao; Li, Qunqing; Duan, Wenhui; Liu, Zhongfan; Fan, Shoushan; Jiang, Kaili
2017-09-01
Two-dimensional (2D) materials have attracted interest because of their excellent properties and potential applications. A key step in realizing industrial applications is to synthesize wafer-scale single-crystal samples. Until now, single-crystal samples, such as graphene domains up to the centimeter scale, have been synthesized. However, a new challenge is to efficiently characterize large-area samples. Currently, the crystalline characterization of these samples still relies on selected-area electron diffraction (SAED) or low-energy electron diffraction (LEED), which is more suitable for characterizing very small local regions. This paper presents a highly efficient characterization technique that adopts a low-energy electrostatically focused electron gun and a super-aligned carbon nanotube (SACNT) film sample support. It allows rapid crystalline characterization of large-area graphene through a single photograph of a transmission-diffracted image at a large beam size. Additionally, the low-energy electron beam enables the observation of a unique diffraction pattern of adsorbates on the suspended graphene at room temperature. This work presents a simple and convenient method for characterizing the macroscopic structures of 2D materials, and the instrument we constructed allows the study of the weak interaction with 2D materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
David, Sabrina N.; Zhai, Yao; van der Zande, Arend M.
Two-dimensional (2D) atomic materials such as graphene and transition metal dichalcogenides (TMDCs) have attracted significant research and industrial interest for their electronic, optical, mechanical, and thermal properties. While large-area crystal growth techniques such as chemical vapor deposition have been demonstrated, the presence of grain boundaries and orientation of grains arising in such growths substantially affect the physical properties of the materials. There is currently no scalable characterization method for determining these boundaries and orientations over a large sample area. We here present a second-harmonic generation based microscopy technique for rapidly mapping grain orientations and boundaries of 2D TMDCs. We experimentallymore » demonstrate the capability to map large samples to an angular resolution of ±1° with minimal sample preparation and without involved analysis. A direct comparison of the all-optical grain orientation maps against results obtained by diffraction-filtered dark-field transmission electron microscopy plus selected-area electron diffraction on identical TMDC samples is provided. This rapid and accurate tool should enable large-area characterization of TMDC samples for expedited studies of grain boundary effects and the efficient characterization of industrial-scale production techniques.« less
An overview of the genetic dissection of complex traits.
Rao, D C
2008-01-01
Thanks to the recent revolutionary genomic advances such as the International HapMap consortium, resolution of the genetic architecture of common complex traits is beginning to look hopeful. While demonstrating the feasibility of genome-wide association (GWA) studies, the pathbreaking Wellcome Trust Case Control Consortium (WTCCC) study also serves to underscore the critical importance of very large sample sizes and draws attention to potential problems, which need to be addressed as part of the study design. Even the large WTCCC study had vastly inadequate power for several of the associations reported (and confirmed) and, therefore, most of the regions harboring relevant associations may not be identified anytime soon. This chapter provides an overview of some of the key developments in the methodological approaches to genetic dissection of common complex traits. Constrained Bayesian networks are suggested as especially useful for analysis of pathway-based SNPs. Likewise, composite likelihood is suggested as a promising method for modeling complex systems. It discusses the key steps in a study design, with an emphasis on GWA studies. Potential limitations highlighted by the WTCCC GWA study are discussed, including problems associated with massive genotype imputation, analysis of pooled national samples, shared controls, and the critical role of interactions. GWA studies clearly need massive sample sizes that are only possible through genuine collaborations. After all, for common complex traits, the question is not whether we can find some pieces of the puzzle, but how large and what kind of a sample we need to (nearly) solve the genetic puzzle.
State Estimates of Disability in America. Disability Statistics Report 3.
ERIC Educational Resources Information Center
LaPlante, Mitchell P.
This study presents and discusses existing data on disability by state, from the 1980 and 1990 censuses, the Current Population Survey (CPS), and the National Health Interview Survey (NHIS). The study used direct methods for states with large sample sizes and synthetic estimates for states with low sample sizes. The study's highlighted findings…
Gebauer, Roman; Řepka, Radomír; Šmudla, Radek; Mamoňová, Miroslava; Ďurkovič, Jaroslav
2016-01-01
Although spine variation within cacti species or populations is assumed to be large, the minimum sample size of different spine anatomical and morphological traits required for species description is less studied. There are studies where only 2 spines were used for taxonomical comparison amnog species. Therefore, the spine structure variation within areoles and individuals of one population of Gymnocalycium kieslingii subsp. castaneum (Ferrari) Slaba was analyzed. Fifteen plants were selected and from each plant one areole from the basal, middle and upper part of the plant body was sampled. A scanning electron microscopy was used for spine surface description and a light microscopy for measurements of spine width, thickness, cross-section area, fiber diameter and fiber cell wall thickness. The spine surface was more visible and damaged less in the upper part of the plant body than in the basal part. Large spine and fiber differences were found between upper and lower parts of the plant body, but also within single areoles. In general, the examined traits in the upper part had by 8-17% higher values than in the lower parts. The variation of spine and fiber traits within areoles was lower than the differences between individuals. The minimum sample size was largely influenced by the studied spine and fiber traits, ranging from 1 to 70 spines. The results provide pioneer information useful in spine sample collection in the field for taxonomical, biomechanical and structural studies. Nevertheless, similar studies should be carried out for other cacti species to make generalizations. The large spine and fiber variation within areoles observed in our study indicates a very complex spine morphogenesis.
Gebauer, Roman; Řepka, Radomír; Šmudla, Radek; Mamoňová, Miroslava; Ďurkovič, Jaroslav
2016-01-01
Abstract Although spine variation within cacti species or populations is assumed to be large, the minimum sample size of different spine anatomical and morphological traits required for species description is less studied. There are studies where only 2 spines were used for taxonomical comparison amnog species. Therefore, the spine structure variation within areoles and individuals of one population of Gymnocalycium kieslingii subsp. castaneum (Ferrari) Slaba was analyzed. Fifteen plants were selected and from each plant one areole from the basal, middle and upper part of the plant body was sampled. A scanning electron microscopy was used for spine surface description and a light microscopy for measurements of spine width, thickness, cross-section area, fiber diameter and fiber cell wall thickness. The spine surface was more visible and damaged less in the upper part of the plant body than in the basal part. Large spine and fiber differences were found between upper and lower parts of the plant body, but also within single areoles. In general, the examined traits in the upper part had by 8–17% higher values than in the lower parts. The variation of spine and fiber traits within areoles was lower than the differences between individuals. The minimum sample size was largely influenced by the studied spine and fiber traits, ranging from 1 to 70 spines. The results provide pioneer information useful in spine sample collection in the field for taxonomical, biomechanical and structural studies. Nevertheless, similar studies should be carried out for other cacti species to make generalizations. The large spine and fiber variation within areoles observed in our study indicates a very complex spine morphogenesis. PMID:27698579
A follow-up study of hygiene in catering premises at large-scale events in the United Kingdom.
Willis, C; Elviss, N; McLauchlin, J
2015-01-01
To investigate food hygiene practices at large events by assessing the microbiological quality of ready-to-eat food, drinking water, food preparation surfaces, cleaning cloths and wristbands worn by food handlers for event security purposes. Over a 7-month period, 1662 samples were collected at 153 events and examined for microbiological contamination. Eight per cent of food samples were of an unsatisfactory quality. A further one per cent contained potentially hazardous levels of human pathogenic bacteria. 27% of water samples, 32% of swabs and 56% of cloths were also unsatisfactory. These results represented an improvement in hygiene compared to a previous study carried out 12 months previously. A fifth of food handler wristbands were contaminated with Enterobacteriaceae, Escherichia coli and/or coagulase-positive staphylococci, with those bands made from fabric being more frequently contaminated than those made from plastic or other materials. This study provides evidence that the food hygiene at large-scale events may have improved. However, there is still a need for continued efforts to maintain an ongoing improvement in cleaning regimes and food hygiene management. This study was part of an ongoing focus on large events in the lead-up to the London 2012 Olympics. Lessons learnt here will be important in the planning of future large events. © 2014 Crown copyright. © 2014 Society for Applied Microbiology This article is Published with the permission of the Controller of HMSO and Queen's Printer for Scotland.
This project consisted of a laboratory study to evaluate an extraction and analysis method for quantifying biomarkers of pesticide exposure and creatinine in urine samples collected with commercially-available disposable diapers. For large exposure studies, such as the National ...
Methods to increase reproducibility in differential gene expression via meta-analysis
Sweeney, Timothy E.; Haynes, Winston A.; Vallania, Francesco; Ioannidis, John P.; Khatri, Purvesh
2017-01-01
Findings from clinical and biological studies are often not reproducible when tested in independent cohorts. Due to the testing of a large number of hypotheses and relatively small sample sizes, results from whole-genome expression studies in particular are often not reproducible. Compared to single-study analysis, gene expression meta-analysis can improve reproducibility by integrating data from multiple studies. However, there are multiple choices in designing and carrying out a meta-analysis. Yet, clear guidelines on best practices are scarce. Here, we hypothesized that studying subsets of very large meta-analyses would allow for systematic identification of best practices to improve reproducibility. We therefore constructed three very large gene expression meta-analyses from clinical samples, and then examined meta-analyses of subsets of the datasets (all combinations of datasets with up to N/2 samples and K/2 datasets) compared to a ‘silver standard’ of differentially expressed genes found in the entire cohort. We tested three random-effects meta-analysis models using this procedure. We showed relatively greater reproducibility with more-stringent effect size thresholds with relaxed significance thresholds; relatively lower reproducibility when imposing extraneous constraints on residual heterogeneity; and an underestimation of actual false positive rate by Benjamini–Hochberg correction. In addition, multivariate regression showed that the accuracy of a meta-analysis increased significantly with more included datasets even when controlling for sample size. PMID:27634930
Recovery of diverse microbes in high turbidity surface water samples using dead-end ultrafiltration
Mull, Bonnie; Hill, Vincent R.
2015-01-01
Dead-end ultrafiltration (DEUF) has been reported to be a simple, field-deployable technique for recovering bacteria, viruses, and parasites from large-volume water samples for water quality testing and waterborne disease investigations. While DEUF has been reported for application to water samples having relatively low turbidity, little information is available regarding recovery efficiencies for this technique when applied to sampling turbid water samples such as those commonly found in lakes and rivers. This study evaluated the effectiveness of a DEUF technique for recoveringMS2 bacteriophage, enterococci, Escherichia coli, Clostridium perfringens, and Cryptosporidium parvum oocysts in surface water samples having elevated turbidity. Average recovery efficiencies for each study microbe across all turbidity ranges were: MS2 (66%), C. parvum (49%), enterococci (85%), E. coli (81%), and C. perfringens (63%). The recovery efficiencies for MS2 and C. perfringens exhibited an inversely proportional relationship with turbidity, however no significant differences in recovery were observed for C. parvum, enterococci, or E. coli. Although ultrafilter clogging was observed, the DEUF method was able to process 100-L surface water samples at each turbidity level within 60 min. This study supports the use of the DEUF method for recovering a wide array of microbes in large-volume surface water samples having medium to high turbidity. PMID:23064261
Recovery of diverse microbes in high turbidity surface water samples using dead-end ultrafiltration.
Mull, Bonnie; Hill, Vincent R
2012-12-01
Dead-end ultrafiltration (DEUF) has been reported to be a simple, field-deployable technique for recovering bacteria, viruses, and parasites from large-volume water samples for water quality testing and waterborne disease investigations. While DEUF has been reported for application to water samples having relatively low turbidity, little information is available regarding recovery efficiencies for this technique when applied to sampling turbid water samples such as those commonly found in lakes and rivers. This study evaluated the effectiveness of a DEUF technique for recovering MS2 bacteriophage, enterococci, Escherichia coli, Clostridium perfringens, and Cryptosporidium parvum oocysts in surface water samples having elevated turbidity. Average recovery efficiencies for each study microbe across all turbidity ranges were: MS2 (66%), C. parvum (49%), enterococci (85%), E. coli (81%), and C. perfringens (63%). The recovery efficiencies for MS2 and C. perfringens exhibited an inversely proportional relationship with turbidity, however no significant differences in recovery were observed for C. parvum, enterococci, or E. coli. Although ultrafilter clogging was observed, the DEUF method was able to process 100-L surface water samples at each turbidity level within 60 min. This study supports the use of the DEUF method for recovering a wide array of microbes in large-volume surface water samples having medium to high turbidity. Published by Elsevier B.V.
Error in the Sampling Area of an Optical Disdrometer: Consequences in Computing Rain Variables
Fraile, R.; Castro, A.; Fernández-Raga, M.; Palencia, C.; Calvo, A. I.
2013-01-01
The aim of this study is to improve the estimation of the characteristic uncertainties of optic disdrometers in an attempt to calculate the efficient sampling area according to the size of the drop and to study how this influences the computation of other parameters, taking into account that the real sampling area is always smaller than the nominal area. For large raindrops (a little over 6 mm), the effective sampling area may be half the area indicated by the manufacturer. The error committed in the sampling area is propagated to all the variables depending on this surface, such as the rain intensity and the reflectivity factor. Both variables tend to underestimate the real value if the sampling area is not corrected. For example, the rainfall intensity errors may be up to 50% for large drops, those slightly larger than 6 mm. The same occurs with reflectivity values, which may be up to twice the reflectivity calculated using the uncorrected constant sampling area. The Z-R relationships appear to have little dependence on the sampling area, because both variables depend on it the same way. These results were obtained by studying one particular rain event that occurred on April 16, 2006. PMID:23844393
Phenotypic Association Analyses With Copy Number Variation in Recurrent Depressive Disorder.
Rucker, James J H; Tansey, Katherine E; Rivera, Margarita; Pinto, Dalila; Cohen-Woods, Sarah; Uher, Rudolf; Aitchison, Katherine J; Craddock, Nick; Owen, Michael J; Jones, Lisa; Jones, Ian; Korszun, Ania; Barnes, Michael R; Preisig, Martin; Mors, Ole; Maier, Wolfgang; Rice, John; Rietschel, Marcella; Holsboer, Florian; Farmer, Anne E; Craig, Ian W; Scherer, Stephen W; McGuffin, Peter; Breen, Gerome
2016-02-15
Defining the molecular genomic basis of the likelihood of developing depressive disorder is a considerable challenge. We previously associated rare, exonic deletion copy number variants (CNV) with recurrent depressive disorder (RDD). Sex chromosome abnormalities also have been observed to co-occur with RDD. In this reanalysis of our RDD dataset (N = 3106 cases; 459 screened control samples and 2699 population control samples), we further investigated the role of larger CNVs and chromosomal abnormalities in RDD and performed association analyses with clinical data derived from this dataset. We found an enrichment of Turner's syndrome among cases of depression compared with the frequency observed in a large population sample (N = 34,910) of live-born infants collected in Denmark (two-sided p = .023, odds ratio = 7.76 [95% confidence interval = 1.79-33.6]), a case of diploid/triploid mosaicism, and several cases of uniparental isodisomy. In contrast to our previous analysis, large deletion CNVs were no more frequent in cases than control samples, although deletion CNVs in cases contained more genes than control samples (two-sided p = .0002). After statistical correction for multiple comparisons, our data do not support a substantial role for CNVs in RDD, although (as has been observed in similar samples) occasional cases may harbor large variants with etiological significance. Genetic pleiotropy and sample heterogeneity suggest that very large sample sizes are required to study conclusively the role of genetic variation in mood disorders. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Steel, Jennifer L.; Herlitz, Claes A.
2005-01-01
Objective: Several studies with small and ''high risk'' samples have demonstrated that a history of childhood or adolescent sexual abuse (CASA) is associated with sexual risk behaviors (SRBs). However, few studies with large random samples from the general population have specifically examined the relationship between CASA and SRBs with a…
Optimizing liquid effluent monitoring at a large nuclear complex.
Chou, Charissa J; Barnett, D Brent; Johnson, Vernon G; Olson, Phil M
2003-12-01
Effluent monitoring typically requires a large number of analytes and samples during the initial or startup phase of a facility. Once a baseline is established, the analyte list and sampling frequency may be reduced. Although there is a large body of literature relevant to the initial design, few, if any, published papers exist on updating established effluent monitoring programs. This paper statistically evaluates four years of baseline data to optimize the liquid effluent monitoring efficiency of a centralized waste treatment and disposal facility at a large defense nuclear complex. Specific objectives were to: (1) assess temporal variability in analyte concentrations, (2) determine operational factors contributing to waste stream variability, (3) assess the probability of exceeding permit limits, and (4) streamline the sampling and analysis regime. Results indicated that the probability of exceeding permit limits was one in a million under normal facility operating conditions, sampling frequency could be reduced, and several analytes could be eliminated. Furthermore, indicators such as gross alpha and gross beta measurements could be used in lieu of more expensive specific isotopic analyses (radium, cesium-137, and strontium-90) for routine monitoring. Study results were used by the state regulatory agency to modify monitoring requirements for a new discharge permit, resulting in an annual cost savings of US dollars 223,000. This case study demonstrates that statistical evaluation of effluent contaminant variability coupled with process knowledge can help plant managers and regulators streamline analyte lists and sampling frequencies based on detection history and environmental risk.
Jin, Sheng Chih; Benitez, Bruno A; Deming, Yuetiva; Cruchaga, Carlos
2016-01-01
Analyses of genome-wide association studies (GWAS) for complex disorders usually identify common variants with a relatively small effect size that only explain a small proportion of phenotypic heritability. Several studies have suggested that a significant fraction of heritability may be explained by low-frequency (minor allele frequency (MAF) of 1-5 %) and rare-variants that are not contained in the commercial GWAS genotyping arrays (Schork et al., Curr Opin Genet Dev 19:212, 2009). Rare variants can also have relatively large effects on risk for developing human diseases or disease phenotype (Cruchaga et al., PLoS One 7:e31039, 2012). However, it is necessary to perform next-generation sequencing (NGS) studies in a large population (>4,000 samples) to detect a significant rare-variant association. Several NGS methods, such as custom capture sequencing and amplicon-based sequencing, are designed to screen a small proportion of the genome, but most of these methods are limited in the number of samples that can be multiplexed (i.e. most sequencing kits only provide 96 distinct index). Additionally, the sequencing library preparation for 4,000 samples remains expensive and thus conducting NGS studies with the aforementioned methods are not feasible for most research laboratories.The need for low-cost large scale rare-variant detection makes pooled-DNA sequencing an ideally efficient and cost-effective technique to identify rare variants in target regions by sequencing hundreds to thousands of samples. Our recent work has demonstrated that pooled-DNA sequencing can accurately detect rare variants in targeted regions in multiple DNA samples with high sensitivity and specificity (Jin et al., Alzheimers Res Ther 4:34, 2012). In these studies we used a well-established pooled-DNA sequencing approach and a computational package, SPLINTER (short indel prediction by large deviation inference and nonlinear true frequency estimation by recursion) (Vallania et al., Genome Res 20:1711, 2010), for accurate identification of rare variants in large DNA pools. Given an average sequencing coverage of 30× per haploid genome, SPLINTER can detect rare variants and short indels up to 4 base pairs (bp) with high sensitivity and specificity (up to 1 haploid allele in a pool as large as 500 individuals). Step-by-step instructions on how to conduct pooled-DNA sequencing experiments and data analyses are described in this chapter.
An atomic-absorption method for the determination of gold in large samples of geologic materials
VanSickle, Gordon H.; Lakin, Hubert William
1968-01-01
A laboratory method for the determination of gold in large (100-gram) samples has been developed for use in the study of the gold content of placer deposits and of trace amounts of gold in other geologic materials. In this method the sample is digested with bromine and ethyl ether, the gold is extracted into methyl isobutyl ketone, and the determination is made by atomicabsorption spectrophotometry. The lower limit of detection is 0.005 part per million in the sample. The few data obtained so far by this method agree favorably with those obtained by assay and by other atomic-absorption methods. About 25 determinations can be made per man-day.
NASA Astrophysics Data System (ADS)
Straus, D. M.
2006-12-01
The transitions between portions of the state space of the large-scale flow is studied from daily wintertime data over the Pacific North America region using the NCEP reanalysis data set (54 winters) and very large suites of hindcasts made with the COLA atmospheric GCM with observed SST (55 members for each of 18 winters). The partition of the large-scale state space is guided by cluster analysis, whose statistical significance and relationship to SST is reviewed (Straus and Molteni, 2004; Straus, Corti and Molteni, 2006). The determination of the global nature of the flow through state space is studied using Markov Chains (Crommelin, 2004). In particular the non-diffusive part of the flow is contrasted in nature (small data sample) and the AGCM (large data sample). The intrinsic error growth associated with different portions of the state space is studied through sets of identical twin AGCM simulations. The goal is to obtain realistic estimates of predictability times for large-scale transitions that should be useful in long-range forecasting.
External beam pixe programs at the University of California, Davis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, R.A.
A PIXE system in which large or delicate samples are excited by a low-current external proton beam is described. This system has been used to analyze historical printed books and manuscripts, as well as a large variety of archeological artifacts. The steps used to protect the sample from unnecessary beam current are examined. A recent thorough study of the first volume of the Gutenberg 42-line Bible is described in some detail.
When the test of mediation is more powerful than the test of the total effect.
O'Rourke, Holly P; MacKinnon, David P
2015-06-01
Although previous research has studied power in mediation models, the extent to which the inclusion of a mediator will increase power has not been investigated. To address this deficit, in a first study we compared the analytical power values of the mediated effect and the total effect in a single-mediator model, to identify the situations in which the inclusion of one mediator increased statistical power. The results from this first study indicated that including a mediator increased statistical power in small samples with large coefficients and in large samples with small coefficients, and when coefficients were nonzero and equal across models. Next, we identified conditions under which power was greater for the test of the total mediated effect than for the test of the total effect in the parallel two-mediator model. These results indicated that including two mediators increased power in small samples with large coefficients and in large samples with small coefficients, the same pattern of results that had been found in the first study. Finally, we assessed the analytical power for a sequential (three-path) two-mediator model and compared the power to detect the three-path mediated effect to the power to detect both the test of the total effect and the test of the mediated effect for the single-mediator model. The results indicated that the three-path mediated effect had more power than the mediated effect from the single-mediator model and the test of the total effect. Practical implications of these results for researchers are then discussed.
Winkelman, D.L.; Van Den Avyle, M.J.
2002-01-01
The objective of this study was to determine dietary overlap between blueback herring and threadfin shad in J. Strom Thrumond Reservoir, South Carolina/Georgia. We also evaluated prey selectivity for each speices and diet differences between two size categories of blueback herring. Diet and zooplankton samples were collected every other month from April 1992 to February 1994. We examined stomachs containing prey from 170 large blueback herring (>140mm), 96 small blueback herring (<140mm), and 109 threadfin shad, and we also examined 45 zooplankton samples. Large blueback herring diets differed significantly from threadfin shad diets on 11 of 12 sampling dates, and small blueback herring diets differed from threadfin shad diets on all sampling dates. In general, blueback herring consumed proportionally more copepods and fewer Bosmina sp. and rotifers than threadfin shad. Large and small blueback herring diets were significantly different on five of eight sampling dates, primarily due to the tendency of small blueback herring to eat proportionally more Bosmina sp. than large blueback herring. Both blueback herring and threadfin shad fed selectively during some periods of the year. Diet differences between the species may contribute to their coexistence; however, both blueback herring and threadfin shad showed a strong preference for Bosmina sp., increasing the chance that they may negatively influence one another.
Sample Size and Correlational Inference
ERIC Educational Resources Information Center
Anderson, Richard B.; Doherty, Michael E.; Friedrich, Jeff C.
2008-01-01
In 4 studies, the authors examined the hypothesis that the structure of the informational environment makes small samples more informative than large ones for drawing inferences about population correlations. The specific purpose of the studies was to test predictions arising from the signal detection simulations of R. B. Anderson, M. E. Doherty,…
Leadership Coaching for Principals: A National Study
ERIC Educational Resources Information Center
Wise, Donald; Cavazos, Blanca
2017-01-01
Surveys were sent to a large representative sample of public school principals in the United States asking if they had received leadership coaching. Comparison of responses to actual numbers of principals indicates that the sample represents the first national study of principal leadership coaching. Results indicate that approximately 50% of all…
Sampling studies to estimate the HIV prevalence rate in female commercial sex workers.
Pascom, Ana Roberta Pati; Szwarcwald, Célia Landmann; Barbosa Júnior, Aristides
2010-01-01
We investigated sampling methods being used to estimate the HIV prevalence rate among female commercial sex workers. The studies were classified according to the adequacy or not of the sample size to estimate HIV prevalence rate and according to the sampling method (probabilistic or convenience). We identified 75 studies that estimated the HIV prevalence rate among female sex workers. Most of the studies employed convenience samples. The sample size was not adequate to estimate HIV prevalence rate in 35 studies. The use of convenience sample limits statistical inference for the whole group. It was observed that there was an increase in the number of published studies since 2005, as well as in the number of studies that used probabilistic samples. This represents a large advance in the monitoring of risk behavior practices and HIV prevalence rate in this group.
A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berres, Anne Sabine; Adhinarayanan, Vignesh; Turton, Terece
2017-05-12
Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline atmore » the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.« less
Assessing the sustainable construction of large construction companies in Malaysia
NASA Astrophysics Data System (ADS)
Adewale, Bamgbade Jibril; Mohammed, Kamaruddeen Ahmed; Nasrun, Mohd Nawi Mohd
2016-08-01
Considering the increasing concerns for the consideration of sustainability issues in construction project delivery within the construction industry, this paper assesses the extent of sustainable construction among Malaysian large contractors, in order to ascertain the level of the industry's impacts on both the environment and the society. Sustainable construction explains the construction industry's responsibility to efficiently utilise the finite resources while also reducing construction impacts on both humans and the environment throughout the phases of construction. This study used proportionate stratified random sampling to conduct a field study with a sample of 172 contractors out of the 708 administered questionnaires. Data were collected from large contractors in the eleven states of peninsular Malaysia. Using the five-level rating scale (which include: 1= Very Low; 2= Low; 3= Moderate; 4= High; 5= Very High) to describe the level of sustainable construction of Malaysian contractors based on previous studies, statistical analysis reveals that environmental, social and economic sustainability of Malaysian large contractors are high.
Willis, C; Elviss, N; Aird, H; Fenelon, D; McLauchlin, J
2012-08-01
To investigate hygiene practices of caterers at large events in order to: support the production of guidance on catering at such events; to compare hygiene standards at weekends with other times in the week; and to learn lessons in preparation for the London Olympics in 2012. UK-wide study of caterers at large events, including questionnaires on hygiene procedures and microbiological examination of food, water and environmental samples. In total, 1364 samples of food, water, surface swabs and cloths were collected at 139 events, by local authority sampling officers, and transported to laboratories for microbiological analysis. Eight percent of food samples were of an unsatisfactory quality, and a further 2% contained potentially hazardous levels of Bacillus spp. A significantly higher proportion of unsatisfactory food samples were taken from vendors without adequate food safety procedures in place. Fifty-two percent of water samples, 38% of swabs and 71% of cloths were also unsatisfactory. The majority of samples (57%) were collected on Saturdays, Sundays or bank holidays. Environmental swab results were significantly poorer at weekends compared with other days of the week. This study reinforces the fact that food hygiene is a continuing cause for concern in mobile vendors, and indicates a need for an ongoing programme of training and monitoring of caterers in preparation for the London Olympics. Copyright © 2012 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Tracing the trajectory of skill learning with a very large sample of online game players.
Stafford, Tom; Dewar, Michael
2014-02-01
In the present study, we analyzed data from a very large sample (N = 854,064) of players of an online game involving rapid perception, decision making, and motor responding. Use of game data allowed us to connect, for the first time, rich details of training history with measures of performance from participants engaged for a sustained amount of time in effortful practice. We showed that lawful relations exist between practice amount and subsequent performance, and between practice spacing and subsequent performance. Our methodology allowed an in situ confirmation of results long established in the experimental literature on skill acquisition. Additionally, we showed that greater initial variation in performance is linked to higher subsequent performance, a result we link to the exploration/exploitation trade-off from the computational framework of reinforcement learning. We discuss the benefits and opportunities of behavioral data sets with very large sample sizes and suggest that this approach could be particularly fecund for studies of skill acquisition.
A hard-to-read font reduces the framing effect in a large sample.
Korn, Christoph W; Ries, Juliane; Schalk, Lennart; Oganian, Yulia; Saalbach, Henrik
2018-04-01
How can apparent decision biases, such as the framing effect, be reduced? Intriguing findings within recent years indicate that foreign language settings reduce framing effects, which has been explained in terms of deeper cognitive processing. Because hard-to-read fonts have been argued to trigger deeper cognitive processing, so-called cognitive disfluency, we tested whether hard-to-read fonts reduce framing effects. We found no reliable evidence for an effect of hard-to-read fonts on four framing scenarios in a laboratory (final N = 158) and an online study (N = 271). However, in a preregistered online study with a rather large sample (N = 732), a hard-to-read font reduced the framing effect in the classic "Asian disease" scenario (in a one-sided test). This suggests that hard-read-fonts can modulate decision biases-albeit with rather small effect sizes. Overall, our findings stress the importance of large samples for the reliability and replicability of modulations of decision biases.
Large strain cruciform biaxial testing for FLC detection
NASA Astrophysics Data System (ADS)
Güler, Baran; Efe, Mert
2017-10-01
Selection of proper test method, specimen design and analysis method are key issues for studying formability of sheet metals and detection of their forming limit curves (FLC). Materials with complex microstructures may need an additional micro-mechanical investigation and accurate modelling. Cruciform biaxial test stands as an alternative to standard tests as it achieves frictionless, in-plane, multi-axial stress states with a single sample geometry. In this study, we introduce a small-scale (less than 10 cm) cruciform sample allowing micro-mechanical investigation at stress states ranging from plane strain to equibiaxial. With successful specimen design and surface finish, large forming limit strains are obtained at the test region of the sample. The large forming limit strains obtained by experiments are compared to the values obtained from Marciniak-Kuczynski (M-K) local necking model and Cockroft-Latham damage model. This comparison shows that the experimental limiting strains are beyond the theoretical values, approaching to the fracture strain of the two test materials: Al-6061-T6 aluminum alloy and DC-04 high formability steel.
A Review of Biological Agent Sampling Methods and ...
Report This study was conducted to evaluate current sampling and analytical capabilities, from a time and resource perspective, for a large-scale biological contamination incident. The analysis will be useful for strategically directing future research investment.
Evaluation of Buccal Cell Samples for Studies of Oral Microbiota.
Yu, Guoqin; Phillips, Steve; Gail, Mitchell H; Goedert, James J; Humphrys, Michael; Ravel, Jacques; Ren, Yanfang; Caporaso, Neil E
2017-02-01
The human microbiota is postulated to affect cancer risk, but collecting microbiota specimens with prospective follow-up for diseases will take time. Buccal cell samples have been obtained from mouthwash for the study of human genomic DNA in many cohort studies. Here, we evaluate the feasibility of using buccal cell samples to examine associations of human microbiota and disease risk. We obtained buccal cells from mouthwash in 41 healthy participants using a protocol that is widely employed to obtain buccal cells for the study of human DNA. We compared oral microbiota from buccal cells with that from eight other oral sample types collected by following the protocols of the Human Microbiome Project. Microbiota profiles were determined by sequencing 16S rRNA gene V3-V4 region. Compared with each of the eight other oral samples, the buccal cell samples had significantly more observed species (P < 0.002) and higher alpha diversity (Shannon index, P < 0.02). The microbial communities were more similar (smaller beta diversity) among buccal cells samples than in the other samples (P < 0.001 for 12 of 16 weighted and unweighted UniFrac distance comparisons). Buccal cell microbial profiles closely resembled saliva but were distinct from dental plaque and tongue dorsum. Stored buccal cell samples in prospective cohort studies are a promising resource to study associations of oral microbiota with disease. The feasibility of using existing buccal cell collections in large prospective cohorts allows investigations of the role of oral microbiota in chronic disease etiology in large population studies possible today. Cancer Epidemiol Biomarkers Prev; 26(2); 249-53. ©2016 AACR. ©2016 American Association for Cancer Research.
Kirk, Michelle R.; Jonker, Arjan; McCulloch, Alan
2015-01-01
Analysis of rumen microbial community structure based on small-subunit rRNA marker genes in metagenomic DNA samples provides important insights into the dominant taxa present in the rumen and allows assessment of community differences between individuals or in response to treatments applied to ruminants. However, natural animal-to-animal variation in rumen microbial community composition can limit the power of a study considerably, especially when only subtle differences are expected between treatment groups. Thus, trials with large numbers of animals may be necessary to overcome this variation. Because ruminants pass large amounts of rumen material to their oral cavities when they chew their cud, oral samples may contain good representations of the rumen microbiota and be useful in lieu of rumen samples to study rumen microbial communities. We compared bacterial, archaeal, and eukaryotic community structures in DNAs extracted from buccal swabs to those in DNAs from samples collected directly from the rumen by use of a stomach tube for sheep on four different diets. After bioinformatic depletion of potential oral taxa from libraries of samples collected via buccal swabs, bacterial communities showed significant clustering by diet (R = 0.37; analysis of similarity [ANOSIM]) rather than by sampling method (R = 0.07). Archaeal, ciliate protozoal, and anaerobic fungal communities also showed significant clustering by diet rather than by sampling method, even without adjustment for potentially orally associated microorganisms. These findings indicate that buccal swabs may in future allow quick and noninvasive sampling for analysis of rumen microbial communities in large numbers of ruminants. PMID:26276109
A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies
Jobard, Elodie; Trédan, Olivier; Postoly, Déborah; André, Fabrice; Martin, Anne-Laure; Elena-Herrmann, Bénédicte; Boyault, Sandrine
2016-01-01
The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation) and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies. PMID:27929400
A topological analysis of large-scale structure, studied using the CMASS sample of SDSS-III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parihar, Prachi; Gott, J. Richard III; Vogeley, Michael S.
2014-12-01
We study the three-dimensional genus topology of large-scale structure using the northern region of the CMASS Data Release 10 (DR10) sample of the SDSS-III Baryon Oscillation Spectroscopic Survey. We select galaxies with redshift 0.452 < z < 0.625 and with a stellar mass M {sub stellar} > 10{sup 11.56} M {sub ☉}. We study the topology at two smoothing lengths: R {sub G} = 21 h {sup –1} Mpc and R {sub G} = 34 h {sup –1} Mpc. The genus topology studied at the R {sub G} = 21 h {sup –1} Mpc scale results in the highest genusmore » amplitude observed to date. The CMASS sample yields a genus curve that is characteristic of one produced by Gaussian random phase initial conditions. The data thus support the standard model of inflation where random quantum fluctuations in the early universe produced Gaussian random phase initial conditions. Modest deviations in the observed genus from random phase are as expected from shot noise effects and the nonlinear evolution of structure. We suggest the use of a fitting formula motivated by perturbation theory to characterize the shift and asymmetries in the observed genus curve with a single parameter. We construct 54 mock SDSS CMASS surveys along the past light cone from the Horizon Run 3 (HR3) N-body simulations, where gravitationally bound dark matter subhalos are identified as the sites of galaxy formation. We study the genus topology of the HR3 mock surveys with the same geometry and sampling density as the observational sample and find the observed genus topology to be consistent with ΛCDM as simulated by the HR3 mock samples. We conclude that the topology of the large-scale structure in the SDSS CMASS sample is consistent with cosmological models having primordial Gaussian density fluctuations growing in accordance with general relativity to form galaxies in massive dark matter halos.« less
Psychopathic Traits in a Large Community Sample: Links to Violence, Alcohol Use, and Intelligence
ERIC Educational Resources Information Center
Neumann, Craig S.; Hare, Robert D.
2008-01-01
Numerous studies conducted with offender or forensic psychiatric samples have revealed that individuals with psychopathic traits are at risk for violence and other externalizing psychopathology. These traits appear to be continuously distributed in these samples, leading investigators to speculate on the presence of such traits in the general…
Precision Timing Calorimeter for High Energy Physics
Anderson, Dustin; Apresyan, Artur; Bornheim, Adolf; ...
2016-04-01
Here, we present studies on the performance and characterization of the time resolution of LYSO-based calorimeters. Results for an LYSO sampling calorimeter and an LYSO-tungsten Shashlik calorimeter are presented. We also demonstrate that a time resolution of 30 ps is achievable for the LYSO sampling calorimeter. Timing calorimetry is described as a tool for mitigating the effects due to the large number of simultaneous interactions in the high luminosity environment foreseen for the Large Hadron Collider.
Estimation of reference intervals from small samples: an example using canine plasma creatinine.
Geffré, A; Braun, J P; Trumel, C; Concordet, D
2009-12-01
According to international recommendations, reference intervals should be determined from at least 120 reference individuals, which often are impossible to achieve in veterinary clinical pathology, especially for wild animals. When only a small number of reference subjects is available, the possible bias cannot be known and the normality of the distribution cannot be evaluated. A comparison of reference intervals estimated by different methods could be helpful. The purpose of this study was to compare reference limits determined from a large set of canine plasma creatinine reference values, and large subsets of this data, with estimates obtained from small samples selected randomly. Twenty sets each of 120 and 27 samples were randomly selected from a set of 1439 plasma creatinine results obtained from healthy dogs in another study. Reference intervals for the whole sample and for the large samples were determined by a nonparametric method. The estimated reference limits for the small samples were minimum and maximum, mean +/- 2 SD of native and Box-Cox-transformed values, 2.5th and 97.5th percentiles by a robust method on native and Box-Cox-transformed values, and estimates from diagrams of cumulative distribution functions. The whole sample had a heavily skewed distribution, which approached Gaussian after Box-Cox transformation. The reference limits estimated from small samples were highly variable. The closest estimates to the 1439-result reference interval for 27-result subsamples were obtained by both parametric and robust methods after Box-Cox transformation but were grossly erroneous in some cases. For small samples, it is recommended that all values be reported graphically in a dot plot or histogram and that estimates of the reference limits be compared using different methods.
Presolar Materials in a Giant Cluster IDP of Probable Cometary Origin
NASA Technical Reports Server (NTRS)
Messenger, S.; Brownlee, D. E.; Joswiak, D. J.; Nguyen, A. N.
2015-01-01
Chondritic porous interplanetary dust particles (CP-IDPs) have been linked to comets by their fragile structure, primitive mineralogy, dynamics, and abundant interstellar materials. But differences have emerged between 'cometary' CP-IDPs and comet 81P/Wild 2 Stardust Mission samples. Particles resembling Ca-Al-rich inclusions (CAIs), chondrules, and amoeboid olivine aggregates (AOAs) in Wild 2 samples are rare in CP-IDPs. Unlike IDPs, presolar materials are scarce in Wild 2 samples. These differences may be due to selection effects, such as destruction of fine grained (presolar) components during the 6 km/s aerogel impact collection of Wild 2 samples. Large refractory grains observed in Wild 2 samples are also unlikely to be found in most (less than 30 micrometers) IDPs. Presolar materials provide a measure of primitive-ness of meteorites and IDPs. Organic matter in IDPs and chondrites shows H and N isotopic anomalies attributed to low-T interstellar or protosolar disk chemistry, where the largest anomalies occur in the most primitive samples. Presolar silicates are abundant in meteorites with low levels of aqueous alteration (Acfer 094 approximately 200 ppm) and scarce in altered chondrites (e.g. Semarkona approximately 20 ppm). Presolar silicates in minimally altered CP-IDPs range from approximately 400 ppm to 15,000 ppm, possibly reflecting variable levels of destruction in the solar nebula or statistical variations due to small sample sizes. Here we present preliminary isotopic and mineralogical studies of a very large CP-IDP. The goals of this study are to more accurately determine the abundances of presolar components of CP-IDP material for comparison with comet Wild 2 samples and meteorites. The large mass of this IDP presents a unique opportunity to accurately determine the abundance of pre-solar grains in a likely cometary sample.
Seretis, Charalampos; Seretis, Fotios; Lagoudianakis, Emmanuel; Politou, Marianna; Gemenetzis, George; Salemis, Nikolaos S.
2012-01-01
Background. The objective of our study is to investigate the potential effect of adjusting preoperative platelet to lymphocyte ratio, an emerging biomarker of survival in cancer patients, for the fraction of large platelets. Methods. A total of 79 patients with breast neoplasias, 44 with fibroadenomas, and 35 with invasive ductal carcinoma were included in the study. Both conventional platelet to lymphocyte ratio (PLR) and the adjusted marker, large platelet to lymphocyte ratio (LPLR), were correlated with laboratory and histopathological parameters of the study sample. Results. LPLR elevation was significantly correlated with the presence of malignancy, advanced tumor stage, metastatic spread in the axillary nodes and HER2/neu overexpression, while PLR was only correlated with the number of infiltrated lymph nodes. Conclusions. This is the first study evaluating the effect of adjustment for large platelet count on improving PLR accuracy, when correlated with the basic independent markers of survival in a sample of breast cancer patients. Further studies are needed in order to assess the possibility of applying our adjustment as standard in terms of predicting survival rates in cancer. PMID:23304480
Seretis, Charalampos; Seretis, Fotios; Lagoudianakis, Emmanuel; Politou, Marianna; Gemenetzis, George; Salemis, Nikolaos S
2012-01-01
Background. The objective of our study is to investigate the potential effect of adjusting preoperative platelet to lymphocyte ratio, an emerging biomarker of survival in cancer patients, for the fraction of large platelets. Methods. A total of 79 patients with breast neoplasias, 44 with fibroadenomas, and 35 with invasive ductal carcinoma were included in the study. Both conventional platelet to lymphocyte ratio (PLR) and the adjusted marker, large platelet to lymphocyte ratio (LPLR), were correlated with laboratory and histopathological parameters of the study sample. Results. LPLR elevation was significantly correlated with the presence of malignancy, advanced tumor stage, metastatic spread in the axillary nodes and HER2/neu overexpression, while PLR was only correlated with the number of infiltrated lymph nodes. Conclusions. This is the first study evaluating the effect of adjustment for large platelet count on improving PLR accuracy, when correlated with the basic independent markers of survival in a sample of breast cancer patients. Further studies are needed in order to assess the possibility of applying our adjustment as standard in terms of predicting survival rates in cancer.
Lyons, Anthony; Heywood, Wendy; Fileborn, Bianca; Minichiello, Victor; Barrett, Catherine; Brown, Graham; Hinchliff, Sharron; Malta, Sue; Crameri, Pauline
2017-09-01
Older people are often excluded from large studies of sexual health, as it is assumed that they are not having sex or are reluctant to talk about sensitive topics and are therefore difficult to recruit. We outline the sampling and recruitment strategies from a recent study on sexual health and relationships among older people. Sex, Age and Me was a nationwide Australian study that examined sexual health, relationship patterns, safer-sex practices and STI knowledge of Australians aged 60 years and over. The study used a mixed-methods approach to establish baseline levels of knowledge and to develop deeper insights into older adult's understandings and practices relating to sexual health. Data collection took place in 2015, with 2137 participants completing a quantitative survey and 53 participating in one-on-one semi-structured interviews. As the feasibility of this type of study has been largely untested until now, we provide detailed information on the study's recruitment strategies and methods. We also compare key characteristics of our sample with national estimates to assess its degree of representativeness. This study provides evidence to challenge the assumptions that older people will not take part in sexual health-related research and details a novel and successful way to recruit participants in this area.
Apollo 15 coarse fines (4-10 mm): Sample classification, description and inventory
NASA Technical Reports Server (NTRS)
Powell, B. N.
1972-01-01
A particle by particle binocular microscopic examination of all of the Apollo 15 4-10 mm fines samples is reported. These particles are classified according to their macroscopic lithologic features in order to provide a basis for sample allocations and future study. The relatively large size of these particles renders them too vaulable to permit treatment along with the other bulk fines, yet they are too small (and numerous) to practically receive full individual descriptive treatment as given the larger rock samples. This examination, classification and description of subgroups represents a compromise treatment. In most cases and for many types of investigation the individual particles should be large enough to permit the application of more than one type of analysis.
LARGE RIVER ASSESSMENT METHODS FOR BENTHIC MACROINVERTEBRATES AND FISH
Multiple projects are currently underway to increase our understanding of the varying results of different sampling methods and designs used for the biological assessment and monitoring of large (boatable) rivers. Studies include methods used to assess fish, benthic macroinverte...
Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.
Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils
2017-09-15
A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling procedure was reproducible with results comparable to the collected sample. However, the sampling procedure favoured sampling of large farms. Furthermore, both under-sampled and over-sampled areas were found using scan statistics. In conclusion, sampling conducted at abattoirs can provide a spatially representative sample. Hence it is a possible cost-effective alternative to simple random sampling. However, it is important to assess the properties of the resulting sample so that any potential selection bias can be addressed when reporting the findings. Copyright © 2017 Elsevier B.V. All rights reserved.
Decoder calibration with ultra small current sample set for intracortical brain-machine interface
NASA Astrophysics Data System (ADS)
Zhang, Peng; Ma, Xuan; Chen, Luyao; Zhou, Jin; Wang, Changyong; Li, Wei; He, Jiping
2018-04-01
Objective. Intracortical brain-machine interfaces (iBMIs) aim to restore efficient communication and movement ability for paralyzed patients. However, frequent recalibration is required for consistency and reliability, and every recalibration will require relatively large most current sample set. The aim in this study is to develop an effective decoder calibration method that can achieve good performance while minimizing recalibration time. Approach. Two rhesus macaques implanted with intracortical microelectrode arrays were trained separately on movement and sensory paradigm. Neural signals were recorded to decode reaching positions or grasping postures. A novel principal component analysis-based domain adaptation (PDA) method was proposed to recalibrate the decoder with only ultra small current sample set by taking advantage of large historical data, and the decoding performance was compared with other three calibration methods for evaluation. Main results. The PDA method closed the gap between historical and current data effectively, and made it possible to take advantage of large historical data for decoder recalibration in current data decoding. Using only ultra small current sample set (five trials of each category), the decoder calibrated using the PDA method could achieve much better and more robust performance in all sessions than using other three calibration methods in both monkeys. Significance. (1) By this study, transfer learning theory was brought into iBMIs decoder calibration for the first time. (2) Different from most transfer learning studies, the target data in this study were ultra small sample set and were transferred to the source data. (3) By taking advantage of historical data, the PDA method was demonstrated to be effective in reducing recalibration time for both movement paradigm and sensory paradigm, indicating a viable generalization. By reducing the demand for large current training data, this new method may facilitate the application of intracortical brain-machine interfaces in clinical practice.
ERIC Educational Resources Information Center
White, Stuart F.; Brislin, Sarah; Sinclair, Stephen; Fowler, Katherine A.; Pope, Kayla; Blair, R. James R.
2013-01-01
Background: The presence of a large cavum septum pellucidum (CSP) has been previously associated with antisocial behavior/psychopathic traits in an adult community sample. Aims: The current study investigated the relationship between a large CSP and symptom severity in disruptive behavior disorders (DBD; conduct disorder and oppositional defiant…
Rimehaug, Tormod; Wallander, Jan
2010-07-01
The study compared anxiety and depression prevalence between parents and non-parents in a society with family- and parenthood-friendly social politics, controlling for family status and family history, age, gender, education and social class. All participants aged 30-49 (N = 24,040) in the large, non-sampled Norwegian HUNT2 community health study completed the Hospital Anxiety and Depression Scales. The slightly elevated anxiety and depression among non-parents compared to parents in the complete sample was not confirmed as statistically significant within any subgroups. Married parents and (previously unmarried) cohabiting parents did not differ in portraying low anxiety and depression prevalence. Anxiety was associated with single parenthood, living alone or being divorced, while elevated depression was found only among those living alone. Burdening selection and cultural/political context are suggested as interpretative perspectives on the contextual and personal influences on the complex relationship between parenthood and mental health.
NASA Technical Reports Server (NTRS)
Pitts, D. E.; Badhwar, G.
1980-01-01
The development of agricultural remote sensing systems requires knowledge of agricultural field size distributions so that the sensors, sampling frames, image interpretation schemes, registration systems, and classification systems can be properly designed. Malila et al. (1976) studied the field size distribution for wheat and all other crops in two Kansas LACIE (Large Area Crop Inventory Experiment) intensive test sites using ground observations of the crops and measurements of their field areas based on current year rectified aerial photomaps. The field area and size distributions reported in the present investigation are derived from a representative subset of a stratified random sample of LACIE sample segments. In contrast to previous work, the obtained results indicate that most field-size distributions are not log-normally distributed. The most common field size observed in this study was 10 acres for most crops studied.
Chambaere, Kenneth; Bilsen, Johan; Cohen, Joachim; Pousset, Geert; Onwuteaka-Philipsen, Bregje; Mortier, Freddy; Deliens, Luc
2008-01-01
Background Reliable studies of the incidence and characteristics of medical end-of-life decisions with a certain or possible life shortening effect (ELDs) are indispensable for an evidence-based medical and societal debate on this issue. This article presents the protocol drafted for the 2007 ELD Study in Flanders, Belgium, and outlines how the main aims and challenges of the study (i.e. making reliable incidence estimates of end-of-life decisions, even rare ones, and describing their characteristics; allowing comparability with past ELD studies; guaranteeing strict anonymity given the sensitive nature of the research topic; and attaining a sufficient response rate) are addressed in a post-mortem survey using a representative sample of death certificates. Study design Reliable incidence estimates are achievable by using large at random samples of death certificates of deceased persons in Flanders (aged one year or older). This entails the cooperation of the appropriate administrative authorities. To further ensure the reliability of the estimates and descriptions, especially of less prevalent end-of-life decisions (e.g. euthanasia), a stratified sample is drawn. A questionnaire is sent out to the certifying physician of each death sampled. The questionnaire, tested thoroughly and avoiding emotionally charged terms is based largely on questions that have been validated in previous national and European ELD studies. Anonymity of both patient and physician is guaranteed through a rigorous procedure, involving a lawyer as intermediary between responding physicians and researchers. To increase response we follow the Total Design Method (TDM) with a maximum of three follow-up mailings. Also, a non-response survey is conducted to gain insight into the reasons for lack of response. Discussion The protocol of the 2007 ELD Study in Flanders, Belgium, is appropriate for achieving the objectives of the study; as past studies in Belgium, the Netherlands, and other European countries have shown, strictly anonymous and thorough surveys among physicians using a large, stratified, and representative death certificate sample are most suitable in nationwide studies of incidence and characteristics of end-of-life decisions. There are however also some limitations to the study design. PMID:18752659
NASA Astrophysics Data System (ADS)
Du, Shihong; Zhang, Fangli; Zhang, Xiuyuan
2015-07-01
While most existing studies have focused on extracting geometric information on buildings, only a few have concentrated on semantic information. The lack of semantic information cannot satisfy many demands on resolving environmental and social issues. This study presents an approach to semantically classify buildings into much finer categories than those of existing studies by learning random forest (RF) classifier from a large number of imbalanced samples with high-dimensional features. First, a two-level segmentation mechanism combining GIS and VHR image produces single image objects at a large scale and intra-object components at a small scale. Second, a semi-supervised method chooses a large number of unbiased samples by considering the spatial proximity and intra-cluster similarity of buildings. Third, two important improvements in RF classifier are made: a voting-distribution ranked rule for reducing the influences of imbalanced samples on classification accuracy and a feature importance measurement for evaluating each feature's contribution to the recognition of each category. Fourth, the semantic classification of urban buildings is practically conducted in Beijing city, and the results demonstrate that the proposed approach is effective and accurate. The seven categories used in the study are finer than those in existing work and more helpful to studying many environmental and social problems.
The ALHAMBRA survey: evolution of galaxy clustering since z ˜ 1
NASA Astrophysics Data System (ADS)
Arnalte-Mur, P.; Martínez, V. J.; Norberg, P.; Fernández-Soto, A.; Ascaso, B.; Merson, A. I.; Aguerri, J. A. L.; Castander, F. J.; Hurtado-Gil, L.; López-Sanjuan, C.; Molino, A.; Montero-Dorta, A. D.; Stefanon, M.; Alfaro, E.; Aparicio-Villegas, T.; Benítez, N.; Broadhurst, T.; Cabrera-Caño, J.; Cepa, J.; Cerviño, M.; Cristóbal-Hornillos, D.; del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Márquez, I.; Masegosa, J.; Moles, M.; Perea, J.; Pović, M.; Prada, F.; Quintana, J. M.
2014-06-01
We study the clustering of galaxies as function of luminosity and redshift in the range 0.35 < z < 1.25 using data from the Advanced Large Homogeneous Area Medium-Band Redshift Astronomical (ALHAMBRA) survey. The ALHAMBRA data used in this work cover 2.38 deg2 in seven independent fields, after applying a detailed angular selection mask, with accurate photometric redshifts, σz ≲ 0.014(1 + z), down to IAB < 24. Given the depth of the survey, we select samples in B-band luminosity down to Lth ≃ 0.16L* at z = 0.9. We measure the real-space clustering using the projected correlation function, accounting for photometric redshifts uncertainties. We infer the galaxy bias, and study its evolution with luminosity. We study the effect of sample variance, and confirm earlier results that the Cosmic Evolution Survey (COSMOS) and European Large Area ISO Survey North 1 (ELAIS-N1) fields are dominated by the presence of large structures. For the intermediate and bright samples, Lmed ≳ 0.6L*, we obtain a strong dependence of bias on luminosity, in agreement with previous results at similar redshift. We are able to extend this study to fainter luminosities, where we obtain an almost flat relation, similar to that observed at low redshift. Regarding the evolution of bias with redshift, our results suggest that the different galaxy populations studied reside in haloes covering a range in mass between log10[Mh/( h-1 M⊙)] ≳ 11.5 for samples with Lmed ≃ 0.3L* and log10[Mh/( h-1 M⊙)] ≳ 13.0 for samples with Lmed ≃ 2L*, with typical occupation numbers in the range of ˜1-3 galaxies per halo.
A Comparison of Men Who Committed Different Types of Sexual Assault in a Community Sample
ERIC Educational Resources Information Center
Abbey, Antonia; Parkhill, Michele R.; Clinton-Sherrod, A. Monique; Zawacki, Tina
2007-01-01
This study extends past research by examining predictors of different types of sexual assault perpetration in a community sample. Computer-assisted self-interviews were conducted with a representative sample of 163 men in one large urban community. As hypothesized, many variables that are significant predictors of sexual assault perpetration in…
The large sample size fallacy.
Lantz, Björn
2013-06-01
Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.
Jose Ricardo Barradas; Lucas G. Silva; Bret C. Harvey; Nelson F. Fontoura
2012-01-01
1. The objective of this study was to identify longitudinal distribution patterns of large migratory fish species in the Uruguay River basin, southern Brazil, and construct statistical distribution models for Salminus brasiliensis, Prochilodus lineatus, Leporinus obtusidens and Pseudoplatystoma corruscans. 2. The sampling programme resulted in 202 interviews with old...
Factors Affecting Adult Student Dropout Rates in the Korean Cyber-University Degree Programs
ERIC Educational Resources Information Center
Choi, Hee Jun; Kim, Byoung Uk
2018-01-01
Few empirical studies of adult distance learners' decisions to drop out of degree programs have used large enough sample sizes to generalize the findings or data sets drawn from multiple online programs that address various subjects. Accordingly, in this study, we used a large administrative data set drawn from multiple online degree programs to…
Study Healthy Ageing and Intellectual Disabilities: Recruitment and Design
ERIC Educational Resources Information Center
Hilgenkamp, Thessa I. M.; Bastiaanse, Luc P.; Hermans, Heidi; Penning, Corine; van Wijck, Ruud; Evenhuis, Heleen M.
2011-01-01
Problems encountered in epidemiologic health research in older adults with intellectual disabilities (ID) are how to recruit a large-scale sample of participants and how to measure a range of health variables in such a group. This cross-sectional study into healthy ageing started with founding a consort of three large care providers with a total…
ERIC Educational Resources Information Center
Luna-Torres, Maria; McKinney, Lyle; Horn, Catherine; Jones, Sara
2018-01-01
This study examined a sample of community college students from a diverse, large urban community college system in Texas. To gain a deeper understanding about the effects of background characteristics on student borrowing behaviors and enrollment outcomes, the study employed descriptive statistics and regression techniques to examine two separate…
Bedload Rating and Flow Competence Curves Vary With Watershed and Bed Material Parameters
NASA Astrophysics Data System (ADS)
Bunte, K.; Abt, S. R.
2003-12-01
Bedload transport rating curves and flow competence curves (largest bedload size for specified flow) are usually not known for streams unless a large number of bedload samples has been collected and analyzed. However, this information is necessary for assessing instream flow needs and stream responses to watershed effects. This study therefore analyzed whether bedload transport rating and flow competence curves were related to stream parameters. Bedload transport rating curves and flow competence curves were obtained from extensive bedload sampling in six gravel- and cobble-bed mountain streams. Samples were collected using bedload traps and a large net sampler, both of which provide steep and relatively well-defined bedload rating and flow competence curves due to a long sampling duration, a large sampler opening and a large sampler capacity. The sampled streams have snowmelt regimes, steep (1-9%) gradients, and watersheds that are mainly forested and relatively undisturbed with basin area sizes of 8 to 105 km2. The channels are slightly incised and can contain flows of more than 1.5 times bankfull with little overbank flow. Exponents of bedload rating and flow competence curves obtained from these measurements were found to systematically increase with basin area size and decrease with the degree of channel armoring. By contrast, coefficients of bedload rating and flow competence curves decreased with basin size and increased with armoring. All of these relationships were well-defined (0.86 < r2 < 0.99). Data sets from other studies in coarse-bedded streams fit the indicated trend if the sampling device used allows measuring bedload transport rates over a wide range and if bedload supply is somewhat low. The existence of a general positive trend between bedload rating curve exponents and basin area, and a negative trend between coefficients and basin area, is confirmed by a large data set of bedload rating curves obtained from Helley-Smith samples. However, in this case, the trends only become visible as basin area sizes span a wide range (1 - 10,000 km2). The well-defined relationships obtained from the bedload trap and the large net sampler suggest that exponents and coefficients of bedload transport rating curves (and flow competence curves) are predictable from an easily obtainable parameter such as basin size. However, the relationships of bedload rating curve exponents and coefficients with basin size and armoring appear to be influenced by the sampling device used and the watershed sediment production.
Intratumoral histologic heterogeneity of gliomas. A quantitative study.
Paulus, W; Peiffer, J
1989-07-15
Quantitative data for intratumoral histologic heterogeneity were obtained by investigating ten small and ten large punched samples from 50 unembedded supratentorial gliomas. The 1000 samples were diagnosed according to the World Health Organization (WHO) classification and six histopathologic features associated with malignancy were evaluated (cellular density, nuclear pleomorphism, necroses, histologic architecture, vessels, and mitoses), each with defined gradations. The slides were read independently by two observers. The initially high interobserver variability (grade, 22.2%; type, 10.3%; and tumor presence/absence, 7.1%) was for the most part due to intermediate grades and types and was reduced to 1.7% after mutual review. Small samples showed lower mean grade than large samples and more often absence of tumor (7.6% versus 2.4%). Of all gliomas, 48% showed differently typed samples, 82% differently graded samples, and 62% benign and malignant grades. Intratumoral heterogeneity was higher for the necroses than for the other histopathologic features. Our results underscore the importance of extensive tissue sampling.
Kittelmann, Sandra; Kirk, Michelle R; Jonker, Arjan; McCulloch, Alan; Janssen, Peter H
2015-11-01
Analysis of rumen microbial community structure based on small-subunit rRNA marker genes in metagenomic DNA samples provides important insights into the dominant taxa present in the rumen and allows assessment of community differences between individuals or in response to treatments applied to ruminants. However, natural animal-to-animal variation in rumen microbial community composition can limit the power of a study considerably, especially when only subtle differences are expected between treatment groups. Thus, trials with large numbers of animals may be necessary to overcome this variation. Because ruminants pass large amounts of rumen material to their oral cavities when they chew their cud, oral samples may contain good representations of the rumen microbiota and be useful in lieu of rumen samples to study rumen microbial communities. We compared bacterial, archaeal, and eukaryotic community structures in DNAs extracted from buccal swabs to those in DNAs from samples collected directly from the rumen by use of a stomach tube for sheep on four different diets. After bioinformatic depletion of potential oral taxa from libraries of samples collected via buccal swabs, bacterial communities showed significant clustering by diet (R = 0.37; analysis of similarity [ANOSIM]) rather than by sampling method (R = 0.07). Archaeal, ciliate protozoal, and anaerobic fungal communities also showed significant clustering by diet rather than by sampling method, even without adjustment for potentially orally associated microorganisms. These findings indicate that buccal swabs may in future allow quick and noninvasive sampling for analysis of rumen microbial communities in large numbers of ruminants. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Effects of Sample Selection Bias on the Accuracy of Population Structure and Ancestry Inference
Shringarpure, Suyash; Xing, Eric P.
2014-01-01
Population stratification is an important task in genetic analyses. It provides information about the ancestry of individuals and can be an important confounder in genome-wide association studies. Public genotyping projects have made a large number of datasets available for study. However, practical constraints dictate that of a geographical/ethnic population, only a small number of individuals are genotyped. The resulting data are a sample from the entire population. If the distribution of sample sizes is not representative of the populations being sampled, the accuracy of population stratification analyses of the data could be affected. We attempt to understand the effect of biased sampling on the accuracy of population structure analysis and individual ancestry recovery. We examined two commonly used methods for analyses of such datasets, ADMIXTURE and EIGENSOFT, and found that the accuracy of recovery of population structure is affected to a large extent by the sample used for analysis and how representative it is of the underlying populations. Using simulated data and real genotype data from cattle, we show that sample selection bias can affect the results of population structure analyses. We develop a mathematical framework for sample selection bias in models for population structure and also proposed a correction for sample selection bias using auxiliary information about the sample. We demonstrate that such a correction is effective in practice using simulated and real data. PMID:24637351
Estimating the Size of a Large Network and its Communities from a Random Sample
Chen, Lin; Karbasi, Amin; Crawford, Forrest W.
2017-01-01
Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = (V, E) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W ⊆ V and letting G(W) be the induced subgraph in G of the vertices in W. In addition to G(W), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that accurately estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhaustive set of experiments to study the effects of sample size, K, and SBM model parameters on the accuracy of the estimates. The experimental results also demonstrate that PULSE significantly outperforms a widely-used method called the network scale-up estimator in a wide variety of scenarios. PMID:28867924
Estimating the Size of a Large Network and its Communities from a Random Sample.
Chen, Lin; Karbasi, Amin; Crawford, Forrest W
2016-01-01
Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = ( V, E ) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W ⊆ V and letting G ( W ) be the induced subgraph in G of the vertices in W . In addition to G ( W ), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that accurately estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhaustive set of experiments to study the effects of sample size, K , and SBM model parameters on the accuracy of the estimates. The experimental results also demonstrate that PULSE significantly outperforms a widely-used method called the network scale-up estimator in a wide variety of scenarios.
Risk Factors for Severe Inter-Sibling Violence: A Preliminary Study of a Youth Forensic Sample
ERIC Educational Resources Information Center
Khan, Roxanne; Cooke, David J.
2008-01-01
The perpetration of severe inter-sibling violence (SISV) remains a largely unexplored area of family violence. This article describes an investigation of risk factors for intentional SISV perpetration. A sample of 111 young people under the care of the Scottish criminal justice or welfare systems was studied. A SISV perpetration interview schedule…
Assessment of sampling stability in ecological applications of discriminant analysis
Williams, B.K.; Titus, K.
1988-01-01
A simulation study was undertaken to assess the sampling stability of the variable loadings in linear discriminant function analysis. A factorial design was used for the factors of multivariate dimensionality, dispersion structure, configuration of group means, and sample size. A total of 32,400 discriminant analyses were conducted, based on data from simulated populations with appropriate underlying statistical distributions. A review of 60 published studies and 142 individual analyses indicated that sample sizes in ecological studies often have met that requirement. However, individual group sample sizes frequently were very unequal, and checks of assumptions usually were not reported. The authors recommend that ecologists obtain group sample sizes that are at least three times as large as the number of variables measured.
A SEDIMENT TOXICITY EVALUATION OF THREE LARGE RIVER SYSTEMS
Sediment toxicity samples were collected from selected sites on the Ohio River, Missouri River and upper Mississippi River as part of the 2004 and 2005 Environmental Monitoring and Assessment Program-Great Rivers Ecosystems Study (EMAP-GRE). Samples were collected by compositing...
Beckwith, Michael A.
2003-01-01
Water-quality samples were collected at 10 sites in the Clark Fork-Pend Oreille and Spokane River Basins in water years 1999 – 2001 as part of the Northern Rockies Intermontane Basins (NROK) National Water-Quality Assessment (NAWQA) Program. Sampling sites were located in varied environments ranging from small streams and rivers in forested, mountainous headwater areas to large rivers draining diverse landscapes. Two sampling sites were located immediately downstream from the large lakes; five sites were located downstream from large-scale historical mining and oreprocessing areas, which are now the two largest “Superfund” (environmental remediation) sites in the Nation. Samples were collected during a wide range of streamflow conditions, more frequently during increasing and high streamflow and less frequently during receding and base-flow conditions. Sample analyses emphasized major ions, nutrients, and selected trace elements. Streamflow during the study ranged from more than 130 percent of the long-term average in 1999 at some sites to 40 percent of the long-term average in 2001. River and stream water in the study area exhibited small values for specific conductance, hardness, alkalinity, and dissolved solids. Dissolved oxygen concentrations in almost all samples were near saturation. Median total nitrogen and total phosphorus concentrations in samples from most sites were smaller than median concentrations reported for many national programs and other NAWQA Program study areas. The only exceptions were two sites downstream from large wastewater-treatment facilities, where median concentrations of total nitrogen exceeded the national median. Maximum concentrations of total phosphorus in samples from six sites exceeded the 0.1 milligram per liter threshold recommended for limiting nuisance aquatic growth. Concentrations of arsenic, cadmium, copper, lead, mercury, and zinc were largest in samples from sites downstream from historical mining and ore-processing areas in the upper Clark Fork in Montana and the South Fork Coeur d’Alene River in Idaho. Concentrations of dissolved lead in all 32 samples from the South Fork Coeur d’Alene River exceeded the Idaho chronic criterion for the protection of aquatic life at the median hardness level measured during the study. Concentrations of dissolved zinc in all samples collected at this site exceeded both the chronic and acute criteria at all hardness levels measured. When all data from all NROK sites were combined, median concentrations of dissolved arsenic, dissolved and total recoverable copper, total recoverable lead, and total recoverable zinc in the NROK study area appeared to be similar to or slightly smaller than median concentrations at sites in other NAWQA Program study areas in the Western United States affected by historical mining activities. Although the NROK median total recoverable lead concentration was the smallest among the three Western study areas compared, concentrations in several NROK samples were an order of magnitude larger than the maximum concentrations measured in the Upper Colorado River and Great Salt Lake Basins. Dissolved cadmium, dissolved lead, and total recoverable zinc concentrations at NROK sites were more variable than in the other study areas; concentrations ranged over almost three orders of magnitude between minimum and maximum values; the range of dissolved zinc concentrations in the NROK study area exceeded three orders of magnitude.
Heritability of metabolic syndrome traits in a large population-based sample[S
van Dongen, Jenny; Willemsen, Gonneke; Chen, Wei-Min; de Geus, Eco J. C.; Boomsma, Dorret I.
2013-01-01
Heritability estimates of metabolic syndrome traits vary widely across studies. Some studies have suggested that the contribution of genes may vary with age or sex. We estimated the heritability of 11 metabolic syndrome-related traits and height as a function of age and sex in a large population-based sample of twin families (N = 2,792–27,021, for different traits). A moderate-to-high heritability was found for all traits [from H2 = 0.47 (insulin) to H2 = 0.78 (BMI)]. The broad-sense heritability (H2) showed little variation between age groups in women; it differed somewhat more in men (e.g., for glucose, H2 = 0.61 in young females, H2 = 0.56 in older females, H2 = 0.64 in young males, and H2= 0.27 in older males). While nonadditive genetic effects explained little variation in the younger subjects, nonadditive genetic effects became more important at a greater age. Our findings show that in an unselected sample (age range, ∼18–98 years), the genetic contribution to individual differences in metabolic syndrome traits is moderate to large in both sexes and across age. Although the prevalence of the metabolic syndrome has greatly increased in the past decades due to lifestyle changes, our study indicates that most of the variation in metabolic syndrome traits between individuals is due to genetic differences. PMID:23918046
ERIC Educational Resources Information Center
Jackson, Allen W.; Morrow, James R., Jr.; Bowles, Heather R.; FitzGerald, Shannon J.; Blair, Steven N.
2007-01-01
Valid measurement of physical activity is important for studying the risks for morbidity and mortality. The purpose of this study was to examine evidence of construct validity of two similar single-response items assessing physical activity via self-report. Both items are based on the stages of change model. The sample was 687 participants (men =…
Who Is at Greatest Risk of Adverse Long-Term Outcomes? The Finnish from a Boy to a Man Study
ERIC Educational Resources Information Center
Sourander, Andre; Jensen, Peter; Davies, Mark; Niemela, Solja; Elonheimo, Henrik; Ristkari, Terja; Helenius, Hans; Sillanmaki, Lauri; Piha, Jorma; Kumpulainen, Kirsti; Tamminen, Tuula; Moilanen, Irma; Almqvist, Fredrik
2007-01-01
Objective: To study associations between comorbid psychopathology and long-term outcomes in a large birth cohort sample from age 8 to early adulthood. Method: The sample included long-term outcome data on 2,556 Finnish boys born in 1981. The aim was to study the impact of early childhood psychopathology types (externalizing versus internalizing…
Martinez-Maza, Cayetana; Alberdi, Maria Teresa; Nieto-Diaz, Manuel; Prado, José Luis
2014-01-01
Histological analyses of fossil bones have provided clues on the growth patterns and life history traits of several extinct vertebrates that would be unavailable for classical morphological studies. We analyzed the bone histology of Hipparion to infer features of its life history traits and growth pattern. Microscope analysis of thin sections of a large sample of humeri, femora, tibiae and metapodials of Hipparion concudense from the upper Miocene site of Los Valles de Fuentidueña (Segovia, Spain) has shown that the number of growth marks is similar among the different limb bones, suggesting that equivalent skeletochronological inferences for this Hipparion population might be achieved by means of any of the elements studied. Considering their abundance, we conducted a skeletechronological study based on the large sample of third metapodials from Los Valles de Fuentidueña together with another large sample from the Upper Miocene locality of Concud (Teruel, Spain). The data obtained enabled us to distinguish four age groups in both samples and to determine that Hipparion concudense tended to reach skeletal maturity during its third year of life. Integration of bone microstructure and skeletochronological data allowed us to identify ontogenetic changes in bone structure and growth rate and to distinguish three histologic ontogenetic stages corresponding to immature, subadult and adult individuals. Data on secondary osteon density revealed an increase in bone remodeling throughout the ontogenetic stages and a lesser degree thereof in the Concud population, which indicates different biomechanical stresses in the two populations, likely due to environmental differences. Several individuals showed atypical growth patterns in the Concud sample, which may also reflect environmental differences between the two localities. Finally, classification of the specimens’ age within groups enabled us to characterize the age structure of both samples, which is typical of attritional assemblages. PMID:25098950
ERIC Educational Resources Information Center
Hoepfner, Ralph; And Others
Sampling techniques used in "A Study of the Sustaining Effects of Compensatory Education" are described in detail. The Sustaining Effects Study is a large, multi-faceted study of issues related to the compensatory education of elementary school students. Public elementary schools that include grades between one and six are eligible for…
Surface and vertical temperature data will be obtained from several large lakes With surface areas large enough to be effectively sampled with AVHRR imagery. Yearly and seasonal patterns of surface and whole water column thermal values will be compared to estimates of surface tem...
Effect of the three-dimensional microstructure on the sound absorption of foams: A parametric study.
Chevillotte, Fabien; Perrot, Camille
2017-08-01
The purpose of this work is to systematically study the effect of the throat and the pore sizes on the sound absorbing properties of open-cell foams. The three-dimensional idealized unit cell used in this work enables to mimic the acoustical macro-behavior of a large class of cellular solid foams. This study is carried out for a normal incidence and also for a diffuse field excitation, with a relatively large range of sample thicknesses. The transport and sound absorbing properties are numerically studied as a function of the throat size, the pore size, and the sample thickness. The resulting diagrams show the ranges of the specific throat sizes and pore sizes where the sound absorption grading is maximized due to the pore morphology as a function of the sample thickness, and how it correlates with the corresponding transport parameters. These charts demonstrate, together with typical examples, how the morphological characteristics of foam could be modified in order to increase the visco-thermal dissipation effects.
Identifying airborne fungi in Seoul, Korea using metagenomics.
Oh, Seung-Yoon; Fong, Jonathan J; Park, Myung Soo; Chang, Limseok; Lim, Young Woon
2014-06-01
Fungal spores are widespread and common in the atmosphere. In this study, we use a metagenomic approach to study the fungal diversity in six total air samples collected from April to May 2012 in Seoul, Korea. This springtime period is important in Korea because of the peak in fungal spore concentration and Asian dust storms, although the year of this study (2012) was unique in that were no major Asian dust events. Clustering sequences for operational taxonomic unit (OTU) identification recovered 1,266 unique OTUs in the combined dataset, with between 223᾿96 OTUs present in individual samples. OTUs from three fungal phyla were identified. For Ascomycota, Davidiella (anamorph: Cladosporium) was the most common genus in all samples, often accounting for more than 50% of all sequences in a sample. Other common Ascomycota genera identified were Alternaria, Didymella, Khuskia, Geosmitha, Penicillium, and Aspergillus. While several Basidiomycota genera were observed, Chytridiomycota OTUs were only present in one sample. Consistency was observed within sampling days, but there was a large shift in species composition from Ascomycota dominant to Basidiomycota dominant in the middle of the sampling period. This marked change may have been caused by meteorological events. A potential set of 40 allergy-inducing genera were identified, accounting for a large proportion of the diversity present (22.5᾿7.2%). Our study identifies high fungal diversity and potentially high levels of fungal allergens in springtime air of Korea, and provides a good baseline for future comparisons with Asian dust storms.
NASA Astrophysics Data System (ADS)
Queloz, Pierre; Bertuzzo, Enrico; Carraro, Luca; Botter, Gianluca; Miglietta, Franco; Rao, P. S. C.; Rinaldo, Andrea
2015-04-01
This paper reports about the experimental evidence collected on the transport of five fluorobenzoate tracers injected under controlled conditions in a vegetated hydrologic volume, a large lysimeter (fitted with load cells, sampling ports, and an underground chamber) where two willows prompting large evapotranspiration fluxes had been grown. The relevance of the study lies in the direct and indirect measures of the ways in which hydrologic fluxes, in this case, evapotranspiration from the upper surface and discharge from the bottom drainage, sample water and solutes in storage at different times under variable hydrologic forcings. Methods involve the accurate control of hydrologic inputs and outputs and a large number of suitable chemical analyses of water samples in discharge waters. Mass extraction from biomass has also been performed ex post. The results of the 2 year long experiment established that our initial premises on the tracers' behavior, known to be sorption-free under saturated conditions which we verified in column leaching tests, were unsuitable as large differences in mass recovery appeared. Issues on reactivity thus arose and were addressed in the paper, in this case attributed to microbial degradation and solute plant uptake. Our results suggest previously unknown features of fluorobenzoate compounds as hydrologic tracers, potentially interesting for catchment studies owing to their suitability for distinguishable multiple injections, and an outlook on direct experimental closures of mass balance in hydrologic transport volumes involving fluxes that are likely to sample differently stored water and solutes.
Bellenguez, Céline; Strange, Amy; Freeman, Colin; Donnelly, Peter; Spencer, Chris C A
2012-01-01
High-throughput genotyping arrays provide an efficient way to survey single nucleotide polymorphisms (SNPs) across the genome in large numbers of individuals. Downstream analysis of the data, for example in genome-wide association studies (GWAS), often involves statistical models of genotype frequencies across individuals. The complexities of the sample collection process and the potential for errors in the experimental assay can lead to biases and artefacts in an individual's inferred genotypes. Rather than attempting to model these complications, it has become a standard practice to remove individuals whose genome-wide data differ from the sample at large. Here we describe a simple, but robust, statistical algorithm to identify samples with atypical summaries of genome-wide variation. Its use as a semi-automated quality control tool is demonstrated using several summary statistics, selected to identify different potential problems, and it is applied to two different genotyping platforms and sample collections. The algorithm is written in R and is freely available at www.well.ox.ac.uk/chris-spencer chris.spencer@well.ox.ac.uk Supplementary data are available at Bioinformatics online.
The topology of large-scale structure. III - Analysis of observations
NASA Astrophysics Data System (ADS)
Gott, J. Richard, III; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.; Weinberg, David H.; Gammie, Charles; Polk, Kevin; Vogeley, Michael; Jeffrey, Scott; Bhavsar, Suketu P.; Melott, Adrian L.; Giovanelli, Riccardo; Hayes, Martha P.; Tully, R. Brent; Hamilton, Andrew J. S.
1989-05-01
A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.
The topology of large-scale structure. III - Analysis of observations. [in universe
NASA Technical Reports Server (NTRS)
Gott, J. Richard, III; Weinberg, David H.; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.
1989-01-01
A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.
Lack of association between digit ratio (2D:4D) and assertiveness: replication in a large sample.
Voracek, Martin
2009-12-01
Findings regarding within-sex associations of digit ratio (2D:4D), a putative pointer to long-lasting effects of prenatal androgen action, and sexually differentiated personality traits have generally been inconsistent or unreplicable, suggesting that effects in this domain, if any, are likely small. In contrast to evidence from Wilson's important 1983 study, a forerunner of modern 2D:4D research, two recent studies in 2005 and 2008 by Freeman, et al. and Hampson, et al. showed assertiveness, a presumably male-typed personality trait, was not associated with 2D:4D; however, these studies were clearly statistically underpowered. Hence this study examined this question anew, based on a large sample of 491 men and 627 women. Assertiveness was only modestly sexually differentiated, favoring men, and a positive correlate of age and education and a negative correlate of weight and Body Mass Index among women, but not men. Replicating the two prior studies, 2D:4D was throughout unrelated to assertiveness scores. This null finding was preserved with controls for correlates of assertiveness, also in nonparametric analysis and with tests for curvilinear relations. Discussed are implications of this specific null finding, now replicated in a large sample, for studies of 2D:4D and personality in general and novel research approaches to proceed in this field.
ASSESSMENT OF LARGE RIVER BENTHIC MACROINVERTEBRATE ASSEMBLAGES
During the summer of 2001, twelve sites were sampled for macroinvertebrates, six each on the Great Miami and Kentucky Rivers. Sites were chosen in each river from those sampled in the 1999 methods comparison study to reflect a disturbance gradient. At each site, a total distanc...
ASSESSMENT OF LARGE RIVER MACROINVERTEBRATE ASSEMBLAGES
During the summer of 2001, twelve sites were sampled for macroinvertebrates, six each on the Great Miami and Kentucky Rivers. Sites were chosen in each river from those sampled in the 1999 methods comparison study to reflect a disturbance gradient. At each site, a total distanc...
Sample sizes to control error estimates in determining soil bulk density in California forest soils
Youzhi Han; Jianwei Zhang; Kim G. Mattson; Weidong Zhang; Thomas A. Weber
2016-01-01
Characterizing forest soil properties with high variability is challenging, sometimes requiring large numbers of soil samples. Soil bulk density is a standard variable needed along with element concentrations to calculate nutrient pools. This study aimed to determine the optimal sample size, the number of observation (n), for predicting the soil bulk density with a...
Course Shopping in Urban Community Colleges: An Analysis of Student Drop and Add Activities
ERIC Educational Resources Information Center
Hagedorn, Linda Serra; Maxwell, William E.; Cypers, Scott; Moon, Hye Sun; Lester, Jaime
2007-01-01
This study examined the course shopping behaviors among a sample of approximately 5,000 community college students enrolled across nine campuses of a large urban district. The sample was purposely designed as an analytic, rather than a random, sample that sought to obtain adequate numbers of students in course areas that were of theoretical and of…
The ARIEL mission reference sample
NASA Astrophysics Data System (ADS)
Zingales, Tiziano; Tinetti, Giovanna; Pillitteri, Ignazio; Leconte, Jérémy; Micela, Giuseppina; Sarkar, Subhajit
2018-02-01
The ARIEL (Atmospheric Remote-sensing Exoplanet Large-survey) mission concept is one of the three M4 mission candidates selected by the European Space Agency (ESA) for a Phase A study, competing for a launch in 2026. ARIEL has been designed to study the physical and chemical properties of a large and diverse sample of exoplanets and, through those, understand how planets form and evolve in our galaxy. Here we describe the assumptions made to estimate an optimal sample of exoplanets - including already known exoplanets and expected ones yet to be discovered - observable by ARIEL and define a realistic mission scenario. To achieve the mission objectives, the sample should include gaseous and rocky planets with a range of temperatures around stars of different spectral type and metallicity. The current ARIEL design enables the observation of ˜1000 planets, covering a broad range of planetary and stellar parameters, during its four year mission lifetime. This nominal list of planets is expected to evolve over the years depending on the new exoplanet discoveries.
Sun, Yanqing; Sun, Liuquan; Zhou, Jie
2013-07-01
This paper studies the generalized semiparametric regression model for longitudinal data where the covariate effects are constant for some and time-varying for others. Different link functions can be used to allow more flexible modelling of longitudinal data. The nonparametric components of the model are estimated using a local linear estimating equation and the parametric components are estimated through a profile estimating function. The method automatically adjusts for heterogeneity of sampling times, allowing the sampling strategy to depend on the past sampling history as well as possibly time-dependent covariates without specifically model such dependence. A [Formula: see text]-fold cross-validation bandwidth selection is proposed as a working tool for locating an appropriate bandwidth. A criteria for selecting the link function is proposed to provide better fit of the data. Large sample properties of the proposed estimators are investigated. Large sample pointwise and simultaneous confidence intervals for the regression coefficients are constructed. Formal hypothesis testing procedures are proposed to check for the covariate effects and whether the effects are time-varying. A simulation study is conducted to examine the finite sample performances of the proposed estimation and hypothesis testing procedures. The methods are illustrated with a data example.
Simulation of Wind Profile Perturbations for Launch Vehicle Design
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
2004-01-01
Ideally, a statistically representative sample of measured high-resolution wind profiles with wavelengths as small as tens of meters is required in design studies to establish aerodynamic load indicator dispersions and vehicle control system capability. At most potential launch sites, high- resolution wind profiles may not exist. Representative samples of Rawinsonde wind profiles to altitudes of 30 km are more likely to be available from the extensive network of measurement sites established for routine sampling in support of weather observing and forecasting activity. Such a sample, large enough to be statistically representative of relatively large wavelength perturbations, would be inadequate for launch vehicle design assessments because the Rawinsonde system accurately measures wind perturbations with wavelengths no smaller than 2000 m (1000 m altitude increment). The Kennedy Space Center (KSC) Jimsphere wind profiles (150/month and seasonal 2 and 3.5-hr pairs) are the only adequate samples of high resolution profiles approx. 150 to 300 m effective resolution, but over-sampled at 25 m intervals) that have been used extensively for launch vehicle design assessments. Therefore, a simulation process has been developed for enhancement of measured low-resolution Rawinsonde profiles that would be applicable in preliminary launch vehicle design studies at launch sites other than KSC.
Integrating resource selection into spatial capture-recapture models for large carnivores
Proffitt, Kelly M.; Goldberg, Joshua; Hebblewite, Mark; Russell, Robin E.; Jimenez, Ben; Robinson, Hugh S.; Pilgrim, Kristine; Schwartz, Michael K.
2015-01-01
Wildlife managers need reliable methods to estimate large carnivore densities and population trends; yet large carnivores are elusive, difficult to detect, and occur at low densities making traditional approaches intractable. Recent advances in spatial capture-recapture (SCR) models have provided new approaches for monitoring trends in wildlife abundance and these methods are particularly applicable to large carnivores. We applied SCR models in a Bayesian framework to estimate mountain lion densities in the Bitterroot Mountains of west central Montana. We incorporate an existing resource selection function (RSF) as a density covariate to account for heterogeneity in habitat use across the study area and include data collected from harvested lions. We identify individuals through DNA samples collected by (1) biopsy darting mountain lions detected in systematic surveys of the study area, (2) opportunistically collecting hair and scat samples, and (3) sampling all harvested mountain lions. We included 80 DNA samples collected from 62 individuals in the analysis. Including information on predicted habitat use as a covariate on the distribution of activity centers reduced the median estimated density by 44%, the standard deviation by 7%, and the width of 95% credible intervals by 10% as compared to standard SCR models. Within the two management units of interest, we estimated a median mountain lion density of 4.5 mountain lions/100 km2 (95% CI = 2.9, 7.7) and 5.2 mountain lions/100 km2 (95% CI = 3.4, 9.1). Including harvested individuals (dead recovery) did not create a significant bias in the detection process by introducing individuals that could not be detected after removal. However, the dead recovery component of the model did have a substantial effect on results by increasing sample size. The ability to account for heterogeneity in habitat use provides a useful extension to SCR models, and will enhance the ability of wildlife managers to reliably and economically estimate density of wildlife populations, particularly large carnivores.
Enhanced Ligand Sampling for Relative Protein–Ligand Binding Free Energy Calculations
2016-01-01
Free energy calculations are used to study how strongly potential drug molecules interact with their target receptors. The accuracy of these calculations depends on the accuracy of the molecular dynamics (MD) force field as well as proper sampling of the major conformations of each molecule. However, proper sampling of ligand conformations can be difficult when there are large barriers separating the major ligand conformations. An example of this is for ligands with an asymmetrically substituted phenyl ring, where the presence of protein loops hinders the proper sampling of the different ring conformations. These ring conformations become more difficult to sample when the size of the functional groups attached to the ring increases. The Adaptive Integration Method (AIM) has been developed, which adaptively changes the alchemical coupling parameter λ during the MD simulation so that conformations sampled at one λ can aid sampling at the other λ values. The Accelerated Adaptive Integration Method (AcclAIM) builds on AIM by lowering potential barriers for specific degrees of freedom at intermediate λ values. However, these methods may not work when there are very large barriers separating the major ligand conformations. In this work, we describe a modification to AIM that improves sampling of the different ring conformations, even when there is a very large barrier between them. This method combines AIM with conformational Monte Carlo sampling, giving improved convergence of ring populations and the resulting free energy. This method, called AIM/MC, is applied to study the relative binding free energy for a pair of ligands that bind to thrombin and a different pair of ligands that bind to aspartyl protease β-APP cleaving enzyme 1 (BACE1). These protein–ligand binding free energy calculations illustrate the improvements in conformational sampling and the convergence of the free energy compared to both AIM and AcclAIM. PMID:25906170
Study on Aerosol Penetration Through Clothing and Individual Protective Equipment
2009-05-01
8.4X10-3 mg.m-3 (2.57X105 particles per cubic meter of air) over a 30 minute period. This scenario represents a very high end threat with a large... Isokinetic air sampling was applied and the effect of aerosol losses in sampling lines and other parts of the test rig were incorporated in analysis...eliminate any “memory” effect. The aerosol sampling (airflow direction control, start of sampling) was operated manually. Isokinetic sampling conditions
Prohl, Annette; Ostermann, Carola; Lohr, Markus; Reinhold, Petra
2014-07-03
There is an ongoing search for alternative animal models in research of respiratory medicine. Depending on the goal of the research, large animals as models of pulmonary disease often resemble the situation of the human lung much better than mice do. Working with large animals also offers the opportunity to sample the same animal repeatedly over a certain course of time, which allows long-term studies without sacrificing the animals. The aim was to establish in vivo sampling methods for the use in a bovine model of a respiratory Chlamydia psittaci infection. Sampling should be performed at various time points in each animal during the study, and the samples should be suitable to study the host response, as well as the pathogen under experimental conditions. Bronchoscopy is a valuable diagnostic tool in human and veterinary medicine. It is a safe and minimally invasive procedure. This article describes the intrabronchial inoculation of calves as well as sampling methods for the lower respiratory tract. Videoendoscopic, intrabronchial inoculation leads to very consistent clinical and pathological findings in all inoculated animals and is, therefore, well-suited for use in models of infectious lung disease. The sampling methods described are bronchoalveolar lavage, bronchial brushing and transbronchial lung biopsy. All of these are valuable diagnostic tools in human medicine and could be adapted for experimental purposes to calves aged 6-8 weeks. The samples obtained were suitable for both pathogen detection and characterization of the severity of lung inflammation in the host.
NASA Astrophysics Data System (ADS)
Perera, I. K.; Kantartzoglou, S.; Dyer, P. E.
1996-12-01
We have performed experiments to explore the characteristics of the matrix-assisted laser desorption/ionization (MALDI) process and to ascertain optimal operational conditions for observing intact molecular ions of large proteins. In this study, several methods have been adopted for the preparation of analyte samples. Of these, the samples prepared with the simple dried-droplet method were found to be the most suitable for the generation of the large molecular clusters, while the near-uniform spin-coated samples were observed to produce highly reproducible molecular ion signals of relatively high mass resolutions. A resulting mass spectrum which illustrates the formation of cluster ions up to the 26-mer [26M+H]+ of bovine insulin corresponding to a mass of about 150,000 Da, is presented. The effect of fluence on the extent of clustering of protein molecules has been studied, the results revealing the existence of an optimum fluence for detecting the large cluster ions. Investigations have also indicated that the use of polyethylene-coated metallic substrates as sample supports can considerably reduce the fragmentation of the matrix/analyte molecular ions and the desorption of "neat" MALDI matrices deposited on these polyethylene-coated sample probes enhance their aggregation, forming up to the heptamer [7M+H]+ of the matrix, ferulic acid. The dependence of the mass resolution on the applied acceleration voltage and the desorption fluence has been examined and the results obtained are discussed in terms of a simple analysis of the linear time-of-flight mass spectrometer. A spectrum of chicken egg lysozyme (M~14,306) displaying the high mass resolutions (M/[Delta]M~690) that can be attained when the mass spectrometer is operated in the reflectron mode is also presented.
Mosaic construction, processing, and review of very large electron micrograph composites
NASA Astrophysics Data System (ADS)
Vogt, Robert C., III; Trenkle, John M.; Harmon, Laurel A.
1996-11-01
A system of programs is described for acquisition, mosaicking, cueing and interactive review of large-scale transmission electron micrograph composite images. This work was carried out as part of a final-phase clinical analysis study of a drug for the treatment of diabetic peripheral neuropathy. MOre than 500 nerve biopsy samples were prepared, digitally imaged, processed, and reviewed. For a given sample, typically 1000 or more 1.5 megabyte frames were acquired, for a total of between 1 and 2 gigabytes of data per sample. These frames were then automatically registered and mosaicked together into a single virtual image composite, which was subsequently used to perform automatic cueing of axons and axon clusters, as well as review and marking by qualified neuroanatomists. Statistics derived from the review process were used to evaluate the efficacy of the drug in promoting regeneration of myelinated nerve fibers. This effort demonstrates a new, entirely digital capability for doing large-scale electron micrograph studies, in which all of the relevant specimen data can be included at high magnification, as opposed to simply taking a random sample of discrete locations. It opens up the possibility of a new era in electron microscopy--one which broadens the scope of questions that this imaging modality can be used to answer.
ERIC Educational Resources Information Center
Hallett, Victoria; Ronald, Angelica; Colvert, Emma; Ames, Catherine; Woodhouse, Emma; Lietz, Stephanie; Garnett, Tracy; Gillan, Nicola; Rijsdijk, Fruhling; Scahill, Lawrence; Bolton, Patrick; Happé, Francesca
2013-01-01
Background: Although many children with autism spectrum disorders (ASDs) experience difficulties with anxiety, the manifestation of these difficulties remains unresolved. The current study assessed anxiety in a large population-based twin sample, aged 10-15 years. Phenotypic analyses were used to explore anxiety symptoms in children with ASDs,…
van der Loos, Matthijs J H M; Haring, Robin; Rietveld, Cornelius A; Baumeister, Sebastian E; Groenen, Patrick J F; Hofman, Albert; de Jong, Frank H; Koellinger, Philipp D; Kohlmann, Thomas; Nauck, Matthias A; Rivadeneira, Fernando; Uitterlinden, André G; van Rooij, Frank J A; Wallaschofski, Henri; Thurik, A Roy
2013-07-02
Previous research has suggested a positive association between testosterone (T) and entrepreneurial behavior in males. However, this evidence was found in a study with a small sample size and has not been replicated. In the present study, we aimed to verify this association using two large, independent, population-based samples of males. We tested the association of T with entrepreneurial behavior, operationalized as self-employment, using data from the Rotterdam Study (N=587) and the Study of Health in Pomerania (N=1697). Total testosterone (TT) and sex hormone-binding globulin (SHBG) were measured in the serum. Free testosterone (FT), non-SHBG-bound T (non-SHBG-T), and the TT/SHBG ratio were calculated and used as measures of bioactive serum T, in addition to TT adjusted for SHBG. Using logistic regression models, we found no significant associations between any of the serum T measures and self-employment in either of the samples. To our knowledge, this is the first large-scale study on the relationship between serum T and entrepreneurial behavior. Copyright © 2013 Elsevier Inc. All rights reserved.
Sampling procedures for throughfall monitoring: A simulation study
NASA Astrophysics Data System (ADS)
Zimmermann, Beate; Zimmermann, Alexander; Lark, Richard Murray; Elsenbeer, Helmut
2010-01-01
What is the most appropriate sampling scheme to estimate event-based average throughfall? A satisfactory answer to this seemingly simple question has yet to be found, a failure which we attribute to previous efforts' dependence on empirical studies. Here we try to answer this question by simulating stochastic throughfall fields based on parameters for statistical models of large monitoring data sets. We subsequently sampled these fields with different sampling designs and variable sample supports. We evaluated the performance of a particular sampling scheme with respect to the uncertainty of possible estimated means of throughfall volumes. Even for a relative error limit of 20%, an impractically large number of small, funnel-type collectors would be required to estimate mean throughfall, particularly for small events. While stratification of the target area is not superior to simple random sampling, cluster random sampling involves the risk of being less efficient. A larger sample support, e.g., the use of trough-type collectors, considerably reduces the necessary sample sizes and eliminates the sensitivity of the mean to outliers. Since the gain in time associated with the manual handling of troughs versus funnels depends on the local precipitation regime, the employment of automatically recording clusters of long troughs emerges as the most promising sampling scheme. Even so, a relative error of less than 5% appears out of reach for throughfall under heterogeneous canopies. We therefore suspect a considerable uncertainty of input parameters for interception models derived from measured throughfall, in particular, for those requiring data of small throughfall events.
The Sampling Design of the China Family Panel Studies (CFPS)
Xie, Yu; Lu, Ping
2018-01-01
The China Family Panel Studies (CFPS) is an on-going, nearly nationwide, comprehensive, longitudinal social survey that is intended to serve research needs on a large variety of social phenomena in contemporary China. In this paper, we describe the sampling design of the CFPS sample for its 2010 baseline survey and methods for constructing weights to adjust for sampling design and survey nonresponses. Specifically, the CFPS used a multi-stage probability strategy to reduce operation costs and implicit stratification to increase efficiency. Respondents were oversampled in five provinces or administrative equivalents for regional comparisons. We provide operation details for both sampling and weights construction. PMID:29854418
Preparation of highly multiplexed small RNA sequencing libraries.
Persson, Helena; Søkilde, Rolf; Pirona, Anna Chiara; Rovira, Carlos
2017-08-01
MicroRNAs (miRNAs) are ~22-nucleotide-long small non-coding RNAs that regulate the expression of protein-coding genes by base pairing to partially complementary target sites, preferentially located in the 3´ untranslated region (UTR) of target mRNAs. The expression and function of miRNAs have been extensively studied in human disease, as well as the possibility of using these molecules as biomarkers for prognostication and treatment guidance. To identify and validate miRNAs as biomarkers, their expression must be screened in large collections of patient samples. Here, we develop a scalable protocol for the rapid and economical preparation of a large number of small RNA sequencing libraries using dual indexing for multiplexing. Combined with the use of off-the-shelf reagents, more samples can be sequenced simultaneously on large-scale sequencing platforms at a considerably lower cost per sample. Sample preparation is simplified by pooling libraries prior to gel purification, which allows for the selection of a narrow size range while minimizing sample variation. A comparison with publicly available data from benchmarking of miRNA analysis platforms showed that this method captures absolute and differential expression as effectively as commercially available alternatives.
In large-scale studies, it is often neither feasible nor necessary to obtain the large samples of 400 particles advocated by many geomorphologists to adequately quantify streambed surface particle-size distributions. Synoptic surveys such as U.S. Environmental Protection Agency...
Grain Size and Parameter Recovery with TIMSS and the General Diagnostic Model
ERIC Educational Resources Information Center
Skaggs, Gary; Wilkins, Jesse L. M.; Hein, Serge F.
2016-01-01
The purpose of this study was to explore the degree of grain size of the attributes and the sample sizes that can support accurate parameter recovery with the General Diagnostic Model (GDM) for a large-scale international assessment. In this resampling study, bootstrap samples were obtained from the 2003 Grade 8 TIMSS in Mathematics at varying…
When Teachers Give Up: Teacher Burnout, Teacher Turnover and Their Impact on Children.
ERIC Educational Resources Information Center
Dworkin, Anthony Gary
A large-scale sociological study of teacher burnout in the public schools is summarized. Data presented in the study consist of: a sample of 3,500 teachers in Houston, whose attitudes were monitored in 1977; exit interviews of every teacher in the initial sample who subsequently quit teaching over a 5-year period; achievement and attendance…
Mwanyika, Gaspary; Call, Douglas R; Rugumisa, Benardether; Luanda, Catherine; Murutu, Rehema; Subbiah, Murugan; Buza, Joram
2016-09-01
Given the potential public health risks associated with a burgeoning goat meat industry in Tanzania, we estimated the load of Escherichia coli and the prevalence of antibiotic-resistant strains for goat meat by using a cross-sectional study design (June to July 2015). Five large (n = 60 samples) and five small (n = 64 samples) slaughterhouses were sampled over a period of four to six visits each. Meat rinsate was prepared and plated onto MacConkey agar, and presumptive E. coli colonies were enumerated and reported as CFU per milliliter of rinsate. In total, 2,736 presumptive E. coli isolates were tested for antibiotic drug sensitivity by using breakpoint assays against 11 medically important antibiotics. E. coli was recovered from almost all the samples (96.8%), with counts ranging from 2 to 4 log CFU ml -1 , and there was no significant difference (P = 0.43) in recovery according to facility size (average, 3.37 versus 3.13 log CFU ml -1 , large and small, respectively). Samples from large facilities had relatively higher prevalence (P = 0.026) of antibiotic-resistant E. coli compared with small facilities. This was mostly explained by more ampicillin (30.1 versus 12.8%) and amoxicillin (17.6 versus 4.5%) resistance for large versus small facilities, respectively, and more tetracycline resistance for small facilities (5.6 versus 10.6%, respectively). Large slaughter operations may serve as foci for dissemination of antibiotic-resistant bacteria via food products. More effective hygiene practices during slaughter and meat handling would limit the probability of transmitting antibiotic-resistant E. coli in goat meat.
Improved specimen adequacy using jumbo biopsy forceps in patients with Barrett's esophagus
Martinek, Jan; Maluskova, Jana; Stefanova, Magdalena; Tuckova, Inna; Suchanek, Stepan; Vackova, Zuzana; Krajciova, Jana; Kollar, Marek; Zavoral, Miroslav; Spicak, Julius
2015-01-01
AIM: To assess the sampling quality of four different forceps (three large capacity and one jumbo) in patients with Barrett’s esophagus. METHODS: This was a prospective, single-blind study. A total of 37 patients with Barrett’s esophagus were enrolled. Targeted or random biopsies with all four forceps were obtained from each patient using a diagnostic endoscope during a single endoscopy. The following forceps were tested: A: FB-220K disposable large capacity; B: BI01-D3-23 reusable large capacity; C: GBF-02-23-180 disposable large capacity; and jumbo: disposable Radial Jaw 4 jumbo. The primary outcome measurement was specimen adequacy, defined as a well-oriented biopsy sample 2 mm or greater with the presence of muscularis mucosa. RESULTS: A total of 436 biopsy samples were analyzed. We found a significantly higher proportion of adequate biopsy samples with jumbo forceps (71%) (P < 0.001 vs forceps A: 26%, forceps B: 17%, and forceps C: 18%). Biopsies with jumbo forceps had the largest diameter (median 2.4 mm) (P < 0.001 vs forceps A: 2 mm, forceps B: 1.6 mm, and forceps C: 2mm). There was a trend for higher diagnostic yield per biopsy with jumbo forceps (forceps A: 0.20, forceps B: 0.22, forceps C: 0.27, and jumbo: 0.28). No complications related to specimen sampling were observed with any of the four tested forceps. CONCLUSION: Jumbo biopsy forceps, when used with a diagnostic endoscope, provide more adequate specimens as compared to large-capacity forceps in patients with Barrett’s esophagus. PMID:25954107
Improved specimen adequacy using jumbo biopsy forceps in patients with Barrett's esophagus.
Martinek, Jan; Maluskova, Jana; Stefanova, Magdalena; Tuckova, Inna; Suchanek, Stepan; Vackova, Zuzana; Krajciova, Jana; Kollar, Marek; Zavoral, Miroslav; Spicak, Julius
2015-05-07
To assess the sampling quality of four different forceps (three large capacity and one jumbo) in patients with Barrett's esophagus. This was a prospective, single-blind study. A total of 37 patients with Barrett's esophagus were enrolled. Targeted or random biopsies with all four forceps were obtained from each patient using a diagnostic endoscope during a single endoscopy. The following forceps were tested: A: FB-220K disposable large capacity; B: BI01-D3-23 reusable large capacity; C: GBF-02-23-180 disposable large capacity; and jumbo: disposable Radial Jaw 4 jumbo. The primary outcome measurement was specimen adequacy, defined as a well-oriented biopsy sample 2 mm or greater with the presence of muscularis mucosa. A total of 436 biopsy samples were analyzed. We found a significantly higher proportion of adequate biopsy samples with jumbo forceps (71%) (P < 0.001 vs forceps A: 26%, forceps B: 17%, and forceps C: 18%). Biopsies with jumbo forceps had the largest diameter (median 2.4 mm) (P < 0.001 vs forceps A: 2 mm, forceps B: 1.6 mm, and forceps C: 2mm). There was a trend for higher diagnostic yield per biopsy with jumbo forceps (forceps A: 0.20, forceps B: 0.22, forceps C: 0.27, and jumbo: 0.28). No complications related to specimen sampling were observed with any of the four tested forceps. Jumbo biopsy forceps, when used with a diagnostic endoscope, provide more adequate specimens as compared to large-capacity forceps in patients with Barrett's esophagus.
Lensfree diffractive tomography for the imaging of 3D cell cultures
NASA Astrophysics Data System (ADS)
Berdeu, Anthony; Momey, Fabien; Dinten, Jean-Marc; Gidrol, Xavier; Picollet-D'hahan, Nathalie; Allier, Cédric
2017-02-01
New microscopes are needed to help reaching the full potential of 3D organoid culture studies by gathering large quantitative and systematic data over extended periods of time while preserving the integrity of the living sample. In order to reconstruct large volumes while preserving the ability to catch every single cell, we propose new imaging platforms based on lens-free microscopy, a technic which is addressing these needs in the context of 2D cell culture, providing label-free and non-phototoxic acquisition of large datasets. We built lens-free diffractive tomography setups performing multi-angle acquisitions of 3D organoid cultures embedded in Matrigel and developed dedicated 3D holographic reconstruction algorithms based on the Fourier diffraction theorem. Nonetheless, holographic setups do not record the phase of the incident wave front and the biological samples in Petri dish strongly limit the angular coverage. These limitations introduce numerous artefacts in the sample reconstruction. We developed several methods to overcome them, such as multi-wavelength imaging or iterative phase retrieval. The most promising technic currently developed is based on a regularised inverse problem approach directly applied on the 3D volume to reconstruct. 3D reconstructions were performed on several complex samples such as 3D networks or spheroids embedded in capsules with large reconstructed volumes up to 25 mm3 while still being able to identify single cells. To our knowledge, this is the first time that such an inverse problem approach is implemented in the context of lens-free diffractive tomography enabling to reconstruct large fully 3D volumes of unstained biological samples.
Spatial considerations during cryopreservation of a large volume sample.
Kilbride, Peter; Lamb, Stephen; Milne, Stuart; Gibbons, Stephanie; Erro, Eloy; Bundy, James; Selden, Clare; Fuller, Barry; Morris, John
2016-08-01
There have been relatively few studies on the implications of the physical conditions experienced by cells during large volume (litres) cryopreservation - most studies have focused on the problem of cryopreservation of smaller volumes, typically up to 2 ml. This study explores the effects of ice growth by progressive solidification, generally seen during larger scale cryopreservation, on encapsulated liver hepatocyte spheroids, and it develops a method to reliably sample different regions across the frozen cores of samples experiencing progressive solidification. These issues are examined in the context of a Bioartificial Liver Device which requires cryopreservation of a 2 L volume in a strict cylindrical geometry for optimal clinical delivery. Progressive solidification cannot be avoided in this arrangement. In such a system optimal cryoprotectant concentrations and cooling rates are known. However, applying these parameters to a large volume is challenging due to the thermal mass and subsequent thermal lag. The specific impact of this to the cryopreservation outcome is required. Under conditions of progressive solidification, the spatial location of Encapsulated Liver Spheroids had a strong impact on post-thaw recovery. Cells in areas first and last to solidify demonstrated significantly impaired post-thaw function, whereas areas solidifying through the majority of the process exhibited higher post-thaw outcome. It was also found that samples where the ice thawed more rapidly had greater post-thaw viability 24 h post-thaw (75.7 ± 3.9% and 62.0 ± 7.2% respectively). These findings have implications for the cryopreservation of large volumes with a rigid shape and for the cryopreservation of a Bioartificial Liver Device. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Hatemi, Peter K.; Medland, Sarah E.; Klemmensen, Robert; Oskarrson, Sven; Littvay, Levente; Dawes, Chris; Verhulst, Brad; McDermott, Rose; Nørgaard, Asbjørn Sonne; Klofstad, Casey; Christensen, Kaare; Johannesson, Magnus; Magnusson, Patrik K.E.; Eaves, Lindon J.; Martin, Nicholas G.
2014-01-01
Almost forty years ago, evidence from large studies of adult twins and their relatives suggested that between 30-60% of the variance in social and political attitudes could be explained by genetic influences. However, these findings have not been widely accepted or incorporated into the dominant paradigms that explain the etiology of political ideology. This has been attributed in part to measurement and sample limitations, as well the relative absence of molecular genetic studies. Here we present results from original analyses of a combined sample of over 12,000 twins pairs, ascertained from nine different studies conducted in five democracies, sampled over the course of four decades. We provide evidence that genetic factors play a role in the formation of political ideology, regardless of how ideology is measured, the era, or the population sampled. The only exception is a question that explicitly uses the phrase “Left-Right”. We then present results from one of the first genome-wide association studies on political ideology using data from three samples: a 1990 Australian sample involving 6,894 individuals from 3,516 families; a 2008 Australian sample of 1,160 related individuals from 635 families and a 2010 Swedish sample involving 3,334 individuals from 2,607 families. No polymorphisms reached genome-wide significance in the meta-analysis. The combined evidence suggests that political ideology constitutes a fundamental aspect of one’s genetically informed psychological disposition, but as Fisher proposed long ago, genetic influences on complex traits will be composed of thousands of markers of very small effects and it will require extremely large samples to have enough power in order to identify specific polymorphisms related to complex social traits. PMID:24569950
Loughland, Carmel; Draganic, Daren; McCabe, Kathryn; Richards, Jacqueline; Nasir, Aslam; Allen, Joanne; Catts, Stanley; Jablensky, Assen; Henskens, Frans; Michie, Patricia; Mowry, Bryan; Pantelis, Christos; Schall, Ulrich; Scott, Rodney; Tooney, Paul; Carr, Vaughan
2010-11-01
This article describes the establishment of the Australian Schizophrenia Research Bank (ASRB), which operates to collect, store and distribute linked clinical, cognitive, neuroimaging and genetic data from a large sample of people with schizophrenia and healthy controls. Recruitment sources for the schizophrenia sample include a multi-media national advertising campaign, inpatient and community treatment services and non-government support agencies. Healthy controls have been recruited primarily through multi-media advertisements. All participants undergo an extensive diagnostic and family history assessment, neuropsychological evaluation, and blood sample donation for genetic studies. Selected individuals also complete structural MRI scans. Preliminary analyses of 493 schizophrenia cases and 293 healthy controls are reported. Mean age was 39.54 years (SD = 11.1) for the schizophrenia participants and 37.38 years (SD = 13.12) for healthy controls. Compared to the controls, features of the schizophrenia sample included a higher proportion of males (cases 65.9%; controls 46.8%), fewer living in married or de facto relationships (cases 16.1%; controls 53.6%) and fewer years of education (cases 13.05, SD = 2.84; controls 15.14, SD = 3.13), as well as lower current IQ (cases 102.68, SD = 15.51; controls 118.28, SD = 10.18). These and other sample characteristics are compared to those reported in another large Australian sample (i.e. the Low Prevalence Disorders Study), revealing some differences that reflect the different sampling methods of these two studies. The ASRB is a valuable and accessible schizophrenia research facility for use by approved scientific investigators. As recruitment continues, the approach to sampling for both cases and controls will need to be modified to ensure that the ASRB samples are as broadly representative as possible of all cases of schizophrenia and healthy controls.
Hatemi, Peter K; Medland, Sarah E; Klemmensen, Robert; Oskarsson, Sven; Littvay, Levente; Dawes, Christopher T; Verhulst, Brad; McDermott, Rose; Nørgaard, Asbjørn Sonne; Klofstad, Casey A; Christensen, Kaare; Johannesson, Magnus; Magnusson, Patrik K E; Eaves, Lindon J; Martin, Nicholas G
2014-05-01
Almost 40 years ago, evidence from large studies of adult twins and their relatives suggested that between 30 and 60% of the variance in social and political attitudes could be explained by genetic influences. However, these findings have not been widely accepted or incorporated into the dominant paradigms that explain the etiology of political ideology. This has been attributed in part to measurement and sample limitations, as well the relative absence of molecular genetic studies. Here we present results from original analyses of a combined sample of over 12,000 twins pairs, ascertained from nine different studies conducted in five democracies, sampled over the course of four decades. We provide evidence that genetic factors play a role in the formation of political ideology, regardless of how ideology is measured, the era, or the population sampled. The only exception is a question that explicitly uses the phrase "Left-Right". We then present results from one of the first genome-wide association studies on political ideology using data from three samples: a 1990 Australian sample involving 6,894 individuals from 3,516 families; a 2008 Australian sample of 1,160 related individuals from 635 families and a 2010 Swedish sample involving 3,334 individuals from 2,607 families. No polymorphisms reached genome-wide significance in the meta-analysis. The combined evidence suggests that political ideology constitutes a fundamental aspect of one's genetically informed psychological disposition, but as Fisher proposed long ago, genetic influences on complex traits will be composed of thousands of markers of very small effects and it will require extremely large samples to have enough power in order to identify specific polymorphisms related to complex social traits.
2013-01-01
Background Stereotypic behaviours, i.e. repetitive behaviours induced by frustration, repeated attempts to cope and/or brain dysfunction, are intriguing as they occur in a variety of domestic and captive species without any clear adaptive function. Among the different hypotheses, the coping hypothesis predicts that stereotypic behaviours provide a way for animals in unfavourable environmental conditions to adjust. As such, they are expected to have a lower physiological stress level (glucocorticoids) than non-stereotypic animals. Attempts to link stereotypic behaviours with glucocorticoids however have yielded contradictory results. Here we investigated correlates of oral and motor stereotypic behaviours and glucocorticoid levels in two large samples of domestic horses (NStudy1 = 55, NStudy2 = 58), kept in sub-optimal conditions (e.g. confinement, social isolation), and already known to experience poor welfare states. Each horse was observed in its box using focal sampling (study 1) and instantaneous scan sampling (study 2). Plasma samples (collected in study 1) but also non-invasive faecal samples (collected in both studies) were retrieved in order to assess cortisol levels. Results Results showed that 1) plasma cortisol and faecal cortisol metabolites concentrations did not differ between horses displaying stereotypic behaviours and non-stereotypic horses and 2) both oral and motor stereotypic behaviour levels did not predict plasma cortisol or faecal cortisol metabolites concentrations. Conclusions Cortisol measures, collected in two large samples of horses using both plasma sampling as well as faecal sampling (the latter method minimizing bias due to a non-invasive sampling procedure), therefore do not indicate that stereotypic horses cope better, at least in terms of adrenocortical activity. PMID:23289406
Evaluation of Existing Methods for Human Blood mRNA Isolation and Analysis for Large Studies
Meyer, Anke; Paroni, Federico; Günther, Kathrin; Dharmadhikari, Gitanjali; Ahrens, Wolfgang; Kelm, Sørge; Maedler, Kathrin
2016-01-01
Aims Prior to implementing gene expression analyses from blood to a larger cohort study, an evaluation to set up a reliable and reproducible method is mandatory but challenging due to the specific characteristics of the samples as well as their collection methods. In this pilot study we optimized a combination of blood sampling and RNA isolation methods and present reproducible gene expression results from human blood samples. Methods The established PAXgeneTM blood collection method (Qiagen) was compared with the more recent TempusTM collection and storing system. RNA from blood samples collected by both systems was extracted on columns with the corresponding Norgen and PAX RNA extraction Kits. RNA quantity and quality was compared photometrically, with Ribogreen and by Real-Time PCR analyses of various reference genes (PPIA, β-ACTIN and TUBULIN) and exemplary of SIGLEC-7. Results Combining different sampling methods and extraction kits caused strong variations in gene expression. The use of PAXgeneTM and TempusTM collection systems resulted in RNA of good quality and quantity for the respective RNA isolation system. No large inter-donor variations could be detected for both systems. However, it was not possible to extract sufficient RNA of good quality with the PAXgeneTM RNA extraction system from samples collected by TempusTM collection tubes. Comparing only the Norgen RNA extraction methods, RNA from blood collected either by the TempusTM or PAXgeneTM collection system delivered sufficient amount and quality of RNA, but the TempusTM collection delivered higher RNA concentration compared to the PAXTM collection system. The established Pre-analytix PAXgeneTM RNA extraction system together with the PAXgeneTM blood collection system showed lowest CT-values, i.e. highest RNA concentration of good quality. Expression levels of all tested genes were stable and reproducible. Conclusions This study confirms that it is not possible to mix or change sampling or extraction strategies during the same study because of large variations of RNA yield and expression levels. PMID:27575051
Next-generation sequencing provides unprecedented access to genomic information in archival FFPE tissue samples. However, costs and technical challenges related to RNA isolation and enrichment limit use of whole-genome RNA-sequencing for large-scale studies of FFPE specimens. Rec...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iwamoto, A.; Mito, T.; Takahata, K.
Heat transfer of large copper plates (18 x 76 mm) in liquid helium has been measured as a function of orientation and treatment of the heat transfer surface. The results relate to applications of large scale superconductors. In order to clarify the influence of the area where the surface treatment peels off, the authors studied five types of heat transfer surface areas including: (a) 100% polished copper sample, (b) and (c) two 50% oxidized copper samples having different patterns of oxidation, (d) 75% oxidized copper sample, (e) 90% oxidized copper sample, and (f) 100% oxidized copper sample. They observed thatmore » the critical heat flux depends on the heat transfer surface orientation. The critical heat flux is a maximum at angles of 0{degrees} - 30{degrees} and decreases monotonically with increasing angles above 30{degrees}, where the angle is taken in reference to the horizontal axis. On the other hand, the minimum heat flux is less dependent on the surface orientation. More than 75% oxidation on the surface makes the critical heat flux increase. The minimum heat fluxes of the 50 and 90% oxidized Cu samples approximately agree with that of the 100% oxidized Cu sample. Experiments and calculations show that the critical and the minimum heat fluxes are a bilinear function of the fraction of oxidized surface area.« less
Estimating accuracy of land-cover composition from two-stage cluster sampling
Stehman, S.V.; Wickham, J.D.; Fattorini, L.; Wade, T.D.; Baffetta, F.; Smith, J.H.
2009-01-01
Land-cover maps are often used to compute land-cover composition (i.e., the proportion or percent of area covered by each class), for each unit in a spatial partition of the region mapped. We derive design-based estimators of mean deviation (MD), mean absolute deviation (MAD), root mean square error (RMSE), and correlation (CORR) to quantify accuracy of land-cover composition for a general two-stage cluster sampling design, and for the special case of simple random sampling without replacement (SRSWOR) at each stage. The bias of the estimators for the two-stage SRSWOR design is evaluated via a simulation study. The estimators of RMSE and CORR have small bias except when sample size is small and the land-cover class is rare. The estimator of MAD is biased for both rare and common land-cover classes except when sample size is large. A general recommendation is that rare land-cover classes require large sample sizes to ensure that the accuracy estimators have small bias. ?? 2009 Elsevier Inc.
Using facebook to recruit young adult veterans: online mental health research.
Pedersen, Eric R; Helmuth, Eric D; Marshall, Grant N; Schell, Terry L; PunKay, Marc; Kurz, Jeremy
2015-06-01
Veteran research has primarily been conducted with clinical samples and those already involved in health care systems, but much is to be learned about veterans in the community. Facebook is a novel yet largely unexplored avenue for recruiting veteran participants for epidemiological and clinical studies. In this study, we utilized Facebook to recruit a sample of young adult veterans for the first phase of an online alcohol intervention study. We describe the successful Facebook recruitment process, including data collection from over 1000 veteran participants in approximately 3 weeks, procedures to verify participation eligibility, and comparison of our sample with nationally available norms. Participants were young adult veterans aged 18-34 recruited through Facebook as part of a large study to document normative drinking behavior among a large community sample of veterans. Facebook ads were targeted toward young veterans to collect information on demographics and military characteristics, health behaviors, mental health, and health care utilization. We obtained a sample of 1023 verified veteran participants over a period of 24 days for the advertising price of approximately US $7.05 per verified veteran participant. Our recruitment strategy yielded a sample similar to the US population of young adult veterans in most demographic areas except for race/ethnicity and previous branch of service, which when we weighted the sample on race/ethnicity and branch a sample better matched with the population data was obtained. The Facebook sample recruited veterans who were engaged in a variety of risky health behaviors such as binge drinking and marijuana use. One fourth of veterans had never since discharge been to an appointment for physical health care and about half had attended an appointment for service compensation review. Only half had attended any appointment for a mental health concern at any clinic or hospital. Despite more than half screening positive for current probable mental health disorders such as post-traumatic stress disorder, depression, anxiety, only about 1 in 3 received mental health care in the past year and only 1 in 50 received such care within the past month. This work expands on the work of other studies that have examined clinical samples of veterans only and suggests Facebook can be an adequate method of obtaining samples of veterans in need of care. Clinicaltrials.gov NCT02187887; http://clinicaltrials.gov/ct2/show/NCT02187887 (Archived by WebCite at http://www.webcitation.org/6YiUKRsXY).
Using Facebook to Recruit Young Adult Veterans: Online Mental Health Research
2015-01-01
Background Veteran research has primarily been conducted with clinical samples and those already involved in health care systems, but much is to be learned about veterans in the community. Facebook is a novel yet largely unexplored avenue for recruiting veteran participants for epidemiological and clinical studies. Objective In this study, we utilized Facebook to recruit a sample of young adult veterans for the first phase of an online alcohol intervention study. We describe the successful Facebook recruitment process, including data collection from over 1000 veteran participants in approximately 3 weeks, procedures to verify participation eligibility, and comparison of our sample with nationally available norms. Methods Participants were young adult veterans aged 18-34 recruited through Facebook as part of a large study to document normative drinking behavior among a large community sample of veterans. Facebook ads were targeted toward young veterans to collect information on demographics and military characteristics, health behaviors, mental health, and health care utilization. Results We obtained a sample of 1023 verified veteran participants over a period of 24 days for the advertising price of approximately US $7.05 per verified veteran participant. Our recruitment strategy yielded a sample similar to the US population of young adult veterans in most demographic areas except for race/ethnicity and previous branch of service, which when we weighted the sample on race/ethnicity and branch a sample better matched with the population data was obtained. The Facebook sample recruited veterans who were engaged in a variety of risky health behaviors such as binge drinking and marijuana use. One fourth of veterans had never since discharge been to an appointment for physical health care and about half had attended an appointment for service compensation review. Only half had attended any appointment for a mental health concern at any clinic or hospital. Despite more than half screening positive for current probable mental health disorders such as post-traumatic stress disorder, depression, anxiety, only about 1 in 3 received mental health care in the past year and only 1 in 50 received such care within the past month. Conclusions This work expands on the work of other studies that have examined clinical samples of veterans only and suggests Facebook can be an adequate method of obtaining samples of veterans in need of care. Trial Registration Clinicaltrials.gov NCT02187887; http://clinicaltrials.gov/ct2/show/NCT02187887 (Archived by WebCite at http://www.webcitation.org/6YiUKRsXY). PMID:26033209
High-Throughput Analysis and Automation for Glycomics Studies.
Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred
This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.
Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L
2018-03-01
Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.
Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.
1993-01-01
Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to determine the quantity of each taxon present in the semi-quantitative samples or to list the taxa present in qualitative samples. The processing guidelines provide standardized laboratory forms, sample labels, detailed sample processing flow charts, standardized format for electronic data, quality-assurance procedures and checks, sample tracking standards, and target levels for taxonomic determinations. The contract laboratory (1) is responsible for identifications and quantifications, (2) constructs reference collections, (3) provides data in hard copy and electronic forms, (4) follows specified quality-assurance and quality-control procedures, and (5) returns all processed and unprocessed portions of the samples. The U.S. Geological Survey's Quality Management Group maintains a Biological Quality-Assurance Unit, located at the National Water-Quality Laboratory, Arvada, Colorado, to oversee the use of contract laboratories and ensure the quality of data obtained from these laboratories according to the guidelines established in this document. This unit establishes contract specifications, reviews contractor performance (timeliness, accuracy, and consistency), enters data into the National Water Information System-II data base, maintains in-house reference collections, deposits voucher specimens in outside museums, and interacts with taxonomic experts within and outside the U.S. Geological Survey. This unit also modifies the existing sample processing and quality-assurance guidelines, establishes criteria and testing procedures for qualifying potential contract laboratories, identifies qualified taxonomic experts, and establishes voucher collections.
Microbial community analysis using MEGAN.
Huson, Daniel H; Weber, Nico
2013-01-01
Metagenomics, the study of microbes in the environment using DNA sequencing, depends upon dedicated software tools for processing and analyzing very large sequencing datasets. One such tool is MEGAN (MEtaGenome ANalyzer), which can be used to interactively analyze and compare metagenomic and metatranscriptomic data, both taxonomically and functionally. To perform a taxonomic analysis, the program places the reads onto the NCBI taxonomy, while functional analysis is performed by mapping reads to the SEED, COG, and KEGG classifications. Samples can be compared taxonomically and functionally, using a wide range of different charting and visualization techniques. PCoA analysis and clustering methods allow high-level comparison of large numbers of samples. Different attributes of the samples can be captured and used within analysis. The program supports various input formats for loading data and can export analysis results in different text-based and graphical formats. The program is designed to work with very large samples containing many millions of reads. It is written in Java and installers for the three major computer operating systems are available from http://www-ab.informatik.uni-tuebingen.de. © 2013 Elsevier Inc. All rights reserved.
Impact of spatial variability and sampling design on model performance
NASA Astrophysics Data System (ADS)
Schrape, Charlotte; Schneider, Anne-Kathrin; Schröder, Boris; van Schaik, Loes
2017-04-01
Many environmental physical and chemical parameters as well as species distributions display a spatial variability at different scales. In case measurements are very costly in labour time or money a choice has to be made between a high sampling resolution at small scales and a low spatial cover of the study area or a lower sampling resolution at the small scales resulting in local data uncertainties with a better spatial cover of the whole area. This dilemma is often faced in the design of field sampling campaigns for large scale studies. When the gathered field data are subsequently used for modelling purposes the choice of sampling design and resulting data quality influence the model performance criteria. We studied this influence with a virtual model study based on a large dataset of field information on spatial variation of earthworms at different scales. Therefore we built a virtual map of anecic earthworm distributions over the Weiherbach catchment (Baden-Württemberg in Germany). First of all the field scale abundance of earthworms was estimated using a catchment scale model based on 65 field measurements. Subsequently the high small scale variability was added using semi-variograms, based on five fields with a total of 430 measurements divided in a spatially nested sampling design over these fields, to estimate the nugget, range and standard deviation of measurements within the fields. With the produced maps, we performed virtual samplings of one up to 50 random points per field. We then used these data to rebuild the catchment scale models of anecic earthworm abundance with the same model parameters as in the work by Palm et al. (2013). The results of the models show clearly that a large part of the non-explained deviance of the models is due to the very high small scale variability in earthworm abundance: the models based on single virtual sampling points on average obtain an explained deviance of 0.20 and a correlation coefficient of 0.64. With increasing sampling points per field, we averaged the measured abundance of the sampling within each field to obtain a more representative value of the field average. Doubling the samplings per field strongly improved the model performance criteria (explained deviance 0.38 and correlation coefficient 0.73). With 50 sampling points per field the performance criteria were 0.91 and 0.97 respectively for explained deviance and correlation coefficient. The relationship between number of samplings and performance criteria can be described with a saturation curve. Beyond five samples per field the model improvement becomes rather small. With this contribution we wish to discuss the impact of data variability at sampling scale on model performance and the implications for sampling design and assessment of model results as well as ecological inferences.
Some limit theorems for ratios of order statistics from uniform random variables.
Xu, Shou-Fang; Miao, Yu
2017-01-01
In this paper, we study the ratios of order statistics based on samples drawn from uniform distribution and establish some limit properties such as the almost sure central limit theorem, the large deviation principle, the Marcinkiewicz-Zygmund law of large numbers and complete convergence.
Current bioassessment efforts are focused on small wadeable streams, at least partly because assessing ecological conditions in non-wadeable large rivers poses many additional challenges. In this study, we sampled 20 sites in each of seven large rivers in the Pacific Northwest, U...
Determining cereal starch amylose content using a dual wavelength iodine binding 96 well plate assay
USDA-ARS?s Scientific Manuscript database
Cereal starch amylose/amylopectin (AM/AP) ratios are critical in functional properties for food and industrial applications. Conventional determination of AM/AP of cereal starches are very time consuming and labor intensive making it very difficult to screen large sample sets. Studying these large...
Large Verbal--Non-Verbal Ability Differences and Underachievement.
ERIC Educational Resources Information Center
Whittington, Joyce
1988-01-01
Describes study conducted in England, Scotland, and Wales based on a national sample of 11-year-olds that investigated the relationship between large verbal and non-verbal differences in ability and underachievement in mathematics and reading. Sex differences are also examined and further research needs are suggested. (14 references) (LRW)
Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A
2014-02-01
Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A Bifactor Model of Negative Affectivity: Fear and Distress Components among Younger and Older Youth
ERIC Educational Resources Information Center
Ebesutani, Chad; Smith, Ashley; Bernstein, Adam; Chorpita, Bruce F.; Higa-McMillan, Charmaine; Nakamura, Brad
2011-01-01
The Positive and Negative Affect Schedule for Children (PANAS-C) is a 27-item youth-report measure of positive affectivity and negative affectivity. Using 2 large school-age youth samples (clinic-referred sample: N = 662; school-based sample: N = 911), in the present study, we thoroughly examined the structure of the PANAS-C NA and PA scales and…
ERIC Educational Resources Information Center
Zuckerman, Katharine E.; Hill, Alison P.; Guion, Kimberly; Voltolina, Lisa; Fombonne, Eric
2014-01-01
Autism Spectrum Disorders (ASDs) and childhood obesity (OBY) are rising public health concerns. This study aimed to evaluate the prevalence of overweight (OWT) and OBY in a sample of 376 Oregon children with ASD, and to assess correlates of OWT and OBY in this sample. We used descriptive statistics, bivariate, and focused multivariate analyses to…
MetaSRA: normalized human sample-specific metadata for the Sequence Read Archive.
Bernstein, Matthew N; Doan, AnHai; Dewey, Colin N
2017-09-15
The NCBI's Sequence Read Archive (SRA) promises great biological insight if one could analyze the data in the aggregate; however, the data remain largely underutilized, in part, due to the poor structure of the metadata associated with each sample. The rules governing submissions to the SRA do not dictate a standardized set of terms that should be used to describe the biological samples from which the sequencing data are derived. As a result, the metadata include many synonyms, spelling variants and references to outside sources of information. Furthermore, manual annotation of the data remains intractable due to the large number of samples in the archive. For these reasons, it has been difficult to perform large-scale analyses that study the relationships between biomolecular processes and phenotype across diverse diseases, tissues and cell types present in the SRA. We present MetaSRA, a database of normalized SRA human sample-specific metadata following a schema inspired by the metadata organization of the ENCODE project. This schema involves mapping samples to terms in biomedical ontologies, labeling each sample with a sample-type category, and extracting real-valued properties. We automated these tasks via a novel computational pipeline. The MetaSRA is available at metasra.biostat.wisc.edu via both a searchable web interface and bulk downloads. Software implementing our computational pipeline is available at http://github.com/deweylab/metasra-pipeline. cdewey@biostat.wisc.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Jjunju, Fred P M; Maher, Simon; Damon, Deidre E; Barrett, Richard M; Syed, S U; Heeren, Ron M A; Taylor, Stephen; Badu-Tawiah, Abraham K
2016-01-19
Direct analysis and identification of long chain aliphatic primary diamine Duomeen O (n-oleyl-1,3-diaminopropane), corrosion inhibitor in raw water samples taken from a large medium pressure water tube boiler plant water samples at low LODs (<0.1 pg) has been demonstrated for the first time, without any sample preparation using paper spray mass spectrometry (PS-MS). The presence of Duomeen O in water samples was confirmed via tandem mass spectrometry using collision-induced dissociation and supported by exact mass measurement and reactive paper spray experiments using an LTQ Orbitrap Exactive instrument. Data shown herein indicate that paper spray ambient ionization can be readily used as a rapid and robust method for in situ direct analysis of polymanine corrosion inhibitors in an industrial water boiler plant and other related samples in the water treatment industry. This approach was applied for the analysis of three complex water samples including feedwater, condensate water, and boiler water, all collected from large medium pressure (MP) water tube boiler plants, known to be dosed with varying amounts of polyamine and amine corrosion inhibitor components. Polyamine chemistry is widely used for example in large high pressure (HP) boilers operating in municipal waste and recycling facilities to prevent corrosion of metals. The samples used in this study are from such a facility in Coventry waste treatment facility, U.K., which has 3 × 40 tonne/hour boilers operating at 17.5 bar.
NASA Astrophysics Data System (ADS)
Li, Bao-Ping; Zhao, Jian-Xin; Greig, Alan; Collerson, Kenneth D.; Zhuo, Zhen-Xi; Feng, Yue-Xin
2005-11-01
We compare the trace element and Sr isotopic compositions of stoneware bodies made in Yaozhou and Jizhou to characterise these Chinese archaeological ceramics and examine the potential of Sr isotopes in provenance studies. Element concentrations determined by ICP-MS achieve distinct characterisation for Jizhou samples due to their restricted variation, yet had limited success with Yaozhou wares because of their large variability. In contrast, 87Sr/86Sr ratios in Yaozhou samples have a very small variation and are all significantly lower than those of Jizhou samples, which show a large variation and cannot be well characterised with Sr isotopes. Geochemical interpretation reveals that 87Sr/86Sr ratios will have greater potential to characterise ceramics made of low Rb/Sr materials such as kaolin clay, yet will show larger variations in ceramics made of high Rb/Sr materials such as porcelain stone.
Advantages and challenges in automated apatite fission track counting
NASA Astrophysics Data System (ADS)
Enkelmann, E.; Ehlers, T. A.
2012-04-01
Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for each individual surface area and grain counted. This manual correction procedure significantly increases (up to four times) the time required to analyze a sample with the automated counting procedure compared to the conventional approach.
Foster, Gregory D.; Gates, Paul M.; Foreman, William T.; McKenzie, Stuart W.; Rinella, Frank A.
1993-01-01
Concentrations of pesticides in the dissolved phase of surface water samples from the Yakima River basin, WA, were determined using preconcentration in the Goulden large-sample extractor (GLSE) and gas chromatography/mass spectrometry (GC/MS) analysis. Sample volumes ranging from 10 to 120 L were processed with the GLSE, and the results from the large-sample analyses were compared to those derived from 1-L continuous liquid-liquid extractions Few of the 40 target pesticides were detected in 1-L samples, whereas large-sample preconcentration in the GLSE provided detectable levels for many of the target pesticides. The number of pesticides detected in GLSE processed samples was usually directly proportional to sample volume, although the measured concentrations of the pesticides were generally lower at the larger sample volumes for the same water source. The GLSE can be used to provide lower detection levels relative to conventional liquid-liquid extraction in GC/MS analysis of pesticides in samples of surface water.
Participation in Learning and Depressive Symptoms
ERIC Educational Resources Information Center
Jenkins, Andrew
2012-01-01
This paper reports the findings of research on relationships between depression and participation in learning using data from a large sample of older adults. The objective was to establish whether learning can reduce the risk of depression. Data were obtained from the English Longitudinal Study of Ageing, a nationally-representative sample of…
An Astronomical Survey Conducted in Belgium
ERIC Educational Resources Information Center
Nazé, Yaël; Fantaine, Sébastien
2014-01-01
This paper presents the results of the first survey conducted in Belgium about the interest in and knowledge of astronomy. Two samples were studied, the public at large (667 questionnaires) and students (2589 questionnaires), but the results are generally similar in both samples. We evaluated people's interest, main information source and…
How College Students Conceptualize and Practice Responsible Drinking
ERIC Educational Resources Information Center
Barry, Adam E.; Goodson, Patricia
2011-01-01
Objective: This study sought to employ a mixed-methods approach to (a) qualitatively explore responsible drinking beliefs and behaviors among a sample of college students, and (b) quantitatively assess the prevalence of those behaviors. Participants: Convenience samples, drawn from currently enrolled students attending a large public university in…
ASSESSMENT OF LARGE RIVER MACROINVERTEBRATE ASSEMBLAGES: HOW FAR IS ENOUGH?
During the summer of 2001, twelve sites were sampled for macroinvertebrates, six each on the Great Miami and Kentucky Rivers. Sites were chosen in each river from those sampled in the 1999 methods comparison study to reflect a disturbance gradient. At each site, a total distanc...
Processing and analyzing solid waste samples from large and costly sampling events in a timely manner is often difficult. As part of a Cooperative Research and Development Agreement (CRADA), the U.S. EPA and Waste Management Inc. (WMI) are investigating the conversion of landfill...
INFORMATION MANAGEMENT AND RELATED QUALITY ASSURANCE FOR A LARGE SCALE, MULTI-SITE RESEARCH PROJECT
During the summer of 2000, as part of a U.S. Environmental Protection Agency study designed to improve microbial water quality monitoring protocols at public beaches, over 11,000 water samples were collected at five selected beaches across the country. At each beach, samples wer...
Control Variate Estimators of Survivor Growth from Point Samples
Francis A. Roesch; Paul C. van Deusen
1993-01-01
Two estimators of the control variate type for survivor growth from remeasured point samples are proposed and compared with more familiar estimators. The large reductionsin variance, observed in many cases forestimators constructed with control variates, arealso realized in thisapplication. A simulation study yielded consistent reductions in variance which were often...
Evaluation of residual uranium contamination in the dirt floor of an abandoned metal rolling mill.
Glassford, Eric; Spitz, Henry; Lobaugh, Megan; Spitler, Grant; Succop, Paul; Rice, Carol
2013-02-01
A single, large, bulk sample of uranium-contaminated material from the dirt floor of an abandoned metal rolling mill was separated into different types and sizes of aliquots to simulate samples that would be collected during site remediation. The facility rolled approximately 11,000 tons of hot-forged ingots of uranium metal approximately 60 y ago, and it has not been used since that time. Thirty small mass (≈ 0.7 g) and 15 large mass (≈ 70 g) samples were prepared from the heterogeneously contaminated bulk material to determine how measurements of the uranium contamination vary with sample size. Aliquots of bulk material were also resuspended in an exposure chamber to produce six samples of respirable particles that were obtained using a cascade impactor. Samples of removable surface contamination were collected by wiping 100 cm of the interior surfaces of the exposure chamber with 47-mm-diameter fiber filters. Uranium contamination in each of the samples was measured directly using high-resolution gamma ray spectrometry. As expected, results for isotopic uranium (i.e., U and U) measured with the large-mass and small-mass samples are significantly different (p < 0.001), and the coefficient of variation (COV) for the small-mass samples was greater than for the large-mass samples. The uranium isotopic concentrations measured in the air and on the wipe samples were not significantly different and were also not significantly different (p > 0.05) from results for the large- or small-mass samples. Large-mass samples are more reliable for characterizing heterogeneously distributed radiological contamination than small-mass samples since they exhibit the least variation compared to the mean. Thus, samples should be sufficiently large in mass to insure that the results are truly representative of the heterogeneously distributed uranium contamination present at the facility. Monitoring exposure of workers and the public as a result of uranium contamination resuspended during site remediation should be evaluated using samples of sufficient size and type to accommodate the heterogeneous distribution of uranium in the bulk material.
Videvall, Elin; Strandh, Maria; Engelbrecht, Anel; Cloete, Schalk; Cornwallis, Charlie K
2017-01-01
The gut microbiome of animals is emerging as an important factor influencing ecological and evolutionary processes. A major bottleneck in obtaining microbiome data from large numbers of samples is the time-consuming laboratory procedures required, specifically the isolation of DNA and generation of amplicon libraries. Recently, direct PCR kits have been developed that circumvent conventional DNA extraction steps, thereby streamlining the laboratory process by reducing preparation time and costs. However, the reliability and efficacy of direct PCR for measuring host microbiomes have not yet been investigated other than in humans with 454 sequencing. Here, we conduct a comprehensive evaluation of the microbial communities obtained with direct PCR and the widely used Mo Bio PowerSoil DNA extraction kit in five distinct gut sample types (ileum, cecum, colon, feces, and cloaca) from 20 juvenile ostriches, using 16S rRNA Illumina MiSeq sequencing. We found that direct PCR was highly comparable over a range of measures to the DNA extraction method in cecal, colon, and fecal samples. However, the two methods significantly differed in samples with comparably low bacterial biomass: cloacal and especially ileal samples. We also sequenced 100 replicate sample pairs to evaluate repeatability during both extraction and PCR stages and found that both methods were highly consistent for cecal, colon, and fecal samples ( r s > 0.7) but had low repeatability for cloacal ( r s = 0.39) and ileal ( r s = -0.24) samples. This study indicates that direct PCR provides a fast, cheap, and reliable alternative to conventional DNA extraction methods for retrieving 16S rRNA data, which can aid future gut microbiome studies. IMPORTANCE The microbial communities of animals can have large impacts on their hosts, and the number of studies using high-throughput sequencing to measure gut microbiomes is rapidly increasing. However, the library preparation procedure in microbiome research is both costly and time-consuming, especially for large numbers of samples. We investigated a cheaper and faster direct PCR method designed to bypass the DNA isolation steps during 16S rRNA library preparation and compared it with a standard DNA extraction method. We used both techniques on five different gut sample types collected from 20 juvenile ostriches and sequenced samples with Illumina MiSeq. The methods were highly comparable and highly repeatable in three sample types with high microbial biomass (cecum, colon, and feces), but larger differences and low repeatability were found in the microbiomes obtained from the ileum and cloaca. These results will help microbiome researchers assess library preparation procedures and plan their studies accordingly.
ERIC Educational Resources Information Center
Shin, Hye Sook
2009-01-01
Using data from a nationwide, large-scale experimental study of the effects of a connected classroom technology on student learning in algebra (Owens et al., 2004), this dissertation focuses on challenges that can arise in estimating treatment effects in educational field experiments when samples are highly heterogeneous in terms of various…
Francy, D.S.; Bushon, R.N.; Brady, A.M.G.; Bertke, E.E.; Kephart, C.M.; Likirdopulos, C.A.; Mailot, B.E.; Schaefer, F. W.; Lindquist, H.D. Alan
2009-01-01
Aims: To compare the performance of traditional methods to quantitative polymerase chain reaction (qPCR) for detecting five biological agents in large-volume drinking-water samples concentrated by ultrafiltration (UF). Methods and Results: Drinking-water samples (100 l) were seeded with Bacillus anthracis, Cryptospordium parvum, Francisella tularensis, Salmonella Typhi, and Vibrio cholerae and concentrated by UF. Recoveries by traditional methods were variable between samples and between some replicates; recoveries were not determined by qPCR. Francisella tularensis and V. cholerae were detected in all 14 samples after UF, B. anthracis was detected in 13, and C. parvum was detected in 9 out of 14 samples. Numbers found by qPCR after UF were significantly or nearly related to those found by traditional methods for all organisms except for C. parvum. A qPCR assay for S. Typhi was not available. Conclusions: qPCR can be used to rapidly detect biological agents after UF as well as traditional methods, but additional work is needed to improve qPCR assays for several biological agents, determine recoveries by qPCR, and expand the study to other areas. Significance and Impact of the Study: To our knowledge, this is the first study to compare the use of traditional and qPCR methods to detect biological agents in large-volume drinking-water samples. ?? 2009 The Society for Applied Microbiology.
Gyarmathy, V. Anna; Johnston, Lisa G.; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A.
2014-01-01
Background Respondent driven sampling (RDS) and Incentivized Snowball Sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). Methods We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania (“original sample”) to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. Results The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1 to 12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. Conclusions When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called “strudel effect” is discussed in the paper. PMID:24360650
Hanley, Sharon Jb; Fujita, Hiromasa; Yokoyama, Susumu; Kunisawa, Shiori; Tamakoshi, Akiko; Dong, Peixin; Kobayashi, Noriko; Watari, Hidemichi; Kudo, Masataka; Sakuragi, Noriaki
2016-09-01
Cervical cancer incidence and mortality is increasing in Japanese women under age 50. Screening uptake is low and proactive recommendations for human papillomavirus vaccination have been suspended. Other cervical cancer prevention initiatives are urgently needed. We assessed whether human papillomavirus self-sampling might be an acceptable alternative to physician-led screening, particularly in women with limited experience of tampon use. We also sought to identify any practical, logistical, or safety issues in women already attending for screening, before carrying out further large-scale studies in non-responders. In total, 203 women aged 20-49 attending their annual workplace healthcheck in Sapporo, northern Japan, performed unsupervised human papillomavirus self-sampling before undergoing a physician-led cervical smear and human papillomavirus test, and completing a measure of acceptability for both tests. Ninety per cent of participants stated they would use self-sampling again. They found instructions easy to follow and reported no issues with the usability of the self-sampling device. Compared with physician-led testing, women found self-sampling significantly less painful, less embarrassing and could relax more (p < 0.001), regardless of history of tampon use, which was associated with negative experiences in physician sampling (p = 0.034). Women lacked confidence the test had been performed correctly, despite no unsatisfactory samples. No safety issues were reported. Self-sampling was highly acceptable in this population of women. They could perform the test safely unsupervised, but lacked confidence the test has been carried out correctly. Japanese women need to be educated about the accuracy of human papillomavirus self-sampling and further large-scale studies are necessary in non-responders. © The Author(s) 2016.
A high-throughput core sampling device for the evaluation of maize stalk composition
2012-01-01
Background A major challenge in the identification and development of superior feedstocks for the production of second generation biofuels is the rapid assessment of biomass composition in a large number of samples. Currently, highly accurate and precise robotic analysis systems are available for the evaluation of biomass composition, on a large number of samples, with a variety of pretreatments. However, the lack of an inexpensive and high-throughput process for large scale sampling of biomass resources is still an important limiting factor. Our goal was to develop a simple mechanical maize stalk core sampling device that can be utilized to collect uniform samples of a dimension compatible with robotic processing and analysis, while allowing the collection of hundreds to thousands of samples per day. Results We have developed a core sampling device (CSD) to collect maize stalk samples compatible with robotic processing and analysis. The CSD facilitates the collection of thousands of uniform tissue cores consistent with high-throughput analysis required for breeding, genetics, and production studies. With a single CSD operated by one person with minimal training, more than 1,000 biomass samples were obtained in an eight-hour period. One of the main advantages of using cores is the high level of homogeneity of the samples obtained and the minimal opportunity for sample contamination. In addition, the samples obtained with the CSD can be placed directly into a bath of ice, dry ice, or liquid nitrogen maintaining the composition of the biomass sample for relatively long periods of time. Conclusions The CSD has been demonstrated to successfully produce homogeneous stalk core samples in a repeatable manner with a throughput substantially superior to the currently available sampling methods. Given the variety of maize developmental stages and the diversity of stalk diameter evaluated, it is expected that the CSD will have utility for other bioenergy crops as well. PMID:22548834
ERIC Educational Resources Information Center
Priebe, Gisela; Svedin, Carl Goran
2008-01-01
Objectives: The aim of this study was to investigate disclosure rates and disclosure patterns and to examine predictors of non-disclosure in a sample of male and female adolescents with self-reported experiences of sexual abuse. Method: A sample of 4,339 high school seniors (2,324 girls, 2,015 boys) was examined with a questionnaire concerning…
ChiLeung So; Jennifer Myszewski; Thomas Elder; Les Groom
2013-01-01
Abstract There have been several recent studies employing near infrared (NIR) spectroscopy for the rapid determination of microfibril angle (MFA). However, only a few have utilized samples cut from individual rings of increment cores, and none have been as large as this present study, sampling over 600 trees from two test sites producing over 3000 individual ring...
Self-Esteem Development across the Life Span: A Longitudinal Study with a Large Sample from Germany
ERIC Educational Resources Information Center
Orth, Ulrich; Maes, Jürgen; Schmitt, Manfred
2015-01-01
The authors examined the development of self-esteem across the life span. Data came from a German longitudinal study with 3 assessments across 4 years of a sample of 2,509 individuals ages 14 to 89 years. The self-esteem measure used showed strong measurement invariance across assessments and birth cohorts. Latent growth curve analyses indicated…
ERIC Educational Resources Information Center
Tesio, Luigi
2012-01-01
Outcome studies in biomedical research usually focus on testing mean changes across samples of subjects and, in so doing, often obscure changes in individuals. These changes, however, may be very informative in studies in which large or homogeneous samples are unavailable and mechanisms of action are still under scrutiny, as is often the case for…
Ströher, Patrícia R.; Firkowski, Carina R.; Freire, Andrea S.; Pie, Marcio R.
2011-01-01
The decapod Grapsus grapsus is commonly found on oceanic islands of the Pacific and Atlantic coasts of the Americas. In this study, a simple, quick and reliable method for detecting its larvae in plankton samples is described, which makes it ideal for large-scale studies of larval dispersal patterns in the species. PMID:21931530
ERIC Educational Resources Information Center
Olatunji, Bunmi O.; Broman-Fulks, Joshua J.; Bergman, Shawn M.; Green, Bradley A.; Zlomke, Kimberly R.
2010-01-01
Worry has been described as a core feature of several disorders, particularly generalized anxiety disorder (GAD). The present study examined the latent structure of worry by applying 3 taxometric procedures (MAXEIG, MAMBAC, and L-Mode) to data collected from 2 large samples. Worry in the first sample (Study 1) of community participants (n = 1,355)…
Contaminant bioaccumulation studies often rely on fish muscle filets as the tissue of choice for the measurement of nitrogen stable isotope ratios ( 15N) and mercury (Hg). Lethal sampling techniques may not be suitable for studies on limited populations from smaller sized aquati...
Dagostino, Concetta; De Gregori, Manuela; Gieger, Christian; Manz, Judith; Gudelj, Ivan; Lauc, Gordan; Divizia, Laura; Wang, Wei; Sim, Moira; Pemberton, Iain K; MacDougall, Jane; Williams, Frances; Van Zundert, Jan; Primorac, Dragan; Aulchenko, Yurii; Kapural, Leonardo; Allegri, Massimo
2017-01-01
Chronic low back pain (CLBP) is one of the most common medical conditions, ranking as the greatest contributor to global disability and accounting for huge societal costs based on the Global Burden of Disease 2010 study. Large genetic and -omics studies provide a promising avenue for the screening, development and validation of biomarkers useful for personalized diagnosis and treatment (precision medicine). Multicentre studies are needed for such an effort, and a standardized and homogeneous approach is vital for recruitment of large numbers of participants among different centres (clinical and laboratories) to obtain robust and reproducible results. To date, no validated standard operating procedures (SOPs) for genetic/-omics studies in chronic pain have been developed. In this study, we validated an SOP model that will be used in the multicentre (5 centres) retrospective "PainOmics" study, funded by the European Community in the 7th Framework Programme, which aims to develop new biomarkers for CLBP through three different -omics approaches: genomics, glycomics and activomics. The SOPs describe the specific procedures for (1) blood collection, (2) sample processing and storage, (3) shipping details and (4) cross-check testing and validation before assays that all the centres involved in the study have to follow. Multivariate analysis revealed the absolute specificity and homogeneity of the samples collected by the five centres for all genetics, glycomics and activomics analyses. The SOPs used in our multicenter study have been validated. Hence, they could represent an innovative tool for the correct management and collection of reliable samples in other large-omics-based multicenter studies.
NASA Astrophysics Data System (ADS)
Amaro, Raquel; Coelho, Sónia D.; Pastorinho, M. Ramiro; Taborda-Barata, Luís; Vaz-Patto, Maria A.; Monteiro, Marisa; Nepomuceno, Miguel C. S.; Lanzinha, Joăo C. G.; Teixeira, Joăo P.; Pereira, Cristiana C.; Sousa, Ana C. A.
2016-11-01
Fungi are a group of microbes that are found with particular incidence in the indoor environment. Their direct toxicity or capability of generating toxic compounds has been associated with a large number of adverse health effects, such as infectious diseases and allergies. Given that in modern society people spend a large part of their time indoors; fungal communities' characterization of this environmental compartment assumes paramount importance in the comprehension of health effects. House dust is an easy to obtain, time-integrative matrix, being its use in epidemiological studies on human exposure to environmental contaminants highly recommended. Furthermore, dust can carry a great variety of fungal content that undergoes a large number of processes that modulate and further complexify human exposure. Our study aims to identify and quantify the fungal community on house dust samples collected using two different methodologies (an approach not often seen in the literature): active (vacuum cleaner bags) and passive sampling (dust settled in petri dishes). Sampling was performed as part of the ongoing 6 × 60 × 6 Project in which six houses from Covilhă (Portugal), with building dates representative of six decades, were studied for a period of sixty days.
Protocol and methodology of Study epidemiological mental health in Andalusia: PISMA-ep.
Cervilla, Jorge A; Ruiz, Isabel; Rodríguez-Barranco, Miguel; Rivera, Margarita; Ibáñez-Casas, Inmaculada; Molina, Esther; Valmisa, Eulalio; Carmona-Calvo, José; Moreno-Küstner, Berta; Muñoz-Negro, José Eduardo; Ching-López, Ana; Gutiérrez, Blanca
This is the general methods describing paper of a cross-sectional study that aims to detect the prevalence of major mental disorders in Andalusia (Southern Spain), and their correlates or potential risk factors, using a large representative sample of community-dwelling adults. This is a cross-sectional study. We undertook a multistage sampling using different standard stratification levels and aimed to interview 4,518 randomly selected participants living in all 8 provinces of the Andalusian region utilizing a door-knocking approach. The Spanish version of the MINI International Neuropsychiatric Interview, a valid screening instrument ascertaining ICD-10/DSM-IV compatible mental disorder diagnoses was used as our main diagnostic tool. A large battery of other instruments was used to explore global functionality, medical comorbidity, personality traits, cognitive function and exposure to psychosocial potential risk factors. A saliva sample for DNA extraction was also obtained for a sub-genetic study. The interviews were administered and completed by fully trained interviewers, despite most tools used are compatible with lay interviewer use. A total of 3,892 (70.8%) of 5,496 initially attempted households had to be substituted for equivalent ones due to either no response (37.7%) or not fulfilling the required participant quota (33%). Thence, out of 5,496 eligible participants finally approached, 4,507 (83.7%) agreed to take part in the study, completed the interview and were finally included in the study (n=4,507) and 4,286 (78%) participants also agreed and consented to provide a saliva sample for DNA study. On the other hand, 989 (16.3%) approached potential participants refused to take part in the study. This is the largest mental health epidemiological study developed in the region of Spain (Andalusia). The response rates and representativeness of the sample obtained are fairly high. The method is particularly comprehensive for this sort of studies and includes both, personality and cognitive assessments, as well as a large array of bio-psycho-social risk measures. Copyright © 2016 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.
Outcome-Dependent Sampling with Interval-Censored Failure Time Data
Zhou, Qingning; Cai, Jianwen; Zhou, Haibo
2017-01-01
Summary Epidemiologic studies and disease prevention trials often seek to relate an exposure variable to a failure time that suffers from interval-censoring. When the failure rate is low and the time intervals are wide, a large cohort is often required so as to yield reliable precision on the exposure-failure-time relationship. However, large cohort studies with simple random sampling could be prohibitive for investigators with a limited budget, especially when the exposure variables are expensive to obtain. Alternative cost-effective sampling designs and inference procedures are therefore desirable. We propose an outcome-dependent sampling (ODS) design with interval-censored failure time data, where we enrich the observed sample by selectively including certain more informative failure subjects. We develop a novel sieve semiparametric maximum empirical likelihood approach for fitting the proportional hazards model to data from the proposed interval-censoring ODS design. This approach employs the empirical likelihood and sieve methods to deal with the infinite-dimensional nuisance parameters, which greatly reduces the dimensionality of the estimation problem and eases the computation difficulty. The consistency and asymptotic normality of the resulting regression parameter estimator are established. The results from our extensive simulation study show that the proposed design and method works well for practical situations and is more efficient than the alternative designs and competing approaches. An example from the Atherosclerosis Risk in Communities (ARIC) study is provided for illustration. PMID:28771664
Lederbogen, Florian; Kühner, Christine; Kirschbaum, Clemens; Meisinger, Christa; Lammich, Josefine; Holle, Rolf; Krumm, Bertram; von Lengerke, Thomas; Wichmann, Heinz-Erich; Deuschle, Michael; Ladwig, Karl-Heinz
2010-09-01
Analysis of salivary cortisol concentrations and derived indices is increasingly used in clinical and scientific medicine. However, comprehensive data on these parameters in the general population are scarce. The aim of this study was to evaluate the concentrations of salivary cortisol in a large middle-aged community sample and to identify major factors associated with altered hormone levels. We conducted a cross-sectional study within the Cooperative Health Research in the Region of Augsburg (KORA)-F3 study. A total of 1484 participants aged 50-69 years (52% women) had agreed to provide four saliva samples during a regular weekday. We measured salivary cortisol concentrations at wake-up (F0), (1/2) h (F(1/2)), 8 h (F8), and 14 h (F14) after waking. We calculated cortisol awakening response (CAR), slope, and area under the curve (AUC(G)) of the circadian cortisol secretion. Sociodemographic and clinical characteristics were evaluated by interview and questionnaires, sampling conditions by protocol. In total, 1208 participants returned saliva samples, exclusion criteria left 990 subjects for final analyses. Salivary cortisol levels were (means+/-s.d.) F0=13.7+/-7.6, F(1/2)=20.5+/-9.8, F8=5.4+/-3.3, and F14=2.0+/-1.8 nmol/l. Earlier sampling times were associated with higher CAR and smaller slope. Cortisol secretion was also influenced by gender and smoking habits. Higher perceived social support was associated with lower AUC(G) and smaller slope. We provide data on salivary cortisol concentrations in a large middle-aged community sample. Gender, sampling time, smoking habits, and perceived social support appeared as determinants of cortisol secretion.
McNeill, K S; Cancilla, D A
2009-03-01
Soil samples from three USA airports representing low, mid, and large volume users of aircraft deicing fluids (ADAFs) were analyzed by LC/MS/MS for the presence of triazoles, a class of corrosion inhibitors historically used in ADAFs. Triazoles, specifically the 4-methyl-1H-benzotriazole and the 5-methyl-1H-benzotriazole, were detected in a majority of samples and ranged from 2.35 to 424.19 microg/kg. Previous studies have focused primarily on ground and surface water impacts of larger volume ADAF users. The detection of triazoles in soils at low volume ADAF use airports suggests that deicing activities may have a broader environmental impact than previously considered.
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates
Hobbs, Michael T.; Brehme, Cheryl S.
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.
Hobbs, Michael T; Brehme, Cheryl S
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.
Molecular epidemiology of pathogenic Leptospira spp. among large ruminants in the Philippines.
Villanueva, Marvin A; Mingala, Claro N; Balbin, Michelle M; Nakajima, Chie; Isoda, Norikazu; Suzuki, Yasuhiko; Koizumi, Nobuo
2016-12-01
The extent of Leptospira infection in large ruminants resulting to economic problems in livestock industry in a leptospirosis-endemic country like the Philippines has not been extensively explored. Therefore, we determined the prevalence and carrier status of leptospirosis in large ruminants using molecular techniques and assessed the risk factors of acquiring leptospirosis in these animals. Water buffalo and cattle urine samples (n=831) collected from 21 farms during 2013-2015 were subjected to flaB-nested PCR to detect pathogenic Leptospira spp. Leptospiral flaB was detected in both species with a detection rate of 16.1%. Leptospiral DNA was detected only in samples from animals managed in communal farms. Sequence analysis of Leptospira flaB in large ruminants revealed the formation of three major clusters with L. borgpetersenii or L. kirschneri. One farm contained Leptospira flaB sequences from all clusters identified in this study, suggesting this farm was the main source of leptospires for other farms. This study suggested that these large ruminants are infected with various pathogenic Leptospira species causing possible major economic loss in the livestock industry as well as potential Leptospira reservoirs that can transmit infection to humans and other animals in the Philippines.
Study of Evaporation Rate of Water in Hydrophobic Confinement using Forward Flux Sampling
NASA Astrophysics Data System (ADS)
Sharma, Sumit; Debenedetti, Pablo G.
2012-02-01
Drying of hydrophobic cavities is of interest in understanding biological self assembly, protein stability and opening and closing of ion channels. Liquid-to-vapor transition of water in confinement is associated with large kinetic barriers which preclude its study using conventional simulation techniques. Using forward flux sampling to study the kinetics of the transition between two hydrophobic surfaces, we show that a) the free energy barriers to evaporation scale linearly with the distance between the two surfaces, d; b) the evaporation rates increase as the lateral size of the surfaces, L increases, and c) the transition state to evaporation for sufficiently large L is a cylindrical vapor cavity connecting the two hydrophobic surfaces. Finally, we decouple the effects of confinement geometry and surface chemistry on the evaporation rates.
Probst, Thomas; Pryss, Rüdiger C.; Langguth, Berthold; Spiliopoulou, Myra; Landgrebe, Michael; Vesala, Markku; Harrison, Stephen; Schobel, Johannes; Reichert, Manfred; Stach, Michael; Schlee, Winfried
2017-01-01
For understanding the heterogeneity of tinnitus, large samples are required. However, investigations on how samples recruited by different methods differ from each other are lacking. In the present study, three large samples each recruited by different means were compared: N = 5017 individuals registered at a self-help web platform for tinnitus (crowdsourcing platform Tinnitus Talk), N = 867 users of a smart mobile application for tinnitus (crowdsensing platform TrackYourTinnitus), and N = 3786 patients contacting an outpatient tinnitus clinic (Tinnitus Center of the University Hospital Regensburg). The three samples were compared regarding age, gender, and duration of tinnitus (month or years perceiving tinnitus; subjective report) using chi-squared tests. The three samples significantly differed from each other in age, gender and tinnitus duration (p < 0.05). Users of the TrackYourTinnitus crowdsensing platform were younger, users of the Tinnitus Talk crowdsourcing platform had more often female gender, and users of both newer technologies (crowdsourcing and crowdsensing) had more frequently acute/subacute tinnitus (<3 months and 4–6 months) as well as a very long tinnitus duration (>20 years). The implications of these findings for clinical research are that newer technologies such as crowdsourcing and crowdsensing platforms offer the possibility to reach individuals hard to get in contact with at an outpatient tinnitus clinic. Depending on the aims and the inclusion/exclusion criteria of a given study, different recruiting strategies (clinic and/or newer technologies) offer different advantages and disadvantages. In general, the representativeness of study results might be increased when tinnitus study samples are recruited in the clinic as well as via crowdsourcing and crowdsensing. PMID:28484389
Probst, Thomas; Pryss, Rüdiger C; Langguth, Berthold; Spiliopoulou, Myra; Landgrebe, Michael; Vesala, Markku; Harrison, Stephen; Schobel, Johannes; Reichert, Manfred; Stach, Michael; Schlee, Winfried
2017-01-01
For understanding the heterogeneity of tinnitus, large samples are required. However, investigations on how samples recruited by different methods differ from each other are lacking. In the present study, three large samples each recruited by different means were compared: N = 5017 individuals registered at a self-help web platform for tinnitus (crowdsourcing platform Tinnitus Talk), N = 867 users of a smart mobile application for tinnitus (crowdsensing platform TrackYourTinnitus), and N = 3786 patients contacting an outpatient tinnitus clinic (Tinnitus Center of the University Hospital Regensburg). The three samples were compared regarding age, gender, and duration of tinnitus (month or years perceiving tinnitus; subjective report) using chi-squared tests. The three samples significantly differed from each other in age, gender and tinnitus duration ( p < 0.05). Users of the TrackYourTinnitus crowdsensing platform were younger, users of the Tinnitus Talk crowdsourcing platform had more often female gender, and users of both newer technologies (crowdsourcing and crowdsensing) had more frequently acute/subacute tinnitus (<3 months and 4-6 months) as well as a very long tinnitus duration (>20 years). The implications of these findings for clinical research are that newer technologies such as crowdsourcing and crowdsensing platforms offer the possibility to reach individuals hard to get in contact with at an outpatient tinnitus clinic. Depending on the aims and the inclusion/exclusion criteria of a given study, different recruiting strategies (clinic and/or newer technologies) offer different advantages and disadvantages. In general, the representativeness of study results might be increased when tinnitus study samples are recruited in the clinic as well as via crowdsourcing and crowdsensing.
The Apollo 17 samples: The Massifs and landslide
NASA Technical Reports Server (NTRS)
Ryder, Graham
1992-01-01
More than 50 kg of rock and regolith samples, a little less than half the total Apollo 17 sample mass, was collected from the highland stations at Taurus-Littrow. Twice as much material was collected from the North Massif as from the South Massif and its landslide (the apparent disproportionate collecting at the mare sites is mainly a reflection of the large size of a few individual basalt samples). Descriptions of the collection, documentation, and nature of the samples are given. A comprehensive catalog is currently being produced. Many of the samples have been intensely studied over the last 20 years and some of the rocks have become very familiar and depicted in popular works, particularly the dunite clast (72415), the troctolite sample (76535), and the station 6 boulder samples. Most of the boulder samples have been studied in Consortium mode, and many of the rake samples have received a basic petrological/geochemical characterization.
Baxter, Amanda J.; Hughes, Maria Celia; Kvaskoff, Marina; Siskind, Victor; Shekar, Sri; Aitken, Joanne F.; Green, Adele C.; Duffy, David L.; Hayward, Nicholas K.; Martin, Nicholas G.; Whiteman, David C.
2013-01-01
Cutaneous malignant melanoma (CMM) is a major health issue in Queensland, Australia which has the world’s highest incidence. Recent molecular and epidemiologic studies suggest that CMM arises through multiple etiological pathways involving gene-environment interactions. Understanding the potential mechanisms leading to CMM requires larger studies than those previously conducted. This article describes the design and baseline characteristics of Q-MEGA, the Queensland study of Melanoma: Environmental and Genetic Associations, which followed-up four population-based samples of CMM patients in Queensland, including children, adolescents, men aged over 50, and a large sample of adult cases and their families, including twins. Q-MEGA aims to investigate the roles of genetic and environmental factors, and their interaction, in the etiology of melanoma. 3,471 participants took part in the follow-up study and were administered a computer-assisted telephone interview in 2002–2005. Updated data on environmental and phenotypic risk factors, and 2,777 blood samples were collected from interviewed participants as well as a subset of relatives. This study provides a large and well-described population-based sample of CMM cases with follow-up data. Characteristics of the cases and repeatability of sun exposure and phenotype measures between the baseline and the follow-up surveys, from six to 17 years later, are also described. PMID:18361720
Applying Active Learning to Assertion Classification of Concepts in Clinical Text
Chen, Yukun; Mani, Subramani; Xu, Hua
2012-01-01
Supervised machine learning methods for clinical natural language processing (NLP) research require a large number of annotated samples, which are very expensive to build because of the involvement of physicians. Active learning, an approach that actively samples from a large pool, provides an alternative solution. Its major goal in classification is to reduce the annotation effort while maintaining the quality of the predictive model. However, few studies have investigated its uses in clinical NLP. This paper reports an application of active learning to a clinical text classification task: to determine the assertion status of clinical concepts. The annotated corpus for the assertion classification task in the 2010 i2b2/VA Clinical NLP Challenge was used in this study. We implemented several existing and newly developed active learning algorithms and assessed their uses. The outcome is reported in the global ALC score, based on the Area under the average Learning Curve of the AUC (Area Under the Curve) score. Results showed that when the same number of annotated samples was used, active learning strategies could generate better classification models (best ALC – 0.7715) than the passive learning method (random sampling) (ALC – 0.7411). Moreover, to achieve the same classification performance, active learning strategies required fewer samples than the random sampling method. For example, to achieve an AUC of 0.79, the random sampling method used 32 samples, while our best active learning algorithm required only 12 samples, a reduction of 62.5% in manual annotation effort. PMID:22127105
Large-scale environments of narrow-line Seyfert 1 galaxies
NASA Astrophysics Data System (ADS)
Järvelä, E.; Lähteenmäki, A.; Lietzen, H.; Poudel, A.; Heinämäki, P.; Einasto, M.
2017-09-01
Studying large-scale environments of narrow-line Seyfert 1 (NLS1) galaxies gives a new perspective on their properties, particularly their radio loudness. The large-scale environment is believed to have an impact on the evolution and intrinsic properties of galaxies, however, NLS1 sources have not been studied in this context before. We have a large and diverse sample of 1341 NLS1 galaxies and three separate environment data sets constructed using Sloan Digital Sky Survey. We use various statistical methods to investigate how the properties of NLS1 galaxies are connected to the large-scale environment, and compare the large-scale environments of NLS1 galaxies with other active galactic nuclei (AGN) classes, for example, other jetted AGN and broad-line Seyfert 1 (BLS1) galaxies, to study how they are related. NLS1 galaxies reside in less dense environments than any of the comparison samples, thus confirming their young age. The average large-scale environment density and environmental distribution of NLS1 sources is clearly different compared to BLS1 galaxies, thus it is improbable that they could be the parent population of NLS1 galaxies and unified by orientation. Within the NLS1 class there is a trend of increasing radio loudness with increasing large-scale environment density, indicating that the large-scale environment affects their intrinsic properties. Our results suggest that the NLS1 class of sources is not homogeneous, and furthermore, that a considerable fraction of them are misclassified. We further support a published proposal to replace the traditional classification to radio-loud, and radio-quiet or radio-silent sources with a division into jetted and non-jetted sources.
Müller, Ueli C; Asherson, Philip; Banaschewski, Tobias; Buitelaar, Jan K; Ebstein, Richard P; Eisenberg, Jaques; Gill, Michael; Manor, Iris; Miranda, Ana; Oades, Robert D; Roeyers, Herbert; Rothenberger, Aribert; Sergeant, Joseph A; Sonuga-Barke, Edmund Js; Thompson, Margaret; Faraone, Stephen V; Steinhausen, Hans-Christoph
2011-04-07
The International Multi-centre ADHD Genetics (IMAGE) project with 11 participating centres from 7 European countries and Israel has collected a large behavioural and genetic database for present and future research. Behavioural data were collected from 1068 probands with ADHD and 1446 unselected siblings. The aim was to describe and analyse questionnaire data and IQ measures from all probands and siblings. In particular, to investigate the influence of age, gender, family status (proband vs. sibling), informant, and centres on sample homogeneity in psychopathological measures. Conners' Questionnaires, Strengths and Difficulties Questionnaires, and Wechsler Intelligence Scores were used to describe the phenotype of the sample. Data were analysed by use of robust statistical multi-way procedures. Besides main effects of age, gender, informant, and centre, there were considerable interaction effects on questionnaire data. The larger differences between probands and siblings at home than at school may reflect contrast effects in the parents. Furthermore, there were marked gender by status effects on the ADHD symptom ratings with girls scoring one standard deviation higher than boys in the proband sample but lower than boys in the siblings sample. The multi-centre design is another important source of heterogeneity, particularly in the interaction with the family status. To a large extent the centres differed from each other with regard to differences between proband and sibling scores. When ADHD probands are diagnosed by use of fixed symptom counts, the severity of the disorder in the proband sample may markedly differ between boys and girls and across age, particularly in samples with a large age range. A multi-centre design carries the risk of considerable phenotypic differences between centres and, consequently, of additional heterogeneity of the sample even if standardized diagnostic procedures are used. These possible sources of variance should be counteracted in genetic analyses either by using age and gender adjusted diagnostic procedures and regional normative data or by adjusting for design artefacts by use of covariate statistics, by eliminating outliers, or by other methods suitable for reducing heterogeneity.
Efficient computation of the joint sample frequency spectra for multiple populations.
Kamm, John A; Terhorst, Jonathan; Song, Yun S
2017-01-01
A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.
OXTR polymorphism in depression and completed suicide-A study on a large population sample.
Wasilewska, Krystyna; Pawlak, Aleksandra; Kostrzewa, Grażyna; Sobczyk-Kopcioł, Agnieszka; Kaczorowska, Aleksandra; Badowski, Jarosław; Brzozowska, Małgorzata; Drygas, Wojciech; Piwoński, Jerzy; Bielecki, Wojciech; Płoski, Rafał
2017-03-01
In the light of contradictory results concerning OXTR polymorphism rs53576 and depression, we decided to verify the potential association between the two on 1) a large, ethnically homogenous sample of 1185 individuals who completed the Beck Depression Inventory (BDI), as well as on 2) a sample of 763 suicide victims. In the population sample, AA males showed significantly lower BDI scores (p=0.005, p cor =0.030). Exploratory analyses suggested that this effect was limited to a subgroup within 0-9 BDI score range (p=0.0007, U-Mann Whitney test), whereas no main effect on depressive symptoms (BDI>9) was found. In the suicide sample no association with rs53576 genotype was present. Exploratory analyses in suicides revealed higher blood alcohol concentration (BAC) among AA than GG/GA males (p=0.014, U-Mann Whitney test). Our results show that the OXTR rs53576 variant modulates the mood in male individuals and may positively correlate with alcohol intake among male suicides, but is not associated with suicide or depression. The study adds to the growing knowledge on rs53576 genotype characteristics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Efficient computation of the joint sample frequency spectra for multiple populations
Kamm, John A.; Terhorst, Jonathan; Song, Yun S.
2016-01-01
A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity. PMID:28239248
Scaling Impact-Melt and Crater Dimensions: Implications for the Lunar Cratering Record
NASA Technical Reports Server (NTRS)
Cintala , Mark J.; Grieve, Richard A. F.
1997-01-01
The consequences of impact on the solid bodies of the solar system are manifest and legion. Although the visible effects on planetary surfaces, such as the Moon's, are the most obvious testimony to the spatial and temporal importance of impacts, less dramatic chemical and petrographic characteristics of materials affected by shock abound. Both the morphologic and petrologic aspects of impact cratering are important in deciphering lunar history, and, ideally, each should complement the other. In practice, however, a gap has persisted in relating large-scale cratering processes to petrologic and geochemical data obtained from lunar samples. While this is due in no small part to the fact that no Apollo mission unambiguously sampled deposits of a large crater, it can also be attributed to the general state of our knowledge of cratering phenomena, particularly those accompanying large events. The most common shock-metamorphosed lunar samples are breccias, but a substantial number are impact-melt rocks. Indeed, numerous workers have called attention to the importance of impact-melt rocks spanning a wide range of ages in the lunar sample collection. Photogeologic studies also have demonstrated the widespread occurrence of impact-melt lithologies in and around lunar craters. Thus, it is clear that impact melting has been a fundamental process operating throughout lunar history, at scales ranging from pits formed on individual regolith grains to the largest impact basins. This contribution examines the potential relationship between impact melting on the Moon and the interior morphologies of large craters and peaking basins. It then examines some of the implications of impact melting at such large scales for lunar-sample provenance and evolution of the lunar crust.
A comparison of liver sampling techniques in dogs.
Kemp, S D; Zimmerman, K L; Panciera, D L; Monroe, W E; Leib, M S; Lanz, O I
2015-01-01
The liver sampling technique in dogs that consistently provides samples adequate for accurate histopathologic interpretation is not known. To compare histopathologic results of liver samples obtained by punch, cup, and 14 gauge needle to large wedge samples collected at necropsy. Seventy dogs undergoing necropsy. Prospective study. Liver specimens were obtained from the left lateral liver lobe with an 8 mm punch, a 5 mm cup, and a 14 gauge needle. After sample acquisition, two larger tissue samples were collected near the center of the left lateral lobe to be used as a histologic standard for comparison. Histopathologic features and numbers of portal triads in each sample were recorded. The mean number of portal triads obtained by each sampling method were 2.9 in needle samples, 3.4 in cup samples, 12 in punch samples, and 30.7 in the necropsy samples. The diagnoses in 66% of needle samples, 60% of cup samples, and 69% of punch samples were in agreement with the necropsy samples, and these proportions were not significantly different from each other. The corresponding kappa coefficients were 0.59 for needle biopsies, 0.52 for cup biopsies, and 0.62 for punch biopsies. The histopathologic interpretation of a liver sample in the dog is unlikely to vary if the liver biopsy specimen contains at least 3-12 portal triads. However, in comparison large necropsy samples, the accuracy of all tested methods was relatively low. Copyright © 2014 by the American College of Veterinary Internal Medicine.
ERIC Educational Resources Information Center
Pujadas Botey, Anna; Vinturache, Angela; Bayrampour, Hamideh; Breitkreuz, Rhonda; Bukutu, Cecilia; Gibbard, Ben; Tough, Suzanne
2017-01-01
Parents and non-parental adults who interact with children influence child development. This study evaluates the knowledge of child development in two large and diverse samples of adults from Alberta in 2007 and 2013. Telephone interviews were completed by two random samples (1,443 in 2007; 1,451 in 2013). Participants were asked when specific…
ERIC Educational Resources Information Center
Poteat, V. Paul; Espelage, Dorothy L.; Koenig, Brian K.
2009-01-01
In this study, heterosexual students' willingness to remain friends with peers who disclose that they are gay or lesbian and their willingness to attend schools that include gay and lesbian students were examined among two large middle school and high school samples (Sample 1: n = 20,509; 50.7% girls; Sample 2: n = 16,917; 50.2% girls). Boys were…
Narumi, Ryohei; Tomonaga, Takeshi
2016-01-01
Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.
Assessment and associated features of prolonged grief disorder among Chinese bereaved individuals.
Li, Jie; Prigerson, Holly G
2016-04-01
Most research on the assessment and characteristics of prolonged grief disorder (PGD) has been conducted in Western bereaved samples. Limited information about PGD in Chinese samples exists. This study aims to validate the Chinese version of the Inventory of Complicated grief (ICG), examine the distinctiveness of PGD symptoms from symptoms of bereavement-related depression and anxiety, and explore the prevalence of PGD in a Chinese sample. Responses from 1358 bereaved Chinese adults were collected through an on-line survey. They completed the Chinese version of ICG and a questionnaire measuring depression and anxiety symptoms. The findings indicate that Chinese ICG has sound validity and high internal consistency. The ICG cut-off score for PGD "caseness"in this large Chinese sample was 48. The distinctiveness of PGD symptoms from those of depression and anxiety was supported by the results of the confirmatory factor analysis and the fact that PGD occurred in isolation in the studied sample. The prevalence of PGD was13.9%. ICG is a valid instrument for use in the Chinese context. Several key characteristics of PGD in Chinese, either different from or comparable to findings in Western samples, may stimulate further research and clinical interest in the concept by providing empirical evidence from an large and influential Eastern country. Copyright © 2015 Elsevier Inc. All rights reserved.
BLIND ordering of large-scale transcriptomic developmental timecourses.
Anavy, Leon; Levin, Michal; Khair, Sally; Nakanishi, Nagayasu; Fernandez-Valverde, Selene L; Degnan, Bernard M; Yanai, Itai
2014-03-01
RNA-Seq enables the efficient transcriptome sequencing of many samples from small amounts of material, but the analysis of these data remains challenging. In particular, in developmental studies, RNA-Seq is challenged by the morphological staging of samples, such as embryos, since these often lack clear markers at any particular stage. In such cases, the automatic identification of the stage of a sample would enable previously infeasible experimental designs. Here we present the 'basic linear index determination of transcriptomes' (BLIND) method for ordering samples comprising different developmental stages. The method is an implementation of a traveling salesman algorithm to order the transcriptomes according to their inter-relationships as defined by principal components analysis. To establish the direction of the ordered samples, we show that an appropriate indicator is the entropy of transcriptomic gene expression levels, which increases over developmental time. Using BLIND, we correctly recover the annotated order of previously published embryonic transcriptomic timecourses for frog, mosquito, fly and zebrafish. We further demonstrate the efficacy of BLIND by collecting 59 embryos of the sponge Amphimedon queenslandica and ordering their transcriptomes according to developmental stage. BLIND is thus useful in establishing the temporal order of samples within large datasets and is of particular relevance to the study of organisms with asynchronous development and when morphological staging is difficult.
Validity of Bioelectrical Impedance Analysis to Estimation Fat-Free Mass in the Army Cadets.
Langer, Raquel D; Borges, Juliano H; Pascoa, Mauro A; Cirolini, Vagner X; Guerra-Júnior, Gil; Gonçalves, Ezequiel M
2016-03-11
Bioelectrical Impedance Analysis (BIA) is a fast, practical, non-invasive, and frequently used method for fat-free mass (FFM) estimation. The aims of this study were to validate predictive equations of BIA to FFM estimation in Army cadets and to develop and validate a specific BIA equation for this population. A total of 396 males, Brazilian Army cadets, aged 17-24 years were included. The study used eight published predictive BIA equations, a specific equation in FFM estimation, and dual-energy X-ray absorptiometry (DXA) as a reference method. Student's t-test (for paired sample), linear regression analysis, and Bland-Altman method were used to test the validity of the BIA equations. Predictive BIA equations showed significant differences in FFM compared to DXA (p < 0.05) and large limits of agreement by Bland-Altman. Predictive BIA equations explained 68% to 88% of FFM variance. Specific BIA equations showed no significant differences in FFM, compared to DXA values. Published BIA predictive equations showed poor accuracy in this sample. The specific BIA equations, developed in this study, demonstrated validity for this sample, although should be used with caution in samples with a large range of FFM.
Corporate psychopathy: Talking the walk.
Babiak, Paul; Neumann, Craig S; Hare, Robert D
2010-01-01
There is a very large literature on the important role of psychopathy in the criminal justice system. We know much less about corporate psychopathy and its implications, in large part because of the difficulty in obtaining the active cooperation of business organizations. This has left us with only a few small-sample studies, anecdotes, and speculation. In this study, we had a unique opportunity to examine psychopathy and its correlates in a sample of 203 corporate professionals selected by their companies to participate in management development programs. The correlates included demographic and status variables, as well as in-house 360 degrees assessments and performance ratings. The prevalence of psychopathic traits-as measured by the Psychopathy Checklist-Revised (PCL-R) and a Psychopathy Checklist: Screening Version (PCL: SV) "equivalent"-was higher than that found in community samples. The results of confirmatory factor analysis (CFA) and structural equation modeling (SEM) indicated that the underlying latent structure of psychopathy in our corporate sample was consistent with that model found in community and offender studies. Psychopathy was positively associated with in-house ratings of charisma/presentation style (creativity, good strategic thinking and communication skills) but negatively associated with ratings of responsibility/performance (being a team player, management skills, and overall accomplishments).
Patel, Rashmi; Jayatilleke, Nishamali; Broadbent, Matthew; Chang, Chin-Kuo; Foskett, Nadia; Gorrell, Genevieve; Hayes, Richard D; Jackson, Richard; Johnston, Caroline; Shetty, Hitesh; Roberts, Angus; McGuire, Philip; Stewart, Robert
2015-09-07
To identify negative symptoms in the clinical records of a large sample of patients with schizophrenia using natural language processing and assess their relationship with clinical outcomes. Observational study using an anonymised electronic health record case register. South London and Maudsley NHS Trust (SLaM), a large provider of inpatient and community mental healthcare in the UK. 7678 patients with schizophrenia receiving care during 2011. Hospital admission, readmission and duration of admission. 10 different negative symptoms were ascertained with precision statistics above 0.80. 41% of patients had 2 or more negative symptoms. Negative symptoms were associated with younger age, male gender and single marital status, and with increased likelihood of hospital admission (OR 1.24, 95% CI 1.10 to 1.39), longer duration of admission (β-coefficient 20.5 days, 7.6-33.5), and increased likelihood of readmission following discharge (OR 1.58, 1.28 to 1.95). Negative symptoms were common and associated with adverse clinical outcomes, consistent with evidence that these symptoms account for much of the disability associated with schizophrenia. Natural language processing provides a means of conducting research in large representative samples of patients, using data recorded during routine clinical practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Incidental Learning of Melodic Structure of North Indian Music
ERIC Educational Resources Information Center
Rohrmeier, Martin; Widdess, Richard
2017-01-01
Musical knowledge is largely implicit. It is acquired without awareness of its complex rules, through interaction with a large number of samples during musical enculturation. Whereas several studies explored implicit learning of mostly abstract and less ecologically valid features of Western music, very little work has been done with respect to…
Learning Instructor Intervention from MOOC Forums: Early Results and Issues
ERIC Educational Resources Information Center
Kumar, Muthu; Kan, Min-Yen; Tan, Bernard C. Y.; Ragupathi, Kiruthika
2015-01-01
With large student enrollment, MOOC instructors face the unique challenge in deciding when to intervene in forum discussions with their limited bandwidth. We study this problem of "instructor intervention." Using a large sample of forum data culled from 61 courses, we design a binary classifier to predict whether an instructor should…
Development and validation of a low-density SNP panel related to prolificacy in sheep
USDA-ARS?s Scientific Manuscript database
High-density SNP panels (e.g., 50,000 and 600,000 markers) have been used in exploratory population genetic studies with commercial and minor breeds of sheep. However, routine genetic diversity evaluations of large numbers of samples with large panels are in general cost-prohibitive for gene banks. ...
Graham, Jay P; VanDerslice, James
2007-06-01
Many communities along the US-Mexico border remain without infrastructure for water and sewage. Residents in these communities often collect and store their water in open 55-gallon drums. This study evaluated changes in drinking water quality resulting from an intervention that provided large closed water storage tanks (2,500-gallons) to individual homes lacking a piped water supply. After the intervention, many of the households did not change the source of their drinking water to the large storage tanks. Therefore, water quality results were first compared based on the source of the household's drinking water: store or vending machine, large tank, or collected from a public supply and transported by the household. Of the households that used the large storage tank as their drinking water supply, drinking water quality was generally of poorer quality. Fifty-four percent of samples collected prior to intervention had detectable levels of total coliforms, while 82% of samples were positive nine months after the intervention (p < 0.05). Exploratory analyses were also carried out to measure water quality at different points between collection by water delivery trucks and delivery to the household's large storage tank. Thirty percent of the samples taken immediately after water was delivered to the home had high total coliforms (> 10 CFU/100 ml). Mean free chlorine levels dropped from 0.43 mg/l, where the trucks filled their tanks, to 0.20 mg/l inside the household's tank immediately after delivery. Results of this study have implications for interventions that focus on safe water treatment and storage in the home, and for guidelines regarding the level of free chlorine required in water delivered by water delivery trucks.
O'Neal, Wanda K; Anderson, Wayne; Basta, Patricia V; Carretta, Elizabeth E; Doerschuk, Claire M; Barr, R Graham; Bleecker, Eugene R; Christenson, Stephanie A; Curtis, Jeffrey L; Han, Meilan K; Hansel, Nadia N; Kanner, Richard E; Kleerup, Eric C; Martinez, Fernando J; Miller, Bruce E; Peters, Stephen P; Rennard, Stephen I; Scholand, Mary Beth; Tal-Singer, Ruth; Woodruff, Prescott G; Couper, David J; Davis, Sonia M
2014-01-08
As a part of the longitudinal Chronic Obstructive Pulmonary Disease (COPD) study, Subpopulations and Intermediate Outcome Measures in COPD study (SPIROMICS), blood samples are being collected from 3200 subjects with the goal of identifying blood biomarkers for sub-phenotyping patients and predicting disease progression. To determine the most reliable sample type for measuring specific blood analytes in the cohort, a pilot study was performed from a subset of 24 subjects comparing serum, Ethylenediaminetetraacetic acid (EDTA) plasma, and EDTA plasma with proteinase inhibitors (P100). 105 analytes, chosen for potential relevance to COPD, arranged in 12 multiplex and one simplex platform (Myriad-RBM) were evaluated in duplicate from the three sample types from 24 subjects. The reliability coefficient and the coefficient of variation (CV) were calculated. The performance of each analyte and mean analyte levels were evaluated across sample types. 20% of analytes were not consistently detectable in any sample type. Higher reliability and/or smaller CV were determined for 12 analytes in EDTA plasma compared to serum, and for 11 analytes in serum compared to EDTA plasma. While reliability measures were similar for EDTA plasma and P100 plasma for a majority of analytes, CV was modestly increased in P100 plasma for eight analytes. Each analyte within a multiplex produced independent measurement characteristics, complicating selection of sample type for individual multiplexes. There were notable detectability and measurability differences between serum and plasma. Multiplexing may not be ideal if large reliability differences exist across analytes measured within the multiplex, especially if values differ based on sample type. For some analytes, the large CV should be considered during experimental design, and the use of duplicate and/or triplicate samples may be necessary. These results should prove useful for studies evaluating selection of samples for evaluation of potential blood biomarkers.
2014-01-01
Background As a part of the longitudinal Chronic Obstructive Pulmonary Disease (COPD) study, Subpopulations and Intermediate Outcome Measures in COPD study (SPIROMICS), blood samples are being collected from 3200 subjects with the goal of identifying blood biomarkers for sub-phenotyping patients and predicting disease progression. To determine the most reliable sample type for measuring specific blood analytes in the cohort, a pilot study was performed from a subset of 24 subjects comparing serum, Ethylenediaminetetraacetic acid (EDTA) plasma, and EDTA plasma with proteinase inhibitors (P100™). Methods 105 analytes, chosen for potential relevance to COPD, arranged in 12 multiplex and one simplex platform (Myriad-RBM) were evaluated in duplicate from the three sample types from 24 subjects. The reliability coefficient and the coefficient of variation (CV) were calculated. The performance of each analyte and mean analyte levels were evaluated across sample types. Results 20% of analytes were not consistently detectable in any sample type. Higher reliability and/or smaller CV were determined for 12 analytes in EDTA plasma compared to serum, and for 11 analytes in serum compared to EDTA plasma. While reliability measures were similar for EDTA plasma and P100 plasma for a majority of analytes, CV was modestly increased in P100 plasma for eight analytes. Each analyte within a multiplex produced independent measurement characteristics, complicating selection of sample type for individual multiplexes. Conclusions There were notable detectability and measurability differences between serum and plasma. Multiplexing may not be ideal if large reliability differences exist across analytes measured within the multiplex, especially if values differ based on sample type. For some analytes, the large CV should be considered during experimental design, and the use of duplicate and/or triplicate samples may be necessary. These results should prove useful for studies evaluating selection of samples for evaluation of potential blood biomarkers. PMID:24397870
NASA Astrophysics Data System (ADS)
Spinoglio, L.; Alonso-Herrero, A.; Armus, L.; Baes, M.; Bernard-Salas, J.; Bianchi, S.; Bocchio, M.; Bolatto, A.; Bradford, C.; Braine, J.; Carrera, F. J.; Ciesla, L.; Clements, D. L.; Dannerbauer, H.; Doi, Y.; Efstathiou, A.; Egami, E.; Fernández-Ontiveros, J. A.; Ferrara, A.; Fischer, J.; Franceschini, A.; Gallerani, S.; Giard, M.; González-Alfonso, E.; Gruppioni, C.; Guillard, P.; Hatziminaoglou, E.; Imanishi, M.; Ishihara, D.; Isobe, N.; Kaneda, H.; Kawada, M.; Kohno, K.; Kwon, J.; Madden, S.; Malkan, M. A.; Marassi, S.; Matsuhara, H.; Matsuura, M.; Miniutti, G.; Nagamine, K.; Nagao, T.; Najarro, F.; Nakagawa, T.; Onaka, T.; Oyabu, S.; Pallottini, A.; Piro, L.; Pozzi, F.; Rodighiero, G.; Roelfsema, P.; Sakon, I.; Santini, P.; Schaerer, D.; Schneider, R.; Scott, D.; Serjeant, S.; Shibai, H.; Smith, J.-D. T.; Sobacchi, E.; Sturm, E.; Suzuki, T.; Vallini, L.; van der Tak, F.; Vignali, C.; Yamada, T.; Wada, T.; Wang, L.
2017-11-01
IR spectroscopy in the range 12-230 μm with the SPace IR telescope for Cosmology and Astrophysics (SPICA) will reveal the physical processes governing the formation and evolution of galaxies and black holes through cosmic time, bridging the gap between the James Webb Space Telescope and the upcoming Extremely Large Telescopes at shorter wavelengths and the Atacama Large Millimeter Array at longer wavelengths. The SPICA, with its 2.5-m telescope actively cooled to below 8 K, will obtain the first spectroscopic determination, in the mid-IR rest-frame, of both the star-formation rate and black hole accretion rate histories of galaxies, reaching lookback times of 12 Gyr, for large statistically significant samples. Densities, temperatures, radiation fields, and gas-phase metallicities will be measured in dust-obscured galaxies and active galactic nuclei, sampling a large range in mass and luminosity, from faint local dwarf galaxies to luminous quasars in the distant Universe. Active galactic nuclei and starburst feedback and feeding mechanisms in distant galaxies will be uncovered through detailed measurements of molecular and atomic line profiles. The SPICA's large-area deep spectrophotometric surveys will provide mid-IR spectra and continuum fluxes for unbiased samples of tens of thousands of galaxies, out to redshifts of z 6.
Dielectric studies on PEG-LTMS based polymer composites
NASA Astrophysics Data System (ADS)
Patil, Ravikumar V.; Praveen, D.; Damle, R.
2018-02-01
PEG LTMS based polymer composites were prepared and studied for dielectric constant variation with frequency and temperature as a potential candidate with better dielectric properties. Solution cast technique is used for the preparation of polymer composite with five different compositions. Samples show variation in dielectric constant with frequency and temperature. Dielectric constant is large at low frequencies and higher temperatures. Samples with larger space charges have shown larger dielectric constant. The highest dielectric constant observed was about 29244 for PEG25LTMS sample at 100Hz and 312 K.
Bartsch, L.A.; Richardson, W.B.; Naimo, T.J.
1998-01-01
Estimation of benthic macroinvertebrate populations over large spatial scales is difficult due to the high variability in abundance and the cost of sample processing and taxonomic analysis. To determine a cost-effective, statistically powerful sample design, we conducted an exploratory study of the spatial variation of benthic macroinvertebrates in a 37 km reach of the Upper Mississippi River. We sampled benthos at 36 sites within each of two strata, contiguous backwater and channel border. Three standard ponar (525 cm(2)) grab samples were obtained at each site ('Original Design'). Analysis of variance and sampling cost of strata-wide estimates for abundance of Oligochaeta, Chironomidae, and total invertebrates showed that only one ponar sample per site ('Reduced Design') yielded essentially the same abundance estimates as the Original Design, while reducing the overall cost by 63%. A posteriori statistical power analysis (alpha = 0.05, beta = 0.20) on the Reduced Design estimated that at least 18 sites per stratum were needed to detect differences in mean abundance between contiguous backwater and channel border areas for Oligochaeta, Chironomidae, and total invertebrates. Statistical power was nearly identical for the three taxonomic groups. The abundances of several taxa of concern (e.g., Hexagenia mayflies and Musculium fingernail clams) were too spatially variable to estimate power with our method. Resampling simulations indicated that to achieve adequate sampling precision for Oligochaeta, at least 36 sample sites per stratum would be required, whereas a sampling precision of 0.2 would not be attained with any sample size for Hexagenia in channel border areas, or Chironomidae and Musculium in both strata given the variance structure of the original samples. Community-wide diversity indices (Brillouin and 1-Simpsons) increased as sample area per site increased. The backwater area had higher diversity than the channel border area. The number of sampling sites required to sample benthic macroinvertebrates during our sampling period depended on the study objective and ranged from 18 to more than 40 sites per stratum. No single sampling regime would efficiently and adequately sample all components of the macroinvertebrate community.
Cantiello, John; Fottler, Myron D; Oetjen, Dawn; Zhang, Ning Jackie
2015-05-12
The large number of uninsured individuals in the United States creates negative consequences for those who are uninsured and for those who are covered by health insurance plans. Young adults between the ages of 18 and 24 are the largest uninsured population subgroup. This subgroup warrants analysis. The major aim of this study is to determine why young adults between the ages of 18 and 24 are the largest uninsured population subgroup. The present study seeks to determine why young adults between the ages of 18 and 24 are the largest population subgroup that is not covered by private health insurance. Data on perceived health status, perceived need, perceived value, socioeconomic status, gender, and race was obtained from a national sample of 1,340 young adults from the 2005 Medical Expenditure Panel Survey and examined for possible explanatory variables, as well as data on the same variables from a national sample of 1,463 from the 2008 Medical Expenditure Panel Survey. Results of the structural equation model analysis indicate that insurance coverage in the 2005 sample was largely a function of higher socioeconomic status and being a non-minority. Perceived health status, perceived need, perceived value, and gender were not significant predictors of private health insurance coverage in the 2005 sample. However, in the 2008 sample, these indicators changed. Socioeconomic status, minority status, perceived health, perceived need, and perceived value were significant predictors of private health insurance coverage. The results of this study show that coverage by a private health insurance plan in the 2005 sample was largely a matter of having a higher socioeconomic status and having a non-minority status. In 2008 each of the attitudinal variables (perceived health, perceived value, and perceived need) predicted whether subjects carried private insurance. Our findings suggest that among those sampled, the young adult subgroup between the ages of 18 and 24 does not necessarily represent a unique segment of the population, with behaviors differing from the rest of the sample.
NASA Technical Reports Server (NTRS)
Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.
2002-01-01
The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.
Exploratory Factor Analysis with Small Sample Sizes
ERIC Educational Resources Information Center
de Winter, J. C. F.; Dodou, D.; Wieringa, P. A.
2009-01-01
Exploratory factor analysis (EFA) is generally regarded as a technique for large sample sizes ("N"), with N = 50 as a reasonable absolute minimum. This study offers a comprehensive overview of the conditions in which EFA can yield good quality results for "N" below 50. Simulations were carried out to estimate the minimum required "N" for different…
Associating Pregnancy with Partner Violence against Chinese Women
ERIC Educational Resources Information Center
Chan, Ko Ling; Brownridge, Douglas A.; Tiwari, Agnes; Fong, Daniel Y. T.; Leung, Wing Cheong; Ho, Pak Chung
2011-01-01
The present study discusses if pregnancy is a risk factor for intimate partner violence using a large, representative sample containing detailed information on partner violence including physical and sexual abuse as well as perpetrator-related risk factors. Data from a representative sample of 2,225 men were analyzed. The self-reported prevalence…
PROPOSED STANDARDIZED ASSESSMENT METHODS (SAMS) FOR ELECTROFISHING LARGE RIVERS
The effects of electrofishing design and sampling distance were studied at 49 sites across four boatable rivers ranging in drainage area from 13,947 to 23,041 km2 in the Ohio River basin. Two general types of sites were sampled: Run-of-the-River (Free-flowing sites or with smal...
Production ecology of Thuja occidentalis
Philip V. Hofmeyer; Robert S. Seymour; Laura S. Kenefic
2010-01-01
Equations to predict branch and tree leaf area, foliar mass, and stemwood volume were developed from 25 destructively sampled northern white-cedar (Thuja occidentalis L.) trees, a species whose production ecology has not been studied. Resulting models were applied to a large sample of 296 cored trees from 60 sites stratified across a soil gradient...
Career Transitions: The Experiences of Unemployed Women Managers
ERIC Educational Resources Information Center
Sheridan, Terry A.
2008-01-01
A sample of 45 women managers was surveyed in a qualitative study to explore their experiences of being unemployed. The sample was purposeful, and the data were collected on a website-based survey. The experience of unemployment for female managers was far different from what was previously presumed from research largely drawn from male …
ERIC Educational Resources Information Center
Whitehouse, Andrew J. O.; Mattes, Eugen; Maybery, Murray T.; Sawyer, Michael G.; Jacoby, Peter; Keelan, Jeffrey A.; Hickey, Martha
2012-01-01
Background: Preliminary evidence suggests that prenatal testosterone exposure may be associated with language delay. However, no study has examined a large sample of children at multiple time-points. Methods: Umbilical cord blood samples were obtained at 861 births and analysed for bioavailable testosterone (BioT) concentrations. When…
Reported Childhood Sexual Abuse and Eating-Disordered Cognitions and Behaviors
ERIC Educational Resources Information Center
van Gerko, K.; Hughes, M.L.; Hamill, M.; Waller, G.
2005-01-01
Objective:: This study assessed links between reported childhood sexual abuse and a range of eating behaviors and attitudes, among a large sample of eating-disordered women. It tested the hypothesis that there will be links to bulimic behaviors and body dissatisfaction, rather than restriction. Method:: The sample consisted of 299 women, meeting…
Kostich, Mitchell S; Batt, Angela L; Lazorchak, James M
2014-01-01
We measured concentrations of 56 active pharmaceutical ingredients (APIs) in effluent samples from 50 large wastewater treatment plants across the US. Hydrochlorothiazide was found in every sample. Metoprolol, atenolol, and carbamazepine were found in over 90% of the samples. Valsartan had the highest concentration (5300 ng/L), and also had the highest average concentration (1600 ng/L) across all 50 samples. Estimates of potential risks to healthy human adults were greatest for six anti-hypertensive APIs (lisinopril, hydrochlorothiazide, valsartan, atenolol, enalaprilat, and metoprolol), but nevertheless suggest risks of exposure to individual APIs as well as their mixtures are generally very low. Estimates of potential risks to aquatic life were also low for most APIs, but suggest more detailed study of potential ecological impacts from four analytes (sertraline, propranolol, desmethylsertraline, and valsartan). Published by Elsevier Ltd.
Does size matter? Statistical limits of paleomagnetic field reconstruction from small rock specimens
NASA Astrophysics Data System (ADS)
Berndt, Thomas; Muxworthy, Adrian R.; Fabian, Karl
2016-01-01
As samples of ever decreasing sizes are being studied paleomagnetically, care has to be taken that the underlying assumptions of statistical thermodynamics (Maxwell-Boltzmann statistics) are being met. Here we determine how many grains and how large a magnetic moment a sample needs to have to be able to accurately record an ambient field. It is found that for samples with a thermoremanent magnetic moment larger than 10-11Am2 the assumption of a sufficiently large number of grains is usually given. Standard 25 mm diameter paleomagnetic samples usually contain enough magnetic grains such that statistical errors are negligible, but "single silicate crystal" works on, for example, zircon, plagioclase, and olivine crystals are approaching the limits of what is physically possible, leading to statistic errors in both the angular deviation and paleointensity that are comparable to other sources of error. The reliability of nanopaleomagnetic imaging techniques capable of resolving individual grains (used, for example, to study the cloudy zone in meteorites), however, is questionable due to the limited area of the material covered.
NASA Astrophysics Data System (ADS)
Piazzi, L.; Bonaviri, C.; Castelli, A.; Ceccherelli, G.; Costa, G.; Curini-Galletti, M.; Langeneck, J.; Manconi, R.; Montefalcone, M.; Pipitone, C.; Rosso, A.; Pinna, S.
2018-07-01
In the Mediterranean Sea, Cystoseira species are the most important canopy-forming algae in shallow rocky bottoms, hosting high biodiverse sessile and mobile communities. A large-scale study has been carried out to investigate the structure of the Cystoseira-dominated assemblages at different spatial scales and to test the hypotheses that alpha and beta diversity of the assemblages, the abundance and the structure of epiphytic macroalgae, epilithic macroalgae, sessile macroinvertebrates and mobile macroinvertebrates associated to Cystoseira beds changed among scales. A hierarchical sampling design in a total of five sites across the Mediterranean Sea (Croatia, Montenegro, Sardinia, Tuscany and Balearic Islands) was used. A total of 597 taxa associated to Cystoseira beds were identified with a mean number per sample ranging between 141.1 ± 6.6 (Tuscany) and 173.9 ± 8.5(Sardinia). A high variability at small (among samples) and large (among sites) scale was generally highlighted, but the studied assemblages showed different patterns of spatial variability. The relative importance of the different scales of spatial variability should be considered to optimize sampling designs and propose monitoring plans of this habitat.
Bond, Alexander L; Provencher, Jennifer F; Elliot, Richard D; Ryan, Pierre C; Rowe, Sherrylynn; Jones, Ian L; Robertson, Gregory J; Wilhelm, Sabina I
2013-12-15
Plastic ingestion by seabirds is a growing conservation issue, but there are few time series of plastic ingestion with large sample sizes for which one can assess temporal trends. Common and Thick-billed Murres (Uria aalge and U. lomvia) are pursuit-diving auks that are legally harvested in Newfoundland and Labrador, Canada. Here, we combined previously unpublished data on plastic ingestion (from the 1980s to the 1990s) with contemporary samples (2011-2012) to evaluate changes in murres' plastic ingestion. Approximately 7% of murres had ingested plastic, with no significant change in the frequency of ingestion among species or periods. The number of pieces of plastic/bird, and mass of plastic/bird were highest in the 1980s, lowest in the late 1990s, and intermediate in contemporary samples. Studying plastic ingestion in harvested seabird populations links harvesters to conservation and health-related issues and is a useful source of large samples for diet and plastic ingestion studies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Xu, Man K; Morin, Alexandre J S; Marsh, Herbert W; Richards, Marcus; Jones, Peter B
2016-08-01
The factorial structure of the Parental Bonding Instrument (PBI) has been frequently studied in diverse samples but no study has examined its psychometric properties from large, population-based samples. In particular, important questions have not been addressed such as the measurement invariance properties across parental and offspring gender. We evaluated the PBI based on responses from a large, representative population-based sample, using an exploratory structural equation modeling method appropriate for categorical data. Analysis revealed a three-factor structure representing "care," "overprotection," and "autonomy" parenting styles. In terms of psychometric measurement validity, our results supported the complete invariance of the PBI ratings across sons and daughters for their mothers and fathers. The PBI ratings were also robust in relation to personality and mental health status. In terms of predictive value, paternal care showed a protective effect on mental health at age 43 in sons. The PBI is a sound instrument for capturing perceived parenting styles, and is predictive of mental health in middle adulthood. © The Author(s) 2016.
Nondestructive Analysis of Astromaterials by Micro-CT and Micro-XRF Analysis for PET Examination
NASA Technical Reports Server (NTRS)
Zeigler, R. A.; Righter, K.; Allen, C. C.
2013-01-01
An integral part of any sample return mission is the initial description and classification of returned samples by the preliminary examination team (PET). The goal of the PET is to characterize and classify returned samples and make this information available to the larger research community who then conduct more in-depth studies on the samples. The PET tries to minimize the impact their work has on the sample suite, which has in the past limited the PET work to largely visual, nonquantitative measurements (e.g., optical microscopy). More modern techniques can also be utilized by a PET to nondestructively characterize astromaterials in much more rigorous way. Here we discuss our recent investigations into the applications of micro-CT and micro-XRF analyses with Apollo samples and ANSMET meteorites and assess the usefulness of these techniques in future PET. Results: The application of micro computerized tomography (micro-CT) to astromaterials is not a new concept. The technique involves scanning samples with high-energy x-rays and constructing 3-dimensional images of the density of materials within the sample. The technique can routinely measure large samples (up to approx. 2700 cu cm) with a small individual voxel size (approx. 30 cu m), and has the sensitivity to distinguish the major rock forming minerals and identify clast populations within brecciated samples. We have recently run a test sample of a terrestrial breccia with a carbonate matrix and multiple igneous clast lithologies. The test results are promising and we will soon analyze a approx. 600 g piece of Apollo sample 14321 to map out the clast population within the sample. Benchtop micro x-ray fluorescence (micro-XRF) instruments can rapidly scan large areas (approx. 100 sq cm) with a small pixel size (approx. 25 microns) and measure the (semi) quantitative composition of largely unprepared surfaces for all elements between Be and U, often with sensitivity on the order of a approx. 100 ppm. Our recent testing of meteorite and Apollo samples on micro-XRF instruments has shown that they can easily detect small zircons and phosphates (approx. 10 m), distinguish different clast lithologies within breccias, and identify different lithologies within small rock fragments (2-4 mm soil Apollo soil fragments).
Sampling Large Graphs for Anticipatory Analytics
2015-05-15
low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges
Procedures and equipment for staining large numbers of plant root samples for endomycorrhizal assay.
Kormanik, P P; Bryan, W C; Schultz, R C
1980-04-01
A simplified method of clearing and staining large numbers of plant roots for vesicular-arbuscular (VA) mycorrhizal assay is presented. Equipment needed for handling multiple samples is described, and two formulations for the different chemical solutions are presented. Because one formulation contains phenol, its use should be limited to basic studies for which adequate laboratory exhaust hoods are available and great clarity of fungal structures is required. The second staining formulation, utilizing lactic acid instead of phenol, is less toxic, requires less elaborate laboratory facilities, and has proven to be completely satisfactory for VA assays.
Using demographics to predict smoking behavior: large sample evidence from an emerging market.
Prinsloo, Melani; Tudhope, Lynne; Pitt, Leyland; Campbell, Colin
2008-01-01
Smoking and nicotine addiction are among the major preventable causes of disease and mortality. Being able to target promotional campaigns effectively relies on a good understanding of the demographics of smokers and potential smokers. This study reports on the results of a large sample survey of the demographics of smokers and non-smokers in South African townships. Using logistical regression, it finds that smokers tend to be significantly, older males who are less educated, and somewhat surprisingly, with no religious affiliation. Implications for public health policy are identified, and avenues for future research recognized.
Barcelos, Adriana Renata; Bobrowiec, Paulo Estefano D; Sanaiotti, Tânia Margarete; Gribel, Rogério
2013-03-01
This study evaluated the potential of lowland tapirs as seed dispersers in the northern Brazilian Amazon. The study analyzed the viability of seeds after passage through the gut. Fecal samples were collected from 6 different vegetation physiognomies in Viruá National Park during the dry season. The samples were then kept in a greenhouse for 16 months to allow the seeds to germinate. The seedling species were identified and classified according to the type of fruit, plant habit, seed size and type of ingestion. Of the 111 fecal samples, 94 (84.7%) had viable seeds of 75 species. Melastomataceae was the most frequent family with viable seeds in the fecal samples (69.1% of samples, N= 18 species). The data suggest that the importance of the lowland tapirs as dispersers is not restricted to the species consumed actively by frugivory but also extends to species accidentally consumed during browsing. The occurrence of both large and small viable seeds in the fecal samples as well as a number of large drupes, which probably cannot be transported via endozoochory by any other animal species, provide evidence of the ecological importance of lowland tapirs to the dynamics of the forest-campinarana vegetation mosaic in the region. © 2012 Wiley Publishing Asia Pty Ltd, ISZS and IOZ/CAS.
Psychometric evaluation of the thought-action fusion scale in a large clinical sample.
Meyer, Joseph F; Brown, Timothy A
2013-12-01
This study examined the psychometric properties of the 19-item Thought-Action Fusion (TAF) Scale, a measure of maladaptive cognitive intrusions, in a large clinical sample (N = 700). An exploratory factor analysis (n = 300) yielded two interpretable factors: TAF Moral (TAF-M) and TAF Likelihood (TAF-L). A confirmatory bifactor analysis was conducted on the second portion of the sample (n = 400) to account for possible sources of item covariance using a general TAF factor (subsuming TAF-M) alongside the TAF-L domain-specific factor. The bifactor model provided an acceptable fit to the sample data. Results indicated that global TAF was more strongly associated with a measure of obsessive-compulsiveness than measures of general worry and depression, and the TAF-L dimension was more strongly related to obsessive-compulsiveness than depression. Overall, results support the bifactor structure of the TAF in a clinical sample and its close relationship to its neighboring obsessive-compulsiveness construct.
Psychometric Evaluation of the Thought–Action Fusion Scale in a Large Clinical Sample
Meyer, Joseph F.; Brown, Timothy A.
2015-01-01
This study examined the psychometric properties of the 19-item Thought–Action Fusion (TAF) Scale, a measure of maladaptive cognitive intrusions, in a large clinical sample (N = 700). An exploratory factor analysis (n = 300) yielded two interpretable factors: TAF Moral (TAF-M) and TAF Likelihood (TAF-L). A confirmatory bifactor analysis was conducted on the second portion of the sample (n = 400) to account for possible sources of item covariance using a general TAF factor (subsuming TAF-M) alongside the TAF-L domain-specific factor. The bifactor model provided an acceptable fit to the sample data. Results indicated that global TAF was more strongly associated with a measure of obsessive-compulsiveness than measures of general worry and depression, and the TAF-L dimension was more strongly related to obsessive-compulsiveness than depression. Overall, results support the bifactor structure of the TAF in a clinical sample and its close relationship to its neighboring obsessive-compulsiveness construct. PMID:22315482
Electrofishing effort requirements for estimating species richness in the Kootenai River, Idaho
Watkins, Carson J.; Quist, Michael C.; Shepard, Bradley B.; Ireland, Susan C.
2016-01-01
This study was conducted on the Kootenai River, Idaho to provide insight on sampling requirements to optimize future monitoring effort associated with the response of fish assemblages to habitat rehabilitation. Our objective was to define the electrofishing effort (m) needed to have a 95% probability of sampling 50, 75, and 100% of the observed species richness and to evaluate the relative influence of depth, velocity, and instream woody cover on sample size requirements. Sidechannel habitats required more sampling effort to achieve 75 and 100% of the total species richness than main-channel habitats. The sampling effort required to have a 95% probability of sampling 100% of the species richness was 1100 m for main-channel sites and 1400 m for side-channel sites. We hypothesized that the difference in sampling requirements between main- and side-channel habitats was largely due to differences in habitat characteristics and species richness between main- and side-channel habitats. In general, main-channel habitats had lower species richness than side-channel habitats. Habitat characteristics (i.e., depth, current velocity, and woody instream cover) were not related to sample size requirements. Our guidelines will improve sampling efficiency during monitoring effort in the Kootenai River and provide insight on sampling designs for other large western river systems where electrofishing is used to assess fish assemblages.
Clendenen, Tess V; Rendleman, Justin; Ge, Wenzhen; Koenig, Karen L; Wirgin, Isaac; Currie, Diane; Shore, Roy E; Kirchhoff, Tomas; Zeleniuch-Jacquotte, Anne
2015-01-01
Large epidemiologic studies have the potential to make valuable contributions to the assessment of gene-environment interactions because they prospectively collected detailed exposure data. Some of these studies, however, have only serum or plasma samples as a low quantity source of DNA. We examined whether DNA isolated from serum can be used to reliably and accurately genotype single nucleotide polymorphisms (SNPs) using Sequenom multiplex SNP genotyping technology. We genotyped 81 SNPs using samples from 158 participants in the NYU Women's Health Study. Each participant had DNA from serum and at least one paired DNA sample isolated from a high quality source of DNA, i.e. clots and/or cell precipitates, for comparison. We observed that 60 of the 81 SNPs (74%) had high call frequencies (≥95%) using DNA from serum, only slightly lower than the 85% of SNPs with high call frequencies in DNA from clots or cell precipitates. Of the 57 SNPs with high call frequencies for serum, clot, and cell precipitate DNA, 54 (95%) had highly concordant (>98%) genotype calls across all three sample types. High purity was not a critical factor to successful genotyping. Our results suggest that this multiplex SNP genotyping method can be used reliably on DNA from serum in large-scale epidemiologic studies.
Wang, Jingwen; Skoog, Tiina; Einarsdottir, Elisabet; Kaartokallio, Tea; Laivuori, Hannele; Grauers, Anna; Gerdhem, Paul; Hytönen, Marjo; Lohi, Hannes; Kere, Juha; Jiao, Hong
2016-01-01
High-throughput sequencing using pooled DNA samples can facilitate genome-wide studies on rare and low-frequency variants in a large population. Some major questions concerning the pooling sequencing strategy are whether rare and low-frequency variants can be detected reliably, and whether estimated minor allele frequencies (MAFs) can represent the actual values obtained from individually genotyped samples. In this study, we evaluated MAF estimates using three variant detection tools with two sets of pooled whole exome sequencing (WES) and one set of pooled whole genome sequencing (WGS) data. Both GATK and Freebayes displayed high sensitivity, specificity and accuracy when detecting rare or low-frequency variants. For the WGS study, 56% of the low-frequency variants in Illumina array have identical MAFs and 26% have one allele difference between sequencing and individual genotyping data. The MAF estimates from WGS correlated well (r = 0.94) with those from Illumina arrays. The MAFs from the pooled WES data also showed high concordance (r = 0.88) with those from the individual genotyping data. In conclusion, the MAFs estimated from pooled DNA sequencing data reflect the MAFs in individually genotyped samples well. The pooling strategy can thus be a rapid and cost-effective approach for the initial screening in large-scale association studies. PMID:27633116
Large-Angular-Scale Clustering as a Clue to the Source of UHECRs
NASA Astrophysics Data System (ADS)
Berlind, Andreas A.; Farrar, Glennys R.
We explore what can be learned about the sources of UHECRs from their large-angular-scale clustering (referred to as their "bias" by the cosmology community). Exploiting the clustering on large scales has the advantage over small-scale correlations of being insensitive to uncertainties in source direction from magnetic smearing or measurement error. In a Cold Dark Matter cosmology, the amplitude of large-scale clustering depends on the mass of the system, with more massive systems such as galaxy clusters clustering more strongly than less massive systems such as ordinary galaxies or AGN. Therefore, studying the large-scale clustering of UHECRs can help determine a mass scale for their sources, given the assumption that their redshift depth is as expected from the GZK cutoff. We investigate the constraining power of a given UHECR sample as a function of its cutoff energy and number of events. We show that current and future samples should be able to distinguish between the cases of their sources being galaxy clusters, ordinary galaxies, or sources that are uncorrelated with the large-scale structure of the universe.
Floyd A. Johnson
1961-01-01
This report assumes a knowledge of the principles of point sampling as described by Grosenbaugh, Bell and Alexander, and others. Whenever trees are counted at every point in a sample of points (large sample) and measured for volume at a portion (small sample) of these points, the sampling design could be called ratio double sampling. If the large...
Lelental, Natalia; Brandner, Sebastian; Kofanova, Olga; Blennow, Kaj; Zetterberg, Henrik; Andreasson, Ulf; Engelborghs, Sebastiaan; Mroczko, Barbara; Gabryelewicz, Tomasz; Teunissen, Charlotte; Mollenhauer, Brit; Parnetti, Lucilla; Chiasserini, Davide; Molinuevo, Jose Luis; Perret-Liaudet, Armand; Verbeek, Marcel M; Andreasen, Niels; Brosseron, Frederic; Bahl, Justyna M C; Herukka, Sanna-Kaisa; Hausner, Lucrezia; Frölich, Lutz; Labonte, Anne; Poirier, Judes; Miller, Anne-Marie; Zilka, Norbert; Kovacech, Branislav; Urbani, Andrea; Suardi, Silvia; Oliveira, Catarina; Baldeiras, Ines; Dubois, Bruno; Rot, Uros; Lehmann, Sylvain; Skinningsrud, Anders; Betsou, Fay; Wiltfang, Jens; Gkatzima, Olymbia; Winblad, Bengt; Buchfelder, Michael; Kornhuber, Johannes; Lewczuk, Piotr
2016-03-01
Assay-vendor independent quality control (QC) samples for neurochemical dementia diagnostics (NDD) biomarkers are so far commercially unavailable. This requires that NDD laboratories prepare their own QC samples, for example by pooling leftover cerebrospinal fluid (CSF) samples. To prepare and test alternative matrices for QC samples that could facilitate intra- and inter-laboratory QC of the NDD biomarkers. Three matrices were validated in this study: (A) human pooled CSF, (B) Aβ peptides spiked into human prediluted plasma, and (C) Aβ peptides spiked into solution of bovine serum albumin in phosphate-buffered saline. All matrices were tested also after supplementation with an antibacterial agent (sodium azide). We analyzed short- and long-term stability of the biomarkers with ELISA and chemiluminescence (Fujirebio Europe, MSD, IBL International), and performed an inter-laboratory variability study. NDD biomarkers turned out to be stable in almost all samples stored at the tested conditions for up to 14 days as well as in samples stored deep-frozen (at - 80°C) for up to one year. Sodium azide did not influence biomarker stability. Inter-center variability of the samples sent at room temperature (pooled CSF, freeze-dried CSF, and four artificial matrices) was comparable to the results obtained on deep-frozen samples in other large-scale projects. Our results suggest that it is possible to replace self-made, CSF-based QC samples with large-scale volumes of QC materials prepared with artificial peptides and matrices. This would greatly facilitate intra- and inter-laboratory QC schedules for NDD measurements.
Margalit, Ili; Cohen, Eytan; Goldberg, Elad; Krause, Ilan
2018-07-01
In a recent small sample study, red blood cell distribution width (RDW) was suggested as a predictor of homocysteine levels. The current study was aimed to reexamine this association in a large scale sample. A retrospective cross-sectional study of healthy adults, conducted at Rabin Medical Center, during 2000-2014. Data were retrieved from the medical charts and a logistic regression controlling for interfering factors was carried out. Sensitivity analysis was implemented by exclusion of individuals with anaemia. Five thousand, five hundred fifty-four healthy individuals were included. Mean serum homocysteine level was 10.10 (SD 2.72) μmol/L. 34.4% of the study population had a homocysteine level higher than the upper limit of normal (10.8 μmol/L). Homocysteine showed no association with RDW (OR 1.00; 95% CI 0.97-1.03), but increased with age (OR 1.05; 95% CI 1.04-1.06) and decreased with a rise in haemoglobin (OR 0.77; 95% CI 0.71-0.83), and in the mean corpuscular volume (OR 0.86; 95% CI 0.85-0.88). Exclusion of individuals with anaemia did not reveal an association between homocysteine and RDW but found a somewhat smaller association between haemoglobin and RDW [OR 0.82; 95% CI 0.73-0.91]. In our large scale sample we did not find an association between RDW and serum homocysteine.
Danis, Ildiko; Scheuring, Noemi; Papp, Eszter; Czinner, Antal
2012-06-01
A new instrument for assessing depressive mood, the first version of Depression Scale Questionnaire (DS1K) was published in 2008 by Halmai et al. This scale was used in our large sample study, in the framework of the For Healthy Offspring project, involving parents of young children. The original questionnaire was developed in small samples, so our aim was to assist further development of the instrument by the psychometric analysis of the data in our large sample (n=1164). The DS1K scale was chosen to measure the parents' mood and mental state in the For Healthy Offspring project. The questionnaire was completed by 1063 mothers and 328 fathers, yielding a heterogenous sample with respect to age and socio-demographic status. Analyses included main descriptive statistics, establishing the scales' inner consistency and some comparisons. Results were checked in our original and multiple imputed datasets as well. According to our results the reliability of our scale was much worse than in the original study (Cronbach alpha: 0.61 versus 0.88). During the detailed item-analysis it became clear that two items contributed to the observed decreased coherence. We assumed a problem related to misreading in case of one of these items. This assumption was checked by cross-analysis by the assumed reading level. According to our results the reliability of the scale was increased in both the lower and higher education level groups if we did not include one or both of these problematic items. However, as the number of items decreased, the relative sensitivity of the scale was also reduced, with fewer persons categorized in the risk group compared to the original scale. We suggest for the authors as an alternative solution to redefine the problematic items and retest the reliability of the measurement in a sample with diverse socio-demographic characteristics.
Sampling from complex networks using distributed learning automata
NASA Astrophysics Data System (ADS)
Rezvanian, Alireza; Rahmati, Mohammad; Meybodi, Mohammad Reza
2014-02-01
A complex network provides a framework for modeling many real-world phenomena in the form of a network. In general, a complex network is considered as a graph of real world phenomena such as biological networks, ecological networks, technological networks, information networks and particularly social networks. Recently, major studies are reported for the characterization of social networks due to a growing trend in analysis of online social networks as dynamic complex large-scale graphs. Due to the large scale and limited access of real networks, the network model is characterized using an appropriate part of a network by sampling approaches. In this paper, a new sampling algorithm based on distributed learning automata has been proposed for sampling from complex networks. In the proposed algorithm, a set of distributed learning automata cooperate with each other in order to take appropriate samples from the given network. To investigate the performance of the proposed algorithm, several simulation experiments are conducted on well-known complex networks. Experimental results are compared with several sampling methods in terms of different measures. The experimental results demonstrate the superiority of the proposed algorithm over the others.
Evaluating information content of SNPs for sample-tagging in re-sequencing projects.
Hu, Hao; Liu, Xiang; Jin, Wenfei; Hilger Ropers, H; Wienker, Thomas F
2015-05-15
Sample-tagging is designed for identification of accidental sample mix-up, which is a major issue in re-sequencing studies. In this work, we develop a model to measure the information content of SNPs, so that we can optimize a panel of SNPs that approach the maximal information for discrimination. The analysis shows that as low as 60 optimized SNPs can differentiate the individuals in a population as large as the present world, and only 30 optimized SNPs are in practice sufficient in labeling up to 100 thousand individuals. In the simulated populations of 100 thousand individuals, the average Hamming distances, generated by the optimized set of 30 SNPs are larger than 18, and the duality frequency, is lower than 1 in 10 thousand. This strategy of sample discrimination is proved robust in large sample size and different datasets. The optimized sets of SNPs are designed for Whole Exome Sequencing, and a program is provided for SNP selection, allowing for customized SNP numbers and interested genes. The sample-tagging plan based on this framework will improve re-sequencing projects in terms of reliability and cost-effectiveness.
Razaq, Aamir; Mihranyan, Albert; Welch, Ken; Nyholm, Leif; Strømme, Maria
2009-01-15
The electrochemically controlled anion absorption properties of a novel large surface area composite paper material composed of polypyrrole (PPy) and cellulose derived from Cladophora sp. algae, synthesized with two oxidizing agents, iron(III) chloride and phosphomolybdic acid (PMo), were analyzed in four different electrolytes containing anions (i.e., chloride, aspartate, glutamate, and p-toluenesulfonate) of varying size.The composites were characterized with scanning and transmission electron microscopy, N2 gas adsorption,and conductivity measurements. The potential-controlled ion exchange properties of the materials were studied by cyclic voltammetry and chronoamperometry at varying potentials. The surface area and conductivity of the iron(III) chloride synthesized sample were 58.8 m2/g and 0.65 S/cm, respectively, while the corresponding values for the PMo synthesized sample were 31.3 m2/g and 0.12 S/cm. The number of absorbed ions per sample mass was found to be larger for the iron(III) chloride synthesized sample than for the PMo synthesized one in all four electrolytes. Although the largest extraction yields were obtained in the presence of the smallest anion (i.e., chloride) for both samples, the relative degree of extraction for the largest ions (i.e., glutamate and p-toluenesulfonate) was higher for the PMo sample. This clearly shows that it is possible to increase the extraction yield of large anions by carrying out the PPy polymerization in the presence of large anions. The results likewise show that high ion exchange capacities, as well as extraction and desorption rates, can be obtained for large anions with high surface area composites coated with relatively thin layers of PPy.
An atlas of L-T transition brown dwarfs with VLT/XShooter
NASA Astrophysics Data System (ADS)
Marocco, F.; Day-Jones, A. C.; Jones, H. R. A.; Pinfield, D. J.
In this contribution we present the first results from a large observing campaign we are carrying out using VLT/Xshooter to obtain spectra of a large sample (˜250 objects) of L-T transition brown dwarfs. Here we report the results based on the first ˜120 spectra already obtained. The large sample, and the wide spectral coverage (300-2480 nm) given by Xshooter, will allow us to do a new powerful analysis, at an unprecedent level. By fitting the absorption lines of a given element (e.g. Na) at different wavelengths we can test ultracool atmospheric models and draw for the first time a 3D picture of stellar atmospheres at temperatures down to 1000K. Determining the atmospheric parameters (e.g. temperature, surface gravity and metallicity) of a big sample of brown dwarfs, will allow us to understand the role of these parameters on the formation of their spectra. The large number of objects in our sample also will allow us to do a statistical significant test of the birth rate and initial mass function predictions for brown dwarfs. Determining the shape of the initial mass function for very low mass objects is a fundamental task to improve galaxy models, as recent studies tep{2010Natur.468..940V} have shown that low-mass objects dominate in massive elliptical galaxies.
Reduction and analysis of VLA maps for 281 radio-loud quasars using the UNLV Cray Y-MP supercomputer
NASA Technical Reports Server (NTRS)
Ding, Ailian; Hintzen, Paul; Weistrop, Donna; Owen, Frazer
1993-01-01
The identification of distorted radio-loud quasars provides a potentially very powerful tool for basic cosmological studies. If large morphological distortions are correlated with membership of the quasars in rich clusters of galaxies, optical observations can be used to identify rich clusters of galaxies at large redshifts. Hintzen, Ulvestad, and Owen (1983, HUO) undertook a VLA A array snapshot survey at 20 cm of 123 radio-loud quasars, and they found that among triple sources in their sample, 17 percent had radio axes which were bent more than 20 deg and 5 percent were bent more than 40 deg. Their subsequent optical observations showed that excess galaxy densities within 30 arcsec of 6 low-redshift distorted quasars were on average 3 times as great as those around undistorted quasars (Hintzen 1984). At least one of the distorted quasars observed, 3C275.1, apparently lies in the first-ranked galaxy at the center of a rich cluster of galaxies (Hintzen and Romanishin, 1986). Although their sample was small, these results indicated that observations of distorted quasars could be used to identify clusters of galaxies at large redshifts. The purpose of this project is to increase the available sample of distorted quasars to allow optical detection of a significant sample of quasar-associated clusters of galaxies at large redshifts.
Comparison of water-quality samples collected by siphon samplers and automatic samplers in Wisconsin
Graczyk, David J.; Robertson, Dale M.; Rose, William J.; Steur, Jeffrey J.
2000-01-01
In small streams, flow and water-quality concentrations often change quickly in response to meteorological events. Hydrologists, field technicians, or locally hired stream ob- servers involved in water-data collection are often unable to reach streams quickly enough to observe or measure these rapid changes. Therefore, in hydrologic studies designed to describe changes in water quality, a combination of manual and automated sampling methods have commonly been used manual methods when flow is relatively stable and automated methods when flow is rapidly changing. Auto- mated sampling, which makes use of equipment programmed to collect samples in response to changes in stage and flow of a stream, has been shown to be an effective method of sampling to describe the rapid changes in water quality (Graczyk and others, 1993). Because of the high cost of automated sampling, however, especially for studies examining a large number of sites, alternative methods have been considered for collecting samples during rapidly changing stream conditions. One such method employs the siphon sampler (fig. 1). also referred to as the "single-stage sampler." Siphon samplers are inexpensive to build (about $25- $50 per sampler), operate, and maintain, so they are cost effective to use at a large number of sites. Their ability to collect samples representing the average quality of water passing though the entire cross section of a stream, however, has not been fully demonstrated for many types of stream sites.
Gibbs sampling on large lattice with GMRF
NASA Astrophysics Data System (ADS)
Marcotte, Denis; Allard, Denis
2018-02-01
Gibbs sampling is routinely used to sample truncated Gaussian distributions. These distributions naturally occur when associating latent Gaussian fields to category fields obtained by discrete simulation methods like multipoint, sequential indicator simulation and object-based simulation. The latent Gaussians are often used in data assimilation and history matching algorithms. When the Gibbs sampling is applied on a large lattice, the computing cost can become prohibitive. The usual practice of using local neighborhoods is unsatisfying as it can diverge and it does not reproduce exactly the desired covariance. A better approach is to use Gaussian Markov Random Fields (GMRF) which enables to compute the conditional distributions at any point without having to compute and invert the full covariance matrix. As the GMRF is locally defined, it allows simultaneous updating of all points that do not share neighbors (coding sets). We propose a new simultaneous Gibbs updating strategy on coding sets that can be efficiently computed by convolution and applied with an acceptance/rejection method in the truncated case. We study empirically the speed of convergence, the effect of choice of boundary conditions, of the correlation range and of GMRF smoothness. We show that the convergence is slower in the Gaussian case on the torus than for the finite case studied in the literature. However, in the truncated Gaussian case, we show that short scale correlation is quickly restored and the conditioning categories at each lattice point imprint the long scale correlation. Hence our approach enables to realistically apply Gibbs sampling on large 2D or 3D lattice with the desired GMRF covariance.
NASA Technical Reports Server (NTRS)
Allton, J. H.; Bevill, T. J.
2003-01-01
The strategy of raking rock fragments from the lunar regolith as a means of acquiring representative samples has wide support due to science return, spacecraft simplicity (reliability) and economy [3, 4, 5]. While there exists widespread agreement that raking or sieving the bulk regolith is good strategy, there is lively discussion about the minimum sample size. Advocates of consor-tium studies desire fragments large enough to support petrologic and isotopic studies. Fragments from 5 to 10 mm are thought adequate [4, 5]. Yet, Jolliff et al. [6] demonstrated use of 2-4 mm fragments as repre-sentative of larger rocks. Here we make use of cura-torial records and sample catalogs to give a different perspective on minimum sample size for a robotic sample collector.
ERIC Educational Resources Information Center
Trent, Lindsay Rae; Buchanan, Erin; Ebesutani, Chad; Ale, Chelsea M.; Heiden, Laurie; Hight, Terry L.; Damon, John D.; Young, John
2013-01-01
This study examined the psychometric properties of the Revised Child Anxiety and Depression Scale in a large sample of youth from the Southern United States. The authors aimed to determine (a) if the established six-factor Revised Child Anxiety and Depression Scale structure could be replicated in this Southern sample and (b) if scores were…
Emissions of nitrous oxide from biomass burning
NASA Technical Reports Server (NTRS)
Winstead, Edward L.; Cofer, Wesley R., III; Levine, Joel S.
1991-01-01
A study has been conducted which compared N2O results obtained over large prescribed fires or wildfires, in which 'grab-sampling' with storage had been used with N2O measurements made in near-real time. CO2-normalized emission ratios obtained initially from the laboratory fires are substantially lower than those obtained over large-scale biomass fires. Combustion may not be the only source of N2O in large fire smoke plumes; physical, chemical, and biochemical processes in the soil may be altered by large biomass fires, leading to large N2O releases.
Generalized Ensemble Sampling of Enzyme Reaction Free Energy Pathways
Wu, Dongsheng; Fajer, Mikolai I.; Cao, Liaoran; Cheng, Xiaolin; Yang, Wei
2016-01-01
Free energy path sampling plays an essential role in computational understanding of chemical reactions, particularly those occurring in enzymatic environments. Among a variety of molecular dynamics simulation approaches, the generalized ensemble sampling strategy is uniquely attractive for the fact that it not only can enhance the sampling of rare chemical events but also can naturally ensure consistent exploration of environmental degrees of freedom. In this review, we plan to provide a tutorial-like tour on an emerging topic: generalized ensemble sampling of enzyme reaction free energy path. The discussion is largely focused on our own studies, particularly ones based on the metadynamics free energy sampling method and the on-the-path random walk path sampling method. We hope that this mini presentation will provide interested practitioners some meaningful guidance for future algorithm formulation and application study. PMID:27498634
A Computational Approach to Qualitative Analysis in Large Textual Datasets
Evans, Michael S.
2014-01-01
In this paper I introduce computational techniques to extend qualitative analysis into the study of large textual datasets. I demonstrate these techniques by using probabilistic topic modeling to analyze a broad sample of 14,952 documents published in major American newspapers from 1980 through 2012. I show how computational data mining techniques can identify and evaluate the significance of qualitatively distinct subjects of discussion across a wide range of public discourse. I also show how examining large textual datasets with computational methods can overcome methodological limitations of conventional qualitative methods, such as how to measure the impact of particular cases on broader discourse, how to validate substantive inferences from small samples of textual data, and how to determine if identified cases are part of a consistent temporal pattern. PMID:24498398
The CAMELS data set: catchment attributes and meteorology for large-sample studies
NASA Astrophysics Data System (ADS)
Addor, Nans; Newman, Andrew J.; Mizukami, Naoki; Clark, Martyn P.
2017-10-01
We present a new data set of attributes for 671 catchments in the contiguous United States (CONUS) minimally impacted by human activities. This complements the daily time series of meteorological forcing and streamflow provided by Newman et al. (2015b). To produce this extension, we synthesized diverse and complementary data sets to describe six main classes of attributes at the catchment scale: topography, climate, streamflow, land cover, soil, and geology. The spatial variations among basins over the CONUS are discussed and compared using a series of maps. The large number of catchments, combined with the diversity of the attributes we extracted, makes this new data set well suited for large-sample studies and comparative hydrology. In comparison to the similar Model Parameter Estimation Experiment (MOPEX) data set, this data set relies on more recent data, it covers a wider range of attributes, and its catchments are more evenly distributed across the CONUS. This study also involves assessments of the limitations of the source data sets used to compute catchment attributes, as well as detailed descriptions of how the attributes were computed. The hydrometeorological time series provided by Newman et al. (2015b, https://doi.org/10.5065/D6MW2F4D) together with the catchment attributes introduced in this paper (https://doi.org/10.5065/D6G73C3Q) constitute the freely available CAMELS data set, which stands for Catchment Attributes and MEteorology for Large-sample Studies.
Health workers cohort study: methods and study design.
Denova-Gutiérrez, Edgar; Flores, Yvonne N; Gallegos-Carrillo, Katia; Ramírez-Palacios, Paula; Rivera-Paredez, Berenice; Muñoz-Aguirre, Paloma; Velázquez-Cruz, Rafael; Torres-Ibarra, Leticia; Meneses-León, Joacim; Méndez-Hernández, Pablo; Hernández-López, Rubí; Salazar-Martínez, Eduardo; Talavera, Juan O; Tamayo, Juan; Castañón, Susana; Osuna-Ramírez, Ignacio; León-Maldonado, Leith; Flores, Mario; Macías, Nayeli; Antúnez, Daniela; Huitrón-Bravo, Gerardo; Salmerón, Jorge
2016-01-01
To examine different health outcomes that are associated with specific lifestyle and genetic factors. From March 2004 to April 2006, a sample of employees from three different health and academic institutions, as well as their family members, were enrolled in the study after providing informed consent. At baseline and follow-up (2010-2013), participants completed a self-administered questionnaire, a physical examination, and provided blood samples. A total of 10 729 participants aged 6 to 94 years were recruited at baseline. Of these, 70% were females, and 50% were from the Mexican Social Security Institute. Nearly 42% of the adults in the sample were overweight, while 20% were obese. Our study can offer new insights into disease mechanisms and prevention through the analysis of risk factor information in a large sample of Mexicans.
A novel computational approach towards the certification of large-scale boson sampling
NASA Astrophysics Data System (ADS)
Huh, Joonsuk
Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing. PMID:28981533
Nabi, Hermann; Bochud, Murielle; Glaus, Jennifer; Lasserre, Aurélie M; Waeber, Gérard; Vollenweider, Peter; Preisig, Martin
2013-10-01
Studies on the association between homocysteine levels and depression have shown conflicting results. To examine the association between serum total homocysteine (tHcy) levels and major depressive disorder (MDD) in a large community sample with an extended age range. A total of 3392 men and women aged 35-66 years participating in the CoLaus study and its psychiatric arm (PsyCoLaus) were included in the analyses. High tHcy measured from fasting blood samples was defined as a concentration ≥15μmol/L. MDD was assessed using the semi-structured Diagnostic Interview for Genetics Studies. In multivariate analyses, elevated tHcy levels were associated with greater odds of meeting the diagnostic criteria for lifetime MDD among men (OR=1.71; 95% CI, 1.18-2.50). This was particularly the case for remitted MDD. Among women, there was no significant association between tHcy levels and MDD and the association tended to be in the opposite direction (OR=0.61; 95% CI, 0.34-1.08). In this large population-based study, elevated tHcy concentrations are associated with lifetime MDD and particularly with remitted MDD among men. Copyright © 2013 Elsevier Ltd. All rights reserved.
A large-scale study of the random variability of a coding sequence: a study on the CFTR gene.
Modiano, Guido; Bombieri, Cristina; Ciminelli, Bianca Maria; Belpinati, Francesca; Giorgi, Silvia; Georges, Marie des; Scotet, Virginie; Pompei, Fiorenza; Ciccacci, Cinzia; Guittard, Caroline; Audrézet, Marie Pierre; Begnini, Angela; Toepfer, Michael; Macek, Milan; Ferec, Claude; Claustres, Mireille; Pignatti, Pier Franco
2005-02-01
Coding single nucleotide substitutions (cSNSs) have been studied on hundreds of genes using small samples (n(g) approximately 100-150 genes). In the present investigation, a large random European population sample (average n(g) approximately 1500) was studied for a single gene, the CFTR (Cystic Fibrosis Transmembrane conductance Regulator). The nonsynonymous (NS) substitutions exhibited, in accordance with previous reports, a mean probability of being polymorphic (q > 0.005), much lower than that of the synonymous (S) substitutions, but they showed a similar rate of subpolymorphic (q < 0.005) variability. This indicates that, in autosomal genes that may have harmful recessive alleles (nonduplicated genes with important functions), genetic drift overwhelms selection in the subpolymorphic range of variability, making disadvantageous alleles behave as neutral. These results imply that the majority of the subpolymorphic nonsynonymous alleles of these genes are selectively negative or even pathogenic.
Detecting a Weak Association by Testing its Multiple Perturbations: a Data Mining Approach
NASA Astrophysics Data System (ADS)
Lo, Min-Tzu; Lee, Wen-Chung
2014-05-01
Many risk factors/interventions in epidemiologic/biomedical studies are of minuscule effects. To detect such weak associations, one needs a study with a very large sample size (the number of subjects, n). The n of a study can be increased but unfortunately only to an extent. Here, we propose a novel method which hinges on increasing sample size in a different direction-the total number of variables (p). We construct a p-based `multiple perturbation test', and conduct power calculations and computer simulations to show that it can achieve a very high power to detect weak associations when p can be made very large. As a demonstration, we apply the method to analyze a genome-wide association study on age-related macular degeneration and identify two novel genetic variants that are significantly associated with the disease. The p-based method may set a stage for a new paradigm of statistical tests.
Chemical composition of snow in the northern Sierra Nevada and other areas
Feth, John Henry Frederick; Rogers, S.M.; Roberson, Charles Elmer
1964-01-01
Melting snow provides a large part of the water used throughout the western conterminous United States for agriculture, industry, and domestic supply. It is an active agent in chemical weathering, supplies moisture for forest growth, and sustains fish and wildlife. Despite its importance, virtually nothing has been known of the chemical character of snow in the western mountains until the present study.Analysis of more than 100 samples, most from the northern Sierra Nevada, but some from Utah, Denver, Colo., and scattered points, shows that melted snow is a dilute solution containing measurable amounts of some or all of the inorganic constituents commonly found in natural water. There are significant regional differences in chemical composition; the progressive increase in calcium content with increasing distance eastward from the west slope of the Sierra Nevada is the most pronounced. The chemical character of individual snowfalls is variable. Some show predominant influence of oceanic salt; others show strong effects of mineralization from continental sources, probably largely dust. Silica and boron were found in about half the samples analyzed for these constituents; precipitation is seldom analyzed for these substances.Results of the chemical analyses for major constituents in snow samples are summarized in the following table. The median and mean values for individual constituents are derived from 41-78 samples of Sierra Nevada snow, 6-18 samples of Utah snow, and 6-17 samples of Denver, Colo., snow.The sodium, chloride, and perhaps boron found in snow are probably incorporated in moisture-laden air masses as they move over the Pacific Ocean. Silica, although abundant in the silicate-mineral nuclei found in some snowflakes, may be derived in soluble form largely from dust. Calcium, magnesium, and some bicarbonate are probably added by dust of continental origin. The sources of the other constituents remain unknown.When snowmelt comes in contact with the lithosphere, the earlier diversity of chemical type largely disappears. The melt water rapidly increases its content of dissolved solids and becomes calcium magnesium bicarbonate in type. Silica, whose concentration increases more than tenfold, shows the largest gain; calcium and bicarbonate contents also increase markedly. Most of the additional mineral matter is from soft and weathered rock; bicarbonate, however, is largely from the soil atmosphere.Investigators, some reporting as much as a century ago, concentrated attention largely on nitrogen compounds and seldom reported other constituents except chloride and sulfate. The Northern European precipitation-sampling network provides the most comprehensive collection of data on precipitation chemistry, but it does not segregate snow from other forms of precipitation. The present study establishes with confidence the chemical character of snow in the Sierra Nevada, and suggests that the dissolved-solids content of precipitation increases with increasing distance inland from the Pacific Coast.
Taylor, Mark J.; Charman, Tony; Robinson, Elise B.; Hayiou-Thomas, Marianna E.; Happé, Francesca; Dale, Philip S.; Ronald, Angelica
2015-01-01
Language difficulties have historically been viewed as integral to autism spectrum conditions (ASC), leading molecular genetic studies to consider whether ASC and language difficulties have overlapping genetic bases. The extent of genetic, and also environmental, overlap between ASC and language is, however, unclear. We hence conducted a twin study of the concurrent association between autistic traits and receptive language abilities. Internet-based language tests were completed by ~3,000 pairs of twins, while autistic traits were assessed via parent ratings. Twin model fitting explored the association between these measures in the full sample, while DeFries-Fulker analysis tested these associations at the extremes of the sample. Phenotypic associations between language ability and autistic traits were modest and negative. The degree of genetic overlap was also negative, indicating that genetic influences on autistic traits lowered language scores in the full sample (mean genetic correlation = −0.13). Genetic overlap was also low at the extremes of the sample (mean genetic correlation = 0.14), indicating that genetic influences on quantitatively defined language difficulties were largely distinct from those on extreme autistic traits. Variation in language ability and autistic traits were also associated with largely different nonshared environmental influences. Language and autistic traits are influenced by largely distinct etiological factors. This has implications for molecular genetic studies of ASC and understanding the etiology of ASC. Additionally, these findings lend support to forthcoming DSM-5 changes to ASC diagnostic criteria that will see language difficulties separated from the core ASC communication symptoms, and instead listed as a clinical specifier. PMID:25088445
Taylor, Mark J; Charman, Tony; Robinson, Elise B; Hayiou-Thomas, Marianna E; Happé, Francesca; Dale, Philip S; Ronald, Angelica
2014-10-01
Language difficulties have historically been viewed as integral to autism spectrum conditions (ASC), leading molecular genetic studies to consider whether ASC and language difficulties have overlapping genetic bases. The extent of genetic, and also environmental, overlap between ASC and language is, however, unclear. We hence conducted a twin study of the concurrent association between autistic traits and receptive language abilities. Internet-based language tests were completed by ~3,000 pairs of twins, while autistic traits were assessed via parent ratings. Twin model fitting explored the association between these measures in the full sample, while DeFries-Fulker analysis tested these associations at the extremes of the sample. Phenotypic associations between language ability and autistic traits were modest and negative. The degree of genetic overlap was also negative, indicating that genetic influences on autistic traits lowered language scores in the full sample (mean genetic correlation = -0.13). Genetic overlap was also low at the extremes of the sample (mean genetic correlation = 0.14), indicating that genetic influences on quantitatively defined language difficulties were largely distinct from those on extreme autistic traits. Variation in language ability and autistic traits were also associated with largely different nonshared environmental influences. Language and autistic traits are influenced by largely distinct etiological factors. This has implications for molecular genetic studies of ASC and understanding the etiology of ASC. Additionally, these findings lend support to forthcoming DSM-5 changes to ASC diagnostic criteria that will see language difficulties separated from the core ASC communication symptoms, and instead listed as a clinical specifier. © 2014 Wiley Periodicals, Inc.
Lambertini, Elisabetta; Spencer, Susan K.; Bertz, Phillip D.; Loge, Frank J.; Kieke, Burney A.; Borchardt, Mark A.
2008-01-01
Available filtration methods to concentrate waterborne viruses are either too costly for studies requiring large numbers of samples, limited to small sample volumes, or not very portable for routine field applications. Sodocalcic glass wool filtration is a cost-effective and easy-to-use method to retain viruses, but its efficiency and reliability are not adequately understood. This study evaluated glass wool filter performance to concentrate the four viruses on the U.S. Environmental Protection Agency contaminant candidate list, i.e., coxsackievirus, echovirus, norovirus, and adenovirus, as well as poliovirus. Total virus numbers recovered were measured by quantitative reverse transcription-PCR (qRT-PCR); infectious polioviruses were quantified by integrated cell culture (ICC)-qRT-PCR. Recovery efficiencies averaged 70% for poliovirus, 14% for coxsackievirus B5, 19% for echovirus 18, 21% for adenovirus 41, and 29% for norovirus. Virus strain and water matrix affected recovery, with significant interaction between the two variables. Optimal recovery was obtained at pH 6.5. No evidence was found that water volume, filtration rate, and number of viruses seeded influenced recovery. The method was successful in detecting indigenous viruses in municipal wells in Wisconsin. Long-term continuous filtration retained viruses sufficiently for their detection for up to 16 days after seeding for qRT-PCR and up to 30 days for ICC-qRT-PCR. Glass wool filtration is suitable for large-volume samples (1,000 liters) collected at high filtration rates (4 liters min−1), and its low cost makes it advantageous for studies requiring large numbers of samples. PMID:18359827
Jenkins, Jill A.; Jeske, Clinton W.; Allain, Larry K.
2011-01-01
The implementation of freshwater diversions in large-scale coastal restoration schemes presents several scientific and management considerations. Large-scale environmental restructuring necessitates aquatic biomonitoring, and during such field studies, photographs that document animals and habitat may be captured. Among the biomonitoring studies performed in conjunction with the Davis Pond freshwater diversion structure south of New Orleans, Louisiana, only postdiversion study images are readily available, and these are presented here.
Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin
2014-01-01
Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.
NASA Technical Reports Server (NTRS)
Zeigler, Ryan A.
2014-01-01
An integral part of any sample return mission is the initial description and classification of returned samples by the preliminary examination team (PET). The goal of a PET is to characterize and classify the returned samples, making this information available to the general research community who can then conduct more in-depth studies on the samples. A PET strives to minimize the impact their work has on the sample suite, which often limits the PET work to largely visual measurements and observations like optical microscopy. More modern techniques can also be utilized by future PET to nondestructively characterize astromaterials in a more rigorous way. Here we present our recent analyses of Apollo samples 14321 and 14305 by micro-CT and micro-XRF (respectively), assess the potential for discovery of "new" Apollo samples for scientific study, and evaluate the usefulness of these techniques in future PET efforts.
Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin
2014-01-01
Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block. PMID:25258742
Magnotti, John F; Basu Mallick, Debshila; Feng, Guo; Zhou, Bin; Zhou, Wen; Beauchamp, Michael S
2015-09-01
Humans combine visual information from mouth movements with auditory information from the voice to recognize speech. A common method for assessing multisensory speech perception is the McGurk effect: When presented with particular pairings of incongruent auditory and visual speech syllables (e.g., the auditory speech sounds for "ba" dubbed onto the visual mouth movements for "ga"), individuals perceive a third syllable, distinct from the auditory and visual components. Chinese and American cultures differ in the prevalence of direct facial gaze and in the auditory structure of their languages, raising the possibility of cultural- and language-related group differences in the McGurk effect. There is no consensus in the literature about the existence of these group differences, with some studies reporting less McGurk effect in native Mandarin Chinese speakers than in English speakers and others reporting no difference. However, these studies sampled small numbers of participants tested with a small number of stimuli. Therefore, we collected data on the McGurk effect from large samples of Mandarin-speaking individuals from China and English-speaking individuals from the USA (total n = 307) viewing nine different stimuli. Averaged across participants and stimuli, we found similar frequencies of the McGurk effect between Chinese and American participants (48 vs. 44 %). In both groups, we observed a large range of frequencies both across participants (range from 0 to 100 %) and stimuli (15 to 83 %) with the main effect of culture and language accounting for only 0.3 % of the variance in the data. High individual variability in perception of the McGurk effect necessitates the use of large sample sizes to accurately estimate group differences.
Meade, R.H.; Stevens, H.H.
1990-01-01
A Lagrangian strategy for sampling large rivers, which was developed and tested in the Orinoco and Amazon Rivers of South America during the early 1980s, is now being applied to the study of toxic chemicals in the Mississippi River. A series of 15-20 cross-sections of the Mississippi mainstem and its principal tributaries is sampled by boat in downstream sequence, beginning upriver of St. Louis and concluding downriver of New Orleans 3 weeks later. The timing of the downstream sampling sequence approximates the travel time of the river water. Samples at each cross-section are discharge-weighted to provide concentrations of dissolved and suspended constituents that are converted to fluxes. Water-sediment mixtures are collected from 10-40 equally spaced points across the river width by sequential depth integration at a uniform vertical transit rate. Essential equipment includes (i) a hydraulic winch, for sensitive control of vertical transit rates, and (ii) a collapsible-bag sampler, which allows integrated samples to be collected at all depths in the river. A section is usually sampled in 4-8 h, for a total sample recovery of 100-120 l. Sampled concentrations of suspended silt and clay are reproducible within 3%.
John B. Bradford; Peter Weishampel; Marie-Louise Smith; Randall Kolka; Richard A. Birdsey; Scott V. Ollinger; Michael G. Ryan
2010-01-01
Assessing forest carbon storage and cycling over large areas is a growing challenge that is complicated by the inherent heterogeneity of forest systems. Field measurements must be conducted and analyzed appropriately to generate precise estimates at scales large enough for mapping or comparison with remote sensing data. In this study we examined...
USDA-ARS?s Scientific Manuscript database
The objective of this study was to characterize Salmonella contamination on carcasses in two large U.S. commercial pork processing plants. Carcasses were sampled before scalding, after dehairing/polishing but before evisceration, and after chilling on two days in each of the four seasons. The prev...
ERIC Educational Resources Information Center
Ellett, Chad D.; Monsaas, Judy; Martin-Hansen, Lisa; Demir, Abdulkadir
2012-01-01
This study reports on the continued large-sample validation of the Inventory for Teaching and Learning (ITAL), a new teacher perception measure of "reformed (inquiry- and standards-based) and traditional teaching and learning" developed for use in science and mathematics classrooms. The continued validation of the ITAL used large samples…
Turkish Version of Students' Ideas about Nature of Science Questionnaire: A Validation Study
ERIC Educational Resources Information Center
Cansiz, Mustafa; Cansiz, Nurcan; Tas, Yasemin; Yerdelen, Sundus
2017-01-01
Mass assessment of large samples' nature of science views has been one of the core concerns in science education research. Due to impracticality of using open-ended questionnaires or conducting interviews with large groups, another line of research has been required for mass assessment of pupils' nature of science conception meaningfully.…
Sampling and handling artifacts can bias filter-based measurements of particulate organic carbon (OC). Several measurement-based methods for OC artifact reduction and/or estimation are currently used in research-grade field studies. OC frequently is not artifact-corrected in larg...
ERIC Educational Resources Information Center
Yuksel-Kaptanoglu, Ilknur; Turkyilmaz, Ahmet Sinan; Heise, Lori
2012-01-01
A large, nationally representative, cross-sectional survey was conducted in Turkey in 2008. In this survey, which used the WHO (World Health Organization) study module on violence, information about lifetime and current violence (past 12 months) was obtained using weighted, stratified, and multistage cluster sampling. This article describes…
Kandel, Saugat; Salomon-Ferrer, Romelia; Larsen, Adrien B; Jain, Abhinandan; Vaidehi, Nagarajan
2016-01-28
The Internal Coordinate Molecular Dynamics (ICMD) method is an attractive molecular dynamics (MD) method for studying the dynamics of bonded systems such as proteins and polymers. It offers a simple venue for coarsening the dynamics model of a system at multiple hierarchical levels. For example, large scale protein dynamics can be studied using torsional dynamics, where large domains or helical structures can be treated as rigid bodies and the loops connecting them as flexible torsions. ICMD with such a dynamic model of the protein, combined with enhanced conformational sampling method such as temperature replica exchange, allows the sampling of large scale domain motion involving high energy barrier transitions. Once these large scale conformational transitions are sampled, all-torsion, or even all-atom, MD simulations can be carried out for the low energy conformations sampled via coarse grained ICMD to calculate the energetics of distinct conformations. Such hierarchical MD simulations can be carried out with standard all-atom forcefields without the need for compromising on the accuracy of the forces. Using constraints to treat bond lengths and bond angles as rigid can, however, distort the potential energy landscape of the system and reduce the number of dihedral transitions as well as conformational sampling. We present here a two-part solution to overcome such distortions of the potential energy landscape with ICMD models. To alleviate the intrinsic distortion that stems from the reduced phase space in torsional MD, we use the Fixman compensating potential. To additionally alleviate the extrinsic distortion that arises from the coupling between the dihedral angles and bond angles within a force field, we propose a hybrid ICMD method that allows the selective relaxing of bond angles. This hybrid ICMD method bridges the gap between all-atom MD and torsional MD. We demonstrate with examples that these methods together offer a solution to eliminate the potential energy distortions encountered in constrained ICMD simulations of peptide molecules.
NASA Astrophysics Data System (ADS)
Kandel, Saugat; Salomon-Ferrer, Romelia; Larsen, Adrien B.; Jain, Abhinandan; Vaidehi, Nagarajan
2016-01-01
The Internal Coordinate Molecular Dynamics (ICMD) method is an attractive molecular dynamics (MD) method for studying the dynamics of bonded systems such as proteins and polymers. It offers a simple venue for coarsening the dynamics model of a system at multiple hierarchical levels. For example, large scale protein dynamics can be studied using torsional dynamics, where large domains or helical structures can be treated as rigid bodies and the loops connecting them as flexible torsions. ICMD with such a dynamic model of the protein, combined with enhanced conformational sampling method such as temperature replica exchange, allows the sampling of large scale domain motion involving high energy barrier transitions. Once these large scale conformational transitions are sampled, all-torsion, or even all-atom, MD simulations can be carried out for the low energy conformations sampled via coarse grained ICMD to calculate the energetics of distinct conformations. Such hierarchical MD simulations can be carried out with standard all-atom forcefields without the need for compromising on the accuracy of the forces. Using constraints to treat bond lengths and bond angles as rigid can, however, distort the potential energy landscape of the system and reduce the number of dihedral transitions as well as conformational sampling. We present here a two-part solution to overcome such distortions of the potential energy landscape with ICMD models. To alleviate the intrinsic distortion that stems from the reduced phase space in torsional MD, we use the Fixman compensating potential. To additionally alleviate the extrinsic distortion that arises from the coupling between the dihedral angles and bond angles within a force field, we propose a hybrid ICMD method that allows the selective relaxing of bond angles. This hybrid ICMD method bridges the gap between all-atom MD and torsional MD. We demonstrate with examples that these methods together offer a solution to eliminate the potential energy distortions encountered in constrained ICMD simulations of peptide molecules.
Wilsmore, Bradley R.; Grunstein, Ronald R.; Fransen, Marlene; Woodward, Mark; Norton, Robyn; Ameratunga, Shanthi
2013-01-01
Study Objectives: To determine the relationship between sleep complaints, primary insomnia, excessive daytime sleepiness, and lifestyle factors in a large community-based sample. Design: Cross-sectional study. Setting: Blood donor sites in New Zealand. Patients or Participants: 22,389 individuals aged 16-84 years volunteering to donate blood. Interventions: N/A. Measurements: A comprehensive self-administered questionnaire including personal demographics and validated questions assessing sleep disorders (snoring, apnea), sleep complaints (sleep quantity, sleep dissatisfaction), insomnia symptoms, excessive daytime sleepiness, mood, and lifestyle factors such as work patterns, smoking, alcohol, and illicit substance use. Additionally, direct measurements of height and weight were obtained. Results: One in three participants report < 7-8 h sleep, 5 or more nights per week, and 60% would like more sleep. Almost half the participants (45%) report suffering the symptoms of insomnia at least once per week, with one in 5 meeting more stringent criteria for primary insomnia. Excessive daytime sleepiness (evident in 9% of this large, predominantly healthy sample) was associated with insomnia (odds ratio [OR] 1.75, 95% confidence interval [CI] 1.50 to 2.05), depression (OR 2.01, CI 1.74 to 2.32), and sleep disordered breathing (OR 1.92, CI 1.59 to 2.32). Long work hours, alcohol dependence, and rotating work shifts also increase the risk of daytime sleepiness. Conclusions: Even in this relatively young, healthy, non-clinical sample, sleep complaints and primary insomnia with subsequent excess daytime sleepiness were common. There were clear associations between many personal and lifestyle factors—such as depression, long work hours, alcohol dependence, and rotating shift work—and sleep problems or excessive daytime sleepiness. Citation: Wilsmore BR; Grunstein RR; Fransen M; Woodward M; Norton R; Ameratunga S. Sleep habits, insomnia, and daytime sleepiness in a large and healthy community-based sample of New Zealanders. J Clin Sleep Med 2013;9(6):559-566. PMID:23772189
USDA-ARS?s Scientific Manuscript database
Naturally-occurring inhibitory compounds are a major concern during qPCR and RT-qPCR analysis of environmental samples, particularly large volume water samples. Here, a standardized method for measuring and mitigating sample inhibition in environmental water concentrates is described. Specifically, ...
Factor Structure and Correlates of the Dissociative Experiences Scale in a Large Offender Sample
ERIC Educational Resources Information Center
Ruiz, Mark A.; Poythress, Norman G.; Lilienfeld, Scott O.; Douglas, Kevin S.
2008-01-01
The authors examined the psychometric properties, factor structure, and construct validity of the Dissociative Experiences Scale (DES) in a large offender sample (N = 1,515). Although the DES is widely used with community and clinical samples, minimal work has examined offender samples. Participants were administered self-report and interview…
Roos, Johannes Lodewikus; Pretorius, Herman Walter; Karayiorgou, Maria
2009-01-01
The clinical characteristics of an Afrikaner founder population sample recruited for a schizophrenia genetic study are described. Comparisons on several clinical characteristics between this sample and a U.S. sample of schizophrenia patients show that generalization of findings in a founder population to the population at large is applicable. The assessment of the frequency of the 22q11 deletion in Afrikaner schizophrenia patients is approximately 2%, similar to findings in a U.S. sample. Results of analysis of early non-psychotic deviant behavior in subjects under the age of 10 years in the Afrikaner population broadly replicated findings in a U.S. sample. Approximately half of male schizophrenia patients and a quarter of female patients in the Afrikaner schizophrenia database used or abused cannabis. Male users of cannabis with severe early deviant behavior had the lowest mean age of criteria onset, namely 18.4 years. These findings confirm previous findings, indicating that early deviance is linked to later outcome of disease. The clinical characteristics and premorbid variables in 12 childhood-onset Afrikaner schizophrenia patients thus far recruited in this study compare favorably with what is known about childhood-onset schizophrenia in a U.S. sample. The prevalence of co-morbid OCD/OCS in this Afrikaner schizophrenia founder sample was 13.2% which is in keeping with that of co-morbid OCD in schizophrenia, estimated at 12.2% by the U.S. National Institute of Mental Health. These findings confirm that the clinical characteristics of a schizophrenia sample drawn from the Afrikaner founder population can be generalized to the schizophrenia population at large when compared to findings reported in the literature.
Large-scale human skin lipidomics by quantitative, high-throughput shotgun mass spectrometry.
Sadowski, Tomasz; Klose, Christian; Gerl, Mathias J; Wójcik-Maciejewicz, Anna; Herzog, Ronny; Simons, Kai; Reich, Adam; Surma, Michal A
2017-03-07
The lipid composition of human skin is essential for its function; however the simultaneous quantification of a wide range of stratum corneum (SC) and sebaceous lipids is not trivial. We developed and validated a quantitative high-throughput shotgun mass spectrometry-based platform for lipid analysis of tape-stripped SC skin samples. It features coverage of 16 lipid classes; total quantification to the level of individual lipid molecules; high reproducibility and high-throughput capabilities. With this method we conducted a large lipidomic survey of 268 human SC samples, where we investigated the relationship between sampling depth and lipid composition, lipidome variability in samples from 14 different sampling sites on the human body and finally, we assessed the impact of age and sex on lipidome variability in 104 healthy subjects. We found sebaceous lipids to constitute an abundant component of the SC lipidome as they diffuse into the topmost SC layers forming a gradient. Lipidomic variability with respect to sampling depth, site and subject is considerable, and mainly accredited to sebaceous lipids, while stratum corneum lipids vary less. This stresses the importance of sampling design and the role of sebaceous lipids in skin studies.
Eichelsheim, Veroni I; Buist, Kirsten L; Deković, Maja; Wissink, Inge B; Frijns, Tom; van Lier, Pol A C; Koot, Hans M; Meeus, Wim H J
2010-03-01
The aim of the present study is to examine whether the patterns of association between the quality of the parent-adolescent relationship on the one hand, and aggression and delinquency on the other hand, are the same for boys and girls of Dutch and Moroccan origin living in the Netherlands. Since inconsistent results have been found previously, the present study tests the replicability of the model of associations in two different Dutch samples of adolescents. Study 1 included 288 adolescents (M age = 14.9, range 12-17 years) all attending lower secondary education. Study 2 included 306 adolescents (M age = 13.2, range = 12-15 years) who were part of a larger community sample with oversampling of at risk adolescents. Multigroup structural analyses showed that neither in Study 1 nor in Study 2 ethnic or gender differences were found in the patterns of associations between support, autonomy, disclosure, and negativity in the parent-adolescent relationship and aggression and delinquency. The patterns were largely similar for both studies. Mainly negative quality of the relationship in both studies was found to be strongly related to both aggression and delinquency. Results show that family processes that affect adolescent development, show a large degree of universality across gender and ethnicity.
Back to Africa: Tracing Dyslexia Genes in East Africa
ERIC Educational Resources Information Center
Grigorenko, Elena L.; Naples, Adam; Chang, Joseph; Romano, Christina; Ngorosho, Damaris; Kungulilo, Selemani; Jukes, Matthew; Bundy, Donald
2007-01-01
A sample of Swahili-speaking probands with reading difficulties was identified from a large representative sample of 1,500 school children in the rural areas of Tanzania. Families of these probands (n = 88) were invited to participate in the study. The proband and his/her siblings received a battery of reading-related tasks and performance on…
The Role of Temperament and Personality in Problem Behaviors of Children with ADHD
ERIC Educational Resources Information Center
De Pauw, Sarah S. W.; Mervielde, Ivan
2011-01-01
This study describes temperament, personality, and problem behaviors in children with Attention-Deficit Hyperactivity Disorder (ADHD) aged 6 to 14 years. It targets differences between an ADHD sample (N=54; 43 boys) and a large community sample (N=465; 393 boys) in means and variances, psychometric properties, and covariation between traits and…
Widespread White Matter Differences in Children and Adolescents with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Vogan, V. M.; Morgan, B. R.; Leung, R. C.; Anagnostou, E.; Doyle-Thomas, K.; Taylor, M. J.
2016-01-01
Diffusion tensor imaging studies show white matter (WM) abnormalities in children with autism spectrum disorder (ASD). However, investigations are often limited by small samples, particularly problematic given the heterogeneity of ASD. We explored WM using DTI in a large sample of 130 children and adolescents (7-15 years) with and without ASD,…
Sexual Abuse and Suicidality: Gender Differences in a Large Community Sample of Adolescents
ERIC Educational Resources Information Center
Martin, Graham; Bergen, Helen A.; Richardson, Angela S.; Roeger, Leigh; Allison, Stephen
2004-01-01
Objective: A cross-sectional study of gender specific relationships between self-reported child sexual abuse and suicidality in a community sample of adolescents. Method: Students aged 14 years on average (N=2,485) from 27 schools in South Australia completed a questionnaire including items on sexual abuse and suicidality, and measures of…
Planning Community-Based Assessments of HIV Educational Intervention Programs in Sub-Saharan Africa
ERIC Educational Resources Information Center
Kelcey, Ben; Shen, Zuchao
2017-01-01
A key consideration in planning studies of community-based HIV education programs is identifying a sample size large enough to ensure a reasonable probability of detecting program effects if they exist. Sufficient sample sizes for community- or group-based designs are proportional to the correlation or similarity of individuals within communities.…
Characterization of the Theta to Beta Ratio in ADHD: Identifying Potential Sources of Heterogeneity
ERIC Educational Resources Information Center
Loo, Sandra K.; Cho, Alexander; Hale, T. Sigi; McGough, James; McCracken, James; Smalley, Susan L.
2013-01-01
Objective: The goal of this study is to characterize the theta to beta ratio (THBR) obtained from electroencephalogram (EEG) measures, in a large sample of community and clinical participants with regard to (a) ADHD diagnosis and subtypes, (b) common psychiatric comorbidities, and (c) cognitive correlates. Method: The sample includes 871…
How Big Is Big Enough? Sample Size Requirements for CAST Item Parameter Estimation
ERIC Educational Resources Information Center
Chuah, Siang Chee; Drasgow, Fritz; Luecht, Richard
2006-01-01
Adaptive tests offer the advantages of reduced test length and increased accuracy in ability estimation. However, adaptive tests require large pools of precalibrated items. This study looks at the development of an item pool for 1 type of adaptive administration: the computer-adaptive sequential test. An important issue is the sample size required…
Intellectual Abilities in a Large Sample of Children with Velo-Cardio-Facial Syndrome: An Update
ERIC Educational Resources Information Center
De Smedt, Bert; Devriendt, K.; Fryns, J. -P.; Vogels, A.; Gewillig, M.; Swillen, A.
2007-01-01
Background: Learning disabilities are one of the most consistently reported features in Velo-Cardio-Facial Syndrome (VCFS). Earlier reports on IQ in children with VCFS were, however, limited by small sample sizes and ascertainment biases. The aim of the present study was therefore to replicate these earlier findings and to investigate intellectual…
ERIC Educational Resources Information Center
Scheiber, Caroline; Reynolds, Matthew R.; Hajovsky, Daniel B.; Kaufman, Alan S.
2015-01-01
The purpose of this study was to investigate developmental gender differences in academic achievement areas, with the primary focus on writing, using the child and adolescent portion (ages 6-21 years) of the "Kaufman Test of Educational Achievement-Second Edition, Brief Form," norming sample (N = 1,574). Path analytic models with gender,…
Detecting Superior Face Recognition Skills in a Large Sample of Young British Adults
Bobak, Anna K.; Pampoulov, Philip; Bate, Sarah
2016-01-01
The Cambridge Face Memory Test Long Form (CFMT+) and Cambridge Face Perception Test (CFPT) are typically used to assess the face processing ability of individuals who believe they have superior face recognition skills. Previous large-scale studies have presented norms for the CFPT but not the CFMT+. However, previous research has also highlighted the necessity for establishing country-specific norms for these tests, indicating that norming data is required for both tests using young British adults. The current study addressed this issue in 254 British participants. In addition to providing the first norm for performance on the CFMT+ in any large sample, we also report the first UK specific cut-off for superior face recognition on the CFPT. Further analyses identified a small advantage for females on both tests, and only small associations between objective face recognition skills and self-report measures. A secondary aim of the study was to examine the relationship between trait or social anxiety and face processing ability, and no associations were noted. The implications of these findings for the classification of super-recognizers are discussed. PMID:27713706
Cox, Nick L J; Cami, Jan; Farhang, Amin; Smoker, Jonathan; Monreal-Ibero, Ana; Lallement, Rosine; Sarre, Peter J; Marshall, Charlotte C M; Smith, Keith T; Evans, Christopher J; Royer, Pierre; Linnartz, Harold; Cordiner, Martin A; Joblin, Christine; van Loon, Jacco Th; Foing, Bernard H; Bhatt, Neil H; Bron, Emeric; Elyajouri, Meriem; de Koter, Alex; Ehrenfreund, Pascale; Javadi, Atefeh; Kaper, Lex; Khosroshadi, Habib G; Laverick, Mike; Le Petit, Franck; Mulas, Giacomo; Roueff, Evelyne; Salama, Farid; Spaans, Marco
2017-10-01
The carriers of the diffuse interstellar bands (DIBs) are largely unidentified molecules ubiquitously present in the interstellar medium (ISM). After decades of study, two strong and possibly three weak near-infrared DIBs have recently been attributed to the [Formula: see text] fullerene based on observational and laboratory measurements. There is great promise for the identification of the over 400 other known DIBs, as this result could provide chemical hints towards other possible carriers. In an effort to systematically study the properties of the DIB carriers, we have initiated a new large-scale observational survey: the ESO Diffuse Interstellar Bands Large Exploration Survey (EDIBLES). The main objective is to build on and extend existing DIB surveys to make a major step forward in characterising the physical and chemical conditions for a statistically significant sample of interstellar lines-of-sight, with the goal to reverse-engineer key molecular properties of the DIB carriers. EDIBLES is a filler Large Programme using the Ultraviolet and Visual Echelle Spectrograph at the Very Large Telescope at Paranal, Chile. It is designed to provide an observationally unbiased view of the presence and behaviour of the DIBs towards early-spectral type stars whose lines-of-sight probe the diffuse-to-translucent ISM. Such a complete dataset will provide a deep census of the atomic and molecular content, physical conditions, chemical abundances and elemental depletion levels for each sightline. Achieving these goals requires a homogeneous set of high-quality data in terms of resolution ( R ~ 70 000 - 100 000), sensitivity (S/N up to 1000 per resolution element), and spectral coverage (305-1042 nm), as well as a large sample size (100+ sightlines). In this first paper the goals, objectives and methodology of the EDIBLES programme are described and an initial assessment of the data is provided.
NASA Astrophysics Data System (ADS)
Cox, Nick L. J.; Cami, Jan; Farhang, Amin; Smoker, Jonathan; Monreal-Ibero, Ana; Lallement, Rosine; Sarre, Peter J.; Marshall, Charlotte C. M.; Smith, Keith T.; Evans, Christopher J.; Royer, Pierre; Linnartz, Harold; Cordiner, Martin A.; Joblin, Christine; van Loon, Jacco Th.; Foing, Bernard H.; Bhatt, Neil H.; Bron, Emeric; Elyajouri, Meriem; de Koter, Alex; Ehrenfreund, Pascale; Javadi, Atefeh; Kaper, Lex; Khosroshadi, Habib G.; Laverick, Mike; Le Petit, Franck; Mulas, Giacomo; Roueff, Evelyne; Salama, Farid; Spaans, Marco
2017-10-01
The carriers of the diffuse interstellar bands (DIBs) are largely unidentified molecules ubiquitously present in the interstellar medium (ISM). After decades of study, two strong and possibly three weak near-infrared DIBs have recently been attributed to the C60^+ fullerene based on observational and laboratory measurements. There is great promise for the identification of the over 400 other known DIBs, as this result could provide chemical hints towards other possible carriers. In an effort tosystematically study the properties of the DIB carriers, we have initiated a new large-scale observational survey: the ESO Diffuse Interstellar Bands Large Exploration Survey (EDIBLES). The main objective is to build on and extend existing DIB surveys to make a major step forward in characterising the physical and chemical conditions for a statistically significant sample of interstellar lines-of-sight, with the goal to reverse-engineer key molecular properties of the DIB carriers. EDIBLES is a filler Large Programme using the Ultraviolet and Visual Echelle Spectrograph at the Very Large Telescope at Paranal, Chile. It is designed to provide an observationally unbiased view of the presence and behaviour of the DIBs towards early-spectral-type stars whose lines-of-sight probe the diffuse-to-translucent ISM. Such a complete dataset will provide a deep census of the atomic and molecular content, physical conditions, chemical abundances and elemental depletion levels for each sightline. Achieving these goals requires a homogeneous set of high-quality data in terms of resolution (R 70 000-100 000), sensitivity (S/N up to 1000 per resolution element), and spectral coverage (305-1042 nm), as well as a large sample size (100+ sightlines). In this first paper the goals, objectives and methodology of the EDIBLES programme are described and an initial assessment of the data is provided.
Cox, Nick L. J.; Cami, Jan; Farhang, Amin; Smoker, Jonathan; Monreal-Ibero, Ana; Lallement, Rosine; Sarre, Peter J.; Marshall, Charlotte C. M.; Smith, Keith T.; Evans, Christopher J.; Royer, Pierre; Linnartz, Harold; Cordiner, Martin A.; Joblin, Christine; van Loon, Jacco Th.; Foing, Bernard H.; Bhatt, Neil H.; Bron, Emeric; Elyajouri, Meriem; de Koter, Alex; Ehrenfreund, Pascale; Javadi, Atefeh; Kaper, Lex; Khosroshadi, Habib G.; Laverick, Mike; Le Petit, Franck; Mulas, Giacomo; Roueff, Evelyne; Salama, Farid; Spaans, Marco
2017-01-01
The carriers of the diffuse interstellar bands (DIBs) are largely unidentified molecules ubiquitously present in the interstellar medium (ISM). After decades of study, two strong and possibly three weak near-infrared DIBs have recently been attributed to the C60+ fullerene based on observational and laboratory measurements. There is great promise for the identification of the over 400 other known DIBs, as this result could provide chemical hints towards other possible carriers. In an effort to systematically study the properties of the DIB carriers, we have initiated a new large-scale observational survey: the ESO Diffuse Interstellar Bands Large Exploration Survey (EDIBLES). The main objective is to build on and extend existing DIB surveys to make a major step forward in characterising the physical and chemical conditions for a statistically significant sample of interstellar lines-of-sight, with the goal to reverse-engineer key molecular properties of the DIB carriers. EDIBLES is a filler Large Programme using the Ultraviolet and Visual Echelle Spectrograph at the Very Large Telescope at Paranal, Chile. It is designed to provide an observationally unbiased view of the presence and behaviour of the DIBs towards early-spectral type stars whose lines-of-sight probe the diffuse-to-translucent ISM. Such a complete dataset will provide a deep census of the atomic and molecular content, physical conditions, chemical abundances and elemental depletion levels for each sightline. Achieving these goals requires a homogeneous set of high-quality data in terms of resolution (R ~ 70 000 – 100 000), sensitivity (S/N up to 1000 per resolution element), and spectral coverage (305–1042 nm), as well as a large sample size (100+ sightlines). In this first paper the goals, objectives and methodology of the EDIBLES programme are described and an initial assessment of the data is provided. PMID:29151608
Sample size requirements for the design of reliability studies: precision consideration.
Shieh, Gwowen
2014-09-01
In multilevel modeling, the intraclass correlation coefficient based on the one-way random-effects model is routinely employed to measure the reliability or degree of resemblance among group members. To facilitate the advocated practice of reporting confidence intervals in future reliability studies, this article presents exact sample size procedures for precise interval estimation of the intraclass correlation coefficient under various allocation and cost structures. Although the suggested approaches do not admit explicit sample size formulas and require special algorithms for carrying out iterative computations, they are more accurate than the closed-form formulas constructed from large-sample approximations with respect to the expected width and assurance probability criteria. This investigation notes the deficiency of existing methods and expands the sample size methodology for the design of reliability studies that have not previously been discussed in the literature.
Waks, Zeev; Weissbrod, Omer; Carmeli, Boaz; Norel, Raquel; Utro, Filippo; Goldschmidt, Yaara
2016-12-23
Compiling a comprehensive list of cancer driver genes is imperative for oncology diagnostics and drug development. While driver genes are typically discovered by analysis of tumor genomes, infrequently mutated driver genes often evade detection due to limited sample sizes. Here, we address sample size limitations by integrating tumor genomics data with a wide spectrum of gene-specific properties to search for rare drivers, functionally classify them, and detect features characteristic of driver genes. We show that our approach, CAnceR geNe similarity-based Annotator and Finder (CARNAF), enables detection of potentially novel drivers that eluded over a dozen pan-cancer/multi-tumor type studies. In particular, feature analysis reveals a highly concentrated pool of known and putative tumor suppressors among the <1% of genes that encode very large, chromatin-regulating proteins. Thus, our study highlights the need for deeper characterization of very large, epigenetic regulators in the context of cancer causality.
State-space reduction and equivalence class sampling for a molecular self-assembly model.
Packwood, Daniel M; Han, Patrick; Hitosugi, Taro
2016-07-01
Direct simulation of a model with a large state space will generate enormous volumes of data, much of which is not relevant to the questions under study. In this paper, we consider a molecular self-assembly model as a typical example of a large state-space model, and present a method for selectively retrieving 'target information' from this model. This method partitions the state space into equivalence classes, as identified by an appropriate equivalence relation. The set of equivalence classes H, which serves as a reduced state space, contains none of the superfluous information of the original model. After construction and characterization of a Markov chain with state space H, the target information is efficiently retrieved via Markov chain Monte Carlo sampling. This approach represents a new breed of simulation techniques which are highly optimized for studying molecular self-assembly and, moreover, serves as a valuable guideline for analysis of other large state-space models.
Raaum, Ryan L; Al-Meeri, Ali; Mulligan, Connie J
2013-04-01
Studies of the impact of post-marital residence patterns on the distribution of genetic variation within populations have returned conflicting results. These studies have generally examined genetic diversity within and between groups with different post-marriage residence patterns. Here, we directly examine Y chromosome microsatellite variation in individuals carrying a chromosome in the same Y haplogroup. We analyze Y chromosome data from two samples of Yemeni males: a sample representing the entire country and a sample from a large highland village. Our results support a normative patrilocality in highland Yemeni tribal populations, but also suggest that patrilocality is violated often enough to break down the expected correlation of genetic and geographic distance. We propose that a great deal of variation in male dispersal distance distributions is subsumed under the "patrilocal" label and that few human societies are likely to realize the idealized male dispersal distribution expected under strict patrilocality. In addition, we found almost no specific correspondence between social kinship and genetic patriline at the level of the clan (large, extended patrilineal kinship group) within a large, highland Yemeni village. We discuss ethnographic accounts that offer several cultural practices that explain exceptions to patrilocality and means by which social kinship and genetic patriline may become disentangled. Copyright © 2013 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Lazebnik, Mariya; Popovic, Dijana; McCartney, Leah; Watkins, Cynthia B.; Lindstrom, Mary J.; Harter, Josephine; Sewall, Sarah; Ogilvie, Travis; Magliocco, Anthony; Breslin, Tara M.; Temple, Walley; Mew, Daphne; Booske, John H.; Okoniewski, Michal; Hagness, Susan C.
2007-10-01
The development of microwave breast cancer detection and treatment techniques has been driven by reports of substantial contrast in the dielectric properties of malignant and normal breast tissues. However, definitive knowledge of the dielectric properties of normal and diseased breast tissues at microwave frequencies has been limited by gaps and discrepancies across previously published studies. To address these issues, we conducted a large-scale study to experimentally determine the ultrawideband microwave dielectric properties of a variety of normal, malignant and benign breast tissues, measured from 0.5 to 20 GHz using a precision open-ended coaxial probe. Previously, we reported the dielectric properties of normal breast tissue samples obtained from reduction surgeries. Here, we report the dielectric properties of normal (adipose, glandular and fibroconnective), malignant (invasive and non-invasive ductal and lobular carcinomas) and benign (fibroadenomas and cysts) breast tissue samples obtained from cancer surgeries. We fit a one-pole Cole-Cole model to the complex permittivity data set of each characterized sample. Our analyses show that the contrast in the microwave-frequency dielectric properties between malignant and normal adipose-dominated tissues in the breast is considerable, as large as 10:1, while the contrast in the microwave-frequency dielectric properties between malignant and normal glandular/fibroconnective tissues in the breast is no more than about 10%.
Ellis, Ian O.; Green, Andrew R.; Hanka, Rudolf
2008-01-01
Background We consider the problem of assessing inter-rater agreement when there are missing data and a large number of raters. Previous studies have shown only ‘moderate’ agreement between pathologists in grading breast cancer tumour specimens. We analyse a large but incomplete data-set consisting of 24177 grades, on a discrete 1–3 scale, provided by 732 pathologists for 52 samples. Methodology/Principal Findings We review existing methods for analysing inter-rater agreement for multiple raters and demonstrate two further methods. Firstly, we examine a simple non-chance-corrected agreement score based on the observed proportion of agreements with the consensus for each sample, which makes no allowance for missing data. Secondly, treating grades as lying on a continuous scale representing tumour severity, we use a Bayesian latent trait method to model cumulative probabilities of assigning grade values as functions of the severity and clarity of the tumour and of rater-specific parameters representing boundaries between grades 1–2 and 2–3. We simulate from the fitted model to estimate, for each rater, the probability of agreement with the majority. Both methods suggest that there are differences between raters in terms of rating behaviour, most often caused by consistent over- or under-estimation of the grade boundaries, and also considerable variability in the distribution of grades assigned to many individual samples. The Bayesian model addresses the tendency of the agreement score to be biased upwards for raters who, by chance, see a relatively ‘easy’ set of samples. Conclusions/Significance Latent trait models can be adapted to provide novel information about the nature of inter-rater agreement when the number of raters is large and there are missing data. In this large study there is substantial variability between pathologists and uncertainty in the identity of the ‘true’ grade of many of the breast cancer tumours, a fact often ignored in clinical studies. PMID:18698346
Rodeles, Amaia A.; Galicia, David; Miranda, Rafael
2016-01-01
The study of freshwater fish species biodiversity and community composition is essential for understanding river systems, the effects of human activities on rivers, and the changes these animals face. Conducting this type of research requires quantitative information on fish abundance, ideally with long-term series and fish body measurements. This Data Descriptor presents a collection of 12 datasets containing a total of 146,342 occurrence records of 41 freshwater fish species sampled in 233 localities of various Iberian river basins. The datasets also contain 148,749 measurement records (length and weight) for these fish. Data were collected in different sampling campaigns (from 1992 to 2015). Eleven datasets represent large projects conducted over several years, and another combines small sampling campaigns. The Iberian Peninsula contains high fish biodiversity, with numerous endemic species threatened by various menaces, such as water extraction and invasive species. These data may support the development of large biodiversity conservation studies. PMID:27727236
Rodeles, Amaia A; Galicia, David; Miranda, Rafael
2016-10-11
The study of freshwater fish species biodiversity and community composition is essential for understanding river systems, the effects of human activities on rivers, and the changes these animals face. Conducting this type of research requires quantitative information on fish abundance, ideally with long-term series and fish body measurements. This Data Descriptor presents a collection of 12 datasets containing a total of 146,342 occurrence records of 41 freshwater fish species sampled in 233 localities of various Iberian river basins. The datasets also contain 148,749 measurement records (length and weight) for these fish. Data were collected in different sampling campaigns (from 1992 to 2015). Eleven datasets represent large projects conducted over several years, and another combines small sampling campaigns. The Iberian Peninsula contains high fish biodiversity, with numerous endemic species threatened by various menaces, such as water extraction and invasive species. These data may support the development of large biodiversity conservation studies.
MicroRNA Expression in Laser Micro-dissected Breast Cancer Tissue Samples - a Pilot Study.
Seclaman, Edward; Narita, Diana; Anghel, Andrei; Cireap, Natalia; Ilina, Razvan; Sirbu, Ioan Ovidiu; Marian, Catalin
2017-10-28
Breast cancer continues to represent a significant public health burden despite outstanding research advances regarding the molecular mechanisms of cancer biology, biomarkers for diagnostics and prognostic and therapeutic management of this disease. The studies of micro RNAs in breast cancer have underlined their potential as biomarkers and therapeutic targets; however most of these studies are still done on largely heterogeneous whole breast tissue samples. In this pilot study we have investigated the expression of four micro RNAs (miR-21, 145, 155, 92) known to be involved in breast cancer, in homogenous cell populations collected by laser capture microdissection from breast tissue section slides. Micro RNA expression was assessed by real time PCR, and associations with clinical and pathological characteristics were also explored. Our results have confirmed previous associations of miR-21 expression with poor prognosis characteristics of breast cancers such as high stage, large and highly proliferative tumors. No statistically significant associations were found with the other micro RNAs investigated, possibly due to the small sample size of our study. Our results also suggest that miR-484 could be a suitable endogenous control for data normalization in breast tissues, these results needing further confirmation by future studies. In summary, our pilot study showed the feasibility of detecting micro RNAs expression in homogenous laser captured microdissected invasive breast cancer samples, and confirmed some of the previously reported associations with poor prognostic characteristics of breast tumors.
Goeman, Valerie R; Tinkler, Stacy H; Hammac, G Kenitra; Ruple, Audrey
2018-04-01
Environmental surveillance for Salmonella enterica can be used for early detection of contamination; thus routine sampling is an integral component of infection control programs in hospital environments. At the Purdue University Veterinary Teaching Hospital (PUVTH), the technique regularly employed in the large animal hospital for sample collection uses sterile gauze sponges for environmental sampling, which has proven labor-intensive and time-consuming. Alternative sampling methods use Swiffer brand electrostatic wipes for environmental sample collection, which are reportedly effective and efficient. It was hypothesized that use of Swiffer wipes for sample collection would be more efficient and less costly than the use of gauze sponges. A head-to-head comparison between the 2 sampling methods was conducted in the PUVTH large animal hospital and relative agreement, cost-effectiveness, and sampling efficiency were compared. There was fair agreement in culture results between the 2 sampling methods, but Swiffer wipes required less time and less physical effort to collect samples and were more cost-effective.
How well do we know the infaunal biomass of the continental shelf?
NASA Astrophysics Data System (ADS)
Powell, Eric N.; Mann, Roger
2016-03-01
Benthic infauna comprise a wide range of taxa of varying abundances and sizes, but large infaunal taxa are infrequently recorded in community surveys of the shelf benthos. These larger, but numerically rare, species may contribute disproportionately to biomass, however. We examine the degree to which standard benthic sampling gear and survey design provide an adequate estimate of the biomass of large infauna using the Atlantic surfclam, Spisula solidissima, on the continental shelf off the northeastern coast of the United States as a test organism. We develop a numerical model that simulates standard survey designs, gear types, and sampling densities to evaluate the effectiveness of vertically-dropped sampling gear (e.g., boxcores, grabs) for estimating density of large species. Simulations of randomly distributed clams at a density of 0.5-1 m-2 within an 0.25-km2 domain show that lower sampling densities (1-5 samples per sampling event) resulted in highly inaccurate estimates of clam density with the presence of clams detected in less than 25% of the sampling events. In all cases in which patchiness was present in the simulated clam population, surveys were prone to very large errors (survey availability events) unless a dense (e.g., 100-sample) sampling protocol was imposed. Thus, commercial quantities of surfclams could easily go completely undetected by any standard benthic community survey protocol using vertically-dropped gear. Without recourse to modern high-volume sampling gear capable of sampling many meters at a swath, such as hydraulic dredges, biomass of the continental shelf will be grievously underestimated if large infauna are present even at moderate densities.
HUMAN EXPOSURE ASSESSMENT USING IMMUNOASSAY
The National Exposure Research Laboratory-Las Vegas is developing analytical methods for human exposure assessment studies. Critical exposure studies generate a large number of samples which must be analyzed in a reliable, cost-effective and timely manner. TCP (3,5,6-trichlor...
A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.
Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa
2016-05-17
Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.
Hille, Katja; Möbius, Nadine; Akmatov, Manas K; Verspohl, Jutta; Rabold, Denise; Hartmann, Maria; Günther, Kathrin; Obi, Nadia; Kreienbrock, Lothar
2014-11-01
Cats and dogs live in more than 20 % of German households and the contact between these pets and their owners can be very close. Therefore, a transmission of zoonotic pathogens may occur. To investigate whether zoonotic research questions can be examined in the context of population-based studies like the German National Cohort (GNC), two studies on different study populations were conducted as part of the feasibility tests of the GNC. The aim of the first study was to quantify the actual exposure of participants of the GNC to cats and dogs. In the second study summarised here the feasibility of the sampling of cats and dogs by their owners was tested. To quantify the exposure of participants of the GNC to cats and dogs 744 study participants of the Pretests of the GNC were asked whether they had contact with animals. Currently 10 % have a dog and 14 % have a cat in their household. These figures confirm that a large proportion of the German population has contact with pets and that there is a need for further zoonoses research. To establish the collection of biological samples from cats and dogs in the context of large-scale population-based studies feasible methods are needed. Therefore, a study was conducted to test whether pet owners can take samples from their cats and dogs and whether the quality of these samples is comparable to samples taken by a qualified veterinarian. A total of 82 dog and 18 cat owners were recruited in two veterinary practices in Hannover and the Clinic for Small Animals at the University of Veterinary Medicine in Hannover. Sampling instructions and sample material for nasal and buccal swabs, faecal samples and, in the case of cat owners, a brush for fur samples, were given to the pet owners. The pet owners were asked to take the samples from their pets at home and to send the samples by surface mail. Swab samples were cultured and bacterial growth was quantified independent of bacterial species. The growth of Gram-positive and Gram-negative bacteria from samples taken by the veterinarian and the pet owners were compared. For Gram-positive bacteria the agreement of laboratory results was 71 % for nasal swabs and 78 % for oral swabs while for Gram-negative bacteria the agreement of laboratory results was 55 % for nasal swabs and 87 % for oral swabs. In conclusion it has been shown that participants of the GNC are exposed to cats and dogs and that the sampling of cats and dogs by their owners is a feasible method which can be a useful tool for zoonoses research in population-based studies.
The functional spectrum of low-frequency coding variation.
Marth, Gabor T; Yu, Fuli; Indap, Amit R; Garimella, Kiran; Gravel, Simon; Leong, Wen Fung; Tyler-Smith, Chris; Bainbridge, Matthew; Blackwell, Tom; Zheng-Bradley, Xiangqun; Chen, Yuan; Challis, Danny; Clarke, Laura; Ball, Edward V; Cibulskis, Kristian; Cooper, David N; Fulton, Bob; Hartl, Chris; Koboldt, Dan; Muzny, Donna; Smith, Richard; Sougnez, Carrie; Stewart, Chip; Ward, Alistair; Yu, Jin; Xue, Yali; Altshuler, David; Bustamante, Carlos D; Clark, Andrew G; Daly, Mark; DePristo, Mark; Flicek, Paul; Gabriel, Stacey; Mardis, Elaine; Palotie, Aarno; Gibbs, Richard
2011-09-14
Rare coding variants constitute an important class of human genetic variation, but are underrepresented in current databases that are based on small population samples. Recent studies show that variants altering amino acid sequence and protein function are enriched at low variant allele frequency, 2 to 5%, but because of insufficient sample size it is not clear if the same trend holds for rare variants below 1% allele frequency. The 1000 Genomes Exon Pilot Project has collected deep-coverage exon-capture data in roughly 1,000 human genes, for nearly 700 samples. Although medical whole-exome projects are currently afoot, this is still the deepest reported sampling of a large number of human genes with next-generation technologies. According to the goals of the 1000 Genomes Project, we created effective informatics pipelines to process and analyze the data, and discovered 12,758 exonic SNPs, 70% of them novel, and 74% below 1% allele frequency in the seven population samples we examined. Our analysis confirms that coding variants below 1% allele frequency show increased population-specificity and are enriched for functional variants. This study represents a large step toward detecting and interpreting low frequency coding variation, clearly lays out technical steps for effective analysis of DNA capture data, and articulates functional and population properties of this important class of genetic variation.
Metabarcoding analysis of strongylid nematode diversity in two sympatric primate species.
Pafčo, Barbora; Čížková, Dagmar; Kreisinger, Jakub; Hasegawa, Hideo; Vallo, Peter; Shutt, Kathryn; Todd, Angelique; Petrželková, Klára J; Modrý, David
2018-04-12
Strongylid nematodes in large terrestrial herbivores such as great apes, equids, elephants, and humans tend to occur in complex communities. However, identification of all species within strongylid communities using traditional methods based on coproscopy or single nematode amplification and sequencing is virtually impossible. High-throughput sequencing (HTS) technologies provide opportunities to generate large amounts of sequence data and enable analyses of samples containing a mixture of DNA from multiple species/genotypes. We designed and tested an HTS approach for strain-level identification of gastrointestinal strongylids using ITS-2 metabarcoding at the MiSeq Illumina platform in samples from two free-ranging non-human primate species inhabiting the same environment, but differing significantly in their host traits and ecology. Although we observed overlapping of particular haplotypes, overall the studied primate species differed in their strongylid nematode community composition. Using HTS, we revealed hidden diversity in the strongylid nematode communities in non-human primates, more than one haplotype was found in more than 90% of samples and coinfections of more than one putative species occurred in 80% of samples. In conclusion, the HTS approach on strongylid nematodes, preferably using fecal samples, represents a time and cost-efficient way of studying strongylid communities and provides a resolution superior to traditional approaches.
Samusik, Nikolay; Wang, Xiaowei; Guan, Leying; Nolan, Garry P.
2017-01-01
Mass cytometry (CyTOF) has greatly expanded the capability of cytometry. It is now easy to generate multiple CyTOF samples in a single study, with each sample containing single-cell measurement on 50 markers for more than hundreds of thousands of cells. Current methods do not adequately address the issues concerning combining multiple samples for subpopulation discovery, and these issues can be quickly and dramatically amplified with increasing number of samples. To overcome this limitation, we developed Partition-Assisted Clustering and Multiple Alignments of Networks (PAC-MAN) for the fast automatic identification of cell populations in CyTOF data closely matching that of expert manual-discovery, and for alignments between subpopulations across samples to define dataset-level cellular states. PAC-MAN is computationally efficient, allowing the management of very large CyTOF datasets, which are increasingly common in clinical studies and cancer studies that monitor various tissue samples for each subject. PMID:29281633
Wang, Y; Yin, D C; Liu, Y M; Shi, J Z; Lu, H M; Shi, Z H; Qian, A R; Shang, P
2011-03-01
A high-field superconducting magnet can provide both high-magnetic fields and large-field gradients, which can be used as a special environment for research or practical applications in materials processing, life science studies, physical and chemical reactions, etc. To make full use of a superconducting magnet, shared instruments (the operating platform, sample holders, temperature controller, and observation system) must be prepared as prerequisites. This paper introduces the design of a set of sample holders and a temperature controller in detail with an emphasis on validating the performance of the force and temperature sensors in the high-magnetic field.
NASA Astrophysics Data System (ADS)
Wang, Y.; Yin, D. C.; Liu, Y. M.; Shi, J. Z.; Lu, H. M.; Shi, Z. H.; Qian, A. R.; Shang, P.
2011-03-01
A high-field superconducting magnet can provide both high-magnetic fields and large-field gradients, which can be used as a special environment for research or practical applications in materials processing, life science studies, physical and chemical reactions, etc. To make full use of a superconducting magnet, shared instruments (the operating platform, sample holders, temperature controller, and observation system) must be prepared as prerequisites. This paper introduces the design of a set of sample holders and a temperature controller in detail with an emphasis on validating the performance of the force and temperature sensors in the high-magnetic field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joy, Lija K.; Sooraj, V.; Sethulakshmi, N.
2014-03-24
Commercial samples of Magnetite with size ranging from 25–30 nm were coated with polyaniline by using radio frequency plasma polymerization to achieve a core shell structure of magnetic nanoparticle (core)–Polyaniline (shell). High resolution transmission electron microscopy images confirm the core shell architecture of polyaniline coated iron oxide. The dielectric properties of the material were studied before and after plasma treatment. The polymer coated magnetite particles exhibited a large dielectric permittivity with respect to uncoated samples. The dielectric behavior was modeled using a Maxwell–Wagner capacitor model. A plausible mechanism for the enhancement of dielectric permittivity is proposed.
Ang, Rebecca P; Lowe, Patricia A; Yusof, Noradlin
2011-12-01
The present study investigated the factor structure, reliability, convergent and discriminant validity, and U.S. norms of the Revised Children's Manifest Anxiety Scale, Second Edition (RCMAS-2; C. R. Reynolds & B. O. Richmond, 2008a) scores in a Singapore sample of 1,618 school-age children and adolescents. Although there were small statistically significant differences in the average RCMAS-2 T scores found across various demographic groupings, on the whole, the U.S. norms appear adequate for use in the Asian Singapore sample. Results from item bias analyses suggested that biased items detected had small effects and were counterbalanced across gender and ethnicity, and hence, their relative impact on test score variation appears to be minimal. Results of factor analyses on the RCMAS-2 scores supported the presence of a large general anxiety factor, the Total Anxiety factor, and the 5-factor structure found in U.S. samples was replicated. Both the large general anxiety factor and the 5-factor solution were invariant across gender and ethnic background. Internal consistency estimates ranged from adequate to good, and 2-week test-retest reliability estimates were comparable to previous studies. Evidence providing support for convergent and discriminant validity of the RCMAS-2 scores was also found. Taken together, findings provide additional cross-cultural evidence of the appropriateness and usefulness of the RCMAS-2 as a measure of anxiety in Asian Singaporean school-age children and adolescents.
NASA Astrophysics Data System (ADS)
Price-Whelan, Adrian M.; Agueros, M. A.; Fournier, A.; Street, R.; Ofek, E.; Levitan, D. B.; PTF Collaboration
2013-01-01
Many current photometric, time-domain surveys are driven by specific goals such as searches for supernovae or transiting exoplanets, or studies of stellar variability. These goals in turn set the cadence with which individual fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several such sub-surveys are being conducted in parallel, leading to extremely non-uniform sampling over the survey's nearly 20,000 sq. deg. footprint. While the typical 7.26 sq. deg. PTF field has been imaged 20 times in R-band, ~2300 sq. deg. have been observed more than 100 times. We use the existing PTF data 6.4x107 light curves) to study the trade-off that occurs when searching for microlensing events when one has access to a large survey footprint with irregular sampling. To examine the probability that microlensing events can be recovered in these data, we also test previous statistics used on uniformly sampled data to identify variables and transients. We find that one such statistic, the von Neumann ratio, performs best for identifying simulated microlensing events. We develop a selection method using this statistic and apply it to data from all PTF fields with >100 observations to uncover a number of interesting candidate events. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large datasets, both of which will be useful to future wide-field, time-domain surveys such as the LSST.
El Bali, Latifa; Diman, Aurélie; Bernard, Alfred; Roosens, Nancy H. C.; De Keersmaecker, Sigrid C. J.
2014-01-01
Human genomic DNA extracted from urine could be an interesting tool for large-scale public health studies involving characterization of genetic variations or DNA biomarkers as a result of the simple and noninvasive collection method. These studies, involving many samples, require a rapid, easy, and standardized extraction protocol. Moreover, for practicability, there is a necessity to collect urine at a moment different from the first void and to store it appropriately until analysis. The present study compared seven commercial kits to select the most appropriate urinary human DNA extraction procedure for epidemiological studies. DNA yield has been determined using different quantification methods: two classical, i.e., NanoDrop and PicoGreen, and two species-specific real-time quantitative (q)PCR assays, as DNA extracted from urine contains, besides human, microbial DNA also, which largely contributes to the total DNA yield. In addition, the kits giving a good yield were also tested for the presence of PCR inhibitors. Further comparisons were performed regarding the sampling time and the storage conditions. Finally, as a proof-of-concept, an important gene related to smoking has been genotyped using the developed tools. We could select one well-performing kit for the human DNA extraction from urine suitable for molecular diagnostic real-time qPCR-based assays targeting genetic variations, applicable to large-scale studies. In addition, successful genotyping was possible using DNA extracted from urine stored at −20°C for several months, and an acceptable yield could also be obtained from urine collected at different moments during the day, which is particularly important for public health studies. PMID:25365790
Dickerson, Daniel L; Johnson, Carrie L
2012-02-01
This study analyzes descriptive data among a clinical sample of American Indian/Alaska Native (AI/AN) youths receiving mental health services in a large California metropolitan area. Among 118 urban AI/AN youths, mood disorders (41.5%) and adjustment disorder (35.4%) were the most common mental health diagnoses. Alcohol (69.2%) and marijuana (50.0%) were the most commonly used substances. Witnessing domestic violence (84.2%) and living with someone who had a substance abuse problem (64.7%) were reported. The majority of patients demonstrated various behavior and emotional problems. Enhancing culturally relevant mental health and substance abuse treatment and prevention programs for urban AI/AN youth is suggested.
Sánchez-Marcos, J; Laguna-Marco, M A; Martínez-Morillas, R; Céspedes, E; Menéndez, N; Jiménez-Villacorta, F; Prieto, C
2012-11-01
Partially oxidized iron nanoclusters have been prepared by the gas-phase aggregation technique with typical sizes of 2-3 nm. This preparation technique has been reported to obtain clusters with interesting magnetic properties such as very large exchange bias. In this paper, a sample composition study carried out by Mössbauer and X-ray absorption spectroscopies is reported. The information reached by these techniques, which is based on the iron short range order, results to be an ideal way to have a characterization of the whole sample since the obtained data are an average over a very large amount of the clusters. In addition, our results indicate the presence of ferrihydrite, which is a compound typically ignored when studying this type of systems.
Biases in the OSSOS Detection of Large Semimajor Axis Trans-Neptunian Objects
NASA Astrophysics Data System (ADS)
Gladman, Brett; Shankman, Cory; OSSOS Collaboration
2017-10-01
The accumulating but small set of large semimajor axis trans-Neptunian objects (TNOs) shows an apparent clustering in the orientations of their orbits. This clustering must either be representative of the intrinsic distribution of these TNOs, or else have arisen as a result of observation biases and/or statistically expected variations for such a small set of detected objects. The clustered TNOs were detected across different and independent surveys, which has led to claims that the detections are therefore free of observational bias. This apparent clustering has led to the so-called “Planet 9” hypothesis that a super-Earth currently resides in the distant solar system and causes this clustering. The Outer Solar System Origins Survey (OSSOS) is a large program that ran on the Canada-France-Hawaii Telescope from 2013 to 2017, discovering more than 800 new TNOs. One of the primary design goals of OSSOS was the careful determination of observational biases that would manifest within the detected sample. We demonstrate the striking and non-intuitive biases that exist for the detection of TNOs with large semimajor axes. The eight large semimajor axis OSSOS detections are an independent data set, of comparable size to the conglomerate samples used in previous studies. We conclude that the orbital distribution of the OSSOS sample is consistent with being detected from a uniform underlying angular distribution.
OSSOS. VI. Striking Biases in the Detection of Large Semimajor Axis Trans-Neptunian Objects
NASA Astrophysics Data System (ADS)
Shankman, Cory; Kavelaars, J. J.; Bannister, Michele T.; Gladman, Brett J.; Lawler, Samantha M.; Chen, Ying-Tung; Jakubik, Marian; Kaib, Nathan; Alexandersen, Mike; Gwyn, Stephen D. J.; Petit, Jean-Marc; Volk, Kathryn
2017-08-01
The accumulating but small set of large semimajor axis trans-Neptunian objects (TNOs) shows an apparent clustering in the orientations of their orbits. This clustering must either be representative of the intrinsic distribution of these TNOs, or else have arisen as a result of observation biases and/or statistically expected variations for such a small set of detected objects. The clustered TNOs were detected across different and independent surveys, which has led to claims that the detections are therefore free of observational bias. This apparent clustering has led to the so-called “Planet 9” hypothesis that a super-Earth currently resides in the distant solar system and causes this clustering. The Outer Solar System Origins Survey (OSSOS) is a large program that ran on the Canada–France–Hawaii Telescope from 2013 to 2017, discovering more than 800 new TNOs. One of the primary design goals of OSSOS was the careful determination of observational biases that would manifest within the detected sample. We demonstrate the striking and non-intuitive biases that exist for the detection of TNOs with large semimajor axes. The eight large semimajor axis OSSOS detections are an independent data set, of comparable size to the conglomerate samples used in previous studies. We conclude that the orbital distribution of the OSSOS sample is consistent with being detected from a uniform underlying angular distribution.
Intra-class correlation estimates for assessment of vitamin A intake in children.
Agarwal, Girdhar G; Awasthi, Shally; Walter, Stephen D
2005-03-01
In many community-based surveys, multi-level sampling is inherent in the design. In the design of these studies, especially to calculate the appropriate sample size, investigators need good estimates of intra-class correlation coefficient (ICC), along with the cluster size, to adjust for variation inflation due to clustering at each level. The present study used data on the assessment of clinical vitamin A deficiency and intake of vitamin A-rich food in children in a district in India. For the survey, 16 households were sampled from 200 villages nested within eight randomly-selected blocks of the district. ICCs and components of variances were estimated from a three-level hierarchical random effects analysis of variance model. Estimates of ICCs and variance components were obtained at village and block levels. Between-cluster variation was evident at each level of clustering. In these estimates, ICCs were inversely related to cluster size, but the design effect could be substantial for large clusters. At the block level, most ICC estimates were below 0.07. At the village level, many ICC estimates ranged from 0.014 to 0.45. These estimates may provide useful information for the design of epidemiological studies in which the sampled (or allocated) units range in size from households to large administrative zones.
Zucker, Kenneth J; Blanchard, Ray; Kim, Tae-Suk; Pae, Chi-Un; Lee, Chul
2007-10-01
Two biodemographic variables - birth order and sibling sex ratio - have been examined in several Western samples of homosexual transsexual men. The results have consistently shown that homosexual transsexuals have a later birth order and come from sibships with an excess of brothers to sisters; the excess of brothers has been largely driven by the number of older brothers and hence has been termed the fraternal birth order effect. In the present study the birth order and sibling sex ratio were examined in an Asian sample of 43 homosexual transsexual men and 49 heterosexual control men from South Korea. Although the transsexual men had a significantly late birth order, so did the control men. Unlike Western samples, the Korean transsexuals had a significant excess of sisters, not brothers, as did the control men, and this was largely accounted for by older sisters. It is concluded that a male-preference stopping rule governing parental reproductive behavior had a strong impact on these two biodemographic variables. Future studies that examine birth order and sibling sex ratio in non-Western samples of transsexuals need to be vigilant for the influential role of stopping rules, including the one identified in the present study.
Brouwer, Lieke; van der Sanden, Sabine M G; Calis, Job C J; Bruning, Andrea H L; Wang, Steven; Wildenbeest, Joanne G; Rebers, Sjoerd P H; Phiri, Kamija S; Westerhuis, Brenda M; van Hensbroek, Michaël Boele; Pajkrt, Dasja; Wolthers, Katja C
2018-05-28
Enteroviruses (EVs) are among the most commonly detected viruses infecting humans worldwide. Although the prevalence of EVs is widely studied, the status of EV prevalence in sub-Saharan Africa remains largely unknown. The objective of our present study was therefore to increase our knowledge on EV circulation in sub-Saharan Africa. We obtained 749 fecal samples from a cross-sectional study conducted on Malawian children aged 6 to 60 months. We tested the samples for the presence of EVs using real time PCR, and typed the positive samples based on partial viral protein 1 (VP1) sequences. A large proportion of the samples was EV positive (89.9%). 12.9% of the typed samples belonged to EV species A (EV-A), 48.6% to species B (EV-B) and 38.5% to species C (EV-C). More than half of the EV-C strains (53%) belonged to subgroup C containing, among others, Poliovirus (PV) 1-3. The serotype most frequently isolated in our study was CVA-13, followed by EV-C99. The strains of CVA-13 showed a vast genetic diversity, possibly representing a new cluster, 'F'. The majority of the EV-C99 strains grouped together as cluster B. In conclusion, this study showed a vast circulation of EVs among Malawian children, with an EV prevalence of 89.9%. Identification of prevalences for species EV-C comparable to our study (38.5%) have only previously been reported in sub-Saharan Africa, and EV-C is rarely found outside of this region. The data found in this study are an important contribution to our current knowledge of EV epidemiology within sub-Saharan Africa.
NASA Technical Reports Server (NTRS)
Veldhuis, Hugo; Hall, Forrest G. (Editor); Knapp, David E. (Editor)
2000-01-01
This data set contains the major soil properties of soil samples collected in 1994 at the tower flux sites in the Northern Study Area (NSA). The soil samples were collected by Hugo Veldhuis and his staff from the University of Manitoba. The mineral soil samples were largely analyzed by Barry Goetz, under the supervision of Dr. Harold Rostad at the University of Saskatchewan. The organic soil samples were largely analyzed by Peter Haluschak, under the supervision of Hugo Veldhuis at the Centre for Land and Biological Resources Research in Winnipeg, Manitoba. During the course of field investigation and mapping, selected surface and subsurface soil samples were collected for laboratory analysis. These samples were used as benchmark references for specific soil attributes in general soil characterization. Detailed soil sampling, description, and laboratory analysis were performed on selected modal soils to provide examples of common soil physical and chemical characteristics in the study area. The soil properties that were determined include soil horizon; dry soil color; pH; bulk density; total, organic, and inorganic carbon; electric conductivity; cation exchange capacity; exchangeable sodium, potassium, calcium, magnesium, and hydrogen; water content at 0.01, 0.033, and 1.5 MPascals; nitrogen; phosphorus: particle size distribution; texture; pH of the mineral soil and of the organic soil; extractable acid; and sulfur. These data are stored in ASCII text files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
Lazebnik, Mariya; McCartney, Leah; Popovic, Dijana; Watkins, Cynthia B; Lindstrom, Mary J; Harter, Josephine; Sewall, Sarah; Magliocco, Anthony; Booske, John H; Okoniewski, Michal; Hagness, Susan C
2007-05-21
The efficacy of emerging microwave breast cancer detection and treatment techniques will depend, in part, on the dielectric properties of normal breast tissue. However, knowledge of these properties at microwave frequencies has been limited due to gaps and discrepancies in previously reported small-scale studies. To address these issues, we experimentally characterized the wideband microwave-frequency dielectric properties of a large number of normal breast tissue samples obtained from breast reduction surgeries at the University of Wisconsin and University of Calgary hospitals. The dielectric spectroscopy measurements were conducted from 0.5 to 20 GHz using a precision open-ended coaxial probe. The tissue composition within the probe's sensing region was quantified in terms of percentages of adipose, fibroconnective and glandular tissues. We fit a one-pole Cole-Cole model to the complex permittivity data set obtained for each sample and determined median Cole-Cole parameters for three groups of normal breast tissues, categorized by adipose tissue content (0-30%, 31-84% and 85-100%). Our analysis of the dielectric properties data for 354 tissue samples reveals that there is a large variation in the dielectric properties of normal breast tissue due to substantial tissue heterogeneity. We observed no statistically significant difference between the within-patient and between-patient variability in the dielectric properties.
Haematology and plasma chemistry of the red top ice blue mbuna cichlid (Metriaclima greshakei).
Snellgrove, Donna L; Alexander, Lucille G
2011-10-01
Clinical haematology and blood plasma chemistry can be used as a valuable tool to provide substantial diagnostic information for fish. A wide range of parameters can be used to assess nutritional status, digestive function, disease identification, routine metabolic levels, general physiological status and even the assessment and management of wild fish populations. However to evaluate such data accurately, baseline reference intervals for each measurable parameter must be established for the species of fish in question. Baseline data for ornamental fish species are limited, as research is more commonly conducted using commercially cultured fish. Blood samples were collected from sixteen red top ice blue cichlids (Metriaclima greshakei), an ornamental freshwater fish, to describe a range of haematology and plasma chemistry parameters. Since this cichlid is fairly large in comparison with most tropical ornamental fish, two independent blood samples were taken to assess a large range of parameters. No significant differences were noted between sample periods for any parameter. Values obtained for a large number of parameters were similar to those established for other closely related fish species such as tilapia (Oreochromis spp.). In addition to reporting the first set of blood values for M. Greshakei, to our knowledge, this study highlights the possibility of using previously established data for cultured cichlid species in studies with ornamental cichlid fish.
NASA Astrophysics Data System (ADS)
Silva, T. F.; Rodrigues, C. L.; Added, N.; Rizzutto, M. A.; Tabacniks, M. H.; Mangiarotti, A.; Curado, J. F.; Aguirre, F. R.; Aguero, N. F.; Allegro, P. R. P.; Campos, P. H. O. V.; Restrepo, J. M.; Trindade, G. F.; Antonio, M. R.; Assis, R. F.; Leite, A. R.
2018-05-01
The elemental mapping of large areas using ion beam techniques is a desired capability for several scientific communities, involved on topics ranging from geoscience to cultural heritage. Usually, the constraints for large-area mapping are not met in setups employing micro- and nano-probes implemented all over the world. A novel setup for mapping large sized samples in an external beam was recently built at the University of São Paulo employing a broad MeV-proton probe with sub-millimeter dimension, coupled to a high-precision large range XYZ robotic stage (60 cm range in all axis and precision of 5 μ m ensured by optical sensors). An important issue on large area mapping is how to deal with the irregularities of the sample's surface, that may introduce artifacts in the images due to the variation of the measuring conditions. In our setup, we implemented an automatic system based on machine vision to correct the position of the sample to compensate for its surface irregularities. As an additional benefit, a 3D digital reconstruction of the scanned surface can also be obtained. Using this new and unique setup, we have produced large-area elemental maps of ceramics, stones, fossils, and other sort of samples.
Do Study Abroad Programs Enhance the Employability of Graduates?
ERIC Educational Resources Information Center
Di Pietro, Giorgio
2015-01-01
Using data on a large sample of recent Italian graduates, this paper investigates the extent to which participation in study abroad programs during university studies impacts subsequent employment likelihood. To address the problem of endogeneity related to participation in study abroad programs, I use a combination of fixed effects and…
ASSOCIATION AMONG INVERTEBRATES AND HABITAT INDICATORS FOR LARGE RIVERS IN THE MIDWEST
Six reaches in each of two large rivers (one each in Kentucky and Ohio) were sampled using a prototype benthic macroinvertebrate sampling technique. The intent was to better understand the relationship between large river macroinvertebrate assemblages and habitat features. This...
ERIC Educational Resources Information Center
Reinhardt, Vanessa P.; Wetherby, Amy M.; Schatschneider, Christopher; Lord, Catherine
2015-01-01
Despite consistent and substantive research documenting a large male to female ratio in Autism Spectrum Disorder (ASD), only a modest body of research exists examining sex differences in characteristics. This study examined sex differences in developmental functioning and early social communication in children with ASD as compared to children with…
ERIC Educational Resources Information Center
Nichols, Joe D.; White, Janet J.; Price, Margret
2006-01-01
This study was designed to examine the epistemological beliefs about the nature of knowledge, views of intelligence and motivational perceptions. Two samples were drawn from two large urban high schools in the Southwest portion of the United States with large Hispanic/Latino student populations while a third was drawn from a majority Anglo student…
Truancy and Well-Being among Secondary School Pupils in England
ERIC Educational Resources Information Center
Attwood, Gaynor; Croll, Paul
2015-01-01
The paper considers two problematic aspects of the lives of young people: the long-standing issues of truancy from school and more recent concerns about the extent of mental well-being. It uses data from a large-scale survey, the Longitudinal Study of Young People in England (LSYPE). LSYPE provides a very large sample which allows for robust…
Reddy, Sushma; Kimball, Rebecca T; Pandey, Akanksha; Hosner, Peter A; Braun, Michael J; Hackett, Shannon J; Han, Kin-Lan; Harshman, John; Huddleston, Christopher J; Kingston, Sarah; Marks, Ben D; Miglia, Kathleen J; Moore, William S; Sheldon, Frederick H; Witt, Christopher C; Yuri, Tamaki; Braun, Edward L
2017-09-01
Phylogenomics, the use of large-scale data matrices in phylogenetic analyses, has been viewed as the ultimate solution to the problem of resolving difficult nodes in the tree of life. However, it has become clear that analyses of these large genomic data sets can also result in conflicting estimates of phylogeny. Here, we use the early divergences in Neoaves, the largest clade of extant birds, as a "model system" to understand the basis for incongruence among phylogenomic trees. We were motivated by the observation that trees from two recent avian phylogenomic studies exhibit conflicts. Those studies used different strategies: 1) collecting many characters [$\\sim$ 42 mega base pairs (Mbp) of sequence data] from 48 birds, sometimes including only one taxon for each major clade; and 2) collecting fewer characters ($\\sim$ 0.4 Mbp) from 198 birds, selected to subdivide long branches. However, the studies also used different data types: the taxon-poor data matrix comprised 68% non-coding sequences whereas coding exons dominated the taxon-rich data matrix. This difference raises the question of whether the primary reason for incongruence is the number of sites, the number of taxa, or the data type. To test among these alternative hypotheses we assembled a novel, large-scale data matrix comprising 90% non-coding sequences from 235 bird species. Although increased taxon sampling appeared to have a positive impact on phylogenetic analyses the most important variable was data type. Indeed, by analyzing different subsets of the taxa in our data matrix we found that increased taxon sampling actually resulted in increased congruence with the tree from the previous taxon-poor study (which had a majority of non-coding data) instead of the taxon-rich study (which largely used coding data). We suggest that the observed differences in the estimates of topology for these studies reflect data-type effects due to violations of the models used in phylogenetic analyses, some of which may be difficult to detect. If incongruence among trees estimated using phylogenomic methods largely reflects problems with model fit developing more "biologically-realistic" models is likely to be critical for efforts to reconstruct the tree of life. [Birds; coding exons; GTR model; model fit; Neoaves; non-coding DNA; phylogenomics; taxon sampling.]. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A SIMPLE METHOD FOR EVALUATING DATA FROM AN INTERLABORATORY STUDY
Large-scale laboratory-and method-performance studies involving more than about 30 laboratories may be evaluated by calculating the HORRAT ratio for each test sample (HORRAT=[experimentally found among-laboratories relative standard deviation] divided by [relative standard deviat...
NASA Astrophysics Data System (ADS)
Cécillon, Lauric; Quénéa, Katell; Anquetil, Christelle; Barré, Pierre
2015-04-01
Due to its large heterogeneity at all scales (from soil core to the globe), several measurements are often mandatory to get a meaningful value of a measured soil property. A large number of measurements can therefore be needed to study a soil property whatever the scale of the study. Moreover, several soil investigation techniques produce large and complex datasets, such as pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) which produces complex 3-way data. In this context, straightforward methods designed to speed up data treatments are needed to deal with large datasets. GC-MS pyrolysis (py-GCMS) is a powerful and frequently used tool to characterize soil organic matter (SOM). However, the treatment of the results of a py-GCMS analysis of soil sample is time consuming (number of peaks, co-elution, etc.) and the treatment of large data set of py-GCMS results is rather laborious. Moreover, peak position shifts and baseline drifts between analyses make the automation of GCMS programs data treatment difficult. These problems can be fixed using the Parallel Factor Analysis 2 (PARAFAC 2, Kiers et al., 1999; Bro et al., 1999). This algorithm has been applied frequently on chromatography data but has never been applied to analyses of SOM. We developed a Matlab routine based on existing Matlab packages dedicated to the simultaneous treatment of dozens of pyro-chromatograms mass spectra. We applied this routine on 40 soil samples. The benefits and expected improvements of our method will be discussed in our poster. References Kiers et al. (1999) PARAFAC2 - PartI. A direct fitting algorithm for the PARAFAC2 model. Journal of Chemometrics, 13: 275-294. Bro et al. (1999) PARAFAC2 - PartII. Modeling chromatographic data with retention time shifts. Journal of Chemometrics, 13: 295-309.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Ajay, E-mail: ajay782@rediffmail.com; Sharma, Sumit, E-mail: sumitshrm210@gmail.com
The study of radon concentration was measured in some areas of Pathankot district, Punjab, India, from the health hazard point of view due to radon. The exposure to radon through drinking water is largely by inhalation and ingestion. RAD 7, an electronic solid state silicon detector (Durridgeco., USA) was used to measure the radon concentration in drinking water samples of the study area. The recorded values of radon concentration in these water samples are below the recommended limit by UNSCEAR and European commission. The recommended limit of radon concentration in water samples is 4 to 40 Bq/l given by UNSCEARmore » [1] and European commission has recommended the safe limit for radon concentration in water sample is 100 Bq/l [2].« less
NASA Astrophysics Data System (ADS)
Kumar, Ajay; Sharma, Sumit
2015-08-01
The study of radon concentration was measured in some areas of Pathankot district, Punjab, India, from the health hazard point of view due to radon. The exposure to radon through drinking water is largely by inhalation and ingestion. RAD 7, an electronic solid state silicon detector (Durridgeco., USA) was used to measure the radon concentration in drinking water samples of the study area. The recorded values of radon concentration in these water samples are below the recommended limit by UNSCEAR and European commission. The recommended limit of radon concentration in water samples is 4 to 40 Bq/l given by UNSCEAR [1] and European commission has recommended the safe limit for radon concentration in water sample is 100 Bq/l [2].
Is High Self-Esteem a Precondition of "Normal" Behavior?
ERIC Educational Resources Information Center
Vande Kamp, Mark E.; And Others
Self-esteem is widely perceived to be important. This study examined the role of self-esteem as a moderator of social behavior in a sample selected to represent a broad range on the self-esteem dimension. Student subjects representing high, medium, and low levels of self-esteem were selected from a large sample (N=1,051) such that those…
ERIC Educational Resources Information Center
Papaioannou, Sophia; Mouzaki, Angeliki; Sideridis, Georgios D.; Antoniou, Foteini; Padeliadu, Suzanna; Simos, Panagiotis G.
2016-01-01
The study assessed cognitive and academic performance of children demonstrating teacher-rated ADHD-related symptoms (Inattention [IA] and/or Hyperactivity/Impulsivity [H/I]) in a representative sample of, largely untreated, Greek elementary school students (N?=?923). A battery of tests assessing short-term memory (STM), sustained attention,…
ERIC Educational Resources Information Center
Steinhausen, Hans-Christoph; Metzke, Christa Winkler
2007-01-01
Background: The goal of this study was to assess the course of functional-somatic symptoms from late childhood to young adulthood and the associations of these symptoms with young adult psychopathology. Methods: Data were collected in a large community sample at three different points in time (1994, 1997, and 2001). Functional-somatic symptoms…
ERIC Educational Resources Information Center
Quesen, Sarah
2016-01-01
When studying differential item functioning (DIF) with students with disabilities (SWD) focal groups typically suffer from small sample size, whereas the reference group population is usually large. This makes it possible for a researcher to select a sample from the reference population to be similar to the focal group on the ability scale. Doing…
ERIC Educational Resources Information Center
Hua, Haiyan; Burchfield, Shirley
A large-scale longitudinal study in Bolivia examined the relationship between adult women's basic education and their social and economic well-being and development. A random sample of 1,600 participants and 600 nonparticipants, aged 15-45, was tracked for 3 years (the final sample included 717 participants and 224 controls). The four adult…
A Comparative Study of Handicap-Free Life Expectancy of China in 1987 and 2006
ERIC Educational Resources Information Center
Lai, Dejian
2009-01-01
After the first large scale national sampling survey on handicapped persons in 1987, China conducted its second national sampling survey in 2006. Using the data from these two surveys and the national life tables, we computed and compared the expected years of life free of handicapped condition by the Sullivan method. The expected years of life…
ERIC Educational Resources Information Center
Ang, Rebecca P.; Lowe, Patricia A.; Yusof, Noradlin
2011-01-01
The present study investigated the factor structure, reliability, convergent and discriminant validity, and U.S. norms of the Revised Children's Manifest Anxiety Scale, Second Edition (RCMAS-2; C. R. Reynolds & B. O. Richmond, 2008a) scores in a Singapore sample of 1,618 school-age children and adolescents. Although there were small…
Stress, Social Support, and Outcomes in Two Probability Samples of Homeless Adults
ERIC Educational Resources Information Center
Toro, Paul A.; Tulloch, Elizabeth; Ouellette, Nicole
2008-01-01
This study investigated the main effects of social support measures and their stress-buffering effects in two samples of homeless adults (Ns =249 and 219) obtained in the same large county (surrounding Detroit) at different points in time over an 8-year period (1992-1994 and 2000-2002). The findings suggest that the construct of social support,…
ERIC Educational Resources Information Center
McCauley, Jenna L.; Conoscenti, Lauren M.; Ruggiero, Kenneth J.; Resnick, Heidi S.; Saunders, Benjamin E.; Kilpatrick, Dean G.
2009-01-01
Incapacitated/drug-alcohol facilitated sexual assault (IS/DAFS) is rapidly gaining recognition as a distinct form of assault with unique public health implications. This study reports the prevalence, case characteristics, and associated health risks of IS/DAFS using a large, nationally representative sample of 1,763 adolescent girls. Results…
ERIC Educational Resources Information Center
Xiaorong, Wu
2015-01-01
Under the Inland Tibetan Classes and Schools Policy, China has trained a large number of personnel to facilitate the social, economic, and cultural development of Tibet. This study used a multistage, random sample survey to collect data on the comprehensive qualities of two sample groups of personnel in Tibet: graduates and nongraduates of inland…
ERIC Educational Resources Information Center
Nam, Yunju; Mason, Lisa Reyes; Kim, Youngmi; Clancy, Margaret; Sherraden, Michael
2013-01-01
This study examined whether and how survey response differs by race and Hispanic origin, using data from birth certificates and survey administrative data for a large-scale statewide experiment. The sample consisted of mothers of infants selected from Oklahoma birth certificates using a stratified random sampling method (N = 7,111). This study…
Parent-Reported Feeding and Feeding Problems in a Sample of Dutch Toddlers
ERIC Educational Resources Information Center
de Moor, Jan; Didden, Robert; Korzilius, Hubert
2007-01-01
Little is known about the feeding behaviors and problems with feeding in toddlers. In the present questionnaire study, data were collected on the feeding behaviors and feeding problems in a relatively large (n = 422) sample of Dutch healthy toddlers (i.e. 18-36 months old) who lived at home with their parents. Results show that three meals a day…
WISC-IV and Clinical Validation of the Four- and Five-Factor Interpretative Approaches
ERIC Educational Resources Information Center
Weiss, Lawrence G.; Keith, Timothy Z.; Zhu, Jianjun; Chen, Hsinyi
2013-01-01
The purpose of this study was to determine the constructs measured by the WISC-IV and the consistency of measurement across large normative and clinical samples. Competing higher order four- and five-factor models were analyzed using the WISC-IV normative sample and clinical subjects. The four-factor solution is the model published with the test…
ERIC Educational Resources Information Center
Xulu, Zhang; Cheng, Jiang; Lili, Li
2017-01-01
Using large sample data from the 2013 National College Graduate Employment Survey, this article compares and analyzes differences in the job-seeking process and results for college students with urban and rural household registrations and uses a measurement model to explore factors affecting the starting salaries of college students. The research…
The K2 Galactic Archaeology Program Data Release. I. Asteroseismic Results from Campaign 1
NASA Astrophysics Data System (ADS)
Stello, Dennis; Zinn, Joel; Elsworth, Yvonne; Garcia, Rafael A.; Kallinger, Thomas; Mathur, Savita; Mosser, Benoit; Sharma, Sanjib; Chaplin, William J.; Davies, Guy; Huber, Daniel; Jones, Caitlin D.; Miglio, Andrea; Silva Aguirre, Victor
2017-01-01
NASA's K2 mission is observing tens of thousands of stars along the ecliptic, providing data suitable for large-scale asteroseismic analyses to inform galactic archaeology studies. Its first campaign covered a field near the north Galactic cap, a region never covered before by large asteroseismic-ensemble investigations, and was therefore of particular interest for exploring this part of our Galaxy. Here we report the asteroseismic analysis of all stars selected by the K2 Galactic Archaeology Program during the mission's “north Galactic cap” campaign 1. Our consolidated analysis uses six independent methods to measure the global seismic properties, in particular the large frequency separation and the frequency of maximum power. From the full target sample of 8630 stars we find about 1200 oscillating red giants, a number comparable with estimates from galactic synthesis modeling. Thus, as a valuable by-product we find roughly 7500 stars to be dwarfs, which provide a sample well suited for galactic exoplanet occurrence studies because they originate from our simple and easily reproducible selection function. In addition, to facilitate the full potential of the data set for galactic archaeology, we assess the detection completeness of our sample of oscillating red giants. We find that the sample is at least nearly complete for stars with 40 ≲ {ν }\\max /μHz ≲ 270 and {ν }\\max ,{detect}< 2.6× {10}6\\cdot {2}-{\\text{Kp}} μHz. There is a detection bias against helium core burning stars with {ν }\\max ˜ 30 μHz, affecting the number of measurements of {{Δ }}ν and possibly also {ν }\\max . Although we can detect oscillations down to {\\text{Kp}} = 15, our campaign 1 sample lacks enough faint giants to assess the detection completeness for stars fainter than {\\text{Kp}} ˜ 14.5.
Retrieving cosmological signal using cosmic flows
NASA Astrophysics Data System (ADS)
Bouillot, V.; Alimi, J.-M.
2011-12-01
To understand the origin of the anomalously high bulk flow at large scales, we use very large simulations in various cosmological models. To disentangle between cosmological and environmental effects, we select samples with bulk flow profiles similar to the observational data Watkins et al. (2009) which exhibit a maximum in the bulk flow at 53 h^{-1} Mpc. The estimation of the cosmological parameters Ω_M and σ_8, done on those samples, is correct from the rms mass fluctuation whereas this estimation gives completely false values when done on bulk flow measurements, hence showing a dependance of velocity fields on larger scales. By drawing a clear link between velocity fields at 53 h^{-1} Mpc and asymmetric patterns of the density field at 85 h^{-1} Mpc, we show that the bulk flow can depend largely on the environment. The retrieving of the cosmological signal is achieved by studying the convergence of the bulk flow towards the linear prediction at very large scale (˜ 150 h^{-1} Mpc).
Large area optical mapping of surface contact angle.
Dutra, Guilherme; Canning, John; Padden, Whayne; Martelli, Cicero; Dligatch, Svetlana
2017-09-04
Top-down contact angle measurements have been validated and confirmed to be as good if not more reliable than side-based measurements. A range of samples, including industrially relevant materials for roofing and printing, has been compared. Using the top-down approach, mapping in both 1-D and 2-D has been demonstrated. The method was applied to study the change in contact angle as a function of change in silver (Ag) nanoparticle size controlled by thermal evaporation. Large area mapping reveals good uniformity for commercial Aspen paper coated with black laser printer ink. A demonstration of the forensic and chemical analysis potential in 2-D is shown by uncovering the hidden CsF initials made with mineral oil on the coated Aspen paper. The method promises to revolutionize nanoscale characterization and industrial monitoring as well as chemical analyses by allowing rapid contact angle measurements over large areas or large numbers of samples in ways and times that have not been possible before.
Pore water sampling in acid sulfate soils: a new peeper method.
Johnston, Scott G; Burton, Edward D; Keene, Annabelle F; Bush, Richard T; Sullivan, Leigh A; Isaacson, Lloyd
2009-01-01
This study describes the design, deployment, and application of a modified equilibration dialysis device (peeper) optimized for sampling pore waters in acid sulfate soils (ASS). The modified design overcomes the limitations of traditional-style peepers, when sampling firm ASS materials over relatively large depth intervals. The new peeper device uses removable, individual cells of 25 mL volume housed in a 1.5 m long rigid, high-density polyethylene rod. The rigid housing structure allows the device to be inserted directly into relatively firm soils without requiring a supporting frame. The use of removable cells eliminates the need for a large glove-box after peeper retrieval, thus simplifying physical handling. Removable cells are easily maintained in an inert atmosphere during sample processing and the 25-mL sample volume is sufficient for undertaking multiple analyses. A field evaluation of equilibration times indicates that 32 to 38 d of deployment was necessary. Overall, the modified method is simple and effective and well suited to acquisition and processing of redox-sensitive pore water profiles>1 m deep in acid sulfate soil or any other firm wetland soils.
Non-wadeable rivers have been largely overlooked by bioassessment programs because of sampling difficulties and a lack of appropriate methods and biological indicators. We are in the process of developing a Large River Bioassessment Protocol (LR-BP) for sampling macroinvertebrat...
Watershed Scale Monitoring and Modeling of Natural Organic Matter (NOM) Generation and Transport
NASA Astrophysics Data System (ADS)
Adams, R.; Rees, P. L.; Reckhow, D. A.; Castellon, C. M.
2006-05-01
This study describes a coupled watershed scale monitoring campaign, laboratory study, and hydrological modeling study which has been focused on determining the sources and transport mechanisms for Natural Organic Matter (NOM), in a small, mostly forested New England watershed. For some time, the state conservation authorities and a large metropolitan water authority have been concerned that the level of naturally-occurring disinfection byproducts in drinking water supplied by a large surface water reservoir (Watchusett Reservoir, MA) have been increasing over time. The resulting study has attempted to investigate how these compounds, which are mostly formed by the chlorination process at the water treatment plant, are related to NOM precursor compounds which are generated from organic matter and transported by runoff processes in the watershed of the Watchusett Reservoir. The laboratory study measures disinfection byproduct formation potential (DBPFP) through chlorination of raw water samples obtained through field monitoring. Samples are analysed for trihalomethanes (THMs), and haloacetic acids (HAAs). Samples are also analysed for dissolved organic carbon (DOC) and ultraviolet absorbance at 254 nm (UV254). The samples have been collected from as many components of the hydrological cycle as possible in one of the subcatchments of Watchusett Reservoir (Stillwater River). To date the samples include, stream runoff, water impounded naturally in small ponds by beaver dams, rainfall, snow, throughfall (drainage from tree canopies) and samples pumped from shallow suction lysimeters which were installed to monitor soil water in the riparian zone. The current monitoring program began in late-Summer 2005, however infrequent stream samples are available dating back to 2000 from an earlier research project and water quality monitoring by various regulatory authorities. The monitoring program has been designed to capture as much seasonal variation in water chemistry as possible and also to capture a large spring snowmelt event. The modeling study has been designed to provide a method of estimating the export of NOM and DBPFP precursor compounds by running a series of simple macromodels. One of these models has already been developed for DOC transport based on a variant of the popular TOPMODEL hydrological model. Currently, historical daily streamflow and precipitation data have been used to calibrate the hydrological model, and the results from the current and previous monitoring programs are being used to improve the representation of DOM generation in the model. The ultimate aim is to produce a modeling tool which can be used to investigate changes both in land-use and climate in the watershed and the resulting effects on the export of NOM and DBPFP compounds into the reservoir.
An experimental study on dynamic response for MICP strengthening liquefiable sands
NASA Astrophysics Data System (ADS)
Han, Zhiguang; Cheng, Xiaohui; Ma, Qiang
2016-12-01
The technology of bio-grouting is a new technique for soft ground improvement. Many researchers have carried out a large number of experiments and study on this topic. However, few studies have been carried out on the dynamic response of solidified sand samples, such reducing liquefaction in sand. To study this characteristic of microbial-strengthened liquefiable sandy foundation, a microorganism formula and grouting scheme is applied. After grouting, the solidified samples are tested via dynamic triaxial testing to examine the cyclic performance of solidified sand samples. The results indicate that the solidified sand samples with various strengths can be obtained to meet different engineering requirements, the use of bacteria solution and nutritive salt is reduced, and solidified time is shortened to 1-2 days. Most importantly, in the study of the dynamic response, it is found that the MICP grouting scheme is effective in improving liquefiable sand characteristic, such as liquefaction resistance.
The Peru Cervical Cancer Prevention Study (PERCAPS): the technology to make screening accessible.
Levinson, Kimberly L; Abuelo, Carolina; Salmeron, Jorge; Chyung, Eunice; Zou, Jing; Belinson, Suzanne E; Wang, Guixiang; Ortiz, Carlos Santos; Vallejos, Carlos Santiago; Belinson, Jerome L
2013-05-01
This study utilized a combination of HPV self-sampling, iFTA elute specimen cards, and long distance transport for centralized processing of specimens to determine the feasibility of large-scale screening in remote and transient populations. This study was performed in two locations in Peru (Manchay and Iquitos). The "Just For Me" cervico-vaginal brush and iFTA elute cards were used for the collection and transport of specimens. Samples were shipped via FedEx to China and tested for 14 types of high-risk HPV using PCR based MALDI-TOF. HPV positive women were treated with cryotherapy after VIA triage, and followed-up with colposcopy, biopsy, ECC, and repeat HPV testing at 6 months. Six hundred and forty three women registered, and 632 returned a sample over a 10 day period. Within 2 weeks, specimens were shipped, samples tested, and results received by study staff. Sixty-eight women (10.8%) tested positive, and these results were delivered over 4 days. Fifty-nine HPV positive women (87%) returned for evaluation and treatment, and 2 had large lesions not suitable for cryotherapy. At 6 months, 42 women (74%) returned for follow-up, and 3 had CIN 2 (all positive samples from the endocervical canal). Ninety eight percent of participants reported that they would participate in this type of program again. Utilizing HPV self-sampling, solid media specimen cards for long distance transport, and centralized high throughput processing, we achieved rapid delivery of results, high satisfaction levels, and low loss to follow-up for cervical cancer screening in remote and transient populations. Copyright © 2013 Elsevier Inc. All rights reserved.
Investigation of spectral analysis techniques for randomly sampled velocimetry data
NASA Technical Reports Server (NTRS)
Sree, Dave
1993-01-01
It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable prefiltering technique. But, this increased bandwidth comes at the cost of the lower frequency estimates. The studies further showed that large data sets of the order of 100,000 points, or more, high data rates, and Poisson sampling are very crucial for obtaining reliable spectral estimates from randomly sampled data, such as LV data. Some of the results of the current study are presented.
Baird, Andrew J; Haslam, Roger A
2013-12-01
Beliefs, cognitions, and behaviors relating to pain can be associated with a range of negative outcomes. In patients, certain beliefs are associated with increased levels of pain and related disability. There are few data, however, showing the extent to which beliefs of patients differ from those of the general population. This study explored pain beliefs in a large nonclinical population and a chronic low back pain (CLBP) sample using the Pain Beliefs Questionnaire (PBQ) to identify differences in scores and factor structures between and within the samples. This was a cross-sectional study. The samples comprised patients attending a rehabilitation program and respondents to a workplace survey. Pain beliefs were assessed using the PBQ, which incorporates 2 scales: organic and psychological. Exploratory factor analysis was used to explore variations in factor structure within and between samples. The relationship between the 2 scales also was examined. Patients reported higher organic scores and lower psychological scores than the nonclinical sample. Within the nonclinical sample, those who reported frequent pain scored higher on the organic scale than those who did not. Factor analysis showed variations in relation to the presence of pain. The relationship between scales was stronger in those not reporting frequent pain. This was a cross-sectional study; therefore, no causal inferences can be made. Patients experiencing CLBP adopt a more biomedical perspective on pain than nonpatients. The presence of pain is also associated with increased biomedical thinking in a nonclinical sample. However, the impact is not only on the strength of beliefs, but also on the relationship between elements of belief and the underlying belief structure.
Rolke, Daniel; Persigehl, Markus; Peters, Britta; Sterk, Guido; Blenau, Wolfgang
2016-11-01
This study was part of a large-scale monitoring project to assess the possible effects of Elado ® (10 g clothianidin & 2 g β-cyfluthrin/kg seed)-dressed oilseed rape seeds on different pollinators in Northern Germany. Firstly, residues of clothianidin and its active metabolites thiazolylnitroguanidine and thiazolylmethylurea were measured in nectar and pollen from Elado ® -dressed (test site, T) and undressed (reference site, R) oilseed rape collected by honey bees confined within tunnel tents. Clothianidin and its metabolites could not be detected or quantified in samples from R fields. Clothianidin concentrations in samples from T fields were 1.3 ± 0.9 μg/kg and 1.7 ± 0.9 μg/kg in nectar and pollen, respectively. Secondly, pollen and nectar for residue analyses were sampled from free flying honey bees, bumble bees and mason bees, placed at six study locations each in the R and T sites at the start of oilseed rape flowering. Honey samples were analysed from all honey bee colonies at the end of oilseed rape flowering. Neither clothianidin nor its metabolites were detectable or quantifiable in R site samples. Clothianidin concentrations in samples from the T site were below the limit of quantification (LOQ, 1.0 µg/kg) in most pollen and nectar samples collected by bees and 1.4 ± 0.5 µg/kg in honey taken from honey bee colonies. In summary, the study provides reliable semi-field and field data of clothianidin residues in nectar and pollen collected by different bee species in oilseed rape fields under common agricultural conditions.
Epistemological Issues in Astronomy Education Research: How Big of a Sample is "Big Enough"?
NASA Astrophysics Data System (ADS)
Slater, Stephanie; Slater, T. F.; Souri, Z.
2012-01-01
As astronomy education research (AER) continues to evolve into a sophisticated enterprise, we must begin to grapple with defining our epistemological parameters. Moreover, as we attempt to make pragmatic use of our findings, we must make a concerted effort to communicate those parameters in a sensible way to the larger astronomical community. One area of much current discussion involves a basic discussion of methodologies, and subsequent sample sizes, that should be considered appropriate for generating knowledge in the field. To address this question, we completed a meta-analysis of nearly 1,000 peer-reviewed studies published in top tier professional journals. Data related to methodologies and sample sizes were collected from "hard science” and "human science” journals to compare the epistemological systems of these two bodies of knowledge. Working back in time from August 2011, the 100 most recent studies reported in each journal were used as a data source: Icarus, ApJ and AJ, NARST, IJSE and SciEd. In addition, data was collected from the 10 most recent AER dissertations, a set of articles determined by the science education community to be the most influential in the field, and the nearly 400 articles used as reference materials for the NRC's Taking Science to School. Analysis indicates these bodies of knowledge have a great deal in common; each relying on a large variety of methodologies, and each building its knowledge through studies that proceed from surprisingly low sample sizes. While both fields publish a small percentage of studies with large sample sizes, the vast majority of top tier publications consist of rich studies of a small number of objects. We conclude that rigor in each field is determined not by a circumscription of methodologies and sample sizes, but by peer judgments that the methods and sample sizes are appropriate to the research question.
Identification of gamma-irradiated foodstuffs by chemiluminescence measurements in Taiwan
NASA Astrophysics Data System (ADS)
Ma, Ming-Shia Chang; Chen, Li-Hsiang; Tsai, Zei-Tsan; Fu, Ying-Kai
In order to establish chemiluminescence (CL) measurements as an identification method for γ-irradiated foodstuffs in Taiwan, ten agricultural products including wheat flour, rice, ginger, potatoes, garlic, onions, red beans, mung beans, soy beans, xanthoxylon seeds and Japanese star anises have been tested to compare CL intensities between untreated samples and samples subject to a 10 kGy γ-irradiation dose. Amongst them, wheat flour is the most eligible product to be identified by CL measurements. The CL intensities of un-irradiated and irradiated flour have shown large differences associated with a significant dose-effect relationship. Effects of three different protein contents of flour, unsieved and sieved (100-200 mesh), the reproducibility and the storage experiment on CL intensities at various doses were investigated in this study. In addition, the white bulb part of onions has shown some CL in irradiated samples. The CL data obtained from the other eight agricultural products have shown large fluctuations and cannot be used to differentiate between irradiated and un-irradiated samples.
Atomistic origin of size effects in fatigue behavior of metallic glasses
NASA Astrophysics Data System (ADS)
Sha, Zhendong; Wong, Wei Hin; Pei, Qingxiang; Branicio, Paulo Sergio; Liu, Zishun; Wang, Tiejun; Guo, Tianfu; Gao, Huajian
2017-07-01
While many experiments and simulations on metallic glasses (MGs) have focused on their tensile ductility under monotonic loading, the fatigue mechanisms of MGs under cyclic loading still remain largely elusive. Here we perform molecular dynamics (MD) and finite element simulations of tension-compression fatigue tests in MGs to elucidate their fatigue mechanisms with focus on the sample size effect. Shear band (SB) thickening is found to be the inherent fatigue mechanism for nanoscale MGs. The difference in fatigue mechanisms between macroscopic and nanoscale MGs originates from whether the SB forms partially or fully through the cross-section of the specimen. Furthermore, a qualitative investigation of the sample size effect suggests that small sample size increases the fatigue life while large sample size promotes cyclic softening and necking. Our observations on the size-dependent fatigue behavior can be rationalized by the Gurson model and the concept of surface tension of the nanovoids. The present study sheds light on the fatigue mechanisms of MGs and can be useful in interpreting previous experimental results.
Quality variations in black musli (curculigo orchioides gaertn.).
Mathew, P P Joy Samuel; Savithri, K E; Skaria, Baby P; Kurien, Kochurani
2004-07-01
Black musli (Curculigo orchioides Gaertn.) one of the ayurvedic dasapushpa and a rejuvenating and aphrodisiac drug. Is on the verge of extinction and needs to be conserved and cultivated. Large variations are also observed in the quality of the crude drug available in the market. Study on the quality of C. orchioides in natural habitat, under cultivation and in trade in south India showed that there was considerable variation with biotypes and habitats. Drugs collected form the natural habitat was superior in quality to that produced by cultivation. Among the market samples collected from the various Zones of kerala, those from the High Ranges were superior in most of the quality parameters, which indicated its superiority for high quality drug formulation. Among the southern states, Tamil Nadu samples ranked next to High Range samples in this respect. There exists large variability in the market samples and there is felt-need for proper standardization of the crude drug for ensuring quality in the drug formulations.
Daboul, Amro; Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea
2018-01-01
Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'.
Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea
2018-01-01
Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'. PMID:29787586
Walker, Joseph F; Yang, Ya; Feng, Tao; Timoneda, Alfonso; Mikenas, Jessica; Hutchison, Vera; Edwards, Caroline; Wang, Ning; Ahluwalia, Sonia; Olivieri, Julia; Walker-Hale, Nathanael; Majure, Lucas C; Puente, Raúl; Kadereit, Gudrun; Lauterbach, Maximilian; Eggli, Urs; Flores-Olvera, Hilda; Ochoterena, Helga; Brockington, Samuel F; Moore, Michael J; Smith, Stephen A
2018-03-01
The Caryophyllales contain ~12,500 species and are known for their cosmopolitan distribution, convergence of trait evolution, and extreme adaptations. Some relationships within the Caryophyllales, like those of many large plant clades, remain unclear, and phylogenetic studies often recover alternative hypotheses. We explore the utility of broad and dense transcriptome sampling across the order for resolving evolutionary relationships in Caryophyllales. We generated 84 transcriptomes and combined these with 224 publicly available transcriptomes to perform a phylogenomic analysis of Caryophyllales. To overcome the computational challenge of ortholog detection in such a large data set, we developed an approach for clustering gene families that allowed us to analyze >300 transcriptomes and genomes. We then inferred the species relationships using multiple methods and performed gene-tree conflict analyses. Our phylogenetic analyses resolved many clades with strong support, but also showed significant gene-tree discordance. This discordance is not only a common feature of phylogenomic studies, but also represents an opportunity to understand processes that have structured phylogenies. We also found taxon sampling influences species-tree inference, highlighting the importance of more focused studies with additional taxon sampling. Transcriptomes are useful both for species-tree inference and for uncovering evolutionary complexity within lineages. Through analyses of gene-tree conflict and multiple methods of species-tree inference, we demonstrate that phylogenomic data can provide unparalleled insight into the evolutionary history of Caryophyllales. We also discuss a method for overcoming computational challenges associated with homolog clustering in large data sets. © 2018 The Authors. American Journal of Botany is published by Wiley Periodicals, Inc. on behalf of the Botanical Society of America.
Predicting sample lifetimes in creep fracture of heterogeneous materials
NASA Astrophysics Data System (ADS)
Koivisto, Juha; Ovaska, Markus; Miksic, Amandine; Laurson, Lasse; Alava, Mikko J.
2016-08-01
Materials flow—under creep or constant loads—and, finally, fail. The prediction of sample lifetimes is an important and highly challenging problem because of the inherently heterogeneous nature of most materials that results in large sample-to-sample lifetime fluctuations, even under the same conditions. We study creep deformation of paper sheets as one heterogeneous material and thus show how to predict lifetimes of individual samples by exploiting the "universal" features in the sample-inherent creep curves, particularly the passage to an accelerating creep rate. Using simulations of a viscoelastic fiber bundle model, we illustrate how deformation localization controls the shape of the creep curve and thus the degree of lifetime predictability.
Sample size for post-marketing safety studies based on historical controls.
Wu, Yu-te; Makuch, Robert W
2010-08-01
As part of a drug's entire life cycle, post-marketing studies are an important part in the identification of rare, serious adverse events. Recently, the US Food and Drug Administration (FDA) has begun to implement new post-marketing safety mandates as a consequence of increased emphasis on safety. The purpose of this research is to provide exact sample size formula for the proposed hybrid design, based on a two-group cohort study with incorporation of historical external data. Exact sample size formula based on the Poisson distribution is developed, because the detection of rare events is our outcome of interest. Performance of exact method is compared to its approximate large-sample theory counterpart. The proposed hybrid design requires a smaller sample size compared to the standard, two-group prospective study design. In addition, the exact method reduces the number of subjects required in the treatment group by up to 30% compared to the approximate method for the study scenarios examined. The proposed hybrid design satisfies the advantages and rationale of the two-group design with smaller sample sizes generally required. 2010 John Wiley & Sons, Ltd.
1982-05-01
in May 1976, and, by July 1976, all sampling techniques were employed. In addition to routine displays of data analysis such as frequency tables and...amphibian and reptile communities in large aquatic habitats in Florida, comparison with similar herpetofaunal assemblages or populations is not possible... field environment was initiated at Lake Conway near Orlando, Fla., to study the effectiveness of the fish as a biological macrophyte control agent. A
Static versus dynamic sampling for data mining
DOE Office of Scientific and Technical Information (OSTI.GOV)
John, G.H.; Langley, P.
1996-12-31
As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the datamining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this end, we introduce the {open_quotes}Probably Close Enough{close_quotes} criterion to describe the desired properties of a sample. Sampling usually refers to the use of static statistical tests to decide whether a sample is sufficiently similar to the large database, in the absence of any knowledgemore » of the tools the data miner intends to use. We discuss dynamic sampling methods, which take into account the mining tool being used and can thus give better samples. We describe dynamic schemes that observe a mining tool`s performance on training samples of increasing size and use these results to determine when a sample is sufficiently large. We evaluate these sampling methods on data from the UCI repository and conclude that dynamic sampling is preferable.« less
Chen, Lei; Pei, Junxian; Huang, Xiaojia; Lu, Min
2018-06-05
On-site sample preparation is highly desired because it avoids the transportation of large-volume samples and ensures the accuracy of the analytical results. In this work, a portable prototype of tip microextraction device (TMD) was designed and developed for on-site sample pretreatment. The assembly procedure of TMD is quite simple. Firstly, polymeric ionic liquid (PIL)-based adsorbent was in-situ prepared in a pipette tip. After that, the tip was connected with a syringe which was driven by a bidirectional motor. The flow rates in adsorption and desorption steps were controlled accurately by the motor. To evaluate the practicability of the developed device, the TMD was used to on-site sample preparation of waters and combined with high-performance liquid chromatography with diode array detection to measure trace estrogens in water samples. Under the most favorable conditions, the limits of detection (LODs, S/N = 3) for the target analytes were in the range of 4.9-22 ng/L, with good coefficients of determination. Confirmatory study well evidences that the extraction performance of TMD is comparable to that of the traditional laboratory solid-phase extraction process, but the proposed TMD is more simple and convenient. At the same time, the TMD avoids complicated sampling and transferring steps of large-volume water samples. Copyright © 2018 Elsevier B.V. All rights reserved.
Electrofishing effort required to estimate biotic condition in southern Idaho Rivers
Maret, Terry R.; Ott, Douglas S.; Herlihy, Alan T.
2007-01-01
An important issue surrounding biomonitoring in large rivers is the minimum sampling effort required to collect an adequate number of fish for accurate and precise determinations of biotic condition. During the summer of 2002, we sampled 15 randomly selected large-river sites in southern Idaho to evaluate the effects of sampling effort on an index of biotic integrity (IBI). Boat electrofishing was used to collect sample populations of fish in river reaches representing 40 and 100 times the mean channel width (MCW; wetted channel) at base flow. Minimum sampling effort was assessed by comparing the relation between reach length sampled and change in IBI score. Thirty-two species of fish in the families Catostomidae, Centrarchidae, Cottidae, Cyprinidae, Ictaluridae, Percidae, and Salmonidae were collected. Of these, 12 alien species were collected at 80% (12 of 15) of the sample sites; alien species represented about 38% of all species (N = 32) collected during the study. A total of 60% (9 of 15) of the sample sites had poor IBI scores. A minimum reach length of about 36 times MCW was determined to be sufficient for collecting an adequate number of fish for estimating biotic condition based on an IBI score. For most sites, this equates to collecting 275 fish at a site. Results may be applicable to other semiarid, fifth-order through seventh-order rivers sampled during summer low-flow conditions.
Hemani, Gibran; Yang, Jian; Vinkhuyzen, Anna; Powell, Joseph E; Willemsen, Gonneke; Hottenga, Jouke-Jan; Abdellaoui, Abdel; Mangino, Massimo; Valdes, Ana M; Medland, Sarah E; Madden, Pamela A; Heath, Andrew C; Henders, Anjali K; Nyholt, Dale R; de Geus, Eco J C; Magnusson, Patrik K E; Ingelsson, Erik; Montgomery, Grant W; Spector, Timothy D; Boomsma, Dorret I; Pedersen, Nancy L; Martin, Nicholas G; Visscher, Peter M
2013-11-07
Evidence that complex traits are highly polygenic has been presented by population-based genome-wide association studies (GWASs) through the identification of many significant variants, as well as by family-based de novo sequencing studies indicating that several traits have a large mutational target size. Here, using a third study design, we show results consistent with extreme polygenicity for body mass index (BMI) and height. On a sample of 20,240 siblings (from 9,570 nuclear families), we used a within-family method to obtain narrow-sense heritability estimates of 0.42 (SE = 0.17, p = 0.01) and 0.69 (SE = 0.14, p = 6 × 10(-)(7)) for BMI and height, respectively, after adjusting for covariates. The genomic inflation factors from locus-specific linkage analysis were 1.69 (SE = 0.21, p = 0.04) for BMI and 2.18 (SE = 0.21, p = 2 × 10(-10)) for height. This inflation is free of confounding and congruent with polygenicity, consistent with observations of ever-increasing genomic-inflation factors from GWASs with large sample sizes, implying that those signals are due to true genetic signals across the genome rather than population stratification. We also demonstrate that the distribution of the observed test statistics is consistent with both rare and common variants underlying a polygenic architecture and that previous reports of linkage signals in complex traits are probably a consequence of polygenic architecture rather than the segregation of variants with large effects. The convergent empirical evidence from GWASs, de novo studies, and within-family segregation implies that family-based sequencing studies for complex traits require very large sample sizes because the effects of causal variants are small on average. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Ukai, Hirohiko; Ohashi, Fumiko; Samoto, Hajime; Fukui, Yoshinari; Okamoto, Satoru; Moriguchi, Jiro; Ezaki, Takafumi; Takada, Shiro; Ikeda, Masayuki
2006-04-01
The present study was initiated to examine the relationship between the workplace concentrations and the estimated highest concentrations in solvent workplaces (SWPs), with special references to enterprise size and types of solvent work. Results of survey conducted in 1010 SWPs in 156 enterprises were taken as a database. Workplace air was sampled at > or = 5 crosses in each SWP following a grid sampling strategy. An additional air was grab-sampled at the site where the worker's exposure was estimated to be highest (estimated highest concentration or EHC). The samples were analyzed for 47 solvents designated by regulation, and solvent concentrations in each sample were summed up by use of additiveness formula. From the workplace concentrations at > or = 5 points, geometric mean and geometric standard deviations were calculated as the representative workplace concentration (RWC) and the indicator of variation in workplace concentration (VWC). Comparison between RWC and EHC in the total of 1010 SWPs showed that EHC was 1.2 (in large enterprises with>300 employees) to 1.7 times [in small to medium (SM) enterprises with < or = 300 employees] greater than RWC. When SWPs were classified into SM enterprises and large enterprises, both RWC and EHC were significantly higher in SM enterprises than in large enterprises. Further comparison by types of solvent work showed that the difference was more marked in printing, surface coating and degreasing/cleaning/wiping SWPs, whereas it was less remarkable in painting SWPs and essentially nil in testing/research laboratories. In conclusion, the present observation as discussed in reference to previous publications suggests that RWC, EHC and the ratio of EHC/WRC varies substantially among different types of solvent work as well as enterprise size, and are typically higher in printing SWPs in SM enterprises.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, M.; Ajello, M.; Allafort, A.
We present a detailed statistical analysis of the correlation between radio and gamma-ray emission of the active galactic nuclei (AGNs) detected by Fermi during its first year of operation, with the largest data sets ever used for this purpose. We use both archival interferometric 8.4 GHz data (from the Very Large Array and ATCA, for the full sample of 599 sources) and concurrent single-dish 15 GHz measurements from the Owens Valley Radio Observatory (OVRO, for a sub sample of 199 objects). Our unprecedentedly large sample permits us to assess with high accuracy the statistical significance of the correlation, using amore » surrogate data method designed to simultaneously account for common-distance bias and the effect of a limited dynamical range in the observed quantities. We find that the statistical significance of a positive correlation between the centimeter radio and the broadband (E > 100 MeV) gamma-ray energy flux is very high for the whole AGN sample, with a probability of <10{sup -7} for the correlation appearing by chance. Using the OVRO data, we find that concurrent data improve the significance of the correlation from 1.6 x 10{sup -6} to 9.0 x 10{sup -8}. Our large sample size allows us to study the dependence of correlation strength and significance on specific source types and gamma-ray energy band. We find that the correlation is very significant (chance probability < 10{sup -7}) for both flat spectrum radio quasars and BL Lac objects separately; a dependence of the correlation strength on the considered gamma-ray energy band is also present, but additional data will be necessary to constrain its significance.« less
Ackermann, M.; Ajello, M.; Allafort, A.; ...
2011-10-12
We present a detailed statistical analysis of the correlation between radio and gamma-ray emission of the active galactic nuclei (AGNs) detected by Fermi during its first year of operation, with the largest data sets ever used for this purpose. We use both archival interferometric 8.4 GHz data (from the Very Large Array and ATCA, for the full sample of 599 sources) and concurrent single-dish 15 GHz measurements from the Owens Valley Radio Observatory (OVRO, for a sub sample of 199 objects). Our unprecedentedly large sample permits us to assess with high accuracy the statistical significance of the correlation, using amore » surrogate data method designed to simultaneously account for common-distance bias and the effect of a limited dynamical range in the observed quantities. We find that the statistical significance of a positive correlation between the centimeter radio and the broadband (E > 100 MeV) gamma-ray energy flux is very high for the whole AGN sample, with a probability of <10 –7 for the correlation appearing by chance. Using the OVRO data, we find that concurrent data improve the significance of the correlation from 1.6 × 10 –6 to 9.0 × 10 –8. Our large sample size allows us to study the dependence of correlation strength and significance on specific source types and gamma-ray energy band. As a result, we find that the correlation is very significant (chance probability < 10 –7) for both flat spectrum radio quasars and BL Lac objects separately; a dependence of the correlation strength on the considered gamma-ray energy band is also present, but additional data will be necessary to constrain its significance.« less
NASA Technical Reports Server (NTRS)
Ackermann, M.; Ajello, M.; Allafort, A.; Angelakis, E.; Axelsson, M.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Bellazzini, R.;
2011-01-01
We present a detailed statistical analysis of the correlation between radio and gamma-ray emission of the active galactic nuclei (AGNs) detected by Fermi during its first year of operation, with the largest data sets ever used for this purpose.We use both archival interferometric 8.4 GHz data (from the Very Large Array and ATCA, for the full sample of 599 sources) and concurrent single-dish 15 GHz measurements from the OwensValley RadioObservatory (OVRO, for a sub sample of 199 objects). Our unprecedentedly large sample permits us to assess with high accuracy the statistical significance of the correlation, using a surrogate data method designed to simultaneously account for common-distance bias and the effect of a limited dynamical range in the observed quantities. We find that the statistical significance of a positive correlation between the centimeter radio and the broadband (E > 100 MeV) gamma-ray energy flux is very high for the whole AGN sample, with a probability of <10(exp -7) for the correlation appearing by chance. Using the OVRO data, we find that concurrent data improve the significance of the correlation from 1.6 10(exp -6) to 9.0 10(exp -8). Our large sample size allows us to study the dependence of correlation strength and significance on specific source types and gamma-ray energy band. We find that the correlation is very significant (chance probability < 10(exp -7)) for both flat spectrum radio quasars and BL Lac objects separately; a dependence of the correlation strength on the considered gamma-ray energy band is also present, but additional data will be necessary to constrain its significance.
Sub-sampling genetic data to estimate black bear population size: A case study
Tredick, C.A.; Vaughan, M.R.; Stauffer, D.F.; Simek, S.L.; Eason, T.
2007-01-01
Costs for genetic analysis of hair samples collected for individual identification of bears average approximately US$50 [2004] per sample. This can easily exceed budgetary allowances for large-scale studies or studies of high-density bear populations. We used 2 genetic datasets from 2 areas in the southeastern United States to explore how reducing costs of analysis by sub-sampling affected precision and accuracy of resulting population estimates. We used several sub-sampling scenarios to create subsets of the full datasets and compared summary statistics, population estimates, and precision of estimates generated from these subsets to estimates generated from the complete datasets. Our results suggested that bias and precision of estimates improved as the proportion of total samples used increased, and heterogeneity models (e.g., Mh[CHAO]) were more robust to reduced sample sizes than other models (e.g., behavior models). We recommend that only high-quality samples (>5 hair follicles) be used when budgets are constrained, and efforts should be made to maximize capture and recapture rates in the field.
Niama, Fabien Roch; Vidal, Nicole; Diop-Ndiaye, Halimatou; Nguimbi, Etienne; Ahombo, Gabriel; Diakabana, Philippe; Bayonne Kombo, Édith Sophie; Mayengue, Pembe Issamou; Kobawila, Simon-Charles; Parra, Henri Joseph; Toure-Kane, Coumba
2017-07-05
In this work, we investigated the genetic diversity of HIV-1 and the presence of mutations conferring antiretroviral drug resistance in 50 drug-naïve infected persons in the Republic of Congo (RoC). Samples were obtained before large-scale access to HAART in 2002 and 2004. To assess the HIV-1 genetic recombination, the sequencing of the pol gene encoding a protease and partial reverse transcriptase was performed and analyzed with updated references, including newly characterized CRFs. The assessment of drug resistance was conducted according to the WHO protocol. Among the 50 samples analyzed for the pol gene, 50% were classified as intersubtype recombinants, charring complex structures inside the pol fragment. Five samples could not be classified (noted U). The most prevalent subtypes were G with 10 isolates and D with 11 isolates. One isolate of A, J, H, CRF05, CRF18 and CRF37 were also found. Two samples (4%) harboring the mutations M230L and Y181C associated with the TAMs M41L and T215Y, respectively, were found. This first study in the RoC, based on WHO classification, shows that the threshold of transmitted drug resistance before large-scale access to antiretroviral therapy is 4%.
NASA Astrophysics Data System (ADS)
Wentworth, Gregory R.; Aklilu, Yayne-abeba; Landis, Matthew S.; Hsu, Yu-Mei
2018-04-01
During May 2016 a very large boreal wildfire burned throughout the Athabasca Oil Sands Region (AOSR) in central Canada, and in close proximity to an extensive air quality monitoring network. This study examines speciated 24-h integrated polycyclic aromatic hydrocarbon (PAH) and volatile organic compound (VOC) measurements collected every sixth day at four and seven sites, respectively, from May to August 2016. The sum of PAHs (ΣPAH) was on average 17 times higher in fire-influenced samples (852 ng m-3, n = 8), relative to non-fire influenced samples (50 ng m-3, n = 64). Diagnostic PAH ratios in fire-influenced samples were indicative of a biomass burning source, whereas ratios in June to August samples showed additional influence from petrogenic and fossil fuel combustion. The average increase in the sum of VOCs (ΣVOC) was minor by comparison: 63 ppbv for fire-influenced samples (n = 16) versus 46 ppbv for non-fire samples (n = 90). The samples collected on August 16th and 22nd had large ΣVOC concentrations at all sites (average of 123 ppbv) that were unrelated to wildfire emissions, and composed primarily of acetaldehyde and methanol suggesting a photochemically aged air mass. Normalized excess enhancement ratios (ERs) were calculated for 20 VOCs and 23 PAHs for three fire influenced samples, and the former were generally consistent with previous observations. To our knowledge, this is the first study to report ER measurements for a number of VOCs and PAHs in fresh North American boreal wildfire plumes. During May the aged wildfire plume intercepted the cities of Edmonton (∼380 km south) or Lethbridge (∼790 km south) on four separate occasions. No enhancement in ground-level ozone (O3) was observed in these aged plumes despite an assumed increase in O3 precursors. In the AOSR, the only daily-averaged VOCs which approached or exceeded the hourly Alberta Ambient Air Quality Objectives (AAAQOs) were benzene (during the fire) and acetaldehyde (on August 16th and 22nd). Implications for local and regional air quality as well as suggestions for supplemental air monitoring during future boreal fires, are also discussed.
Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.
2011-01-01
Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.
2010-01-01
Candidate gene association studies, linkage studies and genome-wide association studies have highlighted the role of genetic factors in the development of ischemic stroke. This research started over a decade ago, and can be separated into three major periods of research. In the first wave classic susceptibility markers associated with other diseases (such as the Leiden mutation in Factor V and mutations in the prothrombin and 5,10-methylenetetrahydrofolate reductase (MTHFR) genes) were tested for their role in stroke. These first studies used just a couple of hundred samples or even less. The second and still ongoing period bridges the two other periods of research and has led to a rapid increase in the spectrum of functional variants of genes or genomic regions, discovered primarily in relation to other diseases, tested on larger stroke samples of clinically better stratified patients. Large numbers of these alleles were originally discovered by array-based genome-wide association studies. The third period of research involves the direct array screening of large samples; this approach represents significant progress for research in the field. Research into susceptibility genes for stroke has taught us that careful stratification of patients is critical, that susceptibility alleles are often shared between diseases, and that not all susceptibility factors that associate with clinical traits that are themselves risk factors for stroke (such as increase of triglycerides) necessarily represent susceptibility for stroke. Research so far has been mainly focused on large- and small-vessel associated stroke, and knowledge on other types of stroke, which represent much smaller population samples, is still very scarce. Although some susceptibility allele tests are on the palette of some direct-to-consumer companies, the clinical utility and clinical validity of these test results still do not support their use in clinical practice. PMID:20831840
Characteristics of the Healthy Brain Project Sample: Representing Diversity among Study Participants
ERIC Educational Resources Information Center
Bryant, Lucinda L.; Laditka, James N.; Laditka, Sarah B.; Mathews, Anna E.
2009-01-01
Purpose: Description of study participants and documentation of the desired diversity in the Prevention Research Centers Healthy Aging Research Network's Workgroup on Promoting Cognitive Health large multisite study designed to examine attitudes about brain health, behaviors associated with its maintenance, and information-receiving preferences…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-11
..., emotional, motor and sensory) for use in large longitudinal or epidemiological studies where functioning is... of establishing comparative norms. Existing recruitment databases will be randomly sampled and... * Adult study participants, single assessment..... 3,150 1 3 9,450 Adult study participants, two...
Free-decay time-domain modal identification for large space structures
NASA Technical Reports Server (NTRS)
Kim, Hyoung M.; Vanhorn, David A.; Doiron, Harold H.
1992-01-01
Concept definition studies for the Modal Identification Experiment (MIE), a proposed space flight experiment for the Space Station Freedom (SSF), have demonstrated advantages and compatibility of free-decay time-domain modal identification techniques with the on-orbit operational constraints of large space structures. Since practical experience with modal identification using actual free-decay responses of large space structures is very limited, several numerical and test data reduction studies were conducted. Major issues and solutions were addressed, including closely-spaced modes, wide frequency range of interest, data acquisition errors, sampling delay, excitation limitations, nonlinearities, and unknown disturbances during free-decay data acquisition. The data processing strategies developed in these studies were applied to numerical simulations of the MIE, test data from a deployable truss, and launch vehicle flight data. Results of these studies indicate free-decay time-domain modal identification methods can provide accurate modal parameters necessary to characterize the structural dynamics of large space structures.
Strategies for Improving Power in School-Randomized Studies of Professional Development.
Kelcey, Ben; Phelps, Geoffrey
2013-12-01
Group-randomized designs are well suited for studies of professional development because they can accommodate programs that are delivered to intact groups (e.g., schools), the collaborative nature of professional development, and extant teacher/school assignments. Though group designs may be theoretically favorable, prior evidence has suggested that they may be challenging to conduct in professional development studies because well-powered designs will typically require large sample sizes or expect large effect sizes. Using teacher knowledge outcomes in mathematics, we investigated when and the extent to which there is evidence that covariance adjustment on a pretest, teacher certification, or demographic covariates can reduce the sample size necessary to achieve reasonable power. Our analyses drew on multilevel models and outcomes in five different content areas for over 4,000 teachers and 2,000 schools. Using these estimates, we assessed the minimum detectable effect sizes for several school-randomized designs with and without covariance adjustment. The analyses suggested that teachers' knowledge is substantially clustered within schools in each of the five content areas and that covariance adjustment for a pretest or, to a lesser extent, teacher certification, has the potential to transform designs that are unreasonably large for professional development studies into viable studies. © The Author(s) 2014.
Analysis of $sup 239$Pu and $sup 241$Am in NAEG large-sized bovine samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Major, W.J.; Lee, K.D.; Wessman, R.A.
Methods are described for the analysis of environmental levels of $sup 239$Pu and $sup 241$Am in large-sized bovine samples. Special procedure modifications to overcome the complexities of sample preparation and analyses and special techniques employed to prepare and analyze different types of bovine samples, such as muscle, blood, liver, and bone are discussed. (CH)
The prevalence of compulsive buying: a meta-analysis.
Maraz, Aniko; Griffiths, Mark D; Demetrovics, Zsolt
2016-03-01
To estimate the pooled prevalence of compulsive buying behaviour (CBB) in different populations and to determine the effect of age, gender, location and screening instrument on the reported heterogeneity in estimates of CBB and whether publication bias could be identified. Three databases were searched (Medline, PsychInfo, Web of Science) using the terms 'compulsive buying', 'pathological buying' and 'compulsive shopping' to estimate the pooled prevalence of CBB in different populations. Forty studies reporting 49 prevalence estimates from 16 countries were located (n = 32,000). To conduct the meta-analysis, data from non-clinical studies regarding mean age and gender proportion, geographical study location and screening instrument used to assess CBB were extracted by multiple independent observers and evaluated using a random-effects model. Four a priori subgroups were analysed using pooled estimation (Cohen's Q) and covariate testing (moderator and meta-regression analysis). The CBB pooled prevalence of adult representative studies was 4.9% (3.4-6.9%, eight estimates, 10,102 participants), although estimates were higher among university students: 8.3% (5.9-11.5%, 19 estimates, 14,947 participants) in adult non-representative samples: 12.3% (7.6-19.1%, 11 estimates, 3929 participants) and in shopping-specific samples: 16.2% (8.8-27.8%, 11 estimates, 4686 participants). Being young and female were associated with increased tendency, but not location (United States versus non-United States). Meta-regression revealed large heterogeneity within subgroups, due mainly to diverse measures and time-frames (current versus life-time) used to assess CBB. A pooled estimate of compulsive buying behaviour in the populations studied is approximately 5%, but there is large variation between samples accounted for largely by use of different time-frames and measures. © 2016 Society for the Study of Addiction.
Cosmogenic nuclides in football-sized rocks.
NASA Technical Reports Server (NTRS)
Wahlen, M.; Honda, M.; Imamura, M.; Fruchter, J. S.; Finkel, R. C.; Kohl, C. P.; Arnold, J. R.; Reedy, R. C.
1972-01-01
The activity of long- and short-lived isotopes in a series of samples from a vertical column through the center of rock 14321 was measured. Rock 14321 is a 9 kg fragmental rock whose orientation was photographically documented on the lunar surface. Also investigated was a sample from the lower portion of rock 14310, where, in order to study target effects, two different density fractions (mineral separates) were analyzed. A few nuclides in a sample from the comprehensive fines 14259 were measured. This material has been collected largely from the top centimeter of the lunar soil. The study of the deep samples of 14321 and 14310 provided values for the activity of isotopes at points where only effects produced by galactic cosmic rays are significant.
Neumann, Craig S.; Malterer, Melanie B.; Newman, Joseph P.
2010-01-01
Recent exploratory factor analysis (EFA) of the Psychopathic Personality Inventory (PPI; Lilienfeld, 1990) with a community sample suggested that the PPI subscales may be comprised of two higher-order factors (Benning et al., 2003). However, little research has examined the PPI structure in offenders. The current study attempted to replicate the Benning et al. two-factor solution using a large (N=1224) incarcerated male sample. Confirmatory factor analysis (CFA) of this model with the full sample resulted in poor model fit. Next, to identify a factor solution that would summarize the offender data, EFA was conducted using a split-half of the total sample, followed by an attempt to replicate the EFA solution via CFA with the other split-half sample. Using the recommendations of Prooijen and van der Kloot (2001) for recovering EFA solutions, model fit results provided some evidence that the EFA solution could be recovered via CFA. However, this model involved extensive cross-loadings of the subscales across three factors, suggesting item overlap across PPI subscales. In sum, the two-factor solution reported by Benning et al. (2003) was not a viable model for the current sample of offenders, and additional research is needed to elucidate the latent structure of the PPI. PMID:18557694
Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A
2018-01-01
This study compares conventional grab sampling to incremental sampling methodology (ISM) to characterize metal contamination at a military small-arms-range. Grab sample results had large variances, positively skewed non-normal distributions, extreme outliers, and poor agreement between duplicate samples even when samples were co-located within tens of centimeters of each other. The extreme outliers strongly influenced the grab sample means for the primary contaminants lead (Pb) and antinomy (Sb). In contrast, median and mean metal concentrations were similar for the ISM samples. ISM significantly reduced measurement uncertainty of estimates of the mean, increasing data quality (e.g., for environmental risk assessments) with fewer samples (e.g., decreasing total project costs). Based on Monte Carlo resampling simulations, grab sampling resulted in highly variable means and upper confidence limits of the mean relative to ISM.
Variance Estimation, Design Effects, and Sample Size Calculations for Respondent-Driven Sampling
2006-01-01
Hidden populations, such as injection drug users and sex workers, are central to a number of public health problems. However, because of the nature of these groups, it is difficult to collect accurate information about them, and this difficulty complicates disease prevention efforts. A recently developed statistical approach called respondent-driven sampling improves our ability to study hidden populations by allowing researchers to make unbiased estimates of the prevalence of certain traits in these populations. Yet, not enough is known about the sample-to-sample variability of these prevalence estimates. In this paper, we present a bootstrap method for constructing confidence intervals around respondent-driven sampling estimates and demonstrate in simulations that it outperforms the naive method currently in use. We also use simulations and real data to estimate the design effects for respondent-driven sampling in a number of situations. We conclude with practical advice about the power calculations that are needed to determine the appropriate sample size for a study using respondent-driven sampling. In general, we recommend a sample size twice as large as would be needed under simple random sampling. PMID:16937083
Cultural influences on personality.
Triandis, Harry C; Suh, Eunkook M
2002-01-01
Ecologies shape cultures; cultures influence the development of personalities. There are both universal and culture-specific aspects of variation in personality. Some culture-specific aspects correspond to cultural syndromes such as complexity, tightness, individualism, and collectivism. A large body of literature suggests that the Big Five personality factors emerge in various cultures. However, caution is required in arguing for such universality, because most studies have not included emic (culture-specific) traits and have not studied samples that are extremely different in culture from Western samples.
A measure of the signal-to-noise ratio of microarray samples and studies using gene correlations.
Venet, David; Detours, Vincent; Bersini, Hugues
2012-01-01
The quality of gene expression data can vary dramatically from platform to platform, study to study, and sample to sample. As reliable statistical analysis rests on reliable data, determining such quality is of the utmost importance. Quality measures to spot problematic samples exist, but they are platform-specific, and cannot be used to compare studies. As a proxy for quality, we propose a signal-to-noise ratio for microarray data, the "Signal-to-Noise Applied to Gene Expression Experiments", or SNAGEE. SNAGEE is based on the consistency of gene-gene correlations. We applied SNAGEE to a compendium of 80 large datasets on 37 platforms, for a total of 24,380 samples, and assessed the signal-to-noise ratio of studies and samples. This allowed us to discover serious issues with three studies. We show that signal-to-noise ratios of both studies and samples are linked to the statistical significance of the biological results. We showed that SNAGEE is an effective way to measure data quality for most types of gene expression studies, and that it often outperforms existing techniques. Furthermore, SNAGEE is platform-independent and does not require raw data files. The SNAGEE R package is available in BioConductor.
Anomalies in Trace Metal and Rare-Earth Loads below a Waste-Water Treatment Plant
NASA Astrophysics Data System (ADS)
Antweiler, R.; Writer, J. H.; Murphy, S.
2013-12-01
The changes in chemical loads were examined for 54 inorganic elements and compounds in a 5.4-km reach of Boulder Creek, Colorado downstream of a waste water treatment plant (WWTP) outfall. Elements were partitioned into three categories: those showing a decrease in loading downstream, those showing an increase, and those which were conservative, at least over the length of the study reach. Dissolved loads which declined - generally indicative of in-stream loss via precipitation or sorption - were typically rapid (occurring largely before the first sampling site, 2.3 km downstream); elements showing this behavior were Bi, Cr, Cs, Ga, Ge, Hg, Se and Sn. These results were as expected before the experiment was performed. However, a large group (28 elements, including all the rare-earth elements, REE, except Gd) exhibited dissolved load increases indicating in-stream gains. These gains may be due to particulate matter dissolving or disaggregating, or that desorption is occurring below the WWTP. As with the in-stream loss group, the processes tended to be rapid, typically occurring before the first sampling site. Whole-water samples collected concurrently also had a large group of elements which showed an increase in load downstream of the WWTP. Among these were most of the group which had increases in the dissolved load, including all the REE (except Gd). Because whole-water samples include both dissolved and suspended particulates within them, increases in loads cannot be accounted for by invoking desorption or disaggregation mechanisms; thus, the only source for these increases is from the bed load of the stream. Further, the difference between the whole-water and dissolved loads is a measure of the particulate load, and calculations show that not only did the dissolved and whole-water loads increase, but so did the particulate loads. This implies that at the time of sampling the bed sediment was supplying a significant contribution to the suspended load. In general, it seems untenable as a hypothesis to suppose that the stream bed material can permanently supply the source of the in-stream load increases of a large group of inorganic elements. We propose that the anomalous increase in loads was more a function of the time of sampling (both diurnally and seasonally) and that sampling at different times of day or different seasons during the year would give contradictory results to those seen here. If this is so, inorganic loading studies must include multiple sampling both over the course of a day and during different seasons and flow regimes.
Marie Yee; Simon J. Grove; Alastair M.M. Richardson; Caroline L. Mohammed
2006-01-01
It is not clear why large diameter logs generally host saproxylic beetle assemblages that are different from those of small diameter logs. In a study in Tasmanian wet eucalypt forest, two size-classes of Eucalyptus obliqua logs (>100cm and 30-60cm diameter) were destructively sampled to assess their beetle fauna and the associations of this fauna...
A metagenomic framework for the study of airborne microbial communities.
Yooseph, Shibu; Andrews-Pfannkoch, Cynthia; Tenney, Aaron; McQuaid, Jeff; Williamson, Shannon; Thiagarajan, Mathangi; Brami, Daniel; Zeigler-Allen, Lisa; Hoffman, Jeff; Goll, Johannes B; Fadrosh, Douglas; Glass, John; Adams, Mark D; Friedman, Robert; Venter, J Craig
2013-01-01
Understanding the microbial content of the air has important scientific, health, and economic implications. While studies have primarily characterized the taxonomic content of air samples by sequencing the 16S or 18S ribosomal RNA gene, direct analysis of the genomic content of airborne microorganisms has not been possible due to the extremely low density of biological material in airborne environments. We developed sampling and amplification methods to enable adequate DNA recovery to allow metagenomic profiling of air samples collected from indoor and outdoor environments. Air samples were collected from a large urban building, a medical center, a house, and a pier. Analyses of metagenomic data generated from these samples reveal airborne communities with a high degree of diversity and different genera abundance profiles. The identities of many of the taxonomic groups and protein families also allows for the identification of the likely sources of the sampled airborne bacteria.
A Metagenomic Framework for the Study of Airborne Microbial Communities
Tenney, Aaron; McQuaid, Jeff; Williamson, Shannon; Thiagarajan, Mathangi; Brami, Daniel; Zeigler-Allen, Lisa; Hoffman, Jeff; Goll, Johannes B.; Fadrosh, Douglas; Glass, John; Adams, Mark D.; Friedman, Robert; Venter, J. Craig
2013-01-01
Understanding the microbial content of the air has important scientific, health, and economic implications. While studies have primarily characterized the taxonomic content of air samples by sequencing the 16S or 18S ribosomal RNA gene, direct analysis of the genomic content of airborne microorganisms has not been possible due to the extremely low density of biological material in airborne environments. We developed sampling and amplification methods to enable adequate DNA recovery to allow metagenomic profiling of air samples collected from indoor and outdoor environments. Air samples were collected from a large urban building, a medical center, a house, and a pier. Analyses of metagenomic data generated from these samples reveal airborne communities with a high degree of diversity and different genera abundance profiles. The identities of many of the taxonomic groups and protein families also allows for the identification of the likely sources of the sampled airborne bacteria. PMID:24349140
CIHR Candrive Cohort Comparison with Canadian Household Population Holding Valid Driver's Licenses.
Gagnon, Sylvain; Marshall, Shawn; Kadulina, Yara; Stinchcombe, Arne; Bédard, Michel; Gélinas, Isabelle; Man-Son-Hing, Malcolm; Mazer, Barbara; Naglie, Gary; Porter, Michelle M; Rapoport, Mark; Tuokko, Holly; Vrkljan, Brenda
2016-06-01
We investigated whether convenience sampling is a suitable method to generate a sample of older drivers representative of the older-Canadian driver population. Using equivalence testing, we compared a large convenience sample of older drivers (Candrive II prospective cohort study) to a similarly aged population of older Canadian drivers. The Candrive sample consists of 928 community-dwelling older drivers from seven metropolitan areas of Canada. The population data was obtained from the Canadian Community Health Survey - Healthy Aging (CCHS-HA), which is a representative sample of older Canadians. The data for drivers aged 70 and older were extracted from the CCHS-HA database, for a total of 3,899 older Canadian drivers. Two samples were demonstrated as equivalent on socio-demographic, health, and driving variables that we compared, but not on driving frequency. We conclude that convenience sampling used in the Candrive study created a fairly representative sample of Canadian older drivers, with a few exceptions.
Characterization of Large Structural Genetic Mosaicism in Human Autosomes
Machiela, Mitchell J.; Zhou, Weiyin; Sampson, Joshua N.; Dean, Michael C.; Jacobs, Kevin B.; Black, Amanda; Brinton, Louise A.; Chang, I-Shou; Chen, Chu; Chen, Constance; Chen, Kexin; Cook, Linda S.; Crous Bou, Marta; De Vivo, Immaculata; Doherty, Jennifer; Friedenreich, Christine M.; Gaudet, Mia M.; Haiman, Christopher A.; Hankinson, Susan E.; Hartge, Patricia; Henderson, Brian E.; Hong, Yun-Chul; Hosgood, H. Dean; Hsiung, Chao A.; Hu, Wei; Hunter, David J.; Jessop, Lea; Kim, Hee Nam; Kim, Yeul Hong; Kim, Young Tae; Klein, Robert; Kraft, Peter; Lan, Qing; Lin, Dongxin; Liu, Jianjun; Le Marchand, Loic; Liang, Xiaolin; Lissowska, Jolanta; Lu, Lingeng; Magliocco, Anthony M.; Matsuo, Keitaro; Olson, Sara H.; Orlow, Irene; Park, Jae Yong; Pooler, Loreall; Prescott, Jennifer; Rastogi, Radhai; Risch, Harvey A.; Schumacher, Fredrick; Seow, Adeline; Setiawan, Veronica Wendy; Shen, Hongbing; Sheng, Xin; Shin, Min-Ho; Shu, Xiao-Ou; VanDen Berg, David; Wang, Jiu-Cun; Wentzensen, Nicolas; Wong, Maria Pik; Wu, Chen; Wu, Tangchun; Wu, Yi-Long; Xia, Lucy; Yang, Hannah P.; Yang, Pan-Chyr; Zheng, Wei; Zhou, Baosen; Abnet, Christian C.; Albanes, Demetrius; Aldrich, Melinda C.; Amos, Christopher; Amundadottir, Laufey T.; Berndt, Sonja I.; Blot, William J.; Bock, Cathryn H.; Bracci, Paige M.; Burdett, Laurie; Buring, Julie E.; Butler, Mary A.; Carreón, Tania; Chatterjee, Nilanjan; Chung, Charles C.; Cook, Michael B.; Cullen, Michael; Davis, Faith G.; Ding, Ti; Duell, Eric J.; Epstein, Caroline G.; Fan, Jin-Hu; Figueroa, Jonine D.; Fraumeni, Joseph F.; Freedman, Neal D.; Fuchs, Charles S.; Gao, Yu-Tang; Gapstur, Susan M.; Patiño-Garcia, Ana; Garcia-Closas, Montserrat; Gaziano, J. Michael; Giles, Graham G.; Gillanders, Elizabeth M.; Giovannucci, Edward L.; Goldin, Lynn; Goldstein, Alisa M.; Greene, Mark H.; Hallmans, Goran; Harris, Curtis C.; Henriksson, Roger; Holly, Elizabeth A.; Hoover, Robert N.; Hu, Nan; Hutchinson, Amy; Jenab, Mazda; Johansen, Christoffer; Khaw, Kay-Tee; Koh, Woon-Puay; Kolonel, Laurence N.; Kooperberg, Charles; Krogh, Vittorio; Kurtz, Robert C.; LaCroix, Andrea; Landgren, Annelie; Landi, Maria Teresa; Li, Donghui; Liao, Linda M.; Malats, Nuria; McGlynn, Katherine A.; McNeill, Lorna H.; McWilliams, Robert R.; Melin, Beatrice S.; Mirabello, Lisa; Peplonska, Beata; Peters, Ulrike; Petersen, Gloria M.; Prokunina-Olsson, Ludmila; Purdue, Mark; Qiao, You-Lin; Rabe, Kari G.; Rajaraman, Preetha; Real, Francisco X.; Riboli, Elio; Rodríguez-Santiago, Benjamín; Rothman, Nathaniel; Ruder, Avima M.; Savage, Sharon A.; Schwartz, Ann G.; Schwartz, Kendra L.; Sesso, Howard D.; Severi, Gianluca; Silverman, Debra T.; Spitz, Margaret R.; Stevens, Victoria L.; Stolzenberg-Solomon, Rachael; Stram, Daniel; Tang, Ze-Zhong; Taylor, Philip R.; Teras, Lauren R.; Tobias, Geoffrey S.; Viswanathan, Kala; Wacholder, Sholom; Wang, Zhaoming; Weinstein, Stephanie J.; Wheeler, William; White, Emily; Wiencke, John K.; Wolpin, Brian M.; Wu, Xifeng; Wunder, Jay S.; Yu, Kai; Zanetti, Krista A.; Zeleniuch-Jacquotte, Anne; Ziegler, Regina G.; de Andrade, Mariza; Barnes, Kathleen C.; Beaty, Terri H.; Bierut, Laura J.; Desch, Karl C.; Doheny, Kimberly F.; Feenstra, Bjarke; Ginsburg, David; Heit, John A.; Kang, Jae H.; Laurie, Cecilia A.; Li, Jun Z.; Lowe, William L.; Marazita, Mary L.; Melbye, Mads; Mirel, Daniel B.; Murray, Jeffrey C.; Nelson, Sarah C.; Pasquale, Louis R.; Rice, Kenneth; Wiggs, Janey L.; Wise, Anastasia; Tucker, Margaret; Pérez-Jurado, Luis A.; Laurie, Cathy C.; Caporaso, Neil E.; Yeager, Meredith; Chanock, Stephen J.
2015-01-01
Analyses of genome-wide association study (GWAS) data have revealed that detectable genetic mosaicism involving large (>2 Mb) structural autosomal alterations occurs in a fraction of individuals. We present results for a set of 24,849 genotyped individuals (total GWAS set II [TGSII]) in whom 341 large autosomal abnormalities were observed in 168 (0.68%) individuals. Merging data from the new TGSII set with data from two prior reports (the Gene-Environment Association Studies and the total GWAS set I) generated a large dataset of 127,179 individuals; we then conducted a meta-analysis to investigate the patterns of detectable autosomal mosaicism (n = 1,315 events in 925 [0.73%] individuals). Restricting to events >2 Mb in size, we observed an increase in event frequency as event size decreased. The combined results underscore that the rate of detectable mosaicism increases with age (p value = 5.5 × 10−31) and is higher in men (p value = 0.002) but lower in participants of African ancestry (p value = 0.003). In a subset of 47 individuals from whom serial samples were collected up to 6 years apart, complex changes were noted over time and showed an overall increase in the proportion of mosaic cells as age increased. Our large combined sample allowed for a unique ability to characterize detectable genetic mosaicism involving large structural events and strengthens the emerging evidence of non-random erosion of the genome in the aging population. PMID:25748358
[The research protocol III. Study population].
Arias-Gómez, Jesús; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe
2016-01-01
The study population is defined as a set of cases, determined, limited, and accessible, that will constitute the subjects for the selection of the sample, and must fulfill several characteristics and distinct criteria. The objectives of this manuscript are focused on specifying each one of the elements required to make the selection of the participants of a research project, during the elaboration of the protocol, including the concepts of study population, sample, selection criteria and sampling methods. After delineating the study population, the researcher must specify the criteria that each participant has to comply. The criteria that include the specific characteristics are denominated selection or eligibility criteria. These criteria are inclusion, exclusion and elimination, and will delineate the eligible population. The sampling methods are divided in two large groups: 1) probabilistic or random sampling and 2) non-probabilistic sampling. The difference lies in the employment of statistical methods to select the subjects. In every research, it is necessary to establish at the beginning the specific number of participants to be included to achieve the objectives of the study. This number is the sample size, and can be calculated or estimated with mathematical formulas and statistic software.
Tang, Hongmao; Beg, Khaliq R.; Al-Otaiba, Yousef
2006-01-01
Kuwait experiences desert climatic weather. Due to the extreme hot and dry conditions in this country, some analytical phenomena have been discovered. Therefore, a systematic study of sampling and analyzing volatile organic compounds in air by using GC-MS with a cryogenic trap is reported in this paper. This study included comparisons of using different sample containers such as Tedlar bags and SUMMA canisters, and different cryogenic freezing-out air volumes in the trap. Calibration curves for different compounds and improvement of replicated analysis results were also reported here. The study found that using different sample containers produced different results. Analysis of ambient air samples collected in Tedlar bags obtained several volatile organic compounds with large concentrations compared to using SUMMA canisters. Therefore, to choose a sample container properly is a key element for successfully completing a project. Because GC-MS with a cryogenic trap often generates replicated results with poor agreement, an internal standard added to gas standards and air samples by using a gas syringe was tested. The study results proved that it helped to improve the replicated results. PMID:16699723
Tang, Hongmao; Beg, Khaliq R; Al-Otaiba, Yousef
2006-05-12
Kuwait experiences desert climatic weather. Due to the extreme hot and dry conditions in this country, some analytical phenomena have been discovered. Therefore, a systematic study of sampling and analyzing volatile organic compounds in air by using GC-MS with a cryogenic trap is reported in this paper. This study included comparisons of using different sample containers such as Tedlar bags and SUMMA canisters, and different cryogenic freezing-out air volumes in the trap. Calibration curves for different compounds and improvement of replicated analysis results were also reported here. The study found that using different sample containers produced different results. Analysis of ambient air samples collected in Tedlar bags obtained several volatile organic compounds with large concentrations compared to using SUMMA canisters. Therefore, to choose a sample container properly is a key element for successfully completing a project. Because GC-MS with a cryogenic trap often generates replicated results with poor agreement, an internal standard added to gas standards and air samples by using a gas syringe was tested. The study results proved that it helped to improve the replicated results.
Identification of New Lithic Clasts in Lunar Breccia 14305 by Micro-CT and Micro-XRF Analysis
NASA Technical Reports Server (NTRS)
Zeigler, Ryan A.; Carpenter, Paul K.; Jolliff, Bradley L.
2014-01-01
From 1969 to 1972, Apollo astronauts collected 382 kg of rocks, soils, and core samples from six locations on the surface of the Moon. The samples were initially characterized, largely by binocular examination, in a custom-built facility at Johnson Space Center (JSC), and the samples have been curated at JSC ever since. Despite over 40 years of study, demand for samples remains high (500 subsamples per year are allocated to scientists around the world), particularly for plutonic (e.g., anorthosites, norites, etc.) and evolved (e.g., granites, KREEP basalts) lithologies. The reason for the prolonged interest is that as new scientists and new techniques examine the samples, our understanding of how the Moon, Earth, and other inner Solar System bodies formed and evolved continues to grow. Scientists continually clamor for new samples to test their emerging hypotheses. Although all of the large Apollo samples that are igneous rocks have been classified, many Apollo samples are complex polymict breccias that have previously yielded large (cm-sized) igneous clasts. In this work we present the initial efforts to use the non-destructive techniques of micro-computed tomography (micro-CT) and micro x-ray fluorescence (micro-XRF) to identify large lithic clasts in Apollo 14 polymict breccia sample 14305. The sample in this study is 14305,483, a 150 g slab of regolith breccia 14305 measuring 10x6x2 cm (Figure 1a). The sample was scanned at the University of Texas High-Resolution X-ray CT Facility on an Xradia MicroXCT scanner. Two adjacent overlapping volumes were acquired at 49.2 micrometer resolution and stitched together, resulting in 1766 slices. Each volume was acquired at 100 kV accelerating voltage and 98 mA beam current with a 1 mm CaF2 filter, with 2161 views gathered over 360deg at 3 seconds acquisition time per view. Micro-XRF analyses were done at Washington University in St. Louis, Missouri on an EDAX Orbis PC micro-XRF instrument. Multiple scans were made at 40 kV accelerating voltage, 800 mA beam current, 30 µm beam diameter, and a beam spacing of 30-120 micrometer. The micro-CT scan of 14305,483 (Figure 2) was able to identify several large lithic clasts (approx. 1 cm) within the interior of the slab. These clasts will be exposed by band-sawing or chipping of the slab, and their composition more fully characterized by subsequent micro-XRF analysis. In addition to lithic clasts, the micro-CT scans identified numerous mineral clasts, including many FeNi metal grains, as well as the prominent fractures within the slab. The micro-XRF analyses (Figure 1b,c) of the slab surfaces revealed the bulk chemical compositions (qualitative) of the different clast types observed. In particular, by looking at the ratios of major elements (e.g. Ca:Mg:Fe), differences among the many observed clast types are readily observed. Moreover, several clasts not apparent to the naked eye were revealed in the K:Al:Si ratio map. The scans are also able to identify small grains of Zr- and P-rich minerals (not shown), which could in turn yield important age dates for the samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Jaejin; Woo, Jong-Hak; Mulchaey, John S.
We perform a comprehensive study of X-ray cavities using a large sample of X-ray targets selected from the Chandra archive. The sample is selected to cover a large dynamic range including galaxy clusters, groups, and individual galaxies. Using β -modeling and unsharp masking techniques, we investigate the presence of X-ray cavities for 133 targets that have sufficient X-ray photons for analysis. We detect 148 X-ray cavities from 69 targets and measure their properties, including cavity size, angle, and distance from the center of the diffuse X-ray gas. We confirm the strong correlation between cavity size and distance from the X-raymore » center similar to previous studies. We find that the detection rates of X-ray cavities are similar among galaxy clusters, groups and individual galaxies, suggesting that the formation mechanism of X-ray cavities is independent of environment.« less
Human Finger-Prick Induced Pluripotent Stem Cells Facilitate the Development of Stem Cell Banking
Tan, Hong-Kee; Toh, Cheng-Xu Delon; Ma, Dongrui; Yang, Binxia; Liu, Tong Ming; Lu, Jun; Wong, Chee-Wai; Tan, Tze-Kai; Li, Hu; Syn, Christopher; Tan, Eng-Lee; Lim, Bing; Lim, Yoon-Pin; Cook, Stuart A.
2014-01-01
Induced pluripotent stem cells (iPSCs) derived from somatic cells of patients can be a good model for studying human diseases and for future therapeutic regenerative medicine. Current initiatives to establish human iPSC (hiPSC) banking face challenges in recruiting large numbers of donors with diverse diseased, genetic, and phenotypic representations. In this study, we describe the efficient derivation of transgene-free hiPSCs from human finger-prick blood. Finger-prick sample collection can be performed on a “do-it-yourself” basis by donors and sent to the hiPSC facility for reprogramming. We show that single-drop volumes of finger-prick samples are sufficient for performing cellular reprogramming, DNA sequencing, and blood serotyping in parallel. Our novel strategy has the potential to facilitate the development of large-scale hiPSC banking worldwide. PMID:24646489
Schwanke, C.J.; Hubert, W.A.
2004-01-01
Alternatives to electrofishing are needed for sampling sexually mature rainbow trout Oncorhynchus mykiss during the spawning season in large Alaskan rivers. We compared hook and line, beach seining, and actively fished gill nets as sampling tools. Beach seining and active gill netting yielded similar catch rates, length frequencies, and sex ratios of sexually mature fish. Hook-and-line sampling was less effective, with a lower catch rate and selectivity for immature fish and sexually mature females. We conclude that both beach seining and active gill netting can serve as alternatives to electrofishing for sampling sexually mature rainbow trout stocks during the spawning season in large rivers with stable spring flows and spawning areas with few snags.
Phylogenetic effective sample size.
Bartoszek, Krzysztof
2016-10-21
In this paper I address the question-how large is a phylogenetic sample? I propose a definition of a phylogenetic effective sample size for Brownian motion and Ornstein-Uhlenbeck processes-the regression effective sample size. I discuss how mutual information can be used to define an effective sample size in the non-normal process case and compare these two definitions to an already present concept of effective sample size (the mean effective sample size). Through a simulation study I find that the AICc is robust if one corrects for the number of species or effective number of species. Lastly I discuss how the concept of the phylogenetic effective sample size can be useful for biodiversity quantification, identification of interesting clades and deciding on the importance of phylogenetic correlations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Attenuation of species abundance distributions by sampling
Shimadzu, Hideyasu; Darnell, Ross
2015-01-01
Quantifying biodiversity aspects such as species presence/ absence, richness and abundance is an important challenge to answer scientific and resource management questions. In practice, biodiversity can only be assessed from biological material taken by surveys, a difficult task given limited time and resources. A type of random sampling, or often called sub-sampling, is a commonly used technique to reduce the amount of time and effort for investigating large quantities of biological samples. However, it is not immediately clear how (sub-)sampling affects the estimate of biodiversity aspects from a quantitative perspective. This paper specifies the effect of (sub-)sampling as attenuation of the species abundance distribution (SAD), and articulates how the sampling bias is induced to the SAD by random sampling. The framework presented also reveals some confusion in previous theoretical studies. PMID:26064626
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, R; Baer, E; Jee, K
Purpose: For proton therapy, an accurate model of CT HU to relative stopping power (RSP) conversion is essential. In current practice, validation of these models relies solely on measurements of tissue substitutes with standard compositions. Validation based on real tissue samples would be much more direct and can address variations between patients. This study intends to develop an efficient and accurate system based on the concept of dose extinction to measure WEPL and retrieve RSP in biological tissue in large number of types. Methods: A broad AP proton beam delivering a spread out Bragg peak (SOBP) is used to irradiatemore » the samples with a Matrixx detector positioned immediately below. A water tank was placed on top of the samples, with the water level controllable in sub-millimeter by a remotely controlled dosing pump. While gradually lowering the water level with beam on, the transmission dose was recorded at 1 frame/sec. The WEPL were determined as the difference between the known beam range of the delivered SOBP (80%) and the water level corresponding to 80% of measured dose profiles in time. A Gammex 467 phantom was used to test the system and various types of biological tissue was measured. Results: RSP for all Gammex inserts, expect the one made with lung-450 material (<2% error), were determined within ±0.5% error. Depends on the WEPL of investigated phantom, a measurement takes around 10 min, which can be accelerated by a faster pump. Conclusion: Based on the concept of dose extinction, a system was explored to measure WEPL efficiently and accurately for a large number of samples. This allows the validation of CT HU to stopping power conversions based on large number of samples and real tissues. It also allows the assessment of beam uncertainties due to variations over patients, which issue has never been sufficiently studied before.« less
Sharland, Michael J; Waring, Stephen C; Johnson, Brian P; Taran, Allise M; Rusin, Travis A; Pattock, Andrew M; Palcher, Jeanette A
2018-01-01
Assessing test performance validity is a standard clinical practice and although studies have examined the utility of cognitive/memory measures, few have examined attention measures as indicators of performance validity beyond the Reliable Digit Span. The current study further investigates the classification probability of embedded Performance Validity Tests (PVTs) within the Brief Test of Attention (BTA) and the Conners' Continuous Performance Test (CPT-II), in a large clinical sample. This was a retrospective study of 615 patients consecutively referred for comprehensive outpatient neuropsychological evaluation. Non-credible performance was defined two ways: failure on one or more PVTs and failure on two or more PVTs. Classification probability of the BTA and CPT-II into non-credible groups was assessed. Sensitivity, specificity, positive predictive value, and negative predictive value were derived to identify clinically relevant cut-off scores. When using failure on two or more PVTs as the indicator for non-credible responding compared to failure on one or more PVTs, highest classification probability, or area under the curve (AUC), was achieved by the BTA (AUC = .87 vs. .79). CPT-II Omission, Commission, and Total Errors exhibited higher classification probability as well. Overall, these findings corroborate previous findings, extending them to a large clinical sample. BTA and CPT-II are useful embedded performance validity indicators within a clinical battery but should not be used in isolation without other performance validity indicators.
Correlates of suicidality in firefighter/EMS personnel.
Martin, Colleen E; Tran, Jana K; Buser, Sam J
2017-01-15
Firefighter and Emergency Medical Services (EMS) personnel experience higher rates of lifetime suicidal ideation and attempts than the general population and other protective service professions. Several correlates of suicidality (alcohol use, depression, posttraumatic stress) have been identified in the literature as applicable to firefighter/EMS populations; however, few studies to date have examined the specific correlates of suicidality (lifetime suicidal ideation and/or attempts) in a firefighter/EMS sample. Participants (N=3036) from a large, urban fire department completed demographic and self-report measures of alcohol dependence, depression, posttraumatic stress disorder (PTSD) symptom severity, and lifetime suicidal ideation and attempts. Participants in this sample performed both firefighter and EMS duties, were predominately male (97%), White (61.6%), and 25-34 years old (32.1%). Through hierarchical linear regressions, depression (β=.22, p<.05) and PTSD symptom severity (β=.21, p<.05) were significantly associated with lifetime suicidal ideation (R 2 =17.5). Depression (β=.15, p<.001), and PTSD symptom severity (β=.07, p<.01) were significantly associated with lifetime suicide attempts (R 2 =5.1). Several limitations are addressed in the current study. The survey was a self-report pre-existing dataset and lifetime suicidal ideation and attempts were measured using sum scores. Additionally, the disproportionately large sample of males and large, urban setting, may not generalize to female firefighters and members of rural community fire departments. The current study highlights the importance of targeting depression and PTSD symptom severity in efforts to reduce suicidality in firefighter/EMS personnel. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Balducci, Cristian; Mnich, Eva; McKee, Kevin J.; Lamura, Giovanni; Beckmann, Anke; Krevers, Barbro; Wojszel, Z. Beata; Nolan, Mike; Prouskas, Constantinos; Bien, Barbara; Oberg, Birgitta
2008-01-01
Purpose: The present study attempts to further validate the COPE Index on a large sample of carers drawn from six European countries. Design and Methods: We used a cross-sectional survey, with approximately 1,000 carers recruited in each of six countries by means of a common standard evaluation protocol. Our saturation recruitment of a designated…
Use of space-filling curves to select sample locations in natural resource monitoring studies
Andrew Lister; Charles T. Scott
2009-01-01
The establishment of several large area monitoring networks over the past few decades has led to increased research into ways to spatially balance sample locations across the landscape. Many of these methods are well documented and have been used in the past with great success. In this paper, we present a method using geographic information systems (GIS) and fractals...