Protocol for Detection of Yersinia pestis in Environmental ...
Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.
Geochemical and isotopic water results, Barrow, Alaska, 2012-2013
Heikoop, Jeff; Wilson, Cathy; Newman, Brent
2012-07-18
Data include a large suite of analytes (geochemical and isotopic) for samples collected in Barrow, Alaska (2012-2013). Sample types are indicated, and include soil pore waters, drainage waters, snowmelt, precipitation, and permafrost samples.
SAMPLING LARGE RIVERS FOR ALGAE, BENTHIC MACROINVERTEBRATES AND FISH
Multiple projects are currently underway to increase our understanding of the effects of different sampling methods and designs used for the biological assessment and monitoring of large (boatable) rivers. Studies include methods used to assess fish, benthic macroinvertebrates, ...
Analysis of large soil samples for actinides
Maxwell, III; Sherrod, L [Aiken, SC
2009-03-24
A method of analyzing relatively large soil samples for actinides by employing a separation process that includes cerium fluoride precipitation for removing the soil matrix and precipitates plutonium, americium, and curium with cerium and hydrofluoric acid followed by separating these actinides using chromatography cartridges.
ERIC Educational Resources Information Center
Boivin, Michel; Perusse, Daniel; Dionne, Ginette; Saysset, Valerie; Zoccolillo, Mark; Tarabulsy, George M.; Tremblay, Nathalie; Tremblay, Richard E.
2005-01-01
Background: Given the importance of parenting for the child's early socio-emotional development, parenting perceptions and behaviours, and their correlates, should be assessed as early as possible in the child's life. The goals of the present study were 1) to confirm, in two parallel population-based samples, including a large sample of twins, the…
Holmberg, Rebecca C; Gindlesperger, Alissa; Stokes, Tinsley; Brady, Dane; Thakore, Nitu; Belgrader, Philip; Cooney, Christopher G; Chandler, Darrell P
2013-06-11
TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively).
FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS ? (ABSTRACT)
Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...
FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS?
Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...
Generation and analysis of chemical compound libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregoire, John M.; Jin, Jian; Kan, Kevin S.
2017-10-03
Various samples are generated on a substrate. The samples each includes or consists of one or more analytes. In some instances, the samples are generated through the use of gels or through vapor deposition techniques. The samples are used in an instrument for screening large numbers of analytes by locating the samples between a working electrode and a counter electrode assembly. The instrument also includes one or more light sources for illuminating each of the samples. The instrument is configured to measure the photocurrent formed through a sample as a result of the illumination of the sample.
Integrating resource selection into spatial capture-recapture models for large carnivores
Proffitt, Kelly M.; Goldberg, Joshua; Hebblewite, Mark; Russell, Robin E.; Jimenez, Ben; Robinson, Hugh S.; Pilgrim, Kristine; Schwartz, Michael K.
2015-01-01
Wildlife managers need reliable methods to estimate large carnivore densities and population trends; yet large carnivores are elusive, difficult to detect, and occur at low densities making traditional approaches intractable. Recent advances in spatial capture-recapture (SCR) models have provided new approaches for monitoring trends in wildlife abundance and these methods are particularly applicable to large carnivores. We applied SCR models in a Bayesian framework to estimate mountain lion densities in the Bitterroot Mountains of west central Montana. We incorporate an existing resource selection function (RSF) as a density covariate to account for heterogeneity in habitat use across the study area and include data collected from harvested lions. We identify individuals through DNA samples collected by (1) biopsy darting mountain lions detected in systematic surveys of the study area, (2) opportunistically collecting hair and scat samples, and (3) sampling all harvested mountain lions. We included 80 DNA samples collected from 62 individuals in the analysis. Including information on predicted habitat use as a covariate on the distribution of activity centers reduced the median estimated density by 44%, the standard deviation by 7%, and the width of 95% credible intervals by 10% as compared to standard SCR models. Within the two management units of interest, we estimated a median mountain lion density of 4.5 mountain lions/100 km2 (95% CI = 2.9, 7.7) and 5.2 mountain lions/100 km2 (95% CI = 3.4, 9.1). Including harvested individuals (dead recovery) did not create a significant bias in the detection process by introducing individuals that could not be detected after removal. However, the dead recovery component of the model did have a substantial effect on results by increasing sample size. The ability to account for heterogeneity in habitat use provides a useful extension to SCR models, and will enhance the ability of wildlife managers to reliably and economically estimate density of wildlife populations, particularly large carnivores.
Extreme Mean and Its Applications
NASA Technical Reports Server (NTRS)
Swaroop, R.; Brownlow, J. D.
1979-01-01
Extreme value statistics obtained from normally distributed data are considered. An extreme mean is defined as the mean of p-th probability truncated normal distribution. An unbiased estimate of this extreme mean and its large sample distribution are derived. The distribution of this estimate even for very large samples is found to be nonnormal. Further, as the sample size increases, the variance of the unbiased estimate converges to the Cramer-Rao lower bound. The computer program used to obtain the density and distribution functions of the standardized unbiased estimate, and the confidence intervals of the extreme mean for any data are included for ready application. An example is included to demonstrate the usefulness of extreme mean application.
Hu, Jian Zhi; Sears, Jr., Jesse A.; Hoyt, David W.; Mehta, Hardeep S.; Peden, Charles H. F.
2015-11-24
A continuous-flow (CF) magic angle sample spinning (CF-MAS) NMR rotor and probe are described for investigating reaction dynamics, stable intermediates/transition states, and mechanisms of catalytic reactions in situ. The rotor includes a sample chamber of a flow-through design with a large sample volume that delivers a flow of reactants through a catalyst bed contained within the sample cell allowing in-situ investigations of reactants and products. Flow through the sample chamber improves diffusion of reactants and products through the catalyst. The large volume of the sample chamber enhances sensitivity permitting in situ .sup.13C CF-MAS studies at natural abundance.
Importance sampling large deviations in nonequilibrium steady states. I.
Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T
2018-03-28
Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.
Importance sampling large deviations in nonequilibrium steady states. I
NASA Astrophysics Data System (ADS)
Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T.
2018-03-01
Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.
Findings and Implications of the Assignment Incentive Survey
2002-10-01
Submarine Base Kings Bay is in Georgia. 27 In comparison, the Rotating to Sea sample (see figure 7) included Florida and Georgia and did not include...Because Kings Bay is a large submarine establishment, the Hawaii sample may have also been heavily in favor of FL/GA; however, the Hawaii sample was...Kings Bay , FL was listed as the most desirable fleet locadon for homebasing, fol- lowed by Bremerton/Bangor, Everett/Whidbey Island, Gulfport/Pas
Pestle, Sarah L; Chorpita, Bruce F; Schiffman, Jason
2008-04-01
The Penn State Worry Questionnaire for Children (PSWQ-C; Chorpita, Tracey, Brown, Collica, & Barlow, 1997) is a 14-item self-report measure of worry in children and adolescents. Although the PSWQ-C has demonstrated favorable psychometric properties in small clinical and large community samples, this study represents the first psychometric evaluation of the PSWQ-C in a large clinical sample (N = 491). Factor analysis indicated a two-factor structure, in contrast to all previously published findings on the measure. The PSWQ-C demonstrated favorable psychometric properties in this sample, including high internal consistency, high convergent validity with related constructs, and acceptable discriminative validity between diagnostic categories. The performance of the 3 reverse-scored items was closely examined, and results indicated retaining all 14 items.
Big Data and Large Sample Size: A Cautionary Note on the Potential for Bias
Chambers, David A.; Glasgow, Russell E.
2014-01-01
Abstract A number of commentaries have suggested that large studies are more reliable than smaller studies and there is a growing interest in the analysis of “big data” that integrates information from many thousands of persons and/or different data sources. We consider a variety of biases that are likely in the era of big data, including sampling error, measurement error, multiple comparisons errors, aggregation error, and errors associated with the systematic exclusion of information. Using examples from epidemiology, health services research, studies on determinants of health, and clinical trials, we conclude that it is necessary to exercise greater caution to be sure that big sample size does not lead to big inferential errors. Despite the advantages of big studies, large sample size can magnify the bias associated with error resulting from sampling or study design. Clin Trans Sci 2014; Volume #: 1–5 PMID:25043853
Lawson, Chris A
2018-09-01
Two experiments examined the extent to which category status influences children's attention to the composition of evidence samples provided by different informants. Children were told about two informants, each of whom presented different samples of evidence, and then were asked to judge which informant they would trust to help them learn something new. The composition of evidence samples was manipulated such that one sample included either a large number (n = 5) or a diverse range of exemplars relative to the other sample, which included either a small number (n = 2) or a homogeneous range of exemplars. Experiment 1 revealed that participants (N = 37; M age = 4.76 years) preferred to place their trust in the informant who presented the large or diverse sample when each informant was labeled "teacher" but exhibited no preference when each informant was labeled "child." Experiment 2 revealed developmental differences in responses when labels and sample composition were pitted against each other. Younger children (n = 32; M age = 3.42 years) consistently trusted the "teacher" regardless of the composition of the sample the informant was said to have provided, whereas older children (n = 30; M age = 5.54 years) consistently trusted the informant who provided the large or diverse sample regardless of whether it was provided by a "teacher" or a "child." These results have important implications for understanding the interplay between children's category knowledge and their evaluation of evidence. Copyright © 2018 Elsevier Inc. All rights reserved.
Got power? A systematic review of sample size adequacy in health professions education research.
Cook, David A; Hatala, Rose
2015-03-01
Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011, and included all studies evaluating simulation-based education for health professionals in comparison with no intervention or another simulation intervention. Reviewers working in duplicate abstracted information to calculate standardized mean differences (SMD's). We included 897 original research studies. Among the 627 no-intervention-comparison studies the median sample size was 25. Only two studies (0.3%) had ≥80% power to detect a small difference (SMD > 0.2 standard deviations) and 136 (22%) had power to detect a large difference (SMD > 0.8). 110 no-intervention-comparison studies failed to find a statistically significant difference, but none excluded a small difference and only 47 (43%) excluded a large difference. Among 297 studies comparing alternate simulation approaches the median sample size was 30. Only one study (0.3%) had ≥80% power to detect a small difference and 79 (27%) had power to detect a large difference. Of the 128 studies that did not detect a statistically significant effect, 4 (3%) excluded a small difference and 91 (71%) excluded a large difference. In conclusion, most education research studies are powered only to detect effects of large magnitude. For most studies that do not reach statistical significance, the possibility of large and important differences still exists.
Adaptive Oceanographic Sampling in a Coastal Environment Using Autonomous Gliding Vehicles
2003-08-01
cost autonomous vehicles with near-global range and modular sensor payload. Particular emphasis is placed on the development of adaptive sampling...environment. Secondary objectives include continued development of adaptive sampling strategies suitable for large fleets of slow-moving autonomous ... vehicles , and development and implementation of new oceanographic sensors and sampling methodologies. The main task completed was a complete redesign of
Abrahamson, Melanie; Hooker, Elizabeth; Ajami, Nadim J; Petrosino, Joseph F; Orwoll, Eric S
2017-09-01
The relationship of the gastrointestinal microbiome to health and disease is of major research interest, including the effects of the gut microbiota on age related conditions. Here we report on the outcome of a project to collect stool samples on a large number of community dwelling elderly men using the OMNIgene-GUT stool/feces collection kit (OMR-200, DNA Genotek, Ottawa, Canada). Among 1,328 men who were eligible for stool collection, 982 (74%) agreed to participate and 951 submitted samples. The collection process was reported to be acceptable, almost all samples obtained were adequate, the process of sample handling by mail was uniformly successful. The DNA obtained provided excellent results in microbiome analyses, yielding an abundance of species and a diversity of taxa as would be predicted. Our results suggest that population studies of older participants involving remote stool sample collection are feasible. These approaches would allow large scale research projects of the association of the gut microbiota with important clinical outcomes.
LARGE RIVER ASSESSMENT METHODS FOR BENTHIC MACROINVERTEBRATES AND FISH
Multiple projects are currently underway to increase our understanding of the varying results of different sampling methods and designs used for the biological assessment and monitoring of large (boatable) rivers. Studies include methods used to assess fish, benthic macroinverte...
Su, Xiaoquan; Xu, Jian; Ning, Kang
2012-10-01
It has long been intriguing scientists to effectively compare different microbial communities (also referred as 'metagenomic samples' here) in a large scale: given a set of unknown samples, find similar metagenomic samples from a large repository and examine how similar these samples are. With the current metagenomic samples accumulated, it is possible to build a database of metagenomic samples of interests. Any metagenomic samples could then be searched against this database to find the most similar metagenomic sample(s). However, on one hand, current databases with a large number of metagenomic samples mostly serve as data repositories that offer few functionalities for analysis; and on the other hand, methods to measure the similarity of metagenomic data work well only for small set of samples by pairwise comparison. It is not yet clear, how to efficiently search for metagenomic samples against a large metagenomic database. In this study, we have proposed a novel method, Meta-Storms, that could systematically and efficiently organize and search metagenomic data. It includes the following components: (i) creating a database of metagenomic samples based on their taxonomical annotations, (ii) efficient indexing of samples in the database based on a hierarchical taxonomy indexing strategy, (iii) searching for a metagenomic sample against the database by a fast scoring function based on quantitative phylogeny and (iv) managing database by index export, index import, data insertion, data deletion and database merging. We have collected more than 1300 metagenomic data from the public domain and in-house facilities, and tested the Meta-Storms method on these datasets. Our experimental results show that Meta-Storms is capable of database creation and effective searching for a large number of metagenomic samples, and it could achieve similar accuracies compared with the current popular significance testing-based methods. Meta-Storms method would serve as a suitable database management and search system to quickly identify similar metagenomic samples from a large pool of samples. ningkang@qibebt.ac.cn Supplementary data are available at Bioinformatics online.
Partitioning heritability by functional annotation using genome-wide association summary statistics.
Finucane, Hilary K; Bulik-Sullivan, Brendan; Gusev, Alexander; Trynka, Gosia; Reshef, Yakir; Loh, Po-Ru; Anttila, Verneri; Xu, Han; Zang, Chongzhi; Farh, Kyle; Ripke, Stephan; Day, Felix R; Purcell, Shaun; Stahl, Eli; Lindstrom, Sara; Perry, John R B; Okada, Yukinori; Raychaudhuri, Soumya; Daly, Mark J; Patterson, Nick; Neale, Benjamin M; Price, Alkes L
2015-11-01
Recent work has demonstrated that some functional categories of the genome contribute disproportionately to the heritability of complex diseases. Here we analyze a broad set of functional elements, including cell type-specific elements, to estimate their polygenic contributions to heritability in genome-wide association studies (GWAS) of 17 complex diseases and traits with an average sample size of 73,599. To enable this analysis, we introduce a new method, stratified LD score regression, for partitioning heritability from GWAS summary statistics while accounting for linked markers. This new method is computationally tractable at very large sample sizes and leverages genome-wide information. Our findings include a large enrichment of heritability in conserved regions across many traits, a very large immunological disease-specific enrichment of heritability in FANTOM5 enhancers and many cell type-specific enrichments, including significant enrichment of central nervous system cell types in the heritability of body mass index, age at menarche, educational attainment and smoking behavior.
Flexible sampling large-scale social networks by self-adjustable random walk
NASA Astrophysics Data System (ADS)
Xu, Xiao-Ke; Zhu, Jonathan J. H.
2016-12-01
Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.
A self-sampling method to obtain large volumes of undiluted cervicovaginal secretions.
Boskey, Elizabeth R; Moench, Thomas R; Hees, Paul S; Cone, Richard A
2003-02-01
Studies of vaginal physiology and pathophysiology sometime require larger volumes of undiluted cervicovaginal secretions than can be obtained by current methods. A convenient method for self-sampling these secretions outside a clinical setting can facilitate such studies of reproductive health. The goal was to develop a vaginal self-sampling method for collecting large volumes of undiluted cervicovaginal secretions. A menstrual collection device (the Instead cup) was inserted briefly into the vagina to collect secretions that were then retrieved from the cup by centrifugation in a 50-ml conical tube. All 16 women asked to perform this procedure found it feasible and acceptable. Among 27 samples, an average of 0.5 g of secretions (range, 0.1-1.5 g) was collected. This is a rapid and convenient self-sampling method for obtaining relatively large volumes of undiluted cervicovaginal secretions. It should prove suitable for a wide range of assays, including those involving sexually transmitted diseases, microbicides, vaginal physiology, immunology, and pathophysiology.
Fisheries research and monitoring activities of the Lake Erie Biological Station, 2013
Kraus, Richard T.; Rogers, Mark W.; Kocovsky, Patrick; Edwards, William; Bodamer Scarbro, Betsy L.; Keretz, Kevin R.; Berkman, Stephanie A.
2014-01-01
In 2013, the U.S. Geological Survey’s Lake Erie Biological Station successfully completed large vessel surveys in all three of Lake Erie’s basins. Lake Erie Biological Station’s primary vessel surveys included the Western Basin Forage Fish Assessment and East Harbor Forage Fish Assessment as well as contributing to the cooperative multi-agency Central Basin Hydroacoustics Assessment and the Eastern Basin Coldwater Community Assessment (see Forage Task Group and Coldwater Task Group reports, respectively). Further large vessel sampling included individual research data collection as well as assisting with University (e.g., University of Toledo) and agency (e.g., USFWS, USEPA) large vessel sampling needs. Our 2013 vessel operations began on April 4th and concluded on November 21 with a total of 77 large vessel sampling days (83 total days). During this time, crews of the R/V Muskie and R/V Bowfin deployed 174 trawls covering 147 km of lake-bottom, over 13 km of gillnet, collected hydroacoustic data that extended over 250 km of the central and eastern basins, and approximately 180 collective zooplankton, benthos, and water samples. 2013 was the first complete sampling year using the R/V Muskie. Technologies available on the new platform provided opportunities for LEBS to improve data sampling methods and results. An investment was made in mensuration gear for the trawls. This gear is attached to the trawl’s headrope, footrope, and wings; thus, allowing measurement of the area swept and conversion of catches to densities. Another improvement included real-time output of water parameter sonde profiles (e.g., temperature, dissolved oxygen). The ability to view profile data on a tablet allowed quick identification of thermoclines as well as the presence (or absence) of hypoxia. Minor modifications were made to survey designs relative to last year (see 2013 report), and thus, collection of long-term data from the R/V Muskie has commenced. One minor change was that we are now indexing yellow perch maturation data during our fall trawl surveys in response to a request from the Lake Erie Yellow Perch Task Group. Within the following sections, we describe results from our 2013 sampling efforts in Lake Erie.
Antarctic Meteorite Newsletter, Volume 11, Number 2, August 1988
NASA Technical Reports Server (NTRS)
1988-01-01
Presented are classifications and descriptions of a large number of meteorites which include the last samples from the 1984 collection and the first samples from the 1987 collection. There is a particularly good selection of meteorites of special petrologic type in the 1987 collection. The achondrites include aubrites, ureilites, howardites, eucrites, and a diogenite. The howardites are particularly notable because of their size and previous scarcity in the Antarctic collection. Noteworthy among the 7 irons and 3 mesosiderities are 2 anamolous irons and 2 large mesosiderites. The carbonaceous chondrites include good suites of C2 and C4 meteorites, and 2 highly equilibrated carbonaceous chondrites tentatively identified as C5 and C6 meteorites. Also included are surveys of numerous meteorites for Al-26 and thermoluminescence. These studies provide information on the thermal and radiation histories of the meteorites and can be used as measures of their terrestrial ages.
An improved technique for taking hydraulic conductivity cores from forest soils
Gerald M. Aubertin
1969-01-01
Describes a large-diameter, heavy-duty soil sampler that makes it possible to obtain long, relatively undisturbed sample columns from stony, root-filled forest soils. The resultant samples include the roots, root channels, stones, and macro-voids common to forested soils.
The ARIEL mission reference sample
NASA Astrophysics Data System (ADS)
Zingales, Tiziano; Tinetti, Giovanna; Pillitteri, Ignazio; Leconte, Jérémy; Micela, Giuseppina; Sarkar, Subhajit
2018-02-01
The ARIEL (Atmospheric Remote-sensing Exoplanet Large-survey) mission concept is one of the three M4 mission candidates selected by the European Space Agency (ESA) for a Phase A study, competing for a launch in 2026. ARIEL has been designed to study the physical and chemical properties of a large and diverse sample of exoplanets and, through those, understand how planets form and evolve in our galaxy. Here we describe the assumptions made to estimate an optimal sample of exoplanets - including already known exoplanets and expected ones yet to be discovered - observable by ARIEL and define a realistic mission scenario. To achieve the mission objectives, the sample should include gaseous and rocky planets with a range of temperatures around stars of different spectral type and metallicity. The current ARIEL design enables the observation of ˜1000 planets, covering a broad range of planetary and stellar parameters, during its four year mission lifetime. This nominal list of planets is expected to evolve over the years depending on the new exoplanet discoveries.
A high-throughput microRNA expression profiling system.
Guo, Yanwen; Mastriano, Stephen; Lu, Jun
2014-01-01
As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.
Ivanov, Alexander I.; Lushchikov, Vladislav I.; Shabalin, Eugeny P.; Maznyy, Nikita G.; Khvastunov, Michael M.; Rowland, Mark
2002-01-01
A detector for fissile materials which provides for integrity monitoring of fissile materials and can be used for nondestructive assay to confirm the presence of a stable content of fissile material in items. The detector has a sample cavity large enough to enable assay of large items of arbitrary configuration, utilizes neutron sources fabricated in spatially extended shapes mounted on the endcaps of the sample cavity, incorporates a thermal neutron filter insert with reflector properties, and the electronics module includes a neutron multiplicity coincidence counter.
Contemporaneous VLBA 5 GHz Observations of Large Area Telescope Detected Blazars
2012-01-10
Polarimetry Survey (VIPS) have been included in the sample, as well as 142 sources not found in VIPS. This very large, 5 GHz flux-limited sample of active...observing runs were follow-up observations on 90 sources in the VLBA Imaging and Polarimetry Survey (VIPS; Helmboldt et al. 2007) and new 5 GHz observations...Array (VLBA). In total, 232 sources were observed with the VLBA. Ninety sources that were previously observed as part of the VLBA Imaging and Polarimetry
Annealing Increases Stability Of Iridium Thermocouples
NASA Technical Reports Server (NTRS)
Germain, Edward F.; Daryabeigi, Kamran; Alderfer, David W.; Wright, Robert E.; Ahmed, Shaffiq
1989-01-01
Metallurgical studies carried out on samples of iridium versus iridium/40-percent rhodium thermocouples in condition received from manufacturer. Metallurgical studies included x-ray, macroscopic, resistance, and metallographic studies. Revealed large amount of internal stress caused by cold-working during manufacturing, and large number of segregations and inhomogeneities. Samples annealed in furnace at temperatures from 1,000 to 2,000 degree C for intervals up to 1 h to study effects of heat treatment. Wire annealed by this procedure found to be ductile.
Molecular epidemiology biomarkers-Sample collection and processing considerations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Nina T.; Pfleger, Laura; Berger, Eileen
2005-08-07
Biomarker studies require processing and storage of numerous biological samples with the goals of obtaining a large amount of information and minimizing future research costs. An efficient study design includes provisions for processing of the original samples, such as cryopreservation, DNA isolation, and preparation of specimens for exposure assessment. Use of standard, two-dimensional and nanobarcodes and customized electronic databases assure efficient management of large sample collections and tracking results of data analyses. Standard operating procedures and quality control plans help to protect sample quality and to assure validity of the biomarker data. Specific state, federal and international regulations are inmore » place regarding research with human samples, governing areas including custody, safety of handling, and transport of human samples. Appropriate informed consent must be obtained from the study subjects prior to sample collection and confidentiality of results maintained. Finally, examples of three biorepositories of different scale (European Cancer Study, National Cancer Institute and School of Public Health Biorepository, University of California, Berkeley) are used to illustrate challenges faced by investigators and the ways to overcome them. New software and biorepository technologies are being developed by many companies that will help to bring biological banking to a new level required by molecular epidemiology of the 21st century.« less
Rare Earth Element and Trace Element Data Associated with Hydrothermal Spring Reservoir Rock, Idaho
Quillinan, Scott; Bagdonas, Davin
2017-06-22
These data represent rock samples collected in Idaho that correspond with naturally occurring hydrothermal samples that were collected and analyzed by INL (Idaho Falls, ID). Representative samples of type rocks were selected to best represent the various regions of Idaho in which naturally occurring hydrothermal waters occur. This includes the Snake River Plain (SRP), Basin and Range type structures east of the SRP, and large scale/deep seated orogenic uplift of the Sawtooth Mountains, ID. Analysis includes ICP-OES and ICP-MS methods for Major, Trace, and REE concentrations.
How Generalizable Is Your Experiment? An Index for Comparing Experimental Samples and Populations
ERIC Educational Resources Information Center
Tipton, Elizabeth
2014-01-01
Although a large-scale experiment can provide an estimate of the average causal impact for a program, the sample of sites included in the experiment is often not drawn randomly from the inference population of interest. In this article, we provide a generalizability index that can be used to assess the degree of similarity between the sample of…
ERIC Educational Resources Information Center
Anderson, J. M.
1978-01-01
A method is described for preparing large gelatine-embedded soil sections for ecological studies. Sampling methods reduce structural disturbance of the samples to a minimum and include freezing the samples in the field to kill soil invertebrates in their natural microhabitats. Projects are suggested for upper secondary school students. (Author/BB)
Anomalies in Trace Metal and Rare-Earth Loads below a Waste-Water Treatment Plant
NASA Astrophysics Data System (ADS)
Antweiler, R.; Writer, J. H.; Murphy, S.
2013-12-01
The changes in chemical loads were examined for 54 inorganic elements and compounds in a 5.4-km reach of Boulder Creek, Colorado downstream of a waste water treatment plant (WWTP) outfall. Elements were partitioned into three categories: those showing a decrease in loading downstream, those showing an increase, and those which were conservative, at least over the length of the study reach. Dissolved loads which declined - generally indicative of in-stream loss via precipitation or sorption - were typically rapid (occurring largely before the first sampling site, 2.3 km downstream); elements showing this behavior were Bi, Cr, Cs, Ga, Ge, Hg, Se and Sn. These results were as expected before the experiment was performed. However, a large group (28 elements, including all the rare-earth elements, REE, except Gd) exhibited dissolved load increases indicating in-stream gains. These gains may be due to particulate matter dissolving or disaggregating, or that desorption is occurring below the WWTP. As with the in-stream loss group, the processes tended to be rapid, typically occurring before the first sampling site. Whole-water samples collected concurrently also had a large group of elements which showed an increase in load downstream of the WWTP. Among these were most of the group which had increases in the dissolved load, including all the REE (except Gd). Because whole-water samples include both dissolved and suspended particulates within them, increases in loads cannot be accounted for by invoking desorption or disaggregation mechanisms; thus, the only source for these increases is from the bed load of the stream. Further, the difference between the whole-water and dissolved loads is a measure of the particulate load, and calculations show that not only did the dissolved and whole-water loads increase, but so did the particulate loads. This implies that at the time of sampling the bed sediment was supplying a significant contribution to the suspended load. In general, it seems untenable as a hypothesis to suppose that the stream bed material can permanently supply the source of the in-stream load increases of a large group of inorganic elements. We propose that the anomalous increase in loads was more a function of the time of sampling (both diurnally and seasonally) and that sampling at different times of day or different seasons during the year would give contradictory results to those seen here. If this is so, inorganic loading studies must include multiple sampling both over the course of a day and during different seasons and flow regimes.
Nutrients and suspended sediments in streams and large rivers are two major issues facing state and federal agencies. Accurate estimates of nutrient and sediment loads are needed to assess a variety of important water-quality issues including total maximum daily loads, aquatic ec...
We measured the concentrations of 56 active pharmaceutical ingredients (APIs) and seven metabolites, including 50 prioritized APIs, in 24-hour composite effluent samples collected from 50 very large municipal wastewater treatment plants across the US. Hydrochlorothiazide was foun...
When the Test of Mediation is More Powerful than the Test of the Total Effect
O'Rourke, Holly P.; MacKinnon, David P.
2014-01-01
Although previous research has studied power in mediation models, the extent to which the inclusion of a mediator will increase power has not been investigated. First, a study compared analytical power of the mediated effect to the total effect in a single mediator model to identify the situations in which the inclusion of one mediator increased statistical power. Results from the first study indicated that including a mediator increased statistical power in small samples with large coefficients and in large samples with small coefficients, and when coefficients were non-zero and equal across models. Next, a study identified conditions where power was greater for the test of the total mediated effect compared to the test of the total effect in the parallel two mediator model. Results indicated that including two mediators increased power in small samples with large coefficients and in large samples with small coefficients, the same pattern of results found in the first study. Finally, a study assessed analytical power for a sequential (three-path) two mediator model and compared power to detect the three-path mediated effect to power to detect both the test of the total effect and the test of the mediated effect for the single mediator model. Results indicated that the three-path mediated effect had more power than the mediated effect from the single mediator model and the test of the total effect. Practical implications of these results for researchers are then discussed. PMID:24903690
Lin, Yu-Pin; Chu, Hone-Jay; Wang, Cheng-Long; Yu, Hsiao-Hsuan; Wang, Yung-Chieh
2009-01-01
This study applies variogram analyses of normalized difference vegetation index (NDVI) images derived from SPOT HRV images obtained before and after the ChiChi earthquake in the Chenyulan watershed, Taiwan, as well as images after four large typhoons, to delineate the spatial patterns, spatial structures and spatial variability of landscapes caused by these large disturbances. The conditional Latin hypercube sampling approach was applied to select samples from multiple NDVI images. Kriging and sequential Gaussian simulation with sufficient samples were then used to generate maps of NDVI images. The variography of NDVI image results demonstrate that spatial patterns of disturbed landscapes were successfully delineated by variogram analysis in study areas. The high-magnitude Chi-Chi earthquake created spatial landscape variations in the study area. After the earthquake, the cumulative impacts of typhoons on landscape patterns depended on the magnitudes and paths of typhoons, but were not always evident in the spatiotemporal variability of landscapes in the study area. The statistics and spatial structures of multiple NDVI images were captured by 3,000 samples from 62,500 grids in the NDVI images. Kriging and sequential Gaussian simulation with the 3,000 samples effectively reproduced spatial patterns of NDVI images. However, the proposed approach, which integrates the conditional Latin hypercube sampling approach, variogram, kriging and sequential Gaussian simulation in remotely sensed images, efficiently monitors, samples and maps the effects of large chronological disturbances on spatial characteristics of landscape changes including spatial variability and heterogeneity.
2013-01-01
Introduction Small-study effects refer to the fact that trials with limited sample sizes are more likely to report larger beneficial effects than large trials. However, this has never been investigated in critical care medicine. Thus, the present study aimed to examine the presence and extent of small-study effects in critical care medicine. Methods Critical care meta-analyses involving randomized controlled trials and reported mortality as an outcome measure were considered eligible for the study. Component trials were classified as large (≥100 patients per arm) and small (<100 patients per arm) according to their sample sizes. Ratio of odds ratio (ROR) was calculated for each meta-analysis and then RORs were combined using a meta-analytic approach. ROR<1 indicated larger beneficial effect in small trials. Small and large trials were compared in methodological qualities including sequence generating, blinding, allocation concealment, intention to treat and sample size calculation. Results A total of 27 critical care meta-analyses involving 317 trials were included. Of them, five meta-analyses showed statistically significant RORs <1, and other meta-analyses did not reach a statistical significance. Overall, the pooled ROR was 0.60 (95% CI: 0.53 to 0.68); the heterogeneity was moderate with an I2 of 50.3% (chi-squared = 52.30; P = 0.002). Large trials showed significantly better reporting quality than small trials in terms of sequence generating, allocation concealment, blinding, intention to treat, sample size calculation and incomplete follow-up data. Conclusions Small trials are more likely to report larger beneficial effects than large trials in critical care medicine, which could be partly explained by the lower methodological quality in small trials. Caution should be practiced in the interpretation of meta-analyses involving small trials. PMID:23302257
National Databases for Neurosurgical Outcomes Research: Options, Strengths, and Limitations.
Karhade, Aditya V; Larsen, Alexandra M G; Cote, David J; Dubois, Heloise M; Smith, Timothy R
2017-08-05
Quality improvement, value-based care delivery, and personalized patient care depend on robust clinical, financial, and demographic data streams of neurosurgical outcomes. The neurosurgical literature lacks a comprehensive review of large national databases. To assess the strengths and limitations of various resources for outcomes research in neurosurgery. A review of the literature was conducted to identify surgical outcomes studies using national data sets. The databases were assessed for the availability of patient demographics and clinical variables, longitudinal follow-up of patients, strengths, and limitations. The number of unique patients contained within each data set ranged from thousands (Quality Outcomes Database [QOD]) to hundreds of millions (MarketScan). Databases with both clinical and financial data included PearlDiver, Premier Healthcare Database, Vizient Clinical Data Base and Resource Manager, and the National Inpatient Sample. Outcomes collected by databases included patient-reported outcomes (QOD); 30-day morbidity, readmissions, and reoperations (National Surgical Quality Improvement Program); and disease incidence and disease-specific survival (Surveillance, Epidemiology, and End Results-Medicare). The strengths of large databases included large numbers of rare pathologies and multi-institutional nationally representative sampling; the limitations of these databases included variable data veracity, variable data completeness, and missing disease-specific variables. The improvement of existing large national databases and the establishment of new registries will be crucial to the future of neurosurgical outcomes research. Copyright © 2017 by the Congress of Neurological Surgeons
LACIE large area acreage estimation. [United States of America
NASA Technical Reports Server (NTRS)
Chhikara, R. S.; Feiveson, A. H. (Principal Investigator)
1979-01-01
A sample wheat acreage for a large area is obtained by multiplying its small grains acreage estimate as computed by the classification and mensuration subsystem by the best available ratio of wheat to small grains acreages obtained from historical data. In the United States, as in other countries with detailed historical data, an additional level of aggregation was required because sample allocation was made at the substratum level. The essential features of the estimation procedure for LACIE countries are included along with procedures for estimating wheat acreage in the United States.
User's guide for a large signal computer model of the helical traveling wave tube
NASA Technical Reports Server (NTRS)
Palmer, Raymond W.
1992-01-01
The use is described of a successful large-signal, two-dimensional (axisymmetric), deformable disk computer model of the helical traveling wave tube amplifier, an extensively revised and operationally simplified version. We also discuss program input and output and the auxiliary files necessary for operation. Included is a sample problem and its input data and output results. Interested parties may now obtain from the author the FORTRAN source code, auxiliary files, and sample input data on a standard floppy diskette, the contents of which are described herein.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gritsenko, Marina A.; Xu, Zhe; Liu, Tao
Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less
Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D
2016-01-01
Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.
Efficient computation of the joint sample frequency spectra for multiple populations.
Kamm, John A; Terhorst, Jonathan; Song, Yun S
2017-01-01
A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.
Efficient computation of the joint sample frequency spectra for multiple populations
Kamm, John A.; Terhorst, Jonathan; Song, Yun S.
2016-01-01
A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity. PMID:28239248
Associating Pregnancy with Partner Violence against Chinese Women
ERIC Educational Resources Information Center
Chan, Ko Ling; Brownridge, Douglas A.; Tiwari, Agnes; Fong, Daniel Y. T.; Leung, Wing Cheong; Ho, Pak Chung
2011-01-01
The present study discusses if pregnancy is a risk factor for intimate partner violence using a large, representative sample containing detailed information on partner violence including physical and sexual abuse as well as perpetrator-related risk factors. Data from a representative sample of 2,225 men were analyzed. The self-reported prevalence…
NASA Technical Reports Server (NTRS)
Natesh, R.
1978-01-01
The various steps involved in obtaining quantitative information of structural defects in crystalline silicon samples are described. Procedures discussed include: (1) chemical polishing; (2) chemical etching; and (3) automated image analysis of samples on the QTM 720 System.
Tammas-Williams, S; Withers, P J; Todd, I; Prangnell, P B
2017-08-04
Without post-manufacture HIPing the fatigue life of electron beam melting (EBM) additively manufactured parts is currently dominated by the presence of porosity, exhibiting large amounts of scatter. Here we have shown that the size and location of these defects is crucial in determining the fatigue life of EBM Ti-6Al-4V samples. X-ray computed tomography has been used to characterise all the pores in fatigue samples prior to testing and to follow the initiation and growth of fatigue cracks. This shows that the initiation stage comprises a large fraction of life (>70%). In these samples the initiating defect was often some way from being the largest (merely within the top 35% of large defects). Using various ranking strategies including a range of parameters, we found that when the proximity to the surface and the pore aspect ratio were included the actual initiating defect was within the top 3% of defects ranked most harmful. This lays the basis for considering how the deposition parameters can be optimised to ensure that the distribution of pores is tailored to the distribution of applied stresses in additively manufactured parts to maximise the fatigue life for a given loading cycle.
SAS procedures for designing and analyzing sample surveys
Stafford, Joshua D.; Reinecke, Kenneth J.; Kaminski, Richard M.
2003-01-01
Complex surveys often are necessary to estimate occurrence (or distribution), density, and abundance of plants and animals for purposes of re-search and conservation. Most scientists are familiar with simple random sampling, where sample units are selected from a population of interest (sampling frame) with equal probability. However, the goal of ecological surveys often is to make inferences about populations over large or complex spatial areas where organisms are not homogeneously distributed or sampling frames are in-convenient or impossible to construct. Candidate sampling strategies for such complex surveys include stratified,multistage, and adaptive sampling (Thompson 1992, Buckland 1994).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, S.; Aldering, G.; Antilogus, P.
The use of Type Ia supernovae as distance indicators led to the discovery of the accelerating expansion of the universe a decade ago. Now that large second generation surveys have significantly increased the size and quality of the high-redshift sample, the cosmological constraints are limited by the currently available sample of ~50 cosmologically useful nearby supernovae. The Nearby Supernova Factory addresses this problem by discovering nearby supernovae and observing their spectrophotometric time development. Our data sample includes over 2400 spectra from spectral timeseries of 185 supernovae. This talk presents results from a portion of this sample including a Hubble diagrammore » (relative distance vs. redshift) and a description of some analyses using this rich dataset.« less
Instrumentation of sampling aircraft for measurement of launch vehicle effluents
NASA Technical Reports Server (NTRS)
Wornom, D. E.; Woods, D. C.; Thomas, M. E.; Tyson, R. W.
1977-01-01
An aircraft was selected and instrumented to measure effluents emitted from large solid propellant rockets during launch activities. The considerations involved in aircraft selection, sampling probes, and instrumentation are discussed with respect to obtaining valid airborne measurements. Discussions of the data acquisition system used, the instrument power system, and operational sampling procedures are included. Representative measurements obtained from an actual rocket launch monitoring activity are also presented.
Laboratory Spectrometer for Wear Metal Analysis of Engine Lubricants.
1986-04-01
analysis, the acid digestion technique for sample pretreatment is the best approach available to date because of its relatively large sample size (1000...microliters or more). However, this technique has two major shortcomings limiting its application: (1) it requires the use of hydrofluoric acid (a...accuracy. Sample preparation including filtration or acid digestion may increase analysis times by 20 minutes or more. b. Repeatability In the analysis
ERIC Educational Resources Information Center
Poteat, V. Paul; Espelage, Dorothy L.; Koenig, Brian K.
2009-01-01
In this study, heterosexual students' willingness to remain friends with peers who disclose that they are gay or lesbian and their willingness to attend schools that include gay and lesbian students were examined among two large middle school and high school samples (Sample 1: n = 20,509; 50.7% girls; Sample 2: n = 16,917; 50.2% girls). Boys were…
Stellato, Giuseppina; La Storia, Antonietta; De Filippis, Francesca; Borriello, Giorgia; Villani, Francesco
2016-01-01
ABSTRACT Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The aims of this study were to learn more about the possible influence of the meat processing environment on initial fresh meat contamination and to investigate the differences between small-scale retail distribution (SD) and large-scale retail distribution (LD) facilities. Samples were collected from butcheries (n = 20), including LD (n = 10) and SD (n = 10) facilities, over two sampling campaigns. Samples included fresh beef and pork cuts and swab samples from the knife, the chopping board, and the butcher's hand. The microbiota of both meat samples and environmental swabs were very complex, including more than 800 operational taxonomic units (OTUs) collapsed at the species level. The 16S rRNA sequencing analysis showed that core microbiota were shared by 80% of the samples and included Pseudomonas spp., Streptococcus spp., Brochothrix spp., Psychrobacter spp., and Acinetobacter spp. Hierarchical clustering of the samples based on the microbiota showed a certain separation between meat and environmental samples, with higher levels of Proteobacteria in meat. In particular, levels of Pseudomonas and several Enterobacteriaceae members were significantly higher in meat samples, while Brochothrix, Staphylococcus, lactic acid bacteria, and Psychrobacter prevailed in environmental swab samples. Consistent clustering was also observed when metabolic activities were considered by predictive metagenomic analysis of the samples. An increase in carbohydrate metabolism was predicted for the environmental swabs and was consistently linked to Firmicutes, while increases in pathways related to amino acid and lipid metabolism were predicted for the meat samples and were positively correlated with Proteobacteria. Our results highlighted the importance of the processing environment in contributing to the initial microbial levels of meat and clearly showed that the type of retail facility (LD or SD) did not apparently affect the contamination. IMPORTANCE The study provides an in-depth description of the microbiota of meat and meat processing environments. It highlights the importance of the environment as a contamination source of spoilage bacteria, and it shows that the size of the retail facility does not affect the level and type of contamination. PMID:27129965
NASA Technical Reports Server (NTRS)
Brunner, H.; Worrall, D. M.; Wilkes, Belinda J.; Elvis, Martin
1989-01-01
The dependence of the soft X-ray spectral slope on radio, optical and X-ray properties, and on redshift are reported for a large sample of Active Galactic Nuclei (AGN). The sample includes 317 optically and radio-selected AGN from a preliminary version of the Einstein Imaging Proportional Counter (IPC) quasar and AGN data base. The main results are: the difference in X-ray slope between radio-loud and radio-quiet AGN were confirmed for an independent and much larger sample of sources; a difference in X-ray slope between flat and steep radio spectrum AGN is observed only in high luminosity sub-sample; in flat radio spectrum AGNs there is an indication for a dependence of the X-ray spectral index on X-ray luminosity redshift and alpha sub 0x.
When the test of mediation is more powerful than the test of the total effect.
O'Rourke, Holly P; MacKinnon, David P
2015-06-01
Although previous research has studied power in mediation models, the extent to which the inclusion of a mediator will increase power has not been investigated. To address this deficit, in a first study we compared the analytical power values of the mediated effect and the total effect in a single-mediator model, to identify the situations in which the inclusion of one mediator increased statistical power. The results from this first study indicated that including a mediator increased statistical power in small samples with large coefficients and in large samples with small coefficients, and when coefficients were nonzero and equal across models. Next, we identified conditions under which power was greater for the test of the total mediated effect than for the test of the total effect in the parallel two-mediator model. These results indicated that including two mediators increased power in small samples with large coefficients and in large samples with small coefficients, the same pattern of results that had been found in the first study. Finally, we assessed the analytical power for a sequential (three-path) two-mediator model and compared the power to detect the three-path mediated effect to the power to detect both the test of the total effect and the test of the mediated effect for the single-mediator model. The results indicated that the three-path mediated effect had more power than the mediated effect from the single-mediator model and the test of the total effect. Practical implications of these results for researchers are then discussed.
The large sample size fallacy.
Lantz, Björn
2013-06-01
Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.
Le Maréchal, M; Collange, F; Fressard, L; Peretti-Watel, P; Sebbah, R; Mikol, F; Agamaliyev, E; Gautier, A; Pulcini, C; Verger, P
2015-10-01
France is currently facing a vaccine-hesitancy crisis. We conducted a questionnaire-based telephone interview with a large sample of general practitioners (GPs) as they play a crucial role in the vaccination process. Our main objectives were to study the GPs' vaccination behaviors when it comes to their own vaccination and that of their relatives, and the vaccine recommendations made to their patients. We also aimed to understand their opinion related to the severity of vaccine-preventable diseases and to assess their trust in various sources of information. Finally, we enquired about their opinion in terms of vaccination-related tools that could help them in their daily practice. The article aimed to present the design of this panel and survey. Four samples of GPs (one national and three regional) were selected among all the French GPs (metropolitan France) using random sampling. Five cross-sectional surveys should be conducted with that panel. The mean targeted sample size is 2350 GPs for each survey. The survey dedicated to vaccination was conducted by telephone or on the Internet. GPs were included in the survey between December 2013 and February 2014. The national sample included 1582 GPs (response rate: 46%) and the three regional samples included 1297 GPs (response rate: 44%). The survey dedicated to vaccination was conducted between April and July 2014; the national sample response rate was 92% (1582/1712). The results of the first wave of surveys, conducted on a large sample of French GPs, provide important information to guide the French vaccination policy. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Yoskowitz, Joshua; Clark, Morgan; Labrake, Scott; Vineyard, Michael
2015-10-01
We have developed an external beam facility for the 1.1-MV tandem Pelletron accelerator in the Union College Ion Beam Analysis Laboratory. The beam is extracted from an aluminum pipe through a 1 / 4 ' ' diameter window with a 7.5- μm thick Kapton foil. This external beam facility allows us to perform ion beam analysis on samples that cannot be put under vacuum, including wet samples and samples too large to fit into the scattering chamber. We have commissioned the new facility by performing proton induced X-ray emission (PIXE) analysis of several samples of environmental interest. These include samples of artificial turf, running tracks, and a human tooth with an amalgam filling. A 1.7-MeV external proton beam was incident on the samples positioned 2 cm from the window. The resulting X-rays were measured using a silicon drift detector and were analyzed using GUPIX software to determine the concentrations of elements in the samples. The results on the human tooth indicate that while significant concentrations of Hg, Ag, and Sn are present in the amalgam filling, only trace amounts of Hg appear to have leached into the tooth. The artificial turf and running tracks show rather large concentrations of a broad range of elements and trace amounts of Pb in the turf infill.
NASA Astrophysics Data System (ADS)
Hsu, L.; Lehnert, K. A.; Walker, J. D.; Chan, C.; Ash, J.; Johansson, A. K.; Rivera, T. A.
2011-12-01
Sample-based measurements in geochemistry are highly diverse, due to the large variety of sample types, measured properties, and idiosyncratic analytical procedures. In order to ensure the utility of sample-based data for re-use in research or education they must be associated with a high quality and quantity of descriptive, discipline-specific metadata. Without an adequate level of documentation, it is not possible to reproduce scientific results or have confidence in using the data for new research inquiries. The required detail in data documentation makes it challenging to aggregate large sets of data from different investigators and disciplines. One solution to this challenge is to build data systems with several tiers of intricacy, where the less detailed tiers are geared toward discovery and interoperability, and the more detailed tiers have higher value for data analysis. The Geoinformatics for Geochemistry (GfG) group, which is part of the Integrated Earth Data Applications facility (http://www.iedadata.org), has taken this approach to provide services for the discovery, access, and analysis of sample-based geochemical data for a diverse user community, ranging from the highly informed geochemist to non-domain scientists and undergraduate students. GfG builds and maintains three tiers in the sample based data systems, from a simple data catalog (Geochemical Resource Library), to a substantially richer data model for the EarthChem Portal (EarthChem XML), and finally to detailed discipline-specific data models for petrologic (PetDB), sedimentary (SedDB), hydrothermal spring (VentDB), and geochronological (GeoChron) samples. The data catalog, the lowest level in the hierarchy, contains the sample data values plus metadata only about the dataset itself (Dublin Core metadata such as dataset title and author), and therefore can accommodate the widest diversity of data holdings. The second level includes measured data values from the sample, basic information about the analytical method, and metadata about the samples such as geospatial information and sample type. The third and highest level includes detailed data quality documentation and more specific information about the scientific context of the sample. The three tiers are linked to allow users to quickly navigate to their desired level of metadata detail. Links are based on the use of unique identifiers: (a) DOI at the granularity of datasets, and (b) the International Geo Sample Number IGSN at the granularity of samples. Current developments in the GfG sample-based systems include new registry architecture for the IGSN to advance international implementation, growth and modification of EarthChemXML to include geochemical data for new sample types such as soils and liquids, and the construction of a hydrothermal vent data system. This flexible, tiered, model provides a solution for offering varying levels of detail in order to aggregate a large quantity of data and serve the largest user group of both disciplinary novices and experts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keifer, W.S.; Blumenthal, D.L.; Tommerdahl, J.B.
1981-09-01
As part of the EPRI sulfate regional experiment (SURE), Meteorology Research, Inc., (MRI) and Research Triangle Institute (RTI) conducted six air quality sampling programs in the eastern United States using instrumented aircraft. This volume includes the air quality and meteorological data obtained during the July 1978 Intensive when MRI sampled near the Duncan Falls, Ohio, SURE Station and RTI sampled near the Scranton, Pennsylvania, SURE Station. During the last part of the July 1978 sampling period, both MRI and RTI aircraft participated in a large regional-scale sampling program with Brookhaven National Laboratory (BNL) and Pacific Northwest Laboratory (PNL). Only themore » data obtained by the MRI and RTI aircraft during this regional-scale sapling program are included in this volume.« less
Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S
2016-11-01
There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.
ERIC Educational Resources Information Center
Sonnleitner, Philipp; Brunner, Martin; Keller, Ulrich; Martin, Romain
2014-01-01
Whereas the assessment of complex problem solving (CPS) has received increasing attention in the context of international large-scale assessments, its fairness in regard to students' cultural background has gone largely unexplored. On the basis of a student sample of 9th-graders (N = 299), including a representative number of immigrant students (N…
NASA Astrophysics Data System (ADS)
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-01
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-05
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.
Use of a temperature-programmable pre-separation column in the gas chromatographic injection port permits determination of a wide range of semi-volatile pesticides including organochlorines, organophosphates, triazines, and anilines in fatty composite dietary samples while reduci...
Sexual Abuse and Suicidality: Gender Differences in a Large Community Sample of Adolescents
ERIC Educational Resources Information Center
Martin, Graham; Bergen, Helen A.; Richardson, Angela S.; Roeger, Leigh; Allison, Stephen
2004-01-01
Objective: A cross-sectional study of gender specific relationships between self-reported child sexual abuse and suicidality in a community sample of adolescents. Method: Students aged 14 years on average (N=2,485) from 27 schools in South Australia completed a questionnaire including items on sexual abuse and suicidality, and measures of…
Characterization of the Theta to Beta Ratio in ADHD: Identifying Potential Sources of Heterogeneity
ERIC Educational Resources Information Center
Loo, Sandra K.; Cho, Alexander; Hale, T. Sigi; McGough, James; McCracken, James; Smalley, Susan L.
2013-01-01
Objective: The goal of this study is to characterize the theta to beta ratio (THBR) obtained from electroencephalogram (EEG) measures, in a large sample of community and clinical participants with regard to (a) ADHD diagnosis and subtypes, (b) common psychiatric comorbidities, and (c) cognitive correlates. Method: The sample includes 871…
A Large Ordinary Chondrite Shower in the Dominion Range
NASA Technical Reports Server (NTRS)
Satterwhite, C. E.; Righter, K.; Harrington, R.; McBride, K. M.; Funk, R.
2017-01-01
The US Antarctic Meteorite Program has visited the Dominion Range in the Transantarctic Mountains during several different seasons, including the 1985, 2003, 2008, 2010, and 2014 seasons. Total recovered meteorites from this region is over 2000. The 2008 and 2010 seasons have been fully classified and, respectively) revealing the presence of a large meteorite shower that comprises approximately 60% of all samples recovered in those two seasons. The oil immersion classification suggests that this shower is LL chondrite material, whereas published magnetic susceptibility (MS; log chi) measurements yield L chondrite values. However, usually random sampling of a large collection like this would uncover EOC material for which we have prepared thin sections. In this case, no LL chondrite materials have been found in thin section, suggesting that the shower might instead be an L chondrite. L and LL chondrites are notoriously difficult to distinguish using oil immersion techniques. To better characterize this large group of samples, we have decided to examine some of the large members of this group, using EMPA analysis of the olivines to verify the classifications. With a compositional link between this subset of samples, and the MS measurements, we can more confidently classify the samples making up this pairing group. Subsequently, more accurate and meaningful comparisons may be drawn between this pairing group and some other Antarctic pairing groups such as from the Queen Alexandra Range (QUE), and Lewis Cliffs Ice Tongue (LEW). electron microprobe analysis
Dainer-Best, Justin; Lee, Hae Yeon; Shumake, Jason D; Yeager, David S; Beevers, Christopher G
2018-06-07
Although the self-referent encoding task (SRET) is commonly used to measure self-referent cognition in depression, many different SRET metrics can be obtained. The current study used best subsets regression with cross-validation and independent test samples to identify the SRET metrics most reliably associated with depression symptoms in three large samples: a college student sample (n = 572), a sample of adults from Amazon Mechanical Turk (n = 293), and an adolescent sample from a school field study (n = 408). Across all 3 samples, SRET metrics associated most strongly with depression severity included number of words endorsed as self-descriptive and rate of accumulation of information required to decide whether adjectives were self-descriptive (i.e., drift rate). These metrics had strong intratask and split-half reliability and high test-retest reliability across a 1-week period. Recall of SRET stimuli and traditional reaction time (RT) metrics were not robustly associated with depression severity. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Mariella, Jr., Raymond P.
2018-03-06
An isotachophoresis system for separating a sample containing particles into discrete packets including a flow channel, the flow channel having a large diameter section and a small diameter section; a negative electrode operably connected to the flow channel; a positive electrode operably connected to the flow channel; a leading carrier fluid in the flow channel; a trailing carrier fluid in the flow channel; and a control for separating the particles in the sample into discrete packets using the leading carrier fluid, the trailing carrier fluid, the large diameter section, and the small diameter section.
Berger, Philip; Messner, Michael J; Crosby, Jake; Vacs Renwick, Deborah; Heinrich, Austin
2018-05-01
Spore reduction can be used as a surrogate measure of Cryptosporidium natural filtration efficiency. Estimates of log10 (log) reduction were derived from spore measurements in paired surface and well water samples in Casper Wyoming and Kearney Nebraska. We found that these data were suitable for testing the hypothesis (H 0 ) that the average reduction at each site was 2 log or less, using a one-sided Student's t-test. After establishing data quality objectives for the test (expressed as tolerable Type I and Type II error rates), we evaluated the test's performance as a function of the (a) true log reduction, (b) number of paired samples assayed and (c) variance of observed log reductions. We found that 36 paired spore samples are sufficient to achieve the objectives over a wide range of variance, including the variances observed in the two data sets. We also explored the feasibility of using smaller numbers of paired spore samples to supplement bioparticle counts for screening purposes in alluvial aquifers, to differentiate wells with large volume surface water induced recharge from wells with negligible surface water induced recharge. With key assumptions, we propose a normal statistical test of the same hypothesis (H 0 ), but with different performance objectives. As few as six paired spore samples appear adequate as a screening metric to supplement bioparticle counts to differentiate wells in alluvial aquifers with large volume surface water induced recharge. For the case when all available information (including failure to reject H 0 based on the limited paired spore data) leads to the conclusion that wells have large surface water induced recharge, we recommend further evaluation using additional paired biweekly spore samples. Published by Elsevier GmbH.
NASA Astrophysics Data System (ADS)
Breier, J. A.; Sheik, C. S.; Gomez-Ibanez, D.; Sayre-McCord, R. T.; Sanger, R.; Rauch, C.; Coleman, M.; Bennett, S. A.; Cron, B. R.; Li, M.; German, C. R.; Toner, B. M.; Dick, G. J.
2014-12-01
A new tool was developed for large volume sampling to facilitate marine microbiology and biogeochemical studies. It was developed for remotely operated vehicle and hydrocast deployments, and allows for rapid collection of multiple sample types from the water column and dynamic, variable environments such as rising hydrothermal plumes. It was used successfully during a cruise to the hydrothermal vent systems of the Mid-Cayman Rise. The Suspended Particulate Rosette V2 large volume multi-sampling system allows for the collection of 14 sample sets per deployment. Each sample set can include filtered material, whole (unfiltered) water, and filtrate. Suspended particulate can be collected on filters up to 142 mm in diameter and pore sizes down to 0.2 μm. Filtration is typically at flowrates of 2 L min-1. For particulate material, filtered volume is constrained only by sampling time and filter capacity, with all sample volumes recorded by digital flowmeter. The suspended particulate filter holders can be filled with preservative and sealed immediately after sample collection. Up to 2 L of whole water, filtrate, or a combination of the two, can be collected as part of each sample set. The system is constructed of plastics with titanium fasteners and nickel alloy spring loaded seals. There are no ferrous alloys in the sampling system. Individual sample lines are prefilled with filtered, deionized water prior to deployment and remain sealed unless a sample is actively being collected. This system is intended to facilitate studies concerning the relationship between marine microbiology and ocean biogeochemistry.
Software engineering the mixed model for genome-wide association studies on large samples.
Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J
2009-11-01
Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.
NASA Astrophysics Data System (ADS)
Hurst, A.; Bowden, S. A.; Parnell, J.; Burchell, M. J.; Ball, A. J.
2007-12-01
There are a number of measurements relevant to planetary geology that can only be adequately performed by physically contacting a sample. This necessitates landing on the surface of a moon or planetary body or returning samples to earth. The need to physically contact a sample is particularly important in the case of measurements that could detect medium to low concentrations of large organic molecules present in surface materials. Large organic molecules, although a trace component of many meteoritic materials and rocks on the surface of earth, carry crucial information concerning the processing of meteoritic material in the surface and subsurface environments, and can be crucial indicators for the presence of life. Unfortunately landing on the surface of a small planetary body or moon is complicated, particularly if surface topography is only poorly characterised and the atmosphere thin thus requiring a propulsion system for a soft landing. One alternative to a surface landing may be to use an impactor launched from an orbiting spacecraft to launch material from the planets surface and shallow sub-surface into orbit. Ejected material could then be collected by a follow-up spacecraft and analyzed. The mission scenario considered in the Europa-Ice Clipper mission proposal included both sample return and the analysis of captured particles. Employing such a sampling procedure to analyse large organic molecules is only viable if large organic molecules present in ices survive hypervelocity impacts (HVIs). To investigate the survival of large organic molecules in HVIs with icy bodies a two stage light air gas gun was used to fire steel projectiles (1-1.5 mm diameter) at samples of water ice containing large organic molecules (amino acids, anthracene and beta-carotene a biological pigment) at velocities > 4.8 km/s.UV-VIS spectroscopy of ejected material detected beta-carotene indicating large organic molecules can survive hypervelocity impacts. These preliminary results are yet to be scaled up to a point where they can be accurately interpreted in the context of a likely mission scenario. However, they strongly indicate that in a low mass payload mission scenario where a lander has been considered unfeasible, such a sampling strategy merits further consideration.
Bhaskar, Anand; Wang, Y X Rachel; Song, Yun S
2015-02-01
With the recent increase in study sample sizes in human genetics, there has been growing interest in inferring historical population demography from genomic variation data. Here, we present an efficient inference method that can scale up to very large samples, with tens or hundreds of thousands of individuals. Specifically, by utilizing analytic results on the expected frequency spectrum under the coalescent and by leveraging the technique of automatic differentiation, which allows us to compute gradients exactly, we develop a very efficient algorithm to infer piecewise-exponential models of the historical effective population size from the distribution of sample allele frequencies. Our method is orders of magnitude faster than previous demographic inference methods based on the frequency spectrum. In addition to inferring demography, our method can also accurately estimate locus-specific mutation rates. We perform extensive validation of our method on simulated data and show that it can accurately infer multiple recent epochs of rapid exponential growth, a signal that is difficult to pick up with small sample sizes. Lastly, we use our method to analyze data from recent sequencing studies, including a large-sample exome-sequencing data set of tens of thousands of individuals assayed at a few hundred genic regions. © 2015 Bhaskar et al.; Published by Cold Spring Harbor Laboratory Press.
Conn, Kathleen E.; Black, Robert W.; Peterson, Norman T.; Senter, Craig A.; Chapman, Elena A.
2018-01-05
From August 2016 to March 2017, the U.S. Geological Survey (USGS) collected representative samples of filtered and unfiltered water and suspended sediment (including the colloidal fraction) at USGS streamgage 12113390 (Duwamish River at Golf Course, at Tukwila, Washington) during 13 periods of differing flow conditions. Samples were analyzed by Washington-State-accredited laboratories for a large suite of compounds, including metals, dioxins/furans, semivolatile compounds including polycyclic aromatic hydrocarbons, butyltins, the 209 polychlorinated biphenyl (PCB) congeners, and total and dissolved organic carbon. Concurrent with the chemistry sampling, water-quality field parameters were measured, and representative water samples were collected and analyzed for river suspended-sediment concentration and particle-size distribution. The results provide new data that can be used to estimate sediment and chemical loads transported by the Green River to the Lower Duwamish Waterway.
Modeling large woody debris recruitment for small streams of the Central Rocky Mountains
Don C. Bragg; Jeffrey L. Kershner; David W. Roberts
2000-01-01
As our understanding of the importance of large woody debris (LWD) evolves, planning for its production in riparian forest management is becoming more widely recognized. This report details the development of a model (CWD, version 1.4) that predicts LWD inputs, including descriptions of the field sampling used to parameterize parts of the model, the theoretical and...
Estimating snag and large tree densities and distributions on a landscape for wildlife management.
Lisa J. Bate; Edward O. Garton; Michael J. Wisdom
1999-01-01
We provide efficient and accurate methods for sampling snags and large trees on a landscape to conduct compliance and effectiveness monitoring for wildlife in relation to the habitat standards and guidelines on National Forests. Included online are the necessary spreadsheets, macros, and instructions to conduct all surveys and analyses pertaining to estimation of snag...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Jaejin; Woo, Jong-Hak; Mulchaey, John S.
We perform a comprehensive study of X-ray cavities using a large sample of X-ray targets selected from the Chandra archive. The sample is selected to cover a large dynamic range including galaxy clusters, groups, and individual galaxies. Using β -modeling and unsharp masking techniques, we investigate the presence of X-ray cavities for 133 targets that have sufficient X-ray photons for analysis. We detect 148 X-ray cavities from 69 targets and measure their properties, including cavity size, angle, and distance from the center of the diffuse X-ray gas. We confirm the strong correlation between cavity size and distance from the X-raymore » center similar to previous studies. We find that the detection rates of X-ray cavities are similar among galaxy clusters, groups and individual galaxies, suggesting that the formation mechanism of X-ray cavities is independent of environment.« less
Survey of Large Methane Emitters in North America
NASA Astrophysics Data System (ADS)
Deiker, S.
2017-12-01
It has been theorized that methane emissions in the oil and gas industry follow log normal or "fat tail" distributions, with large numbers of small sources for every very large source. Such distributions would have significant policy and operational implications. Unfortunately, by their very nature such distributions would require large sample sizes to verify. Until recently, such large-scale studies would be prohibitively expensive. The largest public study to date sampled 450 wells, an order of magnitude too low to effectively constrain these models. During 2016 and 2017, Kairos Aerospace conducted a series of surveys the LeakSurveyor imaging spectrometer, mounted on light aircraft. This small, lightweight instrument was designed to rapidly locate large emission sources. The resulting survey covers over three million acres of oil and gas production. This includes over 100,000 wells, thousands of storage tanks and over 7,500 miles of gathering lines. This data set allows us to now probe the distribution of large methane emitters. Results of this survey, and implications for methane emission distribution, methane policy and LDAR will be discussed.
Gerald, II, Rex E.; Sanchez, Jairo; Rathke, Jerome W.
2004-08-10
A video toroid cavity imager for in situ measurement of electrochemical properties of an electrolytic material sample includes a cylindrical toroid cavity resonator containing the sample and employs NMR and video imaging for providing high-resolution spectral and visual information of molecular characteristics of the sample on a real-time basis. A large magnetic field is applied to the sample under controlled temperature and pressure conditions to simultaneously provide NMR spectroscopy and video imaging capabilities for investigating electrochemical transformations of materials or the evolution of long-range molecular aggregation during cooling of hydrocarbon melts. The video toroid cavity imager includes a miniature commercial video camera with an adjustable lens, a modified compression coin cell imager with a fiat circular principal detector element, and a sample mounted on a transparent circular glass disk, and provides NMR information as well as a video image of a sample, such as a polymer film, with micrometer resolution.
ERIC Educational Resources Information Center
Hua, Haiyan; Burchfield, Shirley
A large-scale longitudinal study in Bolivia examined the relationship between adult women's basic education and their social and economic well-being and development. A random sample of 1,600 participants and 600 nonparticipants, aged 15-45, was tracked for 3 years (the final sample included 717 participants and 224 controls). The four adult…
ERIC Educational Resources Information Center
Springer, Kristen W.; Sheridan, Jennifer; Kuo, Daphne; Carnes, Molly
2007-01-01
Objective: Child maltreatment has been linked to negative adult health outcomes; however, much past research includes only clinical samples of women, focuses exclusively on sexual abuse and/or fails to control for family background and childhood characteristics, both potential confounders. Further research is needed to obtain accurate,…
NASA Technical Reports Server (NTRS)
Natesh, R.; Smith, J. M.; Qidwai, H. A.
1979-01-01
The various steps involved in the chemical polishing and etching of silicon samples are described. Data on twins, dislocation pits, and grain boundaries from thirty-one (31) silicon sample are also discussed. A brief review of the changes made to upgrade the image analysis system is included.
ERIC Educational Resources Information Center
Gotham, Katherine; Marvin, Alison R.; Taylor, Julie Lounds; Warren, Zachary; Anderson, Connie M.; Law, Paul A.; Law, Jessica K.; Lipkin, Paul H.
2015-01-01
Using online survey data from a large sample of adults with autism spectrum disorder and legal guardians, we first report outcomes across a variety of contexts for participants with a wide range of functioning, and second, summarize these stakeholders' priorities for future research. The sample included n?=?255 self-reporting adults with autism…
Area estimation using multiyear designs and partial crop identification
NASA Technical Reports Server (NTRS)
Sielken, R. L., Jr.
1984-01-01
Statistical procedures were developed for large area assessments using both satellite and conventional data. Crop acreages, other ground cover indices, and measures of change were the principal characteristics of interest. These characteristics are capable of being estimated from samples collected possibly from several sources at varying times, with different levels of identification. Multiyear analysis techniques were extended to include partially identified samples; the best current year sampling design corresponding to a given sampling history was determined; weights reflecting the precision or confidence in each observation were identified and utilized, and the variation in estimates incorporating partially identified samples were quantified.
Rotor assembly and method for automatically processing liquids
Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.
1992-01-01
A rotor assembly for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water, includes a rotor body for rotation about an axis and including a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses.
Evaluation of the Biological Sampling Kit (BiSKit) for Large-Area Surface Sampling
Buttner, Mark P.; Cruz, Patricia; Stetzenbach, Linda D.; Klima-Comba, Amy K.; Stevens, Vanessa L.; Emanuel, Peter A.
2004-01-01
Current surface sampling methods for microbial contaminants are designed to sample small areas and utilize culture analysis. The total number of microbes recovered is low because a small area is sampled, making detection of a potential pathogen more difficult. Furthermore, sampling of small areas requires a greater number of samples to be collected, which delays the reporting of results, taxes laboratory resources and staffing, and increases analysis costs. A new biological surface sampling method, the Biological Sampling Kit (BiSKit), designed to sample large areas and to be compatible with testing with a variety of technologies, including PCR and immunoassay, was evaluated and compared to other surface sampling strategies. In experimental room trials, wood laminate and metal surfaces were contaminated by aerosolization of Bacillus atrophaeus spores, a simulant for Bacillus anthracis, into the room, followed by settling of the spores onto the test surfaces. The surfaces were sampled with the BiSKit, a cotton-based swab, and a foam-based swab. Samples were analyzed by culturing, quantitative PCR, and immunological assays. The results showed that the large surface area (1 m2) sampled with the BiSKit resulted in concentrations of B. atrophaeus in samples that were up to 10-fold higher than the concentrations obtained with the other methods tested. A comparison of wet and dry sampling with the BiSKit indicated that dry sampling was more efficient (efficiency, 18.4%) than wet sampling (efficiency, 11.3%). The sensitivities of detection of B. atrophaeus on metal surfaces were 42 ± 5.8 CFU/m2 for wet sampling and 100.5 ± 10.2 CFU/m2 for dry sampling. These results demonstrate that the use of a sampling device capable of sampling larger areas results in higher sensitivity than that obtained with currently available methods and has the advantage of sampling larger areas, thus requiring collection of fewer samples per site. PMID:15574898
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iwamoto, A.; Mito, T.; Takahata, K.
Heat transfer of large copper plates (18 x 76 mm) in liquid helium has been measured as a function of orientation and treatment of the heat transfer surface. The results relate to applications of large scale superconductors. In order to clarify the influence of the area where the surface treatment peels off, the authors studied five types of heat transfer surface areas including: (a) 100% polished copper sample, (b) and (c) two 50% oxidized copper samples having different patterns of oxidation, (d) 75% oxidized copper sample, (e) 90% oxidized copper sample, and (f) 100% oxidized copper sample. They observed thatmore » the critical heat flux depends on the heat transfer surface orientation. The critical heat flux is a maximum at angles of 0{degrees} - 30{degrees} and decreases monotonically with increasing angles above 30{degrees}, where the angle is taken in reference to the horizontal axis. On the other hand, the minimum heat flux is less dependent on the surface orientation. More than 75% oxidation on the surface makes the critical heat flux increase. The minimum heat fluxes of the 50 and 90% oxidized Cu samples approximately agree with that of the 100% oxidized Cu sample. Experiments and calculations show that the critical and the minimum heat fluxes are a bilinear function of the fraction of oxidized surface area.« less
Inclusion of angular momentum in FREYA
Randrup, Jørgen; Vogt, Ramona
2015-05-18
The event-by-event fission model FREYA generates large samples of complete fission events from which any observable can extracted, including fluctuations of the observables and the correlations between them. We describe here how FREYA was recently refined to include angular momentum throughout. Subsequently we present some recent results for both neutron and photon observables.
ERIC Educational Resources Information Center
Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castilla, Veronica
2016-01-01
Recently, statisticians have begun developing methods to improve the generalizability of results from large-scale experiments in education. This work has included the development of methods for improved site selection when random sampling is infeasible, including the use of stratification and targeted recruitment strategies. This article provides…
Roeder, Peter; Gofton, Emma; Thornber, Carl
2006-01-01
The volume %, distribution, texture and composition of coexisting olivine, Cr-spinel and glass has been determined in quenched lava samples from Hawaii, Iceland and mid-oceanic ridges. The volume ratio of olivine to spinel varies from 60 to 2800 and samples with >0·02% spinel have a volume ratio of olivine to spinel of approximately 100. A plot of wt % MgO vs ppm Cr for natural and experimental basaltic glasses suggests that the general trend of the glasses can be explained by the crystallization of a cotectic ratio of olivine to spinel of about 100. One group of samples has an olivine to spinel ratio of approximately 100, with skeletal olivine phenocrysts and small (<50 μm) spinel crystals that tend to be spatially associated with the olivine phenocrysts. The large number of spinel crystals included within olivine phenocrysts is thought to be due to skeletal olivine phenocrysts coming into physical contact with spinel by synneusis during the chaotic conditions of ascent and extrusion. A second group of samples tend to have large olivine phenocrysts relatively free of included spinel, a few large (>100 μm) spinel crystals that show evidence of two stages of growth, and a volume ratio of olivine to spinel of 100 to well over 1000. The olivine and spinel in this group have crystallized more slowly with little physical interaction, and show evidence that they have accumulated in a magma chamber.
Upper atmosphere pollution measurements (GASP)
NASA Technical Reports Server (NTRS)
Rudey, R. A.; Holdeman, J. D.
1975-01-01
The environmental effects are discussed of engine effluents of future large fleets of aircraft operating in the stratosphere. Topics discussed include: atmospheric properties, aircraft engine effluents, upper atmospheric measurements, global air sampling, and data reduction and analysis
Stellato, Giuseppina; La Storia, Antonietta; De Filippis, Francesca; Borriello, Giorgia; Villani, Francesco; Ercolini, Danilo
2016-07-01
Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The aims of this study were to learn more about the possible influence of the meat processing environment on initial fresh meat contamination and to investigate the differences between small-scale retail distribution (SD) and large-scale retail distribution (LD) facilities. Samples were collected from butcheries (n = 20), including LD (n = 10) and SD (n = 10) facilities, over two sampling campaigns. Samples included fresh beef and pork cuts and swab samples from the knife, the chopping board, and the butcher's hand. The microbiota of both meat samples and environmental swabs were very complex, including more than 800 operational taxonomic units (OTUs) collapsed at the species level. The 16S rRNA sequencing analysis showed that core microbiota were shared by 80% of the samples and included Pseudomonas spp., Streptococcus spp., Brochothrix spp., Psychrobacter spp., and Acinetobacter spp. Hierarchical clustering of the samples based on the microbiota showed a certain separation between meat and environmental samples, with higher levels of Proteobacteria in meat. In particular, levels of Pseudomonas and several Enterobacteriaceae members were significantly higher in meat samples, while Brochothrix, Staphylococcus, lactic acid bacteria, and Psychrobacter prevailed in environmental swab samples. Consistent clustering was also observed when metabolic activities were considered by predictive metagenomic analysis of the samples. An increase in carbohydrate metabolism was predicted for the environmental swabs and was consistently linked to Firmicutes, while increases in pathways related to amino acid and lipid metabolism were predicted for the meat samples and were positively correlated with Proteobacteria Our results highlighted the importance of the processing environment in contributing to the initial microbial levels of meat and clearly showed that the type of retail facility (LD or SD) did not apparently affect the contamination. The study provides an in-depth description of the microbiota of meat and meat processing environments. It highlights the importance of the environment as a contamination source of spoilage bacteria, and it shows that the size of the retail facility does not affect the level and type of contamination. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
NASA Astrophysics Data System (ADS)
Tuckerman, Mark
2006-03-01
One of the computational grand challenge problems is to develop methodology capable of sampling conformational equilibria in systems with rough energy landscapes. If met, many important problems, most notably protein folding, could be significantly impacted. In this talk, two new approaches for addressing this problem will be presented. First, it will be shown how molecular dynamics can be combined with a novel variable transformation designed to warp configuration space in such a way that barriers are reduced and attractive basins stretched. This method rigorously preserves equilibrium properties while leading to very large enhancements in sampling efficiency. Extensions of this approach to the calculation/exploration of free energy surfaces will be discussed. Next, a new very large time-step molecular dynamics method will be introduced that overcomes the resonances which plague many molecular dynamics algorithms. The performance of the methods is demonstrated on a variety of systems including liquid water, long polymer chains simple protein models, and oligopeptides.
Photometric Redshifts for the Large-Area Stripe 82X Multiwavelength Survey
NASA Astrophysics Data System (ADS)
Tasnim Ananna, Tonima; Salvato, Mara; Urry, C. Megan; LaMassa, Stephanie M.; STRIPE 82X
2016-06-01
The Stripe 82X survey currently includes 6000 X-ray sources in 31.3 square degrees of XMM-Newton and Chandra X-ray coverage, most of which are AGN. Using a maximum-likelihood approach, we identified optical and infrared counterparts in the SDSS, VHS K-band and WISE W1-band catalogs. 1200 objects which had different best associations in different catalogs were checked by eye. Our most recent paper provided the multiwavelength catalogs for this sample. More than 1000 counterparts have spectroscopic redshifts, either from SDSS spectroscopy or our own follow-up program. Using the extensive multiwavelength data in this field, we provide photometric redshift estimates for most of the remaining sources, which are 80-90% accurate according to the training set. Our sample has a large number of candidates that are very faint in optical and bright in IR. We expect a large fraction of these objects to be the obscured AGN sample we need to complete the census on black hole growth at a range of redshifts.
Understanding Microplastic Distribution: A Global Citizen Monitoring Effort
NASA Astrophysics Data System (ADS)
Barrows, A.
2016-02-01
Understanding distribution and abundance of microplastics in the world's oceans will continue to help inform global law-making. Through recruiting and training over 500 volunteers our study has collected over 1000 samples from remote and populated areas world-wide. Samples include water collected at the sea surface and throughout the water column. Surface to depth sampling has provided insight into vertical plastic distribution. The development of unique field and laboratory methodology has enabled plastics to be quantified down to 50 µm. In 2015, the study expanded to include global freshwater systems. By understanding plastic patterns, distribution and concentration in large and small watersheds we will better understand how freshwater systems are contributing to marine microplastic pollution.
MetaSRA: normalized human sample-specific metadata for the Sequence Read Archive.
Bernstein, Matthew N; Doan, AnHai; Dewey, Colin N
2017-09-15
The NCBI's Sequence Read Archive (SRA) promises great biological insight if one could analyze the data in the aggregate; however, the data remain largely underutilized, in part, due to the poor structure of the metadata associated with each sample. The rules governing submissions to the SRA do not dictate a standardized set of terms that should be used to describe the biological samples from which the sequencing data are derived. As a result, the metadata include many synonyms, spelling variants and references to outside sources of information. Furthermore, manual annotation of the data remains intractable due to the large number of samples in the archive. For these reasons, it has been difficult to perform large-scale analyses that study the relationships between biomolecular processes and phenotype across diverse diseases, tissues and cell types present in the SRA. We present MetaSRA, a database of normalized SRA human sample-specific metadata following a schema inspired by the metadata organization of the ENCODE project. This schema involves mapping samples to terms in biomedical ontologies, labeling each sample with a sample-type category, and extracting real-valued properties. We automated these tasks via a novel computational pipeline. The MetaSRA is available at metasra.biostat.wisc.edu via both a searchable web interface and bulk downloads. Software implementing our computational pipeline is available at http://github.com/deweylab/metasra-pipeline. cdewey@biostat.wisc.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Jjunju, Fred P M; Maher, Simon; Damon, Deidre E; Barrett, Richard M; Syed, S U; Heeren, Ron M A; Taylor, Stephen; Badu-Tawiah, Abraham K
2016-01-19
Direct analysis and identification of long chain aliphatic primary diamine Duomeen O (n-oleyl-1,3-diaminopropane), corrosion inhibitor in raw water samples taken from a large medium pressure water tube boiler plant water samples at low LODs (<0.1 pg) has been demonstrated for the first time, without any sample preparation using paper spray mass spectrometry (PS-MS). The presence of Duomeen O in water samples was confirmed via tandem mass spectrometry using collision-induced dissociation and supported by exact mass measurement and reactive paper spray experiments using an LTQ Orbitrap Exactive instrument. Data shown herein indicate that paper spray ambient ionization can be readily used as a rapid and robust method for in situ direct analysis of polymanine corrosion inhibitors in an industrial water boiler plant and other related samples in the water treatment industry. This approach was applied for the analysis of three complex water samples including feedwater, condensate water, and boiler water, all collected from large medium pressure (MP) water tube boiler plants, known to be dosed with varying amounts of polyamine and amine corrosion inhibitor components. Polyamine chemistry is widely used for example in large high pressure (HP) boilers operating in municipal waste and recycling facilities to prevent corrosion of metals. The samples used in this study are from such a facility in Coventry waste treatment facility, U.K., which has 3 × 40 tonne/hour boilers operating at 17.5 bar.
Effects of Interim Assessments on Student Achievement: Evidence from a Large-Scale Experiment
ERIC Educational Resources Information Center
Konstantopoulos, Spyros; Miller, Shazia R.; van der Ploeg, Arie; Li, Wei
2016-01-01
We use data from a large-scale, school-level randomized experiment conducted in 2010-2011 in public schools in Indiana. Our sample includes more than 30,000 students in 70 schools. We examine the impact of two interim assessment programs (i.e., mCLASS in Grades K-2 and Acuity in Grades 3--8) on mathematics and reading achievement. Two-level models…
Ren, Xinxin; Liu, Jia; Zhang, Chengsen; Luo, Hai
2013-03-15
With the rapid development of ambient mass spectrometry, the hybrid laser-based ambient ionization methods which can generate multiply charged ions of large biomolecules and also characterize small molecules with good signal-to-noise in both positive and negative ion modes are of particular interest. An ambient ionization method termed high-voltage-assisted laser desorption ionization (HALDI) is developed, in which a 1064 nm laser is used to desorb various liquid samples from the sample target biased at a high potential without the need for an organic matrix. The pre-charged liquid samples are desorbed by the laser to form small charged droplets which may undergo an electrospray-like ionization process to produce multiply charged ions of large biomolecules. Various samples including proteins, oligonucleotides (ODNs), drugs, whole milk and chicken eggs have been analyzed by HALDI-MS in both positive and negative ion mode with little or no sample preparation. In addition, HALDI can generate intense signals with better signal-to-noise in negative ion mode than laser desorption spay post-ionization (LDSPI) from the same samples, such as ODNs and some carboxylic-group-containing small drug molecules. HALDI-MS can directly analyze a variety of liquid samples including proteins, ODNs, pharmaceuticals and biological fluids in both positive and negative ion mode without the use of an organic matrix. This technique may be further developed into a useful tool for rapid analysis in many different fields such as pharmaceutical, food, and biological sciences. Copyright © 2013 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Fredricks, Jennifer A.; Eccles, Jacquelynne S.
2008-01-01
In this study, we examined the associations between organized activity participation during early adolescence and adjustment in a large and economically diverse sample of African American and European American youth. The sample included 1,047 youth (51% female and 49% male and 67% African American and 33% European American). We used analysis of…
Who Is at Greatest Risk of Adverse Long-Term Outcomes? The Finnish from a Boy to a Man Study
ERIC Educational Resources Information Center
Sourander, Andre; Jensen, Peter; Davies, Mark; Niemela, Solja; Elonheimo, Henrik; Ristkari, Terja; Helenius, Hans; Sillanmaki, Lauri; Piha, Jorma; Kumpulainen, Kirsti; Tamminen, Tuula; Moilanen, Irma; Almqvist, Fredrik
2007-01-01
Objective: To study associations between comorbid psychopathology and long-term outcomes in a large birth cohort sample from age 8 to early adulthood. Method: The sample included long-term outcome data on 2,556 Finnish boys born in 1981. The aim was to study the impact of early childhood psychopathology types (externalizing versus internalizing…
MaCH-Admix: Genotype Imputation for Admixed Populations
Liu, Eric Yi; Li, Mingyao; Wang, Wei; Li, Yun
2012-01-01
Imputation in admixed populations is an important problem but challenging due to the complex linkage disequilibrium (LD) pattern. The emergence of large reference panels such as that from the 1,000 Genomes Project enables more accurate imputation in general, and in particular for admixed populations and for uncommon variants. To efficiently benefit from these large reference panels, one key issue to consider in modern genotype imputation framework is the selection of effective reference panels. In this work, we consider a number of methods for effective reference panel construction inside a hidden Markov model and specific to each target individual. These methods fall into two categories: identity-by-state (IBS) based and ancestry-weighted approach. We evaluated the performance on individuals from recently admixed populations. Our target samples include 8,421 African Americans and 3,587 Hispanic Americans from the Women’s Health Initiative, which allow assessment of imputation quality for uncommon variants. Our experiments include both large and small reference panels; large, medium, and small target samples; and in genome regions of varying levels of LD. We also include BEAGLE and IMPUTE2 for comparison. Experiment results with large reference panel suggest that our novel piecewise IBS method yields consistently higher imputation quality than other methods/software. The advantage is particularly noteworthy among uncommon variants where we observe up to 5.1% information gain with the difference being highly significant (Wilcoxon signed rank test P-value < 0.0001). Our work is the first that considers various sensible approaches for imputation in admixed populations and presents a comprehensive comparison. PMID:23074066
Extensive Core Microbiome in Drone-Captured Whale Blow Supports a Framework for Health Monitoring
Miller, Carolyn A.; Moore, Michael J.; Durban, John W.; Fearnbach, Holly; Barrett-Lennard, Lance G.
2017-01-01
ABSTRACT The pulmonary system is a common site for bacterial infections in cetaceans, but very little is known about their respiratory microbiome. We used a small, unmanned hexacopter to collect exhaled breath condensate (blow) from two geographically distinct populations of apparently healthy humpback whales (Megaptera novaeangliae), sampled in the Massachusetts coastal waters off Cape Cod (n = 17) and coastal waters around Vancouver Island (n = 9). Bacterial and archaeal small-subunit rRNA genes were amplified and sequenced from blow samples, including many of sparse volume, as well as seawater and other controls, to characterize the associated microbial community. The blow microbiomes were distinct from the seawater microbiomes and included 25 phylogenetically diverse bacteria common to all sampled whales. This core assemblage comprised on average 36% of the microbiome, making it one of the more consistent animal microbiomes studied to date. The closest phylogenetic relatives of 20 of these core microbes were previously detected in marine mammals, suggesting that this core microbiome assemblage is specialized for marine mammals and may indicate a healthy, noninfected pulmonary system. Pathogen screening was conducted on the microbiomes at the genus level, which showed that all blow and few seawater microbiomes contained relatives of bacterial pathogens; no known cetacean respiratory pathogens were detected in the blow. Overall, the discovery of a shared large core microbiome in humpback whales is an important advancement for health and disease monitoring of this species and of other large whales. IMPORTANCE The conservation and management of large whales rely in part upon health monitoring of individuals and populations, and methods generally necessitate invasive sampling. Here, we used a small, unmanned hexacopter drone to noninvasively fly above humpback whales from two populations, capture their exhaled breath (blow), and examine the associated microbiome. In the first extensive examination of the large-whale blow microbiome, we present surprising results about the discovery of a large core microbiome that was shared across individual whales from geographically separated populations in two ocean basins. We suggest that this core microbiome, in addition to other microbiome characteristics, could be a useful feature for health monitoring of large whales worldwide. PMID:29034331
Extensive Core Microbiome in Drone-Captured Whale Blow Supports a Framework for Health Monitoring.
Apprill, Amy; Miller, Carolyn A; Moore, Michael J; Durban, John W; Fearnbach, Holly; Barrett-Lennard, Lance G
2017-01-01
The pulmonary system is a common site for bacterial infections in cetaceans, but very little is known about their respiratory microbiome. We used a small, unmanned hexacopter to collect exhaled breath condensate (blow) from two geographically distinct populations of apparently healthy humpback whales ( Megaptera novaeangliae ), sampled in the Massachusetts coastal waters off Cape Cod ( n = 17) and coastal waters around Vancouver Island ( n = 9). Bacterial and archaeal small-subunit rRNA genes were amplified and sequenced from blow samples, including many of sparse volume, as well as seawater and other controls, to characterize the associated microbial community. The blow microbiomes were distinct from the seawater microbiomes and included 25 phylogenetically diverse bacteria common to all sampled whales. This core assemblage comprised on average 36% of the microbiome, making it one of the more consistent animal microbiomes studied to date. The closest phylogenetic relatives of 20 of these core microbes were previously detected in marine mammals, suggesting that this core microbiome assemblage is specialized for marine mammals and may indicate a healthy, noninfected pulmonary system. Pathogen screening was conducted on the microbiomes at the genus level, which showed that all blow and few seawater microbiomes contained relatives of bacterial pathogens; no known cetacean respiratory pathogens were detected in the blow. Overall, the discovery of a shared large core microbiome in humpback whales is an important advancement for health and disease monitoring of this species and of other large whales. IMPORTANCE The conservation and management of large whales rely in part upon health monitoring of individuals and populations, and methods generally necessitate invasive sampling. Here, we used a small, unmanned hexacopter drone to noninvasively fly above humpback whales from two populations, capture their exhaled breath (blow), and examine the associated microbiome. In the first extensive examination of the large-whale blow microbiome, we present surprising results about the discovery of a large core microbiome that was shared across individual whales from geographically separated populations in two ocean basins. We suggest that this core microbiome, in addition to other microbiome characteristics, could be a useful feature for health monitoring of large whales worldwide.
The prevalence of terraced treescapes in analyses of phylogenetic data sets.
Dobrin, Barbara H; Zwickl, Derrick J; Sanderson, Michael J
2018-04-04
The pattern of data availability in a phylogenetic data set may lead to the formation of terraces, collections of equally optimal trees. Terraces can arise in tree space if trees are scored with parsimony or with partitioned, edge-unlinked maximum likelihood. Theory predicts that terraces can be large, but their prevalence in contemporary data sets has never been surveyed. We selected 26 data sets and phylogenetic trees reported in recent literature and investigated the terraces to which the trees would belong, under a common set of inference assumptions. We examined terrace size as a function of the sampling properties of the data sets, including taxon coverage density (the proportion of taxon-by-gene positions with any data present) and a measure of gene sampling "sufficiency". We evaluated each data set in relation to the theoretical minimum gene sampling depth needed to reduce terrace size to a single tree, and explored the impact of the terraces found in replicate trees in bootstrap methods. Terraces were identified in nearly all data sets with taxon coverage densities < 0.90. They were not found, however, in high-coverage-density (i.e., ≥ 0.94) transcriptomic and genomic data sets. The terraces could be very large, and size varied inversely with taxon coverage density and with gene sampling sufficiency. Few data sets achieved a theoretical minimum gene sampling depth needed to reduce terrace size to a single tree. Terraces found during bootstrap resampling reduced overall support. If certain inference assumptions apply, trees estimated from empirical data sets often belong to large terraces of equally optimal trees. Terrace size correlates to data set sampling properties. Data sets seldom include enough genes to reduce terrace size to one tree. When bootstrap replicate trees lie on a terrace, statistical support for phylogenetic hypotheses may be reduced. Although some of the published analyses surveyed were conducted with edge-linked inference models (which do not induce terraces), unlinked models have been used and advocated. The present study describes the potential impact of that inference assumption on phylogenetic inference in the context of the kinds of multigene data sets now widely assembled for large-scale tree construction.
Wickham, J.D.; Stehman, S.V.; Smith, J.H.; Wade, T.G.; Yang, L.
2004-01-01
Two-stage cluster sampling reduces the cost of collecting accuracy assessment reference data by constraining sample elements to fall within a limited number of geographic domains (clusters). However, because classification error is typically positively spatially correlated, within-cluster correlation may reduce the precision of the accuracy estimates. The detailed population information to quantify a priori the effect of within-cluster correlation on precision is typically unavailable. Consequently, a convenient, practical approach to evaluate the likely performance of a two-stage cluster sample is needed. We describe such an a priori evaluation protocol focusing on the spatial distribution of the sample by land-cover class across different cluster sizes and costs of different sampling options, including options not imposing clustering. This protocol also assesses the two-stage design's adequacy for estimating the precision of accuracy estimates for rare land-cover classes. We illustrate the approach using two large-area, regional accuracy assessments from the National Land-Cover Data (NLCD), and describe how the a priorievaluation was used as a decision-making tool when implementing the NLCD design.
Rotor assembly and method for automatically processing liquids
Burtis, C.A.; Johnson, W.F.; Walker, W.A.
1992-12-22
A rotor assembly is described for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water. It includes a rotor body for rotation about an axis and includes a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses. 34 figs.
Measuring Adolescent Social and Academic Self-Efficacy: Cross-Ethnic Validity of the SEQ-C
ERIC Educational Resources Information Center
Minter, Anthony; Pritzker, Suzanne
2017-01-01
Objective: This study examines the psychometric strength, including cross-ethnic validity, of two subscales of Muris' Self-Efficacy Questionnaire for Children: Academic Self-Efficacy (ASE) and Social Self-Efficacy (SSE). Methods: A large ethnically diverse sample of 3,358 early and late adolescents completed surveys including the ASE and SSE.…
Guide for preparing active solar heating systems operation and maintenance manuals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-01-01
This book presents a systematic and standardized approach to the preparation of operation and maintenance manuals for active solar heating systems. Provides an industry consensus of the best operating and maintenance procedures for large commercial-scale solar service water and space heating systems. A sample O M manual is included. 3-ring binder included.
Undergraduates Who Do Not Apply for Financial Aid. Data Point. NCES 2016-406
ERIC Educational Resources Information Center
Ifill, Nicole
2016-01-01
This report is based on data from the 2011-12 National Postsecondary Student Aid Study (NPSAS:12), a large, nationally representative sample survey of students that focuses on how they finance their education. NPSAS includes data on the application for and receipt of financial aid, including grants, loans, assistantships, scholarships,…
Interpretation and Utilization of Scores on the Air Force Officer Qualifying Test.
ERIC Educational Resources Information Center
Miller, Robert E.
The report summarizes a large body of data relevant to the proper interpretation and use of aptitude scores on the Air Force Officer Qualifying Test (AFOQT). Included are descriptions of the AFOQT testing program and the test itself. Technical data include an extensive sampling of validation studies covering predictors of success in pilot…
[The research protocol III. Study population].
Arias-Gómez, Jesús; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe
2016-01-01
The study population is defined as a set of cases, determined, limited, and accessible, that will constitute the subjects for the selection of the sample, and must fulfill several characteristics and distinct criteria. The objectives of this manuscript are focused on specifying each one of the elements required to make the selection of the participants of a research project, during the elaboration of the protocol, including the concepts of study population, sample, selection criteria and sampling methods. After delineating the study population, the researcher must specify the criteria that each participant has to comply. The criteria that include the specific characteristics are denominated selection or eligibility criteria. These criteria are inclusion, exclusion and elimination, and will delineate the eligible population. The sampling methods are divided in two large groups: 1) probabilistic or random sampling and 2) non-probabilistic sampling. The difference lies in the employment of statistical methods to select the subjects. In every research, it is necessary to establish at the beginning the specific number of participants to be included to achieve the objectives of the study. This number is the sample size, and can be calculated or estimated with mathematical formulas and statistic software.
Analysing home-ownership of couples: the effect of selecting couples at the time of the survey.
Mulder, C H
1996-09-01
"The analysis of events encountered by couple and family households may suffer from sample selection bias when data are restricted to couples existing at the moment of interview. The paper discusses the effect of sample selection bias on event history analyses of buying a home [in the Netherlands] by comparing analyses performed on a sample of existing couples with analyses of a more complete sample including past as well as current partner relationships. The results show that, although home-buying in relationships that have ended differs clearly from behaviour in existing relationships, sample selection bias is not alarmingly large." (SUMMARY IN FRE) excerpt
Loughland, Carmel; Draganic, Daren; McCabe, Kathryn; Richards, Jacqueline; Nasir, Aslam; Allen, Joanne; Catts, Stanley; Jablensky, Assen; Henskens, Frans; Michie, Patricia; Mowry, Bryan; Pantelis, Christos; Schall, Ulrich; Scott, Rodney; Tooney, Paul; Carr, Vaughan
2010-11-01
This article describes the establishment of the Australian Schizophrenia Research Bank (ASRB), which operates to collect, store and distribute linked clinical, cognitive, neuroimaging and genetic data from a large sample of people with schizophrenia and healthy controls. Recruitment sources for the schizophrenia sample include a multi-media national advertising campaign, inpatient and community treatment services and non-government support agencies. Healthy controls have been recruited primarily through multi-media advertisements. All participants undergo an extensive diagnostic and family history assessment, neuropsychological evaluation, and blood sample donation for genetic studies. Selected individuals also complete structural MRI scans. Preliminary analyses of 493 schizophrenia cases and 293 healthy controls are reported. Mean age was 39.54 years (SD = 11.1) for the schizophrenia participants and 37.38 years (SD = 13.12) for healthy controls. Compared to the controls, features of the schizophrenia sample included a higher proportion of males (cases 65.9%; controls 46.8%), fewer living in married or de facto relationships (cases 16.1%; controls 53.6%) and fewer years of education (cases 13.05, SD = 2.84; controls 15.14, SD = 3.13), as well as lower current IQ (cases 102.68, SD = 15.51; controls 118.28, SD = 10.18). These and other sample characteristics are compared to those reported in another large Australian sample (i.e. the Low Prevalence Disorders Study), revealing some differences that reflect the different sampling methods of these two studies. The ASRB is a valuable and accessible schizophrenia research facility for use by approved scientific investigators. As recruitment continues, the approach to sampling for both cases and controls will need to be modified to ensure that the ASRB samples are as broadly representative as possible of all cases of schizophrenia and healthy controls.
Oblinger, Carolyn J.
2004-01-01
The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.
Delgado, Alejandra; Posada-Ureta, Oscar; Olivares, Maitane; Vallejo, Asier; Etxebarria, Nestor
2013-12-15
In this study a priority organic pollutants usually found in environmental water samples were considered to accomplish two extraction and analysis approaches. Among those compounds organochlorine compounds, pesticides, phthalates, phenols and residues of pharmaceutical and personal care products were included. The extraction and analysis steps were based on silicone rod extraction (SR) followed by liquid desorption in combination with large volume injection-programmable temperature vaporiser (LVI-PTV) and gas chromatography-mass spectrometry (GC-MS). Variables affecting the analytical response as a function of the programmable temperature vaporiser (PTV) parameters were firstly optimised following an experimental design approach. The SR extraction and desorption conditions were assessed afterwards, including matrix modification, time extraction, and stripping solvent composition. Subsequently, the possibility of performing membrane enclosed sorptive coating extraction (MESCO) as a modified extraction approach was also evaluated. The optimised method showed low method detection limits (3-35 ng L(-1)), acceptable accuracy (78-114%) and precision values (<13%) for most of the studied analytes regardless of the aqueous matrix. Finally, the developed approach was successfully applied to the determination of target analytes in aqueous environmental matrices including estuarine and wastewater samples. © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Andrews, G. D.; Davila Harris, P.; Brown, S. R.; Anderson, L.; Moreno, N.
2014-12-01
We completed a field sampling transect across the northern Sierra Madre Occidental silicic large igneous province (SMO) in December 2013. Here we present the first stratigraphic, petrological, and geochemical data from the transect between Hidalgo del Parral and Guadalupe y Calvo, Chihuahua, Mexico. This is the first new transect across the SMO in 25 years and the only one between existing NE - SW transects at Chihuahua - Hermosillo and Durango - Mazatlan. The 245 km-long transect along Mexican Highway 24 crosses the boundary between the extended (Basin and Range) and non-extended (Sierra Madre Occidental plateau) parts of the SMO, and allows sampling of previously undescribed Oligocene (?) - early Miocene (?) rhyolitic ignimbrites and lavas, and occasional post-rhyolite, Miocene (?) SCORBA basaltic andesite lavas. 54 samples of rhyolitic ignimbrites (40) and lavas (7), and basaltic andesite lavas (7) were sampled along the transect, including 8 canyon sections with more than one unit. The ignimbrites are overwhelming rhyodacitic (plagioclase and hornblende or biotite phyric) or rhyolitic (quartz (+/- sanidine) in additon to plagioclase and hornblende or biotite phyric) and sparsely to highly phyric. Preliminary petrographic (phenocryst abundances) and geochemical (major and trace element) will be presented and compared to existing data from elsewhere in the SMO. Future work will include U-Pb zircon dating and whole rock and in-zircon radiogenic isotopes analyses.
Adaptive sampling in behavioral surveys.
Thompson, S K
1997-01-01
Studies of populations such as drug users encounter difficulties because the members of the populations are rare, hidden, or hard to reach. Conventionally designed large-scale surveys detect relatively few members of the populations so that estimates of population characteristics have high uncertainty. Ethnographic studies, on the other hand, reach suitable numbers of individuals only through the use of link-tracing, chain referral, or snowball sampling procedures that often leave the investigators unable to make inferences from their sample to the hidden population as a whole. In adaptive sampling, the procedure for selecting people or other units to be in the sample depends on variables of interest observed during the survey, so the design adapts to the population as encountered. For example, when self-reported drug use is found among members of the sample, sampling effort may be increased in nearby areas. Types of adaptive sampling designs include ordinary sequential sampling, adaptive allocation in stratified sampling, adaptive cluster sampling, and optimal model-based designs. Graph sampling refers to situations with nodes (for example, people) connected by edges (such as social links or geographic proximity). An initial sample of nodes or edges is selected and edges are subsequently followed to bring other nodes into the sample. Graph sampling designs include network sampling, snowball sampling, link-tracing, chain referral, and adaptive cluster sampling. A graph sampling design is adaptive if the decision to include linked nodes depends on variables of interest observed on nodes already in the sample. Adjustment methods for nonsampling errors such as imperfect detection of drug users in the sample apply to adaptive as well as conventional designs.
NASA Astrophysics Data System (ADS)
Portegies Zwart, S. F.; Chen, H.-C.
2008-06-01
We reconstruct the initial two-body relaxation time at the half mass radius for a sample of young ⪉ 300 Myr star clusters in the Large Magellanic cloud. We achieve this by simulating star clusters with 12288 to 131072 stars using direct N-body integration. The equations of motion of all stars are calculated with high precision direct N-body simulations which include the effects of the evolution of single stars and binaries. We find that the initial relaxation times of the sample of observed clusters in the Large Magellanic Cloud ranges from about 200 Myr to about 2 Gyr. The reconstructed initial half-mass relaxation times for these clusters have a much narrower distribution than the currently observed distribution, which ranges over more than two orders of magnitude.
Willis, C; Elviss, N; Aird, H; Fenelon, D; McLauchlin, J
2012-08-01
To investigate hygiene practices of caterers at large events in order to: support the production of guidance on catering at such events; to compare hygiene standards at weekends with other times in the week; and to learn lessons in preparation for the London Olympics in 2012. UK-wide study of caterers at large events, including questionnaires on hygiene procedures and microbiological examination of food, water and environmental samples. In total, 1364 samples of food, water, surface swabs and cloths were collected at 139 events, by local authority sampling officers, and transported to laboratories for microbiological analysis. Eight percent of food samples were of an unsatisfactory quality, and a further 2% contained potentially hazardous levels of Bacillus spp. A significantly higher proportion of unsatisfactory food samples were taken from vendors without adequate food safety procedures in place. Fifty-two percent of water samples, 38% of swabs and 71% of cloths were also unsatisfactory. The majority of samples (57%) were collected on Saturdays, Sundays or bank holidays. Environmental swab results were significantly poorer at weekends compared with other days of the week. This study reinforces the fact that food hygiene is a continuing cause for concern in mobile vendors, and indicates a need for an ongoing programme of training and monitoring of caterers in preparation for the London Olympics. Copyright © 2012 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Mixture model normalization for non-targeted gas chromatography/mass spectrometry metabolomics data.
Reisetter, Anna C; Muehlbauer, Michael J; Bain, James R; Nodzenski, Michael; Stevens, Robert D; Ilkayeva, Olga; Metzger, Boyd E; Newgard, Christopher B; Lowe, William L; Scholtens, Denise M
2017-02-02
Metabolomics offers a unique integrative perspective for health research, reflecting genetic and environmental contributions to disease-related phenotypes. Identifying robust associations in population-based or large-scale clinical studies demands large numbers of subjects and therefore sample batching for gas-chromatography/mass spectrometry (GC/MS) non-targeted assays. When run over weeks or months, technical noise due to batch and run-order threatens data interpretability. Application of existing normalization methods to metabolomics is challenged by unsatisfied modeling assumptions and, notably, failure to address batch-specific truncation of low abundance compounds. To curtail technical noise and make GC/MS metabolomics data amenable to analyses describing biologically relevant variability, we propose mixture model normalization (mixnorm) that accommodates truncated data and estimates per-metabolite batch and run-order effects using quality control samples. Mixnorm outperforms other approaches across many metrics, including improved correlation of non-targeted and targeted measurements and superior performance when metabolite detectability varies according to batch. For some metrics, particularly when truncation is less frequent for a metabolite, mean centering and median scaling demonstrate comparable performance to mixnorm. When quality control samples are systematically included in batches, mixnorm is uniquely suited to normalizing non-targeted GC/MS metabolomics data due to explicit accommodation of batch effects, run order and varying thresholds of detectability. Especially in large-scale studies, normalization is crucial for drawing accurate conclusions from non-targeted GC/MS metabolomics data.
Evaluation of human enteric viruses in surface water and drinking water resources in southern Ghana.
Gibson, Kristen E; Opryszko, Melissa C; Schissler, James T; Guo, Yayi; Schwab, Kellogg J
2011-01-01
An estimated 884 million people worldwide do not have access to an improved drinking water source, and the microbial quality of these sources is often unknown. In this study, a combined tangential flow, hollow fiber ultrafiltration (UF), and real-time PCR method was applied to large volume (100 L) groundwater (N = 4), surface water (N = 9), and finished (i.e., receiving treatment) drinking water (N = 6) samples for the evaluation of human enteric viruses and bacterial indicators. Human enteric viruses including norovirus GI and GII, adenovirus, and polyomavirus were detected in five different samples including one groundwater, three surface water, and one drinking water sample. Total coliforms and Escherichia coli assessed for each sample before and after UF revealed a lack of correlation between bacterial indicators and the presence of human enteric viruses.
Evaluation of Human Enteric Viruses in Surface Water and Drinking Water Resources in Southern Ghana
Gibson, Kristen E.; Opryszko, Melissa C.; Schissler, James T.; Guo, Yayi; Schwab, Kellogg J.
2011-01-01
An estimated 884 million people worldwide do not have access to an improved drinking water source, and the microbial quality of these sources is often unknown. In this study, a combined tangential flow, hollow fiber ultrafiltration (UF), and real-time PCR method was applied to large volume (100 L) groundwater (N = 4), surface water (N = 9), and finished (i.e., receiving treatment) drinking water (N = 6) samples for the evaluation of human enteric viruses and bacterial indicators. Human enteric viruses including norovirus GI and GII, adenovirus, and polyomavirus were detected in five different samples including one groundwater, three surface water, and one drinking water sample. Total coliforms and Escherichia coli assessed for each sample before and after UF revealed a lack of correlation between bacterial indicators and the presence of human enteric viruses. PMID:21212196
Davis, T M; Parsons, C M; Utterback, P L; Kirstein, D
2015-05-01
Sixteen meat and bone meal (MBM) samples were obtained and selected from various company plants to provide a wide range in pepsin nitrogen digestibility values. Pepsin digestibility was determined using either 0.02 or 0.002% pepsin. Amino acid (AA) digestibility of the 16 MBM samples was then determined using a precision-fed cecectomized rooster assay. The 0.02% pepsin digestibility values were numerically higher than the 0.002% pepsin values. The values varied from 77 to 93% for 0.02% pepsin and from 67 to 91% for 0.002% pepsin. The rooster AA digestibility results showed a wide range of values among MBM samples mostly due to the 4 samples having lowest and highest AA digestibility. A precision-fed broiler chick ileal AA digestibility assay confirmed that there were large differences in AA digestibility among the MBM samples having the lowest and highest rooster digestibility values. Correlation analyses between pepsin and AA digestibility values showed that the correlation values (r) were highly significant (P < 0.0001) for all AA when all 16 MBM samples were included in the analysis. However, when the MBM samples with the 2 lowest and the 2 highest rooster digestibility values were not included in the correlation analyses, the correlation coefficient values (r) were generally very low and not significant (P > 0.05). The results indicated that the pepsin nitrogen digestibility assay is only useful for detecting large differences in AA digestibility among MBM. There also was no advantage for using 0.02 versus 0.002% pepsin. © 2015 Poultry Science Association Inc.
Multi-wavelength observations of barred, flocculent galaxies
NASA Astrophysics Data System (ADS)
Ratay, Douglas Lee
Although it is generally accepted that large galaxies form through the assemblage of smaller objects, an explanation for the morphology of galaxies is not available. Any complete theory of galaxy morphology must include production and dissolution mechanisms for galactic bars, rings, nuclear bars, spiral arms, and companions. This theory does not exist because of the lack of detailed data from many types of galaxies in different environments. We have defined a new sample of galaxies which are simultaneously flocculent, barred, and isolated. We have performed optical, near-infrared, and radio (HI) observations of the galaxies in this sample. We measured properties of our galaxies including bar length, bar axis ratio, HI diameter, HI mass, and dynamical mass. We found that our sample group is heterogeneous, and compares well to a standard samples of galaxies. We found two of our galaxies to possess companions, and two others to show evidence of current interactions. This is consistent with other observations indicating that local isolated galaxies do not possess a large number of small companions. We cannot rule out the possibility of very small companions. We find that as a group our sample is slightly less luminous than normal galaxies and may be more likely to be involved in interactions. We conclude that the bar and spiral arm features in our sample are due to processes internal to the galaxies, likely involving the interaction between the galactic disk and halo. We defined a control sample of barred, grand design galaxies to further determine the acceptability of barred, flocculent galaxies as a physically meaningful subset of galaxies.
ERIC Educational Resources Information Center
Goldstein, Sam; Naglieri, Jack A.; Rzepa, Sara; Williams, Kevin M.
2012-01-01
We examined the interrelationships among symptoms related to autism spectrum disorders (ASD) using a large representative sample and clinical groups of children aged 6 to 11 and youth aged 12 to 18 years rated by parents (N = 1,881) or teachers (N = 2,171). The samples included individuals from the United States and Canada from the standardization…
40Ar/39Ar technique of KAr dating: a comparison with the conventional technique
Brent, Dalrymple G.; Lanphere, M.A.
1971-01-01
K-Ar ages have been determined by the 40Ar/39Ar total fusion technique on 19 terrestrial samples whose conventional K-Ar ages range from 3.4 my to nearly 1700 my. Sample materials included biotite, muscovite, sanidine, adularia, plagioclase, hornblende, actinolite, alunite, dacite, and basalt. For 18 samples there are no significant differences at the 95% confidence level between the KAr ages obtained by these two techniques; for one sample the difference is 4.3% and is statistically significant. For the neutron doses used in these experiments (???4 ?? 1018 nvt) it appears that corrections for interfering Ca- and K-derived Ar isotopes can be made without significant loss of precision for samples with K/Ca > 1 as young as about 5 ?? 105 yr, and for samples with K/Ca < 1 as young as about 107 yr. For younger samples the combination of large atmospheric Ar corrections and large corrections for Ca- and K-derived Ar may make the precision of the 40Ar/39Ar technique less than that of the conventional technique unless the irradiation parameters are adjusted to minimize these corrections. ?? 1971.
Networks for image acquisition, processing and display
NASA Technical Reports Server (NTRS)
Ahumada, Albert J., Jr.
1990-01-01
The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.
Pan, Feng; Tao, Guohua
2013-03-07
Full semiclassical (SC) initial value representation (IVR) for time correlation functions involves a double phase space average over a set of two phase points, each of which evolves along a classical path. Conventionally, the two initial phase points are sampled independently for all degrees of freedom (DOF) in the Monte Carlo procedure. Here, we present an efficient importance sampling scheme by including the path correlation between the two initial phase points for the bath DOF, which greatly improves the performance of the SC-IVR calculations for large molecular systems. Satisfactory convergence in the study of quantum coherence in vibrational relaxation has been achieved for a benchmark system-bath model with up to 21 DOF.
Nationwide Databases in Orthopaedic Surgery Research.
Bohl, Daniel D; Singh, Kern; Grauer, Jonathan N
2016-10-01
The use of nationwide databases to conduct orthopaedic research has expanded markedly in recent years. Nationwide databases offer large sample sizes, sampling of patients who are representative of the country as a whole, and data that enable investigation of trends over time. The most common use of nationwide databases is to study the occurrence of postoperative adverse events. Other uses include the analysis of costs and the investigation of critical hospital metrics, such as length of stay and readmission rates. Although nationwide databases are powerful research tools, readers should be aware of the differences between them and their limitations. These include variations and potential inaccuracies in data collection, imperfections in patient sampling, insufficient postoperative follow-up, and lack of orthopaedic-specific outcomes.
Rapid fusion method for the determination of Pu, Np, and Am in large soil samples
Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...
2015-02-14
A new rapid sodium hydroxide fusion method for the preparation of 10-20 g soil samples has been developed by the Savannah River National Laboratory (SRNL). The method enables lower detection limits for plutonium, neptunium, and americium in environmental soil samples. The method also significantly reduces sample processing time and acid fume generation compared to traditional soil digestion techniques using hydrofluoric acid. Ten gram soil aliquots can be ashed and fused using the new method in 1-2 hours, completely dissolving samples, including refractory particles. Pu, Np and Am are separated using stacked 2mL cartridges of TEVA and DGA Resin and measuredmore » using alpha spectrometry. The method can be adapted for measurement by inductively-coupled plasma mass spectrometry (ICP-MS). Two 10 g soil aliquots of fused soil may be combined prior to chromatographic separations to further improve detection limits. Total sample preparation time, including chromatographic separations and alpha spectrometry source preparation, is less than 8 hours.« less
Meade, R.H.; Stevens, H.H.
1990-01-01
A Lagrangian strategy for sampling large rivers, which was developed and tested in the Orinoco and Amazon Rivers of South America during the early 1980s, is now being applied to the study of toxic chemicals in the Mississippi River. A series of 15-20 cross-sections of the Mississippi mainstem and its principal tributaries is sampled by boat in downstream sequence, beginning upriver of St. Louis and concluding downriver of New Orleans 3 weeks later. The timing of the downstream sampling sequence approximates the travel time of the river water. Samples at each cross-section are discharge-weighted to provide concentrations of dissolved and suspended constituents that are converted to fluxes. Water-sediment mixtures are collected from 10-40 equally spaced points across the river width by sequential depth integration at a uniform vertical transit rate. Essential equipment includes (i) a hydraulic winch, for sensitive control of vertical transit rates, and (ii) a collapsible-bag sampler, which allows integrated samples to be collected at all depths in the river. A section is usually sampled in 4-8 h, for a total sample recovery of 100-120 l. Sampled concentrations of suspended silt and clay are reproducible within 3%.
Nelson, Jennifer Clark; Marsh, Tracey; Lumley, Thomas; Larson, Eric B; Jackson, Lisa A; Jackson, Michael L
2013-08-01
Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased owing to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. We applied two such methods, namely imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method's ability to reduce bias using the control time period before influenza circulation. Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not use the validation sample confounders. Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from health care database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which the data can be imputed or reweighted using the additional validation sample information. Copyright © 2013 Elsevier Inc. All rights reserved.
Nelson, Jennifer C.; Marsh, Tracey; Lumley, Thomas; Larson, Eric B.; Jackson, Lisa A.; Jackson, Michael
2014-01-01
Objective Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased due to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. Study Design and Setting We applied two such methods, imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method’s ability to reduce bias using the control time period prior to influenza circulation. Results Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not utilize the validation sample confounders. Conclusion Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from healthcare database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which data can be imputed or reweighted using the additional validation sample information. PMID:23849144
Lim, Jun; Park, So Yeong; Huang, Jung Yun; Han, Sung Mi; Kim, Hong-Tae
2013-01-01
We developed an off-axis-illuminated zone-plate-based hard x-ray Zernike phase-contrast microscope beamline at Pohang Light Source. Owing to condenser optics-free and off-axis illumination, a large field of view was achieved. The pinhole-type Zernike phase plate affords high-contrast images of a cell with minimal artifacts such as the shade-off and halo effects. The setup, including the optics and the alignment, is simple and easy, and allows faster and easier imaging of large bio-samples.
Mondol, Samrat; Navya, R; Athreya, Vidya; Sunagar, Kartik; Selvaraj, Velu Mani; Ramakrishnan, Uma
2009-12-04
Leopards are the most widely distributed of the large cats, ranging from Africa to the Russian Far East. Because of habitat fragmentation, high human population densities and the inherent adaptability of this species, they now occupy landscapes close to human settlements. As a result, they are the most common species involved in human wildlife conflict in India, necessitating their monitoring. However, their elusive nature makes such monitoring difficult. Recent advances in DNA methods along with non-invasive sampling techniques can be used to monitor populations and individuals across large landscapes including human dominated ones. In this paper, we describe a DNA-based method for leopard individual identification where we used fecal DNA samples to obtain genetic material. Further, we apply our methods to non-invasive samples collected in a human-dominated landscape to estimate the minimum number of leopards in this human-leopard conflict area in Western India. In this study, 25 of the 29 tested cross-specific microsatellite markers showed positive amplification in 37 wild-caught leopards. These loci revealed varied levels of polymorphism (four-12 alleles) and heterozygosity (0.05-0.79). Combining data on amplification success (including non-invasive samples) and locus specific polymorphisms, we showed that eight loci provide a sibling probability of identity of 0.0005, suggesting that this panel can be used to discriminate individuals in the wild. When this microsatellite panel was applied to fecal samples collected from a human-dominated landscape, we identified 7 individuals, with a sibling probability of identity of 0.001. Amplification success of field collected scats was up to 72%, and genotype error ranged from 0-7.4%. Our results demonstrated that the selected panel of eight microsatellite loci can conclusively identify leopards from various kinds of biological samples. Our methods can be used to monitor leopards over small and large landscapes to assess population trends, as well as could be tested for population assignment in forensic applications.
2009-01-01
Background Leopards are the most widely distributed of the large cats, ranging from Africa to the Russian Far East. Because of habitat fragmentation, high human population densities and the inherent adaptability of this species, they now occupy landscapes close to human settlements. As a result, they are the most common species involved in human wildlife conflict in India, necessitating their monitoring. However, their elusive nature makes such monitoring difficult. Recent advances in DNA methods along with non-invasive sampling techniques can be used to monitor populations and individuals across large landscapes including human dominated ones. In this paper, we describe a DNA-based method for leopard individual identification where we used fecal DNA samples to obtain genetic material. Further, we apply our methods to non-invasive samples collected in a human-dominated landscape to estimate the minimum number of leopards in this human-leopard conflict area in Western India. Results In this study, 25 of the 29 tested cross-specific microsatellite markers showed positive amplification in 37 wild-caught leopards. These loci revealed varied levels of polymorphism (four-12 alleles) and heterozygosity (0.05-0.79). Combining data on amplification success (including non-invasive samples) and locus specific polymorphisms, we showed that eight loci provide a sibling probability of identity of 0.0005, suggesting that this panel can be used to discriminate individuals in the wild. When this microsatellite panel was applied to fecal samples collected from a human-dominated landscape, we identified 7 individuals, with a sibling probability of identity of 0.001. Amplification success of field collected scats was up to 72%, and genotype error ranged from 0-7.4%. Conclusion Our results demonstrated that the selected panel of eight microsatellite loci can conclusively identify leopards from various kinds of biological samples. Our methods can be used to monitor leopards over small and large landscapes to assess population trends, as well as could be tested for population assignment in forensic applications. PMID:19961605
Danis, Ildiko; Scheuring, Noemi; Papp, Eszter; Czinner, Antal
2012-06-01
A new instrument for assessing depressive mood, the first version of Depression Scale Questionnaire (DS1K) was published in 2008 by Halmai et al. This scale was used in our large sample study, in the framework of the For Healthy Offspring project, involving parents of young children. The original questionnaire was developed in small samples, so our aim was to assist further development of the instrument by the psychometric analysis of the data in our large sample (n=1164). The DS1K scale was chosen to measure the parents' mood and mental state in the For Healthy Offspring project. The questionnaire was completed by 1063 mothers and 328 fathers, yielding a heterogenous sample with respect to age and socio-demographic status. Analyses included main descriptive statistics, establishing the scales' inner consistency and some comparisons. Results were checked in our original and multiple imputed datasets as well. According to our results the reliability of our scale was much worse than in the original study (Cronbach alpha: 0.61 versus 0.88). During the detailed item-analysis it became clear that two items contributed to the observed decreased coherence. We assumed a problem related to misreading in case of one of these items. This assumption was checked by cross-analysis by the assumed reading level. According to our results the reliability of the scale was increased in both the lower and higher education level groups if we did not include one or both of these problematic items. However, as the number of items decreased, the relative sensitivity of the scale was also reduced, with fewer persons categorized in the risk group compared to the original scale. We suggest for the authors as an alternative solution to redefine the problematic items and retest the reliability of the measurement in a sample with diverse socio-demographic characteristics.
NASA Astrophysics Data System (ADS)
Billings, Andrew; Kaiser, Carl; Young, Craig M.; Hiebert, Laurel S.; Cole, Eli; Wagner, Jamie K. S.; Van Dover, Cindy Lee
2017-03-01
The current standard for large-volume (thousands of cubic meters) zooplankton sampling in the deep sea is the MOCNESS, a system of multiple opening-closing nets, typically lowered to within 50 m of the seabed and towed obliquely to the surface to obtain low-spatial-resolution samples that integrate across 10 s of meters of water depth. The SyPRID (Sentry Precision Robotic Impeller Driven) sampler is an innovative, deep-rated (6000 m) plankton sampler that partners with the Sentry Autonomous Underwater Vehicle (AUV) to obtain paired, large-volume plankton samples at specified depths and survey lines to within 1.5 m of the seabed and with simultaneous collection of sensor data. SyPRID uses a perforated Ultra-High-Molecular-Weight (UHMW) plastic tube to support a fine mesh net within an outer carbon composite tube (tube-within-a-tube design), with an axial flow pump located aft of the capture filter. The pump facilitates flow through the system and reduces or possibly eliminates the bow wave at the mouth opening. The cod end, a hollow truncated cone, is also made of UHMW plastic and includes a collection volume designed to provide an area where zooplankton can collect, out of the high flow region. SyPRID attaches as a saddle-pack to the Sentry vehicle. Sentry itself is configured with a flight control system that enables autonomous survey paths to low altitudes. In its verification deployment at the Blake Ridge Seep (2160 m) on the US Atlantic Margin, SyPRID was operated for 6 h at an altitude of 5 m. It recovered plankton samples, including delicate living larvae, from the near-bottom stratum that is seldom sampled by a typical MOCNESS tow. The prototype SyPRID and its next generations will enable studies of plankton or other particulate distributions associated with localized physico-chemical strata in the water column or above patchy habitats on the seafloor.
Wilkison, D.H.; Armstrong, D.J.; Hampton, S.A.
2009-01-01
From 1998 through 2007, over 750 surface-water or bed-sediment samples in the Blue River Basin - a largely urban basin in metropolitan Kansas City - were analyzed for more than 100 anthropogenic compounds. Compounds analyzed included nutrients, fecal-indicator bacteria, suspended sediment, pharmaceuticals and personal care products. Non-point source runoff, hydrologic alterations, and numerous waste-water discharge points resulted in the routine detection of complex mixtures of anthropogenic compounds in samples from basin stream sites. Temporal and spatial variations in concentrations and loads of nutrients, pharmaceuticals, and organic wastewater compounds were observed, primarily related to a site's proximity to point-source discharges and stream-flow dynamics. ?? 2009 ASCE.
Apparatus for measuring surface particulate contamination
Woodmansee, Donald E.
2002-01-01
An apparatus for measuring surface particulate contamination includes a tool for collecting a contamination sample from a target surface, a mask having an opening of known area formed therein for defining the target surface, and a flexible connector connecting the tool to the mask. The tool includes a body portion having a large diameter section defining a surface and a small diameter section extending from the large diameter section. A particulate collector is removably mounted on the surface of the large diameter section for collecting the contaminants. The tool further includes a spindle extending from the small diameter section and a spool slidingly mounted on the spindle. A spring is disposed between the small diameter section and the spool for biasing the spool away from the small diameter section. An indicator is provided on the spindle so as to be revealed when the spool is pressed downward to compress the spring.
Whitfield, Pamela S.
2016-04-29
Here, quantitative phase analysis (QPA) using neutron powder diffraction more often than not involves non-ambient studies where no sample preparation is possible. The larger samples and penetration of neutrons versus X-rays makes neutron diffraction less susceptible to inhomogeneity and large grain sizes, but most well-characterized QPA standard samples do not have these characteristics. Sample #4 from the International Union of Crystallography Commission on Powder Diffraction QPA round robin was one such sample. Data were collected using the POWGEN time-of-flight (TOF) neutron powder diffractometer and analysed together with historical data from the C2 diffractometer at Chalk River. The presence of magneticmore » reflections from Fe 3O 4 (magnetite) in the sample was an additional consideration, and given the frequency at which iron-containing and other magnetic compounds are present during in-operando studies their possible impact on the accuracy of QPA is of interest. Additionally, scattering from thermal diffuse scattering in the high-Qregion (<0.6 Å) accessible with TOF data could impact QPA results during least-squares because of the extreme peak overlaps present in this region. Refinement of POWGEN data was largely insensitive to the modification of longer d-spacing reflections by magnetic contributions, but the constant-wavelength data were adversely impacted if the magnetic structure was not included. A robust refinement weighting was found to be effective in reducing quantification errors using the constant-wavelength neutron data both where intensities from magnetic reflections were ignored and included. Results from the TOF data were very sensitive to inadequate modelling of the high- Q (low d-spacing) background using simple polynomials.« less
Assessing the sustainable construction of large construction companies in Malaysia
NASA Astrophysics Data System (ADS)
Adewale, Bamgbade Jibril; Mohammed, Kamaruddeen Ahmed; Nasrun, Mohd Nawi Mohd
2016-08-01
Considering the increasing concerns for the consideration of sustainability issues in construction project delivery within the construction industry, this paper assesses the extent of sustainable construction among Malaysian large contractors, in order to ascertain the level of the industry's impacts on both the environment and the society. Sustainable construction explains the construction industry's responsibility to efficiently utilise the finite resources while also reducing construction impacts on both humans and the environment throughout the phases of construction. This study used proportionate stratified random sampling to conduct a field study with a sample of 172 contractors out of the 708 administered questionnaires. Data were collected from large contractors in the eleven states of peninsular Malaysia. Using the five-level rating scale (which include: 1= Very Low; 2= Low; 3= Moderate; 4= High; 5= Very High) to describe the level of sustainable construction of Malaysian contractors based on previous studies, statistical analysis reveals that environmental, social and economic sustainability of Malaysian large contractors are high.
Large area optical mapping of surface contact angle.
Dutra, Guilherme; Canning, John; Padden, Whayne; Martelli, Cicero; Dligatch, Svetlana
2017-09-04
Top-down contact angle measurements have been validated and confirmed to be as good if not more reliable than side-based measurements. A range of samples, including industrially relevant materials for roofing and printing, has been compared. Using the top-down approach, mapping in both 1-D and 2-D has been demonstrated. The method was applied to study the change in contact angle as a function of change in silver (Ag) nanoparticle size controlled by thermal evaporation. Large area mapping reveals good uniformity for commercial Aspen paper coated with black laser printer ink. A demonstration of the forensic and chemical analysis potential in 2-D is shown by uncovering the hidden CsF initials made with mineral oil on the coated Aspen paper. The method promises to revolutionize nanoscale characterization and industrial monitoring as well as chemical analyses by allowing rapid contact angle measurements over large areas or large numbers of samples in ways and times that have not been possible before.
NASA Astrophysics Data System (ADS)
Sun, Chengjun; Jiang, Fenghua; Gao, Wei; Li, Xiaoyun; Yu, Yanzhen; Yin, Xiaofei; Wang, Yong; Ding, Haibing
2017-01-01
Detection of sulfur-oxidizing bacteria has largely been dependent on targeted gene sequencing technology or traditional cell cultivation, which usually takes from days to months to carry out. This clearly does not meet the requirements of analysis for time-sensitive samples and/or complicated environmental samples. Since energy-dispersive X-ray spectrometry (EDS) can be used to simultaneously detect multiple elements in a sample, including sulfur, with minimal sample treatment, this technology was applied to detect sulfur-oxidizing bacteria using their high sulfur content within the cell. This article describes the application of scanning electron microscopy imaging coupled with EDS mapping for quick detection of sulfur oxidizers in contaminated environmental water samples, with minimal sample handling. Scanning electron microscopy imaging revealed the existence of dense granules within the bacterial cells, while EDS identified large amounts of sulfur within them. EDS mapping localized the sulfur to these granules. Subsequent 16S rRNA gene sequencing showed that the bacteria detected in our samples belonged to the genus Chromatium, which are sulfur oxidizers. Thus, EDS mapping made it possible to identify sulfur oxidizers in environmental samples based on localized sulfur within their cells, within a short time (within 24 h of sampling). This technique has wide ranging applications for detection of sulfur bacteria in environmental water samples.
Identification of gamma-irradiated foodstuffs by chemiluminescence measurements in Taiwan
NASA Astrophysics Data System (ADS)
Ma, Ming-Shia Chang; Chen, Li-Hsiang; Tsai, Zei-Tsan; Fu, Ying-Kai
In order to establish chemiluminescence (CL) measurements as an identification method for γ-irradiated foodstuffs in Taiwan, ten agricultural products including wheat flour, rice, ginger, potatoes, garlic, onions, red beans, mung beans, soy beans, xanthoxylon seeds and Japanese star anises have been tested to compare CL intensities between untreated samples and samples subject to a 10 kGy γ-irradiation dose. Amongst them, wheat flour is the most eligible product to be identified by CL measurements. The CL intensities of un-irradiated and irradiated flour have shown large differences associated with a significant dose-effect relationship. Effects of three different protein contents of flour, unsieved and sieved (100-200 mesh), the reproducibility and the storage experiment on CL intensities at various doses were investigated in this study. In addition, the white bulb part of onions has shown some CL in irradiated samples. The CL data obtained from the other eight agricultural products have shown large fluctuations and cannot be used to differentiate between irradiated and un-irradiated samples.
Virtual Observatories, Data Mining, and Astroinformatics
NASA Astrophysics Data System (ADS)
Borne, Kirk
The historical, current, and future trends in knowledge discovery from data in astronomy are presented here. The story begins with a brief history of data gathering and data organization. A description of the development ofnew information science technologies for astronomical discovery is then presented. Among these are e-Science and the virtual observatory, with its data discovery, access, display, and integration protocols; astroinformatics and data mining for exploratory data analysis, information extraction, and knowledge discovery from distributed data collections; new sky surveys' databases, including rich multivariate observational parameter sets for large numbers of objects; and the emerging discipline of data-oriented astronomical research, called astroinformatics. Astroinformatics is described as the fourth paradigm of astronomical research, following the three traditional research methodologies: observation, theory, and computation/modeling. Astroinformatics research areas include machine learning, data mining, visualization, statistics, semantic science, and scientific data management.Each of these areas is now an active research discipline, with significantscience-enabling applications in astronomy. Research challenges and sample research scenarios are presented in these areas, in addition to sample algorithms for data-oriented research. These information science technologies enable scientific knowledge discovery from the increasingly large and complex data collections in astronomy. The education and training of the modern astronomy student must consequently include skill development in these areas, whose practitioners have traditionally been limited to applied mathematicians, computer scientists, and statisticians. Modern astronomical researchers must cross these traditional discipline boundaries, thereby borrowing the best of breed methodologies from multiple disciplines. In the era of large sky surveys and numerous large telescopes, the potential for astronomical discovery is equally large, and so the data-oriented research methods, algorithms, and techniques that are presented here will enable the greatest discovery potential from the ever-growing data and information resources in astronomy.
Discriminant WSRC for Large-Scale Plant Species Recognition.
Zhang, Shanwen; Zhang, Chuanlei; Zhu, Yihai; You, Zhuhong
2017-01-01
In sparse representation based classification (SRC) and weighted SRC (WSRC), it is time-consuming to solve the global sparse representation problem. A discriminant WSRC (DWSRC) is proposed for large-scale plant species recognition, including two stages. Firstly, several subdictionaries are constructed by dividing the dataset into several similar classes, and a subdictionary is chosen by the maximum similarity between the test sample and the typical sample of each similar class. Secondly, the weighted sparse representation of the test image is calculated with respect to the chosen subdictionary, and then the leaf category is assigned through the minimum reconstruction error. Different from the traditional SRC and its improved approaches, we sparsely represent the test sample on a subdictionary whose base elements are the training samples of the selected similar class, instead of using the generic overcomplete dictionary on the entire training samples. Thus, the complexity to solving the sparse representation problem is reduced. Moreover, DWSRC is adapted to newly added leaf species without rebuilding the dictionary. Experimental results on the ICL plant leaf database show that the method has low computational complexity and high recognition rate and can be clearly interpreted.
Scalable metagenomic taxonomy classification using a reference genome database
Ames, Sasha K.; Hysom, David A.; Gardner, Shea N.; Lloyd, G. Scott; Gokhale, Maya B.; Allen, Jonathan E.
2013-01-01
Motivation: Deep metagenomic sequencing of biological samples has the potential to recover otherwise difficult-to-detect microorganisms and accurately characterize biological samples with limited prior knowledge of sample contents. Existing metagenomic taxonomic classification algorithms, however, do not scale well to analyze large metagenomic datasets, and balancing classification accuracy with computational efficiency presents a fundamental challenge. Results: A method is presented to shift computational costs to an off-line computation by creating a taxonomy/genome index that supports scalable metagenomic classification. Scalable performance is demonstrated on real and simulated data to show accurate classification in the presence of novel organisms on samples that include viruses, prokaryotes, fungi and protists. Taxonomic classification of the previously published 150 giga-base Tyrolean Iceman dataset was found to take <20 h on a single node 40 core large memory machine and provide new insights on the metagenomic contents of the sample. Availability: Software was implemented in C++ and is freely available at http://sourceforge.net/projects/lmat Contact: allen99@llnl.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23828782
The topology of large-scale structure. III - Analysis of observations
NASA Astrophysics Data System (ADS)
Gott, J. Richard, III; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.; Weinberg, David H.; Gammie, Charles; Polk, Kevin; Vogeley, Michael; Jeffrey, Scott; Bhavsar, Suketu P.; Melott, Adrian L.; Giovanelli, Riccardo; Hayes, Martha P.; Tully, R. Brent; Hamilton, Andrew J. S.
1989-05-01
A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.
The topology of large-scale structure. III - Analysis of observations. [in universe
NASA Technical Reports Server (NTRS)
Gott, J. Richard, III; Weinberg, David H.; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.
1989-01-01
A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.
Turek, S; Rudan, I; Smolej-Narancić, N; Szirovicza, L; Cubrilo-Turek, M; Zerjavić-Hrabak, V; Rak-Kaić, A; Vrhovski-Hebrang, D; Prebeg, Z; Ljubicić, M; Janićijević, B; Rudan, P
2001-06-01
As the liberation of occupied Croatian territories ended the war in the country in 1995, the Ministry of Health and Croatian Health Insurance Institute have agreed to create the new framework for developing a long-term strategy of public health planning, prevention and intervention. They provided financial resources to develop the First Croatian Health Project, the rest of the support coming from the World Bank loan and the National Institute of Public Health. A large cross-sectional study was designed aiming to assess health attitudes, knowledge, behaviour and risks in the post-war Croatian population. The large field study was carried out by the Institute for Anthropological Research with technical support from the National Institute of Public Health. The field study was completed between 1995-1997. It included about 10,000 adult volunteers from all 21 Croatian counties. The geographic distribution of the sample covered both coastal and continental areas of Croatia and included rural and urban environments. The specific measurements included antropometry (body mass index and blood pressure). From each examinee a blood sample was collected from which the levels of total plasma cholesterol (TC), triglycerides (TG), HDL-cholesterol (High Density Lipoprotein), LDL-cholesterol (Low Density Lipoprotein), lipoprotein Lp(a), and haemostatic risk factor fibrinogen (F) were determined. The detailed data were collected on the general knowledge and attitudes on health issues, followed by specific investigation of smoking history, alcohol consumption, nutrition habits, physical activity, family history of chronic non-communicable diseases and occupational exposures. From the initial database a targeted sample of 5,840 persons of both sexes, aged 18-65, was created corresponding by age, sex and geographic distribution to the general Croatian population. This paper summarises and discusses the main findings of the project within this representative sample of Croatian population.
NASA Astrophysics Data System (ADS)
Nelson, D. B.; Kahmen, A.
2016-12-01
The hydrogen and oxygen isotopic composition of water available for biosynthetic processes in vascular plants plays an important role in shaping the isotopic composition of organic compounds that these organisms produce, including leaf waxes and cellulose in leaves and tree rings. Characterizing changes in large scale spatial patterns of precipitation, soil water, stem water, and leaf water isotope values over time is therefore useful for evaluating how plants reflect changes in the isotopic composition of these source waters in different environments. This information can, in turn, provide improved calibration targets for understanding the environmental signals that plants preserve. The pathway of water through this continuum can include several isotopic fractionations, but the extent to which the isotopic composition of each of these water pools varies under normal field conditions and over space and time has not been systematically and concurrently evaluated at large spatial scales. Two season-long sampling campaigns were conducted at nineteen sites throughout Europe over the 2014 and 2015 growing seasons to track changes in the isotopic composition of plant-relevant waters. Samples of precipitation, soil water, stem water, and leaf water were collected over more than 200 field days and include more than 500 samples from each water pool. Measurements were used to validate continent-wide gridded estimates of leaf water isotope values derived from a combination of mechanistic and statistical modeling conducted with temperature, precipitation, and relative humidity data. Data-model comparison shows good agreement for summer leaf waters, and substantiates the incorporation of modeled leaf waters in evaluating how plants respond to hydroclimate changes at large spatial scales. These results also suggest that modeled leaf water isotope values might be used in future studies in similar ecosystems to improve the coverage density of spatial or temporal data.
Analysis of the research sample collections of Uppsala biobank.
Engelmark, Malin T; Beskow, Anna H
2014-10-01
Uppsala Biobank is the joint and only biobank organization of the two principals, Uppsala University and Uppsala University Hospital. Biobanks are required to have updated registries on sample collection composition and management in order to fulfill legal regulations. We report here the results from the first comprehensive and overall analysis of the 131 research sample collections organized in the biobank. The results show that the median of the number of samples in the collections was 700 and that the number of samples varied from less than 500 to over one million. Blood samples, such as whole blood, serum, and plasma, were included in the vast majority, 84.0%, of the research sample collections. Also, as much as 95.5% of the newly collected samples within healthcare included blood samples, which further supports the concept that blood samples have fundamental importance for medical research. Tissue samples were also commonly used and occurred in 39.7% of the research sample collections, often combined with other types of samples. In total, 96.9% of the 131 sample collections included samples collected for healthcare, showing the importance of healthcare as a research infrastructure. Of the collections that had accessed existing samples from healthcare, as much as 96.3% included tissue samples from the Department of Pathology, which shows the importance of pathology samples as a resource for medical research. Analysis of different research areas shows that the most common of known public health diseases are covered. Collections that had generated the most publications, up to over 300, contained a large number of samples collected systematically and repeatedly over many years. More knowledge about existing biobank materials, together with public registries on sample collections, will support research collaborations, improve transparency, and bring us closer to the goals of biobanks, which is to save and prolong human lives and improve health and quality of life.
Havens, Karl E; Harwell, Matthew C; Brady, Mark A; Sharfstein, Bruce; East, Therese L; Rodusky, Andrew J; Anson, Daniel; Maki, Ryan P
2002-04-09
A spatially intensive sampling program was developed for mapping the submerged aquatic vegetation (SAV) over an area of approximately 20,000 ha in a large, shallow lake in Florida, U.S. The sampling program integrates Geographic Information System (GIS) technology with traditional field sampling of SAV and has the capability of producing robust vegetation maps under a wide range of conditions, including high turbidity, variable depth (0 to 2 m), and variable sediment types. Based on sampling carried out in August-September 2000, we measured 1,050 to 4,300 ha of vascular SAV species and approximately 14,000 ha of the macroalga Chara spp. The results were similar to those reported in the early 1990s, when the last large-scale SAV sampling occurred. Occurrence of Chara was strongly associated with peat sediments, and maximal depths of occurrence varied between sediment types (mud, sand, rock, and peat). A simple model of Chara occurrence, based only on water depth, had an accuracy of 55%. It predicted occurrence of Chara over large areas where the plant actually was not found. A model based on sediment type and depth had an accuracy of 75% and produced a spatial map very similar to that based on observations. While this approach needs to be validated with independent data in order to test its general utility, we believe it may have application elsewhere. The simple modeling approach could serve as a coarse-scale tool for evaluating effects of water level management on Chara populations.
ERIC Educational Resources Information Center
Hoepfner, Ralph; And Others
Sampling techniques used in "A Study of the Sustaining Effects of Compensatory Education" are described in detail. The Sustaining Effects Study is a large, multi-faceted study of issues related to the compensatory education of elementary school students. Public elementary schools that include grades between one and six are eligible for…
Using Self-Report Assessment Methods to Explore Facets of Mindfulness
ERIC Educational Resources Information Center
Baer, Ruth A.; Smith, Gregory T.; Hopkins, Jaclyn; Krietemeyer, Jennifer; Toney, Leslie
2006-01-01
The authors examine the facet structure of mindfulness using five recently developed mindfulness questionnaires. Two large samples of undergraduate students completed mindfulness questionnaires and measures of other constructs. Psychometric properties of the mindfulness questionnaires were examined, including internal consistency and convergent…
Fuka, Mirna Mrkonjić; Wallisch, Stefanie; Engel, Marion; Welzl, Gerhard; Havranek, Jasmina; Schloter, Michael
2013-01-01
Microbial communities play an important role in cheese ripening and determine the flavor and taste of different cheese types to a large extent. However, under adverse conditions human pathogens may colonize cheese samples during ripening and may thus cause severe outbreaks of diarrhoea and other diseases. Therefore in the present study we investigated the bacterial community structure of three raw ewe's milk cheese types, which are produced without the application of starter cultures during ripening from two production sites based on fingerprinting in combination with next generation sequencing of 16S rRNA gene amplicons. Overall a surprisingly high diversity was found in the analyzed samples and overall up to 213 OTU97 could be assigned. 20 of the major OTUs were present in all samples and include mostly lactic acid bacteria (LAB), mainly Lactococcus, and Enterococcus species. Abundance and diversity of these genera differed to a large extent between the 3 investigated cheese types and in response to the ripening process. Also a large number of non LAB genera could be identified based on phylogenetic alignments including mainly Enterobacteriaceae and Staphylococcacae. Some species belonging to these two families could be clearly assigned to species which are known as potential human pathogens like Staphylococcus saprophyticus or Salmonella spp. However, during cheese ripening their abundance was reduced. The bacterial genera, namely Lactobacillus, Streptococcus, Leuconostoc, Bifidobacterium, Brevibacterium, Corynebacterium, Clostridium, Staphylococcus, Thermoanerobacterium, E. coli, Hafnia, Pseudomonas, Janthinobacterium, Petrotoga, Kosmotoga, Megasphaera, Macrococcus, Mannheimia, Aerococcus, Vagococcus, Weissella and Pediococcus were identified at a relative low level and only in selected samples. Overall the microbial composition of the used milk and the management of the production units determined the bacterial community composition for all cheese types to a large extend, also at the late time points of cheese ripening.
Tools for T-RFLP data analysis using Excel.
Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie
2014-11-08
Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.
Fuka, Mirna Mrkonjić; Wallisch, Stefanie; Engel, Marion; Welzl, Gerhard; Havranek, Jasmina; Schloter, Michael
2013-01-01
Microbial communities play an important role in cheese ripening and determine the flavor and taste of different cheese types to a large extent. However, under adverse conditions human pathogens may colonize cheese samples during ripening and may thus cause severe outbreaks of diarrhoea and other diseases. Therefore in the present study we investigated the bacterial community structure of three raw ewe's milk cheese types, which are produced without the application of starter cultures during ripening from two production sites based on fingerprinting in combination with next generation sequencing of 16S rRNA gene amplicons. Overall a surprisingly high diversity was found in the analyzed samples and overall up to 213 OTU97 could be assigned. 20 of the major OTUs were present in all samples and include mostly lactic acid bacteria (LAB), mainly Lactococcus, and Enterococcus species. Abundance and diversity of these genera differed to a large extent between the 3 investigated cheese types and in response to the ripening process. Also a large number of non LAB genera could be identified based on phylogenetic alignments including mainly Enterobacteriaceae and Staphylococcacae. Some species belonging to these two families could be clearly assigned to species which are known as potential human pathogens like Staphylococcus saprophyticus or Salmonella spp. However, during cheese ripening their abundance was reduced. The bacterial genera, namely Lactobacillus, Streptococcus, Leuconostoc, Bifidobacterium, Brevibacterium, Corynebacterium, Clostridium, Staphylococcus, Thermoanerobacterium, E. coli, Hafnia, Pseudomonas, Janthinobacterium, Petrotoga, Kosmotoga, Megasphaera, Macrococcus, Mannheimia, Aerococcus, Vagococcus, Weissella and Pediococcus were identified at a relative low level and only in selected samples. Overall the microbial composition of the used milk and the management of the production units determined the bacterial community composition for all cheese types to a large extend, also at the late time points of cheese ripening. PMID:24278315
Lunga, Dalton D.; Yang, Hsiuhan Lexie; Reith, Andrew E.; ...
2018-02-06
Satellite imagery often exhibits large spatial extent areas that encompass object classes with considerable variability. This often limits large-scale model generalization with machine learning algorithms. Notably, acquisition conditions, including dates, sensor position, lighting condition, and sensor types, often translate into class distribution shifts introducing complex nonlinear factors and hamper the potential impact of machine learning classifiers. Here, this article investigates the challenge of exploiting satellite images using convolutional neural networks (CNN) for settlement classification where the class distribution shifts are significant. We present a large-scale human settlement mapping workflow based-off multiple modules to adapt a pretrained CNN to address themore » negative impact of distribution shift on classification performance. To extend a locally trained classifier onto large spatial extents areas we introduce several submodules: First, a human-in-the-loop element for relabeling of misclassified target domain samples to generate representative examples for model adaptation; second, an efficient hashing module to minimize redundancy and noisy samples from the mass-selected examples; and third, a novel relevance ranking module to minimize the dominance of source example on the target domain. The workflow presents a novel and practical approach to achieve large-scale domain adaptation with binary classifiers that are based-off CNN features. Experimental evaluations are conducted on areas of interest that encompass various image characteristics, including multisensors, multitemporal, and multiangular conditions. Domain adaptation is assessed on source–target pairs through the transfer loss and transfer ratio metrics to illustrate the utility of the workflow.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lunga, Dalton D.; Yang, Hsiuhan Lexie; Reith, Andrew E.
Satellite imagery often exhibits large spatial extent areas that encompass object classes with considerable variability. This often limits large-scale model generalization with machine learning algorithms. Notably, acquisition conditions, including dates, sensor position, lighting condition, and sensor types, often translate into class distribution shifts introducing complex nonlinear factors and hamper the potential impact of machine learning classifiers. Here, this article investigates the challenge of exploiting satellite images using convolutional neural networks (CNN) for settlement classification where the class distribution shifts are significant. We present a large-scale human settlement mapping workflow based-off multiple modules to adapt a pretrained CNN to address themore » negative impact of distribution shift on classification performance. To extend a locally trained classifier onto large spatial extents areas we introduce several submodules: First, a human-in-the-loop element for relabeling of misclassified target domain samples to generate representative examples for model adaptation; second, an efficient hashing module to minimize redundancy and noisy samples from the mass-selected examples; and third, a novel relevance ranking module to minimize the dominance of source example on the target domain. The workflow presents a novel and practical approach to achieve large-scale domain adaptation with binary classifiers that are based-off CNN features. Experimental evaluations are conducted on areas of interest that encompass various image characteristics, including multisensors, multitemporal, and multiangular conditions. Domain adaptation is assessed on source–target pairs through the transfer loss and transfer ratio metrics to illustrate the utility of the workflow.« less
A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.
Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa
2016-05-17
Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.
NASA Astrophysics Data System (ADS)
Naylor, M.; Main, I. G.; Greenhough, J.; Bell, A. F.; McCloskey, J.
2009-04-01
The Sumatran Boxing Day earthquake and subsequent large events provide an opportunity to re-evaluate the statistical evidence for characteristic earthquake events in frequency-magnitude distributions. Our aims are to (i) improve intuition regarding the properties of samples drawn from power laws, (ii) illustrate using random samples how appropriate Poisson confidence intervals can both aid the eye and provide an appropriate statistical evaluation of data drawn from power-law distributions, and (iii) apply these confidence intervals to test for evidence of characteristic earthquakes in subduction-zone frequency-magnitude distributions. We find no need for a characteristic model to describe frequency magnitude distributions in any of the investigated subduction zones, including Sumatra, due to an emergent skew in residuals of power law count data at high magnitudes combined with a sample bias for examining large earthquakes as candidate characteristic events.
The large bright quasar survey. 6: Quasar catalog and survey parameters
NASA Astrophysics Data System (ADS)
Hewett, Paul C.; Foltz, Craig B.; Chaffee, Frederic H.
1995-04-01
Positions, redshifts, and magnitudes for the 1055 quasars in the Large Bright Quasar Survey (LBQS) are presented in a single catalog. Celestial positions have been derived using the PPM catalog to provide an improved reference frame. J2000.0 coordinates are given together with improved b1950.0 positions. Redshifts calculated via cross correlation with a high signal-to-noise ratio composite quasar spectrum are included and the small number of typographic and redshift misidentifications in the discovery papers are corrected. Spectra of the 12 quasars added to the sample since the publication of the discovery papers are included. Discriptions of the plate material, magnitude calibration, quasar candidate selection procedures, and the identification spectroscopy are given. Calculation of the effective area of the survey for the 1055 quasars comprising the well-defined LBQS sample specified in detail. Number-redshift and number-magnitude relations for the quasars are derived and the strengths and limitastions of the LBSQ sample summarized. Comparison with existing surveys is made and a qualitative assessment of the effectiveness of the LBQS undertaken. Positions, magnitudes, and optical spectra of the eight objects (less than 1%) in the survey that remain unidentified are also presented.
Moscow Test Well, INEL Oversight Program: Aqueous geochemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurry, M.; Fromm, J.; Welhan, J.
1992-09-29
This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less
Machine Learning Toolkit for Extreme Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-03-31
Support Vector Machines (SVM) is a popular machine learning technique, which has been applied to a wide range of domains such as science, finance, and social networks for supervised learning. MaTEx undertakes the challenge of designing a scalable parallel SVM training algorithm for large scale systems, which includes commodity multi-core machines, tightly connected supercomputers and cloud computing systems. Several techniques are proposed for improved speed and memory space usage including adaptive and aggressive elimination of samples for faster convergence , and sparse format representation of data samples. Several heuristics for earliest possible to lazy elimination of non-contributing samples are consideredmore » in MaTEx. In many cases, where an early sample elimination might result in a false positive, low overhead mechanisms for reconstruction of key data structures are proposed. The proposed algorithm and heuristics are implemented and evaluated on various publicly available datasets« less
Possession experiences in dissociative identity disorder: a preliminary study.
Ross, Colin A
2011-01-01
Dissociative trance disorder, which includes possession experiences, was introduced as a provisional diagnosis requiring further study in the Diagnostic and Statistical Manual of Mental Disorders (4th ed.). Consideration is now being given to including possession experiences within dissociative identity disorder (DID) in the Diagnostic and Statistical Manual of Mental Disorders (5th ed.), which is due to be published in 2013. In order to provide empirical data relevant to the relationship between DID and possession states, I analyzed data on the prevalence of trance, possession states, sleepwalking, and paranormal experiences in 3 large samples: patients with DID from North America; psychiatric outpatients from Shanghai, China; and a general population sample from Winnipeg, Canada. Trance, sleepwalking, paranormal, and possession experiences were much more common in the DID patients than in the 2 comparison samples. The study is preliminary and exploratory in nature because the samples were not matched in any way.
The cost of large numbers of hypothesis tests on power, effect size and sample size.
Lazzeroni, L C; Ray, A
2012-01-01
Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.
The Mass Function of Abell Clusters
NASA Astrophysics Data System (ADS)
Chen, J.; Huchra, J. P.; McNamara, B. R.; Mader, J.
1998-12-01
The velocity dispersion and mass functions for rich clusters of galaxies provide important constraints on models of the formation of Large-Scale Structure (e.g., Frenk et al. 1990). However, prior estimates of the velocity dispersion or mass function for galaxy clusters have been based on either very small samples of clusters (Bahcall and Cen 1993; Zabludoff et al. 1994) or large but incomplete samples (e.g., the Girardi et al. (1998) determination from a sample of clusters with more than 30 measured galaxy redshifts). In contrast, we approach the problem by constructing a volume-limited sample of Abell clusters. We collected individual galaxy redshifts for our sample from two major galaxy velocity databases, the NASA Extragalactic Database, NED, maintained at IPAC, and ZCAT, maintained at SAO. We assembled a database with velocity information for possible cluster members and then selected cluster members based on both spatial and velocity data. Cluster velocity dispersions and masses were calculated following the procedures of Danese, De Zotti, and di Tullio (1980) and Heisler, Tremaine, and Bahcall (1985), respectively. The final velocity dispersion and mass functions were analyzed in order to constrain cosmological parameters by comparison to the results of N-body simulations. Our data for the cluster sample as a whole and for the individual clusters (spatial maps and velocity histograms) in our sample is available on-line at http://cfa-www.harvard.edu/ huchra/clusters. This website will be updated as more data becomes available in the master redshift compilations, and will be expanded to include more clusters and large groups of galaxies.
Schlottmann, Jamie L.; Funkhouser, Ron A.
1991-01-01
Chemical analyses of water from eight test holes and geophysical logs for nine test holes drilled in the Central Oklahoma aquifer are presented. The test holes were drilled to investigate local occurrences of potentially toxic, naturally occurring trace substances in ground water. These trace substances include arsenic, chromium, selenium, residual alpha-particle activities, and uranium. Eight of the nine test holes were drilled near wells known to contain large concentrations of one or more of the naturally occurring trace substances. One test hole was drilled in an area known to have only small concentrations of any of the naturally occurring trace substances.Water samples were collected from one to eight individual sandstone layers within each test hole. A total of 28 water samples, including four duplicate samples, were collected. The temperature, pH, specific conductance, alkalinity, and dissolved-oxygen concentrations were measured at the sample site. Laboratory determinations included major ions, nutrients, dissolved organic carbon, and trace elements (aluminum, arsenic, barium, beryllium, boron, cadmium, chromium, hexavalent chromium, cobalt, copper, iron, lead, lithium, manganese, mercury, molybdenum, nickel, selenium, silver, strontium, vanadium and zinc). Radionuclide activities and stable isotope (5 values also were determined, including: gross-alpha-particle activity, gross-beta-particle activity, radium-226, radium-228, radon-222, uranium-234, uranium-235, uranium-238, total uranium, carbon-13/carbon-12, deuterium/hydrogen-1, oxygen-18/oxygen-16, and sulfur-34/sulfur-32. Additional analyses of arsenic and selenium species are presented for selected samples as well as analyses of density and iodine for two samples, tritium for three samples, and carbon-14 for one sample.Geophysical logs for most test holes include caliper, neutron, gamma-gamma, natural-gamma logs, spontaneous potential, long- and short-normal resistivity, and single-point resistance. Logs for test-hole NOTS 7 do not include long- and short-normal resistivity, spontaneous-potential, or single-point resistivity. Logs for test-hole NOTS 7A include only caliper and natural-gamma logs.
High-Throughput Analysis and Automation for Glycomics Studies.
Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred
This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.
Evaluation of respondent-driven sampling.
McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G
2012-01-01
Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required when interpreting findings based on the sampling method.
Preservation of samples for dissolved mercury
Hamlin, S.N.
1989-01-01
Water samples for dissolved mercury requires special treatment because of the high chemical mobility and volatility of this element. Widespread use of mercury and its compounds has provided many avenues for contamination of water. Two laboratory tests were done to determine the relative permeabilities of glass and plastic sample bottles to mercury vapor. Plastic containers were confirmed to be quite permeable to airborne mercury, glass containers were virtually impermeable. Methods of preservation include the use of various combinations of acids, oxidants, and complexing agents. The combination of nitric acid and potassium dichromate successfully preserved mercury in a large variety of concentrations and dissolved forms. Because this acid-oxidant preservative acts as a sink for airborne mercury and plastic containers are permeable to mercury vapor, glass bottles are preferred for sample collection. To maintain a healthy work environment and minimize the potential for contamination of water samples, mercury and its compounds are isolated from the atmosphere while in storage. Concurrently, a program to monitor environmental levels of mercury vapor in areas of potential contamination is needed to define the extent of mercury contamination and to assess the effectiveness of mercury clean-up procedures.Water samples for dissolved mercury require special treatment because of the high chemical mobility and volatility of this element. Widespread use of mercury and its compounds has provided many avenues for contamination of water. Two laboratory tests were done to determine the relative permeabilities of glass and plastic sample bottles to mercury vapor. Plastic containers were confirmed to be quite permeable to airborne mercury, glass containers were virtually impermeable. Methods of preservation include the use of various combinations of acids, oxidants, and complexing agents. The combination of nitric acid and potassium dichromate successfully preserved mercury in a large variety of concentrations and dissolved forms.
Large-Scale Spatial Distribution Patterns of Gastropod Assemblages in Rocky Shores
Miloslavich, Patricia; Cruz-Motta, Juan José; Klein, Eduardo; Iken, Katrin; Weinberger, Vanessa; Konar, Brenda; Trott, Tom; Pohle, Gerhard; Bigatti, Gregorio; Benedetti-Cecchi, Lisandro; Shirayama, Yoshihisa; Mead, Angela; Palomo, Gabriela; Ortiz, Manuel; Gobin, Judith; Sardi, Adriana; Díaz, Juan Manuel; Knowlton, Ann; Wong, Melisa; Peralta, Ana C.
2013-01-01
Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1) describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2) identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3) identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME) following the NaGISA (Natural Geography in Shore Areas) standard protocol (www.nagisa.coml.org). A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2%) appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs) followed by the Trochidae and the Columbellidae (6 LMEs). In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska). No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05). Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages. PMID:23967204
Mosaic construction, processing, and review of very large electron micrograph composites
NASA Astrophysics Data System (ADS)
Vogt, Robert C., III; Trenkle, John M.; Harmon, Laurel A.
1996-11-01
A system of programs is described for acquisition, mosaicking, cueing and interactive review of large-scale transmission electron micrograph composite images. This work was carried out as part of a final-phase clinical analysis study of a drug for the treatment of diabetic peripheral neuropathy. MOre than 500 nerve biopsy samples were prepared, digitally imaged, processed, and reviewed. For a given sample, typically 1000 or more 1.5 megabyte frames were acquired, for a total of between 1 and 2 gigabytes of data per sample. These frames were then automatically registered and mosaicked together into a single virtual image composite, which was subsequently used to perform automatic cueing of axons and axon clusters, as well as review and marking by qualified neuroanatomists. Statistics derived from the review process were used to evaluate the efficacy of the drug in promoting regeneration of myelinated nerve fibers. This effort demonstrates a new, entirely digital capability for doing large-scale electron micrograph studies, in which all of the relevant specimen data can be included at high magnification, as opposed to simply taking a random sample of discrete locations. It opens up the possibility of a new era in electron microscopy--one which broadens the scope of questions that this imaging modality can be used to answer.
Niama, Fabien Roch; Vidal, Nicole; Diop-Ndiaye, Halimatou; Nguimbi, Etienne; Ahombo, Gabriel; Diakabana, Philippe; Bayonne Kombo, Édith Sophie; Mayengue, Pembe Issamou; Kobawila, Simon-Charles; Parra, Henri Joseph; Toure-Kane, Coumba
2017-07-05
In this work, we investigated the genetic diversity of HIV-1 and the presence of mutations conferring antiretroviral drug resistance in 50 drug-naïve infected persons in the Republic of Congo (RoC). Samples were obtained before large-scale access to HAART in 2002 and 2004. To assess the HIV-1 genetic recombination, the sequencing of the pol gene encoding a protease and partial reverse transcriptase was performed and analyzed with updated references, including newly characterized CRFs. The assessment of drug resistance was conducted according to the WHO protocol. Among the 50 samples analyzed for the pol gene, 50% were classified as intersubtype recombinants, charring complex structures inside the pol fragment. Five samples could not be classified (noted U). The most prevalent subtypes were G with 10 isolates and D with 11 isolates. One isolate of A, J, H, CRF05, CRF18 and CRF37 were also found. Two samples (4%) harboring the mutations M230L and Y181C associated with the TAMs M41L and T215Y, respectively, were found. This first study in the RoC, based on WHO classification, shows that the threshold of transmitted drug resistance before large-scale access to antiretroviral therapy is 4%.
Daboul, Amro; Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea
2018-01-01
Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'.
Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea
2018-01-01
Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'. PMID:29787586
DESIGNING MONITORING AND ASSESSMENT STRATEGIES TO INCLUDE NEARSHORE ECOSYSTEMS OF THE GREAT LAKES
An expectation for monitoring and assessment of very large aquatic systems is that we can develop a strategy that recognizes and reports on ecologically-important subareas using spatially-stratified, probabilistic sampling designs. Ongoing efforts monitor the main-body, offshore ...
ERIC Educational Resources Information Center
Jaeger, Richard M.; Tesh, Anita S.
This study examined the degree and dimensions of professional satisfaction among a large, nationally-representative sample of practicing counselors. The objectives of the study included estimating the distribution of global professional satisfaction among practicing counselors; examining the relationships between counselors' global professional…
Bar-Code System for a Microbiological Laboratory
NASA Technical Reports Server (NTRS)
Law, Jennifer; Kirschner, Larry
2007-01-01
A bar-code system has been assembled for a microbiological laboratory that must examine a large number of samples. The system includes a commercial bar-code reader, computer hardware and software components, plus custom-designed database software. The software generates a user-friendly, menu-driven interface.
Hair Testing for Drugs of Abuse and New Psychoactive Substances in a High-Risk Population.
Salomone, Alberto; Palamar, Joseph J; Gerace, Enrico; Di Corcia, Daniele; Vincenti, Marco
2017-06-01
Hundreds of new psychoactive substances (NPS) have emerged in the drug market over the last decade. Few drug surveys in the USA, however, ask about use of NPS, so prevalence and correlates of use are largely unknown. A large portion of NPS use is unintentional or unknown as NPS are common adulterants in drugs like ecstasy/Molly, and most NPS are rapidly eliminated from the body, limiting efficacy of urine, blood and saliva testing. We utilized a novel method of examining prevalence of NPS use in a high-risk population utilizing hair-testing. Hair samples from high-risk nightclub and dance music attendees were tested for 82 drugs and metabolites (including NPS) using ultra-high performance liquid chromatography-tandem mass spectrometry. Eighty samples collected from different parts of the body were analyzed, 57 of which detected positive for at least one substance-either a traditional or new drug. Among these, 26 samples tested positive for at least one NPS-the most common being butylone (25 samples). Other new drugs detected include methylone, methoxetamine, 5/6-APB, α-PVP and 4-FA. Hair analysis proved a powerful tool to gain objective biological drug-prevalence information, free from possible biases of unintentional or unknown intake and untruthful reporting of use. Such testing can be used actively or retrospectively to validate survey responses and inform research on consumption patterns, including intentional and unknown use, polydrug-use, occasional NPS intake and frequent or heavy use. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Morphological changes in polycrystalline Fe after compression and release
NASA Astrophysics Data System (ADS)
Gunkelmann, Nina; Tramontina, Diego R.; Bringa, Eduardo M.; Urbassek, Herbert M.
2015-02-01
Despite a number of large-scale molecular dynamics simulations of shock compressed iron, the morphological properties of simulated recovered samples are still unexplored. Key questions remain open in this area, including the role of dislocation motion and deformation twinning in shear stress release. In this study, we present simulations of homogeneous uniaxial compression and recovery of large polycrystalline iron samples. Our results reveal significant recovery of the body-centered cubic grains with some deformation twinning driven by shear stress, in agreement with experimental results by Wang et al. [Sci. Rep. 3, 1086 (2013)]. The twin fraction agrees reasonably well with a semi-analytical model which assumes a critical shear stress for twinning. On reloading, twins disappear and the material reaches a very low strength value.
Gassner, Christoph; Rainer, Esther; Pircher, Elfriede; Markut, Lydia; Körmöczi, Günther F.; Jungbauer, Christof; Wessin, Dietmar; Klinghofer, Roswitha; Schennach, Harald; Schwind, Peter; Schönitzer, Diether
2009-01-01
Summary Background Validations of routinely used serological typing methods require intense performance evaluations typically including large numbers of samples before routine application. However, such evaluations could be improved considering information about the frequency of standard blood groups and their variants. Methods Using RHD and ABO population genetic data, a Caucasian-specific donor panel was compiled for a performance comparison of the three RhD and ABO serological typing methods MDmulticard (Medion Diagnostics), ID-System (DiaMed) and ScanGel (Bio-Rad). The final test panel included standard and variant RHD and ABO genotypes, e.g. RhD categories, partial and weak RhDs, RhD DELs, and ABO samples, mainly to interpret weak serological reactivity for blood group A specificity. All samples were from individuals recorded in our local DNA blood group typing database. Results For ‘standard’ blood groups, results of performance were clearly interpretable for all three serological methods compared. However, when focusing on specific variant phenotypes, pronounced differences in reaction strengths and specificities were observed between them. Conclusions A genetically and ethnically predefined donor test panel consisting of 93 individual samples only, delivered highly significant results for serological performance comparisons. Such small panels offer impressive representative powers, higher as such based on statistical chances and large numbers only. PMID:21113264
Baxter, Amanda J.; Hughes, Maria Celia; Kvaskoff, Marina; Siskind, Victor; Shekar, Sri; Aitken, Joanne F.; Green, Adele C.; Duffy, David L.; Hayward, Nicholas K.; Martin, Nicholas G.; Whiteman, David C.
2013-01-01
Cutaneous malignant melanoma (CMM) is a major health issue in Queensland, Australia which has the world’s highest incidence. Recent molecular and epidemiologic studies suggest that CMM arises through multiple etiological pathways involving gene-environment interactions. Understanding the potential mechanisms leading to CMM requires larger studies than those previously conducted. This article describes the design and baseline characteristics of Q-MEGA, the Queensland study of Melanoma: Environmental and Genetic Associations, which followed-up four population-based samples of CMM patients in Queensland, including children, adolescents, men aged over 50, and a large sample of adult cases and their families, including twins. Q-MEGA aims to investigate the roles of genetic and environmental factors, and their interaction, in the etiology of melanoma. 3,471 participants took part in the follow-up study and were administered a computer-assisted telephone interview in 2002–2005. Updated data on environmental and phenotypic risk factors, and 2,777 blood samples were collected from interviewed participants as well as a subset of relatives. This study provides a large and well-described population-based sample of CMM cases with follow-up data. Characteristics of the cases and repeatability of sun exposure and phenotype measures between the baseline and the follow-up surveys, from six to 17 years later, are also described. PMID:18361720
Dean, Christopher; Kirkpatrick, Jamie B; Osborn, Jon; Doyle, Richard B; Fitzgerald, Nicholas B; Roxburgh, Stephen H
2018-01-01
Abstract There is high uncertainty in the contribution of land-use change to anthropogenic climate change, especially pertaining to below-ground carbon loss resulting from conversion of primary-to-secondary forest. Soil organic carbon (SOC) and coarse roots are concentrated close to tree trunks, a region usually unmeasured during soil carbon sampling. Soil carbon estimates and their variation with land-use change have not been correspondingly adjusted. Our aim was to deduce allometric equations that will allow improvement of SOC estimates and tree trunk carbon estimates, for primary forest stands that include large trees in rugged terrain. Terrestrial digital photography, photogrammetry and GIS software were used to produce 3D models of the buttresses, roots and humus mounds of large trees in primary forests dominated by Eucalyptus regnans in Tasmania. Models of 29, in situ eucalypts were made and analysed. 3D models of example eucalypt roots, logging debris, rainforest tree species, fallen trees, branches, root and trunk slices, and soil profiles were also derived. Measurements in 2D, from earlier work, of three buttress ‘logs’ were added to the data set. The 3D models had high spatial resolution. The modelling allowed checking and correction of field measurements. Tree anatomical detail was formulated, such as buttress shape, humus volume, root volume in the under-sampled zone and trunk hollow area. The allometric relationships developed link diameter at breast height and ground slope, to SOC and tree trunk carbon, the latter including a correction for senescence. These formulae can be applied to stand-level carbon accounting. The formulae allow the typically measured, inter-tree SOC to be corrected for not sampling near large trees. The 3D models developed are irreplaceable, being for increasingly rare, large trees, and they could be useful to other scientific endeavours. PMID:29593855
Dean, Christopher; Kirkpatrick, Jamie B; Osborn, Jon; Doyle, Richard B; Fitzgerald, Nicholas B; Roxburgh, Stephen H
2018-03-01
There is high uncertainty in the contribution of land-use change to anthropogenic climate change, especially pertaining to below-ground carbon loss resulting from conversion of primary-to-secondary forest. Soil organic carbon (SOC) and coarse roots are concentrated close to tree trunks, a region usually unmeasured during soil carbon sampling. Soil carbon estimates and their variation with land-use change have not been correspondingly adjusted. Our aim was to deduce allometric equations that will allow improvement of SOC estimates and tree trunk carbon estimates, for primary forest stands that include large trees in rugged terrain. Terrestrial digital photography, photogrammetry and GIS software were used to produce 3D models of the buttresses, roots and humus mounds of large trees in primary forests dominated by Eucalyptus regnans in Tasmania. Models of 29, in situ eucalypts were made and analysed. 3D models of example eucalypt roots, logging debris, rainforest tree species, fallen trees, branches, root and trunk slices, and soil profiles were also derived. Measurements in 2D, from earlier work, of three buttress 'logs' were added to the data set. The 3D models had high spatial resolution. The modelling allowed checking and correction of field measurements. Tree anatomical detail was formulated, such as buttress shape, humus volume, root volume in the under-sampled zone and trunk hollow area. The allometric relationships developed link diameter at breast height and ground slope, to SOC and tree trunk carbon, the latter including a correction for senescence. These formulae can be applied to stand-level carbon accounting. The formulae allow the typically measured, inter-tree SOC to be corrected for not sampling near large trees. The 3D models developed are irreplaceable, being for increasingly rare, large trees, and they could be useful to other scientific endeavours.
Irish study of high-density Schizophrenia families: Field methods and power to detect linkage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kendler, K.S.; Straub, R.E.; MacLean, C.J.
Large samples of multiplex pedigrees will probably be needed to detect susceptibility loci for schizophrenia by linkage analysis. Standardized ascertainment of such pedigrees from culturally and ethnically homogeneous populations may improve the probability of detection and replication of linkage. The Irish Study of High-Density Schizophrenia Families (ISHDSF) was formed from standardized ascertainment of multiplex schizophrenia families in 39 psychiatric facilities covering over 90% of the population in Ireland and Northern Ireland. We here describe a phenotypic sample and a subset thereof, the linkage sample. Individuals were included in the phenotypic sample if adequate diagnostic information, based on personal interview and/ormore » hospital record, was available. Only individuals with available DNA were included in the linkage sample. Inclusion of a pedigree into the phenotypic sample required at least two first, second, or third degree relatives with non-affective psychosis (NAP), one of whom had schizophrenia (S) or poor-outcome schizoaffective disorder (PO-SAD). Entry into the linkage sample required DNA samples on at least two individuals with NAP, of whom at least one had S or PO-SAD. Affection was defined by narrow, intermediate, and broad criteria. 75 refs., 6 tabs.« less
Wygant, Dustin B; Ben-Porath, Yossef S; Arbisi, Paul A; Berry, David T R; Freeman, David B; Heilbronner, Robert L
2009-11-01
The current study examined the effectiveness of the MMPI-2 Restructured Form (MMPI-2-RF; Ben-Porath and Tellegen, 2008) over-reporting indicators in civil forensic settings. The MMPI-2-RF includes three revised MMPI-2 over-reporting validity scales and a new scale to detect over-reported somatic complaints. Participants dissimulated medical and neuropsychological complaints in two simulation samples, and a known-groups sample used symptom validity tests as a response bias criterion. Results indicated large effect sizes for the MMPI-2-RF validity scales, including a Cohen's d of .90 for Fs in a head injury simulation sample, 2.31 for FBS-r, 2.01 for F-r, and 1.97 for Fs in a medical simulation sample, and 1.45 for FBS-r and 1.30 for F-r in identifying poor effort on SVTs. Classification results indicated good sensitivity and specificity for the scales across the samples. This study indicates that the MMPI-2-RF over-reporting validity scales are effective at detecting symptom over-reporting in civil forensic settings.
Anxiety, Depression, Hostility and General Psychopathology: An Arabian Study.
ERIC Educational Resources Information Center
Ibrahim, Abdel-Sattar; Ibrahim, Radwa M.
In Arabian cultures, the psychosocial characteristics of psychopathological trends, including depression, anxiety, and hostility remain largely unknown. Scales measuring depression, anxiety, and hostility were administered to a voluntary sample of 989 Saudi Arabian men and 1,024 Saudi women coming from different social, economical, and educational…
Transport Coefficients from Large Deviation Functions
NASA Astrophysics Data System (ADS)
Gao, Chloe; Limmer, David
2017-10-01
We describe a method for computing transport coefficients from the direct evaluation of large deviation function. This method is general, relying on only equilibrium fluctuations, and is statistically efficient, employing trajectory based importance sampling. Equilibrium fluctuations of molecular currents are characterized by their large deviation functions, which is a scaled cumulant generating function analogous to the free energy. A diffusion Monte Carlo algorithm is used to evaluate the large deviation functions, from which arbitrary transport coefficients are derivable. We find significant statistical improvement over traditional Green-Kubo based calculations. The systematic and statistical errors of this method are analyzed in the context of specific transport coefficient calculations, including the shear viscosity, interfacial friction coefficient, and thermal conductivity.
CALIFA: a diameter-selected sample for an integral field spectroscopy galaxy survey
NASA Astrophysics Data System (ADS)
Walcher, C. J.; Wisotzki, L.; Bekeraité, S.; Husemann, B.; Iglesias-Páramo, J.; Backsmann, N.; Barrera Ballesteros, J.; Catalán-Torrecilla, C.; Cortijo, C.; del Olmo, A.; Garcia Lorenzo, B.; Falcón-Barroso, J.; Jilkova, L.; Kalinova, V.; Mast, D.; Marino, R. A.; Méndez-Abreu, J.; Pasquali, A.; Sánchez, S. F.; Trager, S.; Zibetti, S.; Aguerri, J. A. L.; Alves, J.; Bland-Hawthorn, J.; Boselli, A.; Castillo Morales, A.; Cid Fernandes, R.; Flores, H.; Galbany, L.; Gallazzi, A.; García-Benito, R.; Gil de Paz, A.; González-Delgado, R. M.; Jahnke, K.; Jungwiert, B.; Kehrig, C.; Lyubenova, M.; Márquez Perez, I.; Masegosa, J.; Monreal Ibero, A.; Pérez, E.; Quirrenbach, A.; Rosales-Ortega, F. F.; Roth, M. M.; Sanchez-Blazquez, P.; Spekkens, K.; Tundo, E.; van de Ven, G.; Verheijen, M. A. W.; Vilchez, J. V.; Ziegler, B.
2014-09-01
We describe and discuss the selection procedure and statistical properties of the galaxy sample used by the Calar Alto Legacy Integral Field Area (CALIFA) survey, a public legacy survey of 600 galaxies using integral field spectroscopy. The CALIFA "mother sample" was selected from the Sloan Digital Sky Survey (SDSS) DR7 photometric catalogue to include all galaxies with an r-band isophotal major axis between 45'' and 79.2'' and with a redshift 0.005 < z < 0.03. The mother sample contains 939 objects, 600 of which will be observed in the course of the CALIFA survey. The selection of targets for observations is based solely on visibility and thus keeps the statistical properties of the mother sample. By comparison with a large set of SDSS galaxies, we find that the CALIFA sample is representative of galaxies over a luminosity range of -19 > Mr > -23.1 and over a stellar mass range between 109.7 and 1011.4 M⊙. In particular, within these ranges, the diameter selection does not lead to any significant bias against - or in favour of - intrinsically large or small galaxies. Only below luminosities of Mr = -19 (or stellar masses <109.7 M⊙) is there a prevalence of galaxies with larger isophotal sizes, especially of nearly edge-on late-type galaxies, but such galaxies form <10% of the full sample. We estimate volume-corrected distribution functions in luminosities and sizes and show that these are statistically fully compatible with estimates from the full SDSS when accounting for large-scale structure. For full characterization of the sample, we also present a number of value-added quantities determined for the galaxies in the CALIFA sample. These include consistent multi-band photometry based on growth curve analyses; stellar masses; distances and quantities derived from these; morphological classifications; and an overview of available multi-wavelength photometric measurements. We also explore different ways of characterizing the environments of CALIFA galaxies, finding that the sample covers environmental conditions from the field to genuine clusters. We finally consider the expected incidence of active galactic nuclei among CALIFA galaxies given the existing pre-CALIFA data, finding that the final observed CALIFA sample will contain approximately 30 Sey2 galaxies. Based on observations collected at the Centro Astronómico Hispano Alemán (CAHA) at Calar Alto, operated jointly by the Max Planck Institute for Astronomy and the Instituto de Astrofísica de Andalucía (CSIC). Publically released data products from CALIFA are made available on the webpage http://www.caha.es/CALIFA
Exploring the acceptability of human papillomavirus self-sampling among Muslim immigrant women.
Lofters, Aisha K; Vahabi, Mandana; Fardad, Mitra; Raza, Afrah
2017-01-01
With appropriate screening (ie, the Papanicolaou [Pap] test), cervical cancer is highly preventable, and high-income countries, including Canada, have observed significant decreases in cervical cancer mortality. However, certain subgroups, including immigrants from countries with large Muslim populations, experience disparities in cervical cancer screening. Little is known about the acceptability of human papillomavirus (HPV) self-sampling as a screening strategy among Muslim immigrant women in Canada. This study assessed cervical cancer screening practices, knowledge and attitudes, and acceptability of HPV self-sampling among Muslim immigrant women. A convenience sample of 30 women was recruited over a 3-month period (June-August 2015) in the Greater Toronto Area. All women were between 21 and 69 years old, foreign-born, and self-identified as Muslim, and had good knowledge of English. Data were collected through a self-completed questionnaire. More than half of the participants falsely indicated that Pap tests may cause cervical infection, and 46.7% indicated that the test is an intrusion on privacy. The majority of women reported that they would be willing to try HPV self-sampling, and more than half would prefer this method to provider-administered sampling methods. Barriers to self-sampling included confidence in the ability to perform the test and perceived cost, and facilitators included convenience and privacy being preserved. The results demonstrate that HPV self-sampling may provide a favorable alternative model of care to the traditional provider-administered Pap testing. These findings add important information to the literature related to promoting cancer screening among women who are under or never screened for cervical cancer.
Tissue Sampling Guides for Porcine Biomedical Models.
Albl, Barbara; Haesner, Serena; Braun-Reichhart, Christina; Streckel, Elisabeth; Renner, Simone; Seeliger, Frank; Wolf, Eckhard; Wanke, Rüdiger; Blutke, Andreas
2016-04-01
This article provides guidelines for organ and tissue sampling adapted to porcine animal models in translational medical research. Detailed protocols for the determination of sampling locations and numbers as well as recommendations on the orientation, size, and trimming direction of samples from ∼50 different porcine organs and tissues are provided in the Supplementary Material. The proposed sampling protocols include the generation of samples suitable for subsequent qualitative and quantitative analyses, including cryohistology, paraffin, and plastic histology; immunohistochemistry;in situhybridization; electron microscopy; and quantitative stereology as well as molecular analyses of DNA, RNA, proteins, metabolites, and electrolytes. With regard to the planned extent of sampling efforts, time, and personnel expenses, and dependent upon the scheduled analyses, different protocols are provided. These protocols are adjusted for (I) routine screenings, as used in general toxicity studies or in analyses of gene expression patterns or histopathological organ alterations, (II) advanced analyses of single organs/tissues, and (III) large-scale sampling procedures to be applied in biobank projects. Providing a robust reference for studies of porcine models, the described protocols will ensure the efficiency of sampling, the systematic recovery of high-quality samples representing the entire organ or tissue as well as the intra-/interstudy comparability and reproducibility of results. © The Author(s) 2016.
A phylogeny and revised classification of Squamata, including 4161 species of lizards and snakes
2013-01-01
Background The extant squamates (>9400 known species of lizards and snakes) are one of the most diverse and conspicuous radiations of terrestrial vertebrates, but no studies have attempted to reconstruct a phylogeny for the group with large-scale taxon sampling. Such an estimate is invaluable for comparative evolutionary studies, and to address their classification. Here, we present the first large-scale phylogenetic estimate for Squamata. Results The estimated phylogeny contains 4161 species, representing all currently recognized families and subfamilies. The analysis is based on up to 12896 base pairs of sequence data per species (average = 2497 bp) from 12 genes, including seven nuclear loci (BDNF, c-mos, NT3, PDC, R35, RAG-1, and RAG-2), and five mitochondrial genes (12S, 16S, cytochrome b, ND2, and ND4). The tree provides important confirmation for recent estimates of higher-level squamate phylogeny based on molecular data (but with more limited taxon sampling), estimates that are very different from previous morphology-based hypotheses. The tree also includes many relationships that differ from previous molecular estimates and many that differ from traditional taxonomy. Conclusions We present a new large-scale phylogeny of squamate reptiles that should be a valuable resource for future comparative studies. We also present a revised classification of squamates at the family and subfamily level to bring the taxonomy more in line with the new phylogenetic hypothesis. This classification includes new, resurrected, and modified subfamilies within gymnophthalmid and scincid lizards, and boid, colubrid, and lamprophiid snakes. PMID:23627680
Effect of clearance on cartilage tribology in hip hemi-arthroplasty.
Lizhang, Jia; Taylor, Simon D; Jin, Zhongmin; Fisher, John; Williams, Sophie
2013-12-01
Hemi-arthroplasty of the hip (an artificial femoral head articulating against the natural acetabulum) is used to treat fractured necks of femur; however, there is evidence that articulation causes erosion of the cartilage, resulting in pain for the patient. Parameters that may influence this cartilage erosion include head material and roughness, clearance between the head and acetabulum and activity levels of the patient. This study has assessed the effect of clearance of hemi-arthroplasty articulations on the contact stress, friction and cartilage deformation in an in vitro tribological simulation of the hemi-arthroplasty joint that applied dynamic loads and motion. It has been demonstrated that peak contact stress increased from 5.6 to 10.6 MPa as radial clearance increased from small (<0.6 mm) to extra-large (>1.8 mm). In all samples, friction factor increased with time and was significantly less with extra-large clearances compared to small (<0.6 mm), medium (0.6-1.2 mm) and large (1.2-1.8 mm) clearances. The cartilage deformation observed was significantly greater in acetabulum samples paired to give small or extra-large clearances compared to those with medium or large clearances.
Ryan, John Jake; Rawn, Dorothea F K
2014-09-01
Human milk samples were collected from individuals residing in various regions across Canada mostly in the years 1992 to 2005. These included five large cities in southern Canada as well as samples from Nunavik in northern Quebec. Comparative samples were also collected from residents of Austin, Texas, USA in 2002 and 2004. More than 300 milk samples were analysed for the brominated flame retardants (BFRs), PBDEs and HBCD, by extraction, purification and quantification using either isotope dilution gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-MS. The Canadian total PBDE values in the years 2002-2005 show median levels of about 20μg/kg on a lipid basis; a value significantly higher than in the 1980s and 1990s. Milk samples from Inuit donors in the northern region of Nunavik were slightly lower in PBDE concentrations than those from populated regions in the south of Quebec. Milk samples from Ontario contained slightly lower amounts of PBDEs in two time periods than those from Texas. HBCD levels in most milk samples were usually less than 1ppb milk lipid and dominated by the α-isomer. This large data set of BFRs in Canadian human milk demonstrates an increase in the last few decades in human exposure to BFRs which now appears to have stabilized. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
"Tools For Analysis and Visualization of Large Time- Varying CFD Data Sets"
NASA Technical Reports Server (NTRS)
Wilhelms, Jane; vanGelder, Allen
1999-01-01
During the four years of this grant (including the one year extension), we have explored many aspects of the visualization of large CFD (Computational Fluid Dynamics) datasets. These have included new direct volume rendering approaches, hierarchical methods, volume decimation, error metrics, parallelization, hardware texture mapping, and methods for analyzing and comparing images. First, we implemented an extremely general direct volume rendering approach that can be used to render rectilinear, curvilinear, or tetrahedral grids, including overlapping multiple zone grids, and time-varying grids. Next, we developed techniques for associating the sample data with a k-d tree, a simple hierarchial data model to approximate samples in the regions covered by each node of the tree, and an error metric for the accuracy of the model. We also explored a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH (Association for Computing Machinery Special Interest Group on Computer Graphics) '96. In our initial implementation, we automatically image the volume from 32 approximately evenly distributed positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation.
Nonlinear Finite Element Analysis of Shells with Large Aspect Ratio
NASA Technical Reports Server (NTRS)
Chang, T. Y.; Sawamiphakdi, K.
1984-01-01
A higher order degenerated shell element with nine nodes was selected for large deformation and post-buckling analysis of thick or thin shells. Elastic-plastic material properties are also included. The post-buckling analysis algorithm is given. Using a square plate, it was demonstrated that the none-node element does not have shear locking effect even if its aspect ratio was increased to the order 10 to the 8th power. Two sample problems are given to illustrate the analysis capability of the shell element.
Textual data in psychiatry: reasoning by analogy to quantitative principles.
Yang, Suzanne; Mulvey, Edward P; Falissard, Bruno
2012-08-01
Personal meaning in subjective experience is a key element in the treatment of persons with mental disorders. Open-response speech samples would appear to be suitable for studying this type of subjective experience, but there are still important challenges in using language as data. Scientific principles involved in sample size calculation, validity, and reliability may be applicable, by analogy, to data collected in the form of words. We describe a rationale for including computer-assisted techniques as one step of a qualitative analysis procedure that includes manual reading. Clarification of a framework for including language as data in psychiatric research may allow us to more effectively bridge biological and psychometric research with clinical practice, a setting where the patient's clinical "data" are, in large part, conveyed in words.
Project Hyreus: Mars Sample Return Mission Utilizing in Situ Propellant Production
NASA Technical Reports Server (NTRS)
Bruckner, A. P.; Thill, Brian; Abrego, Anita; Koch, Amber; Kruse, Ross; Nicholson, Heather; Nill, Laurie; Schubert, Heidi; Schug, Eric; Smith, Brian
1993-01-01
Project Hyreus is an unmanned Mars sample return mission that utilizes propellants manufactured in situ from the Martian atmosphere for the return voyage. A key goal of the mission is to demonstrate the considerable benefits of using indigenous resources and to test the viability of this approach as a precursor to manned Mars missions. The techniques, materials, and equipment used in Project Hyreus represent those that are currently available or that could be developed and readied in time for the proposed launch date in 2003. Project Hyreus includes such features as a Mars-orbiting satellite equipped with ground-penetrating radar, a large rover capable of sample gathering and detailed surface investigations, and a planetary science array to perform on-site research before samples are returned to Earth. Project Hyreus calls for the Mars Landing Vehicle to land in the Mangala Valles region of Mars, where it will remain for approximately 1.5 years. Methane and oxygen propellant for the Earth return voyage will be produced using carbon dioxide from the Martian atmosphere and a small supply of hydrogen brought from Earth. This process is key to returning a large Martian sample to Earth with a single Earth launch.
Project Hyreus: Mars sample return mission utilizing in situ propellant production
NASA Technical Reports Server (NTRS)
Abrego, Anita; Bair, Chris; Hink, Anthony; Kim, Jae; Koch, Amber; Kruse, Ross; Ngo, Dung; Nicholson, Heather; Nill, Laurie; Perras, Craig
1993-01-01
Project Hyreus is an unmanned Mars sample return mission that utilizes propellants manufactured in situ from the Martian atmosphere for the return voyage. A key goal of the mission is to demonstrate the considerable benefits of using indigenous resources and to test the viability of this approach as a precursor to manned Mars missions. The techniques, materials, and equipment used in Project Hyreus represent those that are currently available or that could be developed and readied in time for the proposed launch date in 2003. Project Hyreus includes such features as a Mars-orbiting satellite equipped with ground-penetrating radar, a large rover capable of sample gathering and detailed surface investigations, and a planetary science array to perform on-site research before samples are returned to Earth. Project Hyreus calls for the Mars Landing Vehicle to land in the Mangala Valles region of Mars, where it will remain for approximately 1.5 years. Methane and oxygen propellant for the Earth return voyage will be produced using carbon dioxide from the Martian atmosphere and a small supply of hydrogen brought from Earth. This process is key to returning a large Martian sample to Earth with a single Earth launch.
2017-12-01
description in Figure 9 below 2 Full or partial loss of test data due to instrumentation/triggering failures 3 Gages not included in these tests 4...Table 2. Sample properties. Test Description Dimensions Weight (lbs.) Strength (psi) Notes 17 Fully Tempered Glass Window 4-ft x 6-ft x...an estimate of prism strength for medium weight CMU. The reinforced concrete sample was a 5.5-in thick solid panel. To evaluate its strength
Cultural influences on personality.
Triandis, Harry C; Suh, Eunkook M
2002-01-01
Ecologies shape cultures; cultures influence the development of personalities. There are both universal and culture-specific aspects of variation in personality. Some culture-specific aspects correspond to cultural syndromes such as complexity, tightness, individualism, and collectivism. A large body of literature suggests that the Big Five personality factors emerge in various cultures. However, caution is required in arguing for such universality, because most studies have not included emic (culture-specific) traits and have not studied samples that are extremely different in culture from Western samples.
The Odd Man Out: How Fathers Navigate the Special Education System
ERIC Educational Resources Information Center
Mueller, Tracy Gershwin; Buckley, Pamela C.
2014-01-01
Research about parent experiences with the special education system is largely dominated by the perspectives of mothers. Using purposeful sampling techniques, we interviewed 20 active fathers about their experiences navigating the special education system. All the fathers described three primary roles they experienced, including acting as a…
17 CFR Appendix B to Part 420 - Sample Large Position Report
Code of Federal Regulations, 2014 CFR
2014-04-01
... Memorandum 1 $ Memorandum 2: Report the gross par amount of fails to deliver. Included in the calculation of... millions at par value as of trade date] Security Being Reported Date For Which Information is Being... Principal Components of the Specific Security $________ Total Net Trading Position $ 2. Gross Financing...
Quantitative Approaches to Group Research: Suggestions for Best Practices
ERIC Educational Resources Information Center
McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal
2017-01-01
Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…
An Investigation of Student Psychological Wellbeing: Honors versus Nonhonors Undergraduate Education
ERIC Educational Resources Information Center
Plominski, Abigail P.; Burns, Lawrence R.
2018-01-01
The purpose of this study was to describe the current state of psychological wellbeing in gifted and nongifted undergraduate student sample populations and identify undergraduate populations experiencing heightened levels of distress within a large Midwestern public university. Study participants included 641 honors and 386 nonhonors undergraduate…
Neglected but Exciting Concepts in Developmental and Neurobiological Psychology
ERIC Educational Resources Information Center
Jordan, Evan M.; Thomas, David G.
2017-01-01
This review provides an evaluative overview of five concepts specific to developmental and neurobiological psychology that are found to be largely overlooked in current textbooks. A sample of 19 introductory psychology texts was surveyed to develop a list, including glial cell signaling, grandmother cells, memory reconsolidation, brain plasticity,…
COMPARISON OF TWO DIFFERENT SOLID PHASE EXTRACTION/LARGE VOLUME INJECTION PROCEDURES FOR METHOD 8270
Two solid phase (SPE) and one traditional continuous liquid-liquid extraction method are compared for analysis of Method 8270 SVOCs. Productivity parameters include data quality, sample volume, analysis time and solvent waste.
One SPE system, unique in the U.S., uses aut...
Neurobehavioral studies pose unique challenges for dose-response modeling, including small sample size and relatively large intra-subject variation, repeated measurements over time, multiple endpoints with both continuous and ordinal scales, and time dependence of risk characteri...
ERIC Educational Resources Information Center
Holt, Melissa K.; Greif Green, Jennifer; Reid, Gerald; DiMeo, Amanda; Espelage, Dorothy L.; Felix, Erika D.; Furlong, Michael J.; Poteat, V. Paul; Sharkey, Jill D.
2014-01-01
Objectives: This study examined whether childhood bullying victimization was associated with psychosocial and academic functioning at college. Participants: The sample consisted of 413 first-year students from a large northeastern university. Methods: Students completed an online survey in February 2012 that included items assessing past bullying…
The Career Futures Inventory-Revised: Measuring Dimensions of Career Adaptability
ERIC Educational Resources Information Center
Rottinghaus, Patrick J.; Buelow, Kristine L.; Matyja, Anna; Schneider, Madalyn R.
2012-01-01
This study reports the development and initial validation of the "Career Futures Inventory-Revised" (CFI-R) in two large samples of university students. The 28-item CFI-R assesses aspects of career adaptability, including positive career planning attitudes, general outcome expectations, and components of Parsons' tripartite model and…
9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2013 CFR
2013-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2012 CFR
2012-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2013 CFR
2013-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2014 CFR
2014-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2014 CFR
2014-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2012 CFR
2012-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
A Web-Hosted R Workflow to Simplify and Automate the Analysis of 16S NGS Data
Next-Generation Sequencing (NGS) produces large data sets that include tens-of-thousands of sequence reads per sample. For analysis of bacterial diversity, 16S NGS sequences are typically analyzed in a workflow that containing best-of-breed bioinformatics packages that may levera...
A robust method of thin plate spline and its application to DEM construction
NASA Astrophysics Data System (ADS)
Chen, Chuanfa; Li, Yanyan
2012-11-01
In order to avoid the ill-conditioning problem of thin plate spline (TPS), the orthogonal least squares (OLS) method was introduced, and a modified OLS (MOLS) was developed. The MOLS of TPS (TPS-M) can not only select significant points, termed knots, from large and dense sampling data sets, but also easily compute the weights of the knots in terms of back-substitution. For interpolating large sampling points, we developed a local TPS-M, where some neighbor sampling points around the point being estimated are selected for computation. Numerical tests indicate that irrespective of sampling noise level, the average performance of TPS-M can advantage with smoothing TPS. Under the same simulation accuracy, the computational time of TPS-M decreases with the increase of the number of sampling points. The smooth fitting results on lidar-derived noise data indicate that TPS-M has an obvious smoothing effect, which is on par with smoothing TPS. The example of constructing a series of large scale DEMs, located in Shandong province, China, was employed to comparatively analyze the estimation accuracies of the two versions of TPS and the classical interpolation methods including inverse distance weighting (IDW), ordinary kriging (OK) and universal kriging with the second-order drift function (UK). Results show that regardless of sampling interval and spatial resolution, TPS-M is more accurate than the classical interpolation methods, except for the smoothing TPS at the finest sampling interval of 20 m, and the two versions of kriging at the spatial resolution of 15 m. In conclusion, TPS-M, which avoids the ill-conditioning problem, is considered as a robust method for DEM construction.
Chen, Zongbao; Lin, Zian; Zhang, Lin; Cai, Yan; Zhang, Lan
2012-04-07
A novel method of microemulsion electrokinetic capillary chromatography (MEEKC) coupled with on-line large volume sample stacking was developed for the analysis of six plant hormones including indole-3-acetic acid, indole-3-butyric acid, indole-3-propionic acid, 1-naphthaleneacetic acid, abscisic acid and salicylic acid. Baseline separation of six plant hormones was achieved within 10 min by using the microemulsion background electrolyte containing a 97.2% (w/w) 10 mM borate buffer at pH 9.2, 1.0% (w/w) ethyl acetate as oil droplets, 0.6% (w/w) sodium dodecyl sulphate as surfactant and 1.2% (w/w) 1-butanol as cosurfactant. In addition, an on-line concentration method based on a large volume sample stacking technique and multiple wavelength detection was adopted for improving the detection sensitivity in order to determine trace level hormones in a real sample. The optimal method provided about 50-100 fold increase in detection sensitivity compared with a single MEEKC method, and the detection limits (S/N = 3) were between 0.005 and 0.02 μg mL(-1). The proposed method was simple, rapid and sensitive and could be applied to the determination of six plant hormones in spiked water samples, tobacco leaves and 1-naphthylacetic acid in leaf fertilizer. The recoveries ranged from 76.0% to 119.1%, and good reproducibilities were obtained with relative standard deviations (RSDs) less than 6.6%.
Analysis and imaging of biocidal agrochemicals using ToF-SIMS.
Converso, Valerio; Fearn, Sarah; Ware, Ecaterina; McPhail, David S; Flemming, Anthony J; Bundy, Jacob G
2017-09-06
ToF-SIMS has been increasingly widely used in recent years to look at biological matrices, in particular for biomedical research, although there is still a lot of development needed to maximise the value of this technique in the life sciences. The main issue for biological matrices is the complexity of the mass spectra and therefore the difficulty to specifically and precisely detect analytes in the biological sample. Here we evaluated the use of ToF-SIMS in the agrochemical field, which remains a largely unexplored area for this technique. We profiled a large number of biocidal active ingredients (herbicides, fungicides, and insecticides); we then selected fludioxonil, a halogenated fungicide, as a model compound for more detailed study, including the effect of co-occurring biomolecules on detection limits. There was a wide range of sensitivity of the ToF-SIMS for the different active ingredient compounds, but fludioxonil was readily detected in real-world samples (wheat seeds coated with a commercial formulation). Fludioxonil did not penetrate the seed to any great depth, but was largely restricted to a layer coating the seed surface. ToF-SIMS has clear potential as a tool for not only detecting biocides in biological samples, but also mapping their distribution.
The large impact process inferred from the geology of lunar multiring basins
NASA Technical Reports Server (NTRS)
Spudis, Paul D.
1992-01-01
The nature of the impact process has been inferred through the study of the geology of a wide variety of impact crater types and sizes. Some of the largest craters known are the multiring basins found in ancient terrains of the terrestrial planets. Of these features, those found on the Moon possess the most extensive and diverse data coverage, including morphological, geochemical, geophysical, and sample data. The study of the geology of lunar basins over the past 10 years has given us a rudimentary understanding of how these large structures have formed and evolved. The topics covered include basin morphology, basin ejecta, basin excavation, and basin ring formation.
Holderman, C J; Gezan, S A; Stone, A E S; Connelly, C R; Kaufman, P E
2018-01-10
Mosquito surveillance typically uses Centers for Disease Control and Prevention (CDC) mosquito light traps baited with CO2. From January 2013 to March 2015, we sampled seven field sites using three active mosquito-trapping techniques (two different aspirators and a sweep net) and the stationary CO2-baited CDC mosquito light trap to determine mosquito capture efficacy for each technique. Sampling occurred in four suburban backyards and three dog kennel facilities near Gainesville, FL, USA; species collection and relative abundance were measured. A total of 32 species and 70,090 individual mosquitoes were collected, including a new record for Alachua County, Florida, Aedes hendersoni (Cockerell). The dominant (>5% of total capture) mosquito species collected during the study included Aedes atlanticus (Dyar and Knab), Aedes infirmatus (Dyar and Knab), Anopheles crucians Wiedemann, Culiseta melanura (Coquillett), Culex erraticus (Dyar and Knab), Culex nigripalpus Theobald, and Uranotaenia sapphirina (Osten Sacken). The CDC trap captured the most species (29), followed by large aspirator (28), small aspirator (26), and the sweep net (23). All dominant species were captured with each sampling technique. Excluding Wyeomyia mitchellii (Theobald), all subdominant species (1-5% of total capture) were collected with each sampling technique. Future sampling should consider the utility (e.g., large numbers are readily collected) and limitations (e.g., personnel requirements) of aspirator collections when designing field-based mosquito sampling projects, especially those in residential areas or those focused upon species captured. © The Author(s) 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
A method for analysing small samples of floral pollen for free and protein-bound amino acids.
Stabler, Daniel; Power, Eileen F; Borland, Anne M; Barnes, Jeremy D; Wright, Geraldine A
2018-02-01
Pollen provides floral visitors with essential nutrients including proteins, lipids, vitamins and minerals. As an important nutrient resource for pollinators, including honeybees and bumblebees, pollen quality is of growing interest in assessing available nutrition to foraging bees. To date, quantifying the protein-bound amino acids in pollen has been difficult and methods rely on large amounts of pollen, typically more than 1 g. More usual is to estimate a crude protein value based on the nitrogen content of pollen, however, such methods provide no information on the distribution of essential and non-essential amino acids constituting the proteins.Here, we describe a method of microwave-assisted acid hydrolysis using low amounts of pollen that allows exploration of amino acid composition, quantified using ultra high performance liquid chromatography (UHPLC), and a back calculation to estimate the crude protein content of pollen.Reliable analysis of protein-bound and free amino acids as well as an estimation of crude protein concentration was obtained from pollen samples as low as 1 mg. Greater variation in both protein-bound and free amino acids was found in pollen sample sizes <1 mg. Due to the variability in recovery of amino acids in smaller sample sizes, we suggest a correction factor to apply to specific sample sizes of pollen in order to estimate total crude protein content.The method described in this paper will allow researchers to explore the composition of amino acids in pollen and will aid research assessing the available nutrition to pollinating animals. This method will be particularly useful in assaying the pollen of wild plants, from which it is difficult to obtain large sample weights.
Interpreting and Reporting Radiological Water-Quality Data
McCurdy, David E.; Garbarino, John R.; Mullin, Ann H.
2008-01-01
This document provides information to U.S. Geological Survey (USGS) Water Science Centers on interpreting and reporting radiological results for samples of environmental matrices, most notably water. The information provided is intended to be broadly useful throughout the United States, but it is recommended that scientists who work at sites containing radioactive hazardous wastes need to consult additional sources for more detailed information. The document is largely based on recognized national standards and guidance documents for radioanalytical sample processing, most notably the Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP), and on documents published by the U.S. Environmental Protection Agency and the American National Standards Institute. It does not include discussion of standard USGS practices including field quality-control sample analysis, interpretive report policies, and related issues, all of which shall always be included in any effort by the Water Science Centers. The use of 'shall' in this report signifies a policy requirement of the USGS Office of Water Quality.
Pristine Igneous Rocks and the Early Differentiation of Planetary Materials
NASA Technical Reports Server (NTRS)
Warren, Paul H.
1998-01-01
Our studies are highly interdisciplinary, but are focused on the processes and products of early planetary and asteroidal differentiation, especially the genesis of the ancient lunar crust. Most of the accessible lunar crust consists of materials hybridized by impact-mixing. Rare pristine (unmixed) samples reflect the original genetic diversity of the early crust. We studied the relative importance of internally generated melt (including the putative magma ocean) versus large impact melts in early lunar magmatism, through both sample analysis and physical modeling. Other topics under investigation included: lunar and SNC (martian?) meteorites; igneous meteorites in general; impact breccias, especially metal-rich Apollo samples and polymict eucrites; effects of regolith/megaregolith insulation on thermal evolution and geochronology; and planetary bulk compositions and origins. We investigated the theoretical petrology of impact melts, especially those formed in large masses, such as the unejected parts of the melts of the largest lunar and terrestrial impact basins. We developed constraints on several key effects that variations in melting/displacement ratio (a strong function of both crater size and planetary g) have on impact melt petrology. Modeling results indicate that the impact melt-derived rock in the sampled, megaregolith part of the Moon is probably material that was ejected from deeper average levels than the non-impact-melted material (fragmental breccias and unbrecciated pristine rocks). In the largest lunar impacts, most of the impact melt is of mantle origin and avoids ejection from the crater, while most of the crust, and virtually all of the impact-melted crust, in the area of the crater is ejected. We investigated numerous extraordinary meteorites and Apollo rocks, emphasizing pristine rocks, siderophile and volatile trace elements, and the identification of primary partial melts, as opposed to partial cumulates. Apollo 15 sample 15434,28 is an extraodinarily large glass spherule, nearly if not entirely free of meteoritic contamination, and provides insight into the diversity of mare basalts in the Hadley-Apennine region. Apollo 14 sample 14434 is in many respects a new rock type, intermediate between nonmare gabbronorites and mare basalts. We helped to both plan and implement a consortium to study the Yamato-793605 SNC/martian meteorite.
[Ethical aspects of biological sample banks].
Cambon-Thomsen, A; Rial-Sebbag, E
2003-02-01
Numerous activities in the domain of epidemiology require the constitution or the use of biological sample banks. Such biobanks raise ethical issues. A number of recommendations are applicable to this field, in France and elsewhere. Major principles applicable to biobanks include the respect of person's autonomy, the respect of human body, the respect of confidentiality. These principles are translated into practices through the following procedures: relevant information to the persons regarding their sample management prior to informed consent, opinion of an independent ethics committee, actual implementation of conditions for protecting samples and data. However, although those principles may appear quite simple and obvious, in the context of a largely international practice of research and given the large variety of biobanks, it is not always obvious for researchers to find their way. The attitudes vary between countries, there are numerous texts for various types of biobanks, the same texts raise different interpretations in different institutions, there are new ethical opinions expressed, and mainly the novelty of questions raised by the uses of samples that are possible today, especially in genetics, and were not foreseeable at the time of sampling make the field difficult in practice. This article reviews the types of biobanks, the relevant ethical issues. It also underlines the still unclear or ambiguous situations using some examples of practical situations.
Generalized analog thresholding for spike acquisition at ultralow sampling rates
He, Bryan D.; Wein, Alex; Varshney, Lav R.; Kusuma, Julius; Richardson, Andrew G.
2015-01-01
Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712
VizieR Online Data Catalog: The ESO DIBs Large Exploration Survey (Cox+, 2017)
NASA Astrophysics Data System (ADS)
Cox, N. L. J.; Cami, J.; Farhang, A.; Smoker, J.; Monreal-Ibero, A.; Lallement, R.; Sarre, P. J.; Marshall, C. C. M.; Smith, K. T.; Evans, C. J.; Royer, P.; Linnartz, H.; Cordiner, M. A.; Joblin, C.; van Loon, J. T.; Foing, B. H.; Bhatt, N. H.; Bron, E.; Elyajouri, M.; de Koter, A.; Ehrenfreund, P.; Javadi, A.; Kaper, L.; Khosroshadi, H. G.; Laverick, M.; Le Petit, F.; Mulas, G.; Roueff, E.; Salama, F.; Spaans, M.
2018-01-01
We constructed a statistically representative survey sample that probes a wide range of interstellar environment parameters including reddening E(B-V), visual extinction AV, total-to-selective extinction ratio RV, and molecular hydrogen fraction fH2. EDIBLES provides the community with optical (~305-1042nm) spectra at high spectral resolution (R~70000 in the blue arm and 100000 in the red arm) and high signal-to-noise (S/N; median value ~500-1000), for a statistically significant sample of interstellar sightlines. Many of the >100 sightlines included in the survey already have auxiliary available ultraviolet, infrared and/or polarisation data on the dust and gas components. (2 data files).
Examining the link between weight suppression and non-suicidal self-injurious behaviors.
Keel, Pamela K; Jean Forney, K; Buchman-Schmitt, Jennifer M; Kennedy, Grace A; Joiner, Thomas E
2018-06-02
Given the negative consequences of excess weight, a large portion of the US population is seeking to obtain and maintain weight loss. Weight Suppression (WS) represents the difference between previous highest adult weight and current weight and may have negative psychological consequences. The current study examined the link between WS and lifetime non-suicidal self-injurious (NSSI) behavior and explored indirect effects in this link using survey data in two large samples. Sample 1 included 1011 college students (67% female, mean age = 19 years); Sample 2 included 2461 participants from an epidemiological study (68% female, mean age = 34 years). Models of direct and indirect effects were tested in MPlus using bootstrapping. As hypothesized, greater WS was associated with increased likelihood of lifetime NSSI in both samples (OR = 1.05 and 1.02). In both samples, significant indirect effects of drive for thinness (Total R 2 = 0.06 and 0.09) and depressive symptoms (Total R 2 = 0.13 and 0.29) accounted for this association. Alternative models in which the indirect effect of WS was tested in associations between drive for thinness or depressive symptoms and NSSI were not supported. Results suggest that the link between WS and lifetime NSSI may be accounted for by eating or mood-related pathology. Future research should test whether addressing associated eating and mood problems would eliminate the link between WS and NSSI as a means for reducing suicide risk. Copyright © 2018 Elsevier Ltd. All rights reserved.
Application of an ETV-ICP system for the determination of elements in human hair*1
NASA Astrophysics Data System (ADS)
Plantikow-Voβgätter, F.; Denkhaus, E.
1996-01-01
When determining element contents in hair samples without sample digestion it is necessary to analyze large sample volumes in order to minimize problems of inhomogeneity of biological sample materials. Therefore an electrothermal vaporization system (ETV) is used for solid sample introduction into an inductively coupled plasma (ICP) for the determination of matrix and trace elements in hair. This paper concentrates on the instrumental aspects without time consuming sample preparation. The results obtained for optimization tests, ETV operating parameters and ICP operating parameters, are shown and discussed. Standard additions are used for calibration for the determination of Zn, Mg, and Mn in human hair. Studies including reproducibility and detection limits for chosen elements have been carried out on certified reference materials (CRMs). The determination of reproducibility (relative standard deviation (RSD) of n = 10) and detection limits (DLs) of Zn (RSD < 8.5%, DL < 0.8 μ g -1), Mn (RSD < 14.1%, DL < 0.3 μ g -1), and Mg (RSD < 7.4%, DL < 6.6 μ g -1) are satisfactory. The concentration values found show good agreement with the corresponding certified values. Further sample preparation steps, including hair sampling, washing procedure and homogenization for hair, relating to measurements of real hair samples are described.
Mano, Junichi; Hatano, Shuko; Nagatomi, Yasuaki; Futo, Satoshi; Takabatake, Reona; Kitta, Kazumi
2018-03-01
Current genetically modified organism (GMO) detection methods allow for sensitive detection. However, a further increase in sensitivity will enable more efficient testing for large grain samples and reliable testing for processed foods. In this study, we investigated real-time PCR-based GMO detection methods using a large amount of DNA template. We selected target sequences that are commonly introduced into many kinds of GM crops, i.e., 35S promoter and nopaline synthase (NOS) terminator. This makes the newly developed method applicable to a wide range of GMOs, including some unauthorized ones. The estimated LOD of the new method was 0.005% of GM maize events; to the best of our knowledge, this method is the most sensitive among the GM maize detection methods for which the LOD was evaluated in terms of GMO content. A 10-fold increase in the DNA amount as compared with the amount used under common testing conditions gave an approximately 10-fold reduction in the LOD without PCR inhibition. Our method is applicable to various analytical samples, including processed foods. The use of other primers and fluorescence probes would permit highly sensitive detection of various recombinant DNA sequences besides the 35S promoter and NOS terminator.
2014-01-01
Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312
Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang
2014-03-05
RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.
Leckelt, Marius; Wetzel, Eunike; Gerlach, Tanja M; Ackerman, Robert A; Miller, Joshua D; Chopik, William J; Penke, Lars; Geukes, Katharina; Küfner, Albrecht C P; Hutteman, Roos; Richter, David; Renner, Karl-Heinz; Allroggen, Marc; Brecheen, Courtney; Campbell, W Keith; Grossmann, Igor; Back, Mitja D
2018-01-01
Due to increased empirical interest in narcissism across the social sciences, there is a need for inventories that can be administered quickly while also reliably measuring both the agentic and antagonistic aspects of grandiose narcissism. In this study, we sought to validate the factor structure, provide representative descriptive data and reliability estimates, assess the reliability across the trait spectrum, and examine the nomological network of the short version of the Narcissistic Admiration and Rivalry Questionnaire (NARQ-S; Back et al., 2013). We used data from a large convenience sample (total N = 11,937) as well as data from a large representative sample (total N = 4,433) that included responses to other narcissism measures as well as related constructs, including the other Dark Triad traits, Big Five personality traits, and self-esteem. Confirmatory factor analysis and item response theory were used to validate the factor structure and estimate the reliability across the latent trait spectrum, respectively. Results suggest that the NARQ-S shows a robust factor structure and is a reliable and valid short measure of the agentic and antagonistic aspects of grandiose narcissism. We also discuss future directions and applications of the NARQ-S as a short and comprehensive measure of grandiose narcissism. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Rezeli, Melinda; Sjödin, Karin; Lindberg, Henrik; Gidlöf, Olof; Lindahl, Bertil; Jernberg, Tomas; Spaak, Jonas; Erlinge, David; Marko-Varga, György
2017-09-01
A multiple reaction monitoring (MRM) assay was developed for precise quantitation of 87 plasma proteins including the three isoforms of apolipoprotein E (APOE) associated with cardiovascular diseases using nanoscale liquid chromatography separation and stable isotope dilution strategy. The analytical performance of the assay was evaluated and we found an average technical variation of 4.7% in 4-5 orders of magnitude dynamic range (≈0.2 mg/L to 4.5 g/L) from whole plasma digest. Here, we report a complete workflow, including sample processing adapted to 96-well plate format and normalization strategy for large-scale studies. To further investigate the MS-based quantitation the amount of six selected proteins was measured by routinely used clinical chemistry assays as well and the two methods showed excellent correlation with high significance (p-value < 10e-5) for the six proteins, in addition for the cardiovascular predictor factor, APOB: APOA1 ratio (r = 0.969, p-value < 10e-5). Moreover, we utilized the developed assay for screening of biobank samples from patients with myocardial infarction and performed the comparative analysis of patient groups with STEMI (ST- segment elevation myocardial infarction), NSTEMI (non ST- segment elevation myocardial infarction) and type-2 AMI (type-2 myocardial infarction) patients.
Faster the better: a reliable technique to sample anopluran lice in large hosts.
Leonardi, María Soledad
2014-06-01
Among Anoplura, the family Echinophthiriidae includes those species that infest mainly the pinnipeds. Working with large hosts implies methodological considerations as the time spent in the sampling, and the way in that the animal is restrained. Previous works on echinophthiriids combined a diverse array of analyses including field counts of lice and in vitro observations. To collect lice, the authors used forceps, and each louse was collected individually. This implied a long manipulation time, i.e., ≈60 min and the need to physically and/or chemically immobilize the animal. The present work described and discussed for the first a sample technique that minimized the manipulation time and also avoiding the use of anesthesia. This methodology implied combing the host's pelage with a fine-tooth plastic comb, as used in the treatment of human pediculosis, and keeping the comb with the lice retained in a Ziploc® bag with ethanol. This technique was used successfully in studies of population dynamic, habitat selection, and transmission pattern, being a reliable methodology. Lice are collected entirely and are in a good condition to prepare them for mounting for studying under light or scanning electron microscopy. Moreover, the use of the plastic comb protects from damaging taxonomically important structures as spines being also recommended to reach taxonomic or morphological goals.
A simplified method to recover urinary vesicles for clinical applications, and sample banking.
Musante, Luca; Tataruch, Dorota; Gu, Dongfeng; Benito-Martin, Alberto; Calzaferri, Giulio; Aherne, Sinead; Holthofer, Harry
2014-12-23
Urinary extracellular vesicles provide a novel source for valuable biomarkers for kidney and urogenital diseases: Current isolation protocols include laborious, sequential centrifugation steps which hampers their widespread research and clinical use. Furthermore, large individual urine sample volumes or sizable target cohorts are to be processed (e.g. for biobanking), the storage capacity is an additional problem. Thus, alternative methods are necessary to overcome such limitations. We have developed a practical vesicle isolation technique to yield easily manageable sample volumes in an exceptionally cost efficient way to facilitate their full utilization in less privileged environments and maximize the benefit of biobanking. Urinary vesicles were isolated by hydrostatic dialysis with minimal interference of soluble proteins or vesicle loss. Large volumes of urine were concentrated up to 1/100 of original volume and the dialysis step allowed equalization of urine physico-chemical characteristics. Vesicle fractions were found suitable to any applications, including RNA analysis. In the yield, our hydrostatic filtration dialysis system outperforms the conventional ultracentrifugation-based methods and the labour intensive and potentially hazardous step of ultracentrifugations are eliminated. Likewise, the need for trained laboratory personnel and heavy initial investment is avoided. Thus, our method qualifies as a method for laboratories working with urinary vesicles and biobanking.
A Simplified Method to Recover Urinary Vesicles for Clinical Applications, and Sample Banking
Musante, Luca; Tataruch, Dorota; Gu, Dongfeng; Benito-Martin, Alberto; Calzaferri, Giulio; Aherne, Sinead; Holthofer, Harry
2014-01-01
Urinary extracellular vesicles provide a novel source for valuable biomarkers for kidney and urogenital diseases: Current isolation protocols include laborious, sequential centrifugation steps which hampers their widespread research and clinical use. Furthermore, large individual urine sample volumes or sizable target cohorts are to be processed (e.g. for biobanking), the storage capacity is an additional problem. Thus, alternative methods are necessary to overcome such limitations. We have developed a practical vesicle isolation technique to yield easily manageable sample volumes in an exceptionally cost efficient way to facilitate their full utilization in less privileged environments and maximize the benefit of biobanking. Urinary vesicles were isolated by hydrostatic dialysis with minimal interference of soluble proteins or vesicle loss. Large volumes of urine were concentrated up to 1/100 of original volume and the dialysis step allowed equalization of urine physico-chemical characteristics. Vesicle fractions were found suitable to any applications, including RNA analysis. In the yield, our hydrostatic filtration dialysis system outperforms the conventional ultracentrifugation-based methods and the labour intensive and potentially hazardous step of ultracentrifugations are eliminated. Likewise, the need for trained laboratory personnel and heavy initial investment is avoided. Thus, our method qualifies as a method for laboratories working with urinary vesicles and biobanking. PMID:25532487
Eichelsheim, Veroni I; Buist, Kirsten L; Deković, Maja; Wissink, Inge B; Frijns, Tom; van Lier, Pol A C; Koot, Hans M; Meeus, Wim H J
2010-03-01
The aim of the present study is to examine whether the patterns of association between the quality of the parent-adolescent relationship on the one hand, and aggression and delinquency on the other hand, are the same for boys and girls of Dutch and Moroccan origin living in the Netherlands. Since inconsistent results have been found previously, the present study tests the replicability of the model of associations in two different Dutch samples of adolescents. Study 1 included 288 adolescents (M age = 14.9, range 12-17 years) all attending lower secondary education. Study 2 included 306 adolescents (M age = 13.2, range = 12-15 years) who were part of a larger community sample with oversampling of at risk adolescents. Multigroup structural analyses showed that neither in Study 1 nor in Study 2 ethnic or gender differences were found in the patterns of associations between support, autonomy, disclosure, and negativity in the parent-adolescent relationship and aggression and delinquency. The patterns were largely similar for both studies. Mainly negative quality of the relationship in both studies was found to be strongly related to both aggression and delinquency. Results show that family processes that affect adolescent development, show a large degree of universality across gender and ethnicity.
DALMATIAN: An Algorithm for Automatic Cell Detection and Counting in 3D.
Shuvaev, Sergey A; Lazutkin, Alexander A; Kedrov, Alexander V; Anokhin, Konstantin V; Enikolopov, Grigori N; Koulakov, Alexei A
2017-01-01
Current 3D imaging methods, including optical projection tomography, light-sheet microscopy, block-face imaging, and serial two photon tomography enable visualization of large samples of biological tissue. Large volumes of data obtained at high resolution require development of automatic image processing techniques, such as algorithms for automatic cell detection or, more generally, point-like object detection. Current approaches to automated cell detection suffer from difficulties originating from detection of particular cell types, cell populations of different brightness, non-uniformly stained, and overlapping cells. In this study, we present a set of algorithms for robust automatic cell detection in 3D. Our algorithms are suitable for, but not limited to, whole brain regions and individual brain sections. We used watershed procedure to split regional maxima representing overlapping cells. We developed a bootstrap Gaussian fit procedure to evaluate the statistical significance of detected cells. We compared cell detection quality of our algorithm and other software using 42 samples, representing 6 staining and imaging techniques. The results provided by our algorithm matched manual expert quantification with signal-to-noise dependent confidence, including samples with cells of different brightness, non-uniformly stained, and overlapping cells for whole brain regions and individual tissue sections. Our algorithm provided the best cell detection quality among tested free and commercial software.
Scaling up to address data science challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Joanne R.
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Scaling up to address data science challenges
Wendelberger, Joanne R.
2017-04-27
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
NASA Astrophysics Data System (ADS)
Gunawardhana, M. L. P.; Hopkins, A. M.; Bland-Hawthorn, J.; Brough, S.; Sharp, R.; Loveday, J.; Taylor, E.; Jones, D. H.; Lara-López, M. A.; Bauer, A. E.; Colless, M.; Owers, M.; Baldry, I. K.; López-Sánchez, A. R.; Foster, C.; Bamford, S.; Brown, M. J. I.; Driver, S. P.; Drinkwater, M. J.; Liske, J.; Meyer, M.; Norberg, P.; Robotham, A. S. G.; Ching, J. H. Y.; Cluver, M. E.; Croom, S.; Kelvin, L.; Prescott, M.; Steele, O.; Thomas, D.; Wang, L.
2013-08-01
Measurements of the low-z Hα luminosity function, Φ, have a large dispersion in the local number density of sources (˜0.5-1 Mpc-3 dex-1), and correspondingly in the star formation rate density (SFRD). The possible causes for these discrepancies include limited volume sampling, biases arising from survey sample selection, different methods of correcting for dust obscuration and active galactic nucleus contamination. The Galaxy And Mass Assembly (GAMA) survey and Sloan Digital Sky Survey (SDSS) provide deep spectroscopic observations over a wide sky area enabling detection of a large sample of star-forming galaxies spanning 0.001 < SFRHα (M⊙ yr- 1) < 100 with which to robustly measure the evolution of the SFRD in the low-z Universe. The large number of high-SFR galaxies present in our sample allow an improved measurement of the bright end of the luminosity function, indicating that the decrease in Φ at bright luminosities is best described by a Saunders functional form rather than the traditional Schechter function. This result is consistent with other published luminosity functions in the far-infrared and radio. For GAMA and SDSS, we find the r-band apparent magnitude limit, combined with the subsequent requirement for Hα detection leads to an incompleteness due to missing bright Hα sources with faint r-band magnitudes.
Methods to increase reproducibility in differential gene expression via meta-analysis
Sweeney, Timothy E.; Haynes, Winston A.; Vallania, Francesco; Ioannidis, John P.; Khatri, Purvesh
2017-01-01
Findings from clinical and biological studies are often not reproducible when tested in independent cohorts. Due to the testing of a large number of hypotheses and relatively small sample sizes, results from whole-genome expression studies in particular are often not reproducible. Compared to single-study analysis, gene expression meta-analysis can improve reproducibility by integrating data from multiple studies. However, there are multiple choices in designing and carrying out a meta-analysis. Yet, clear guidelines on best practices are scarce. Here, we hypothesized that studying subsets of very large meta-analyses would allow for systematic identification of best practices to improve reproducibility. We therefore constructed three very large gene expression meta-analyses from clinical samples, and then examined meta-analyses of subsets of the datasets (all combinations of datasets with up to N/2 samples and K/2 datasets) compared to a ‘silver standard’ of differentially expressed genes found in the entire cohort. We tested three random-effects meta-analysis models using this procedure. We showed relatively greater reproducibility with more-stringent effect size thresholds with relaxed significance thresholds; relatively lower reproducibility when imposing extraneous constraints on residual heterogeneity; and an underestimation of actual false positive rate by Benjamini–Hochberg correction. In addition, multivariate regression showed that the accuracy of a meta-analysis increased significantly with more included datasets even when controlling for sample size. PMID:27634930
NASA Technical Reports Server (NTRS)
Hughes, William O.; McNelis, Anne M.
2010-01-01
The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.
Gaussian vs. Bessel light-sheets: performance analysis in live large sample imaging
NASA Astrophysics Data System (ADS)
Reidt, Sascha L.; Correia, Ricardo B. C.; Donnachie, Mark; Weijer, Cornelis J.; MacDonald, Michael P.
2017-08-01
Lightsheet fluorescence microscopy (LSFM) has rapidly progressed in the past decade from an emerging technology into an established methodology. This progress has largely been driven by its suitability to developmental biology, where it is able to give excellent spatial-temporal resolution over relatively large fields of view with good contrast and low phototoxicity. In many respects it is superseding confocal microscopy. However, it is no magic bullet and still struggles to image deeply in more highly scattering samples. Many solutions to this challenge have been presented, including, Airy and Bessel illumination, 2-photon operation and deconvolution techniques. In this work, we show a comparison between a simple but effective Gaussian beam illumination and Bessel illumination for imaging in chicken embryos. Whilst Bessel illumination is shown to be of benefit when a greater depth of field is required, it is not possible to see any benefits for imaging into the highly scattering tissue of the chick embryo.
Scaling ice microstructures from the laboratory to nature: cryo-EBSD on large samples.
NASA Astrophysics Data System (ADS)
Prior, David; Craw, Lisa; Kim, Daeyeong; Peyroux, Damian; Qi, Chao; Seidemann, Meike; Tooley, Lauren; Vaughan, Matthew; Wongpan, Pat
2017-04-01
Electron backscatter diffraction (EBSD) has extended significantly our ability to conduct detailed quantitative microstructural investigations of rocks, metals and ceramics. EBSD on ice was first developed in 2004. Techniques have improved significantly in the last decade and EBSD is now becoming more common in the microstructural analysis of ice. This is particularly true for laboratory-deformed ice where, in some cases, the fine grain sizes exclude the possibility of using a thin section of the ice. Having the orientations of all axes (rather than just the c-axis as in an optical method) yields important new information about ice microstructure. It is important to examine natural ice samples in the same way so that we can scale laboratory observations to nature. In the case of ice deformation, higher strain rates are used in the laboratory than those seen in nature. These are achieved by increasing stress and/or temperature and it is important to assess that the microstructures produced in the laboratory are comparable with those observed in nature. Natural ice samples are coarse grained. Glacier and ice sheet ice has a grain size from a few mm up to several cm. Sea and lake ice has grain sizes of a few cm to many metres. Thus extending EBSD analysis to larger sample sizes to include representative microstructures is needed. The chief impediments to working on large ice samples are sample exchange, limitations on stage motion and temperature control. Large ice samples cannot be transferred through a typical commercial cryo-transfer system that limits sample sizes. We transfer through a nitrogen glove box that encloses the main scanning electron microscope (SEM) door. The nitrogen atmosphere prevents the cold stage and the sample from becoming covered in frost. Having a long optimal working distance for EBSD (around 30mm for the Otago cryo-EBSD facility) , by moving the camera away from the pole piece, enables the stage to move without crashing into either the EBSD camera or the SEM pole piece (final lens). In theory a sample up to 100mm perpendicular to the tilt axis by 150mm parallel to the tilt axis can be analysed. In practice, the motion of our stage is restricted to maximum dimensions of 100 by 50mm by a conductive copper braid on our cold stage. Temperature control becomes harder as the samples become larger. If the samples become too warm then they will start to sublime and the quality of EBSD data will reduce. Large samples need to be relatively thin ( 5mm or less) so that conduction of heat to the cold stage is more effective at keeping the surface temperature low. In the Otago facility samples of up to 40mm by 40mm present little problem and can be analysed for several hours without significant sublimation. Larger samples need more care, e.g. fast sample transfer to keep the sample very cold. The largest samples we work on routinely are 40 by 60mm in size. We will show examples of EBSD data from glacial ice and sea ice from Antarctica and from large laboratory ice samples.
NASA Astrophysics Data System (ADS)
Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun
2016-03-01
Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.
Establishing an academic biobank in a resource-challenged environment.
Soo, Cassandra Claire; Mukomana, Freedom; Hazelhurst, Scott; Ramsay, Michele
2017-05-24
Past practices of informal sample collections and spreadsheets for data and sample management fall short of best-practice models for biobanking, and are neither cost effective nor efficient to adequately serve the needs of large research studies. The biobank of the Sydney Brenner Institute for Molecular Bioscience serves as a bioresource for institutional, national and international research collaborations. It provides high-quality human biospecimens from African populations, secure data and sample curation and storage, as well as monitored sample handling and management processes, to promote both non-communicable and infectious-disease research. Best-practice guidelines have been adapted to align with a low-resource setting and have been instrumental in the development of a quality-management system, including standard operating procedures and a quality-control regimen. Here, we provide a summary of 10 important considerations for initiating and establishing an academic research biobank in a low-resource setting. These include addressing ethical, legal, technical, accreditation and/or certification concerns and financial sustainability.
Integrated Blood Barcode Chips
Fan, Rong; Vermesh, Ophir; Srivastava, Alok; Yen, Brian K.H.; Qin, Lidong; Ahmad, Habib; Kwong, Gabriel A.; Liu, Chao-Chao; Gould, Juliane; Hood, Leroy; Heath, James R.
2008-01-01
Blood comprises the largest version of the human proteome1. Changes of plasma protein profiles can reflect physiological or pathological conditions associated with many human diseases, making blood the most important fluid for clinical diagnostics2-4. Nevertheless, only a handful of plasma proteins are utilized in routine clinical tests. This is due to a host of reasons, including the intrinsic complexity of the plasma proteome1, the heterogeneity of human diseases and the fast kinetics associated with protein degradation in sampled blood5. Simple technologies that can sensitively sample large numbers of proteins over broad concentration ranges, from small amounts of blood, and within minutes of sample collection, would assist in solving these problems. Herein, we report on an integrated microfluidic system, called the Integrated Blood Barcode Chip (IBBC). It enables on-chip blood separation and the rapid measurement of a panel of plasma proteins from small quantities of blood samples including a fingerprick of whole blood. This platform holds potential for inexpensive, non-invasive, and informative clinical diagnoses, particularly, for point-of-care. PMID:19029914
Establishing an academic biobank in a resource-challenged environment
Soo, C C; Mukomana, F; Hazelhurst, S; Ramsay, M
2018-01-01
Past practices of informal sample collections and spreadsheets for data and sample management fall short of best-practice models for biobanking, and are neither cost effective nor efficient to adequately serve the needs of large research studies. The biobank of the Sydney Brenner Institute for Molecular Bioscience serves as a bioresource for institutional, national and international research collaborations. It provides high-quality human biospecimens from African populations, secure data and sample curation and storage, as well as monitored sample handling and management processes, to promote both non-communicable and infectious-disease research. Best-practice guidelines have been adapted to align with a low-resource setting and have been instrumental in the development of a quality-management system, including standard operating procedures and a quality-control regimen. Here, we provide a summary of 10 important considerations for initiating and establishing an academic research biobank in a low-resource setting. These include addressing ethical, legal, technical, accreditation and/or certification concerns and financial sustainability. PMID:28604319
Micro injector sample delivery system for charged molecules
Davidson, James C.; Balch, Joseph W.
1999-11-09
A micro injector sample delivery system for charged molecules. The injector is used for collecting and delivering controlled amounts of charged molecule samples for subsequent analysis. The injector delivery system can be scaled to large numbers (>96) for sample delivery to massively parallel high throughput analysis systems. The essence of the injector system is an electric field controllable loading tip including a section of porous material. By applying the appropriate polarity bias potential to the injector tip, charged molecules will migrate into porous material, and by reversing the polarity bias potential the molecules are ejected or forced away from the tip. The invention has application for uptake of charged biological molecules (e.g. proteins, nucleic acids, polymers, etc.) for delivery to analytical systems, and can be used in automated sample delivery systems.
Marron, D.C.
1988-01-01
Samples from metal-contaminated flood-plain sediments at 9 sites downstream from Lead, in west-central South Dakota, were collected during the summers of 1985-87 to characterize aspects of the sedimentology, chemistry, and geometry of a deposit that resulted from the discharge of a large volume of mining wastes into a river system. Field and laboratory data include stratigraphic descriptions, chemical contents and grain-size distributions of samples, and surveyed flood-plain positions of samples. This report describes sampling-site locations, and methods of sample collection and preservation, and subsequent laboratory analysis. Field and laboratory data are presented in 4 figures and 11 tables in the ' Supplemental Data ' section at the back of the report. (USGS)
Development and Validation of the Minnesota Borderline Personality Disorder Scale (MBPD)
Bornovalova, Marina A.; Hicks, Brian M.; Patrick, Christopher J.; Iacono, William G.; McGue, Matt
2011-01-01
While large epidemiological datasets can inform research on the etiology and development of borderline personality disorder (BPD), they rarely include BPD measures. In some cases, however, proxy measures can be constructed using instruments already in these datasets. In this study we developed and validated a self-report measure of BPD from the Multidimensional Personality Questionnaire (MPQ). Items for the new instrument—the Minnesota BPD scale (MBPD)—were identified and refined using three large samples: undergraduates, community adolescent twins, and urban substance users. We determined the construct validity of the MBPD by examining its association with (1) diagnosed BPD, (2) questionnaire reported BPD symptoms, and (3) clinical variables associated with BPD: suicidality, trauma, disinhibition, internalizing distress, and substance use. We also tested the MBPD in two prison inmate samples. Across samples, the MBPD correlated with BPD indices and external criteria, and showed incremental validity above measures of negative affect, thus supporting its construct validity as a measure of BPD. PMID:21467094
A DNA methylation map of human cancer at single base-pair resolution.
Vidal, E; Sayols, S; Moran, S; Guillaumet-Adkins, A; Schroeder, M P; Royo, R; Orozco, M; Gut, M; Gut, I; Lopez-Bigas, N; Heyn, H; Esteller, M
2017-10-05
Although single base-pair resolution DNA methylation landscapes for embryonic and different somatic cell types provided important insights into epigenetic dynamics and cell-type specificity, such comprehensive profiling is incomplete across human cancer types. This prompted us to perform genome-wide DNA methylation profiling of 22 samples derived from normal tissues and associated neoplasms, including primary tumors and cancer cell lines. Unlike their invariant normal counterparts, cancer samples exhibited highly variable CpG methylation levels in a large proportion of the genome, involving progressive changes during tumor evolution. The whole-genome sequencing results from selected samples were replicated in a large cohort of 1112 primary tumors of various cancer types using genome-scale DNA methylation analysis. Specifically, we determined DNA hypermethylation of promoters and enhancers regulating tumor-suppressor genes, with potential cancer-driving effects. DNA hypermethylation events showed evidence of positive selection, mutual exclusivity and tissue specificity, suggesting their active participation in neoplastic transformation. Our data highlight the extensive changes in DNA methylation that occur in cancer onset, progression and dissemination.
Thorsen, Jonathan; Brejnrod, Asker; Mortensen, Martin; Rasmussen, Morten A; Stokholm, Jakob; Al-Soud, Waleed Abu; Sørensen, Søren; Bisgaard, Hans; Waage, Johannes
2016-11-25
There is an immense scientific interest in the human microbiome and its effects on human physiology, health, and disease. A common approach for examining bacterial communities is high-throughput sequencing of 16S rRNA gene hypervariable regions, aggregating sequence-similar amplicons into operational taxonomic units (OTUs). Strategies for detecting differential relative abundance of OTUs between sample conditions include classical statistical approaches as well as a plethora of newer methods, many borrowing from the related field of RNA-seq analysis. This effort is complicated by unique data characteristics, including sparsity, sequencing depth variation, and nonconformity of read counts to theoretical distributions, which is often exacerbated by exploratory and/or unbalanced study designs. Here, we assess the robustness of available methods for (1) inference in differential relative abundance analysis and (2) beta-diversity-based sample separation, using a rigorous benchmarking framework based on large clinical 16S microbiome datasets from different sources. Running more than 380,000 full differential relative abundance tests on real datasets with permuted case/control assignments and in silico-spiked OTUs, we identify large differences in method performance on a range of parameters, including false positive rates, sensitivity to sparsity and case/control balances, and spike-in retrieval rate. In large datasets, methods with the highest false positive rates also tend to have the best detection power. For beta-diversity-based sample separation, we show that library size normalization has very little effect and that the distance metric is the most important factor in terms of separation power. Our results, generalizable to datasets from different sequencing platforms, demonstrate how the choice of method considerably affects analysis outcome. Here, we give recommendations for tools that exhibit low false positive rates, have good retrieval power across effect sizes and case/control proportions, and have low sparsity bias. Result output from some commonly used methods should be interpreted with caution. We provide an easily extensible framework for benchmarking of new methods and future microbiome datasets.
Measurements of airborne methylene diphenyl diisocyanate (MDI) concentration in the U.S. workplace.
Booth, Karroll; Cummings, Barbara; Karoly, William J; Mullins, Sharon; Robert, William P; Spence, Mark; Lichtenberg, Fran W; Banta, J
2009-04-01
This article summarizes a large body of industry air sampling data (8134 samples) in which airborne MDI concentrations were measured in a wide variety of manufacturing processes that use either polymeric MDI (PMDI) or monomeric (pure) MDI. Data were collected during the period 1984 through 1999. A total of 606 surveys were conducted for 251 companies at 317 facilities. The database includes 3583 personal (breathing zone) samples and 4551 area samples. Data demonstrate that workplace airborne MDI concentrations are extremely low in a majority of the manufacturing operations. Most (74.6%) of the airborne MDI concentrations measured in the personal samples were nondetectable, i.e., below the limits of quantification (LOQs). A variety of validated industrial hygiene sampling/analytical methods were used for data collection; most are modifications of OSHA Method 47. The LOQs for these methods ranged from 0.1-0.5 microg/sample. The very low vapor pressures of both monomeric MDI and PMDI largely explain the low airborne concentrations found in most operations. However, processes or applications in which the chemical is sprayed or heated may result in higher airborne concentrations and higher exposure potentials if appropriate control measures are not implemented. Data presented in this article will be a useful reference for employers in helping them to manage their health and safety program as it relates to respiratory protection during MDI/PMDI applications.
Erosion of an ancient mountain range, the Great Smoky Mountains, North Carolina and Tennessee
Matmon, A.; Bierman, P.R.; Larsen, J.; Southworth, S.; Pavich, M.; Finkel, R.; Caffee, M.
2003-01-01
Analysis of 10Be and 26Al in bedrock (n=10), colluvium (n=5 including grain size splits), and alluvial sediments (n=59 including grain size splits), coupled with field observations and GIS analysis, suggest that erosion rates in the Great Smoky Mountains are controlled by subsurface bedrock erosion and diffusive slope processes. The results indicate rapid alluvial transport, minimal alluvial storage, and suggest that most of the cosmogenic nuclide inventory in sediments is accumulated while they are eroding from bedrock and traveling down hill slopes. Spatially homogeneous erosion rates of 25 - 30 mm Ky-1 are calculated throughout the Great Smoky Mountains using measured concentrations of cosmogenic 10Be and 26Al in quartz separated from alluvial sediment. 10Be and 26Al concentrations in sediments collected from headwater tributaries that have no upstream samples (n=18) are consistent with an average erosion rate of 28 ?? 8 mm Ky-1, similar to that of the outlet rivers (n=16, 24 ?? 6 mm Ky-1), which carry most of the sediment out of the mountain range. Grain-size-specific analysis of 6 alluvial sediment samples shows higher nuclide concentrations in smaller grain sizes than in larger ones. The difference in concentrations arises from the large elevation distribution of the source of the smaller grains compared with the narrow and relatively low source elevation of the large grains. Large sandstone clasts disaggregate into sand-size grains rapidly during weathering and downslope transport; thus, only clasts from the lower parts of slopes reach the streams. 26Al/10Be ratios do not suggest significant burial periods for our samples. However, alluvial samples have lower 26Al/10Be ratios than bedrock and colluvial samples, a trend consistent with a longer integrated cosmic ray exposure history that includes periods of burial during down-slope transport. The results confirm some of the basic ideas embedded in Davis' geographic cycle model, such as the reduction of relief through slope processes, and of Hack's dynamic equilibrium model such as the similarity of erosion rates across different lithologies. Comparing cosmogenic nuclide data with other measured and calculated erosion rates for the Appalachians, we conclude that rates of erosion, integrated over varying time periods from decades to a hundred million years are similar, the result of equilibrium between erosion and isostatic uplift in the southern Appalachian Mountains.
Analysis of volatile organic compounds. [trace amounts of organic volatiles in gas samples
NASA Technical Reports Server (NTRS)
Zlatkis, A. (Inventor)
1977-01-01
An apparatus and method are described for reproducibly analyzing trace amounts of a large number of organic volatiles existing in a gas sample. Direct injection of the trapped volatiles into a cryogenic percolum provides a sharply defined plug. Applications of the method include: (1) analyzing the headspace gas of body fluids and comparing a profile of the organic volatiles with standard profiles for the detection and monitoring of disease; (2) analyzing the headspace gas of foods and beverages and comparing the profile with standard profiles to monitor and control flavor and aroma; and (3) analyses for determining the organic pollutants in air or water samples.
NASA Technical Reports Server (NTRS)
Colwell, R. N.
1976-01-01
The Forestry Applications Project has been directed towards solving the problem of meeting informational needs of the resource managers utilizing remote sensing data sources including satellite data, conventional aerial photography, and direct measurement on the ground in such combinations as needed to best achieve these goals. It is recognized that sampling plays an important role in generating relevant information for managing large geographic populations. The central problem, therefore, is to define the kind and amount of sampling and the place of remote sensing data sources in that sampling system to do the best possible job of meeting the manager's informational needs.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Straková, Petra; Laiho, Raija
2016-04-01
In this presentation, we assess the merits of using Fourier transform infrared (FTIR) spectra to estimate the organic matter composition in different plant biomass and peat soil samples. Infrared spectroscopy has a great potential in large-scale peatland studies that require low cost and high throughput techniques, as it gives a unique "chemical overview" of a sample, with all the chemical compounds present contributing to the spectrum produced. Our extensive sample sets include soil samples ranging from boreal to tropical peatlands, including sites under different environmental and/or land-use changes; above- and below-ground biomass of different peatland plant species; plant root mixtures. We mainly use FTIR to estimate (1) chemical composition of the samples (e.g., total C and N, C:N ratio, holocellulose, lignin and ash content), (2) proportion of each plant species in root mixtures, and (3) respiration of surface peat. The satisfactory results of our predictive models suggest that this experimental approach can, for example, be used as a screening tool in the evaluation of organic matter composition in peatlands during monitoring of their degradation and/or restoration success.
A single mini-barcode test to screen for Australian mammalian predators from environmental samples
MacDonald, Anna J; Sarre, Stephen D
2017-01-01
Abstract Identification of species from trace samples is now possible through the comparison of diagnostic DNA fragments against reference DNA sequence databases. DNA detection of animals from non-invasive samples, such as predator faeces (scats) that contain traces of DNA from their species of origin, has proved to be a valuable tool for the management of elusive wildlife. However, application of this approach can be limited by the availability of appropriate genetic markers. Scat DNA is often degraded, meaning that longer DNA sequences, including standard DNA barcoding markers, are difficult to recover. Instead, targeted short diagnostic markers are required to serve as diagnostic mini-barcodes. The mitochondrial genome is a useful source of such trace DNA markers because it provides good resolution at the species level and occurs in high copy numbers per cell. We developed a mini-barcode based on a short (178 bp) fragment of the conserved 12S ribosomal ribonucleic acid mitochondrial gene sequence, with the goal of discriminating amongst the scats of large mammalian predators of Australia. We tested the sensitivity and specificity of our primers and can accurately detect and discriminate amongst quolls, cats, dogs, foxes, and devils from trace DNA samples. Our approach provides a cost-effective, time-efficient, and non-invasive tool that enables identification of all 8 medium-large mammal predators in Australia, including native and introduced species, using a single test. With modification, this approach is likely to be of broad applicability elsewhere. PMID:28810700
The relation between statistical power and inference in fMRI
Wager, Tor D.; Yarkoni, Tal
2017-01-01
Statistically underpowered studies can result in experimental failure even when all other experimental considerations have been addressed impeccably. In fMRI the combination of a large number of dependent variables, a relatively small number of observations (subjects), and a need to correct for multiple comparisons can decrease statistical power dramatically. This problem has been clearly addressed yet remains controversial—especially in regards to the expected effect sizes in fMRI, and especially for between-subjects effects such as group comparisons and brain-behavior correlations. We aimed to clarify the power problem by considering and contrasting two simulated scenarios of such possible brain-behavior correlations: weak diffuse effects and strong localized effects. Sampling from these scenarios shows that, particularly in the weak diffuse scenario, common sample sizes (n = 20–30) display extremely low statistical power, poorly represent the actual effects in the full sample, and show large variation on subsequent replications. Empirical data from the Human Connectome Project resembles the weak diffuse scenario much more than the localized strong scenario, which underscores the extent of the power problem for many studies. Possible solutions to the power problem include increasing the sample size, using less stringent thresholds, or focusing on a region-of-interest. However, these approaches are not always feasible and some have major drawbacks. The most prominent solutions that may help address the power problem include model-based (multivariate) prediction methods and meta-analyses with related synthesis-oriented approaches. PMID:29155843
Derlet, Robert Wayne; Carlson, James Reynolds
2002-01-01
To determine the prevalence of microorganisms that are potentially pathogenic for humans in horse/mule manure along the John Muir Trail (JMT). Random samples of horse/mule manure were collected along sections of the JMT in Yosemite, Kings Canyon, and Sequoia national parks (NP), as well as in portions of the Pacific Crest Trail (PCT) and selected JMT/PCT access trails. Convenience samples of wild animal scat found within I mile of trails were also collected. The fresh specimens were individually preserved both in 0.9% saline and polyvinyl alcohol (PVA)-containing tubes and stored at 4 degrees C until time of analysis. Bacteriological analysis was performed using standard microbiology laboratory procedures. PVA samples were stained with trichrome and were then examined by a parasitologist. Collection: A total of 186 trail miles were sampled, including 113 on the JMT (Yosemite 37, Kings 53, and Sequoia 23). The PCT samplings included 24 miles, and NP and wilderness area access trails added an additional 49 miles. A total of 102 samples were collected, which included 81 samples from pack animals and 21 identified as having come from wild animals. Pack Animal Bacteria: All plated specimens grew large numbers of commensal gut flora. Potential pathogenic bacteria were found in only 12 samples and included Hafnia alvei (4), Serratia odorifera (1), Citrobacter freundii (1), Escherichia vulneris (1), Clostridium clostridioforme (1), Yersinia enterocolitica (1), Sherwinella putraformus (1), and Enterobacter spp (4). No Escherichia coli O157, Salmonella, or Aeromonas were found. Microscopic examination for protozoal organisms revealed occasional commensal ciliates and I Giardia. Wild Animal Pathogens: One specimen grew Y enterocolitica, and another grew Enterobacter amnigenus. We found a low prevalence of human pathogens in pack animal manure on the JMT.
On the predictivity of pore-scale simulations: Estimating uncertainties with multilevel Monte Carlo
NASA Astrophysics Data System (ADS)
Icardi, Matteo; Boccardo, Gianluca; Tempone, Raúl
2016-09-01
A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another ;equivalent; sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [1], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers, extrapolation and post-processing techniques. The proposed method can be efficiently used in many porous media applications for problems such as stochastic homogenization/upscaling, propagation of uncertainty from microscopic fluid and rock properties to macro-scale parameters, robust estimation of Representative Elementary Volume size for arbitrary physics.
Loukas, Christos-Moritz; Mowlem, Matthew C; Tsaloglou, Maria-Nefeli; Green, Nicolas G
2018-05-01
This paper presents a novel portable sample filtration/concentration system, designed for use on samples of microorganisms with very low cell concentrations and large volumes, such as water-borne parasites, pathogens associated with faecal matter, or toxic phytoplankton. The example application used for demonstration was the in-field collection and concentration of microalgae from seawater samples. This type of organism is responsible for Harmful Algal Blooms (HABs), an example of which is commonly referred to as "red tides", which are typically the result of rapid proliferation and high biomass accumulation of harmful microalgal species in the water column or at the sea surface. For instance, Karenia brevis red tides are the cause of aquatic organism mortality and persistent blooms may cause widespread die-offs of populations of other organisms including vertebrates. In order to respond to, and adequately manage HABs, monitoring of toxic microalgae is required and large-volume sample concentrators would be a useful tool for in situ monitoring of HABs. The filtering system presented in this work enables consistent sample collection and concentration from 1 L to 1 mL in five minutes, allowing for subsequent benchtop sample extraction and analysis using molecular methods such as NASBA and IC-NASBA. The microalga Tetraselmis suecica was successfully detected at concentrations ranging from 2 × 10 5 cells/L to 20 cells/L. Karenia brevis was also detected and quantified at concentrations between 10 cells/L and 10 6 cells/L. Further analysis showed that the filter system, which concentrates cells from very large volumes with consequently more reliable sampling, produced samples that were more consistent than the independent non-filtered samples (benchtop controls), with a logarithmic dependency on increasing cell numbers. This filtering system provides simple, rapid, and consistent sample collection and concentration for further analysis, and could be applied to a wide range of different samples and target organisms in situations lacking laboratories. Copyright © 2018. Published by Elsevier B.V.
LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team
2011-01-01
The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.
Evaluation of Respondent-Driven Sampling
McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G
2012-01-01
Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling method, and caution is required when interpreting findings based on the sampling method. PMID:22157309
17 CFR Appendix B to Part 420 - Sample Large Position Report
Code of Federal Regulations, 2013 CFR
2013-04-01
... $ Memorandum 2: Report the gross par amount of fails to deliver. Included in the calculation of line item 3... millions at par value as of trade date] Security Being Reported Date For Which Information is Being... Principal Components of the Specific Security $ Total Net Trading Position $ 2. Gross Financing Position...
17 CFR Appendix B to Part 420 - Sample Large Position Report
Code of Federal Regulations, 2012 CFR
2012-04-01
... $ Memorandum 2: Report the gross par amount of fails to deliver. Included in the calculation of line item 3... millions at par value as of trade date] Security Being Reported Date For Which Information is Being... Principal Components of the Specific Security $ Total Net Trading Position $ 2. Gross Financing Position...
17 CFR Appendix B to Part 420 - Sample Large Position Report
Code of Federal Regulations, 2011 CFR
2011-04-01
... $ Memorandum 2: Report the gross par amount of fails to deliver. Included in the calculation of line item 3... millions at par value as of trade date] Security Being Reported Date For Which Information is Being... Principal Components of the Specific Security $ Total Net Trading Position $ 2. Gross Financing Position...
Verbal and Performance IQ for Discrimination Among Psychiatric Diagnostic Groups
ERIC Educational Resources Information Center
Loro, Bert; Woodward, J. Arthur
1976-01-01
In view of the practical and theoretical importance of the issues involved, the current research was undertaken to investigate the diagnostic relevance of WAIS Verbal and Performance IQ in a large sample of psychiatric patients that included a variety of functional diagnostic groups as well as groups of mentally deficient and organic brain…
Longitudinal Surveys of Australian Youth (LSAY): 1995 Cohort: User Guide. Technical Report 49
ERIC Educational Resources Information Center
National Centre for Vocational Education Research (NCVER), 2009
2009-01-01
The Longitudinal Surveys of Australian Youth (LSAY) is a research program that tracks young people as they move from school into further study, work and other destinations. It uses large, nationally representative samples of young people to collect information about education and training, work, and social development. It includes surveys…
Development and Examination of the Social Appearance Anxiety Scale
ERIC Educational Resources Information Center
Hart, Trevor A.; Flora, David B.; Palyo, Sarah A.; Fresco, David M.; Holle, Christian; Heimberg, Richard G.
2008-01-01
The Social Appearance Anxiety Scale (SAAS) was created to measure anxiety about being negatively evaluated by others because of one's overall appearance, including body shape. This study examined the psychometric properties of the SAAS in three large samples of undergraduate students (respective ns = 512, 853, and 541). The SAAS demonstrated a…
ERIC Educational Resources Information Center
Bonney, Lewis A.
the steps taken by a large urban school district to develop and implement an objectives-based curriculum with criterion-referenced assessment of student progress are described. These steps include: goal setting, development of curriculum objectives, construction of assessment exercises, matrix sampling in test administration, and reporting of…
ERIC Educational Resources Information Center
Hall, William J.
2016-01-01
This article describes the development and preliminary validation of the Bullying, Harassment, and Aggression Receipt Measure (BullyHARM). The development of the BullyHARM involved a number of steps and methods, including a literature review, expert review, cognitive testing, readability testing, data collection from a large sample, reliability…
ERIC Educational Resources Information Center
Hatcher, Robert L.; Rogers, Daniel T.
2009-01-01
An Inventory of Interpersonal Strengths (IIS) was developed and validated in a series of large college student samples. Based on interpersonal theory and associated methods, the IIS was designed to assess positive characteristics representing the full range of interpersonal domains, including those generally thought to have negative qualities…
Learning Communities: Foundations for First-Year Students' Development of Pluralistic Outcomes
ERIC Educational Resources Information Center
Soria, Krista M.; Mitchell, Tania D.
2015-01-01
The purpose of this study was to investigate the associations between first-year undergraduates' (n = 1,701) participation in learning communities and their development of leadership and multicultural competence. The sample included first-year students who were enrolled at six large, public research universities in 2012 and completed the Student…
The Characteristics and Quality of Pre-School Education in Spain
ERIC Educational Resources Information Center
Sandstrom, Heather
2012-01-01
We examined 25 four-year-old pre-school classrooms from a random sample of 15 schools within a large urban city in southern Spain. Observational measures of classroom quality included the Early Childhood Environment Rating Scale-Revised, the Classroom Assessment Scoring System and the Observation of Activities in Pre-school. Findings revealed…
Educational Interests of Disadvantaged and Non-Disadvantaged Iowa Household Heads.
ERIC Educational Resources Information Center
Arendt, Donald Philip
A study was made of 538 disadvantaged and 247 non-disadvantaged household heads in Iowa -- their occupation, training desired, material possessions, membership and participation. The sample included 643 males and 142 females and was distributed in zones from open country to large urban areas. According to the prescribed criteria 14% of the…
The Marshall Islands Data Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoker, A.C.; Conrado, C.L.
1995-09-01
This report is a resource document of the methods and procedures used currently in the Data Management Program of the Marshall Islands Dose Assessment and Radioecology Project. Since 1973, over 60,000 environmental samples have been collected. Our program includes relational database design, programming and maintenance; sample and information management; sample tracking; quality control; and data entry, evaluation and reduction. The usefulness of scientific databases involves careful planning in order to fulfill the requirements of any large research program. Compilation of scientific results requires consolidation of information from several databases, and incorporation of new information as it is generated. The successmore » in combining and organizing all radionuclide analysis, sample information and statistical results into a readily accessible form, is critical to our project.« less
Temperature Control Diagnostics for Sample Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santodonato, Louis J; Walker, Lakeisha MH; Church, Andrew J
2010-01-01
In a scientific laboratory setting, standard equipment such as cryocoolers are often used as part of a custom sample environment system designed to regulate temperature over a wide range. The end user may be more concerned with precise sample temperature control than with base temperature. But cryogenic systems tend to be specified mainly in terms of cooling capacity and base temperature. Technical staff at scientific user facilities (and perhaps elsewhere) often wonder how to best specify and evaluate temperature control capabilities. Here we describe test methods and give results obtained at a user facility that operates a large sample environmentmore » inventory. Although this inventory includes a wide variety of temperature, pressure, and magnetic field devices, the present work focuses on cryocooler-based systems.« less
Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L
2018-03-01
Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.
Strijker, Marin; Gerritsen, Arja; van Hilst, Jony; Bijlsma, Maarten F; Bonsing, Bert A; Brosens, Lodewijk A; Bruno, Marco J; van Dam, Ronald M; Dijk, Frederike; van Eijck, Casper H; Farina Sarasqueta, Arantza; Fockens, Paul; Gerhards, Michael F; Groot Koerkamp, Bas; van der Harst, Erwin; de Hingh, Ignace H; van Hooft, Jeanin E; Huysentruyt, Clément J; Kazemier, Geert; Klaase, Joost M; van Laarhoven, Cornelis J; van Laarhoven, Hanneke W; Liem, Mike S; de Meijer, Vincent E; van Rijssen, L Bengt; van Santvoort, Hjalmar C; Suker, Mustafa; Verhagen, Judith H; Verheij, Joanne; Verspaget, Hein W; Wennink, Roos A; Wilmink, Johanna W; Molenaar, I Quintus; Boermeester, Marja A; Busch, Olivier R; Besselink, Marc G
2018-04-01
Large biobanks with uniform collection of biomaterials and associated clinical data are essential for translational research. The Netherlands has traditionally been well organized in multicenter clinical research on pancreatic diseases, including the nationwide multidisciplinary Dutch Pancreatic Cancer Group and Dutch Pancreatitis Study Group. To enable high-quality translational research on pancreatic and periampullary diseases, these groups established the Dutch Pancreas Biobank. The Dutch Pancreas Biobank is part of the Parelsnoer Institute and involves all 8 Dutch university medical centers and 5 nonacademic hospitals. Adult patients undergoing pancreatic surgery (all indications) are eligible for inclusion. Preoperative blood samples, tumor tissue from resected specimens, pancreatic cyst fluid, and follow-up blood samples are collected. Clinical parameters are collected in conjunction with the mandatory Dutch Pancreatic Cancer Audit. Between January 2015 and May 2017, 488 patients were included in the first 5 participating centers: 4 university medical centers and 1 nonacademic hospital. Over 2500 samples were collected: 1308 preoperative blood samples, 864 tissue samples, and 366 follow-up blood samples. Prospective collection of biomaterials and associated clinical data has started in the Dutch Pancreas Biobank. Subsequent translational research will aim to improve treatment decisions based on disease characteristics.
Phillips, P.; Chalmers, A.
2009-01-01
Some sources of organic wastewater compounds (OWCs) to streams, lakes, and estuaries, including wastewater-treatment-plant effluent, have been well documented, but other sources, particularly wet-weather discharges from combined-sewer-overflow (CSO) and urban runoff, may also be major sources of OWCs. Samples of wastewater-treatment-plant (WWTP) effluent, CSO effluent, urban streams, large rivers, a reference (undeveloped) stream, and Lake Champlain were collected from March to August 2006. The highest concentrations of many OWCs associated with wastewater were in WWTP-effluent samples, but high concentrations of some OWCs in samples of CSO effluent and storm runoff from urban streams subject to leaky sewer pipes or CSOs were also detected. Total concentrations and numbers of compounds detected differed substantially among sampling sites. The highest total OWC concentrations (10-100 ??g/l) were in samples of WWTP and CSO effluent. Total OWC concentrations in samples from urban streams ranged from 0.1 to 10 ??g/l, and urban stream-stormflow samples had higher concentrations than baseflow samples because of contributions of OWCs from CSOs and leaking sewer pipes. The relations between OWC concentrations in WWTP-effluent and those in CSO effluent and urban streams varied with the degree to which the compound is removed through normal wastewater treatment. Concentrations of compounds that are highly removed during normal wastewater treatment [including caffeine, Tris(2-butoxyethyl)phosphate, and cholesterol] were generally similar to or higher in CSO effluent than in WWTP effluent (and ranged from around 1 to over 10 ??g/l) because CSO effluent is untreated, and were higher in urban-stream stormflow samples than in baseflow samples as a result of CSO discharge and leakage from near-surface sources during storms. Concentrations of compounds that are poorly removed during treatment, by contrast, are higher in WWTP effluent than in CSO, due to dilution. Results indicate that CSO effluent and urban stormwaters can be a significant major source of OWCs entering large water bodies such as Burlington Bay. ?? 2008 American Water Resources Association.
Classifying sensory profiles of children in the general population.
Little, L M; Dean, E; Tomchek, S D; Dunn, W
2017-01-01
The aim of this study was to subtype groups of children in a community sample with and without developmental conditions, based on sensory processing patterns. We used latent profile analysis to determine the number of sensory subtypes in a sample of n = 1132 children aged 3-14 years with typical development and developmental conditions, including autism spectrum disorder (ASD), attention-deficit hyperactivity disorder and learning disabilities. A five-subtype solution was found to best characterize the sample, which differed on overall degree and differential presentation of sensory processing patterns. Children with and without developmental conditions presented across subtypes, and one subtype was significantly younger in age than others (P < 0.05). Our results show that sensory subtypes include both children with typical development and those with developmental conditions. Sensory subtypes have previously been investigated in ASD only, and our results suggest that similar sensory subtypes are present in a sample reflective of the general population of children including those largely with typical development. Elevated scores on sensory processing patterns are not unique to ASD but rather are reflections of children's abilities to respond to environmental demands. © 2016 John Wiley & Sons Ltd.
United States planetary rover status: 1989
NASA Technical Reports Server (NTRS)
Pivirotto, Donna L. S.; Dias, William C.
1990-01-01
A spectrum of concepts for planetary rovers and rover missions, is covered. Rovers studied range from tiny micro rovers to large and highly automated vehicles capable of traveling hundreds of kilometers and performing complex tasks. Rover concepts are addressed both for the Moon and Mars, including a Lunar/Mars common rover capable of supporting either program with relatively small modifications. Mission requirements considered include both Science and Human Exploration. Studies include a range of autonomy in rovers, from interactive teleoperated systems to those requiring and onboard System Executive making very high level decisions. Both high and low technology rover options are addressed. Subsystems are described for a representative selection of these rovers, including: Mobility, Sample Acquisition, Science, Vehicle Control, Thermal Control, Local Navigation, Computation and Communications. System descriptions of rover concepts include diagrams, technology levels, system characteristics, and performance measurement in terms of distance covered, samples collected, and area surveyed for specific representative missions. Rover development schedules and costs are addressed for Lunar and Mars exploration initiatives.
Dry, Sarah M; Garrett, Sarah B; Koenig, Barbara A; Brown, Arleen F; Burgess, Michael M; Hult, Jen R; Longstaff, Holly; Wilcox, Elizabeth S; Madrigal Contreras, Sigrid Karina; Martinez, Arturo; Boyd, Elizabeth A; Dohan, Daniel
2017-01-01
United States-based biorepositories are on the cusp of substantial change in regulatory oversight at the same time that they are increasingly including samples and data from large populations, e.g. all patients in healthcare system. It is appropriate to engage stakeholders from these populations in new governance arrangements. We sought to describe community recommendations for biorepository governance and oversight using deliberative community engagement (DCE), a qualitative research method designed to elicit lay perspectives on complex technical issues. We asked for stakeholders to provide input on governance of large biorepositories at the University of California (UC), a public university. We defined state residents as stakeholders and recruited residents from two large metropolitan areas, Los Angeles (LA) and San Francisco (SF). In LA, we recruited English and Spanish speakers; in SF the DCE was conducted in English only. We recruited individuals who had completed the 2009 California Health Interview Survey and were willing to be re-contacted for future studies. Using stratified random sampling (by age, education, race/ethnicity), we contacted 162 potential deliberants of whom 53 agreed to participate and 51 completed the 4-day DCE in June (LA) and September-October (SF), 2013. Each DCE included discussion among deliberants facilitated by a trained staff and simultaneously-translated in LA. Deliberants also received a briefing book describing biorepository operations and regulation. During the final day of the DCE, deliberants voted on governance and oversight recommendations using an audience response system. This paper describes 23 recommendations (of 57 total) that address issues including: educating the public, sharing samples broadly, monitoring researcher behavior, using informative consent procedures, and involving community members in a transparent process of biobank governance. This project demonstrates the feasibility of obtaining meaningful input on biorepository governance from diverse lay stakeholders. Such input should be considered as research institutions respond to changes in biorepository regulation.
Large-format InGaAs focal plane arrays for SWIR imaging
NASA Astrophysics Data System (ADS)
Hood, Andrew D.; MacDougal, Michael H.; Manzo, Juan; Follman, David; Geske, Jonathan C.
2012-06-01
FLIR Electro Optical Components will present our latest developments in large InGaAs focal plane arrays, which are used for low light level imaging in the short wavelength infrared (SWIR) regime. FLIR will present imaging from their latest small pitch (15 μm) focal plane arrays in VGA and High Definition (HD) formats. FLIR will present characterization of the FPA including dark current measurements as well as the use of correlated double sampling to reduce read noise. FLIR will show imagery as well as FPA-level characterization data.
Microfluidic ultrasonic particle separators with engineered node locations and geometries
Rose, Klint A.; Fisher, Karl A.; Wajda, Douglas A.; Mariella, Jr., Raymond P.; Bailey, Christopher; Dehlinger, Dietrich; Shusteff, Maxim; Jung, Byoungsok; Ness, Kevin D.
2016-04-26
An ultrasonic microfluidic system includes a separation channel for conveying a sample fluid containing small particles and large particles, flowing substantially parallel, adjacent to a recovery fluid, with which it is in contact. An acoustic transducer produces an ultrasound standing wave, that generates a pressure field having at least one node of minimum pressure amplitude. An acoustic extension structure is located proximate to said separation channel for positioning said acoustic node off center in said acoustic area and concentrating the large particles in said recovery fluid stream.
Microfluidic ultrasonic particle separators with engineered node locations and geometries
Rose, Klint A; Fisher, Karl A; Wajda, Douglas A; Mariella, Jr., Raymond P; Bailey, Christopher; Dehlinger, Dietrich; Shusteff, Maxim; Jung, Byoungsok; Ness, Kevin D
2015-03-31
An ultrasonic microfluidic system includes a separation channel for conveying a sample fluid containing small particles and large particles, flowing substantially parallel, adjacent to a recovery fluid, with which it is in contact. An acoustic transducer produces an ultrasound standing wave, that generates a pressure field having at least one node of minimum, pressure amplitude. An acoustic extension structure is located proximate to said separation channel for positioning said acoustic node off center in said acoustic area and concentrating the large particles in said recovery fluid stream.
Microfluidic ultrasonic particle separators with engineered node locations and geometries
Rose, Klint A; Fisher, Karl A; Wajda, Douglas A; Mariella, Jr., Raymond P; Bailey, Christoppher; Dehlinger, Dietrich; Shusteff, Maxim; Jung, Byoungsok; Ness, Kevin D
2014-05-20
An ultrasonic microfluidic system includes a separation channel for conveying a sample fluid containing small particles and large particles, flowing substantially parallel, adjacent to a recovery fluid, with which it is in contact. An acoustic transducer produces an ultrasound standing wave, that generates a pressure field having at least one node of minimum pressure amplitude. An acoustic extension structure is located proximate to said separation channel for positioning said acoustic node off center in said acoustic area and concentrating the large particles in said recovery fluid stream.
Occurrence of Radio Minihalos in a Mass-Limited Sample of Galaxy Clusters
NASA Technical Reports Server (NTRS)
Giacintucci, Simona; Markevitch, Maxim; Cassano, Rossella; Venturi, Tiziana; Clarke, Tracy E.; Brunetti, Gianfranco
2017-01-01
We investigate the occurrence of radio minihalos-diffuse radio sources of unknown origin observed in the cores of some galaxy clusters-in a statistical sample of 58 clusters drawn from the Planck Sunyaev-Zeldovich cluster catalog using a mass cut (M(sub 500) greater than 6 x 10(exp 14) solar mass). We supplement our statistical sample with a similarly sized nonstatistical sample mostly consisting of clusters in the ACCEPT X-ray catalog with suitable X-ray and radio data, which includes lower-mass clusters. Where necessary (for nine clusters), we reanalyzed the Very Large Array archival radio data to determine whether a minihalo is present. Our total sample includes all 28 currently known and recently discovered radio minihalos, including six candidates. We classify clusters as cool-core or non-cool-core according to the value of the specific entropy floor in the cluster center, rederived or newly derived from the Chandra X-ray density and temperature profiles where necessary (for 27 clusters). Contrary to the common wisdom that minihalos are rare, we find that almost all cool cores-at least 12 out of 15 (80%)-in our complete sample of massive clusters exhibit minihalos. The supplementary sample shows that the occurrence of minihalos may be lower in lower-mass cool-core clusters. No minihalos are found in non-cool cores or "warm cores." These findings will help test theories of the origin of minihalos and provide information on the physical processes and energetics of the cluster cores.
Doré, Evelyne; Deshommes, Elise; Andrews, Robert C; Nour, Shokoufeh; Prévost, Michèle
2018-04-21
Legacy lead and copper components are ubiquitous in plumbing of large buildings including schools that serve children most vulnerable to lead exposure. Lead and copper samples must be collected after varying stagnation times and interpreted in reference to different thresholds. A total of 130 outlets (fountains, bathroom and kitchen taps) were sampled for dissolved and particulate lead as well as copper. Sampling was conducted at 8 schools and 3 institutional (non-residential) buildings served by municipal water of varying corrosivity, with and without corrosion control (CC), and without a lead service line. Samples included first draw following overnight stagnation (>8h), partial (30 s) and fully (5 min) flushed, and first draw after 30 min of stagnation. Total lead concentrations in first draw samples after overnight stagnation varied widely from 0.07 to 19.9 μg Pb/L (median: 1.7 μg Pb/L) for large buildings served with non-corrosive water. Higher concentrations were observed in schools with corrosive water without CC (0.9-201 μg Pb/L, median: 14.3 μg Pb/L), while levels in schools with CC ranged from 0.2 to 45.1 μg Pb/L (median: 2.1 μg Pb/L). Partial flushing (30 s) and full flushing (5 min) reduced concentrations by 88% and 92% respectively for corrosive waters without CC. Lead concentrations were <10 μg Pb/L in all samples following 5 min of flushing. However, after only 30 min of stagnation, first draw concentrations increased back to >45% than values in 1st draw samples collected after overnight stagnation. Concentrations of particulate Pb varied widely (≥0.02-846 μg Pb/L) and was found to be the cause of very high total Pb concentrations in the 2% of samples exceeding 50 μg Pb/L. Pb levels across outlets within the same building varied widely (up to 1000X) especially in corrosive water (0.85-851 μg Pb/L after 30MS) confirming the need to sample at each outlet to identify high risk taps. Based on the much higher concentrations observed in first draw samples, even after a short stagnation, the first 250mL should be discarded unless no sources of lead are present. Results question the cost-benefit of daily or weekly flushing as a remediation strategy. As such, current regulatory requirements may fail to protect children as they may not identify problematic taps and effective mitigation measures. Copyright © 2018 Elsevier Ltd. All rights reserved.
Guagliardo, Sarah Anne; Morrison, Amy C.; Barboza, Jose Luis; Requena, Edwin; Astete, Helvio; Vazquez-Prokopec, Gonzalo; Kitron, Uriel
2015-01-01
Background and Objectives The dramatic range expansion of the dengue vector Aedes aegypti is associated with various anthropogenic transport activities, but little is known about the underlying mechanisms driving this geographic expansion. We longitudinally characterized infestation of different vehicle types (cars, boats, etc.) to estimate the frequency and intensity of mosquito introductions into novel locations (propagule pressure). Methods Exhaustive adult and immature Ae. aegypti collections were performed on six different vehicle types at five ports and two bus/ taxi departure points in the Amazonian city of Iquitos, Peru during 2013. Aquatic vehicles included 32 large and 33 medium-sized barges, 53 water taxis, and 41 speed boats. Terrestrial vehicles sampled included 40 buses and 30 taxis traveling on the only highway in the region. Ae. aegypti adult infestation rates and immature indices were analyzed by vehicle type, location within vehicles, and sampling date. Results Large barges (71.9% infested) and medium barges (39.4% infested) accounted for most of the infestations. Notably, buses had an overall infestation rate of 12.5%. On large barges, the greatest number of Ae. aegypti adults were found in October, whereas most immatures were found in February followed by October. The vast majority of larvae (85.9%) and pupae (76.7%) collected in large barges were produced in puddles formed in cargo holds. Conclusions Because larges barges provide suitable mosquito habitats (due to dark, damp cargo storage spaces and ample oviposition sites), we conclude that they likely serve as significant contributors to mosquitoes’ propagule pressure across long distances throughout the Peruvian Amazon. This information can help anticipate vector population mixing and future range expansions of dengue and other viruses transmitted by Ae. aegypti. PMID:25860352
Guagliardo, Sarah Anne; Morrison, Amy C; Barboza, Jose Luis; Requena, Edwin; Astete, Helvio; Vazquez-Prokopec, Gonzalo; Kitron, Uriel
2015-04-01
The dramatic range expansion of the dengue vector Aedes aegypti is associated with various anthropogenic transport activities, but little is known about the underlying mechanisms driving this geographic expansion. We longitudinally characterized infestation of different vehicle types (cars, boats, etc.) to estimate the frequency and intensity of mosquito introductions into novel locations (propagule pressure). Exhaustive adult and immature Ae. aegypti collections were performed on six different vehicle types at five ports and two bus/ taxi departure points in the Amazonian city of Iquitos, Peru during 2013. Aquatic vehicles included 32 large and 33 medium-sized barges, 53 water taxis, and 41 speed boats. Terrestrial vehicles sampled included 40 buses and 30 taxis traveling on the only highway in the region. Ae. aegypti adult infestation rates and immature indices were analyzed by vehicle type, location within vehicles, and sampling date. Large barges (71.9% infested) and medium barges (39.4% infested) accounted for most of the infestations. Notably, buses had an overall infestation rate of 12.5%. On large barges, the greatest number of Ae. aegypti adults were found in October, whereas most immatures were found in February followed by October. The vast majority of larvae (85.9%) and pupae (76.7%) collected in large barges were produced in puddles formed in cargo holds. Because larges barges provide suitable mosquito habitats (due to dark, damp cargo storage spaces and ample oviposition sites), we conclude that they likely serve as significant contributors to mosquitoes' propagule pressure across long distances throughout the Peruvian Amazon. This information can help anticipate vector population mixing and future range expansions of dengue and other viruses transmitted by Ae. aegypti.
Cozzolino, Daniel
2015-03-30
Vibrational spectroscopy encompasses a number of techniques and methods including ultra-violet, visible, Fourier transform infrared or mid infrared, near infrared and Raman spectroscopy. The use and application of spectroscopy generates spectra containing hundreds of variables (absorbances at each wavenumbers or wavelengths), resulting in the production of large data sets representing the chemical and biochemical wine fingerprint. Multivariate data analysis techniques are then required to handle the large amount of data generated in order to interpret the spectra in a meaningful way in order to develop a specific application. This paper focuses on the developments of sample presentation and main sources of error when vibrational spectroscopy methods are applied in wine analysis. Recent and novel applications will be discussed as examples of these developments. © 2014 Society of Chemical Industry.
Contact x-ray microscopy using Asterix
NASA Astrophysics Data System (ADS)
Conti, Aldo; Batani, Dimitri; Botto, Cesare; Masini, Alessandra; Bernardinello, A.; Bortolotto, Fulvia; Moret, M.; Poletti, G.; Piccoli, S.; Cotelli, F.; Lora Lamia Donin, C.; Stead, Anthony D.; Marranca, A.; Eidmann, Klaus; Flora, Francesco; Palladino, Libero; Reale, Lucia
1997-10-01
The use of a high energy laser source for soft x-ray contact microscopy is discussed. Several different targets were used and their emission spectra compared. The x-ray emission, inside and outside the Water Window, was characterized in detail by means of many diagnostics, including pin hole and streak cameras. Up to 12 samples holders per shot were exposed thanks to the large x-ray flux and the geometry of the interaction chamber. Images of several biological samples were obtained, including Chlamydomonas and Crethidia green algae, fish and boar sperms and Saccharomyces Cerevisiae yeast cells. A 50 nm resolution was reached on the images of boar sperm. Original information concerning the density of inner structures of Crethidia green algae were obtained.
Scanning tunneling spectroscopy under large current flow through the sample.
Maldonado, A; Guillamón, I; Suderow, H; Vieira, S
2011-07-01
We describe a method to make scanning tunneling microscopy/spectroscopy imaging at very low temperatures while driving a constant electric current up to some tens of mA through the sample. It gives a new local probe, which we term current driven scanning tunneling microscopy/spectroscopy. We show spectroscopic and topographic measurements under the application of a current in superconducting Al and NbSe(2) at 100 mK. Perspective of applications of this local imaging method includes local vortex motion experiments, and Doppler shift local density of states studies.
Nowcasting and Forecasting the Monthly Food Stamps Data in the US Using Online Search Data
Fantazzini, Dean
2014-01-01
We propose the use of Google online search data for nowcasting and forecasting the number of food stamps recipients. We perform a large out-of-sample forecasting exercise with almost 3000 competing models with forecast horizons up to 2 years ahead, and we show that models including Google search data statistically outperform the competing models at all considered horizons. These results hold also with several robustness checks, considering alternative keywords, a falsification test, different out-of-samples, directional accuracy and forecasts at the state-level. PMID:25369315
Sample Return from Small Solar System Bodies
NASA Astrophysics Data System (ADS)
Orgel, L.; A'Hearn, M.; Bada, J.; Baross, J.; Chapman, C.; Drake, M.; Kerridge, J.; Race, M.; Sogin, M.; Squyres, S.
With plans for multiple sample return missions in the next decade, NASA requested guidance from the National Research Council's SSB on how to treat samples returned from solar system bodies such as planetary satellites, asteroids and comets. A special Task Group assessed the potential for a living entity to be included in return samples from various bodies as well as the potential for large scale effects if such an entity were inadvertently introduced into the Earth's biosphere. The Group also assessed differences among solar system bodies, identified investigations that could reduce uncertainty about the bodies, and considered risks of returned samples compared to natural influx of material to the Earth in the form of interplanetary dust particles, meteorites and other small impactors. The final report (NRC, 1998) provides a decision making framework for future missions and makes recommendations on how to handle samples from different planetary satellites and primitive solar system bodies
Ion Beam Analyses Of Bark And Wood In Environmental Studies
NASA Astrophysics Data System (ADS)
Lill, J.-O.; Saarela, K.-E.; Harju, L.; Rajander, J.; Lindroos, A.; Heselius, S.-J.
2011-06-01
A large number of wood and bark samples have been analysed utilizing particle-induced X-ray emission (PIXE) and particle-induced gamma-ray emission (PIGE) techniques. Samples of common tree species like Scots Pine, Norway Spruce and birch were collected from a large number of sites in Southern and Southwestern Finland. Some of the samples were from a heavily polluted area in the vicinity of a copper-nickel smelter. The samples were dry ashed at 550 °C for the removal of the organic matrix in order to increase the analytical sensitivity of the method. The sensitivity was enhanced by a factor of 50 for wood and slightly less for bark. The ashed samples were pressed into pellets and irradiated as thick targets with a millimetre-sized proton beam. By including the ashing procedure in the method, the statistical dispersion due to elemental heterogeneities in wood material could be reduced. As a by-product, information about the elemental composition of ashes was obtained. By comparing the concentration of an element in bark ash to the concentration in wood ash of the same tree useful information from environmental point of view was obtained. The obtained ratio of the ashes was used to distinguish between elemental contributions from anthropogenic atmospheric sources and natural geochemical sources, like soil and bedrock.
Orbital Circularization of Hot and Cool Kepler Eclipsing Binaries
NASA Astrophysics Data System (ADS)
Van Eylen, Vincent; Winn, Joshua N.; Albrecht, Simon
2016-06-01
The rate of tidal circularization is predicted to be faster for relatively cool stars with convective outer layers, compared to hotter stars with radiative outer layers. Observing this effect is challenging because it requires large and well-characterized samples that include both hot and cool stars. Here we seek evidence of the predicted dependence of circularization upon stellar type, using a sample of 945 eclipsing binaries observed by Kepler. This sample complements earlier studies of this effect, which employed smaller samples of better-characterized stars. For each Kepler binary we measure e cos ω based on the relative timing of the primary and secondary eclipses. We examine the distribution of e cos ω as a function of period for binaries composed of hot stars, cool stars, and mixtures of the two types. At the shortest periods, hot-hot binaries are most likely to be eccentric; for periods shorter than four days, significant eccentricities occur frequently for hot-hot binaries, but not for hot-cool or cool-cool binaries. This is in qualitative agreement with theoretical expectations based on the slower dissipation rates of hot stars. However, the interpretation of our results is complicated by the largely unknown ages and evolutionary states of the stars in our sample.
ORBITAL CIRCULARIZATION OF HOT AND COOL KEPLER ECLIPSING BINARIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eylen, Vincent Van; Albrecht, Simon; Winn, Joshua N., E-mail: vincent@phys.au.dk
The rate of tidal circularization is predicted to be faster for relatively cool stars with convective outer layers, compared to hotter stars with radiative outer layers. Observing this effect is challenging because it requires large and well-characterized samples that include both hot and cool stars. Here we seek evidence of the predicted dependence of circularization upon stellar type, using a sample of 945 eclipsing binaries observed by Kepler . This sample complements earlier studies of this effect, which employed smaller samples of better-characterized stars. For each Kepler binary we measure e cos ω based on the relative timing of themore » primary and secondary eclipses. We examine the distribution of e cos ω as a function of period for binaries composed of hot stars, cool stars, and mixtures of the two types. At the shortest periods, hot–hot binaries are most likely to be eccentric; for periods shorter than four days, significant eccentricities occur frequently for hot–hot binaries, but not for hot–cool or cool–cool binaries. This is in qualitative agreement with theoretical expectations based on the slower dissipation rates of hot stars. However, the interpretation of our results is complicated by the largely unknown ages and evolutionary states of the stars in our sample.« less
Baranes, Adrien F; Oudeyer, Pierre-Yves; Gottlieb, Jacqueline
2014-01-01
Devising efficient strategies for exploration in large open-ended spaces is one of the most difficult computational problems of intelligent organisms. Because the available rewards are ambiguous or unknown during the exploratory phase, subjects must act in intrinsically motivated fashion. However, a vast majority of behavioral and neural studies to date have focused on decision making in reward-based tasks, and the rules guiding intrinsically motivated exploration remain largely unknown. To examine this question we developed a paradigm for systematically testing the choices of human observers in a free play context. Adult subjects played a series of short computer games of variable difficulty, and freely choose which game they wished to sample without external guidance or physical rewards. Subjects performed the task in three distinct conditions where they sampled from a small or a large choice set (7 vs. 64 possible levels of difficulty), and where they did or did not have the possibility to sample new games at a constant level of difficulty. We show that despite the absence of external constraints, the subjects spontaneously adopted a structured exploration strategy whereby they (1) started with easier games and progressed to more difficult games, (2) sampled the entire choice set including extremely difficult games that could not be learnt, (3) repeated moderately and high difficulty games much more frequently than was predicted by chance, and (4) had higher repetition rates and chose higher speeds if they could generate new sequences at a constant level of difficulty. The results suggest that intrinsically motivated exploration is shaped by several factors including task difficulty, novelty and the size of the choice set, and these come into play to serve two internal goals-maximize the subjects' knowledge of the available tasks (exploring the limits of the task set), and maximize their competence (performance and skills) across the task set.
Zhong, Sheng; McPeek, Mary Sara
2016-01-01
We consider the problem of genetic association testing of a binary trait in a sample that contains related individuals, where we adjust for relevant covariates and allow for missing data. We propose CERAMIC, an estimating equation approach that can be viewed as a hybrid of logistic regression and linear mixed-effects model (LMM) approaches. CERAMIC extends the recently proposed CARAT method to allow samples with related individuals and to incorporate partially missing data. In simulations, we show that CERAMIC outperforms existing LMM and generalized LMM approaches, maintaining high power and correct type 1 error across a wider range of scenarios. CERAMIC results in a particularly large power increase over existing methods when the sample includes related individuals with some missing data (e.g., when some individuals with phenotype and covariate information have missing genotype), because CERAMIC is able to make use of the relationship information to incorporate partially missing data in the analysis while correcting for dependence. Because CERAMIC is based on a retrospective analysis, it is robust to misspecification of the phenotype model, resulting in better control of type 1 error and higher power than that of prospective methods, such as GMMAT, when the phenotype model is misspecified. CERAMIC is computationally efficient for genomewide analysis in samples of related individuals of almost any configuration, including small families, unrelated individuals and even large, complex pedigrees. We apply CERAMIC to data on type 2 diabetes (T2D) from the Framingham Heart Study. In a genome scan, 9 of the 10 smallest CERAMIC p-values occur in or near either known T2D susceptibility loci or plausible candidates, verifying that CERAMIC is able to home in on the important loci in a genome scan. PMID:27695091
Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...
2017-02-16
Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A.; Halsey, William; Dehoff, Ryan
Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less
Seretis, Charalampos; Seretis, Fotios; Lagoudianakis, Emmanuel; Politou, Marianna; Gemenetzis, George; Salemis, Nikolaos S.
2012-01-01
Background. The objective of our study is to investigate the potential effect of adjusting preoperative platelet to lymphocyte ratio, an emerging biomarker of survival in cancer patients, for the fraction of large platelets. Methods. A total of 79 patients with breast neoplasias, 44 with fibroadenomas, and 35 with invasive ductal carcinoma were included in the study. Both conventional platelet to lymphocyte ratio (PLR) and the adjusted marker, large platelet to lymphocyte ratio (LPLR), were correlated with laboratory and histopathological parameters of the study sample. Results. LPLR elevation was significantly correlated with the presence of malignancy, advanced tumor stage, metastatic spread in the axillary nodes and HER2/neu overexpression, while PLR was only correlated with the number of infiltrated lymph nodes. Conclusions. This is the first study evaluating the effect of adjustment for large platelet count on improving PLR accuracy, when correlated with the basic independent markers of survival in a sample of breast cancer patients. Further studies are needed in order to assess the possibility of applying our adjustment as standard in terms of predicting survival rates in cancer. PMID:23304480
Seretis, Charalampos; Seretis, Fotios; Lagoudianakis, Emmanuel; Politou, Marianna; Gemenetzis, George; Salemis, Nikolaos S
2012-01-01
Background. The objective of our study is to investigate the potential effect of adjusting preoperative platelet to lymphocyte ratio, an emerging biomarker of survival in cancer patients, for the fraction of large platelets. Methods. A total of 79 patients with breast neoplasias, 44 with fibroadenomas, and 35 with invasive ductal carcinoma were included in the study. Both conventional platelet to lymphocyte ratio (PLR) and the adjusted marker, large platelet to lymphocyte ratio (LPLR), were correlated with laboratory and histopathological parameters of the study sample. Results. LPLR elevation was significantly correlated with the presence of malignancy, advanced tumor stage, metastatic spread in the axillary nodes and HER2/neu overexpression, while PLR was only correlated with the number of infiltrated lymph nodes. Conclusions. This is the first study evaluating the effect of adjustment for large platelet count on improving PLR accuracy, when correlated with the basic independent markers of survival in a sample of breast cancer patients. Further studies are needed in order to assess the possibility of applying our adjustment as standard in terms of predicting survival rates in cancer.
Assessment of arsenic surface contamination in a museum anthropology department.
Gribovich, Andrey; Lacey, Steven; Franke, John; Hinkamp, David
2013-02-01
To assess potential arsenic (As) contamination of work surfaces to improve upon the control strategy at an anthropology department in a large natural history museum. Work practices were observed and control strategy reviewed to inform an occupational hygiene assessment strategy utilizing surface wipe sampling. A total of 35 sampling targets were identified, focusing on surfaces that receive high touch traffic, including workstations, artifact transport carts, and elevator buttons. Arsenic sampling and analysis were performed using reference method Occupational Safety and Health Administration ID-125G. Four of the sampling areas returned detectable levels of As, ranging from 0.052 to 0.350 μg/100 cm. Workplace observations and wipe sampling data enabled the development of recommendations to help to further reduce potential occupational exposure to As. Continuous reduction of surface contamination is prudent for known human carcinogens.
Imchen, Madangchanok; Kumavath, Ranjith; Barh, Debmalya; Azevedo, Vasco; Ghosh, Preetam; Viana, Marcus; Wattam, Alice R
2017-08-18
In this study, we categorize the microbial community in mangrove sediment samples from four different locations within a vast mangrove system in Kerala, India. We compared this data to other samples taken from the other known mangrove data, a tropical rainforest, and ocean sediment. An examination of the microbial communities from a large mangrove forest that stretches across southwestern India showed strong similarities across the higher taxonomic levels. When ocean sediment and a single isolate from a tropical rain forest were included in the analysis, a strong pattern emerged with Bacteria from the phylum Proteobacteria being the prominent taxon among the forest samples. The ocean samples were predominantly Archaea, with Euryarchaeota as the dominant phylum. Principal component and functional analyses grouped the samples isolated from forests, including those from disparate mangrove forests and the tropical rain forest, from the ocean. Our findings show similar patterns in samples were isolated from forests, and these were distinct from the ocean sediment isolates. The taxonomic structure was maintained to the level of class, and functional analysis of the genes present also displayed these similarities. Our report for the first time shows the richness of microbial diversity in the Kerala coast and its differences with tropical rain forest and ocean microbiome.
Ferrari, Ulisse
2016-08-01
Maximum entropy models provide the least constrained probability distributions that reproduce statistical properties of experimental datasets. In this work we characterize the learning dynamics that maximizes the log-likelihood in the case of large but finite datasets. We first show how the steepest descent dynamics is not optimal as it is slowed down by the inhomogeneous curvature of the model parameters' space. We then provide a way for rectifying this space which relies only on dataset properties and does not require large computational efforts. We conclude by solving the long-time limit of the parameters' dynamics including the randomness generated by the systematic use of Gibbs sampling. In this stochastic framework, rather than converging to a fixed point, the dynamics reaches a stationary distribution, which for the rectified dynamics reproduces the posterior distribution of the parameters. We sum up all these insights in a "rectified" data-driven algorithm that is fast and by sampling from the parameters' posterior avoids both under- and overfitting along all the directions of the parameters' space. Through the learning of pairwise Ising models from the recording of a large population of retina neurons, we show how our algorithm outperforms the steepest descent method.
Froud, Robert; Rajendran, Dévan; Patel, Shilpa; Bright, Philip; Bjørkli, Tom; Eldridge, Sandra; Buchbinder, Rachelle; Underwood, Martin
2017-06-01
A systematic review of nonspecific low back pain trials published between 1980 and 2012. To explore what proportion of trials have been powered to detect different bands of effect size; whether there is evidence that sample size in low back pain trials has been increasing; what proportion of trial reports include a sample size calculation; and whether likelihood of reporting sample size calculations has increased. Clinical trials should have a sample size sufficient to detect a minimally important difference for a given power and type I error rate. An underpowered trial is one within which probability of type II error is too high. Meta-analyses do not mitigate underpowered trials. Reviewers independently abstracted data on sample size at point of analysis, whether a sample size calculation was reported, and year of publication. Descriptive analyses were used to explore ability to detect effect sizes, and regression analyses to explore the relationship between sample size, or reporting sample size calculations, and time. We included 383 trials. One-third were powered to detect a standardized mean difference of less than 0.5, and 5% were powered to detect less than 0.3. The average sample size was 153 people, which increased only slightly (∼4 people/yr) from 1980 to 2000, and declined slightly (∼4.5 people/yr) from 2005 to 2011 (P < 0.00005). Sample size calculations were reported in 41% of trials. The odds of reporting a sample size calculation (compared to not reporting one) increased until 2005 and then declined (Equation is included in full-text article.). Sample sizes in back pain trials and the reporting of sample size calculations may need to be increased. It may be justifiable to power a trial to detect only large effects in the case of novel interventions. 3.
From large-eddy simulation to multi-UAVs sampling of shallow cumulus clouds
NASA Astrophysics Data System (ADS)
Lamraoui, Fayçal; Roberts, Greg; Burnet, Frédéric
2016-04-01
In-situ sampling of clouds that can provide simultaneous measurements at satisfying spatio-temporal resolutions to capture 3D small scale physical processes continues to present challenges. This project (SKYSCANNER) aims at bringing together cloud sampling strategies using a swarm of unmanned aerial vehicles (UAVs) based on Large-eddy simulation (LES). The multi-UAV-based field campaigns with a personalized sampling strategy for individual clouds and cloud fields will significantly improve the understanding of the unresolved cloud physical processes. An extensive set of LES experiments for case studies from ARM-SGP site have been performed using MesoNH model at high resolutions down to 10 m. The carried out simulations led to establishing a macroscopic model that quantifies the interrelationship between micro- and macrophysical properties of shallow convective clouds. Both the geometry and evolution of individual clouds are critical to multi-UAV cloud sampling and path planning. The preliminary findings of the current project reveal several linear relationships that associate many cloud geometric parameters to cloud related meteorological variables. In addition, the horizontal wind speed indicates a proportional impact on cloud number concentration as well as triggering and prolonging the occurrence of cumulus clouds. In the framework of the joint collaboration that involves a Multidisciplinary Team (including institutes specializing in aviation, robotics and atmospheric science), this model will be a reference point for multi-UAVs sampling strategies and path planning.
An optimised protocol for molecular identification of Eimeria from chickens☆
Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L.; Macdonald, Sarah E.; Chaudhry, Abdul S.; Sparagano, Olivier; Banerjee, Partha S.; Kundu, Krishnendu; Tomley, Fiona M.; Blake, Damer P.
2014-01-01
Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples. PMID:24138724
Jiang, Zhenhong; He, Fei; Zhang, Ziding
2017-07-01
Through large-scale transcriptional data analyses, we highlighted the importance of plant metabolism in plant immunity and identified 26 metabolic pathways that were frequently influenced by the infection of 14 different pathogens. Reprogramming of plant metabolism is a common phenomenon in plant defense responses. Currently, a large number of transcriptional profiles of infected tissues in Arabidopsis (Arabidopsis thaliana) have been deposited in public databases, which provides a great opportunity to understand the expression patterns of metabolic pathways during plant defense responses at the systems level. Here, we performed a large-scale transcriptome analysis based on 135 previously published expression samples, including 14 different pathogens, to explore the expression pattern of Arabidopsis metabolic pathways. Overall, metabolic genes are significantly changed in expression during plant defense responses. Upregulated metabolic genes are enriched on defense responses, and downregulated genes are enriched on photosynthesis, fatty acid and lipid metabolic processes. Gene set enrichment analysis (GSEA) identifies 26 frequently differentially expressed metabolic pathways (FreDE_Paths) that are differentially expressed in more than 60% of infected samples. These pathways are involved in the generation of energy, fatty acid and lipid metabolism as well as secondary metabolite biosynthesis. Clustering analysis based on the expression levels of these 26 metabolic pathways clearly distinguishes infected and control samples, further suggesting the importance of these metabolic pathways in plant defense responses. By comparing with FreDE_Paths from abiotic stresses, we find that the expression patterns of 26 FreDE_Paths from biotic stresses are more consistent across different infected samples. By investigating the expression correlation between transcriptional factors (TFs) and FreDE_Paths, we identify several notable relationships. Collectively, the current study will deepen our understanding of plant metabolism in plant immunity and provide new insights into disease-resistant crop improvement.
The analysis of soil cores polluted with certain metals using the Box-Cox transformation.
Meloun, Milan; Sánka, Milan; Nemec, Pavel; Krítková, Sona; Kupka, Karel
2005-09-01
To define the soil properties for a given area or country including the level of pollution, soil survey and inventory programs are essential tools. Soil data transformations enable the expression of the original data on a new scale, more suitable for data analysis. In the computer-aided interactive analysis of large data files of soil characteristics containing outliers, the diagnostic plots of the exploratory data analysis (EDA) often find that the sample distribution is systematically skewed or reject sample homogeneity. Under such circumstances the original data should be transformed. The Box-Cox transformation improves sample symmetry and stabilizes spread. The logarithmic plot of a profile likelihood function enables the optimum transformation parameter to be found. Here, a proposed procedure for data transformation in univariate data analysis is illustrated on a determination of cadmium content in the plough zone of agricultural soils. A typical soil pollution survey concerns the determination of the elements Be (16 544 values available), Cd (40 317 values), Co (22 176 values), Cr (40 318 values), Hg (32 344 values), Ni (34 989 values), Pb (40 344 values), V (20 373 values) and Zn (36 123 values) in large samples.
Validity of Bioelectrical Impedance Analysis to Estimation Fat-Free Mass in the Army Cadets.
Langer, Raquel D; Borges, Juliano H; Pascoa, Mauro A; Cirolini, Vagner X; Guerra-Júnior, Gil; Gonçalves, Ezequiel M
2016-03-11
Bioelectrical Impedance Analysis (BIA) is a fast, practical, non-invasive, and frequently used method for fat-free mass (FFM) estimation. The aims of this study were to validate predictive equations of BIA to FFM estimation in Army cadets and to develop and validate a specific BIA equation for this population. A total of 396 males, Brazilian Army cadets, aged 17-24 years were included. The study used eight published predictive BIA equations, a specific equation in FFM estimation, and dual-energy X-ray absorptiometry (DXA) as a reference method. Student's t-test (for paired sample), linear regression analysis, and Bland-Altman method were used to test the validity of the BIA equations. Predictive BIA equations showed significant differences in FFM compared to DXA (p < 0.05) and large limits of agreement by Bland-Altman. Predictive BIA equations explained 68% to 88% of FFM variance. Specific BIA equations showed no significant differences in FFM, compared to DXA values. Published BIA predictive equations showed poor accuracy in this sample. The specific BIA equations, developed in this study, demonstrated validity for this sample, although should be used with caution in samples with a large range of FFM.
Corporate psychopathy: Talking the walk.
Babiak, Paul; Neumann, Craig S; Hare, Robert D
2010-01-01
There is a very large literature on the important role of psychopathy in the criminal justice system. We know much less about corporate psychopathy and its implications, in large part because of the difficulty in obtaining the active cooperation of business organizations. This has left us with only a few small-sample studies, anecdotes, and speculation. In this study, we had a unique opportunity to examine psychopathy and its correlates in a sample of 203 corporate professionals selected by their companies to participate in management development programs. The correlates included demographic and status variables, as well as in-house 360 degrees assessments and performance ratings. The prevalence of psychopathic traits-as measured by the Psychopathy Checklist-Revised (PCL-R) and a Psychopathy Checklist: Screening Version (PCL: SV) "equivalent"-was higher than that found in community samples. The results of confirmatory factor analysis (CFA) and structural equation modeling (SEM) indicated that the underlying latent structure of psychopathy in our corporate sample was consistent with that model found in community and offender studies. Psychopathy was positively associated with in-house ratings of charisma/presentation style (creativity, good strategic thinking and communication skills) but negatively associated with ratings of responsibility/performance (being a team player, management skills, and overall accomplishments).
Lee, Jae-Hoon; Hyeon, Ji-Yeon; Kim, Yun-Gyeong; Chon, Jung-Whan; Park, Jun-Ho; Park, Chankyu; Choi, In-Soo; Kim, Soo-Ki; Seo, Kun-Ho
2012-02-01
The prevalence of Shiga toxin-producing Escherichia coli (STEC) was investigated in 350 edible beef intestinal samples, including omasum (n=110), abomasum (n=120), and large intestines (n=120), collected from traditional beef markets in Seoul, Korea. A total of 23 STEC strains were isolated from 15 samples (four strains from three omasa, 10 from five abomasa, and nine from seven large intestines). The O serotypes and toxin gene types of all STEC isolates were identified, and antimicrobial resistance was assessed using the disk diffusion method. The isolation rates of STEC from edible beef intestines were 2.8% in omasum, 4.2% in abomasums, and 5.9% in large intestines. All STEC isolates harbored either stx1, or both stx1 and stx2 genes simultaneously. Among the 23 isolates, 13 strains were identified as 11 different O serogroups, and 10 strains were untypable. However, enterohemorrhagic Esherichia coli O157, O26, and O111 strains were not isolated. The highest resistance rate observed was against tetracycline (39%), followed by streptomycin (35%) and ampicillin (22%). Of the 23 isolates, 12 isolates (52%) were resistant to at least one antibiotic, nine (39%) isolates were resistant to two or more antibiotics, and one isolate from an abmasum carried resistance against nine antibiotics, including beta-lactam/beta-lactamase inhibitor in combination and cephalosporins. This study shows that edible beef by-products, which are often consumed as raw food in many countries, including Korea, can be potential vehicles for transmission of antimicrobial-resistant pathogenic E. coli to humans.
Protocol and methodology of Study epidemiological mental health in Andalusia: PISMA-ep.
Cervilla, Jorge A; Ruiz, Isabel; Rodríguez-Barranco, Miguel; Rivera, Margarita; Ibáñez-Casas, Inmaculada; Molina, Esther; Valmisa, Eulalio; Carmona-Calvo, José; Moreno-Küstner, Berta; Muñoz-Negro, José Eduardo; Ching-López, Ana; Gutiérrez, Blanca
This is the general methods describing paper of a cross-sectional study that aims to detect the prevalence of major mental disorders in Andalusia (Southern Spain), and their correlates or potential risk factors, using a large representative sample of community-dwelling adults. This is a cross-sectional study. We undertook a multistage sampling using different standard stratification levels and aimed to interview 4,518 randomly selected participants living in all 8 provinces of the Andalusian region utilizing a door-knocking approach. The Spanish version of the MINI International Neuropsychiatric Interview, a valid screening instrument ascertaining ICD-10/DSM-IV compatible mental disorder diagnoses was used as our main diagnostic tool. A large battery of other instruments was used to explore global functionality, medical comorbidity, personality traits, cognitive function and exposure to psychosocial potential risk factors. A saliva sample for DNA extraction was also obtained for a sub-genetic study. The interviews were administered and completed by fully trained interviewers, despite most tools used are compatible with lay interviewer use. A total of 3,892 (70.8%) of 5,496 initially attempted households had to be substituted for equivalent ones due to either no response (37.7%) or not fulfilling the required participant quota (33%). Thence, out of 5,496 eligible participants finally approached, 4,507 (83.7%) agreed to take part in the study, completed the interview and were finally included in the study (n=4,507) and 4,286 (78%) participants also agreed and consented to provide a saliva sample for DNA study. On the other hand, 989 (16.3%) approached potential participants refused to take part in the study. This is the largest mental health epidemiological study developed in the region of Spain (Andalusia). The response rates and representativeness of the sample obtained are fairly high. The method is particularly comprehensive for this sort of studies and includes both, personality and cognitive assessments, as well as a large array of bio-psycho-social risk measures. Copyright © 2016 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.
RECONSTRUCTING REDSHIFT DISTRIBUTIONS WITH CROSS-CORRELATIONS: TESTS AND AN OPTIMIZED RECIPE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Daniel J.; Newman, Jeffrey A., E-mail: djm70@pitt.ed, E-mail: janewman@pitt.ed
2010-09-20
Many of the cosmological tests to be performed by planned dark energy experiments will require extremely well-characterized photometric redshift measurements. Current estimates for cosmic shear are that the true mean redshift of the objects in each photo-z bin must be known to better than 0.002(1 + z), and the width of the bin must be known to {approx}0.003(1 + z) if errors in cosmological measurements are not to be degraded significantly. A conventional approach is to calibrate these photometric redshifts with large sets of spectroscopic redshifts. However, at the depths probed by Stage III surveys (such as DES), let alonemore » Stage IV (LSST, JDEM, and Euclid), existing large redshift samples have all been highly (25%-60%) incomplete, with a strong dependence of success rate on both redshift and galaxy properties. A powerful alternative approach is to exploit the clustering of galaxies to perform photometric redshift calibrations. Measuring the two-point angular cross-correlation between objects in some photometric redshift bin and objects with known spectroscopic redshift, as a function of the spectroscopic z, allows the true redshift distribution of a photometric sample to be reconstructed in detail, even if it includes objects too faint for spectroscopy or if spectroscopic samples are highly incomplete. We test this technique using mock DEEP2 Galaxy Redshift survey light cones constructed from the Millennium Simulation semi-analytic galaxy catalogs. From this realistic test, which incorporates the effects of galaxy bias evolution and cosmic variance, we find that the true redshift distribution of a photometric sample can, in fact, be determined accurately with cross-correlation techniques. We also compare the empirical error in the reconstruction of redshift distributions to previous analytic predictions, finding that additional components must be included in error budgets to match the simulation results. This extra error contribution is small for surveys that sample large areas of sky (>{approx}10{sup 0}-100{sup 0}), but dominant for {approx}1 deg{sup 2} fields. We conclude by presenting a step-by-step, optimized recipe for reconstructing redshift distributions from cross-correlation information using standard correlation measurements.« less
Giraudon, I; Cathcart, S; Blomqvist, S; Littleton, A; Surman-Lee, S; Mifsud, A; Anaraki, S; Fraser, G
2009-06-01
To describe the epidemiology of an outbreak of Salmonella enteritidis phage type 1 (PT1) infection associated with a fast food premises, and to identify the causative factors leading to an acute outbreak with high attack rate and severe illness including hospital admission. Integrated descriptive study of epidemiology, food and environmental microbiology, and professional environmental health assessment, supplemented by a case-case analytical study. Cases were identified through multiple sources and were interviewed to identify food items consumed. Descriptive epidemiology of all cases and a case-case analytical study of risk factors for severe illness were undertaken. Microbiological investigation included analysis and typing of pathogens from stools, blood and environmental surfaces. Professional environmental heath assessment of the premises was undertaken. S. enteritidis PT1 was recovered from two-thirds of faecal samples. Three cases had dual infection with enterotoxin-producing Clostridium perfringens. S. enteritidis PT1 was isolated from 14 of 40 food samples examined and C. perfringens was isolated from eight food samples. Environmental health inspection of the premises revealed multiple deficiencies, including deficits in food preparation and hygiene consistent with multiple cross-contamination, and time-temperature abuse of sauces widely used across menu items. Severe cases were associated with consumption of chips and salad. Outbreaks from fast food premises have been infrequently described. This outbreak demonstrates the potential for fast food premises, with multiple deficiencies in food preparation and hygiene, to produce large, intense community outbreaks with high attack rates and severe illness, highly confined in space and time.
Leavey, Katherine; Bainbridge, Shannon A; Cox, Brian J
2015-01-01
Preeclampsia (PE) is a life-threatening hypertensive pathology of pregnancy affecting 3-5% of all pregnancies. To date, PE has no cure, early detection markers, or effective treatments short of the removal of what is thought to be the causative organ, the placenta, which may necessitate a preterm delivery. Additionally, numerous small placental microarray studies attempting to identify "PE-specific" genes have yielded inconsistent results. We therefore hypothesize that preeclampsia is a multifactorial disease encompassing several pathology subclasses, and that large cohort placental gene expression analysis will reveal these groups. To address our hypothesis, we utilized known bioinformatic methods to aggregate 7 microarray data sets across multiple platforms in order to generate a large data set of 173 patient samples, including 77 with preeclampsia. Unsupervised clustering of these patient samples revealed three distinct molecular subclasses of PE. This included a "canonical" PE subclass demonstrating elevated expression of known PE markers and genes associated with poor oxygenation and increased secretion, as well as two other subclasses potentially representing a poor maternal response to pregnancy and an immunological presentation of preeclampsia. Our analysis sheds new light on the heterogeneity of PE patients, and offers up additional avenues for future investigation. Hopefully, our subclassification of preeclampsia based on molecular diversity will finally lead to the development of robust diagnostics and patient-based treatments for this disorder.
Civade, Raphaël; Dejean, Tony; Valentini, Alice; Roset, Nicolas; Raymond, Jean-Claude; Bonin, Aurélie; Taberlet, Pierre; Pont, Didier
2016-01-01
In the last few years, the study of environmental DNA (eDNA) has drawn attention for many reasons, including its advantages for monitoring and conservation purposes. So far, in aquatic environments, most of eDNA research has focused on the detection of single species using species-specific markers. Recently, species inventories based on the analysis of a single generalist marker targeting a larger taxonomic group (eDNA metabarcoding) have proven useful for bony fish and amphibian biodiversity surveys. This approach involves in situ filtering of large volumes of water followed by amplification and sequencing of a short discriminative fragment from the 12S rDNA mitochondrial gene. In this study, we went one step further by investigating the spatial representativeness (i.e. ecological reliability and signal variability in space) of eDNA metabarcoding for large-scale fish biodiversity assessment in a freshwater system including lentic and lotic environments. We tested the ability of this approach to characterize large-scale organization of fish communities along a longitudinal gradient, from a lake to the outflowing river. First, our results confirm that eDNA metabarcoding is more efficient than a single traditional sampling campaign to detect species presence, especially in rivers. Second, the species list obtained using this approach is comparable to the one obtained when cumulating all traditional sampling sessions since 1995 and 1988 for the lake and the river, respectively. In conclusion, eDNA metabarcoding gives a faithful description of local fish biodiversity in the study system, more specifically within a range of a few kilometers along the river in our study conditions, i.e. longer than a traditional fish sampling site.
Norton, Sophie; Huhtinen, Essi; Conaty, Stephen; Hope, Kirsty; Campbell, Brett; Tegel, Marianne; Boyd, Rowena; Cullen, Beth
2012-04-01
In January 2011, Sydney South West Public Health Unit was notified of a large number of people presenting with gastroenteritis over two days at a local hospital emergency department (ED). Case-finding was conducted through hospital EDs and general practitioners, which resulted in the notification of 154 possible cases, from which 83 outbreak cases were identified. Fifty-eight cases were interviewed about demographics, symptom profile and food histories. Stool samples were collected and submitted for analysis. An inspection was conducted at a Vietnamese bakery and food samples were collected and submitted for analysis. Further case ascertainment occurred to ensure control measures were successful. Of the 58 interviewed cases, the symptom profile included diarrhoea (100%), fever (79.3%) and vomiting (89.7%). Salmonella Typhimurium multiple-locus-variable number tandem repeats analysis (MLVA) type 3-10-8-9-523 was identified in 95.9% (47/49) of stool samples. Cases reported consuming chicken, pork or salad rolls from a single Vietnamese bakery. Environmental swabs detected widespread contamination with Salmonella at the premises. This was a large point-source outbreak associated with the consumption of Vietnamese-style pork, chicken and salad rolls. These foods have been responsible for significant outbreaks in the past. The typical ingredients of raw egg butter or mayonnaise and pate are often implicated, as are the food-handling practices in food outlets. This indicates the need for education in better food-handling practices, including the benefits of using safer products. Ongoing surveillance will monitor the success of new food regulations introduced in New South Wales during 2011 for improving food-handling practices and reducing foodborne illness.
Huhtinen, Essi; Conaty, Stephen; Hope, Kirsty; Campbell, Brett; Tegel, Marianne; Boyd, Rowena; Cullen, Beth
2012-01-01
Introduction In January 2011, Sydney South West Public Health Unit was notified of a large number of people presenting with gastroenteritis over two days at a local hospital emergency department (ED). Methods Case-finding was conducted through hospital EDs and general practitioners, which resulted in the notification of 154 possible cases, from which 83 outbreak cases were identified. Fifty-eight cases were interviewed about demographics, symptom profile and food histories. Stool samples were collected and submitted for analysis. An inspection was conducted at a Vietnamese bakery and food samples were collected and submitted for analysis. Further case ascertainment occurred to ensure control measures were successful. Results Of the 58 interviewed cases, the symptom profile included diarrhoea (100%), fever (79.3%) and vomiting (89.7%). Salmonella Typhimurium multiple-locus-variable number tandem repeats analysis (MLVA) type 3–10–8-9–523 was identified in 95.9% (47/49) of stool samples. Cases reported consuming chicken, pork or salad rolls from a single Vietnamese bakery. Environmental swabs detected widespread contamination with Salmonella at the premises. Discussion This was a large point-source outbreak associated with the consumption of Vietnamese-style pork, chicken and salad rolls. These foods have been responsible for significant outbreaks in the past. The typical ingredients of raw egg butter or mayonnaise and pate are often implicated, as are the food-handling practices in food outlets. This indicates the need for education in better food-handling practices, including the benefits of using safer products. Ongoing surveillance will monitor the success of new food regulations introduced in New South Wales during 2011 for improving food-handling practices and reducing foodborne illness. PMID:23908908
ERIC Educational Resources Information Center
Tolley, Patricia Ann Separ
2009-01-01
The purpose of this correlational study was to examine the effects of a residential learning community and enrollment in an introductory engineering course to engineering students' perceptions of the freshman year experience, academic performance, and persistence. The sample included students enrolled in a large, urban, public, research university…
ERIC Educational Resources Information Center
Legerstee, Jeroen S.; Tulen, Joke H. M.; Dierckx, Bram; Treffers, Philip D. A.; Verhulst, Frank C.; Utens, Elisabeth M. W. J.
2010-01-01
Background: This study examined whether treatment response to stepped-care cognitive-behavioural treatment (CBT) is associated with changes in threat-related selective attention and its specific components in a large clinical sample of anxiety-disordered children. Methods: Ninety-one children with an anxiety disorder were included in the present…
Changes in Pell Grant Participation and Median Income of Recipients. Data Point. NCES 2016-407
ERIC Educational Resources Information Center
Ifill, Nicole; Velez, Erin Dunlop
2016-01-01
This report is based on data from four iterations of the National Postsecondary Student Aid Study (NPSAS), a large, nationally representative sample survey of students that focuses on how they finance their education. NPSAS includes data on federal Pell Grant awards, which are need-based grants awarded to low-income students, primarily…
ERIC Educational Resources Information Center
Springer, Matthew G.; Pepper, Matthew J.; Ghosh-Dastidar, Bonnie
2014-01-01
This study examines the effect of supplemental education services (SES) on student test score gains and whether particular subgroups of students benefit more from NCLB tutoring services. Our sample includes information on students enrolled in third through eighth grades nested in 121 elementary and middle schools over a five-year period comprising…
ERIC Educational Resources Information Center
Baker, Claire E.
2015-01-01
Research Findings: There is growing evidence that home learning stimulation that includes informal numeracy experiences can promote math-related learning in school. Furthermore, national studies suggest that children who start kindergarten with stronger math skills are more likely to succeed in high school. This study used a large sample of…
ERIC Educational Resources Information Center
Vargas, Nestor Albert
2013-01-01
The objective of this study was to generate principal and teacher descriptions of what constitutes a teacher's "special fitness to perform" in a public urban continuation high school with a concentration of at-risk students. The sample included 6 continuation principals and 15 continuation teachers from a large urban school district in…
Speedy Acquisition of Surface-Contamination Samples
NASA Technical Reports Server (NTRS)
Puleo, J. R.; Kirschner, L. E.
1982-01-01
Biological contamination of large-area surfaces can be determined quickly, inexpensively, and accurately with the aid of a polyester bonded cloth. Cloth is highly effective in removing microbes from a surface and releasing them for biological assay. In releasing contaminants, polyester bonded cloth was found to be superior to other commercial cleanroom cloths, including spun-bound polyamid cloths and cellulose cloths.
ERIC Educational Resources Information Center
Jackson, Dan; Bowden, Jack; Baker, Rose
2015-01-01
Moment-based estimators of the between-study variance are very popular when performing random effects meta-analyses. This type of estimation has many advantages including computational and conceptual simplicity. Furthermore, by using these estimators in large samples, valid meta-analyses can be performed without the assumption that the treatment…
ERIC Educational Resources Information Center
Burt, S. Alexandra; Klahr, Ashlea M.; Rueter, Martha A.; McGue, Matt; Iacono, William G.
2011-01-01
Background: A recent meta-analysis revealed moderate shared environmental influences (C) on most forms of child and adolescent psychopathology (Burt, 2009), including antisocial behavior. Critically, however, the research analyzed in this meta-analysis relied largely on specific informant-reports (and particularly parent and child reports), each…
Preparing to Teach Online as Transformative Faculty Development
ERIC Educational Resources Information Center
McQuiggan, Carol A.
2011-01-01
An action research study was conducted at a campus college of a large Research I institution of higher education to explore transformative learning among higher education faculty as a result of participating in a blended program to prepare them to teach online. The purposeful sample included six full-time and one adjunct faculty, teaching a mix of…
Gender in Adolescent Autonomy: Distinction between Boys and Girls Accelerates at 16 Years of Age
ERIC Educational Resources Information Center
Fleming, Manuela
2005-01-01
Introduction: Autonomy is a major developmental feature of adolescents. Its success mediates transition into adulthood. It involves a number of psychological parameters, including desire, conflict with parents and actual achievement. Method: How male and female adolescents view autonomy was investigated in a large sample of 12-17 year-old…
ERIC Educational Resources Information Center
O'Neal, Colleen R.
2018-01-01
The objective of this short-term longitudinal study was to examine individual versus classroom peer effects of grit on later individual literacy achievement in elementary school. The dual language learner, largely Latina/o sample included students from the 3rd through the 5th grades. Participants completed a literacy achievement performance task…
ERIC Educational Resources Information Center
Esler, Amy N.; Hall-Lande, Jennifer; Hewitt, Amy
2017-01-01
The potential for culture to impact diagnosis of autism spectrum disorder (ASD) is high, yet remains largely unstudied. This study examined differences across racial/ethnic groups in ASD symptoms, cognitive and adaptive skills, and related behaviors in children with ASD that included a unique subgroup, children from the Somali diaspora. Somali…
Serum markers for type II diabetes mellitus
Metz, Thomas O; Qian, Wei-Jun; Jacobs, Jon M; Polpitiya, Ashoka D; Camp, II, David G; Smith, Richard D
2014-03-18
A method for identifying persons with increased risk of developing type 2 diabetes mellitus utilizing selected biomarkers described hereafter either alone or in combination. The present invention allows for broad based, reliable, screening of large population bases and provides other advantages, including the formulation of effective strategies for characterizing, archiving, and contrasting data from multiple sample types under varying conditions.
Data Mining of University Philanthropic Giving: Cluster-Discriminant Analysis and Pareto Effects
ERIC Educational Resources Information Center
Le Blanc, Louis A.; Rucks, Conway T.
2009-01-01
A large sample of 33,000 university alumni records were cluster-analyzed to generate six groups relatively unique in their respective attribute values. The attributes used to cluster the former students included average gift to the university's foundation and to the alumni association for the same institution. Cluster detection is useful in this…
ERIC Educational Resources Information Center
Reinhorn, Stefanie K.; Johnson, Susan Moore; Simon, Nicole S.
2017-01-01
We studied how six high-performing, high-poverty schools in one large Massachusetts city implemented the state's new teacher evaluation policy. The sample includes traditional, turnaround, restart, and charter schools, each of which had received the state's highest accountability rating. We sought to learn how these successful schools approached…
State Profiles: Financing Public Higher Education. 1978 to 1998 Trend Data.
ERIC Educational Resources Information Center
Halstead, Kent
This report presents two large tables showing trends in the financing of public higher education since 1977-78. Introductory information explains how to use the tables, the data-time relationship (whether fiscal year, academic year, or calendar year), and includes a sample chart constructed from one state's data. The raw data used for these…
ERIC Educational Resources Information Center
Barksdale, Christopher J.
2017-01-01
The purpose of this sequential mixed method study was to examine the relationship between classroom climate and student achievement of middle school students. This study included a review of data collected from the Learning Environment Inventory from a purposeful sample of middle school students from a large suburban school district. A purposeful…
The Agony and the Ecstasy: Teaching Marketing Metrics to Undergraduate Business Students
ERIC Educational Resources Information Center
Saber, Jane Lee; Foster, Mary K.
2011-01-01
The marketing department of a large business school introduced a new undergraduate course, marketing metrics and analysis. The main materials for this course consisted of a series of online spreadsheets with embedded text and practice problems, a 32-page online metrics primer that included assurance of learning questions and a sample examination…
SOME ASPECTS OF SCHOOL INTEGRATION IN A CALIFORNIA HIGH SCHOOL.
ERIC Educational Resources Information Center
HICKERSON, NATHANIEL
THE PURPOSE OF THIS STUDY WAS TO LEARN WHETHER NEGRO AND NON-NEGRO STUDENTS RECEIVE SIMILAR KINDS OF FORMAL AND INFORMAL EDUCATIONAL EXPERIENCES. THE SAMPLE CONSISTED OF MEXICAN-AMERICAN AND FILIPINO-AMERICANS IN THE SAME HIGH SCHOOL. USE WAS MADE OF STUDENT RECORDS INCLUDING CURRICULUM TRACK, IQ SCORE, AND PARENTS' OCCUPATIONS. THE LARGE MAJORITY…
Nursing Home Staff Turnover: Impact on Nursing Home Compare Quality Measures
ERIC Educational Resources Information Center
Castle, Nicholas G.; Engberg, John; Men, Aiju
2007-01-01
Purpose: We used data from a large sample of nursing homes to examine the association between staff turnover and quality. Design and Methods: The staff turnover measures came from primary data collected from 2,840 nursing homes in 2004 (representing a 71% response rate). Data collection included measures for nurse aides, licensed practical nurses,…
Analysis of Environmental Contamination resulting from ...
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to safe levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illu
An overview of the genetic dissection of complex traits.
Rao, D C
2008-01-01
Thanks to the recent revolutionary genomic advances such as the International HapMap consortium, resolution of the genetic architecture of common complex traits is beginning to look hopeful. While demonstrating the feasibility of genome-wide association (GWA) studies, the pathbreaking Wellcome Trust Case Control Consortium (WTCCC) study also serves to underscore the critical importance of very large sample sizes and draws attention to potential problems, which need to be addressed as part of the study design. Even the large WTCCC study had vastly inadequate power for several of the associations reported (and confirmed) and, therefore, most of the regions harboring relevant associations may not be identified anytime soon. This chapter provides an overview of some of the key developments in the methodological approaches to genetic dissection of common complex traits. Constrained Bayesian networks are suggested as especially useful for analysis of pathway-based SNPs. Likewise, composite likelihood is suggested as a promising method for modeling complex systems. It discusses the key steps in a study design, with an emphasis on GWA studies. Potential limitations highlighted by the WTCCC GWA study are discussed, including problems associated with massive genotype imputation, analysis of pooled national samples, shared controls, and the critical role of interactions. GWA studies clearly need massive sample sizes that are only possible through genuine collaborations. After all, for common complex traits, the question is not whether we can find some pieces of the puzzle, but how large and what kind of a sample we need to (nearly) solve the genetic puzzle.
Carter, James L.; Resh, Vincent H.
2001-01-01
A survey of methods used by US state agencies for collecting and processing benthic macroinvertebrate samples from streams was conducted by questionnaire; 90 responses were received and used to describe trends in methods. The responses represented an estimated 13,000-15,000 samples collected and processed per year. Kicknet devices were used in 64.5% of the methods; other sampling devices included fixed-area samplers (Surber and Hess), artificial substrates (Hester-Dendy and rock baskets), grabs, and dipnets. Regional differences existed, e.g., the 1-m kicknet was used more often in the eastern US than in the western US. Mesh sizes varied among programs but 80.2% of the methods used a mesh size between 500 and 600 (mu or u)m. Mesh size variations within US Environmental Protection Agency regions were large, with size differences ranging from 100 to 700 (mu or u)m. Most samples collected were composites; the mean area sampled was 1.7 m2. Samples rarely were collected using a random method (4.7%); most samples (70.6%) were collected using "expert opinion", which may make data obtained operator-specific. Only 26.3% of the methods sorted all the organisms from a sample; the remainder subsampled in the laboratory. The most common method of subsampling was to remove 100 organisms (range = 100-550). The magnification used for sorting ranged from 1 (sorting by eye) to 30x, which results in inconsistent separation of macroinvertebrates from detritus. In addition to subsampling, 53% of the methods sorted large/rare organisms from a sample. The taxonomic level used for identifying organisms varied among taxa; Ephemeroptera, Plecoptera, and Trichoptera were generally identified to a finer taxonomic resolution (genus and species) than other taxa. Because there currently exists a large range of field and laboratory methods used by state programs, calibration among all programs to increase data comparability would be exceptionally challenging. However, because many techniques are shared among methods, limited testing could be designed to evaluate whether procedural differences affect the ability to determine levels of environmental impairment using benthic macroinvertebrate communities.
Review: Abortion care in Ghana: A critical review of the literature
Rominski, Sarah D; Lori, Jody R
2015-01-01
The Government of Ghana has taken important steps to mitigate the impact of unsafe abortion. However, the expected decline in maternal deaths is yet to be realized. This literature review aims to present findings from empirical research directly related to abortion provision in Ghana and identify gaps for future research. A total of four (4) databases were searched with the keywords “Ghana and abortion” and hand review of reference lists was conducted. All abstracts were reviewed. The final include sample was 39 articles. Abortion-related complications represent a large component of admissions to gynecological wards in hospitals in Ghana as well as a large contributor to maternal mortality. Almost half of the included studies were hospital-based, mainly chart reviews. This review has identified gaps in the literature including: interviewing women who have sought unsafe abortions and with healthcare providers who may act as gatekeepers to women wishing to access safe abortion services. PMID:25438507
Ameur, Adam; Bunikis, Ignas; Enroth, Stefan; Gyllensten, Ulf
2014-01-01
CanvasDB is an infrastructure for management and analysis of genetic variants from massively parallel sequencing (MPS) projects. The system stores SNP and indel calls in a local database, designed to handle very large datasets, to allow for rapid analysis using simple commands in R. Functional annotations are included in the system, making it suitable for direct identification of disease-causing mutations in human exome- (WES) or whole-genome sequencing (WGS) projects. The system has a built-in filtering function implemented to simultaneously take into account variant calls from all individual samples. This enables advanced comparative analysis of variant distribution between groups of samples, including detection of candidate causative mutations within family structures and genome-wide association by sequencing. In most cases, these analyses are executed within just a matter of seconds, even when there are several hundreds of samples and millions of variants in the database. We demonstrate the scalability of canvasDB by importing the individual variant calls from all 1092 individuals present in the 1000 Genomes Project into the system, over 4.4 billion SNPs and indels in total. Our results show that canvasDB makes it possible to perform advanced analyses of large-scale WGS projects on a local server. Database URL: https://github.com/UppsalaGenomeCenter/CanvasDB PMID:25281234
Ameur, Adam; Bunikis, Ignas; Enroth, Stefan; Gyllensten, Ulf
2014-01-01
CanvasDB is an infrastructure for management and analysis of genetic variants from massively parallel sequencing (MPS) projects. The system stores SNP and indel calls in a local database, designed to handle very large datasets, to allow for rapid analysis using simple commands in R. Functional annotations are included in the system, making it suitable for direct identification of disease-causing mutations in human exome- (WES) or whole-genome sequencing (WGS) projects. The system has a built-in filtering function implemented to simultaneously take into account variant calls from all individual samples. This enables advanced comparative analysis of variant distribution between groups of samples, including detection of candidate causative mutations within family structures and genome-wide association by sequencing. In most cases, these analyses are executed within just a matter of seconds, even when there are several hundreds of samples and millions of variants in the database. We demonstrate the scalability of canvasDB by importing the individual variant calls from all 1092 individuals present in the 1000 Genomes Project into the system, over 4.4 billion SNPs and indels in total. Our results show that canvasDB makes it possible to perform advanced analyses of large-scale WGS projects on a local server. Database URL: https://github.com/UppsalaGenomeCenter/CanvasDB. © The Author(s) 2014. Published by Oxford University Press.
CCD Parallaxes for 309 Late-type Dwarfs and Subdwarfs
NASA Astrophysics Data System (ADS)
Dahn, Conard C.; Harris, Hugh C.; Subasavage, John P.; Ables, Harold D.; Canzian, Blaise J.; Guetter, Harry H.; Harris, Fred H.; Henden, Arne H.; Leggett, S. K.; Levine, Stephen E.; Luginbuhl, Christian B.; Monet, Alice B.; Monet, David G.; Munn, Jeffrey A.; Pier, Jeffrey R.; Stone, Ronald C.; Vrba, Frederick J.; Walker, Richard L.; Tilleman, Trudy M.
2017-10-01
New, updated, and/or revised CCD parallaxes determined with the Strand Astrometric Reflector at the Naval Observatory Flagstaff Station are presented. Included are results for 309 late-type dwarf and subdwarf stars observed over the 30+ years that the program operated. For 124 of the stars, parallax determinations from other investigators have already appeared in the literature and we compare the different results. Also included here are new or updated VI photometry on the Johnson-Kron-Cousins system for all but a few of the faintest targets. Together with 2MASS JHK s near-infrared photometry, a sample of absolute magnitude versus color and color versus color diagrams are constructed. Because large proper motion was a prime criterion for targeting the stars, the majority turn out to be either M-type subdwarfs or late M-type dwarfs. The sample also includes 50 dwarf or subdwarf L-type stars, and four T dwarfs. Possible halo subdwarfs are identified in the sample based on tangential velocity, subluminosity, and spectral type. Residuals from the solutions for parallax and proper motion for several stars show evidence of astrometric perturbations.
NASA Technical Reports Server (NTRS)
Veldhuis, Hugo; Hall, Forrest G. (Editor); Knapp, David E. (Editor)
2000-01-01
This data set contains the major soil properties of soil samples collected in 1994 at the tower flux sites in the Northern Study Area (NSA). The soil samples were collected by Hugo Veldhuis and his staff from the University of Manitoba. The mineral soil samples were largely analyzed by Barry Goetz, under the supervision of Dr. Harold Rostad at the University of Saskatchewan. The organic soil samples were largely analyzed by Peter Haluschak, under the supervision of Hugo Veldhuis at the Centre for Land and Biological Resources Research in Winnipeg, Manitoba. During the course of field investigation and mapping, selected surface and subsurface soil samples were collected for laboratory analysis. These samples were used as benchmark references for specific soil attributes in general soil characterization. Detailed soil sampling, description, and laboratory analysis were performed on selected modal soils to provide examples of common soil physical and chemical characteristics in the study area. The soil properties that were determined include soil horizon; dry soil color; pH; bulk density; total, organic, and inorganic carbon; electric conductivity; cation exchange capacity; exchangeable sodium, potassium, calcium, magnesium, and hydrogen; water content at 0.01, 0.033, and 1.5 MPascals; nitrogen; phosphorus: particle size distribution; texture; pH of the mineral soil and of the organic soil; extractable acid; and sulfur. These data are stored in ASCII text files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
Cai, Long-Fei; Zhu, Ying; Du, Guan-Sheng; Fang, Qun
2012-01-03
We described a microfluidic chip-based system capable of generating droplet array with a large scale concentration gradient by coupling flow injection gradient technique with droplet-based microfluidics. Multiple modules including sample injection, sample dispersion, gradient generation, droplet formation, mixing of sample and reagents, and online reaction within the droplets were integrated into the microchip. In the system, nanoliter-scale sample solution was automatically injected into the chip under valveless flow injection analysis mode. The sample zone was first dispersed in the microchannel to form a concentration gradient along the axial direction of the microchannel and then segmented into a linear array of droplets by immiscible oil phase. With the segmentation and protection of the oil phase, the concentration gradient profile of the sample was preserved in the droplet array with high fidelity. With a single injection of 16 nL of sample solution, an array of droplets with concentration gradient spanning 3-4 orders of magnitude could be generated. The present system was applied in the enzyme inhibition assay of β-galactosidase to preliminarily demonstrate its potential in high throughput drug screening. With a single injection of 16 nL of inhibitor solution, more than 240 in-droplet enzyme inhibition reactions with different inhibitor concentrations could be performed with an analysis time of 2.5 min. Compared with multiwell plate-based screening systems, the inhibitor consumption was reduced 1000-fold. © 2011 American Chemical Society
Sun, Yangbo; Chen, Long; Huang, Bisheng; Chen, Keli
2017-07-01
As a mineral, the traditional Chinese medicine calamine has a similar shape to many other minerals. Investigations of commercially available calamine samples have shown that there are many fake and inferior calamine goods sold on the market. The conventional identification method for calamine is complicated, therefore as a result of the large scale of calamine samples, a rapid identification method is needed. To establish a qualitative model using near-infrared (NIR) spectroscopy for rapid identification of various calamine samples, large quantities of calamine samples including crude products, counterfeits and processed products were collected and correctly identified using the physicochemical and powder X-ray diffraction method. The NIR spectroscopy method was used to analyze these samples by combining the multi-reference correlation coefficient (MRCC) method and the error back propagation artificial neural network algorithm (BP-ANN), so as to realize the qualitative identification of calamine samples. The accuracy rate of the model based on NIR and MRCC methods was 85%; in addition, the model, which took comprehensive multiple factors into consideration, can be used to identify crude calamine products, its counterfeits and processed products. Furthermore, by in-putting the correlation coefficients of multiple references as the spectral feature data of samples into BP-ANN, a BP-ANN model of qualitative identification was established, of which the accuracy rate was increased to 95%. The MRCC method can be used as a NIR-based method in the process of BP-ANN modeling.
Wilsmore, Bradley R.; Grunstein, Ronald R.; Fransen, Marlene; Woodward, Mark; Norton, Robyn; Ameratunga, Shanthi
2013-01-01
Study Objectives: To determine the relationship between sleep complaints, primary insomnia, excessive daytime sleepiness, and lifestyle factors in a large community-based sample. Design: Cross-sectional study. Setting: Blood donor sites in New Zealand. Patients or Participants: 22,389 individuals aged 16-84 years volunteering to donate blood. Interventions: N/A. Measurements: A comprehensive self-administered questionnaire including personal demographics and validated questions assessing sleep disorders (snoring, apnea), sleep complaints (sleep quantity, sleep dissatisfaction), insomnia symptoms, excessive daytime sleepiness, mood, and lifestyle factors such as work patterns, smoking, alcohol, and illicit substance use. Additionally, direct measurements of height and weight were obtained. Results: One in three participants report < 7-8 h sleep, 5 or more nights per week, and 60% would like more sleep. Almost half the participants (45%) report suffering the symptoms of insomnia at least once per week, with one in 5 meeting more stringent criteria for primary insomnia. Excessive daytime sleepiness (evident in 9% of this large, predominantly healthy sample) was associated with insomnia (odds ratio [OR] 1.75, 95% confidence interval [CI] 1.50 to 2.05), depression (OR 2.01, CI 1.74 to 2.32), and sleep disordered breathing (OR 1.92, CI 1.59 to 2.32). Long work hours, alcohol dependence, and rotating work shifts also increase the risk of daytime sleepiness. Conclusions: Even in this relatively young, healthy, non-clinical sample, sleep complaints and primary insomnia with subsequent excess daytime sleepiness were common. There were clear associations between many personal and lifestyle factors—such as depression, long work hours, alcohol dependence, and rotating shift work—and sleep problems or excessive daytime sleepiness. Citation: Wilsmore BR; Grunstein RR; Fransen M; Woodward M; Norton R; Ameratunga S. Sleep habits, insomnia, and daytime sleepiness in a large and healthy community-based sample of New Zealanders. J Clin Sleep Med 2013;9(6):559-566. PMID:23772189
Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets
Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L
2014-01-01
Background As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Methods Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Results Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Conclusions Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. PMID:24464852
NASA Astrophysics Data System (ADS)
Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert
2016-05-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.
Bladergroen, Marco R.; van der Burgt, Yuri E. M.
2015-01-01
For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071
Zheng, Hui-Fei; Wang, Wen-Qiang; Li, Xin-Min; Rauw, Gail; Baker, Glen B
2017-01-01
A review of studies on the body fluid levels of neuroactive amino acids, including glutamate, glutamine, taurine, gamma-aminobutyric acid (GABA), glycine, tryptophan, D-serine, and others, in autism spectrum disorders (ASD) is given. The results reported in the literature are generally inconclusive and contradictory, but there has been considerable variation among the previous studies in terms of factors such as age, gender, number of subjects, intelligence quotient, and psychoactive medication being taken. Future studies should include simultaneous analyses of a large number of amino acids [including D-serine and branched-chain amino acids (BCAAs)] and standardization of the factors mentioned above. It may also be appropriate to use saliva sampling to detect amino acids in ASD patients in the future-this is noninvasive testing that can be done easily more frequently than other sampling, thus providing more dynamic monitoring.
BMI curves for preterm infants.
Olsen, Irene E; Lawson, M Louise; Ferguson, A Nicole; Cantrell, Rebecca; Grabich, Shannon C; Zemel, Babette S; Clark, Reese H
2015-03-01
Preterm infants experience disproportionate growth failure postnatally and may be large weight for length despite being small weight for age by hospital discharge. The objective of this study was to create and validate intrauterine weight-for-length growth curves using the contemporary, large, racially diverse US birth parameters sample used to create the Olsen weight-, length-, and head-circumference-for-age curves. Data from 391 681 US infants (Pediatrix Medical Group) born at 22 to 42 weeks' gestational age (born in 1998-2006) included birth weight, length, and head circumference, estimated gestational age, and gender. Separate subsamples were used to create and validate curves. Established methods were used to determine the weight-for-length ratio that was most highly correlated with weight and uncorrelated with length. Final smoothed percentile curves (3rd to 97th) were created by the Lambda Mu Sigma (LMS) method. The validation sample was used to confirm results. The final sample included 254 454 singleton infants (57.2% male) who survived to discharge. BMI was the best overall weight-for-length ratio for both genders and a majority of gestational ages. Gender-specific BMI-for-age curves were created (n = 127 446) and successfully validated (n = 126 988). Mean z scores for the validation sample were ∼0 (∼1 SD). BMI was different across gender and gestational age. We provide a set of validated reference curves (gender-specific) to track changes in BMI for prematurely born infants cared for in the NICU for use with weight-, length-, and head-circumference-for-age intrauterine growth curves. Copyright © 2015 by the American Academy of Pediatrics.
Impact of Processing Method on Recovery of Bacteria from Wipes Used in Biological Surface Sampling
Olson, Nathan D.; Filliben, James J.; Morrow, Jayne B.
2012-01-01
Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense. PMID:22706055
NASA Astrophysics Data System (ADS)
Caulton, D.; Golston, L.; Li, Q.; Bou-Zeid, E.; Pan, D.; Lane, H.; Lu, J.; Fitts, J. P.; Zondlo, M. A.
2015-12-01
Recent work suggests the distribution of methane emissions from fracking operations is a skewed distributed with a small percentage of emitters contributing a large proportion of the total emissions. In order to provide a statistically robust distributions of emitters and determine the presence of super-emitters, errors in current techniques need to be constrained and mitigated. The Marcellus shale, the most productive natural gas shale field in the United States, has received less intense focus for well-level emissions and is here investigated to provide the distribution of methane emissions. In July of 2015 approximately 250 unique well pads were sampled using the Princeton Atmospheric Chemistry Mobile Acquisition Node (PAC-MAN). This mobile lab includes a Garmin GPS unit, Vaisala weather station (WTX520), LICOR 7700 CH4 open path sensor and LICOR 7500 CO2/H2O open path sensor. Sampling sites were preselected based on wind direction, sampling distance and elevation grade. All sites were sampled during low boundary layer conditions (600-1000 and 1800-2200 local time). The majority of sites were sampled 1-3 times while selected test sites were sampled multiple times or resampled several times during the day. For selected sites a sampling tower was constructed consisting of a Metek uSonic-3 Class A sonic anemometer, and an additional LICOR 7700 and 7500. Data were recorded for at least one hour at these sites. A robust study and inter-comparison of different methodologies will be presented. The Gaussian plume model will be used to calculate fluxes for all sites and compare results from test sites with multiple passes. Tower data is used to provide constraints on the Gaussian plume model. Additionally, Large Eddy Simulation (LES) modeling will be used to calculate emissions from the tower sites. Alternative techniques will also be discussed. Results from these techniques will be compared to identify best practices and provide robust error estimates.
Geophysics Under Pressure: Large-Volume Presses Versus the Diamond-Anvil Cell
NASA Astrophysics Data System (ADS)
Hazen, R. M.
2002-05-01
Prior to 1970, the legacy of Harvard physicist Percy Bridgman dominated high-pressure geophysics. Massive presses with large-volume devices, including piston-cylinder, opposed-anvil, and multi-anvil configurations, were widely used in both science and industry to achieve a range of crustal and upper mantle temperatures and pressures. George Kennedy of UCLA was a particularly influential advocate of large-volume apparatus for geophysical research prior to his death in 1980. The high-pressure scene began to change in 1959 with the invention of the diamond-anvil cell, which was designed simultaneously and independently by John Jamieson at the University of Chicago and Alvin Van Valkenburg at the National Bureau of Standards in Washington, DC. The compact, inexpensive diamond cell achieved record static pressures and had the advantage of optical access to the high-pressure environment. Nevertheless, members of the geophysical community, who favored the substantial sample volumes, geothermally relevant temperature range, and satisfying bulk of large-volume presses, initially viewed the diamond cell with indifference or even contempt. Several factors led to a gradual shift in emphasis from large-volume presses to diamond-anvil cells in geophysical research during the 1960s and 1970s. These factors include (1) their relatively low cost at time of fiscal restraint, (2) Alvin Van Valkenburg's new position as a Program Director at the National Science Foundation in 1964 (when George Kennedy's proposal for a Nation High-Pressure Laboratory was rejected), (3) the development of lasers and micro-analytical spectroscopic techniques suitable for analyzing samples in a diamond cell, and (4) the attainment of record pressures (e.g., 100 GPa in 1975 by Mao and Bell at the Geophysical Laboratory). Today, a more balanced collaborative approach has been adopted by the geophysics and mineral physics community. Many high-pressure laboratories operate a new generation of less expensive large-volume presses side-by-side with a wide variety of diamond-anvil cells.
An Internationally Coordinated Science Management Plan for Samples Returned from Mars
NASA Astrophysics Data System (ADS)
Haltigin, T.; Smith, C. L.
2015-12-01
Mars Sample Return (MSR) remains a high priority of the planetary exploration community. Such an effort will undoubtedly be too large for any individual agency to conduct itself, and thus will require extensive global cooperation. To help prepare for an eventual MSR campaign, the International Mars Exploration Working Group (IMEWG) chartered the international Mars Architecture for the Return of Samples (iMARS) Phase II working group in 2014, consisting of representatives from 17 countries and agencies. The overarching task of the team was to provide recommendations for progressing towards campaign implementation, including a proposed science management plan. Building upon the iMARS Phase I (2008) outcomes, the Phase II team proposed the development of an International MSR Science Institute as part of the campaign governance, centering its deliberations around four themes: Organization: including an organizational structure for the Institute that outlines roles and responsibilities of key members and describes sample return facility requirements; Management: presenting issues surrounding scientific leadership, defining guidelines and assumptions for Institute membership, and proposing a possible funding model; Operations & Data: outlining a science implementation plan that details the preliminary sample examination flow, sample allocation process, and data policies; and Curation: introducing a sample curation plan that comprises sample tracking and routing procedures, sample sterilization considerations, and long-term archiving recommendations. This work presents a summary of the group's activities, findings, and recommendations, highlighting the role of international coordination in managing the returned samples.
Recent Progress and Emerging Issues in Measuring and Modeling Biomass Burning Emissions
NASA Astrophysics Data System (ADS)
Yokelson, R. J.; Stockwell, C.; Veres, P. R.; Hatch, L. E.; Barsanti, K. C.; Simpson, I. J.; Blake, D. R.; Alvarado, M.; Kreidenweis, S. M.; Robinson, A. L.; Akagi, S. K.; McMeeking, G. R.; Stone, E.; Gilman, J.; Warneke, C.; Sedlacek, A. J.; Kleinman, L. I.
2013-12-01
Nine recent multi-PI campaigns (6 airborne, 3 laboratory) have quantified biomass burning emissions and the subsequent smoke evolution in unprecedented detail. Among these projects were the Fourth Fire Lab at Missoula Experiment (FLAME-4) and the DOE airborne campaign BBOP (Biomass Burning Observation Project). Between 2009 and 2013 a large selection of fuels and ecosystems were probed including: (1) 21 US prescribed fires in pine forests, chaparral, and shrublands; (2) numerous wildfires in the Pacific Northwest of the US; (3) 77 lab fires burning fuels collected from the sites of the prescribed fires; and (4) 158 lab fires burning authentic fuels in traditional cooking fires and advanced stoves; peat from Indonesia, Canada, and North Carolina; savanna grasses from Africa; temperate grasses from the US; crop waste from the US; rice straw from Taiwan, China, Malaysia, and California; temperate and boreal forest fuels collected in Montana and Alaska; chaparral fuels from California; trash; and tires. Instrumentation for gases included: FTIR, PTR-TOF-MS, 2D-GC and whole air sampling. Particle measurements included filter sampling (with IC, elemental carbon (EC), organic carbon (OC), and GC-MS) and numerous real-time measurements such as: HR-AMS (high-resolution aerosol MS), SP-AMS (soot particle AMS), SP2 (single particle soot photometer), SP-MS (single particle MS), ice nuclei, CCN (cloud condensation nuclei), water soluble OC, size distribution, and optical properties in the UV-VIS. New data include: emission factors for over 400 gases, black carbon (BC), brown carbon (BrC), organic aerosol (OA), ions, metals, EC, and OC; and details of particle morphology, mixing state, optical properties, size distributions, and cloud nucleating activity. Large concentrations (several ppm) of monoterpenes were present in fresh smoke. About 30-70% of the initially emitted gas-phase non-methane organic compounds were semivolatile and could not be identified with current technology. The detection rate for the sampled US prescribed fires was zero by burned area and <30% by active fire detection. Smoke evolution was measured for numerous gas-phase precursors and products, ozone, OA, ions, and BC and BrC mixing state. BC particles were coated within one hour and the smoke evolution was, in general, strongly impacted by the unidentified low volatility gases. An informative synthesis of lab and field fire data with fuels from the same sites was carried out. A preliminary comparison of wildfire and prescribed fire emissions will be presented. Novel schemes are under development to summarize the new emissions data for models, with limited mechanisms and parameterize fast, sub-grid processes. Key current issues to be discussed include: packaging/parameterizing the recent explosion of emissions/evolution data for use in model mechanisms; addressing fires not detected from space; addressing the large amount of unidentified semi-volatile gases emitted by all fires; and developing appropriate airborne and ground-based sampling scales/strategies for local-global models. We briefly summarize a recently funded project that will sample emissions and quantify biomass consumption by peat fires in Indonesia and a pending proposal for comprehensive sampling of cooking fires, brick kilns, garbage burning, diesel super-emitters, etc. in South Asia.
Prevalence and clinical correlates of explosive outbursts in Tourette Syndrome
Chen, Kevin; Budman, Cathy L.; Herrera, Luis Diego; Witkin, Joanna E.; Weiss, Nicholas T.; Lowe, Thomas L.; Freimer, Nelson B.; Reus, Victor I.; Mathews, Carol A.
2012-01-01
The aim of this study was to examine the prevalence and clinical correlates of explosive outbursts in two large samples of individuals with TS, including one collected primarily from non-clinical sources. Participants included 218 TS-affected individuals who were part of a genetic study (N=104 from Costa Rica (CR) and N=114 from the US). The relationship between explosive outbursts and comorbid attention deficit hyperactivity disorder (ADHD), obsessive compulsive disorder (OCD), tic severity, and prenatal and perinatal complications were examined using regression analyses. Twenty percent of participants had explosive outbursts, with no significant differences in prevalence between the CR (non-clinical) and the US (primarily clinical) samples. In the overall sample, ADHD, greater tic severity, and lower age of tic onset were strongly associated with explosive outbursts. ADHD, prenatal exposure to tobacco, and male gender were significantly associated with explosive outbursts in the US sample. Lower age of onset and greater severity of tics were significantly associated with explosive outbursts in the CR sample. This study confirms previous studies that suggest that clinically significant explosive outbursts are common in TS and associated with ADHD and tic severity. An additional potential risk factor, prenatal exposure to tobacco, was also identified. PMID:23040794
DeBoever, Christopher; Reid, Erin G.; Smith, Erin N.; Wang, Xiaoyun; Dumaop, Wilmar; Harismendy, Olivier; Carson, Dennis; Richman, Douglas; Masliah, Eliezer; Frazer, Kelly A.
2013-01-01
Primary central nervous system lymphomas (PCNSL) have a dramatically increased prevalence among persons living with AIDS and are known to be associated with human Epstein Barr virus (EBV) infection. Previous work suggests that in some cases, co-infection with other viruses may be important for PCNSL pathogenesis. Viral transcription in tumor samples can be measured using next generation transcriptome sequencing. We demonstrate the ability of transcriptome sequencing to identify viruses, characterize viral expression, and identify viral variants by sequencing four archived AIDS-related PCNSL tissue samples and analyzing raw sequencing reads. EBV was detected in all four PCNSL samples and cytomegalovirus (CMV), JC polyomavirus (JCV), and HIV were also discovered, consistent with clinical diagnoses. CMV was found to express three long non-coding RNAs recently reported as expressed during active infection. Single nucleotide variants were observed in each of the viruses observed and three indels were found in CMV. No viruses were found in several control tumor types including 32 diffuse large B-cell lymphoma samples. This study demonstrates the ability of next generation transcriptome sequencing to accurately identify viruses, including DNA viruses, in solid human cancer tissue samples. PMID:24023918
Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A
2014-02-01
Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Calvani, Nichola Eliza Davies; Windsor, Peter Andrew; Bush, Russell David; Šlapeta, Jan
2017-09-01
Fasciolosis, due to Fasciola hepatica and Fasciola gigantica, is a re-emerging zoonotic parasitic disease of worldwide importance. Human and animal infections are commonly diagnosed by the traditional sedimentation and faecal egg-counting technique. However, this technique is time-consuming and prone to sensitivity errors when a large number of samples must be processed or if the operator lacks sufficient experience. Additionally, diagnosis can only be made once the 12-week pre-patent period has passed. Recently, a commercially available coprological antigen ELISA has enabled detection of F. hepatica prior to the completion of the pre-patent period, providing earlier diagnosis and increased throughput, although species differentiation is not possible in areas of parasite sympatry. Real-time PCR offers the combined benefits of highly sensitive species differentiation for medium to large sample sizes. However, no molecular diagnostic workflow currently exists for the identification of Fasciola spp. in faecal samples. A new molecular diagnostic workflow for the highly-sensitive detection and quantification of Fasciola spp. in faecal samples was developed. The technique involves sedimenting and pelleting the samples prior to DNA isolation in order to concentrate the eggs, followed by disruption by bead-beating in a benchtop homogeniser to ensure access to DNA. Although both the new molecular workflow and the traditional sedimentation technique were sensitive and specific, the new molecular workflow enabled faster sample throughput in medium to large epidemiological studies, and provided the additional benefit of speciation. Further, good correlation (R2 = 0.74-0.76) was observed between the real-time PCR values and the faecal egg count (FEC) using the new molecular workflow for all herds and sampling periods. Finally, no effect of storage in 70% ethanol was detected on sedimentation and DNA isolation outcomes; enabling transport of samples from endemic to non-endemic countries without the requirement of a complete cold chain. The commercially-available ELISA displayed poorer sensitivity, even after adjustment of the positive threshold (65-88%), compared to the sensitivity (91-100%) of the new molecular diagnostic workflow. Species-specific assays for sensitive detection of Fasciola spp. enable ante-mortem diagnosis in both human and animal settings. This includes Southeast Asia where there are potentially many undocumented human cases and where post-mortem examination of production animals can be difficult. The new molecular workflow provides a sensitive and quantitative diagnostic approach for the rapid testing of medium to large sample sizes, potentially superseding the traditional sedimentation and FEC technique and enabling surveillance programs in locations where animal and human health funding is limited.
NASA Technical Reports Server (NTRS)
Milam, S. N.; Nuevo, M.; Sandford, S. A.; Cody, G. D.; Kilcoyne, A. L. D.; Stroud, R. M.; DeGregorio, B. T.
2010-01-01
The NASA Stardust mission successfully collected material from Comet 81P/Wild 2 [1], including authentic cometary grains [2]. X-ray absorption near-edge structure (XANES) spectroscopy analysis of these samples indicates the presence of oxygen-rich and nitrogen-rich organic materials, which contain a broad variety of functional groups (carbonyls, C=C bonds, aliphatic chains, amines, arnides, etc.) [3]. One component of these organics appears to contain very little aromatic carbon and bears some similarity to the organic residues produced by the irradiation of ices of interstellar/cometary composition, Stardust samples were also recently shown to contain glycine, the smallest biological amino acid [4]. Organic residues produced froth the UV irradiation of astrophysical ice analogs are already known to contain a large suite of organic molecules including amino acids [5-7], amphiphilic compounds (fatty acids) [8], and other complex species. This work presents a comparison between XANES spectra measured from organic residues formed in the laboratory with similar data of cometary samples collected by the Stardust mission
Species-area relationships and extinction forecasts.
Halley, John M; Sgardeli, Vasiliki; Monokrousos, Nikolaos
2013-05-01
The species-area relationship (SAR) predicts that smaller areas contain fewer species. This is the basis of the SAR method that has been used to forecast large numbers of species committed to extinction every year due to deforestation. The method has a number of issues that must be handled with care to avoid error. These include the functional form of the SAR, the choice of equation parameters, the sampling procedure used, extinction debt, and forest regeneration. Concerns about the accuracy of the SAR technique often cite errors not much larger than the natural scatter of the SAR itself. Such errors do not undermine the credibility of forecasts predicting large numbers of extinctions, although they may be a serious obstacle in other SAR applications. Very large errors can arise from misinterpretation of extinction debt, inappropriate functional form, and ignoring forest regeneration. Major challenges remain to understand better the relationship between sampling protocol and the functional form of SARs and the dynamics of relaxation, especially in continental areas, and to widen the testing of extinction forecasts. © 2013 New York Academy of Sciences.
Estimating abundance of mountain lions from unstructured spatial sampling
Russell, Robin E.; Royle, J. Andrew; Desimone, Richard; Schwartz, Michael K.; Edwards, Victoria L.; Pilgrim, Kristy P.; Mckelvey, Kevin S.
2012-01-01
Mountain lions (Puma concolor) are often difficult to monitor because of their low capture probabilities, extensive movements, and large territories. Methods for estimating the abundance of this species are needed to assess population status, determine harvest levels, evaluate the impacts of management actions on populations, and derive conservation and management strategies. Traditional mark–recapture methods do not explicitly account for differences in individual capture probabilities due to the spatial distribution of individuals in relation to survey effort (or trap locations). However, recent advances in the analysis of capture–recapture data have produced methods estimating abundance and density of animals from spatially explicit capture–recapture data that account for heterogeneity in capture probabilities due to the spatial organization of individuals and traps. We adapt recently developed spatial capture–recapture models to estimate density and abundance of mountain lions in western Montana. Volunteers and state agency personnel collected mountain lion DNA samples in portions of the Blackfoot drainage (7,908 km2) in west-central Montana using 2 methods: snow back-tracking mountain lion tracks to collect hair samples and biopsy darting treed mountain lions to obtain tissue samples. Overall, we recorded 72 individual capture events, including captures both with and without tissue sample collection and hair samples resulting in the identification of 50 individual mountain lions (30 females, 19 males, and 1 unknown sex individual). We estimated lion densities from 8 models containing effects of distance, sex, and survey effort on detection probability. Our population density estimates ranged from a minimum of 3.7 mountain lions/100 km2 (95% Cl 2.3–5.7) under the distance only model (including only an effect of distance on detection probability) to 6.7 (95% Cl 3.1–11.0) under the full model (including effects of distance, sex, survey effort, and distance x sex on detection probability). These numbers translate to a total estimate of 293 mountain lions (95% Cl 182–451) to 529 (95% Cl 245–870) within the Blackfoot drainage. Results from the distance model are similar to previous estimates of 3.6 mountain lions/100 km2 for the study area; however, results from all other models indicated greater numbers of mountain lions. Our results indicate that unstructured spatial sampling combined with spatial capture–recapture analysis can be an effective method for estimating large carnivore densities.
NASA Astrophysics Data System (ADS)
Reddy, Naveen A.; Steidel, Charles C.; Erb, Dawn K.; Shapley, Alice E.; Pettini, Max
2006-12-01
We present the results of a spectroscopic survey with LRIS-B on Keck of more than 280 star-forming galaxies and AGNs at redshifts 1.4<~z<~3.0 in the GOODS-N field. Candidates are selected by their UnGR colors using the ``BM/BX'' criteria to target redshift 1.4<~z<~2.5 galaxies and the LBG criteria to target redshift z~3 galaxies; combined these samples account for ~25%-30% of the R and Ks band counts to R=25.5 and Ks(AB)=24.4, respectively. The 212 BM/BX galaxies and 74 LBGs constitute the largest spectroscopic sample of galaxies at z>1.4 in GOODS-N. Extensive multiwavelength data allow us to investigate the stellar populations, stellar masses, bolometric luminosities (Lbol), and extinction of z~2 galaxies. Deep Chandra and Spitzer data indicate that the sample includes galaxies with a wide range in Lbol (~=1010 to >1012 Lsolar) and 4 orders of magnitude in dust obscuration (Lbol/LUV). The sample includes galaxies with a large dynamic range in evolutionary state, from very young galaxies (ages ~=50 Myr) with small stellar masses (M*~=109 Msolar) to evolved galaxies with stellar masses comparable to the most massive galaxies at these redshifts (M*>1011 Msolar). Spitzer data indicate that the optical sample includes some fraction of the obscured AGN population at high redshifts: at least 3 of 11 AGNs in the z>1.4 sample are undetected in the deep X-ray data but exhibit power-law SEDs longward of ~2 μm (rest frame) indicative of obscured AGNs. The results of our survey indicate that rest-frame UV selection and spectroscopy presently constitute the most timewise efficient method of culling large samples of high-redshift galaxies with a wide range in intrinsic properties, and the data presented here will add significantly to the multiwavelength legacy of GOODS. Based on data obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and NASA and was made possible by the generous financial support of the W. M. Keck Foundation.
3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples
NASA Technical Reports Server (NTRS)
Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.
2015-01-01
In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible
Capers, Patrice L.; Brown, Andrew W.; Dawson, John A.; Allison, David B.
2015-01-01
Background: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing) has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. Objective: To evaluate the use of double sampling combined with multiple imputation (DS + MI) to address meta-research questions, using as an example adherence of PubMed entries to two simple consolidated standards of reporting trials guidelines for titles and abstracts. Methods: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT, human, abstract available, and English language (n = 322, 107). For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI) method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO) human rating method. Multiple imputation of the missing-completely at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. Results: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title = 1.00, abstract = 0.92). Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS + MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by year: subsample RHITLO 1.050–1.174 vs. DS + MI 1.082–1.151). As evidence of improved accuracy, DS + MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. Conclusion: Our results support our hypothesis that DS + MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of literature. PMID:25988135
Tyrannosaur paleobiology: new research on ancient exemplar organisms.
Brusatte, Stephen L; Norell, Mark A; Carr, Thomas D; Erickson, Gregory M; Hutchinson, John R; Balanoff, Amy M; Bever, Gabe S; Choiniere, Jonah N; Makovicky, Peter J; Xu, Xing
2010-09-17
Tyrannosaurs, the group of dinosaurian carnivores that includes Tyrannosaurus rex and its closest relatives, are icons of prehistory. They are also the most intensively studied extinct dinosaurs, and thanks to large sample sizes and an influx of new discoveries, have become ancient exemplar organisms used to study many themes in vertebrate paleontology. A phylogeny that includes recently described species shows that tyrannosaurs originated by the Middle Jurassic but remained mostly small and ecologically marginal until the latest Cretaceous. Anatomical, biomechanical, and histological studies of T. rex and other derived tyrannosaurs show that large tyrannosaurs could not run rapidly, were capable of crushing bite forces, had accelerated growth rates and keen senses, and underwent pronounced changes during ontogeny. The biology and evolutionary history of tyrannosaurs provide a foundation for comparison with other dinosaurs and living organisms.
Nanoengineered capsules for selective SERS analysis of biological samples
NASA Astrophysics Data System (ADS)
You, Yil-Hwan; Schechinger, Monika; Locke, Andrea; Coté, Gerard; McShane, Mike
2018-02-01
Metal nanoparticles conjugated with DNA oligomers have been intensively studied for a variety of applications, including optical diagnostics. Assays based on aggregation of DNA-coated particles in proportion to the concentration of target analyte have not been widely adopted for clinical analysis, however, largely due to the nonspecific responses observed in complex biofluids. While sample pre-preparation such as dialysis is helpful to enable selective sensing, here we sought to prove that assay encapsulation in hollow microcapsules could remove this requirement and thereby facilitate more rapid analysis on complex samples. Gold nanoparticle-based assays were incorporated into capsules comprising polyelectrolyte multilayer (PEMs), and the response to small molecule targets and larger proteins were compared. Gold nanoparticles were able to selectively sense small Raman dyes (Rhodamine 6G) in the presence of large protein molecules (BSA) when encapsulated. A ratiometric based microRNA-17 sensing assay exhibited drastic reduction in response after encapsulation, with statistically-significant relative Raman intensity changes only at a microRNA-17 concentration of 10 nM compared to a range of 0-500 nM for the corresponding solution-phase response.
Measuring consistent masses for 25 Milky Way globular clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimmig, Brian; Seth, Anil; Ivans, Inese I.
2015-02-01
We present central velocity dispersions, masses, mass-to-light ratios (M/Ls ), and rotation strengths for 25 Galactic globular clusters (GCs). We derive radial velocities of 1951 stars in 12 GCs from single order spectra taken with Hectochelle on the MMT telescope. To this sample we add an analysis of available archival data of individual stars. For the full set of data we fit King models to derive consistent dynamical parameters for the clusters. We find good agreement between single-mass King models and the observed radial dispersion profiles. The large, uniform sample of dynamical masses we derive enables us to examine trendsmore » of M/L with cluster mass and metallicity. The overall values of M/L and the trends with mass and metallicity are consistent with existing measurements from a large sample of M31 clusters. This includes a clear trend of increasing M/L with cluster mass and lower than expected M/Ls for the metal-rich clusters. We find no clear trend of increasing rotation with increasing cluster metallicity suggested in previous work.« less
Bentzen, Amalie Kai; Marquard, Andrea Marion; Lyngaa, Rikke; Saini, Sunil Kumar; Ramskov, Sofie; Donia, Marco; Such, Lina; Furness, Andrew J S; McGranahan, Nicholas; Rosenthal, Rachel; Straten, Per Thor; Szallasi, Zoltan; Svane, Inge Marie; Swanton, Charles; Quezada, Sergio A; Jakobsen, Søren Nyboe; Eklund, Aron Charles; Hadrup, Sine Reker
2016-10-01
Identification of the peptides recognized by individual T cells is important for understanding and treating immune-related diseases. Current cytometry-based approaches are limited to the simultaneous screening of 10-100 distinct T-cell specificities in one sample. Here we use peptide-major histocompatibility complex (MHC) multimers labeled with individual DNA barcodes to screen >1,000 peptide specificities in a single sample, and detect low-frequency CD8 T cells specific for virus- or cancer-restricted antigens. When analyzing T-cell recognition of shared melanoma antigens before and after adoptive cell therapy in melanoma patients, we observe a greater number of melanoma-specific T-cell populations compared with cytometry-based approaches. Furthermore, we detect neoepitope-specific T cells in tumor-infiltrating lymphocytes and peripheral blood from patients with non-small cell lung cancer. Barcode-labeled pMHC multimers enable the combination of functional T-cell analysis with large-scale epitope recognition profiling for the characterization of T-cell recognition in various diseases, including in small clinical samples.
A DNA methylation map of human cancer at single base-pair resolution
Vidal, E; Sayols, S; Moran, S; Guillaumet-Adkins, A; Schroeder, M P; Royo, R; Orozco, M; Gut, M; Gut, I; Lopez-Bigas, N; Heyn, H; Esteller, M
2017-01-01
Although single base-pair resolution DNA methylation landscapes for embryonic and different somatic cell types provided important insights into epigenetic dynamics and cell-type specificity, such comprehensive profiling is incomplete across human cancer types. This prompted us to perform genome-wide DNA methylation profiling of 22 samples derived from normal tissues and associated neoplasms, including primary tumors and cancer cell lines. Unlike their invariant normal counterparts, cancer samples exhibited highly variable CpG methylation levels in a large proportion of the genome, involving progressive changes during tumor evolution. The whole-genome sequencing results from selected samples were replicated in a large cohort of 1112 primary tumors of various cancer types using genome-scale DNA methylation analysis. Specifically, we determined DNA hypermethylation of promoters and enhancers regulating tumor-suppressor genes, with potential cancer-driving effects. DNA hypermethylation events showed evidence of positive selection, mutual exclusivity and tissue specificity, suggesting their active participation in neoplastic transformation. Our data highlight the extensive changes in DNA methylation that occur in cancer onset, progression and dissemination. PMID:28581523
Foster, Gregory D.; Gates, Paul M.; Foreman, William T.; McKenzie, Stuart W.; Rinella, Frank A.
1993-01-01
Concentrations of pesticides in the dissolved phase of surface water samples from the Yakima River basin, WA, were determined using preconcentration in the Goulden large-sample extractor (GLSE) and gas chromatography/mass spectrometry (GC/MS) analysis. Sample volumes ranging from 10 to 120 L were processed with the GLSE, and the results from the large-sample analyses were compared to those derived from 1-L continuous liquid-liquid extractions Few of the 40 target pesticides were detected in 1-L samples, whereas large-sample preconcentration in the GLSE provided detectable levels for many of the target pesticides. The number of pesticides detected in GLSE processed samples was usually directly proportional to sample volume, although the measured concentrations of the pesticides were generally lower at the larger sample volumes for the same water source. The GLSE can be used to provide lower detection levels relative to conventional liquid-liquid extraction in GC/MS analysis of pesticides in samples of surface water.
A Review of Enhanced Sampling Approaches for Accelerated Molecular Dynamics
NASA Astrophysics Data System (ADS)
Tiwary, Pratyush; van de Walle, Axel
Molecular dynamics (MD) simulations have become a tool of immense use and popularity for simulating a variety of systems. With the advent of massively parallel computer resources, one now routinely sees applications of MD to systems as large as hundreds of thousands to even several million atoms, which is almost the size of most nanomaterials. However, it is not yet possible to reach laboratory timescales of milliseconds and beyond with MD simulations. Due to the essentially sequential nature of time, parallel computers have been of limited use in solving this so-called timescale problem. Instead, over the years a large range of statistical mechanics based enhanced sampling approaches have been proposed for accelerating molecular dynamics, and accessing timescales that are well beyond the reach of the fastest computers. In this review we provide an overview of these approaches, including the underlying theory, typical applications, and publicly available software resources to implement them.
A k-Vector Approach to Sampling, Interpolation, and Approximation
NASA Astrophysics Data System (ADS)
Mortari, Daniele; Rogers, Jonathan
2013-12-01
The k-vector search technique is a method designed to perform extremely fast range searching of large databases at computational cost independent of the size of the database. k-vector search algorithms have historically found application in satellite star-tracker navigation systems which index very large star catalogues repeatedly in the process of attitude estimation. Recently, the k-vector search algorithm has been applied to numerous other problem areas including non-uniform random variate sampling, interpolation of 1-D or 2-D tables, nonlinear function inversion, and solution of systems of nonlinear equations. This paper presents algorithms in which the k-vector search technique is used to solve each of these problems in a computationally-efficient manner. In instances where these tasks must be performed repeatedly on a static (or nearly-static) data set, the proposed k-vector-based algorithms offer an extremely fast solution technique that outperforms standard methods.
Krejci, Charlene B; Bissada, Nabil F
2012-01-01
To examine the literature with respect to periodontitis and issues specific to women's health, namely, hormonal changes, adverse pregnancy outcomes and osteoporosis. The literature was evaluated to review reported associations between periodontitis and genderspecific issues, namely, hormonal changes, adverse pregnancy outcomes and osteoporosis. Collectively, the literature provided a large body of evidence that supports various associations between periodontitis and hormonal changes, adverse pregnancy outcomes and osteoporosis; however, certain shortcomings were noted with respect to biases involving definitions, sample sizes and confounding variables. Specific cause and effect relationships could not be delineated at this time and neither could definitive treatment interventions. Future research must include randomised controlled trials with consistent definitions, adequate controls and sufficiently large sample sizes in order to clarify specific associations, identify cause and effect relationships, define treatment options and determine treatment interventions which will lessen the untoward effects on the at-risk populations.
Evaluation of residual uranium contamination in the dirt floor of an abandoned metal rolling mill.
Glassford, Eric; Spitz, Henry; Lobaugh, Megan; Spitler, Grant; Succop, Paul; Rice, Carol
2013-02-01
A single, large, bulk sample of uranium-contaminated material from the dirt floor of an abandoned metal rolling mill was separated into different types and sizes of aliquots to simulate samples that would be collected during site remediation. The facility rolled approximately 11,000 tons of hot-forged ingots of uranium metal approximately 60 y ago, and it has not been used since that time. Thirty small mass (≈ 0.7 g) and 15 large mass (≈ 70 g) samples were prepared from the heterogeneously contaminated bulk material to determine how measurements of the uranium contamination vary with sample size. Aliquots of bulk material were also resuspended in an exposure chamber to produce six samples of respirable particles that were obtained using a cascade impactor. Samples of removable surface contamination were collected by wiping 100 cm of the interior surfaces of the exposure chamber with 47-mm-diameter fiber filters. Uranium contamination in each of the samples was measured directly using high-resolution gamma ray spectrometry. As expected, results for isotopic uranium (i.e., U and U) measured with the large-mass and small-mass samples are significantly different (p < 0.001), and the coefficient of variation (COV) for the small-mass samples was greater than for the large-mass samples. The uranium isotopic concentrations measured in the air and on the wipe samples were not significantly different and were also not significantly different (p > 0.05) from results for the large- or small-mass samples. Large-mass samples are more reliable for characterizing heterogeneously distributed radiological contamination than small-mass samples since they exhibit the least variation compared to the mean. Thus, samples should be sufficiently large in mass to insure that the results are truly representative of the heterogeneously distributed uranium contamination present at the facility. Monitoring exposure of workers and the public as a result of uranium contamination resuspended during site remediation should be evaluated using samples of sufficient size and type to accommodate the heterogeneous distribution of uranium in the bulk material.
Cosart, Ted; Beja-Pereira, Albano; Luikart, Gordon
2014-11-01
The computer program EXONSAMPLER automates the sampling of thousands of exon sequences from publicly available reference genome sequences and gene annotation databases. It was designed to provide exon sequences for the efficient, next-generation gene sequencing method called exon capture. The exon sequences can be sampled by a list of gene name abbreviations (e.g. IFNG, TLR1), or by sampling exons from genes spaced evenly across chromosomes. It provides a list of genomic coordinates (a bed file), as well as a set of sequences in fasta format. User-adjustable parameters for collecting exon sequences include a minimum and maximum acceptable exon length, maximum number of exonic base pairs (bp) to sample per gene, and maximum total bp for the entire collection. It allows for partial sampling of very large exons. It can preferentially sample upstream (5 prime) exons, downstream (3 prime) exons, both external exons, or all internal exons. It is written in the Python programming language using its free libraries. We describe the use of EXONSAMPLER to collect exon sequences from the domestic cow (Bos taurus) genome for the design of an exon-capture microarray to sequence exons from related species, including the zebu cow and wild bison. We collected ~10% of the exome (~3 million bp), including 155 candidate genes, and ~16,000 exons evenly spaced genomewide. We prioritized the collection of 5 prime exons to facilitate discovery and genotyping of SNPs near upstream gene regulatory DNA sequences, which control gene expression and are often under natural selection. © 2014 John Wiley & Sons Ltd.
Occurrence of Radio Minihalos in a Mass-limited Sample of Galaxy Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giacintucci, Simona; Clarke, Tracy E.; Markevitch, Maxim
2017-06-01
We investigate the occurrence of radio minihalos—diffuse radio sources of unknown origin observed in the cores of some galaxy clusters—in a statistical sample of 58 clusters drawn from the Planck Sunyaev–Zel’dovich cluster catalog using a mass cut ( M {sub 500} > 6 × 10{sup 14} M {sub ⊙}). We supplement our statistical sample with a similarly sized nonstatistical sample mostly consisting of clusters in the ACCEPT X-ray catalog with suitable X-ray and radio data, which includes lower-mass clusters. Where necessary (for nine clusters), we reanalyzed the Very Large Array archival radio data to determine whether a minihalo is present.more » Our total sample includes all 28 currently known and recently discovered radio minihalos, including six candidates. We classify clusters as cool-core or non-cool-core according to the value of the specific entropy floor in the cluster center, rederived or newly derived from the Chandra X-ray density and temperature profiles where necessary (for 27 clusters). Contrary to the common wisdom that minihalos are rare, we find that almost all cool cores—at least 12 out of 15 (80%)—in our complete sample of massive clusters exhibit minihalos. The supplementary sample shows that the occurrence of minihalos may be lower in lower-mass cool-core clusters. No minihalos are found in non-cool cores or “warm cores.” These findings will help test theories of the origin of minihalos and provide information on the physical processes and energetics of the cluster cores.« less
Gyarmathy, V. Anna; Johnston, Lisa G.; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A.
2014-01-01
Background Respondent driven sampling (RDS) and Incentivized Snowball Sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). Methods We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania (“original sample”) to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. Results The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1 to 12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. Conclusions When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called “strudel effect” is discussed in the paper. PMID:24360650
Bardenheier, Barbara H; Bullard, Kai McKeever; Caspersen, Carl J; Cheng, Yiling J; Gregg, Edward W; Geiss, Linda S
2013-09-01
To use structural modeling to test a hypothesized model of causal pathways related with prediabetes among older adults in the U.S. Cross-sectional study of 2,230 older adults (≥ 50 years) without diabetes included in the morning fasting sample of the 2001-2006 National Health and Nutrition Examination Surveys. Demographic data included age, income, marital status, race/ethnicity, and education. Behavioral data included physical activity (metabolic equivalent hours per week for vigorous or moderate muscle strengthening, walking/biking, and house/yard work), and poor diet (refined grains, red meat, added sugars, solid fats, and high-fat dairy). Structural-equation modeling was performed to examine the interrelationships among these variables with family history of diabetes, high blood pressure, BMI, large waist (waist circumference: women, ≥ 35 inches; men, ≥ 40 inches), triglycerides ≥ 200 mg/dL, and total and HDL (≥ 60 mg/dL) cholesterol. After dropping BMI and total cholesterol, our best-fit model included three single factors: socioeconomic position (SEP), physical activity, and poor diet. Large waist had the strongest direct effect on prediabetes (0.279), followed by male sex (0.270), SEP (-0.157), high blood pressure (0.122), family history of diabetes (0.070), and age (0.033). Physical activity had direct effects on HDL (0.137), triglycerides (-0.136), high blood pressure (-0.132), and large waist (-0.067); poor diet had direct effects on large waist (0.146) and triglycerides (0.148). Our results confirmed that, while including factors known to be associated with high risk of developing prediabetes, large waist circumference had the strongest direct effect. The direct effect of SEP on prediabetes suggests mediation by some unmeasured factor(s).
Removing the echoes from terahertz pulse reflection system and sample
NASA Astrophysics Data System (ADS)
Liu, Haishun; Zhang, Zhenwei; Zhang, Cunlin
2018-01-01
Due to the echoes both from terahertz (THz) pulse reflection system and sample, the THz primary pulse will be distorted. The system echoes include two types. One preceding the main peak probably is caused by ultrafast laser pulse and the other at the back of the primary pulse is caused by the Fabry-Perot (F-P) etalon effect of detector. We attempt to remove the corresponding echoes by using two kinds of deconvolution. A Si wafer of 400μm was selected as the tested sample. Firstly, the method of double Gaussian filter (DGF) decnvolution was used to remove the systematic echoes, and then another deconvolution technique was employed to eliminate the two obvious echoes of the sample. The ultimate results indicated: although the combination of two deconvolution techniques could not entirely remove the echoes of sample and system, the echoes were largely reduced.
Pistón, Mariela; Dol, Isabel
2006-01-01
A multiparametric flow system based on multicommutation and binary sampling has been designed for the automated determination of sodium, potassium, calcium, and magnesium in large-volume parenteral solutions and hemodialysis concentrated solutions. The goal was to obtain a computer-controlled system capable of determining the four metals without extensive modifications. The system involved the use of five solenoid valves under software control, allowing the establishment of the appropriate flow conditions for each analyte, that is, sample size, dilution, reagent addition, and so forth. Detection was carried out by either flame atomic emission spectrometry (sodium, potassium) or flame atomic absorption spectrometry (calcium, magnesium). The influence of several operating parameters was studied. Validation was carried out by analyzing artificial samples. Figures of merit obtained include linearity, accuracy, precision, and sampling frequency. Linearity was satisfactory: sodium, r 2 >0.999 ( 0.5 – 3.5 g/L), potassium, r 2 >0.996 (50–150 mg/L), calcium, r 2 >0.999 (30–120 mg/L), and magnesium, r 2 >0.999 (20–40 mg/L). Precision ( s r , %, n=5 ) was better than 2.1 %, and accuracy (evaluated through recovery assays) was in the range of 99.8 %– 101.0 % (sodium), 100.8 – 102.5 % (potassium), 97.3 %– 101.3 % (calcium), and 97.1 %– 99.8 % (magnesium). Sampling frequencies ( h −1 ) were 70 (sodium), 75 (potassium), 70 (calcium), and 58 (magnesium). According to the results obtained, the use of an automated multiparametric system based on multicommutation offers several advantages for the quality control of large-volume parenteral solutions and hemodialysis concentrated solutions. PMID:17671619
Cryptosporidium source tracking in the Potomac River watershed.
Yang, Wenli; Chen, Plato; Villegas, Eric N; Landy, Ronald B; Kanetsky, Charles; Cama, Vitaliano; Dearen, Theresa; Schultz, Cherie L; Orndorff, Kenneth G; Prelewicz, Gregory J; Brown, Miranda H; Young, Kim Roy; Xiao, Lihua
2008-11-01
To better characterize Cryptosporidium in the Potomac River watershed, a PCR-based genotyping tool was used to analyze 64 base flow and 28 storm flow samples from five sites in the watershed. These sites included two water treatment plant intakes, as well as three upstream sites, each associated with a different type of land use. The uses, including urban wastewater, agricultural (cattle) wastewater, and wildlife, posed different risks in terms of the potential contribution of Cryptosporidium oocysts to the source water. Cryptosporidium was detected in 27 base flow water samples and 23 storm flow water samples. The most frequently detected species was C. andersoni (detected in 41 samples), while 14 other species or genotypes, almost all wildlife associated, were occasionally detected. The two common human-pathogenic species, C. hominis and C. parvum, were not detected. Although C. andersoni was common at all four sites influenced by agriculture, it was largely absent at the urban wastewater site. There were very few positive samples as determined by Environmental Protection Agency method 1623 at any site; only 8 of 90 samples analyzed (9%) were positive for Cryptosporidium as determined by microscopy. The genotyping results suggest that many of the Cryptosporidium oocysts in the water treatment plant source waters were from old calves and adult cattle and might not pose a significant risk to human health.
Evaluation of the impact of RNA preservation methods of spiders for de novo transcriptome assembly.
Kono, Nobuaki; Nakamura, Hiroyuki; Ito, Yusuke; Tomita, Masaru; Arakawa, Kazuharu
2016-05-01
With advances in high-throughput sequencing technologies, de novo transcriptome sequencing and assembly has become a cost-effective method to obtain comprehensive genetic information of a species of interest, especially in nonmodel species with large genomes such as spiders. However, high-quality RNA is essential for successful sequencing, and sample preservation conditions require careful consideration for the effective storage of field-collected samples. To this end, we report a streamlined feasibility study of various storage conditions and their effects on de novo transcriptome assembly results. The storage parameters considered include temperatures ranging from room temperature to -80°C; preservatives, including ethanol, RNAlater, TRIzol and RNAlater-ICE; and sample submersion states. As a result, intact RNA was extracted and assembly was successful when samples were preserved at low temperatures regardless of the type of preservative used. The assemblies as well as the gene expression profiles were shown to be robust to RNA degradation, when 30 million 150-bp paired-end reads are obtained. The parameters for sample storage, RNA extraction, library preparation, sequencing and in silico assembly considered in this work provide a guideline for the study of field-collected samples of spiders. © 2015 John Wiley & Sons Ltd.
Variation of clinical outcomes used in glaucoma randomised controlled trials: a systematic review.
Ismail, Rehab; Azuara-Blanco, Augusto; Ramsay, Craig R
2014-04-01
In randomised clinical trials (RCTs) the selection of appropriate outcomes is crucial to the assessment of whether one intervention is better than another. The purpose of this review is to identify different clinical outcomes reported in glaucoma trials. We conducted a systematic review of glaucoma RCTs. A sample or selection of glaucoma trials were included bounded by a time frame (between 2006 and March 2012). Only studies in English language were considered. All clinical measured and reported outcomes were included. The possible variations of clinical outcomes were defined prior to data analysis. Information on reported clinical outcomes was tabulated and analysed using descriptive statistics. Other data recorded included type of intervention and glaucoma, duration of the study, defined primary outcomes, and outcomes used for sample size calculation, if nominated. The search strategy identified 4323 potentially relevant abstracts. There were 315 publications retrieved, of which 233 RCTs were included. A total of 967 clinical measures were reported. There were large variations in the definitions used to describe different outcomes and their measures. Intraocular pressure was the most commonly reported outcome (used in 201 RCTs, 86%) with a total of 422 measures (44%). Safety outcomes were commonly reported in 145 RCTs (62%) whereas visual field outcomes were used in 38 RCTs (16%). There is a large variation in the reporting of clinical outcomes in glaucoma RCTs. This lack of standardisation may impair the ability to evaluate the evidence of glaucoma interventions.
Hunt, Geoffrey; Moloney, Molly; Fazio, Adam
2012-01-01
Qualitative research is often conceptualized as inherently small-scale research, primarily conducted by a lone researcher enmeshed in extensive and long-term fieldwork or involving in-depth interviews with a small sample of 20 to 30 participants. In the study of illicit drugs, traditionally this has often been in the form of ethnographies of drug-using subcultures. Such small-scale projects have produced important interpretive scholarship that focuses on the culture and meaning of drug use in situated, embodied contexts. Larger-scale projects are often assumed to be solely the domain of quantitative researchers, using formalistic survey methods and descriptive or explanatory models. In this paper, however, we will discuss qualitative research done on a comparatively larger scale—with in-depth qualitative interviews with hundreds of young drug users. Although this work incorporates some quantitative elements into the design, data collection, and analysis, the qualitative dimension and approach has nevertheless remained central. Larger-scale qualitative research shares some of the challenges and promises of smaller-scale qualitative work including understanding drug consumption from an emic perspective, locating hard-to-reach populations, developing rapport with respondents, generating thick descriptions and a rich analysis, and examining the wider socio-cultural context as a central feature. However, there are additional challenges specific to the scale of qualitative research, which include data management, data overload and problems of handling large-scale data sets, time constraints in coding and analyzing data, and personnel issues including training, organizing and mentoring large research teams. Yet large samples can prove to be essential for enabling researchers to conduct comparative research, whether that be cross-national research within a wider European perspective undertaken by different teams or cross-cultural research looking at internal divisions and differences within diverse communities and cultures. PMID:22308079
ERIC Educational Resources Information Center
Allen, Daniel N.; Thaler, Nicholas S.; Barchard, Kimberly A.; Vertinski, Mary; Mayfield, Joan
2012-01-01
The Comprehensive Trail Making Test (CTMT) is a relatively new version of the Trail Making Test that has a number of appealing features, including a large normative sample that allows raw scores to be converted to standard "T" scores adjusted for age. Preliminary validity information suggests that CTMT scores are sensitive to brain…
USDA-ARS?s Scientific Manuscript database
Toxoplasma gondii is a protozoan parasite that infects a large spectrum of warm-blooded animals, including humans. Small mammals and rodents play an important role in the epidemiology of T. gondii because they are sources of infection for domestic and feral cats. Serum samples from 151 rodents and 4...
Sex and age composition of Great Gray Owls (Strix nebulosa), winter 1995/1996
Robert W. Nero; Herbert W. R. Copland
1997-01-01
In winter 1995/1996, a nearly continent-wide movement of Great Gray Owls (Strix nebulosa) occurred. A sample of 126 owls examined during this period, mainly from northeast of Winnipeg, included a large number from the 1994 hatch-year. If our assumptions regarding molt are correct, 51 birds were from this age class. An inhibited molt condition found...
ERIC Educational Resources Information Center
Byrne, Bruce; Guy, Richard
2016-01-01
This article describes student perceptions and outcomes in relation to the use of a novel interteaching approach. The study sample (n = 260) was taken from a large human physiology class, which included both first- and second-year students. However, unlike the first-year students, the second-year students had significant prior knowledge, having…
ERIC Educational Resources Information Center
Almgren, Gunnar; Magarati, Maya; Mogford, Liz
2009-01-01
We investigate the factors that influence adolescent self-assessed health, based upon surveys conducted between 2000 and 2004 of high-school seniors in Washington State (N = 6853). A large proportion of the sample (30%) was first and second generation immigrants from Asia, Latin America, and Eastern Europe. Findings include a robust negative…
Substance Abuse and Behavioral Correlates of Sexual Assault among South African Adolescents
ERIC Educational Resources Information Center
King, Gary; Flisher, Alan, J.; Noubary, Farzad.; Reece, Robert; Marais, Adele; Lombard, Carl
2004-01-01
Objective: The aim of this article is twofold: first, to examine the prevalence of being the victim of actual and attempted rape among a large representative sample of Cape Town high school students; and second, to identify the correlates of sexual assault for both boys and girls, including alcohol, tobacco and other drug use, behavioral problems,…
Total Synthesis of Eleutherobin and Analogs and Study of Anti-Cancer Mechanism
2001-05-01
storage of the samples in our prenmiscs for five months at into 25- 35 nmn large single crystals of zeolite Y. Further 78"C. crystallization of the...5, , " Sevcrl ib6kcular sieves, including zeolite A, Y, L, ZSM -5, d) R. E. Ireland, L. Liu, J. Org. Chem. 3993,58,2899; review of TPAP/ NMO oxidation
They're Not All at Home: Residential Placements of Early Adolescents in Special Education
ERIC Educational Resources Information Center
Chen, Chin-Chih; Culhane, Dennis P.; Metraux, Stephen; Park, Jung Min; Venable, Jessica C.; Burnett, T. C.
2016-01-01
Using an integrated administrative data set, out-of-home residential placements (i.e., child welfare, juvenile justice, mental health) were examined in a sample of early adolescents in a large urban school district. Out-of-home placements were tracked across Grades 7 to 9 in a population of 58,000 youth. This included 10,911 students identified…
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2013-01-01
Large-scale experiments that involve nested structures may assign treatment conditions either to subgroups such as classrooms or to individuals such as students within subgroups. Key aspects of the design of such experiments include knowledge of the variance structure in higher levels and the sample sizes necessary to reach sufficient power to…
The Changing of the Guard: Turnover and Structural Change in the Top-Management Positions.
ERIC Educational Resources Information Center
Harrison, J. Richard; And Others
1988-01-01
Using transition-rate analysis for a sample of 671 large U.S. manufacturing firms during 1978-1980, this paper explores the nature and relationship of the chief executive officer (CEO) and chair of the board of directors positions. One analysis examines CEO and board chair turnover; the other examines consolidations and separations. Includes 46…
ERIC Educational Resources Information Center
Archer, David Eric
2010-01-01
Scope and Method of Study: The purpose of this study was to discover the meanings female intercollegiate athletes ascribe to their experiences preceding exit from NCAA Division I competition. The study sample included five Division I female intercollegiate athletes. Four of these attended a large public research institution in the Southern Plains…
ERIC Educational Resources Information Center
Elkins, Irene J.; Malone, Steve; Keyes, Margaret; Iacono, William G.; McGue, Matt
2011-01-01
Whether gender differences exist in the impairment associated with attention-deficit/hyperactivity disorder (ADHD) is still largely unknown, because most samples have few affected girls or include only one sex. The current study evaluated whether ADHD affects adjustment differently for girls than boys in a population-based cohort of 11-year-olds…
Supra-galactic colour patterns in globular cluster systems
NASA Astrophysics Data System (ADS)
Forte, Juan C.
2017-07-01
An analysis of globular cluster systems associated with galaxies included in the Virgo and Fornax Hubble Space Telescope-Advanced Camera Surveys reveals distinct (g - z) colour modulation patterns. These features appear on composite samples of globular clusters and, most evidently, in galaxies with absolute magnitudes Mg in the range from -20.2 to -19.2. These colour modulations are also detectable on some samples of globular clusters in the central galaxies NGC 1399 and NGC 4486 (and confirmed on data sets obtained with different instruments and photometric systems), as well as in other bright galaxies in these clusters. After discarding field contamination, photometric errors and statistical effects, we conclude that these supra-galactic colour patterns are real and reflect some previously unknown characteristic. These features suggest that the globular cluster formation process was not entirely stochastic but included a fraction of clusters that formed in a rather synchronized fashion over large spatial scales, and in a tentative time lapse of about 1.5 Gy at redshifts z between 2 and 4. We speculate that the putative mechanism leading to that synchronism may be associated with large scale feedback effects connected with violent star-forming events and/or with supermassive black holes.
Large-scale linkage analysis of 1302 affected relative pairs with rheumatoid arthritis
Hamshere, Marian L; Segurado, Ricardo; Moskvina, Valentina; Nikolov, Ivan; Glaser, Beate; Holmans, Peter A
2007-01-01
Rheumatoid arthritis is the most common systematic autoimmune disease and its etiology is believed to have both strong genetic and environmental components. We demonstrate the utility of including genetic and clinical phenotypes as covariates within a linkage analysis framework to search for rheumatoid arthritis susceptibility loci. The raw genotypes of 1302 affected relative pairs were combined from four large family-based samples (North American Rheumatoid Arthritis Consortium, United Kingdom, European Consortium on Rheumatoid Arthritis Families, and Canada). The familiality of the clinical phenotypes was assessed. The affected relative pairs were subjected to autosomal multipoint affected relative-pair linkage analysis. Covariates were included in the linkage analysis to take account of heterogeneity within the sample. Evidence of familiality was observed with age at onset (p << 0.001) and rheumatoid factor (RF) IgM (p << 0.001), but not definite erosions (p = 0.21). Genome-wide significant evidence for linkage was observed on chromosome 6. Genome-wide suggestive evidence for linkage was observed on chromosomes 13 and 20 when conditioning on age at onset, chromosome 15 conditional on gender, and chromosome 19 conditional on RF IgM after allowing for multiple testing of covariates. PMID:18466440
The use of Landsat for monitoring water parameters in the coastal zone
NASA Technical Reports Server (NTRS)
Bowker, D. E.; Witte, W. G.
1977-01-01
Landsats 1 and 2 have been successful in detecting and quantifying suspended sediment and several other important parameters in the coastal zone, including chlorophyll, particles, alpha (light transmission), tidal conditions, acid and sewage dumps, and in some instances oil spills. When chlorophyll a is present in detectable quantities, however, it is shown to interfere with the measurement of sediment. The Landsat banding problem impairs the instrument resolution and places a requirement on the sampling program to collect surface data from a sufficiently large area. A sampling method which satisfies this condition is demonstrated.
Methods of Sensing Land Pollution from Sanitary Landfills
NASA Technical Reports Server (NTRS)
Nosanov, Myron Ellis; Bowerman, Frank R.
1971-01-01
Major cities are congested and large sites suitable for landfill development are limited. Methane and other gases are produced at most sanitary landfills and dumps. These gases may migrate horizontally and vertically and have caused fatalities. Monitoring these gases provides data bases for design and construction of safe buildings on and adjacent to landfills. Methods of monitoring include: (1) a portable combustible gas indicator; and (2) glass flasks valved to allow simultaneous exhaust of the flask and aspiration of the sample into the flask. Samples are drawn through tubing from probes as deep as twenty-five feet below the surface.
Facebook or Twitter?: Effective recruitment strategies for family caregivers.
Herbell, Kayla; Zauszniewski, Jaclene A
2018-06-01
This brief details recent recruitment insights from a large all-online study of family caregivers that aimed to develop a measure to assess how family caregivers manage daily stresses. Online recruitment strategies included the use of Twitter and Facebook. Overall, 800 individuals responded to the recruitment strategy; 230 completed all study procedures. The most effective online recruitment strategy for targeting family caregivers was Facebook, yielding 86% of the sample. Future researchers may find the use of social media recruitment methods appealing because they are inexpensive, simple, and efficient methods for obtaining National samples. Copyright © 2018 Elsevier Inc. All rights reserved.
An optimised protocol for molecular identification of Eimeria from chickens.
Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L; Macdonald, Sarah E; Chaudhry, Abdul S; Sparagano, Olivier; Banerjee, Partha S; Kundu, Krishnendu; Tomley, Fiona M; Blake, Damer P
2014-01-17
Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples. Copyright © 2013 Dirk Vulpius The Authors. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Johnson, H. T.; Linley, L. J.; Mansfield, J. A.
1982-01-01
A series of large-scale JP-4 fuel pool fire tests was conducted to refine existing mathematical models of large fires. Seven tests were conducted to make chemical concentration and temperature measurements in 7.5 and 15 meter-diameter pool fires. Measurements were made at heights of 0.7, 1.4, 2.9, 5.7, 11.4, and 21.3 meters above the fires. Temperatures were measured at up to 50 locations each second during the fires. Chemistry samples were taken at up to 23 locations within the fires and analyzed for combustion chemistry and soot concentration. Temperature and combustion chemistry profiles obtained during two 7.5 meter-diameter and two 15 meter-diameter fires are included.
Sample Preparation for Electron Probe Microanalysis—Pushing the Limits
Geller, Joseph D.; Engle, Paul D.
2002-01-01
There are two fundamental considerations in preparing samples for electron probe microanalysis (EPMA). The first one may seem obvious, but we often find it is overlooked. That is, the sample analyzed should be representative of the population from which it comes. The second is a direct result of the assumptions in the calculations used to convert x-ray intensity ratios, between the sample and standard, to concentrations. Samples originate from a wide range of sources. During their journey to being excited under the electron beam for the production of x rays there are many possibilities for sample alteration. Handling can contaminate samples by adding extraneous matter. In preparation, the various abrasives used in sizing the sample by sawing, grinding and polishing can embed themselves. The most accurate composition of a contaminated sample is, at best, not representative of the original sample; it is misleading. Our laboratory performs EPMA analysis on customer submitted samples and prepares over 250 different calibration standards including pure elements, compounds, alloys, glasses and minerals. This large variety of samples does not lend itself to mass production techniques, including automatic polishing. Our manual preparation techniques are designed individually for each sample. The use of automated preparation equipment does not lend itself to this environment, and is not included in this manuscript. The final step in quantitative electron probe microanalysis is the conversion of x-ray intensities ratios, known as the “k-ratios,” to composition (in mass fraction or atomic percent) and/or film thickness. Of the many assumptions made in the ZAF (where these letters stand for atomic number, absorption and fluorescence) corrections the localized geometry between the sample and electron beam, or takeoff angle, must be accurately known. Small angular errors can lead to significant errors in the final results. The sample preparation technique then becomes very important, and, under certain conditions, may even be the limiting factor in the analytical uncertainty budget. This paper considers preparing samples to get known geometries. It will not address the analysis of samples with irregular, unprepared surfaces or unknown geometries. PMID:27446757
Oil-shale data, cores, and samples collected by the U.S. geological survey through 1989
Dyni, John R.; Gay, Frances; Michalski, Thomas C.; ,
1990-01-01
The U.S. Geological Survey has acquired a large collection of geotechnical data, drill cores, and crushed samples of oil shale from the Eocene Green River Formation in Colorado, Wyoming, and Utah. The data include about 250,000 shale-oil analyses from about 600 core holes. Most of the data is from Colorado where the thickest and highest-grade oil shales of the Green River Formation are found in the Piceance Creek basin. Other data on file but not yet in the computer database include hundreds of lithologic core descriptions, geophysical well logs, and mineralogical and geochemical analyses. The shale-oil analyses are being prepared for release on floppy disks for use on microcomputers. About 173,000 lineal feet of drill core of oil shale and associated rocks, as well as 100,000 crushed samples of oil shale, are stored at the Core Research Center, U.S. Geological Survey, Lakewood, Colo. These materials are available to the public for research.
Arnold, Terri L.; Desimone, Leslie A.; Bexfield, Laura M.; Lindsey, Bruce D.; Barlow, Jeannie R.; Kulongoski, Justin T.; Musgrove, MaryLynn; Kingsbury, James A.; Belitz, Kenneth
2016-06-20
Groundwater-quality data were collected from 748 wells as part of the National Water-Quality Assessment Project of the U.S. Geological Survey National Water-Quality Program from May 2012 through December 2013. The data were collected from four types of well networks: principal aquifer study networks, which assess the quality of groundwater used for public water supply; land-use study networks, which assess land-use effects on shallow groundwater quality; major aquifer study networks, which assess the quality of groundwater used for domestic supply; and enhanced trends networks, which evaluate the time scales during which groundwater quality changes. Groundwater samples were analyzed for a large number of water-quality indicators and constituents, including major ions, nutrients, trace elements, volatile organic compounds, pesticides, and radionuclides. These groundwater quality data are tabulated in this report. Quality-control samples also were collected; data from blank and replicate quality-control samples are included in this report.
Pokines, James T; Zinni, Debra Prince; Crowley, Kate
2016-01-01
A sample of 49 cases of cemetery remains received at the Office of the Chief Medical Examiner, Massachusetts (OCME-MA), in Boston was compared with published taphonomic profiles of cemetery remains. The present sample is composed of a cross section of typical cases in this region that ultimately are derived from modern to historical coffin burials and get turned over to or seized by law enforcement. The present sample was composed of a large portion of isolated remains, and most were completely skeletonized. The most prevalent taphonomic characteristics included uniform staining (77.6%), coffin wear (46.9%), and cortical Exfoliation (49.0%). Other taphonomic changes occurring due to later surface exposure of cemetery remains included subaerial weathering, animal gnawing, algae formation, and excavation marks. A case of one set of skeletal remains associated with coffin artifacts and cemetery offerings that was recovered from transported cemetery fill is also presented. © 2015 American Academy of Forensic Sciences.
Direct determination of fatty acids in fish tissues: quantifying top predator trophic connections.
Parrish, Christopher C; Nichols, Peter D; Pethybridge, Heidi; Young, Jock W
2015-01-01
Fatty acids are a valuable tool in ecological studies because of the large number of unique structures synthesized. They provide versatile signatures that are being increasingly employed to delineate the transfer of dietary material through marine and terrestrial food webs. The standard procedure for determining fatty acids generally involves lipid extraction followed by methanolysis to produce methyl esters for analysis by gas chromatography. By directly transmethylating ~50 mg wet samples and adding an internal standard it was possible to greatly simplify the analytical methodology to enable rapid throughput of 20-40 fish tissue fatty acid analyses a day including instrumental analysis. This method was verified against the more traditional lipid methods using albacore tuna and great white shark muscle and liver samples, and it was shown to provide an estimate of sample dry mass, total lipid content, and a condition index. When large fatty acid data sets are generated in this way, multidimensional scaling, analysis of similarities, and similarity of percentages analysis can be used to define trophic connections among samples and to quantify them. These routines were used on albacore and skipjack tuna fatty acid data obtained by direct methylation coupled with literature values for krill. There were clear differences in fatty acid profiles among the species as well as spatial differences among albacore tuna sampled from different locations.
Outcome-Dependent Sampling with Interval-Censored Failure Time Data
Zhou, Qingning; Cai, Jianwen; Zhou, Haibo
2017-01-01
Summary Epidemiologic studies and disease prevention trials often seek to relate an exposure variable to a failure time that suffers from interval-censoring. When the failure rate is low and the time intervals are wide, a large cohort is often required so as to yield reliable precision on the exposure-failure-time relationship. However, large cohort studies with simple random sampling could be prohibitive for investigators with a limited budget, especially when the exposure variables are expensive to obtain. Alternative cost-effective sampling designs and inference procedures are therefore desirable. We propose an outcome-dependent sampling (ODS) design with interval-censored failure time data, where we enrich the observed sample by selectively including certain more informative failure subjects. We develop a novel sieve semiparametric maximum empirical likelihood approach for fitting the proportional hazards model to data from the proposed interval-censoring ODS design. This approach employs the empirical likelihood and sieve methods to deal with the infinite-dimensional nuisance parameters, which greatly reduces the dimensionality of the estimation problem and eases the computation difficulty. The consistency and asymptotic normality of the resulting regression parameter estimator are established. The results from our extensive simulation study show that the proposed design and method works well for practical situations and is more efficient than the alternative designs and competing approaches. An example from the Atherosclerosis Risk in Communities (ARIC) study is provided for illustration. PMID:28771664
OPTICAL–NEAR-INFRARED PHOTOMETRIC CALIBRATION OF M DWARF METALLICITY AND ITS APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hejazi, N.; Robertis, M. M. De; Dawson, P. C., E-mail: nedahej@yorku.ca, E-mail: mmdr@yorku.ca, E-mail: pdawson@trentu.ca
2015-04-15
Based on a carefully constructed sample of dwarf stars, a new optical–near-infrared photometric calibration to estimate the metallicity of late-type K and early-to-mid-type M dwarfs is presented. The calibration sample has two parts; the first part includes 18 M dwarfs with metallicities determined by high-resolution spectroscopy and the second part contains 49 dwarfs with metallicities obtained through moderate-resolution spectra. By applying this calibration to a large sample of around 1.3 million M dwarfs from the Sloan Digital Sky Survey and 2MASS, the metallicity distribution of this sample is determined and compared with those of previous studies. Using photometric parallaxes, themore » Galactic heights of M dwarfs in the large sample are also estimated. Our results show that stars farther from the Galactic plane, on average, have lower metallicity, which can be attributed to the age–metallicity relation. A scarcity of metal-poor dwarf stars in the metallicity distribution relative to the Simple Closed Box Model indicates the existence of the “M dwarf problem,” similar to the previously known G and K dwarf problems. Several more complicated Galactic chemical evolution models which have been proposed to resolve the G and K dwarf problems are tested and it is shown that these models could, to some extent, mitigate the M dwarf problem as well.« less
Pham, Giang
2017-01-01
Purpose Although language samples and standardized tests are regularly used in assessment, few studies provide clinical guidance on how to synthesize information from these testing tools. This study extends previous work on the relations between tests and language samples to a new population—school-age bilingual speakers with primary language impairment—and considers the clinical implications for bilingual assessment. Method Fifty-one bilingual children with primary language impairment completed narrative language samples and standardized language tests in English and Spanish. Children were separated into younger (ages 5;6 [years;months]–8;11) and older (ages 9;0–11;2) groups. Analysis included correlations with age and partial correlations between language sample measures and test scores in each language. Results Within the younger group, positive correlations with large effect sizes indicated convergence between test scores and microstructural language sample measures in both Spanish and English. There were minimal correlations in the older group for either language. Age related to English but not Spanish measures. Conclusions Tests and language samples complement each other in assessment. Wordless picture-book narratives may be more appropriate for ages 5–8 than for older children. We discuss clinical implications, including a case example of a bilingual child with primary language impairment, to illustrate how to synthesize information from these tools in assessment. PMID:28055056
Holliday, Trenton W; Hilton, Charles E
2010-06-01
Given the well-documented fact that human body proportions covary with climate (presumably due to the action of selection), one would expect that the Ipiutak and Tigara Inuit samples from Point Hope, Alaska, would be characterized by an extremely cold-adapted body shape. Comparison of the Point Hope Inuit samples to a large (n > 900) sample of European and European-derived, African and African-derived, and Native American skeletons (including Koniag Inuit from Kodiak Island, Alaska) confirms that the Point Hope Inuit evince a cold-adapted body form, but analyses also reveal some unexpected results. For example, one might suspect that the Point Hope samples would show a more cold-adapted body form than the Koniag, given their more extreme environment, but this is not the case. Additionally, univariate analyses seldom show the Inuit samples to be more cold-adapted in body shape than Europeans, and multivariate cluster analyses that include a myriad of body shape variables such as femoral head diameter, bi-iliac breadth, and limb segment lengths fail to effectively separate the Inuit samples from Europeans. In fact, in terms of body shape, the European and the Inuit samples tend to be cold-adapted and tend to be separated in multivariate space from the more tropically adapted Africans, especially those groups from south of the Sahara. Copyright 2009 Wiley-Liss, Inc.
A cautionary note on Bayesian estimation of population size by removal sampling with diffuse priors.
Bord, Séverine; Bioche, Christèle; Druilhet, Pierre
2018-05-01
We consider the problem of estimating a population size by removal sampling when the sampling rate is unknown. Bayesian methods are now widespread and allow to include prior knowledge in the analysis. However, we show that Bayes estimates based on default improper priors lead to improper posteriors or infinite estimates. Similarly, weakly informative priors give unstable estimators that are sensitive to the choice of hyperparameters. By examining the likelihood, we show that population size estimates can be stabilized by penalizing small values of the sampling rate or large value of the population size. Based on theoretical results and simulation studies, we propose some recommendations on the choice of the prior. Then, we applied our results to real datasets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Spötl, Christoph
2005-09-01
The stable carbon isotopic composition of dissolved inorganic carbon (delta13C(DIC)) is traditionally determined using either direct precipitation or gas evolution methods in conjunction with offline gas preparation and measurement in a dual-inlet isotope ratio mass spectrometer. A gas evolution method based on continuous-flow technology is described here, which is easy to use and robust. Water samples (100-1500 microl depending on the carbonate alkalinity) are injected into He-filled autosampler vials in the field and analysed on an automated continuous-flow gas preparation system interfaced to an isotope ratio mass spectrometer. Sample analysis time including online preparation is 10 min and overall precision is 0.1 per thousand. This method is thus fast and can easily be automated for handling large sample batches.
NASA Technical Reports Server (NTRS)
Score, Roberta; Lindstrom, Marilyn M.
1990-01-01
The state of the collection of Antarctic Meteorites is summarized. This guide is intended to assist investigators plan their meteorite research and select and request samples. Useful information is presented for all classified meteorites from 1976 to 1988 collections, as of Sept. 1989. The meteorite collection has grown over 13 years to include 4264 samples of which 2754 have been classified. Most of the unclassified meteorites are ordinary chondrites because the collections have been culled for specimens of special petrologic type. The guide consists of two large classification tables. They are preceded by a list of sample locations and important notes to make the tables understandable.
Pollard, Robert Q; Sutter, Erika; Cerulli, Catherine
2014-01-01
A computerized sign language survey was administered to two large samples of deaf adults. Six questions regarding intimate partner violence (IPV) were included, querying lifetime and past-year experiences of emotional abuse, physical abuse, and forced sex. Comparison data were available from a telephone survey of local households. Deaf respondents reported high rates of emotional abuse and much higher rates of forced sex than general population respondents. Physical abuse rates were comparable between groups. More men than women in both deaf samples reported past-year physical and sexual abuse. Past-year IPV was associated with higher utilization of hospital emergency services. Implications for IPV research, education, and intervention in the Deaf community are discussed. PMID:24142445
Anomalous waveforms observed in laboratory-formed gas hydrate-bearing and ice-bearing sediments
Lee, Myung W.; Waite, William F.
2011-01-01
Acoustic transmission measurements of compressional, P, and shear, S, wave velocities rely on correctly identifying the P- and S-body wave arrivals in the measured waveform. In cylindrical samples for which the sample is much longer than the acoustic wavelength, these body waves can be obscured by high-amplitude waveform features arriving just after the relatively small-amplitude P-body wave. In this study, a normal mode approach is used to analyze this type of waveform, observed in sediment containing gas hydrate or ice. This analysis extends an existing normal-mode waveform propagation theory by including the effects of the confining medium surrounding the sample, and provides guidelines for estimating S-wave velocities from waveforms containing multiple large-amplitude arrivals. PMID:21476628
Christner, Martin; Trusch, Maria; Rohde, Holger; Kwiatkowski, Marcel; Schlüter, Hartmut; Wolters, Manuel; Aepfelbacher, Martin; Hentschke, Moritz
2014-01-01
In 2011 northern Germany experienced a large outbreak of Shiga-Toxigenic Escherichia coli O104:H4. The large amount of samples sent to microbiology laboratories for epidemiological assessment highlighted the importance of fast and inexpensive typing procedures. We have therefore evaluated the applicability of a MALDI-TOF mass spectrometry based strategy for outbreak strain identification. Specific peaks in the outbreak strain's spectrum were identified by comparative analysis of archived pre-outbreak spectra that had been acquired for routine species-level identification. Proteins underlying these discriminatory peaks were identified by liquid chromatography tandem mass spectrometry and validated against publicly available databases. The resulting typing scheme was evaluated against PCR genotyping with 294 E. coli isolates from clinical samples collected during the outbreak. Comparative spectrum analysis revealed two characteristic peaks at m/z 6711 and m/z 10883. The underlying proteins were found to be of low prevalence among genome sequenced E. coli strains. Marker peak detection correctly classified 292 of 293 study isolates, including all 104 outbreak isolates. MALDI-TOF mass spectrometry allowed for reliable outbreak strain identification during a large outbreak of Shiga-Toxigenic E. coli. The applied typing strategy could probably be adapted to other typing tasks and might facilitate epidemiological surveys as part of the routine pathogen identification workflow.
Understanding resilience in same-sex parented families: the work, love, play study
2010-01-01
Background While families headed by same-sex couples have achieved greater public visibility in recent years, there are still many challenges for these families in dealing with legal and community contexts that are not supportive of same-sex relationships. The Work, Love, Play study is a large longitudinal study of same-sex parents. It aims to investigate many facets of family life among this sample and examine how they change over time. The study focuses specifically on two key areas missing from the current literature: factors supporting resilience in same-sex parented families; and health and wellbeing outcomes for same-sex couples who undergo separation, including the negotiation of shared parenting arrangements post-separation. The current paper aims to provide a comprehensive overview of the design and methods of this longitudinal study and discuss its significance. Methods/Design The Work, Love, Play study is a mixed design, three wave, longitudinal cohort study of same-sex attracted parents. The sample includes lesbian, gay, bisexual and transgender parents in Australia and New Zealand (including single parents within these categories) caring for any children under the age of 18 years. The study will be conducted over six years from 2008 to 2014. Quantitative data are to be collected via three on-line surveys in 2008, 2010 and 2012 from the cohort of parents recruited in Wave1. Qualitative data will be collected via interviews with purposively selected subsamples in 2012 and 2013. Data collection began in 2008 and 355 respondents to Wave One of the study have agreed to participate in future surveys. Work is currently underway to increase this sample size. The methods and survey instruments are described. Discussion This study will make an important contribution to the existing research on same-sex parented families. Strengths of the study design include the longitudinal method, which will allow understanding of changes over time within internal family relationships and social supports. Further, the mixed method design enables triangulation of qualitative and quantitative data. A broad recruitment strategy has already enabled a large sample size with the inclusion of both gay men and lesbians. PMID:20211027
Understanding resilience in same-sex parented families: the work, love, play study.
Power, Jennifer J; Perlesz, Amaryll; Schofield, Margot J; Pitts, Marian K; Brown, Rhonda; McNair, Ruth; Barrett, Anna; Bickerdike, Andrew
2010-03-09
While families headed by same-sex couples have achieved greater public visibility in recent years, there are still many challenges for these families in dealing with legal and community contexts that are not supportive of same-sex relationships. The Work, Love, Play study is a large longitudinal study of same-sex parents. It aims to investigate many facets of family life among this sample and examine how they change over time. The study focuses specifically on two key areas missing from the current literature: factors supporting resilience in same-sex parented families; and health and wellbeing outcomes for same-sex couples who undergo separation, including the negotiation of shared parenting arrangements post-separation. The current paper aims to provide a comprehensive overview of the design and methods of this longitudinal study and discuss its significance. The Work, Love, Play study is a mixed design, three wave, longitudinal cohort study of same-sex attracted parents. The sample includes lesbian, gay, bisexual and transgender parents in Australia and New Zealand (including single parents within these categories) caring for any children under the age of 18 years. The study will be conducted over six years from 2008 to 2014. Quantitative data are to be collected via three on-line surveys in 2008, 2010 and 2012 from the cohort of parents recruited in Wave1. Qualitative data will be collected via interviews with purposively selected subsamples in 2012 and 2013. Data collection began in 2008 and 355 respondents to Wave One of the study have agreed to participate in future surveys. Work is currently underway to increase this sample size. The methods and survey instruments are described. This study will make an important contribution to the existing research on same-sex parented families. Strengths of the study design include the longitudinal method, which will allow understanding of changes over time within internal family relationships and social supports. Further, the mixed method design enables triangulation of qualitative and quantitative data. A broad recruitment strategy has already enabled a large sample size with the inclusion of both gay men and lesbians.
NASA Astrophysics Data System (ADS)
Kuo, Yi-Ming; Liu, Wen-Wen
2015-04-01
The Han River basin is one of the most important industrial and grain production bases in the central China. A lot of factories and towns have been established along the river where large farmlands are located nearby. In the last few decades the water quality of the Han River, specifically in middle and lower reaches, has gradually declined. The agricultural nonpoint pollution and municipal and industrial point pollution significantly degrade the water quality of the Han River. Factor analysis can be applied to reduce the dimensionality of a data set consisting of a large number of inter-related variables. Cluster analysis can classify the samples according to their similar characters. In this study, factor analysis is used to identify major pollution indicators, and cluster analysis is employed to classify the samples based on the sample locations and hydrochemical variables. Water samples were collected from 12 sample sites collected from Xiangyang City (middle Han River) to Wuhan City (lower Han River). Correlations among 25 hydrochemical variables are statistically examined. The important pollutants are determined by factor analysis. A three-factor model is determined and explains over 85% of the total river water quality variation. Factor 1, including SS, Chl-a, TN and TP, can be considered as the nonpoint source pollution. Factor 2, including Cl-, Br-, SO42-, Ca2+, Mg2+, K+, Fe2+ and PO43-, can be treated as the industrial pollutant pollution. Factor 3, including F- and NO3-, reflects the influence of the groundwater or self-purification capability of the river water. The various land uses along the Han River correlate well with the pollution types. In addition, the result showed that the water quality of Han River deteriorated gradually from middle to lower Han River. Some tributaries have been seriously polluted and significantly influence the mainstream water quality of the Han River. Finally, the result showed that the nonpoint pollution and the point pollution both significantly influence water quality in the middle and lower Han River. This study provides an effective method for watershed management and pollution control in Han River.
Trujillano, D; Ramos, M D; González, J; Tornador, C; Sotillo, F; Escaramis, G; Ossowski, S; Armengol, L; Casals, T; Estivill, X
2013-07-01
Here we have developed a novel and much more efficient strategy for the complete molecular characterisation of the cystic fibrosis (CF) transmembrane regulator (CFTR) gene, based on multiplexed targeted resequencing. We have tested this approach in a cohort of 92 samples with previously characterised CFTR mutations and polymorphisms. After enrichment of the pooled barcoded DNA libraries with a custom NimbleGen SeqCap EZ Choice array (Roche) and sequencing with a HiSeq2000 (Illumina) sequencer, we applied several bioinformatics tools to call mutations and polymorphisms in CFTR. The combination of several bioinformatics tools allowed us to detect all known pathogenic variants (point mutations, short insertions/deletions, and large genomic rearrangements) and polymorphisms (including the poly-T and poly-thymidine-guanine polymorphic tracts) in the 92 samples. In addition, we report the precise characterisation of the breakpoints of seven genomic rearrangements in CFTR, including those of a novel deletion of exon 22 and a complex 85 kb inversion which includes two large deletions affecting exons 4-8 and 12-21, respectively. This work is a proof-of-principle that targeted resequencing is an accurate and cost-effective approach for the genetic testing of CF and CFTR-related disorders (ie, male infertility) amenable to the routine clinical practice, and ready to substitute classical molecular methods in medical genetics.
A Hybrid Semi-supervised Classification Scheme for Mining Multisource Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Bhaduri, Budhendra L
2011-01-01
Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions (class conditional probability densities) are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and ecological zones. A second problem with statistical classifiers is the requirement of large number of accurate training samples (10 to 30 |dimensions|), which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, itmore » is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 25 to 35% improvement in overall classification accuracy over conventional classification schemes.« less
High-energy x-ray diffraction from surfaces and nanoparticles
NASA Astrophysics Data System (ADS)
Hejral, U.; Müller, P.; Shipilin, M.; Gustafson, J.; Franz, D.; Shayduk, R.; Rütt, U.; Zhang, C.; Merte, L. R.; Lundgren, E.; Vonk, V.; Stierle, A.
2017-11-01
High-energy surface-sensitive x-ray diffraction (HESXRD) is a powerful high-energy photon technique (E > 70 keV) that has in recent years proven to allow a fast data acquisition for the 3D structure determination of surfaces and nanoparticles under in situ and operando conditions. The use of a large-area detector facilitates the direct collection of nearly distortion-free diffraction patterns over a wide q range, including crystal truncation rods perpendicular to the surface and large-area reciprocal space maps from epitaxial nanoparticles, which is not possible in the conventional low-photon energy approach (E =10 -20 keV ). Here, we present a comprehensive mathematical approach, explaining the working principle of HESXRD for both single-crystal surfaces and epitaxial nanostructures on single-crystal supports. The angular calculations used in conventional crystal truncation rod measurements at low-photon energies are adopted for the high-photon-energy regime, illustrating why and to which extent large reciprocal-space areas can be probed in stationary geometry with fixed sample rotation. We discuss how imperfections such as mosaicity and finite domain size aid in sampling a substantial part of reciprocal space without the need of rotating the sample. An exact account is given of the area probed in reciprocal space using such a stationary mode, which is essential for in situ or operando time-resolved experiments on surfaces and nanostructures.
Forest cover, socioeconomics, and reported flood frequency in developing countries
NASA Astrophysics Data System (ADS)
Ferreira, Susana; Ghimire, Ramesh
2012-08-01
In this paper, we analyze the determinants of the number of large floods reported since 1990. Using the same sample of countries as Bradshaw et al. (2007), and, like them, omitting socioeconomic characteristics from the analysis, we found that a reduction in natural forest cover is associated with an increase in the reported count of large floods. This result does not hold in any of three new analyses we perform. First, we expand the sample to include all the developing countries and all countries for which data were available but were omitted in their study. Second, and more importantly, since forest management is just one possible channel through which humans can influence reported flood frequency, we account for other important human-flood interactions. People are typically responsible for deforestation, but they are also responsible for other land use changes (e.g., urbanization), for floodplain and flood emergency management, and for reporting the floods. Thus, in our analysis we account for population, urban population growth, income, and corruption. Third, we exploit the panel nature of the data to control for unobserved country and time heterogeneity. We conclude that not only is the link between forest cover and reported flood frequency at the country level not robust, it also seems to be driven by sample selection and omitted variable bias. The human impact on the reported frequency of large floods at the country level is not through deforestation.
NASA Astrophysics Data System (ADS)
Ferrari, Ulisse
A maximal entropy model provides the least constrained probability distribution that reproduces experimental averages of an observables set. In this work we characterize the learning dynamics that maximizes the log-likelihood in the case of large but finite datasets. We first show how the steepest descent dynamics is not optimal as it is slowed down by the inhomogeneous curvature of the model parameters space. We then provide a way for rectifying this space which relies only on dataset properties and does not require large computational efforts. We conclude by solving the long-time limit of the parameters dynamics including the randomness generated by the systematic use of Gibbs sampling. In this stochastic framework, rather than converging to a fixed point, the dynamics reaches a stationary distribution, which for the rectified dynamics reproduces the posterior distribution of the parameters. We sum up all these insights in a ``rectified'' Data-Driven algorithm that is fast and by sampling from the parameters posterior avoids both under- and over-fitting along all the directions of the parameters space. Through the learning of pairwise Ising models from the recording of a large population of retina neurons, we show how our algorithm outperforms the steepest descent method. This research was supported by a Grant from the Human Brain Project (HBP CLAP).
Onukwugha, Eberechukwu; Qi, Ran; Jayasekera, Jinani; Zhou, Shujia
2016-02-01
Prognostic classification approaches are commonly used in clinical practice to predict health outcomes. However, there has been limited focus on use of the general approach for predicting costs. We applied a grouping algorithm designed for large-scale data sets and multiple prognostic factors to investigate whether it improves cost prediction among older Medicare beneficiaries diagnosed with prostate cancer. We analysed the linked Surveillance, Epidemiology and End Results (SEER)-Medicare data, which included data from 2000 through 2009 for men diagnosed with incident prostate cancer between 2000 and 2007. We split the survival data into two data sets (D0 and D1) of equal size. We trained the classifier of the Grouping Algorithm for Cancer Data (GACD) on D0 and tested it on D1. The prognostic factors included cancer stage, age, race and performance status proxies. We calculated the average difference between observed D1 costs and predicted D1 costs at 5 years post-diagnosis with and without the GACD. The sample included 110,843 men with prostate cancer. The median age of the sample was 74 years, and 10% were African American. The average difference (mean absolute error [MAE]) per person between the real and predicted total 5-year cost was US$41,525 (MAE US$41,790; 95% confidence interval [CI] US$41,421-42,158) with the GACD and US$43,113 (MAE US$43,639; 95% CI US$43,062-44,217) without the GACD. The 5-year cost prediction without grouping resulted in a sample overestimate of US$79,544,508. The grouping algorithm developed for complex, large-scale data improves the prediction of 5-year costs. The prediction accuracy could be improved by utilization of a richer set of prognostic factors and refinement of categorical specifications.
van Doorn, Remco; Zoutman, Willem H; Dijkman, Remco; de Menezes, Renee X; Commandeur, Suzan; Mulder, Aat A; van der Velden, Pieter A; Vermeer, Maarten H; Willemze, Rein; Yan, Pearlly S; Huang, Tim H; Tensen, Cornelis P
2005-06-10
To analyze the occurrence of promoter hypermethylation in primary cutaneous T-cell lymphoma (CTCL) on a genome-wide scale, focusing on epigenetic alterations with pathogenetic significance. DNA isolated from biopsy specimens of 28 patients with CTCL, including aggressive CTCL entities (transformed mycosis fungoides and CD30-negative large T-cell lymphoma) and an indolent entity (CD30-positive large T-cell lymphoma), were investigated. For genome-wide DNA methylation screening, differential methylation hybridization using CpG island microarrays was applied, which allows simultaneous detection of the methylation status of 8640 CpG islands. Bisulfite sequence analysis was applied for confirmation and detection of hypermethylation of eight selected tumor suppressor genes. The DNA methylation patterns of CTCLs emerging from differential methylation hybridization analysis included 35 CpG islands hypermethylated in at least four of the 28 studied CTCL samples when compared with benign T-cell samples. Hypermethylation of the putative tumor suppressor genes BCL7a (in 48% of CTCL samples), PTPRG (27%), and thrombospondin 4 (52%) was confirmed and demonstrated to be associated with transcriptional downregulation. BCL7a was hypermethylated at a higher frequency in aggressive (64%) than in indolent (14%) CTCL entities. In addition, the promoters of the selected tumor suppressor genes p73 (48%), p16 (33%), CHFR (19%), p15 (10%), and TMS1 (10%) were hypermethylated in CTCL. Malignant T cells of patients with CTCL display widespread promoter hypermethylation associated with inactivation of several tumor suppressor genes involved in DNA repair, cell cycle, and apoptosis signaling pathways. In view of this, CTCL may be amenable to treatment with demethylating agents.
Sampling Large Graphs for Anticipatory Analytics
2015-05-15
low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges
Energy Landscape of All-Atom Protein-Protein Interactions Revealed by Multiscale Enhanced Sampling
Moritsugu, Kei; Terada, Tohru; Kidera, Akinori
2014-01-01
Protein-protein interactions are regulated by a subtle balance of complicated atomic interactions and solvation at the interface. To understand such an elusive phenomenon, it is necessary to thoroughly survey the large configurational space from the stable complex structure to the dissociated states using the all-atom model in explicit solvent and to delineate the energy landscape of protein-protein interactions. In this study, we carried out a multiscale enhanced sampling (MSES) simulation of the formation of a barnase-barstar complex, which is a protein complex characterized by an extraordinary tight and fast binding, to determine the energy landscape of atomistic protein-protein interactions. The MSES adopts a multicopy and multiscale scheme to enable for the enhanced sampling of the all-atom model of large proteins including explicit solvent. During the 100-ns MSES simulation of the barnase-barstar system, we observed the association-dissociation processes of the atomistic protein complex in solution several times, which contained not only the native complex structure but also fully non-native configurations. The sampled distributions suggest that a large variety of non-native states went downhill to the stable complex structure, like a fast folding on a funnel-like potential. This funnel landscape is attributed to dominant configurations in the early stage of the association process characterized by near-native orientations, which will accelerate the native inter-molecular interactions. These configurations are guided mostly by the shape complementarity between barnase and barstar, and lead to the fast formation of the final complex structure along the downhill energy landscape. PMID:25340714
Wilsmore, Bradley R; Grunstein, Ronald R; Fransen, Marlene; Woodward, Mark; Norton, Robyn; Ameratunga, Shanthi
2013-06-15
To determine the relationship between sleep complaints, primary insomnia, excessive daytime sleepiness, and lifestyle factors in a large community-based sample. Cross-sectional study. Blood donor sites in New Zealand. 22,389 individuals aged 16-84 years volunteering to donate blood. N/A. A comprehensive self-administered questionnaire including personal demographics and validated questions assessing sleep disorders (snoring, apnea), sleep complaints (sleep quantity, sleep dissatisfaction), insomnia symptoms, excessive daytime sleepiness, mood, and lifestyle factors such as work patterns, smoking, alcohol, and illicit substance use. Additionally, direct measurements of height and weight were obtained. One in three participants report < 7-8 h sleep, 5 or more nights per week, and 60% would like more sleep. Almost half the participants (45%) report suffering the symptoms of insomnia at least once per week, with one in 5 meeting more stringent criteria for primary insomnia. Excessive daytime sleepiness (evident in 9% of this large, predominantly healthy sample) was associated with insomnia (odds ratio [OR] 1.75, 95% confidence interval [CI] 1.50 to 2.05), depression (OR 2.01, CI 1.74 to 2.32), and sleep disordered breathing (OR 1.92, CI 1.59 to 2.32). Long work hours, alcohol dependence, and rotating work shifts also increase the risk of daytime sleepiness. Even in this relatively young, healthy, non-clinical sample, sleep complaints and primary insomnia with subsequent excess daytime sleepiness were common. There were clear associations between many personal and lifestyle factors-such as depression, long work hours, alcohol dependence, and rotating shift work-and sleep problems or excessive daytime sleepiness.
The impact of sample size on the reproducibility of voxel-based lesion-deficit mappings.
Lorca-Puls, Diego L; Gajardo-Vidal, Andrea; White, Jitrachote; Seghier, Mohamed L; Leff, Alexander P; Green, David W; Crinion, Jenny T; Ludersdorfer, Philipp; Hope, Thomas M H; Bowman, Howard; Price, Cathy J
2018-07-01
This study investigated how sample size affects the reproducibility of findings from univariate voxel-based lesion-deficit analyses (e.g., voxel-based lesion-symptom mapping and voxel-based morphometry). Our effect of interest was the strength of the mapping between brain damage and speech articulation difficulties, as measured in terms of the proportion of variance explained. First, we identified a region of interest by searching on a voxel-by-voxel basis for brain areas where greater lesion load was associated with poorer speech articulation using a large sample of 360 right-handed English-speaking stroke survivors. We then randomly drew thousands of bootstrap samples from this data set that included either 30, 60, 90, 120, 180, or 360 patients. For each resample, we recorded effect size estimates and p values after conducting exactly the same lesion-deficit analysis within the previously identified region of interest and holding all procedures constant. The results show (1) how often small effect sizes in a heterogeneous population fail to be detected; (2) how effect size and its statistical significance varies with sample size; (3) how low-powered studies (due to small sample sizes) can greatly over-estimate as well as under-estimate effect sizes; and (4) how large sample sizes (N ≥ 90) can yield highly significant p values even when effect sizes are so small that they become trivial in practical terms. The implications of these findings for interpreting the results from univariate voxel-based lesion-deficit analyses are discussed. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; ...
2015-11-17
The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less
Wu, Chenglin; de Miranda, Noel Fcc; Chen, Longyun; Wasik, Agata M; Mansouri, Larry; Jurczak, Wojciech; Galazka, Krystyna; Dlugosz-Danecka, Monika; Machaczka, Maciej; Zhang, Huilai; Peng, Roujun; Morin, Ryan D; Rosenquist, Richard; Sander, Birgitta; Pan-Hammarström, Qiang
2016-06-21
The genetic mechanisms underlying disease progression, relapse and therapy resistance in mantle cell lymphoma (MCL) remain largely unknown. Whole-exome sequencing was performed in 27 MCL samples from 13 patients, representing the largest analyzed series of consecutive biopsies obtained at diagnosis and/or relapse for this type of lymphoma. Eighteen genes were found to be recurrently mutated in these samples, including known (ATM, MEF2B and MLL2) and novel mutation targets (S1PR1 and CARD11). CARD11, a scaffold protein required for B-cell receptor (BCR)-induced NF-κB activation, was subsequently screened in an additional 173 MCL samples and mutations were observed in 5.5% of cases. Based on in vitro cell line-based experiments, overexpression of CARD11 mutants were demonstrated to confer resistance to the BCR-inhibitor ibrutinib and NF-κB-inhibitor lenalidomide. Genetic alterations acquired in the relapse samples were found to be largely non-recurrent, in line with the branched evolutionary pattern of clonal evolution observed in most cases. In summary, this study highlights the genetic heterogeneity in MCL, in particular at relapse, and provides for the first time genetic evidence of BCR/NF-κB activation in a subset of MCL.
MicroRNA signatures in B-cell lymphomas
Di Lisio, L; Sánchez-Beato, M; Gómez-López, G; Rodríguez, M E; Montes-Moreno, S; Mollejo, M; Menárguez, J; Martínez, M A; Alves, F J; Pisano, D G; Piris, M A; Martínez, N
2012-01-01
Accurate lymphoma diagnosis, prognosis and therapy still require additional markers. We explore the potential relevance of microRNA (miRNA) expression in a large series that included all major B-cell non-Hodgkin lymphoma (NHL) types. The data generated were also used to identify miRNAs differentially expressed in Burkitt lymphoma (BL) and diffuse large B-cell lymphoma (DLBCL) samples. A series of 147 NHL samples and 15 controls were hybridized on a human miRNA one-color platform containing probes for 470 human miRNAs. Each lymphoma type was compared against the entire set of NHLs. BL was also directly compared with DLBCL, and 43 preselected miRNAs were analyzed in a new series of routinely processed samples of 28 BLs and 43 DLBCLs using quantitative reverse transcription-polymerase chain reaction. A signature of 128 miRNAs enabled the characterization of lymphoma neoplasms, reflecting the lymphoma type, cell of origin and/or discrete oncogene alterations. Comparative analysis of BL and DLBCL yielded 19 differentially expressed miRNAs, which were confirmed in a second confirmation series of 71 paraffin-embedded samples. The set of differentially expressed miRNAs found here expands the range of potential diagnostic markers for lymphoma diagnosis, especially when differential diagnosis of BL and DLBCL is required. PMID:22829247
WATER QUALITY MONITORING OF PHARMACEUTICALS ...
The demand on freshwater to sustain the needs of the growing population is of worldwide concern. Often this water is used, treated, and released for reuse by other communities. The anthropogenic contaminants present in this water may include complex mixtures of pesticides, prescription and nonprescription drugs, personal care and common consumer products, industrial and domestic-use materials and degradation products of these compounds. Although, the fate of these pharmaceuticals and personal care products (PPCPs) in wastewater treatment facilities is largely unknown, the limited data that does exist suggests that many of these chemicals survive treatment and some others are returned to their biologically active form via deconjugation of metabolites.Traditional water sampling methods (i.e., grab or composite samples) often require the concentration of large amounts of water to detect trace levels of PPCPs. A passive sampler, the polar organic chemical integrative sampler (POCIS), has been developed to integratively concentrate the trace levels of these chemicals, determine the time-weighted average water concentrations, and provide a method of estimating the potential exposure of aquatic organisms to these complex mixtures of waterborne contaminants. The POCIS (U.S. Patent number 6,478,961) consists of a hydrophilic microporous membrane, acting as a semipermeable barrier, enveloping various solid-phase sorbents that retain the sampled chemicals. Sampling rates f
Predictive value of callous-unemotional traits in a large community sample.
Moran, Paul; Rowe, Richard; Flach, Clare; Briskman, Jacqueline; Ford, Tamsin; Maughan, Barbara; Scott, Stephen; Goodman, Robert
2009-11-01
Callous-unemotional (CU) traits in children and adolescents are increasingly recognized as a distinctive dimension of prognostic importance in clinical samples. Nevertheless, comparatively little is known about the longitudinal effects of these personality traits on the mental health of young people from the general population. Using a large representative sample of children and adolescents living in Great Britain, we set out to examine the effects of CU traits on a range of mental health outcomes measured 3 years after the initial assessment. Parents were interviewed to determine the presence of CU traits in a representative sample of 7,636 children and adolescents. The parents also completed the Strengths and Difficulties Questionnaire, a broad measure of childhood psychopathology. Three years later, parents repeated the Strengths and Difficulties Questionnaire. At 3-year follow-up, CU traits were associated with conduct, hyperactivity, emotional, and total symptom scores. After adjusting for the effects of all covariates, including baseline symptom score, CU traits remained robustly associated with the overall levels of conduct problems and emotional problems and with total psychiatric difficulties at 3-year follow-up. Callous-unemotional traits are independently associated with future psychiatric difficulties in children and adolescents. An assessment of CU traits adds small but significant improvements to the prediction of future psychopathology.
Zucker, Kenneth J; Blanchard, Ray; Kim, Tae-Suk; Pae, Chi-Un; Lee, Chul
2007-10-01
Two biodemographic variables - birth order and sibling sex ratio - have been examined in several Western samples of homosexual transsexual men. The results have consistently shown that homosexual transsexuals have a later birth order and come from sibships with an excess of brothers to sisters; the excess of brothers has been largely driven by the number of older brothers and hence has been termed the fraternal birth order effect. In the present study the birth order and sibling sex ratio were examined in an Asian sample of 43 homosexual transsexual men and 49 heterosexual control men from South Korea. Although the transsexual men had a significantly late birth order, so did the control men. Unlike Western samples, the Korean transsexuals had a significant excess of sisters, not brothers, as did the control men, and this was largely accounted for by older sisters. It is concluded that a male-preference stopping rule governing parental reproductive behavior had a strong impact on these two biodemographic variables. Future studies that examine birth order and sibling sex ratio in non-Western samples of transsexuals need to be vigilant for the influential role of stopping rules, including the one identified in the present study.
Neurocognitive impairment in a large sample of homeless adults with mental illness.
Stergiopoulos, V; Cusi, A; Bekele, T; Skosireva, A; Latimer, E; Schütz, C; Fernando, I; Rourke, S B
2015-04-01
This study examines neurocognitive functioning in a large, well-characterized sample of homeless adults with mental illness and assesses demographic and clinical factors associated with neurocognitive performance. A total of 1500 homeless adults with mental illness enrolled in the At Home Chez Soi study completed neuropsychological measures assessing speed of information processing, memory, and executive functioning. Sociodemographic and clinical data were also collected. Linear regression analyses were conducted to examine factors associated with neurocognitive performance. Approximately half of our sample met criteria for psychosis, major depressive disorder, and alcohol or substance use disorder, and nearly half had experienced severe traumatic brain injury. Overall, 72% of participants demonstrated cognitive impairment, including deficits in processing speed (48%), verbal learning (71%) and recall (67%), and executive functioning (38%). The overall statistical model explained 19.8% of the variance in the neurocognitive summary score, with reduced neurocognitive performance associated with older age, lower education, first language other than English or French, Black or Other ethnicity, and the presence of psychosis. Homeless adults with mental illness experience impairment in multiple neuropsychological domains. Much of the variance in our sample's cognitive performance remains unexplained, highlighting the need for further research in the mechanisms underlying cognitive impairment in this population. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Salmonella testing of pooled pre-enrichment broth cultures for screening multiple food samples.
Price, W R; Olsen, R A; Hunter, J E
1972-04-01
A method has been described for testing multiple food samples for Salmonella without loss in sensitivity. The method pools multiple pre-enrichment broth cultures into single enrichment broths. The subsequent stages of the Salmonella analysis are not altered. The method was found applicable to several dry food materials including nonfat dry milk, dried egg albumin, cocoa, cottonseed flour, wheat flour, and shredded coconut. As many as 25 pre-enrichment broth cultures were pooled without apparent loss in the sensitivity of Salmonella detection as compared to individual sample analysis. The procedure offers a simple, yet effective, way to increase sample capacity in the Salmonella testing of foods, particularly where a large proportion of samples ordinarily is negative. It also permits small portions of pre-enrichment broth cultures to be retained for subsequent individual analysis if positive tests are found. Salmonella testing of pooled pre-enrichment broths provides increased consumer protection for a given amount of analytical effort as compared to individual sample analysis.
Floyd A. Johnson
1961-01-01
This report assumes a knowledge of the principles of point sampling as described by Grosenbaugh, Bell and Alexander, and others. Whenever trees are counted at every point in a sample of points (large sample) and measured for volume at a portion (small sample) of these points, the sampling design could be called ratio double sampling. If the large...
NASA Astrophysics Data System (ADS)
Mackey, A. D.; Gilmore, G. F.
2003-01-01
We have compiled a pseudo-snapshot data set of two-colour observations from the Hubble Space Telescope archive for a sample of 53 rich LMC clusters with ages of 106-1010 yr. We present surface brightness profiles for the entire sample, and derive structural parameters for each cluster, including core radii, and luminosity and mass estimates. Because we expect the results presented here to form the basis for several further projects, we describe in detail the data reduction and surface brightness profile construction processes, and compare our results with those of previous ground-based studies. The surface brightness profiles show a large amount of detail, including irregularities in the profiles of young clusters (such as bumps, dips and sharp shoulders), and evidence for both double clusters and post-core-collapse (PCC) clusters. In particular, we find power-law profiles in the inner regions of several candidate PCC clusters, with slopes of approximately -0.7, but showing considerable variation. We estimate that 20 +/- 7 per cent of the old cluster population of the Large Magellanic Cloud (LMC) has entered PCC evolution, a similar fraction to that for the Galactic globular cluster system. In addition, we examine the profile of R136 in detail and show that it is probably not a PCC cluster. We also observe a trend in core radius with age that has been discovered and discussed in several previous publications by different authors. Our diagram has better resolution, however, and appears to show a bifurcation at several hundred Myr. We argue that this observed relationship reflects true physical evolution in LMC clusters, with some experiencing small-scale core expansion owing to mass loss, and others large-scale expansion owing to some unidentified characteristic or physical process.
Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets.
Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L
2014-01-01
As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Phenotyping of subjects for large scale studies on patients with IBS.
Boeckxstaens, G E; Drug, V; Dumitrascu, D; Farmer, A D; Hammer, J; Hausken, T; Niesler, B; Pohl, D; Pojskic, L; Polster, A; Simren, M; Goebel-Stengel, M; Van Oudenhove, L; Vassallo, M; Wensaas, K-A; Aziz, Q; Houghton, L A
2016-08-01
Irritable bowel syndrome (IBS) is a complex condition with multiple factors contributing to its aetiology and pathophysiology. Aetiologically these include genetics, life-time events and environment, and physiologically, changes in motility, central processing, visceral sensitivity, immunity, epithelial permeability and gastrointestinal microflora. Such complexity means there is currently no specific reliable biomarker for IBS, and thus IBS continues to be diagnosed and classified according to symptom based criteria, the Rome Criteria. Carefully phenotyping and characterisation of a 'large' pool of IBS patients across Europe and even the world however, might help identify sub-populations with accuracy and consistency. This will not only aid future research but improve tailoring of treatment and health care of IBS patients. The aim of this position paper is to discuss the requirements necessary to standardize the process of selecting and phenotyping IBS patients and how to organise the collection and storage of patient information/samples in such a large multi-centre pan European/global study. We include information on general demographics, gastrointestinal symptom assessment, psychological factors, quality of life, physiological evaluation, genetic/epigenetic and microbiota analysis, biopsy/blood sampling, together with discussion on the organisational, ethical and language issues associated with implementing such a study. The proposed approach and documents selected to be used in such a study was the result of a thoughtful and thorough four-year dialogue amongst experts associated with the European COST action BM1106 GENIEUR (www.GENIEUR.eu). © 2016 John Wiley & Sons Ltd.
Emery, R J
1997-03-01
Institutional radiation safety programs routinely use wipe test sampling and liquid scintillation counting analysis to indicate the presence of removable radioactive contamination. Significant volumes of liquid waste can be generated by such surveillance activities, and the subsequent disposal of these materials can sometimes be difficult and costly. In settings where large numbers of negative results are regularly obtained, the limited grouping of samples for analysis based on expected value statistical techniques is possible. To demonstrate the plausibility of the approach, single wipe samples exposed to varying amounts of contamination were analyzed concurrently with nine non-contaminated samples. Although the sample grouping inevitably leads to increased quenching with liquid scintillation counting systems, the effect did not impact the ability to detect removable contamination in amounts well below recommended action levels. Opportunities to further improve this cost effective semi-quantitative screening procedure are described, including improvements in sample collection procedures, enhancing sample-counting media contact through mixing and extending elution periods, increasing sample counting times, and adjusting institutional action levels.
Use of the ecf1 gene to detect Shiga toxin-producing Escherichia coli in beef samples.
Livezey, Kristin W; Groschel, Bettina; Becker, Michael M
2015-04-01
Escherichia coli O157:H7 and six serovars (O26, O103, O121, O111, O145, and O45) are frequently implicated in severe clinical illness worldwide. Standard testing methods using stx, eae, and O serogroup-specific gene sequences for detecting the top six non-O157 STEC bear the disadvantage that these genes may reside, independently, in different nonpathogenic organisms, leading to false-positive results. The ecf operon has previously been identified in the large enterohemolysin-encoding plasmid of eae-positive Shiga toxin-producing E. coli (STEC). Here, we explored the utility of the ecf operon as a single marker to detect eae-positive STEC from pure broth and primary meat enrichments. Analysis of 501 E. coli isolates demonstrated a strong correlation (99.6%) between the presence of the ecf1 gene and the combined presence of stx, eae, and ehxA genes. Two large studies were carried out to determine the utility of an ecf1 detection assay to detect non-O157 STEC strains in enriched meat samples in comparison to the results using the U. S. Department of Agriculture Food Safety and Inspection Service (FSIS) method that detects stx and eae genes. In ground beef samples (n = 1,065), the top six non-O157 STEC were detected in 4.0% of samples by an ecf1 detection assay and in 5.0% of samples by the stx- and eae-based method. In contrast, in beef samples composed largely of trim (n = 1,097), the top six non-O157 STEC were detected at 1.1% by both methods. Estimation of false-positive rates among the top six non-O157 STEC revealed a lower rate using the ecf1 detection method (0.5%) than using the eae and stx screening method (1.1%). Additionally, the ecf1 detection assay detected STEC strains associated with severe illness that are not included in the FSIS regulatory definition of adulterant STEC.
Understanding the role of conscientiousness in healthy aging: where does the brain come in?
Patrick, Christopher J
2014-05-01
In reviewing this impressive series of articles, I was struck by 2 points in particular: (a) the fact that the empirically oriented articles focused on analyses of data from very large samples, with the articles by Friedman, Kern, Hampson, and Duckworth (2014) and Kern, Hampson, Goldbert, and Friedman (2014) highlighting an approach to merging existing data sets through use of "metric bridges" to address key questions not addressable through 1 data set alone, and (b) the fact that the articles as a whole included limited mention of neuroscientific (i.e., brain research) concepts, methods, and findings. One likely reason for the lack of reference to brain-oriented work is the persisting gap between smaller sample size lab-experimental and larger sample size multivariate-correlational approaches to psychological research. As a strategy for addressing this gap and bringing a distinct neuroscientific component to the National Institute on Aging's conscientiousness and health initiative, I suggest that the metric bridging approach highlighted by Friedman and colleagues could be used to connect existing large-scale data sets containing both neurophysiological variables and measures of individual difference constructs to other data sets containing richer arrays of nonphysiological variables-including data from longitudinal or twin studies focusing on personality and health-related outcomes (e.g., Terman Life Cycle study and Hawaii longitudinal studies, as described in the article by Kern et al., 2014). (PsycINFO Database Record (c) 2014 APA, all rights reserved).
A NASTRAN primer for the analysis of rotating flexible blades
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Aiello, Robert A.; Ernst, Michael A.; Mcgee, Oliver G.
1987-01-01
This primer provides documentation for using MSC NASTRAN in analyzing rotating flexible blades. The analysis of these blades includes geometrically nonlinear (large displacement) analysis under centrifugal loading, and frequency and mode shape (normal modes) determination. The geometrically nonlinear analysis using NASTRAN Solution sequence 64 is discussed along with the determination of frequencies and mode shapes using Solution Sequence 63. A sample problem with the complete NASTRAN input data is included. Items unique to rotating blade analyses, such as setting angle and centrifugal softening effects are emphasized.
Using Moss to Detect Fine-Scaled Deposition of Heavy Metals in Urban Environments
NASA Astrophysics Data System (ADS)
Jovan, S.; Donovan, G.; Demetrios, G.; Monleon, V. J.; Amacher, M. C.
2017-12-01
Mosses are commonly used as bio-indicators of heavy metal deposition to forests. Their application in urban airsheds is relatively rare. Our objective was to develop fine-scaled, city-wide maps for heavy metals in Portland, Oregon, to identify pollution "hotspots" and serve as a screening tool for more effective placement of air quality monitoring instruments. In 2013 we measured twenty-two elements in epiphytic moss sampled on a 1km x1km sampling grid (n = 346). We detected large hotspots of cadmium and arsenic in two neighborhoods associated with stained glass manufacturers. Air instruments deployed by local regulators measured cadmium concentrations 49 times and arsenic levels 155 times the state health benchmarks. Moss maps also detected a large nickel hotspot in a neighborhood near a forge where air instruments later measured concentrations 4 times the health benchmark. In response, the facilities implemented new pollution controls, air quality improved in all three affected neighborhoods, revision of regulations for stained glass furnace emissions are underway, and Oregon's governor launched an initiative to develop health-based (vs technology-based) regulations for air toxics in the state. The moss maps also indicated a couple dozen smaller hotspots of heavy metals, including lead, chromium, and cobalt, in Portland neighborhoods. Ongoing follow-up work includes: 1) use of moss sampling by local regulators to investigate source and extent of the smaller hotspots, 2) use of lead isotopes to determine origins of higher lead levels observed in moss collected from the inner city, and 3) co-location of air instruments and moss sampling to determine accuracy, timeframe represented, and seasonality of heavy metals in moss.
[Epidemiological study of cytopenia among benzene-exposed workers and its influential factors].
Peng, Juan-juan; Liu, Mei-xia; Yang, Feng; Guo, Wei-wei; Zhuang, Ran; Jia, Xian-dong
2013-03-01
To evaluate the benzene exposure level and cytopenia among the benzene exposed workers in Shanghai, China and to analyze the influential factors for the health of benzene-exposed workers. A total of 3314 benzene-exposed workers, who were from 85 benzene-related enterprises selected by stratified random sampling based on enterprise sizes and industries, were included in the study. The time-weighted average (TWA) concentration of benzene in each workshop was measured by individual sampling and fixed point sampling, and the benzene exposure level in workshop was evaluated accordingly. The occupational health examination results and health status of benzene-exposed workers were collected. The median of TW A concentrations of benzene was 0.3 mg/m3. The TWA concentrations measured at 7 ( 1.4%) of the 504 sampling points were above the safety limit. Of the 7 points, 3 were from large enterprises, 2 from medium enterprises, and 2 from small enterprises; 3 were from shipbuilding industry, 1 from chemical industry, and 3 from light industry. Of the 3314 benzene-exposed workers, 451 ( 13.6%) had cytopenia, including 339 males ( 339/2548, 13.3%) and 112 females ( 112/766, 14.6% ). There were significant differences in the incidence rates of leukopenia and neutropenia among the benzene-exposed workers of different sexes and ages (P<0.05); there were significant differences in the incidence rate of cytopenia among the benzene-exposed workers of different ages and working years ( P<0.05 ); there were significant differences in the incidence of neutropenia among the benzene exposed workers of different working years ( P<0.05). Monitoring and intervention measures should be enhanced to protect the benzene-exposed workers in the large enterprises in shipbuilding industry and medium and private enterprises in chemical industry from occupational hazards.
NASA Astrophysics Data System (ADS)
Lo Faro, B.; Silva, L.; Franceschini, A.; Miller, N.; Efstathiou, A.
2015-03-01
We complement our previous analysis of a sample of z ˜ 1-2 luminous and ultraluminous infrared galaxies [(U)LIRGs], by adding deep Very Large Array radio observations at 1.4 GHz to a large data set from the far-UV to the submillimetre, including Spitzer and Herschel data. Given the relatively small number of (U)LIRGs in our sample with high signal-to-noise (S/N) radio data, and to extend our study to a different family of galaxies, we also include six well-sampled near-infrared (near-IR)-selected BzK galaxies at z ˜ 1.5. From our analysis based on the radtran spectral synthesis code GRASIL, we find that, while the IR luminosity may be a biased tracer of the star formation rate (SFR) depending on the age of stars dominating the dust heating, the inclusion of the radio flux offers significantly tighter constraints on SFR. Our predicted SFRs are in good agreement with the estimates based on rest-frame radio luminosity and the Bell calibration. The extensive spectrophotometric coverage of our sample allows us to set important constraints on the star formation (SF) history of individual objects. For essentially all galaxies, we find evidence for a rather continuous SFR and a peak epoch of SF preceding that of the observation by a few Gyr. This seems to correspond to a formation redshift of z ˜ 5-6. We finally show that our physical analysis may affect the interpretation of the SFR-M⋆ diagram, by possibly shifting, with respect to previous works, the position of the most dust obscured objects to higher M⋆ and lower SFRs.
Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A
2009-12-16
Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.
2009-01-01
Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393
Automated High-Throughput Permethylation for Glycosylation Analysis of Biologics Using MALDI-TOF-MS.
Shubhakar, Archana; Kozak, Radoslaw P; Reiding, Karli R; Royle, Louise; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred
2016-09-06
Monitoring glycoprotein therapeutics for changes in glycosylation throughout the drug's life cycle is vital, as glycans significantly modulate the stability, biological activity, serum half-life, safety, and immunogenicity. Biopharma companies are increasingly adopting Quality by Design (QbD) frameworks for measuring, optimizing, and controlling drug glycosylation. Permethylation of glycans prior to analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) is a valuable tool for glycan characterization and for screening of large numbers of samples in QbD drug realization. However, the existing protocols for manual permethylation and liquid-liquid extraction (LLE) steps are labor intensive and are thus not practical for high-throughput (HT) studies. Here we present a glycan permethylation protocol, based on 96-well microplates, that has been developed into a kit suitable for HT work. The workflow is largely automated using a liquid handling robot and includes N-glycan release, enrichment of N-glycans, permethylation, and LLE. The kit has been validated according to industry analytical performance guidelines and applied to characterize biopharmaceutical samples, including IgG4 monoclonal antibodies (mAbs) and recombinant human erythropoietin (rhEPO). The HT permethylation enabled glycan characterization and relative quantitation with minimal side reactions: the MALDI-TOF-MS profiles obtained were in good agreement with hydrophilic liquid interaction chromatography (HILIC) and ultrahigh performance liquid chromatography (UHPLC) data. Automated permethylation and extraction of 96 glycan samples was achieved in less than 5 h and automated data acquisition on MALDI-TOF-MS took on average less than 1 min per sample. This automated and HT glycan preparation and permethylation showed to be convenient, fast, and reliable and can be applied for drug glycan profiling and clinical glycan biomarker studies.
NASA Astrophysics Data System (ADS)
Kotler, J.; Hinman, N. W.; Yan, B.; Stoner, D. L.; Scott, J. R.
2006-12-01
The jarosite group minerals have received increasing attention since the discovery by the Mars Exploration Rover-Opportunity of jarosite on the Martian surface. The general chemical formula for jarosite is XFe3(SO4)2(OH)6 where the X represents both monovalent and divalent cations that can occupy the axial positions in the crystal structure. Commonly found ions include K+, Na+, H3O+, NH4+, and Pb2+ with reports of other large ions occupying this position in the literature. Modeling efforts have been performed to confirm that jarosite has the ability to incorporate a variety of "foreign" cations. The minerals unique ability to incorporate various large ions in its structure and its association with biological activity in terrestrial environments has lead to investigations regarding its use as an indicator of aqueous and/or biological activity. The use of laser desorption Fourier transform mass spectrometry (LD-FTMS) has revealed the presence of organic matter including the amino acid, glycine, in several jarosite samples from various worldwide locations. Iron precipitates derived from acidophilic microbial cultures were also analyzed. Using attenuated total reflectance infrared spectroscopy (ATR-IR), signals indicative of microbes or microbial exudates were weak and ambiguous. In contrast, LD-FTMS clearly detected bioorganic constituents in some desorption spots. However, the signals were sporadic and required the laser scanning/imaging capability of our laboratory built system to locate the microbial signatures in the heterogeneous samples. The ability to observe these bioorganic signatures in jarosite samples using the instrumental technique employed in this study furthers the goals of planetary geologists to determine whether signs of life (e.g., presence of biomolecules or biomolecule precursors) can be detected in the rock record of terrestrial and extraterrestrial samples.
A Large Catalog of Multiwavelength GRB Afterglows. I. Color Evolution and Its Physical Implication
NASA Astrophysics Data System (ADS)
Li, Liang; Wang, Yu; Shao, Lang; Wu, Xue-Feng; Huang, Yong-Feng; Zhang, Bing; Ryde, Felix; Yu, Hoi-Fung
2018-02-01
The spectrum of gamma-ray burst (GRB) afterglows can be studied with color indices. Here, we present a large comprehensive catalog of 70 GRBs with multiwavelength optical transient data on which we perform a systematic study to find the temporal evolution of color indices. We categorize them into two samples based on how well the color indices are evaluated. The Golden sample includes 25 bursts mostly observed by GROND, and the Silver sample includes 45 bursts observed by other telescopes. For the Golden sample, we find that 96% of the color indices do not vary over time. However, the color indices do vary during short periods in most bursts. The observed variations are consistent with effects of (i) the cooling frequency crossing the studied energy bands in a wind medium (43%) and in a constant-density medium (30%), (ii) early dust extinction (12%), (iii) transition from reverse-shock to forward-shock emission (5%), or (iv) an emergent SN emission (10%). We also study the evolutionary properties of the mean color indices for different emission episodes. We find that 86% of the color indices in the 70 bursts show constancy between consecutive ones. The color index variations occur mainly during the late GRB–SN bump, the flare, and early reverse-shock emission components. We further perform a statistical analysis of various observational properties and model parameters (spectral index {β }o{CI}, electron spectral indices p CI, etc.) using color indices. Overall, we conclude that ∼90% of colors are constant in time and can be accounted for by the simplest external forward-shock model, while the varying color indices call for more detailed modeling.
Pedersen, C B; Bybjerg-Grauholm, J; Pedersen, M G; Grove, J; Agerbo, E; Bækvad-Hansen, M; Poulsen, J B; Hansen, C S; McGrath, J J; Als, T D; Goldstein, J I; Neale, B M; Daly, M J; Hougaard, D M; Mors, O; Nordentoft, M; Børglum, A D; Werge, T; Mortensen, P B
2018-01-01
The Integrative Psychiatric Research (iPSYCH) consortium has established a large Danish population-based Case–Cohort sample (iPSYCH2012) aimed at unravelling the genetic and environmental architecture of severe mental disorders. The iPSYCH2012 sample is nested within the entire Danish population born between 1981 and 2005, including 1 472 762 persons. This paper introduces the iPSYCH2012 sample and outlines key future research directions. Cases were identified as persons with schizophrenia (N=3540), autism (N=16 146), attention-deficit/hyperactivity disorder (N=18 726) and affective disorder (N=26 380), of which 1928 had bipolar affective disorder. Controls were randomly sampled individuals (N=30 000). Within the sample of 86 189 individuals, a total of 57 377 individuals had at least one major mental disorder. DNA was extracted from the neonatal dried blood spot samples obtained from the Danish Neonatal Screening Biobank and genotyped using the Illumina PsychChip. Genotyping was successful for 90% of the sample. The assessments of exome sequencing, methylation profiling, metabolome profiling, vitamin-D, inflammatory and neurotrophic factors are in progress. For each individual, the iPSYCH2012 sample also includes longitudinal information on health, prescribed medicine, social and socioeconomic information, and analogous information among relatives. To the best of our knowledge, the iPSYCH2012 sample is the largest and most comprehensive data source for the combined study of genetic and environmental aetiologies of severe mental disorders. PMID:28924187
Erdal, Erik P; Mitra, Debanjali; Khangulov, Victor S; Church, Stephen; Plokhoy, Elizabeth
2017-03-01
Background Despite advances in clinical chemistry testing, poor blood sample quality continues to impact laboratory operations and the quality of results. While previous studies have identified the preanalytical causes of lower sample quality, few studies have examined the economic impact of poor sample quality on the laboratory. Specifically, the costs associated with workarounds related to fibrin and gel contaminants remain largely unexplored. Methods A quantitative survey of clinical chemistry laboratory stakeholders across 10 international regions, including countries in North America, Europe and Oceania, was conducted to examine current blood sample testing practices, sample quality issues and practices to remediate poor sample quality. Survey data were used to estimate costs incurred by laboratories to mitigate sample quality issues. Results Responses from 164 participants were included in the analysis, which was focused on three specific issues: fibrin strands, fibrin masses and gel globules. Fibrin strands were the most commonly reported issue, with an overall incidence rate of ∼3%. Further, 65% of respondents indicated that these issues contribute to analyzer probe clogging, and the majority of laboratories had visual inspection and manual remediation practices in place to address fibrin- and gel-related quality problems (55% and 70%, respectively). Probe maintenance/replacement, visual inspection and manual remediation were estimated to carry significant costs for the laboratories surveyed. Annual cost associated with lower sample quality and remediation related to fibrin and/or gel globules for an average US laboratory was estimated to be $100,247. Conclusions Measures to improve blood sample quality present an important step towards improved laboratory operations.
Rúgeles, Laura Cristina; Bai, Jing; Martínez, Aída Juliana; Vanegas, María Consuelo; Gómez-Duarte, Oscar Gilberto
2010-01-01
The prevalence of diarrheagenic E. coli in childhood diarrhea and the role of contaminated food products in disease transmission in Colombia are largely unknown. The aim of this study is to identify E. coli pathotypes, including E. coli O157:H7, from 108 stool samples from children with acute diarrhea, 38 meat samples and 38 vegetable samples. Multiplex PCR and Bax Dupont systems were used for E. coli pathotype detection. Eighteen (9.8%) E. coli diarrheagenic pathotypes were detected among all clinical and food product samples tested. Four different pathotypes were identified from clinical samples, including enteroaggregative E. coli, enterotoxigenic E. coli, shiga-toxin producing E. coli, and enteropathogenic E. coli. Food product samples were positive for enteroaggregative and shiga-toxin producing E. coli, suggesting that meat and vegetables may be involved in transmission of these E. coli pathotypes in the community. Most E. coli strains identified belong to the phylogenetic groups A and B1, known to be associated with intestinal rather than extraintestinal E. coli clones. Our data is the first molecular E. coli report that confirms the presence of E. coli pathotypes circulating in Colombia among children with diarrhea and food products for human consumption. Implementation of multiplex PCR technology in Latin America and other countries with limited resources may provide an important epidemiological tool for the surveillance of E. coli pathotypes from clinical isolates as well as from water and food product samples. PMID:20153069
Laboratory theory and methods for sediment analysis
Guy, Harold P.
1969-01-01
The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.
The small-scale treatability study sample exemption
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coalgate, J.
1991-01-01
In 1981, the Environmental Protection Agency (EPA) issued an interim final rule that conditionally exempted waste samples collected solely for the purpose of monitoring or testing to determine their characteristics or composition'' from RCRA Subtitle C hazardous waste regulations. This exemption (40 CFR 261.4(d)) apples to the transportation of samples between the generator and testing laboratory, temporary storage of samples at the laboratory prior to and following testing, and storage at a laboratory for specific purposes such as an enforcement action. However, the exclusion did not include large-scale samples used in treatability studies or other testing at pilot plants ormore » other experimental facilities. As a result of comments received by the EPA subsequent to the issuance of the interim final rule, the EPA reopened the comment period on the interim final rule on September 18, 1987, and specifically requested comments on whether or not the sample exclusion should be expanded to include waste samples used in small-scale treatability studies. Almost all responders commented favorably on such a proposal. As a result, the EPA issued a final rule (53 FR 27290, July 19, 1988) conditionally exempting waste samples used in small-scale treatability studies from full regulation under Subtitle C of RCRA. The question of whether or not to extend the exclusion to larger scale as proposed by the Hazardous Waste Treatment Council was deferred until a later date. This information Brief summarizes the requirements of the small-scale treatability exemption.« less
Lam, Raymond W; Wolinsky, Debra; Kinsella, Cynthia; Woo, Cindy; Cayley, Paula M; Walker, Anne B
2012-11-01
To determine the prevalence and characteristics of clients with depression attending an employee assistance program (EAP). Anonymized data were obtained from 10,794 consecutive clients, including 9105 employees, self-referred to PPC Canada, a large, external EAP. Assessment measures included the self-rated nine-item Patient Health Questionnaire (PHQ-9). Clinical characteristics of depressed clients (PHQ-9 score ≥ 10) were compared with those of nondepressed clients. Thirty-seven percent of the employee sample met PHQ-9 criteria for clinically significant depression. Compared with clients without depression, they had significantly higher rates of anxiety, psychotropic medication use, problem substance use, global problems with functioning, absenteeism, impairment in work-related tasks, and low job satisfaction. A large proportion of EAP clients were clinically depressed with associated negative effects on personal and occupational functioning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Broman, D.; Axelman, J.; Bandh, C.
In order to study the fate and occurrence of two groups of hydrophobic compounds in the Baltic aquatic environment a large number of samples were collected from the southern Baltic proper to the northern Bothnian Bay for the analyses of polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs). The following sample matrices were collected; bottom surface sediments (0--1 cm, collected with gravity corer), settling particulate matter (collected with sediment traps), open water samples and over water samples (suspended particulates and dissolved fraction sampled by filtration) and air samples (aerosols and vapor phase sampled by filtration). All samples (except over watermore » and air) were collected at open sea in the Baltic. The analyses results have been used to make a model approach on the whole Baltic and to elucidate different aspects of the behavior of PAHs and PCBs in the Baltic, such as the occurrence of the compounds in water and sediment, the total content as well as the concentration variabilities over such a large geographical area, Further, the data on settling particulate matter as well as the air concentration data were used to estimate the total fluxes of PAHs and PCBs to the bottoms of the Baltic and t o the total water area of the Baltic, respectively. Further, data on the PAH and PCB content in river water from four major rivers provides rough estimates of the riverine input to the Baltic. The dynamics of PAHs and PCBs within the water mass have also been studied in terms of settling velocities and residence times in the water mass for these type of compounds in the open Baltic.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less
Imaging System and Method for Biomedical Analysis
2013-03-11
biological particles and items of interest. Broadly, Padmanabhan et al. utilize the diffraction of a laser light source in flow cytometry to count...spread of light from multiple LED devices over the entire sample surface. Preferably, light source 308 projects a full spectrum white light. Light...for example, red blood cells, white blood cells (which may include lymphocytes which are relatively large and easily detectable), T-helper cells
ERIC Educational Resources Information Center
Springer, Matthew G.; Pepper, Matthew J.; Ghosh-Dastidar, Bonnie
2009-01-01
This study examines the effect of supplemental educational services (SES) on student test score gains and whether particular subgroups of students benefit more from No Child Left Behind (NCLB) tutoring services. The sample used includes information on students enrolled in 3rd through 8th grades nested in 121 elementary and middle schools over a…
ERIC Educational Resources Information Center
Roberts, Donald F.
A study examined media use patterns among a large, nationally representative sample of children ages 2-18, and which explored how children choose and interact with the whole array of media available to them, including television, movies, computers, music, video games, radio, magazines, books, and newspapers. The goal was to provide a solid base…
Paul L. Patterson; Sara A. Goeking
2012-01-01
The annual forest inventory of New Mexico began as an accelerated inventory, and 8 of the 10 Phase 2 panels were sampled between 2008 and 2011. The inventory includes a large proportion of nonresponse. FIA's estimation process uses post-stratification and assumes that nonresponse occurs at random within each stratum. We construct an estimator for the New Mexico...
Stan T. Lebow; Daniel Foster
2005-01-01
A study was conducted to evaluate environmental accumulation and mobility of total copper, chromium, and arsenic adjacent to a chromated-copper-arsenate-(CCA-C-) treated wetland boardwalk. The study was considered a severe test because it included a large volume of treated wood in a site with high annual rainfall. Soil and sediment samples were collected before...