Sample records for methods representative samples

  1. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...

  2. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...

  3. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 27 2013-07-01 2013-07-01 false Representative Sampling Methods I Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the...

  4. Representativeness of direct observations selected using a work-sampling equation.

    PubMed

    Sharp, Rebecca A; Mudford, Oliver C; Elliffe, Douglas

    2015-01-01

    Deciding on appropriate sampling to obtain representative samples of behavior is important but not straightforward, because the relative duration of the target behavior may affect its observation in a given sampling interval. Work-sampling methods, which offer a way to adjust the frequency of sampling according to a priori or ongoing estimates of the behavior to achieve a preselected level of representativeness, may provide a solution. Full-week observations of 7 behaviors were conducted for 3 students with autism spectrum disorder and intellectual disabilities. Work-sampling methods were used to select momentary time samples from the full time-of-interest, which produced representative samples. However, work sampling required impractically high numbers of time samples to obtain representative samples. More practical momentary time samples produced less representative samples, particularly for low-duration behaviors. The utility and limits of work-sampling methods for applied behavior analysis are discussed. © Society for the Experimental Analysis of Behavior.

  5. Respondent-Driven Sampling with Hard-to-Reach Emerging Adults: An Introduction and Case Study with Rural African Americans

    ERIC Educational Resources Information Center

    Kogan, Steven M.; Wejnert, Cyprian; Chen, Yi-fu; Brody, Gene H.; Slater, LaTrina M.

    2011-01-01

    Obtaining representative samples from populations of emerging adults who do not attend college is challenging for researchers. This article introduces respondent-driven sampling (RDS), a method for obtaining representative samples of hard-to-reach but socially interconnected populations. RDS combines a prescribed method for chain referral with a…

  6. CREPT-MCNP code for efficiency calibration of HPGe detectors with the representative point method.

    PubMed

    Saegusa, Jun

    2008-01-01

    The representative point method for the efficiency calibration of volume samples has been previously proposed. For smoothly implementing the method, a calculation code named CREPT-MCNP has been developed. The code estimates the position of a representative point which is intrinsic to each shape of volume sample. The self-absorption correction factors are also given to make correction on the efficiencies measured at the representative point with a standard point source. Features of the CREPT-MCNP code are presented.

  7. Appearance-based representative samples refining method for palmprint recognition

    NASA Astrophysics Data System (ADS)

    Wen, Jiajun; Chen, Yan

    2012-07-01

    The sparse representation can deal with the lack of sample problem due to utilizing of all the training samples. However, the discrimination ability will degrade when more training samples are used for representation. We propose a novel appearance-based palmprint recognition method. We aim to find a compromise between the discrimination ability and the lack of sample problem so as to obtain a proper representation scheme. Under the assumption that the test sample can be well represented by a linear combination of a certain number of training samples, we first select the representative training samples according to the contributions of the samples. Then we further refine the training samples by an iteration procedure, excluding the training sample with the least contribution to the test sample for each time. Experiments on PolyU multispectral palmprint database and two-dimensional and three-dimensional palmprint database show that the proposed method outperforms the conventional appearance-based palmprint recognition methods. Moreover, we also explore and find out the principle of the usage for the key parameters in the proposed algorithm, which facilitates to obtain high-recognition accuracy.

  8. Method and apparatus for data sampling

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.

  9. Strengths and weaknesses of temporal stability analysis for monitoring and estimating grid-mean soil moisture in a high-intensity irrigated agricultural landscape

    NASA Astrophysics Data System (ADS)

    Ran, Youhua; Li, Xin; Jin, Rui; Kang, Jian; Cosh, Michael H.

    2017-01-01

    Monitoring and estimating grid-mean soil moisture is very important for assessing many hydrological, biological, and biogeochemical processes and for validating remotely sensed surface soil moisture products. Temporal stability analysis (TSA) is a valuable tool for identifying a small number of representative sampling points to estimate the grid-mean soil moisture content. This analysis was evaluated and improved using high-quality surface soil moisture data that were acquired by a wireless sensor network in a high-intensity irrigated agricultural landscape in an arid region of northwestern China. The performance of the TSA was limited in areas where the representative error was dominated by random events, such as irrigation events. This shortcoming can be effectively mitigated by using a stratified TSA (STSA) method, proposed in this paper. In addition, the following methods were proposed for rapidly and efficiently identifying representative sampling points when using TSA. (1) Instantaneous measurements can be used to identify representative sampling points to some extent; however, the error resulting from this method is significant when validating remotely sensed soil moisture products. Thus, additional representative sampling points should be considered to reduce this error. (2) The calibration period can be determined from the time span of the full range of the grid-mean soil moisture content during the monitoring period. (3) The representative error is sensitive to the number of calibration sampling points, especially when only a few representative sampling points are used. Multiple sampling points are recommended to reduce data loss and improve the likelihood of representativeness at two scales.

  10. Method and apparatus for data sampling

    DOEpatents

    Odell, D.M.C.

    1994-04-19

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.

  11. A comparison of four sampling methods among men having sex with men in China: implications for HIV/STD surveillance and prevention

    PubMed Central

    Guo, Yan; Li, Xiaoming; Fang, Xiaoyi; Lin, Xiuyun; Song, Yan; Jiang, Shuling; Stanton, Bonita

    2011-01-01

    Sample representativeness remains one of the challenges in effective HIV/STD surveillance and prevention targeting MSM worldwide. Although convenience samples are widely used in studies of MSM, previous studies suggested that these samples might not be representative of the broader MSM population. This issue becomes even more critical in many developing countries where needed resources for conducting probability sampling are limited. We examined variations in HIV and Syphilis infections and sociodemographic and behavioral factors among 307 young migrant MSM recruited using four different convenience sampling methods (peer outreach, informal social network, Internet, and venue-based) in Beijing, China in 2009. The participants completed a self-administered survey and provided blood specimens for HIV/STD testing. Among the four MSM samples using different recruitment methods, rates of HIV infections were 5.1%, 5.8%, 7.8%, and 3.4%; rates of Syphilis infection were 21.8%, 36.2%, 11.8%, and 13.8%; rates of inconsistent condom use were 57%, 52%, 58%, and 38%. Significant differences were found in various sociodemographic characteristics (e.g., age, migration history, education, income, places of employment) and risk behaviors (e.g., age at first sex, number of sex partners, involvement in commercial sex, and substance use) among samples recruited by different sampling methods. The results confirmed the challenges of obtaining representative MSM samples and underscored the importance of using multiple sampling methods to reach MSM from diverse backgrounds and in different social segments and to improve the representativeness of the MSM samples when the use of probability sampling approach is not feasible. PMID:21711162

  12. A comparison of four sampling methods among men having sex with men in China: implications for HIV/STD surveillance and prevention.

    PubMed

    Guo, Yan; Li, Xiaoming; Fang, Xiaoyi; Lin, Xiuyun; Song, Yan; Jiang, Shuling; Stanton, Bonita

    2011-11-01

    Sample representativeness remains one of the challenges in effective HIV/STD surveillance and prevention targeting men who have sex with men (MSM) worldwide. Although convenience samples are widely used in studies of MSM, previous studies suggested that these samples might not be representative of the broader MSM population. This issue becomes even more critical in many developing countries where needed resources for conducting probability sampling are limited. We examined variations in HIV and Syphilis infections and sociodemographic and behavioral factors among 307 young migrant MSM recruited using four different convenience sampling methods (peer outreach, informal social network, Internet, and venue-based) in Beijing, China in 2009. The participants completed a self-administered survey and provided blood specimens for HIV/STD testing. Among the four MSM samples using different recruitment methods, rates of HIV infections were 5.1%, 5.8%, 7.8%, and 3.4%; rates of Syphilis infection were 21.8%, 36.2%, 11.8%, and 13.8%; and rates of inconsistent condom use were 57%, 52%, 58%, and 38%. Significant differences were found in various sociodemographic characteristics (e.g., age, migration history, education, income, and places of employment) and risk behaviors (e.g., age at first sex, number of sex partners, involvement in commercial sex, and substance use) among samples recruited by different sampling methods. The results confirmed the challenges of obtaining representative MSM samples and underscored the importance of using multiple sampling methods to reach MSM from diverse backgrounds and in different social segments and to improve the representativeness of the MSM samples when the use of probability sampling approach is not feasible.

  13. Lipidomic analysis of biological samples: Comparison of liquid chromatography, supercritical fluid chromatography and direct infusion mass spectrometry methods.

    PubMed

    Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal

    2017-11-24

    Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Strelka: accurate somatic small-variant calling from sequenced tumor-normal sample pairs.

    PubMed

    Saunders, Christopher T; Wong, Wendy S W; Swamy, Sajani; Becq, Jennifer; Murray, Lisa J; Cheetham, R Keira

    2012-07-15

    Whole genome and exome sequencing of matched tumor-normal sample pairs is becoming routine in cancer research. The consequent increased demand for somatic variant analysis of paired samples requires methods specialized to model this problem so as to sensitively call variants at any practical level of tumor impurity. We describe Strelka, a method for somatic SNV and small indel detection from sequencing data of matched tumor-normal samples. The method uses a novel Bayesian approach which represents continuous allele frequencies for both tumor and normal samples, while leveraging the expected genotype structure of the normal. This is achieved by representing the normal sample as a mixture of germline variation with noise, and representing the tumor sample as a mixture of the normal sample with somatic variation. A natural consequence of the model structure is that sensitivity can be maintained at high tumor impurity without requiring purity estimates. We demonstrate that the method has superior accuracy and sensitivity on impure samples compared with approaches based on either diploid genotype likelihoods or general allele-frequency tests. The Strelka workflow source code is available at ftp://strelka@ftp.illumina.com/. csaunders@illumina.com

  15. Evaluation of Respondent-Driven Sampling

    PubMed Central

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling method, and caution is required when interpreting findings based on the sampling method. PMID:22157309

  16. Design and Weighting Methods for a Nationally Representative Sample of HIV-infected Adults Receiving Medical Care in the United States-Medical Monitoring Project

    PubMed Central

    Iachan, Ronaldo; H. Johnson, Christopher; L. Harding, Richard; Kyle, Tonja; Saavedra, Pedro; L. Frazier, Emma; Beer, Linda; L. Mattson, Christine; Skarbinski, Jacek

    2016-01-01

    Background: Health surveys of the general US population are inadequate for monitoring human immunodeficiency virus (HIV) infection because the relatively low prevalence of the disease (<0.5%) leads to small subpopulation sample sizes. Objective: To collect a nationally and locally representative probability sample of HIV-infected adults receiving medical care to monitor clinical and behavioral outcomes, supplementing the data in the National HIV Surveillance System. This paper describes the sample design and weighting methods for the Medical Monitoring Project (MMP) and provides estimates of the size and characteristics of this population. Methods: To develop a method for obtaining valid, representative estimates of the in-care population, we implemented a cross-sectional, three-stage design that sampled 23 jurisdictions, then 691 facilities, then 9,344 HIV patients receiving medical care, using probability-proportional-to-size methods. The data weighting process followed standard methods, accounting for the probabilities of selection at each stage and adjusting for nonresponse and multiplicity. Nonresponse adjustments accounted for differing response at both facility and patient levels. Multiplicity adjustments accounted for visits to more than one HIV care facility. Results: MMP used a multistage stratified probability sampling design that was approximately self-weighting in each of the 23 project areas and nationally. The probability sample represents the estimated 421,186 HIV-infected adults receiving medical care during January through April 2009. Methods were efficient (i.e., induced small, unequal weighting effects and small standard errors for a range of weighted estimates). Conclusion: The information collected through MMP allows monitoring trends in clinical and behavioral outcomes and informs resource allocation for treatment and prevention activities. PMID:27651851

  17. A METHODS COMPARISON FOR COLLECTING MACROINVERTEBRATES IN THE OHIO RIVER

    EPA Science Inventory

    Collection of representative benthic macroinvertebrate samples from large rivers has been challenging researchers for many years. The objective of our study was to develop an appropriate method(s) for sampling macroinvertebrates from the Ohio River. Four existing sampling metho...

  18. REPRESENTATIVE SAMPLING AND ANALYSIS OF HETEROGENEOUS SOILS

    EPA Science Inventory

    Standard sampling and analysis methods for hazardous substances in contaminated soils currently are available and routinely employed. Standard methods inherently assume a homogeneous soil matrix and contaminant distribution; therefore only small sample quantities typically are p...

  19. Observational studies of patients in the emergency department: a comparison of 4 sampling methods.

    PubMed

    Valley, Morgan A; Heard, Kennon J; Ginde, Adit A; Lezotte, Dennis C; Lowenstein, Steven R

    2012-08-01

    We evaluate the ability of 4 sampling methods to generate representative samples of the emergency department (ED) population. We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by using 2 sample sizes (n=200 and n=400) and 4 sampling methods: true random, random 4-hour time blocks by exact sample size, random 4-hour time blocks by a predetermined number of blocks, and convenience or "business hours." For each method and sample size, we obtained 1,000 samples from the population. Using χ(2) tests, we measured the number of statistically significant differences between the sample and the population for 8 variables (age, sex, race/ethnicity, language, triage acuity, arrival mode, disposition, and payer source). Then, for each variable, method, and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Only the true random samples represented the population with respect to sex, race/ethnicity, triage acuity, mode of arrival, language, and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. Copyright © 2012. Published by Mosby, Inc.

  20. An efficient reliability algorithm for locating design point using the combination of importance sampling concepts and response surface method

    NASA Astrophysics Data System (ADS)

    Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin

    2017-06-01

    Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.

  1. GROUND WATER PURGING AND SAMPLING METHODS: HISTORY VS. HYSTERIA

    EPA Science Inventory

    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  2. Face recognition based on symmetrical virtual image and original training image

    NASA Astrophysics Data System (ADS)

    Ke, Jingcheng; Peng, Yali; Liu, Shigang; Li, Jun; Pei, Zhao

    2018-02-01

    In face representation-based classification methods, we are able to obtain high recognition rate if a face has enough available training samples. However, in practical applications, we only have limited training samples to use. In order to obtain enough training samples, many methods simultaneously use the original training samples and corresponding virtual samples to strengthen the ability of representing the test sample. One is directly using the original training samples and corresponding mirror samples to recognize the test sample. However, when the test sample is nearly symmetrical while the original training samples are not, the integration of the original training and mirror samples might not well represent the test samples. To tackle the above-mentioned problem, in this paper, we propose a novel method to obtain a kind of virtual samples which are generated by averaging the original training samples and corresponding mirror samples. Then, the original training samples and the virtual samples are integrated to recognize the test sample. Experimental results on five face databases show that the proposed method is able to partly overcome the challenges of the various poses, facial expressions and illuminations of original face image.

  3. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  4. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  5. Prevalence and Predictors of Sexual Assault among a College Sample

    ERIC Educational Resources Information Center

    Conley, A. H.; Overstreet, C. M.; Hawn, S. E.; Kendler, K. S.; Dick, D. M.; Amstadter, A. B.

    2017-01-01

    Objective: This study examined the prevalence and correlates of precollege, college-onset, and repeat sexual assault (SA) within a representative student sample. Participants: A representative sample of 7,603 students. Methods: Incoming first-year students completed a survey about their exposure to broad SA prior to college, prior trauma,…

  6. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  7. An active learning representative subset selection method using net analyte signal.

    PubMed

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-05

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. An active learning representative subset selection method using net analyte signal

    NASA Astrophysics Data System (ADS)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  9. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).

  10. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations].

    PubMed

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna

    2008-01-01

    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  11. Recruitment for Occupational Research: Using Injured Workers as the Point of Entry into Workplaces

    PubMed Central

    Koehoorn, Mieke; Trask, Catherine M.; Teschke, Kay

    2013-01-01

    Objective To investigate the feasibility, costs and sample representativeness of a recruitment method that used workers with back injuries as the point of entry into diverse working environments. Methods Workers' compensation claims were used to randomly sample workers from five heavy industries and to recruit their employers for ergonomic assessments of the injured worker and up to 2 co-workers. Results The final study sample included 54 workers from the workers’ compensation registry and 72 co-workers. This sample of 126 workers was based on an initial random sample of 822 workers with a compensation claim, or a ratio of 1 recruited worker to approximately 7 sampled workers. The average recruitment cost was CND$262/injured worker and CND$240/participating worksite including co-workers. The sample was representative of the heavy industry workforce, and was successful in recruiting the self-employed (8.2%), workers from small employers (<20 workers, 38.7%), and workers from diverse working environments (49 worksites, 29 worksite types, and 51 occupations). Conclusions The recruitment rate was low but the cost per participant reasonable and the sample representative of workers in small worksites. Small worksites represent a significant portion of the workforce but are typically underrepresented in occupational research despite having distinct working conditions, exposures and health risks worthy of investigation. PMID:23826387

  12. Optimization of groundwater sampling approach under various hydrogeological conditions using a numerical simulation model

    NASA Astrophysics Data System (ADS)

    Qi, Shengqi; Hou, Deyi; Luo, Jian

    2017-09-01

    This study presents a numerical model based on field data to simulate groundwater flow in both the aquifer and the well-bore for the low-flow sampling method and the well-volume sampling method. The numerical model was calibrated to match well with field drawdown, and calculated flow regime in the well was used to predict the variation of dissolved oxygen (DO) concentration during the purging period. The model was then used to analyze sampling representativeness and sampling time. Site characteristics, such as aquifer hydraulic conductivity, and sampling choices, such as purging rate and screen length, were found to be significant determinants of sampling representativeness and required sampling time. Results demonstrated that: (1) DO was the most useful water quality indicator in ensuring groundwater sampling representativeness in comparison with turbidity, pH, specific conductance, oxidation reduction potential (ORP) and temperature; (2) it is not necessary to maintain a drawdown of less than 0.1 m when conducting low flow purging. However, a high purging rate in a low permeability aquifer may result in a dramatic decrease in sampling representativeness after an initial peak; (3) the presence of a short screen length may result in greater drawdown and a longer sampling time for low-flow purging. Overall, the present study suggests that this new numerical model is suitable for describing groundwater flow during the sampling process, and can be used to optimize sampling strategies under various hydrogeological conditions.

  13. An improved SRC method based on virtual samples for face recognition

    NASA Astrophysics Data System (ADS)

    Fu, Lijun; Chen, Deyun; Lin, Kezheng; Li, Ao

    2018-07-01

    The sparse representation classifier (SRC) performs classification by evaluating which class leads to the minimum representation error. However, in real world, the number of available training samples is limited due to noise interference, training samples cannot accurately represent the test sample linearly. Therefore, in this paper, we first produce virtual samples by exploiting original training samples at the aim of increasing the number of training samples. Then, we take the intra-class difference as data representation of partial noise, and utilize the intra-class differences and training samples simultaneously to represent the test sample in a linear way according to the theory of SRC algorithm. Using weighted score level fusion, the respective representation scores of the virtual samples and the original training samples are fused together to obtain the final classification results. The experimental results on multiple face databases show that our proposed method has a very satisfactory classification performance.

  14. Sampling maternal care behaviour in domestic dogs: What's the best approach?

    PubMed

    Czerwinski, Veronika H; Smith, Bradley P; Hynd, Philip I; Hazel, Susan J

    2017-07-01

    Our understanding of the frequency and duration of maternal care behaviours in the domestic dog during the first two postnatal weeks is limited, largely due to the inconsistencies in the sampling methodologies that have been employed. In order to develop a more concise picture of maternal care behaviour during this period, and to help establish the sampling method that represents these behaviours best, we compared a variety of time sampling methods Six litters were continuously observed for a total of 96h over postnatal days 3, 6, 9 and 12 (24h per day). Frequent (dam presence, nursing duration, contact duration) and infrequent maternal behaviours (anogenital licking duration and frequency) were coded using five different time sampling methods that included: 12-h night (1800-0600h), 12-h day (0600-1800h), one hour period during the night (1800-0600h), one hour period during the day (0600-1800h) and a one hour period anytime. Each of the one hour time sampling method consisted of four randomly chosen 15-min periods. Two random sets of four 15-min period were also analysed to ensure reliability. We then determined which of the time sampling methods averaged over the three 24-h periods best represented the frequency and duration of behaviours. As might be expected, frequently occurring behaviours were adequately represented by short (oneh) sampling periods, however this was not the case with the infrequent behaviour. Thus, we argue that the time sampling methodology employed must match the behaviour of interest. This caution applies to maternal behaviour in altricial species, such as canids, as well as all systematic behavioural observations utilising time sampling methodology. Copyright © 2017. Published by Elsevier B.V.

  15. Evaluation of respondent-driven sampling.

    PubMed

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required when interpreting findings based on the sampling method.

  16. GROUND WATER MONITORING AND SAMPLING: MULTI-LEVEL VERSUS TRADITIONAL METHODS WHATS WHAT?

    EPA Science Inventory

    After years of research and many publications, the question still remains: What is the best method to collect representative ground water samples from monitoring wells? Numerous systems and devices are currently available for obtaining both multi-level samples as well as traditi...

  17. Reaching a Representative Sample of College Students: A Comparative Analysis

    ERIC Educational Resources Information Center

    Giovenco, Daniel P.; Gundersen, Daniel A.; Delnevo, Cristine D.

    2016-01-01

    Objective: To explore the feasibility of a random-digit dial (RDD) cellular phone survey in order to reach a national and representative sample of college students. Methods: Demographic distributions from the 2011 National Young Adult Health Survey (NYAHS) were benchmarked against enrollment numbers from the Integrated Postsecondary Education…

  18. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    PubMed

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (n<10) or very small (n < or = 5) sample sizes. This method can be used by researchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  19. Apparatus and method for detecting full-capture radiation events

    DOEpatents

    Odell, D.M.C.

    1994-10-11

    An apparatus and method are disclosed for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events. 4 figs.

  20. Apparatus and method for detecting full-capture radiation events

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    An apparatus and method for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events.

  1. Rare Earth Element and Trace Element Data Associated with Hydrothermal Spring Reservoir Rock, Idaho

    DOE Data Explorer

    Quillinan, Scott; Bagdonas, Davin

    2017-06-22

    These data represent rock samples collected in Idaho that correspond with naturally occurring hydrothermal samples that were collected and analyzed by INL (Idaho Falls, ID). Representative samples of type rocks were selected to best represent the various regions of Idaho in which naturally occurring hydrothermal waters occur. This includes the Snake River Plain (SRP), Basin and Range type structures east of the SRP, and large scale/deep seated orogenic uplift of the Sawtooth Mountains, ID. Analysis includes ICP-OES and ICP-MS methods for Major, Trace, and REE concentrations.

  2. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    USGS Publications Warehouse

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  3. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    PubMed

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. Published by Elsevier B.V.

  4. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  5. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  6. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  7. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  8. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  9. GROUND WATER SAMPLING USING LOW-FLOW TECHNIQUES

    EPA Science Inventory

    Obtaining representative ground water samples is important for site assessment and remedial performance monitoring objectives. The sampling device or method used to collect samples from monitoring or compliance well can significantly impact data quality and reliability. Low-flo...

  10. Cross-Validation of FITNESSGRAM® Health-Related Fitness Standards in Hungarian Youth

    ERIC Educational Resources Information Center

    Laurson, Kelly R.; Saint-Maurice, Pedro F.; Karsai, István; Csányi, Tamás

    2015-01-01

    Purpose: The purpose of this study was to cross-validate FITNESSGRAM® aerobic and body composition standards in a representative sample of Hungarian youth. Method: A nationally representative sample (N = 405) of Hungarian adolescents from the Hungarian National Youth Fitness Study (ages 12-18.9 years) participated in an aerobic capacity assessment…

  11. Predicting Posttraumatic Stress Symptoms Longitudinally in a Representative Sample of Hospitalized Injured Adolescents

    ERIC Educational Resources Information Center

    Zatzick, Douglas F.; Grossman, David C.; Russo, Joan; Pynoos, Robert; Berliner, Lucy; Jurkovich, Gregory; Sabin, Janice A.; Katon, Wayne; Ghesquiere, Angela; McCauley, Elizabeth; Rivara, Frederick P.

    2006-01-01

    Objective: Adolescents constitute a high-risk population for traumatic physical injury, yet few longitudinal investigations have assessed the development of posttraumatic stress disorder (PTSD) symptoms over time in representative samples. Method: Between July 2002 and August 2003,108 randomly selected injured adolescent patients ages 12 to 18 and…

  12. Sociodemographic Differences in Depressed Mood: Results from a Nationally Representative Sample of High School Adolescents

    ERIC Educational Resources Information Center

    Paxton, Raheem J.; Valois, Robert F.; Watkins, Ken W.; Huebner, E. Scott; Drane, J. Wanzer

    2007-01-01

    Background: Research on adolescent mental health suggests that prevalence rates for depressed mood are not uniformly distributed across all populations. This study examined demographic difference in depressed mood among a nationally representative sample of high school adolescents. Methods: The 2003 National Youth Risk Behavior Survey was utilized…

  13. Adaptive Importance Sampling for Control and Inference

    NASA Astrophysics Data System (ADS)

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  14. Recommendations for representative ballast water sampling

    NASA Astrophysics Data System (ADS)

    Gollasch, Stephan; David, Matej

    2017-05-01

    Until now, the purpose of ballast water sampling studies was predominantly limited to general scientific interest to determine the variety of species arriving in ballast water in a recipient port. Knowing the variety of species arriving in ballast water also contributes to the assessment of relative species introduction vector importance. Further, some sampling campaigns addressed awareness raising or the determination of organism numbers per water volume to evaluate the species introduction risk by analysing the propagule pressure of species. A new aspect of ballast water sampling, which this contribution addresses, is compliance monitoring and enforcement of ballast water management standards as set by, e.g., the IMO Ballast Water Management Convention. To achieve this, sampling methods which result in representative ballast water samples are essential. We recommend such methods based on practical tests conducted on two commercial vessels also considering results from our previous studies. The results show that different sampling approaches influence the results regarding viable organism concentrations in ballast water samples. It was observed that the sampling duration (i.e., length of the sampling process), timing (i.e., in which point in time of the discharge the sample is taken), the number of samples and the sampled water quantity are the main factors influencing the concentrations of viable organisms in a ballast water sample. Based on our findings we provide recommendations for representative ballast water sampling.

  15. Analysis of four recruitment methods for obtaining normative data through a Web-based questionnaire: a pilot study.

    PubMed

    Nolte, Michael T; Shauver, Melissa J; Chung, Kevin C

    2015-09-01

    Quality normative data requires a diverse sample of participants and plays an important role in the appropriate use of health outcomes. Using social media and other online resources for survey recruitment is a tempting prospect, but the effectiveness of these methods in collecting a diverse sample is unknown. The purpose of this study is to pilot test four methods of recruitment to determine their ability to produce a sample representative of the general US population. This project is part of a larger study to gather normative data for the Michigan Hand Outcomes Questionnaire (MHQ). We used flyers, e-mail, Facebook, and an institution-specific clinical research recruitment Web site to direct participants to complete an online version of the MHQ. Participants also provided comorbidity and demographic information. The institution-specific recruitment Web site yielded the greatest number of respondents in an age distribution that mirrored the US population. Facebook was effective for recruiting young adults, and e-mail was successful for recruiting the older adults. None of the methods was successful in reaching an ethnically diverse sample. Obtaining normative data that is truly representative of the US population is a difficult task. The use of any one recruitment method is unlikely to result in a representative sample, but a greater understanding of these methods will empower researchers to use them to target specific populations. This pilot analysis provides support for the use of Facebook and clinical research sites in addition to traditional methods of e-mail and paper flyers.

  16. Improvement of Predictive Ability by Uniform Coverage of the Target Genetic Space

    PubMed Central

    Bustos-Korts, Daniela; Malosetti, Marcos; Chapman, Scott; Biddulph, Ben; van Eeuwijk, Fred

    2016-01-01

    Genome-enabled prediction provides breeders with the means to increase the number of genotypes that can be evaluated for selection. One of the major challenges in genome-enabled prediction is how to construct a training set of genotypes from a calibration set that represents the target population of genotypes, where the calibration set is composed of a training and validation set. A random sampling protocol of genotypes from the calibration set will lead to low quality coverage of the total genetic space by the training set when the calibration set contains population structure. As a consequence, predictive ability will be affected negatively, because some parts of the genotypic diversity in the target population will be under-represented in the training set, whereas other parts will be over-represented. Therefore, we propose a training set construction method that uniformly samples the genetic space spanned by the target population of genotypes, thereby increasing predictive ability. To evaluate our method, we constructed training sets alongside with the identification of corresponding genomic prediction models for four genotype panels that differed in the amount of population structure they contained (maize Flint, maize Dent, wheat, and rice). Training sets were constructed using uniform sampling, stratified-uniform sampling, stratified sampling and random sampling. We compared these methods with a method that maximizes the generalized coefficient of determination (CD). Several training set sizes were considered. We investigated four genomic prediction models: multi-locus QTL models, GBLUP models, combinations of QTL and GBLUPs, and Reproducing Kernel Hilbert Space (RKHS) models. For the maize and wheat panels, construction of the training set under uniform sampling led to a larger predictive ability than under stratified and random sampling. The results of our methods were similar to those of the CD method. For the rice panel, all training set construction methods led to similar predictive ability, a reflection of the very strong population structure in this panel. PMID:27672112

  17. Improvement of core drill methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gatz, J.L.

    1975-07-01

    This report documents results of a program to evaluate effectiveness of more or less conventional subsurface samplers in obtaining representative and undisturbed samples of noncohesive alluvial materials containing large quantities of gravels and cobbles. This is the first phase of a research program to improve core drill methods. Samplers evaluated consisted of the Lawrence Livermore Laboratory membrane sampler, 4-in. Denison sampler, 6-in. Dension sampler, 5-in. Modified Denison sampler, and 3-in. thinwall drive tube. Small representative samples were obtained with the Dension samplers; no undisturbed samples were obtained. The field work was accomplished in the Rhodes Canyon area, White Sands Misslemore » Range, New Mexico.« less

  18. Study on high-resolution representation of terraces in Shanxi Loess Plateau area

    NASA Astrophysics Data System (ADS)

    Zhao, Weidong; Tang, Guo'an; Ma, Lei

    2008-10-01

    A new elevation points sampling method, namely TIN-based Sampling Method (TSM) and a new visual method called Elevation Addition Method (EAM), are put forth for representing the typical terraces in Shanxi loess plateau area. The DEM Feature Points and Lines Classification (DEPLC) put forth by the authors in 2007 is perfected for depicting the main path in the study area. The EAM is used to visualize the terraces and the path in the study area. 406 key elevation points and 15 feature constrained lines sampled by this method are used to construct CD-TINs which can depict the terraces and path correctly and effectively. Our case study shows that the new sampling method called TSM is reasonable and feasible. The complicated micro-terrains like terraces and path can be represented with high resolution and high efficiency successfully by use of the perfected DEPLC, TSM and CD-TINs. And both the terraces and the main path are visualized very well by use of EAM even when the terrace height is not more than 1m.

  19. Minimum and Maximum Times Required to Obtain Representative Suspended Sediment Samples

    NASA Astrophysics Data System (ADS)

    Gitto, A.; Venditti, J. G.; Kostaschuk, R.; Church, M. A.

    2014-12-01

    Bottle sampling is a convenient method of obtaining suspended sediment measurements for the development of sediment budgets. While these methods are generally considered to be reliable, recent analysis of depth-integrated sampling has identified considerable uncertainty in measurements of grain-size concentration between grain-size classes of multiple samples. Point-integrated bottle sampling is assumed to represent the mean concentration of suspended sediment but the uncertainty surrounding this method is not well understood. Here we examine at-a-point variability in velocity, suspended sediment concentration, grain-size distribution, and grain-size moments to determine if traditional point-integrated methods provide a representative sample of suspended sediment. We present continuous hour-long observations of suspended sediment from the sand-bedded portion of the Fraser River at Mission, British Columbia, Canada, using a LISST laser-diffraction instrument. Spectral analysis suggests that there are no statistically significant peak in energy density, suggesting the absence of periodic fluctuations in flow and suspended sediment. However, a slope break in the spectra at 0.003 Hz corresponds to a period of 5.5 minutes. This coincides with the threshold between large-scale turbulent eddies that scale with channel width/mean velocity and hydraulic phenomena related to channel dynamics. This suggests that suspended sediment samples taken over a period longer than 5.5 minutes incorporate variability that is larger scale than turbulent phenomena in this channel. Examination of 5.5-minute periods of our time series indicate that ~20% of the time a stable mean value of volumetric concentration is reached within 30 seconds, a typical bottle sample duration. In ~12% of measurements a stable mean was not reached over the 5.5 minute sample duration. The remaining measurements achieve a stable mean in an even distribution over the intervening interval.

  20. A Synopsis of Technical Issues for Monitoring Sediment in Highway and Urban Runoff

    USGS Publications Warehouse

    Bent, Gardner C.; Gray, John R.; Smith, Kirk P.; Glysson, G. Douglas

    2000-01-01

    Accurate and representative sediment data are critical for assessing the potential effects of highway and urban runoff on receiving waters. The U.S. Environmental Protection Agency identified sediment as the most widespread pollutant in the Nation's rivers and streams, affecting aquatic habitat, drinking water treatment processes, and recreational uses of rivers, lakes, and estuaries. Representative sediment data are also necessary for quantifying and interpreting concentrations, loads, and effects of trace elements and organic constituents associated with highway and urban runoff. Many technical issues associated with the collecting, processing, and analyzing of samples must be addressed to produce valid (useful for intended purposes), current, complete, and technically defensible data for local, regional, and national information needs. All aspects of sediment data-collection programs need to be evaluated, and adequate quality-control data must be collected and documented so that the comparability and representativeness of data obtained for highway- and urban-runoff studies may be assessed. Collection of representative samples for the measurement of sediment in highway and urban runoff involves a number of interrelated issues. Temporal and spatial variability in runoff result from a combination of factors, including volume and intensity of precipitation, rate of snowmelt, and features of the drainage basin such as area, slope, infiltration capacity, channel roughness, and storage characteristics. In small drainage basins such as those found in many highway and urban settings, automatic samplers are often the most suitable method for collecting samples of runoff for a variety of reasons. Indirect sediment-measurement methods are also useful as supplementary and(or) surrogate means for monitoring sediment in runoff. All of these methods have limitations in addition to benefits, which must be identified and quantified to produce representative data. Methods for processing raw sediment samples (including homogenization and subsampling) for subsequent analysis for total suspended solids or suspended-sediment concentration often increase variance and may introduce bias. Processing artifacts can be substantial if the methods used are not appropriate for the concentrations and particle-size distributions present in the samples collected. Analytical methods for determining sediment concentrations include the suspended-sediment concentration and the total suspended solids methods. Although the terms suspended-sediment concentration and total suspended solids are often used interchangeably to describe the total concentration of suspended solid-phase material, the analytical methods differ and can produce substantially different results. The total suspended solids method, which commonly is used to produce highway- and urban-runoff sediment data, may not be valid for studies of runoff water quality. Studies of fluvial and highway-runoff sediment data indicate that analyses of samples by the total suspended solids method tends to under represent the true sediment concentration, and that relations between total suspended solids and suspended-sediment concentration are not transferable from site to site even when grain-size distribution information is available. Total suspended solids data used to calculate suspended-sediment loads in highways and urban runoff may be fundamentally unreliable. Consequently, use of total suspended solids data may have adverse consequences for the assessment, design, and maintenance of sediment-removal best management practices. Therefore, it may be necessary to analyze water samples using the suspended-sediment concentration method. Data quality, comparability, and utility are important considerations in collection, processing, and analysis of sediment samples and interpretation of sediment data for highway- and urban-runoff studies. Results from sediment studies must be comparable and readily transf

  1. Assessing representativeness of sampling methods for reaching men who have sex with men: a direct comparison of results obtained from convenience and probability samples.

    PubMed

    Schwarcz, Sandra; Spindler, Hilary; Scheer, Susan; Valleroy, Linda; Lansky, Amy

    2007-07-01

    Convenience samples are used to determine HIV-related behaviors among men who have sex with men (MSM) without measuring the extent to which the results are representative of the broader MSM population. We compared results from a cross-sectional survey of MSM recruited from gay bars between June and October 2001 to a random digit dial telephone survey conducted between June 2002 and January 2003. The men in the probability sample were older, better educated, and had higher incomes than men in the convenience sample, the convenience sample enrolled more employed men and men of color. Substance use around the time of sex was higher in the convenience sample but other sexual behaviors were similar. HIV testing was common among men in both samples. Periodic validation, through comparison of data collected by different sampling methods, may be useful when relying on survey data for program and policy development.

  2. Sample Selection for Training Cascade Detectors.

    PubMed

    Vállez, Noelia; Deniz, Oscar; Bueno, Gloria

    2015-01-01

    Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  3. Accumulation of polycyclic aromatic hydrocarbons by Neocalanus copepods in Port Valdez, Alaska.

    PubMed

    Carls, Mark G; Short, Jeffrey W; Payne, James

    2006-11-01

    Sampling zooplankton is a useful strategy for observing trace hydrocarbon concentrations in water because samples represent an integrated average over a considerable effective sampling volume and are more representative of the sampled environment than discretely collected water samples. We demonstrate this method in Port Valdez, Alaska, an approximately 100 km(2) basin that receives about 0.5-2.4 kg of polynuclear aromatic hydrocarbons (PAH) per day. Total PAH (TPAH) concentrations (0.61-1.31 microg/g dry weight), composition, and spatial distributions in a lipid-rich copepod, Neocalanus were consistent with the discharge as the source of contamination. Although Neocalanus acquire PAH from water or suspended particulate matter, total PAH concentrations in these compartments were at or below method detection limits, demonstrating plankton can amplify trace concentrations to detectable levels useful for study.

  4. From picture to porosity of river bed material using Structure-from-Motion with Multi-View-Stereo

    NASA Astrophysics Data System (ADS)

    Seitz, Lydia; Haas, Christian; Noack, Markus; Wieprecht, Silke

    2018-04-01

    Common methods for in-situ determination of porosity of river bed material are time- and effort-consuming. Although mathematical predictors can be used for estimation, they do not adequately represent porosities. The objective of this study was to assess a new approach for the determination of porosity of frozen sediment samples. The method is based on volume determination by applying Structure-from-Motion with Multi View Stereo (SfM-MVS) to estimate a 3D volumetric model based on overlapping imagery. The method was applied on artificial sediment mixtures as well as field samples. In addition, the commonly used water replacement method was applied to determine porosities in comparison with the SfM-MVS method. We examined a range of porosities from 0.16 to 0.46 that are representative of the wide range of porosities found in rivers. SfM-MVS performed well in determining volumes of the sediment samples. A very good correlation (r = 0.998, p < 0.0001) was observed between the SfM-MVS and the water replacement method. Results further show that the water replacement method underestimated total sample volumes. A comparison with several mathematical predictors showed that for non-uniform samples the calculated porosity based on the standard deviation performed better than porosities based on the median grain size. None of the predictors were effective at estimating the porosity of the field samples.

  5. Aggressive and Violent Behaviors in the School Environment among a Nationally Representative Sample of Adolescent Youth

    ERIC Educational Resources Information Center

    Rajan, Sonali; Namdar, Rachel; Ruggles, Kelly V.

    2015-01-01

    Background: The purpose of this study was to describe the prevalence of aggressive and violent behaviors in the context of the school environment in a nationally representative sample of adolescent youth and to illustrate these patterns during 2001-2011. Methods: We analyzed data from 84,734 participants via the Youth Risk Behavior Surveillance…

  6. Risk and Protective Factors Associated with Speech and Language Impairment in a Nationally Representative Sample of 4- to 5-Year-Old Children

    ERIC Educational Resources Information Center

    Harrison, Linda J.; McLeod, Sharynne

    2010-01-01

    Purpose: To determine risk and protective factors for speech and language impairment in early childhood. Method: Data are presented for a nationally representative sample of 4,983 children participating in the Longitudinal Study of Australian Children (described in McLeod & Harrison, 2009). Thirty-one child, parent, family, and community…

  7. Bullying Victimization Prevalence and Its Effects on Psychosomatic Complaints: Can Sense of Coherence Make a Difference?

    ERIC Educational Resources Information Center

    García-Moya, Irene; Suominen, Sakari; Moreno, Carmen

    2014-01-01

    Background: The aim of this study was to examine the prevalence of bullying victimization and its impact on physical and psychological complaints in a representative sample of adolescents and to explore the role of sense of coherence (SOC) in victimization prevalence and consequences. Methods: A representative sample of Spanish adolescents (N =…

  8. The Relationship between Child Abuse, Parental Divorce, and Lifetime Mental Disorders and Suicidality in a Nationally Representative Adult Sample

    ERIC Educational Resources Information Center

    Afifi, Tracie O.; Boman, Jonathan; Fleisher, William; Sareen, Jitender

    2009-01-01

    Objectives: To determine how the experiences of child abuse and parental divorce are related to long-term mental health outcomes using a nationally representative adult sample after adjusting for sociodemographic variables and parental psychopathology. Methods: Data were drawn from the National Comorbidity Survey (NCS, n=5,877; age 15-54 years;…

  9. Method for predicting peptide detection in mass spectrometry

    DOEpatents

    Kangas, Lars [West Richland, WA; Smith, Richard D [Richland, WA; Petritis, Konstantinos [Richland, WA

    2010-07-13

    A method of predicting whether a peptide present in a biological sample will be detected by analysis with a mass spectrometer. The method uses at least one mass spectrometer to perform repeated analysis of a sample containing peptides from proteins with known amino acids. The method then generates a data set of peptides identified as contained within the sample by the repeated analysis. The method then calculates the probability that a specific peptide in the data set was detected in the repeated analysis. The method then creates a plurality of vectors, where each vector has a plurality of dimensions, and each dimension represents a property of one or more of the amino acids present in each peptide and adjacent peptides in the data set. Using these vectors, the method then generates an algorithm from the plurality of vectors and the calculated probabilities that specific peptides in the data set were detected in the repeated analysis. The algorithm is thus capable of calculating the probability that a hypothetical peptide represented as a vector will be detected by a mass spectrometry based proteomic platform, given that the peptide is present in a sample introduced into a mass spectrometer.

  10. Investigation of the "true" extraction recovery of analytes from multiple types of tissues and its impact on tissue bioanalysis using two model compounds.

    PubMed

    Yuan, Long; Ma, Li; Dillon, Lisa; Fancher, R Marcus; Sun, Huadong; Zhu, Mingshe; Lehman-McKeeman, Lois; Aubry, Anne-Françoise; Ji, Qin C

    2016-11-16

    LC-MS/MS has been widely applied to the quantitative analysis of tissue samples. However, one key remaining issue is that the extraction recovery of analyte from spiked tissue calibration standard and quality control samples (QCs) may not accurately represent the "true" recovery of analyte from incurred tissue samples. This may affect the accuracy of LC-MS/MS tissue bioanalysis. Here, we investigated whether the recovery determined using tissue QCs by LC-MS/MS can accurately represent the "true" recovery from incurred tissue samples using two model compounds: BMS-986104, a S1P 1 receptor modulator drug candidate, and its phosphate metabolite, BMS-986104-P. We first developed a novel acid and surfactant assisted protein precipitation method for the extraction of BMS-986104 and BMS-986104-P from rat tissues, and determined their recoveries using tissue QCs by LC-MS/MS. We then used radioactive incurred samples from rats dosed with 3 H-labeled BMS-986104 to determine the absolute total radioactivity recovery in six different tissues. The recoveries determined using tissue QCs and incurred samples matched with each other very well. The results demonstrated that, in this assay, tissue QCs accurately represented the incurred tissue samples to determine the "true" recovery, and LC-MS/MS assay was accurate for tissue bioanalysis. Another aspect we investigated is how the tissue QCs should be prepared to better represent the incurred tissue samples. We compared two different QC preparation methods (analyte spiked in tissue homogenates or in intact tissues) and demonstrated that the two methods had no significant difference when a good sample preparation was in place. The developed assay showed excellent accuracy and precision, and was successfully applied to the quantitative determination of BMS-986104 and BMS-986104-P in tissues in a rat toxicology study. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. BACTERIOLOGICAL ANALYSIS WITH SAMPLING AND SAMPLE PRESERVATION SPECIFICS

    EPA Science Inventory

    Current federal regulations (40CFR 503) specify that under certain conditions treated municipal biosolids must be analyzed for fecal coliform or salmonellae. The regulations state that representative samples of biosolids must be collected and analyzed using standard methods. Th...

  12. Comparison of strategies for the isolation of PCR-compatible, genomic DNA from a municipal biogas plants.

    PubMed

    Weiss, Agnes; Jérôme, Valérie; Freitag, Ruth

    2007-06-15

    The goal of the project was the extraction of PCR-compatible genomic DNA representative of the entire microbial community from municipal biogas plant samples (mash, bioreactor content, process water, liquid fertilizer). For the initial isolation of representative DNA from the respective lysates, methods were used that employed adsorption, extraction, or precipitation to specifically enrich the DNA. Since no dedicated method for biogas plant samples was available, preference was given to kits/methods suited to samples that resembled either the bioreactor feed, e.g. foodstuffs, or those intended for environmental samples including wastewater. None of the methods succeeded in preparing DNA that was directly PCR-compatible. Instead the DNA was found to still contain considerable amounts of difficult-to-remove enzyme inhibitors (presumably humic acids) that hindered the PCR reaction. Based on the isolation method that gave the highest yield/purity for all sample types, subsequent purification was attempted by agarose gel electrophoresis followed by electroelution, spermine precipitation, or dialysis through nitrocellulose membrane. A combination of phenol/chloroform extraction followed by purification via dialysis constituted the most efficient sample treatment. When such DNA preparations were diluted 1:100 they did no longer inhibit PCR reactions, while they still contained sufficient genomic DNA to allow specific amplification of specific target sequences.

  13. Evaluating performance of stormwater sampling approaches using a dynamic watershed model.

    PubMed

    Ackerman, Drew; Stein, Eric D; Ritter, Kerry J

    2011-09-01

    Accurate quantification of stormwater pollutant levels is essential for estimating overall contaminant discharge to receiving waters. Numerous sampling approaches exist that attempt to balance accuracy against the costs associated with the sampling method. This study employs a novel and practical approach of evaluating the accuracy of different stormwater monitoring methodologies using stormflows and constituent concentrations produced by a fully validated continuous simulation watershed model. A major advantage of using a watershed model to simulate pollutant concentrations is that a large number of storms representing a broad range of conditions can be applied in testing the various sampling approaches. Seventy-eight distinct methodologies were evaluated by "virtual samplings" of 166 simulated storms of varying size, intensity and duration, representing 14 years of storms in Ballona Creek near Los Angeles, California. The 78 methods can be grouped into four general strategies: volume-paced compositing, time-paced compositing, pollutograph sampling, and microsampling. The performances of each sampling strategy was evaluated by comparing the (1) median relative error between the virtually sampled and the true modeled event mean concentration (EMC) of each storm (accuracy), (2) median absolute deviation about the median or "MAD" of the relative error or (precision), and (3) the percentage of storms where sampling methods were within 10% of the true EMC (combined measures of accuracy and precision). Finally, costs associated with site setup, sampling, and laboratory analysis were estimated for each method. Pollutograph sampling consistently outperformed the other three methods both in terms of accuracy and precision, but was the most costly method evaluated. Time-paced sampling consistently underestimated while volume-paced sampling over estimated the storm EMCs. Microsampling performance approached that of pollutograph sampling at a substantial cost savings. The most efficient method for routine stormwater monitoring in terms of a balance between performance and cost was volume-paced microsampling, with variable sample pacing to ensure that the entirety of the storm was captured. Pollutograph sampling is recommended if the data are to be used for detailed analysis of runoff dynamics.

  14. Detailed description of oil shale organic and mineralogical heterogeneity via fourier transform infrared mircoscopy

    USGS Publications Warehouse

    Washburn, Kathryn E.; Birdwell, Justin E.; Foster, Michael; Gutierrez, Fernando

    2015-01-01

    Mineralogical and geochemical information on reservoir and source rocks is necessary to assess and produce from petroleum systems. The standard methods in the petroleum industry for obtaining these properties are bulk measurements on homogenized, generally crushed, and pulverized rock samples and can take from hours to days to perform. New methods using Fourier transform infrared (FTIR) spectroscopy have been developed to more rapidly obtain information on mineralogy and geochemistry. However, these methods are also typically performed on bulk, homogenized samples. We present a new approach to rock sample characterization incorporating multivariate analysis and FTIR microscopy to provide non-destructive, spatially resolved mineralogy and geochemistry on whole rock samples. We are able to predict bulk mineralogy and organic carbon content within the same margin of error as standard characterization techniques, including X-ray diffraction (XRD) and total organic carbon (TOC) analysis. Validation of the method was performed using two oil shale samples from the Green River Formation in the Piceance Basin with differing sedimentary structures. One sample represents laminated Green River oil shales, and the other is representative of oil shale breccia. The FTIR microscopy results on the oil shales agree with XRD and LECO TOC data from the homogenized samples but also give additional detail regarding sample heterogeneity by providing information on the distribution of mineral phases and organic content. While measurements for this study were performed on oil shales, the method could also be applied to other geological samples, such as other mudrocks, complex carbonates, and soils.

  15. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  16. A method for development of a system of identification for Appalachian coal-bearing rocks

    USGS Publications Warehouse

    Ferm, J.C.; Weisenfluh, G.A.; Smith, G.C.

    2002-01-01

    The number of observable properties of sedimentary rocks is large and numerous classifications have been proposed for describing them. Some rock classifications, however, may be disadvantageous in situations such as logging rock core during coal exploration programs, where speed and simplicity are the essence. After experimenting with a number of formats for logging rock core in the Appalachian coal fields, a method of using color photographs accompanied by a rock name and numeric code was selected. In order to generate a representative collection of rocks to be photographed, sample methods were devised to produce a representative collection, and empirically based techniques were devised to identify repeatedly recognizable rock types. A number of cores representing the stratigraphic and geographic range of the region were sampled so that every megascopically recognizable variety was included in the collection; the frequency of samples of any variety reflects the frequency with which it would be encountered during logging. In order to generate repeatedly recognizable rock classes, the samples were sorted to display variation in grain size, mineral composition, color, and sedimentary structures. Class boundaries for each property were selected on the basis of existing, widely accepted limits and the precision with which these limits could be recognized. The process of sorting the core samples demonstrated relationships between rock properties and indicated that similar methods, applied to other groups of rocks, could yield more widely applicable field classifications. ?? 2002 Elsevier Science B.V. All rights reserved.

  17. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    PubMed

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical uniform random (VUR) sections.

  18. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    PubMed Central

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries. PMID:29351349

  19. Electrofracturing test system and method of determining material characteristics of electrofractured material samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauer, Stephen J.; Glover, Steven F.; Pfeifle, Tom

    A device for electrofracturing a material sample and analyzing the material sample is disclosed. The device simulates an in situ electrofracturing environment so as to obtain electrofractured material characteristics representative of field applications while allowing permeability testing of the fractured sample under in situ conditions.

  20. ROLE OF LABORATORY SAMPLING DEVICES AND LABORATORY SUBSAMPLING METHODS IN OPTIMIZING REPRESENTATIVENESS STRATEGIES

    EPA Science Inventory

    Sampling is the act of selecting items from a specified population in order to estimate the parameters of that population (e.g., selecting soil samples to characterize the properties at an environmental site). Sampling occurs at various levels and times throughout an environmenta...

  1. Sampling in epidemiological research: issues, hazards and pitfalls.

    PubMed

    Tyrer, Stephen; Heyman, Bob

    2016-04-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research.

  2. Sampling in epidemiological research: issues, hazards and pitfalls

    PubMed Central

    Tyrer, Stephen; Heyman, Bob

    2016-01-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985

  3. CIHR Candrive Cohort Comparison with Canadian Household Population Holding Valid Driver's Licenses.

    PubMed

    Gagnon, Sylvain; Marshall, Shawn; Kadulina, Yara; Stinchcombe, Arne; Bédard, Michel; Gélinas, Isabelle; Man-Son-Hing, Malcolm; Mazer, Barbara; Naglie, Gary; Porter, Michelle M; Rapoport, Mark; Tuokko, Holly; Vrkljan, Brenda

    2016-06-01

    We investigated whether convenience sampling is a suitable method to generate a sample of older drivers representative of the older-Canadian driver population. Using equivalence testing, we compared a large convenience sample of older drivers (Candrive II prospective cohort study) to a similarly aged population of older Canadian drivers. The Candrive sample consists of 928 community-dwelling older drivers from seven metropolitan areas of Canada. The population data was obtained from the Canadian Community Health Survey - Healthy Aging (CCHS-HA), which is a representative sample of older Canadians. The data for drivers aged 70 and older were extracted from the CCHS-HA database, for a total of 3,899 older Canadian drivers. Two samples were demonstrated as equivalent on socio-demographic, health, and driving variables that we compared, but not on driving frequency. We conclude that convenience sampling used in the Candrive study created a fairly representative sample of Canadian older drivers, with a few exceptions.

  4. Application of advanced sampling and analysis methods to predict the structure of adsorbed protein on a material surface

    PubMed Central

    Abramyan, Tigran M.; Hyde-Volpe, David L.; Stuart, Steven J.; Latour, Robert A.

    2017-01-01

    The use of standard molecular dynamics simulation methods to predict the interactions of a protein with a material surface have the inherent limitations of lacking the ability to determine the most likely conformations and orientations of the adsorbed protein on the surface and to determine the level of convergence attained by the simulation. In addition, standard mixing rules are typically applied to combine the nonbonded force field parameters of the solution and solid phases the system to represent interfacial behavior without validation. As a means to circumvent these problems, the authors demonstrate the application of an efficient advanced sampling method (TIGER2A) for the simulation of the adsorption of hen egg-white lysozyme on a crystalline (110) high-density polyethylene surface plane. Simulations are conducted to generate a Boltzmann-weighted ensemble of sampled states using force field parameters that were validated to represent interfacial behavior for this system. The resulting ensembles of sampled states were then analyzed using an in-house-developed cluster analysis method to predict the most probable orientations and conformations of the protein on the surface based on the amount of sampling performed, from which free energy differences between the adsorbed states were able to be calculated. In addition, by conducting two independent sets of TIGER2A simulations combined with cluster analyses, the authors demonstrate a method to estimate the degree of convergence achieved for a given amount of sampling. The results from these simulations demonstrate that these methods enable the most probable orientations and conformations of an adsorbed protein to be predicted and that the use of our validated interfacial force field parameter set provides closer agreement to available experimental results compared to using standard CHARMM force field parameterization to represent molecular behavior at the interface. PMID:28514864

  5. Point-Sampling and Line-Sampling Probability Theory, Geometric Implications, Synthesis

    Treesearch

    L.R. Grosenbaugh

    1958-01-01

    Foresters concerned with measuring tree populations on definite areas have long employed two well-known methods of representative sampling. In list or enumerative sampling the entire tree population is tallied with a known proportion being randomly selected and measured for volume or other variables. In area sampling all trees on randomly located plots or strips...

  6. Jaccard distance based weighted sparse representation for coarse-to-fine plant species recognition.

    PubMed

    Zhang, Shanwen; Wu, Xiaowei; You, Zhuhong

    2017-01-01

    Leaf based plant species recognition plays an important role in ecological protection, however its application to large and modern leaf databases has been a long-standing obstacle due to the computational cost and feasibility. Recognizing such limitations, we propose a Jaccard distance based sparse representation (JDSR) method which adopts a two-stage, coarse to fine strategy for plant species recognition. In the first stage, we use the Jaccard distance between the test sample and each training sample to coarsely determine the candidate classes of the test sample. The second stage includes a Jaccard distance based weighted sparse representation based classification(WSRC), which aims to approximately represent the test sample in the training space, and classify it by the approximation residuals. Since the training model of our JDSR method involves much fewer but more informative representatives, this method is expected to overcome the limitation of high computational and memory costs in traditional sparse representation based classification. Comparative experimental results on a public leaf image database demonstrate that the proposed method outperforms other existing feature extraction and SRC based plant recognition methods in terms of both accuracy and computational speed.

  7. Nonlinear Spatial Inversion Without Monte Carlo Sampling

    NASA Astrophysics Data System (ADS)

    Curtis, A.; Nawaz, A.

    2017-12-01

    High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable, so these do not need to be estimated from samples as is required in MC methods. On a 2-D test example the method is shown to outperform previous methods significantly, and at a fraction of the computational cost. In many foreseeable applications there are therefore no serious impediments to extending the method to 3-D spatial models.

  8. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study.

    PubMed

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D'Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira

    2017-06-22

    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis ® μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis ® μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis ® μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis ® μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations.

  9. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study

    PubMed Central

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D’Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira

    2017-01-01

    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis®μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis®μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis®μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis®μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations. PMID:28640202

  10. Signal Sampling for Efficient Sparse Representation of Resting State FMRI Data

    PubMed Central

    Ge, Bao; Makkie, Milad; Wang, Jin; Zhao, Shijie; Jiang, Xi; Li, Xiang; Lv, Jinglei; Zhang, Shu; Zhang, Wei; Han, Junwei; Guo, Lei; Liu, Tianming

    2015-01-01

    As the size of brain imaging data such as fMRI grows explosively, it provides us with unprecedented and abundant information about the brain. How to reduce the size of fMRI data but not lose much information becomes a more and more pressing issue. Recent literature studies tried to deal with it by dictionary learning and sparse representation methods, however, their computation complexities are still high, which hampers the wider application of sparse representation method to large scale fMRI datasets. To effectively address this problem, this work proposes to represent resting state fMRI (rs-fMRI) signals of a whole brain via a statistical sampling based sparse representation. First we sampled the whole brain’s signals via different sampling methods, then the sampled signals were aggregate into an input data matrix to learn a dictionary, finally this dictionary was used to sparsely represent the whole brain’s signals and identify the resting state networks. Comparative experiments demonstrate that the proposed signal sampling framework can speed-up by ten times in reconstructing concurrent brain networks without losing much information. The experiments on the 1000 Functional Connectomes Project further demonstrate its effectiveness and superiority. PMID:26646924

  11. Estimation of the sugar cane cultivated area from LANDSAT images using the two phase sampling method

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.; Mendonca, F. J.; Lee, D. C. L.; Shimabukuro, Y. E.

    1982-01-01

    A two phase sampling method and the optimal sampling segment dimensions for the estimation of sugar cane cultivated area were developed. This technique employs visual interpretations of LANDSAT images and panchromatic aerial photographs considered as the ground truth. The estimates, as a mean value of 100 simulated samples, represent 99.3% of the true value with a CV of approximately 1%; the relative efficiency of the two phase design was 157% when compared with a one phase aerial photographs sample.

  12. Aerostat-Lofted Instrument Platform and Sampling Method for Determination of Emissions from Open Area Sources

    EPA Science Inventory

    Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...

  13. Surveying immigrants without sampling frames - evaluating the success of alternative field methods.

    PubMed

    Reichel, David; Morales, Laura

    2017-01-01

    This paper evaluates the sampling methods of an international survey, the Immigrant Citizens Survey, which aimed at surveying immigrants from outside the European Union (EU) in 15 cities in seven EU countries. In five countries, no sample frame was available for the target population. Consequently, alternative ways to obtain a representative sample had to be found. In three countries 'location sampling' was employed, while in two countries traditional methods were used with adaptations to reach the target population. The paper assesses the main methodological challenges of carrying out a survey among a group of immigrants for whom no sampling frame exists. The samples of the survey in these five countries are compared to results of official statistics in order to assess the accuracy of the samples obtained through the different sampling methods. It can be shown that alternative sampling methods can provide meaningful results in terms of core demographic characteristics although some estimates differ to some extent from the census results.

  14. Dynamics Sampling in Transition Pathway Space.

    PubMed

    Zhou, Hongyu; Tao, Peng

    2018-01-09

    The minimum energy pathway contains important information describing the transition between two states on a potential energy surface (PES). Chain-of-states methods were developed to efficiently calculate minimum energy pathways connecting two stable states. In the chain-of-states framework, a series of structures are generated and optimized to represent the minimum energy pathway connecting two states. However, multiple pathways may exist connecting two existing states and should be identified to obtain a full view of the transitions. Therefore, we developed an enhanced sampling method, named as the direct pathway dynamics sampling (DPDS) method, to facilitate exploration of a PES for multiple pathways connecting two stable states as well as addition minima and their associated transition pathways. In the DPDS method, molecular dynamics simulations are carried out on the targeting PES within a chain-of-states framework to directly sample the transition pathway space. The simulations of DPDS could be regulated by two parameters controlling distance among states along the pathway and smoothness of the pathway. One advantage of the chain-of-states framework is that no specific reaction coordinates are necessary to generate the reaction pathway, because such information is implicitly represented by the structures along the pathway. The chain-of-states setup in a DPDS method greatly enhances the sufficient sampling in high-energy space between two end states, such as transition states. By removing the constraint on the end states of the pathway, DPDS will also sample pathways connecting minima on a PES in addition to the end points of the starting pathway. This feature makes DPDS an ideal method to directly explore transition pathway space. Three examples demonstrate the efficiency of DPDS methods in sampling the high-energy area important for reactions on the PES.

  15. The comparison of automated clustering algorithms for resampling representative conformer ensembles with RMSD matrix.

    PubMed

    Kim, Hyoungrae; Jang, Cheongyun; Yadav, Dharmendra K; Kim, Mi-Hyun

    2017-03-23

    The accuracy of any 3D-QSAR, Pharmacophore and 3D-similarity based chemometric target fishing models are highly dependent on a reasonable sample of active conformations. Since a number of diverse conformational sampling algorithm exist, which exhaustively generate enough conformers, however model building methods relies on explicit number of common conformers. In this work, we have attempted to make clustering algorithms, which could find reasonable number of representative conformer ensembles automatically with asymmetric dissimilarity matrix generated from openeye tool kit. RMSD was the important descriptor (variable) of each column of the N × N matrix considered as N variables describing the relationship (network) between the conformer (in a row) and the other N conformers. This approach used to evaluate the performance of the well-known clustering algorithms by comparison in terms of generating representative conformer ensembles and test them over different matrix transformation functions considering the stability. In the network, the representative conformer group could be resampled for four kinds of algorithms with implicit parameters. The directed dissimilarity matrix becomes the only input to the clustering algorithms. Dunn index, Davies-Bouldin index, Eta-squared values and omega-squared values were used to evaluate the clustering algorithms with respect to the compactness and the explanatory power. The evaluation includes the reduction (abstraction) rate of the data, correlation between the sizes of the population and the samples, the computational complexity and the memory usage as well. Every algorithm could find representative conformers automatically without any user intervention, and they reduced the data to 14-19% of the original values within 1.13 s per sample at the most. The clustering methods are simple and practical as they are fast and do not ask for any explicit parameters. RCDTC presented the maximum Dunn and omega-squared values of the four algorithms in addition to consistent reduction rate between the population size and the sample size. The performance of the clustering algorithms was consistent over different transformation functions. Moreover, the clustering method can also be applied to molecular dynamics sampling simulation results.

  16. Characterization of rock thermal conductivity by high-resolution optical scanning

    USGS Publications Warehouse

    Popov, Y.A.; Pribnow, D.F.C.; Sass, J.H.; Williams, C.F.; Burkhardt, H.

    1999-01-01

    We compared thress laboratory methods for thermal conductivity measurements: divided-bar, line-source and optical scanning. These methods are widely used in geothermal and petrophysical studies, particularly as applied to research on cores from deep scientific boreholes. The relatively new optical scanning method has recently been perfected and applied to geophysical problems. A comparison among these methods for determining the thermal conductivity tensor for anisotropic rocks is based on a representative collection of 80 crystalline rock samples from the KTB continental deep borehole (Germany). Despite substantial thermal inhomogeneity of rock thermal conductivity (up to 40-50% variation) and high anisotropy (with ratios of principal values attaining 2 and more), the results of measurements agree very well among the different methods. The discrepancy for measurements along the foliation is negligible (<1%). The component of thermal conductivity normal to the foliation reveals somewhat larger differences (3-4%). Optical scanning allowed us to characterize the thermal inhomogeneity of rocks and to identify a three-dimensional anisotropy in thermal conductivity of some gneiss samples. The merits of optical scanning include minor random errors (1.6%), the ability to record the variation of thermal conductivity along the sample, the ability to sample deeply using a slow scanning rate, freedom from constraints for sample size and shape, and quality of mechanical treatment of the sample surface, a contactless mode of measurement, high speed of operation, and the ability to measure on a cylindrical sample surface. More traditional methods remain superior for characterizing bulk conductivity at elevated temperature.Three laboratory methods including divided-bar, line-source and optical scanning are widely applied in geothermal and petrophysical studies. In this study, these three methods were compared for determining the thermal conductivity tensor for anisotropic rocks. For this study, a representative collection of 80 crystalline rock samples from the KTB continental deep borehole was used. Despite substantial thermal inhomogeneity of rock thermal conductivity and high anisotropy, measurement results were in excellent agreement among the three methods.

  17. A novel one-class SVM based negative data sampling method for reconstructing proteome-wide HTLV-human protein interaction networks.

    PubMed

    Mei, Suyu; Zhu, Hao

    2015-01-26

    Protein-protein interaction (PPI) prediction is generally treated as a problem of binary classification wherein negative data sampling is still an open problem to be addressed. The commonly used random sampling is prone to yield less representative negative data with considerable false negatives. Meanwhile rational constraints are seldom exerted on model selection to reduce the risk of false positive predictions for most of the existing computational methods. In this work, we propose a novel negative data sampling method based on one-class SVM (support vector machine, SVM) to predict proteome-wide protein interactions between HTLV retrovirus and Homo sapiens, wherein one-class SVM is used to choose reliable and representative negative data, and two-class SVM is used to yield proteome-wide outcomes as predictive feedback for rational model selection. Computational results suggest that one-class SVM is more suited to be used as negative data sampling method than two-class PPI predictor, and the predictive feedback constrained model selection helps to yield a rational predictive model that reduces the risk of false positive predictions. Some predictions have been validated by the recent literature. Lastly, gene ontology based clustering of the predicted PPI networks is conducted to provide valuable cues for the pathogenesis of HTLV retrovirus.

  18. Reduction in training time of a deep learning model in detection of lesions in CT

    NASA Astrophysics Data System (ADS)

    Makkinejad, Nazanin; Tajbakhsh, Nima; Zarshenas, Amin; Khokhar, Ashfaq; Suzuki, Kenji

    2018-02-01

    Deep learning (DL) emerged as a powerful tool for object detection and classification in medical images. Building a well-performing DL model, however, requires a huge number of images for training, and it takes days to train a DL model even on a cutting edge high-performance computing platform. This study is aimed at developing a method for selecting a "small" number of representative samples from a large collection of training samples to train a DL model for the could be used to detect polyps in CT colonography (CTC), without compromising the classification performance. Our proposed method for representative sample selection (RSS) consists of a K-means clustering algorithm. For the performance evaluation, we applied the proposed method to select samples for the training of a massive training artificial neural network based DL model, to be used for the classification of polyps and non-polyps in CTC. Our results show that the proposed method reduce the training time by a factor of 15, while maintaining the classification performance equivalent to the model trained using the full training set. We compare the performance using area under the receiveroperating- characteristic curve (AUC).

  19. [Determination of five representative ultraviolet filters in water by gas chromatography-mass spectrometry].

    PubMed

    Ding, Yiran; Huang, Yun; Zhao, Tingting; Cai, Qian; Luo, Yu; Huang, Bin; Zhang, Yuxia; Pan, Xuejun

    2014-06-01

    A method for the determination of five representative organic UV filters: ethylhexyl methoxycinnamate (EHMC), benzophenone-3 (BP-3), 4-methylbenzylidene camphor (4-MBC), octocrylene (OC), homosalate (HMS) in water was investigated. The method was ased on derivatization, solid phase extraction (SPE), followed by determination with gas chromatography-mass spectrometry (GC-MS). The variables involved in the derivatization of BP-3 and HMS were optimized, and SPE conditions were studied. For derivatization, 100 microL N,O-bis(trimethylsilyl) trifluoroacetamide (BSTFA) was used as derivatization reagent and reacted with BP-3 and HMS at 100 degrees C for 100 min. For SPE, the pH value of water sample was adjusted to 3-5. The Oasis HLB cartridges were employed and the solution of ethyl acetate and dichloromethane (1 : 1, v/v) was used as the eluting solvent, and good recoveries of the target compounds were obtained. The limits of detection (LODs) and the limits of quantification (LOQs) for the five target compounds in water samples were 0.5-1.2 ng/L and 1.4-4.0 ng/L, respectively. The recoveries of spiked water samples were 87.85%-102.34% with good repeatability and reproducibility (RSD < 5%, n = 3) for all the target compounds. Finally, the validated method was applied to analysis the representative UV filters in water samples collected from a wastewater treatment plant in Kunming city of Yunnan province.

  20. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    PubMed

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries.

  1. K-Nearest Neighbor Algorithm Optimization in Text Categorization

    NASA Astrophysics Data System (ADS)

    Chen, Shufeng

    2018-01-01

    K-Nearest Neighbor (KNN) classification algorithm is one of the simplest methods of data mining. It has been widely used in classification, regression and pattern recognition. The traditional KNN method has some shortcomings such as large amount of sample computation and strong dependence on the sample library capacity. In this paper, a method of representative sample optimization based on CURE algorithm is proposed. On the basis of this, presenting a quick algorithm QKNN (Quick k-nearest neighbor) to find the nearest k neighbor samples, which greatly reduces the similarity calculation. The experimental results show that this algorithm can effectively reduce the number of samples and speed up the search for the k nearest neighbor samples to improve the performance of the algorithm.

  2. Stratification of American hearing aid users by age and audiometric characteristics: a method for representative sampling.

    PubMed

    Aronoff, Justin M; Yoon, Yang-soo; Soli, Sigfrid D

    2010-06-01

    Stratified sampling plans can increase the accuracy and facilitate the interpretation of a dataset characterizing a large population. However, such sampling plans have found minimal use in hearing aid (HA) research, in part because of a paucity of quantitative data on the characteristics of HA users. The goal of this study was to devise a quantitatively derived stratified sampling plan for HA research, so that such studies will be more representative and generalizable, and the results obtained using this method are more easily reinterpreted as the population changes. Pure-tone average (PTA) and age information were collected for 84,200 HAs acquired in 2006 and 2007. The distribution of PTA and age was quantified for each HA type and for a composite of all HA users. Based on their respective distributions, PTA and age were each divided into three groups, the combination of which defined the stratification plan. The most populous PTA and age group was also subdivided, allowing greater homogeneity within strata. Finally, the percentage of users in each stratum was calculated. This article provides a stratified sampling plan for HA research, based on a quantitative analysis of the distribution of PTA and age for HA users. Adopting such a sampling plan will make HA research results more representative and generalizable. In addition, data acquired using such plans can be reinterpreted as the HA population changes.

  3. Semi-Supervised Projective Non-Negative Matrix Factorization for Cancer Classification.

    PubMed

    Zhang, Xiang; Guan, Naiyang; Jia, Zhilong; Qiu, Xiaogang; Luo, Zhigang

    2015-01-01

    Advances in DNA microarray technologies have made gene expression profiles a significant candidate in identifying different types of cancers. Traditional learning-based cancer identification methods utilize labeled samples to train a classifier, but they are inconvenient for practical application because labels are quite expensive in the clinical cancer research community. This paper proposes a semi-supervised projective non-negative matrix factorization method (Semi-PNMF) to learn an effective classifier from both labeled and unlabeled samples, thus boosting subsequent cancer classification performance. In particular, Semi-PNMF jointly learns a non-negative subspace from concatenated labeled and unlabeled samples and indicates classes by the positions of the maximum entries of their coefficients. Because Semi-PNMF incorporates statistical information from the large volume of unlabeled samples in the learned subspace, it can learn more representative subspaces and boost classification performance. We developed a multiplicative update rule (MUR) to optimize Semi-PNMF and proved its convergence. The experimental results of cancer classification for two multiclass cancer gene expression profile datasets show that Semi-PNMF outperforms the representative methods.

  4. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and datamore » interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and statisticians used carefully thought out designs that systematically and economically provided plans for data collection from the DWPF process. Key shared features of the sampling designs used at DWPF and the Gy sampling methodology were the specification of a standard for sample representativeness, an investigation that produced data from the process to study the sampling function, and a decision framework used to assess whether the specification was met based on the data. Without going into detail with regard to the seven errors identified by Pierre Gy, as excellent summaries are readily available such as Pitard [1989] and Smith [2001], SRS engineers understood, for example, that samplers can be biased (Gy's extraction error), and developed plans to mitigate those biases. Experiments that compared installed samplers with more representative samples obtained directly from the tank may not have resulted in systematically partitioning sampling errors into the now well-known error categories of Gy, but did provide overall information on the suitability of sampling systems. Most of the designs in this report are related to the DWPF vessels, not the large SRS Tank Farm tanks. Samples from the DWPF Slurry Mix Evaporator (SME), which contains the feed to the DWPF melter, are characterized using standardized analytical methods with known uncertainty. The analytical error is combined with the established error from sampling and processing in DWPF to determine the melter feed composition. This composition is used with the known uncertainty of the models in the Product Composition Control System (PCCS) to ensure that the wasteform that is produced is comfortably within the acceptable processing and product performance region. Having the advantage of many years of processing that meets the waste glass product acceptance criteria, the DWPF process has provided a considerable amount of data about itself in addition to the data from many special studies. Demonstrating representative sampling directly from the large Tank Farm tanks is a difficult, if not unsolvable enterprise due to limited accessibility. However, the consistency and the adequacy of sampling and mixing at SRS could at least be studied under the controlled process conditions based on samples discussed by Ray and others [2012a] in Waste Form Qualification Report (WQR) Volume 2 and the transfers from Tanks 40H and 51H to the Sludge Receipt and Adjustment Tank (SRAT) within DWPF. It is important to realize that the need for sample representativeness becomes more stringent as the material gets closer to the melter, and the tanks within DWPF have been studied extensively to meet those needs.« less

  5. Improving the Accuracy of Extracting Surface Water Quality Levels (SWQLs) Using Remote Sensing and Artificial Neural Network: a Case Study in the Saint John River, Canada

    NASA Astrophysics Data System (ADS)

    Sammartano, G.; Spanò, A.

    2017-09-01

    Delineating accurate surface water quality levels (SWQLs) always presents a great challenge to researchers. Existing methods of assessing surface water quality only provide individual concentrations of monitoring stations without providing the overall SWQLs. Therefore, the results of existing methods are usually difficult to be understood by decision-makers. Conversely, the water quality index (WQI) can simplify surface water quality assessment process to be accessible to decision-makers. However, in most cases, the WQI reflects inaccurate SWQLs due to the lack of representative water samples. It is very challenging to provide representative water samples because this process is costly and time consuming. To solve this problem, we introduce a cost-effective method which combines the Landsat-8 imagery and artificial intelligence to develop models to derive representative water samples by correlating concentrations of ground truth water samples to satellite spectral information. Our method was validated and the correlation between concentrations of ground truth water samples and predicted concentrations from the developed models reached a high level of coefficient of determination (R2) > 0.80, which is trustworthy. Afterwards, the predicted concentrations over each pixel of the study area were used as an input to the WQI developed by the Canadian Council of Ministers of the Environment to extract accurate SWQLs, for drinking purposes, in the Saint John River. The results indicated that SWQL was observed as 67 (Fair) and 59 (Marginal) for the lower and middle basins of the river, respectively. These findings demonstrate the potential of using our approach in surface water quality management.

  6. Are we using the appropriate reference samples to develop juvenile age estimation methods based on bone size? An exploration of growth differences between average children and those who become victims of homicide.

    PubMed

    Spake, Laure; Cardoso, Hugo F V

    2018-01-01

    The population on which forensic juvenile skeletal age estimation methods are applied has not been critically considered. Previous research suggests that child victims of homicide tend to be from socioeconomically disadvantaged contexts, and that these contexts impair linear growth. This study investigates whether juvenile skeletal remains examined by forensic anthropologists are short for age compared to their normal healthy peers. Cadaver lengths were obtained from records of autopsies of 1256 individuals, aged birth to eighteen years at death, conducted between 2000 and 2015 in Australia, New Zealand, and the U.S. Growth status of the forensic population, represented by homicide victims, and general population, represented by accident victims, were compared using height for age Z-scores and independent sample t-tests. Cadaver lengths of the accident victims were compared to growth references using one sample t-tests to evaluate whether accident victims reflect the general population. Homicide victims are shorter for age than accident victims in samples from the U.S., but not in Australia and New Zealand. Accident victims are more representative of the general population in Australia and New Zealand. Different results in Australia and New Zealand as opposed to the U.S. may be linked to socioeconomic inequality. These results suggest that physical anthropologists should critically select reference samples when devising forensic juvenile skeletal age estimation methods. Children examined in forensic investigations may be short for age, and thus methods developed on normal healthy children may yield inaccurate results. A healthy reference population may not necessarily constitute an appropriate growth comparison for the forensic anthropology population. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A Rapid, Presumptive Procedure for the Detection of Salmonella in Foods and Food Ingredients

    PubMed Central

    Hoben, D. A.; Ashton, D. H.; Peterson, A. C.

    1973-01-01

    A rapid detection procedure was developed in which a lysine-iron-cystine-neutral red (LICNR) broth medium, originally described by Hargrove et al. in 1971, was modified and used to detect the presence of viable Salmonella organisms in a variety of foods, food ingredients, and feed materials by using a two-step enrichment technique. Tetrathionate broth was used to enrich samples with incubation at 41 C for 20 hr, followed by transfer to LICNR broth and incubation at 37 C for 24 hr for further enrichment and for the detection of Salmonella organisms by color change. One hundred ten samples representing 18 different sample types were evaluated for the presence of viable Salmonella. Ninety-four percent of the samples found to be presumptive positive by this method were confirmed as positive by a culture method. Fluorescent-antibody results also compared closely. A second study was conducted under quality-control laboratory conditions by using procedures currently employed for Salmonella detection. One hundred forty-three samples representing 19 different sample types were evaluated for the presence of viable Salmonella. No false negatives were observed with the rapid-detection method. The usefulness of the LICNR broth procedure as a screening technique to eliminate negative samples rapidly and to identify presumptive positive samples for the presence of viable Salmonella organisms was established in this laboratory. PMID:4568884

  8. 40 CFR 60.547 - Test methods and procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... materials. In the event of dispute, Method 24 shall be the reference method. For Method 24, the cement or... sample will be representative of the material as applied in the affected facility. (2) Method 25 as the... by the Administrator. (3) Method 2, 2A, 2C, or 2D, as appropriate, as the reference method for...

  9. 40 CFR 60.547 - Test methods and procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... materials. In the event of dispute, Method 24 shall be the reference method. For Method 24, the cement or... sample will be representative of the material as applied in the affected facility. (2) Method 25 as the... by the Administrator. (3) Method 2, 2A, 2C, or 2D, as appropriate, as the reference method for...

  10. 40 CFR 60.547 - Test methods and procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... materials. In the event of dispute, Method 24 shall be the reference method. For Method 24, the cement or... sample will be representative of the material as applied in the affected facility. (2) Method 25 as the... by the Administrator. (3) Method 2, 2A, 2C, or 2D, as appropriate, as the reference method for...

  11. Norm Block Sample Sizes: A Review of 17 Individually Administered Intelligence Tests

    ERIC Educational Resources Information Center

    Norfolk, Philip A.; Farmer, Ryan L.; Floyd, Randy G.; Woods, Isaac L.; Hawkins, Haley K.; Irby, Sarah M.

    2015-01-01

    The representativeness, recency, and size of norm samples strongly influence the accuracy of inferences drawn from their scores. Inadequate norm samples may lead to inflated or deflated scores for individuals and poorer prediction of developmental and academic outcomes. The purpose of this study was to apply Kranzler and Floyd's method for…

  12. Combining linear polarization spectroscopy and the Representative Layer Theory to measure the Beer-Lambert law absorbance of highly scattering materials.

    PubMed

    Gobrecht, Alexia; Bendoula, Ryad; Roger, Jean-Michel; Bellon-Maurel, Véronique

    2015-01-01

    Visible and Near Infrared (Vis-NIR) Spectroscopy is a powerful non destructive analytical method used to analyze major compounds in bulk materials and products and requiring no sample preparation. It is widely used in routine analysis and also in-line in industries, in-vivo with biomedical applications or in-field for agricultural and environmental applications. However, highly scattering samples subvert Beer-Lambert law's linear relationship between spectral absorbance and the concentrations. Instead of spectral pre-processing, which is commonly used by Vis-NIR spectroscopists to mitigate the scattering effect, we put forward an optical method, based on Polarized Light Spectroscopy to improve the absorbance signal measurement on highly scattering samples. This method selects part of the signal which is less impacted by scattering. The resulted signal is combined in the Absorption/Remission function defined in Dahm's Representative Layer Theory to compute an absorbance signal fulfilling Beer-Lambert's law, i.e. being linearly related to concentration of the chemicals composing the sample. The underpinning theories have been experimentally evaluated on scattering samples in liquid form and in powdered form. The method produced more accurate spectra and the Pearson's coefficient assessing the linearity between the absorbance spectra and the concentration of the added dye improved from 0.94 to 0.99 for liquid samples and 0.84-0.97 for powdered samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. A rapid method for soil cement design : Louisiana slope value method : part II : evaluation.

    DOT National Transportation Integrated Search

    1966-05-01

    This report is an evaluation of the recently developed "Louisiana Slope Value Method". : The conclusion drawn are based on data from 637 separate samples representing nearly all major soil groups in Louisiana that are suitable for cement stabilizatio...

  14. Conceptual data sampling for breast cancer histology image classification.

    PubMed

    Rezk, Eman; Awan, Zainab; Islam, Fahad; Jaoua, Ali; Al Maadeed, Somaya; Zhang, Nan; Das, Gautam; Rajpoot, Nasir

    2017-10-01

    Data analytics have become increasingly complicated as the amount of data has increased. One technique that is used to enable data analytics in large datasets is data sampling, in which a portion of the data is selected to preserve the data characteristics for use in data analytics. In this paper, we introduce a novel data sampling technique that is rooted in formal concept analysis theory. This technique is used to create samples reliant on the data distribution across a set of binary patterns. The proposed sampling technique is applied in classifying the regions of breast cancer histology images as malignant or benign. The performance of our method is compared to other classical sampling methods. The results indicate that our method is efficient and generates an illustrative sample of small size. It is also competing with other sampling methods in terms of sample size and sample quality represented in classification accuracy and F1 measure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Ultrasonic sensor and method of use

    DOEpatents

    Condreva, Kenneth J.

    2001-01-01

    An ultrasonic sensor system and method of use for measuring transit time though a liquid sample, using one ultrasonic transducer coupled to a precision time interval counter. The timing circuit captures changes in transit time, representing small changes in the velocity of sound transmitted, over necessarily small time intervals (nanoseconds) and uses the transit time changes to identify the presence of non-conforming constituents in the sample.

  16. Real-time combustion monitoring of PCDD/F indicators by REMPI-TOFMS

    EPA Science Inventory

    Analyses for polychlorinated dibenzodioxin and dibenzofuran (PCDD/F) emissions typically require a 4 h extractive sample taken on an annual or less frequent basis. This results in a potentially minimally representative monitoring scheme. More recently, methods for continual sampl...

  17. Differential auger spectrometry

    DOEpatents

    Strongin, Myron; Varma, Matesh Narayan; Anne, Joshi

    1976-06-22

    Differential Auger spectroscopy method for increasing the sensitivity of micro-Auger spectroanalysis of the surfaces of dilute alloys, by alternately periodically switching an electron beam back and forth between an impurity free reference sample and a test sample containing a trace impurity. The Auger electrons from the samples produce representative Auger spectrum signals which cancel to produce an Auger test sample signal corresponding to the amount of the impurity in the test samples.

  18. Reconstruction of three-dimensional porous media using generative adversarial neural networks

    NASA Astrophysics Data System (ADS)

    Mosser, Lukas; Dubrule, Olivier; Blunt, Martin J.

    2017-10-01

    To evaluate the variability of multiphase flow properties of porous media at the pore scale, it is necessary to acquire a number of representative samples of the void-solid structure. While modern x-ray computer tomography has made it possible to extract three-dimensional images of the pore space, assessment of the variability in the inherent material properties is often experimentally not feasible. We present a method to reconstruct the solid-void structure of porous media by applying a generative neural network that allows an implicit description of the probability distribution represented by three-dimensional image data sets. We show, by using an adversarial learning approach for neural networks, that this method of unsupervised learning is able to generate representative samples of porous media that honor their statistics. We successfully compare measures of pore morphology, such as the Euler characteristic, two-point statistics, and directional single-phase permeability of synthetic realizations with the calculated properties of a bead pack, Berea sandstone, and Ketton limestone. Results show that generative adversarial networks can be used to reconstruct high-resolution three-dimensional images of porous media at different scales that are representative of the morphology of the images used to train the neural network. The fully convolutional nature of the trained neural network allows the generation of large samples while maintaining computational efficiency. Compared to classical stochastic methods of image reconstruction, the implicit representation of the learned data distribution can be stored and reused to generate multiple realizations of the pore structure very rapidly.

  19. Monitoring benthic aIgal communides: A comparison of targeted and coefficient sampling methods

    USGS Publications Warehouse

    Edwards, Matthew S.; Tinker, M. Tim

    2009-01-01

    Choosing an appropriate sample unit is a fundamental decision in the design of ecological studies. While numerous methods have been developed to estimate organism abundance, they differ in cost, accuracy and precision.Using both field data and computer simulation modeling, we evaluated the costs and benefits associated with two methods commonly used to sample benthic organisms in temperate kelp forests. One of these methods, the Targeted Sampling method, relies on different sample units, each "targeted" for a specific species or group of species while the other method relies on coefficients that represent ranges of bottom cover obtained from visual esti-mates within standardized sample units. Both the field data and the computer simulations suggest that both methods yield remarkably similar estimates of organism abundance and among-site variability, although the Coefficient method slightly underestimates variability among sample units when abundances are low. In contrast, the two methods differ considerably in the effort needed to sample these communities; the Targeted Sampling requires more time and twice the personnel to complete. We conclude that the Coefficent Sampling method may be better for environmental monitoring programs where changes in mean abundance are of central concern and resources are limiting, but that the Targeted sampling methods may be better for ecological studies where quantitative relationships among species and small-scale variability in abundance are of central concern.

  20. Does Self-Selection Affect Samples’ Representativeness in Online Surveys? An Investigation in Online Video Game Research

    PubMed Central

    van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-01-01

    Background The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Objective Our objective was to explore the representativeness of a self-selected sample of online gamers using online players’ virtual characters (avatars). Methods All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars’ characteristics were defined using various games’ scores, reported on the WoW’s official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. Results We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Conclusions Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted. PMID:25001007

  1. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    PubMed

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling procedure was reproducible with results comparable to the collected sample. However, the sampling procedure favoured sampling of large farms. Furthermore, both under-sampled and over-sampled areas were found using scan statistics. In conclusion, sampling conducted at abattoirs can provide a spatially representative sample. Hence it is a possible cost-effective alternative to simple random sampling. However, it is important to assess the properties of the resulting sample so that any potential selection bias can be addressed when reporting the findings. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. A Comparison of Two Methods for Recruiting Children with an Intellectual Disability

    ERIC Educational Resources Information Center

    Adams, Dawn; Handley, Louise; Heald, Mary; Simkiss, Doug; Jones, Alison; Walls, Emily; Oliver, Chris

    2017-01-01

    Background: Recruitment is a widely cited barrier of representative intellectual disability research, yet it is rarely studied. This study aims to document the rates of recruiting children with intellectual disabilities using two methods and discuss the impact of such methods on sample characteristics. Methods: Questionnaire completion rates are…

  3. 30 CFR 870.18 - General rules for calculating excess moisture.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Collection of Coal Samples from Core; and, D1412-93, Standard Test Method for Equilibrium Moisture of Coal at... shipment or use. (5) Core sample means a cylindrical sample of coal that represents the thickness of a coal seam penetrated by drilling according to ASTM standard D5192-91. (6) Correction factor means the...

  4. 30 CFR 870.18 - General rules for calculating excess moisture.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Collection of Coal Samples from Core; and, D1412-93, Standard Test Method for Equilibrium Moisture of Coal at... shipment or use. (5) Core sample means a cylindrical sample of coal that represents the thickness of a coal seam penetrated by drilling according to ASTM standard D5192-91. (6) Correction factor means the...

  5. 30 CFR 870.18 - General rules for calculating excess moisture.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Collection of Coal Samples from Core; and, D1412-93, Standard Test Method for Equilibrium Moisture of Coal at... shipment or use. (5) Core sample means a cylindrical sample of coal that represents the thickness of a coal seam penetrated by drilling according to ASTM standard D5192-91. (6) Correction factor means the...

  6. 30 CFR 870.18 - General rules for calculating excess moisture.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Collection of Coal Samples from Core; and, D1412-93, Standard Test Method for Equilibrium Moisture of Coal at... shipment or use. (5) Core sample means a cylindrical sample of coal that represents the thickness of a coal seam penetrated by drilling according to ASTM standard D5192-91. (6) Correction factor means the...

  7. Exploring the hierarchical structure of the MMPI-2-RF Personality Psychopathology Five in psychiatric patient and university student samples.

    PubMed

    Bagby, R Michael; Sellbom, Martin; Ayearst, Lindsay E; Chmielewski, Michael S; Anderson, Jaime L; Quilty, Lena C

    2014-01-01

    In this study our goal was to examine the hierarchical structure of personality pathology as conceptualized by Harkness and McNulty's (1994) Personality Psychopathology Five (PSY-5) model, as recently operationalized by the MMPI-2-RF (Ben-Porath & Tellegen, 2011) PSY-5r scales. We used Goldberg's (2006) "bass-ackwards" method to obtain factor structure using PSY-5r item data, successively extracting from 1 to 5 factors in a sample of psychiatric patients (n = 1,000) and a sample of university undergraduate students (n = 1,331). Participants from these samples had completed either the MMPI-2 or the MMPI-2-RF. The results were mostly consistent across the 2 samples, with some differences at the 3-factor level. In the patient sample a factor structure representing 3 broad psychopathology domains (internalizing, externalizing, and psychoticism) emerged; in the student sample the 3-factor level represented what is more commonly observed in "normal-range" personality models (negative emotionality, introversion, and disconstraint). At the 5-factor level the basic structure was similar across the 2 samples and represented well the PSY-5r domains.

  8. Archaeal communities of Arctic methane-containing permafrost.

    PubMed

    Shcherbakova, Victoria; Yoshimura, Yoshitaka; Ryzhmanova, Yana; Taguchi, Yukihiro; Segawa, Takahiro; Oshurkova, Victoria; Rivkina, Elizaveta

    2016-10-01

    In the present study, we used culture-independent methods to investigate the diversity of methanogenic archaea and their distribution in five permafrost samples collected from a borehole in the Kolyma River Lowland (north-east of Russia). Total DNA was extracted from methane-containing permafrost samples of different age and amplified by PCR. The resulting DNA fragments were cloned. Phylogenetic analysis of the sequences showed the presence of archaea in all studied samples; 60%-95% of sequences belonged to the Euryarchaeota. Methanogenic archaea were novel representatives of Methanosarcinales, Methanomicrobiales, Methanobacteriales and Methanocellales orders. Bathyarchaeota (Miscellaneous Crenarchaeota Group) representatives were found among nonmethanogenic archaea in all the samples studied. The Thaumarchaeota representatives were not found in the upper sample, whereas Woesearchaeota (formerly DHVEG-6) were found in the three deepest samples. Unexpectedly, the greatest diversity of archaea was observed at a depth of 22.3 m, probably due to the availability of the labile organic carbon and/or due to the migration of the microbial cells during the freezing front towards the bottom. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Method for analyzing microbial communities

    DOEpatents

    Zhou, Jizhong [Oak Ridge, TN; Wu, Liyou [Oak Ridge, TN

    2010-07-20

    The present invention provides a method for quantitatively analyzing microbial genes, species, or strains in a sample that contains at least two species or strains of microorganisms. The method involves using an isothermal DNA polymerase to randomly and representatively amplify genomic DNA of the microorganisms in the sample, hybridizing the resultant polynucleotide amplification product to a polynucleotide microarray that can differentiate different genes, species, or strains of microorganisms of interest, and measuring hybridization signals on the microarray to quantify the genes, species, or strains of interest.

  10. Structural system reliability calculation using a probabilistic fault tree analysis method

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  11. Predicting Academic Library Circulations: A Forecasting Methods Competition.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Forys, John W., Jr.

    Based on sample data representing five years of monthly circulation totals from 50 academic libraries in Illinois, Iowa, Michigan, Minnesota, Missouri, and Ohio, a study was conducted to determine the most efficient smoothing forecasting methods for academic libraries. Smoothing forecasting methods were chosen because they have been characterized…

  12. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  13. Jack Healy Remembers - Anecdotal Evidence for the Origin of the Approximate 24-hour Urine Sampling Protocol Used for Worker Bioassay Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.

    2008-10-01

    The origin of the approximate 24-hour urine sampling protocol used at Hanford for routine bioassay is attributed to an informal study done in the mid-1940s. While the actual data were never published and have been lost, anecdotal recollections by staff involved in the initial bioassay program design and administration suggest that the sampling protocol had a solid scientific basis. Numerous alternate methods for normalizing partial day samples to represent a total 24-hour collection have since been proposed and used, but no one method is obviously preferred.

  14. An Improved Computational Technique for Calculating Electromagnetic Forces and Power Absorptions Generated in Spherical and Deformed Body in Levitation Melting Devices

    NASA Technical Reports Server (NTRS)

    Zong, Jin-Ho; Szekely, Julian; Schwartz, Elliot

    1992-01-01

    An improved computational technique for calculating the electromagnetic force field, the power absorption and the deformation of an electromagnetically levitated metal sample is described. The technique is based on the volume integral method, but represents a substantial refinement; the coordinate transformation employed allows the efficient treatment of a broad class of rotationally symmetrical bodies. Computed results are presented to represent the behavior of levitation melted metal samples in a multi-coil, multi-frequency levitation unit to be used in microgravity experiments. The theoretical predictions are compared with both analytical solutions and with the results or previous computational efforts for the spherical samples and the agreement has been very good. The treatment of problems involving deformed surfaces and actually predicting the deformed shape of the specimens breaks new ground and should be the major usefulness of the proposed method.

  15. Hybrid Optimal Design of the Eco-Hydrological Wireless Sensor Network in the Middle Reach of the Heihe River Basin, China

    PubMed Central

    Kang, Jian; Li, Xin; Jin, Rui; Ge, Yong; Wang, Jinfeng; Wang, Jianghao

    2014-01-01

    The eco-hydrological wireless sensor network (EHWSN) in the middle reaches of the Heihe River Basin in China is designed to capture the spatial and temporal variability and to estimate the ground truth for validating the remote sensing productions. However, there is no available prior information about a target variable. To meet both requirements, a hybrid model-based sampling method without any spatial autocorrelation assumptions is developed to optimize the distribution of EHWSN nodes based on geostatistics. This hybrid model incorporates two sub-criteria: one for the variogram modeling to represent the variability, another for improving the spatial prediction to evaluate remote sensing productions. The reasonability of the optimized EHWSN is validated from representativeness, the variogram modeling and the spatial accuracy through using 15 types of simulation fields generated with the unconditional geostatistical stochastic simulation. The sampling design shows good representativeness; variograms estimated by samples have less than 3% mean error relative to true variograms. Then, fields at multiple scales are predicted. As the scale increases, estimated fields have higher similarities to simulation fields at block sizes exceeding 240 m. The validations prove that this hybrid sampling method is effective for both objectives when we do not know the characteristics of an optimized variables. PMID:25317762

  16. Hybrid optimal design of the eco-hydrological wireless sensor network in the middle reach of the Heihe River Basin, China.

    PubMed

    Kang, Jian; Li, Xin; Jin, Rui; Ge, Yong; Wang, Jinfeng; Wang, Jianghao

    2014-10-14

    The eco-hydrological wireless sensor network (EHWSN) in the middle reaches of the Heihe River Basin in China is designed to capture the spatial and temporal variability and to estimate the ground truth for validating the remote sensing productions. However, there is no available prior information about a target variable. To meet both requirements, a hybrid model-based sampling method without any spatial autocorrelation assumptions is developed to optimize the distribution of EHWSN nodes based on geostatistics. This hybrid model incorporates two sub-criteria: one for the variogram modeling to represent the variability, another for improving the spatial prediction to evaluate remote sensing productions. The reasonability of the optimized EHWSN is validated from representativeness, the variogram modeling and the spatial accuracy through using 15 types of simulation fields generated with the unconditional geostatistical stochastic simulation. The sampling design shows good representativeness; variograms estimated by samples have less than 3% mean error relative to true variograms. Then, fields at multiple scales are predicted. As the scale increases, estimated fields have higher similarities to simulation fields at block sizes exceeding 240 m. The validations prove that this hybrid sampling method is effective for both objectives when we do not know the characteristics of an optimized variables.

  17. TEMPORAL VARIATION IN OHIO RIVER MACROINVERTEBRATES: A HISTORICAL ROCK BASKET COMPARISON, 1960'S TO PRESENT

    EPA Science Inventory

    Collection of representative macroinvertebrate samples has historically been a problem for researchers working on the Ohio River. The USEPA utilized rock basket artificial substrates to sample benthic assemblages from 1964-1971. By this method, a steel basket (7" diameter, 11" ...

  18. Evaluating the effect of disturbed ensemble distributions on SCFG based statistical sampling of RNA secondary structures.

    PubMed

    Scheid, Anika; Nebel, Markus E

    2012-07-09

    Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case - without sacrificing much of the accuracy of the results. Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms.

  19. Evaluating the effect of disturbed ensemble distributions on SCFG based statistical sampling of RNA secondary structures

    PubMed Central

    2012-01-01

    Background Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. Results In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case – without sacrificing much of the accuracy of the results. Conclusions Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms. PMID:22776037

  20. Evapotranspiration Measurement and Estimation: Weighing Lysimeter and Neutron Probe Based Methods Compared with Eddy Covariance

    NASA Astrophysics Data System (ADS)

    Evett, S. R.; Gowda, P. H.; Marek, G. W.; Alfieri, J. G.; Kustas, W. P.; Brauer, D. K.

    2014-12-01

    Evapotranspiration (ET) may be measured by mass balance methods and estimated by flux sensing methods. The mass balance methods are typically restricted in terms of the area that can be represented (e.g., surface area of weighing lysimeter (LYS) or equivalent representative area of neutron probe (NP) and soil core sampling techniques), and can be biased with respect to ET from the surrounding area. The area represented by flux sensing methods such as eddy covariance (EC) is typically estimated with a flux footprint/source area model. The dimension, position of, and relative contribution of upwind areas within the source area are mainly influenced by sensor height, wind speed, atmospheric stability and wind direction. Footprints for EC sensors positioned several meters above the canopy are often larger than can be economically covered by mass balance methods. Moreover, footprints move with atmospheric conditions and wind direction to cover different field areas over time while mass balance methods are static in space. Thus, EC systems typically sample a much greater field area over time compared with mass balance methods. Spatial variability of surface cover can thus complicate interpretation of flux estimates from EC systems. The most commonly used flux estimation method is EC; and EC estimates of latent heat energy (representing ET) and sensible heat fluxes combined are typically smaller than the available energy from net radiation and soil heat flux (commonly referred to as lack of energy balance closure). Reasons for this are the subject of ongoing research. We compare ET from LYS, NP and EC methods applied to field crops for three years at Bushland, Texas (35° 11' N, 102° 06' W, 1170 m elevation above MSL) to illustrate the potential problems with and comparative advantages of all three methods. In particular, we examine how networks of neutron probe access tubes can be representative of field areas large enough to be equivalent in size to EC footprints, and how the ET data from these methods can address bias and accuracy issues.

  1. Effective School-Community Relations as a Key Performance Indicator for the Secondary School Administrator in Aba South District, Nigeria

    ERIC Educational Resources Information Center

    Abraham, Nath. M.; Ememe, Ogbonna N.

    2012-01-01

    This study investigates Effective School-Community Relations as a key Performance Indicator (KPI) of Secondary Schools Administrator in Aba South District, Nigeria. Descriptive survey method was adopted. All the 248 teachers made up the population and sample in a purposive sampling technique representing 100% of the entire population as sample. A…

  2. Social and Emotional Components of Book Reading between Caregivers and Their Toddlers in a High-Risk Sample

    ERIC Educational Resources Information Center

    Cross, Jennifer Riedl; Fletcher, Kathryn L.; Speirs Neumeister, Kristie L.

    2011-01-01

    In this collective case study of caregiver behaviors with their toddlers, two-minute videotaped reading interactions were analyzed using a constant comparative method. Twenty-four caregiver-toddler dyads from a high-risk sample of children prenatally exposed to cocaine were selected from a larger sample because they represented the extremes of…

  3. Problematic Preferences? A Mixed Method Examination of Principals' Preferences for Teacher Characteristics in Chicago

    ERIC Educational Resources Information Center

    Engel, Mimi

    2013-01-01

    Purpose: Relatively little is known about how principals make decisions about teacher hiring. This article uses mixed methods to examine what characteristics principals look for in teachers. Research Methods: Data were gathered using a mixed method approach, including in-depth interviews with a representative sample of 31 principals as well as an…

  4. Inhibition Of Molecular And Biological Processes Using Modified Oligonucleotides

    DOEpatents

    Kozyavkin, Sergei A.; Malykh, Andrei G.; Polouchine, Nikolai N.; Slesarev, Alexei I.

    2003-04-15

    A method of inhibiting at least one molecular process in a sample, comprising administering to the sample an oligonucleotide or polynucleotide containing at least one monomeric unit having formula (I): wherein A is an organic moiety, n is at least 1, and each X is independently selected from the group consisting of --NRCOCONu, --NHCOCR.sub.2 CR.sub.2 CONu, --NHCOCR.dbd.CRCONu, and --NHCOSSCONu, wherein each R independently represents H or a substituted or unsubstituted alkyl group, and Nu represents a nucleophile, or a salt of the compound.

  5. Statistical scaling of geometric characteristics in stochastically generated pore microstructures

    DOE PAGES

    Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee

    2015-05-21

    In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, wemore » rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Φ) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Φ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.« less

  6. REAL TIME MONITORING OF PCDD/PCDF FOR TRANSIENT CHARACTERIZATION AND PROCESS CONTROL

    EPA Science Inventory

    Current sampling methods for PCDD/F emission compliance make use of a sample taken during steady state conditions which is assumed to be representative of facility performance. This is often less than satisfactory. The rapid variation of PCDDs, PCDF, and other co-pollutants due ...

  7. Associations among Adolescent Risk Behaviours and Self-Esteem in Six Domains

    ERIC Educational Resources Information Center

    Wild, Lauren G.; Flisher, Alan J.; Bhana, Arvin; Lombard, Carl

    2004-01-01

    Background: This study investigated associations among adolescents' self-esteem in 6 domains (peers, school, family, sports/athletics, body image and global self-worth) and risk behaviours related to substance use, bullying, suicidality and sexuality. Method: A multistage stratified sampling strategy was used to select a representative sample of…

  8. Evaluation of nutrient quality-assurance data for Alexanders and Mount Rock Spring basins, Cumberland County, Pennsylvania

    USGS Publications Warehouse

    Witt, E. C.; Hippe, D.J.; Giovannitti, R.M.

    1992-01-01

    A total of 304 nutrient samples were collected from May 1990 through September 1991 to determine concentrations and loads of nutrients in water discharged from two spring basins in Cumberland County, Pa. Fifty-four percent of these nutrient samples were for the evaluation of (1) laboratory consistency, (2) container and preservative cleanliness, (3) maintenance of analyte representativeness as affected by three different preservation methods, and (4) comparison of analyte results with the "Most Probable Value" for Standard Reference Water Samples. Results of 37 duplicate analyses indicate that the Pennsylvania Department of Environmental Resources, Bureau of Laboratories (principal laboratory) remained within its ±10 percent goal for all but one analyte. Results of the blank analysis show that the sampling containers did not compromise the water quality. However, mercuric-chloride-preservation blanks apparently contained measurable ammonium in four of five samples and ammonium plus organic nitrogen in two of five samples. Interlaboratory results indicate substantial differences in the determination of nitrate and ammonium plus organic nitrogen between the principal laboratory and the U.S. Geological Survey National Water-Quality Laboratory. In comparison with the U.S. Environmental Protection Agency Quality-Control Samples, the principal laboratory was sufficiently accurate in its determination of nutrient anafytes. Analysis of replicate samples indicated that sulfuric-acid preservative best maintained the representativeness of the anafytes nitrate and ammonium plus organic nitrogen, whereas, mercuric chloride best maintained the representativeness of orthophosphate. Comparison of nutrient analyte determinations with the Most Probable Value for each preservation method shows that two of five analytes with no chemical preservative compare well, three of five with mercuric-chloride preservative compare well, and three of five with sulfuricacid preservative compare well.

  9. Guidelines for sample collecting and analytical methods used in the U.S. Geological Survey for determining chemical composition of coal

    USGS Publications Warehouse

    Swanson, Vernon Emanuel; Huffman, Claude

    1976-01-01

    This report is intended to meet the many requests for information on current U.S. Geological Survey procedures in handling coal samples. In general, the exact type and number of samples of coal and associated rock to be collected are left to the best judgment of the geologist. Samples should be of unweathered coal or rock and representative of the bed or beds sampled; it is recommended that two channel samples, separated by 10 to 100 yards (10 to 100 metres) and weighing 4 to 5 pounds ( 1.8 to 2.3 kilograms) each, be collected of each 5 feet ( 1.5 metres) of vertical section. Care must be taken to avoid any sample contamination, and to record the exact locality, thickness, and stratigraphic information for each sample. Analytical methods are described for the determination of major, minor, and trace elements in coal. Hg, As, Sb, F, Se, U, and Th are determined in the raw coal, and the following 34 elements are determined after ashing the coal: Si, Al, Ca, Mg, Na, K, Fe (total), Cl, Ti, Mn, P, S (total), Cd, Li, Cu, Zn, Pb, B, Ba, Be, Co, Cr, Ga, La, Mo, Nb, Ni, Sc, Sr, Ti, V, Y, Yb, and Zr. The methods used to determine these elements include atomic absorption spectroscopy, X-ray fluorescence spectroscopy, optical emission spectroscopy, spectrophotometry, selective-ion electrode, and neutron activation analysis. A split of representative coal samples is submitted to the U.S. Bureau of Mines for proximate, ultimate, forms of sulfur, and Btu determinations.

  10. A mixture model with a reference-based automatic selection of components for disease classification from protein and/or gene expression levels

    PubMed Central

    2011-01-01

    Background Bioinformatics data analysis is often using linear mixture model representing samples as additive mixture of components. Properly constrained blind matrix factorization methods extract those components using mixture samples only. However, automatic selection of extracted components to be retained for classification analysis remains an open issue. Results The method proposed here is applied to well-studied protein and genomic datasets of ovarian, prostate and colon cancers to extract components for disease prediction. It achieves average sensitivities of: 96.2 (sd = 2.7%), 97.6% (sd = 2.8%) and 90.8% (sd = 5.5%) and average specificities of: 93.6% (sd = 4.1%), 99% (sd = 2.2%) and 79.4% (sd = 9.8%) in 100 independent two-fold cross-validations. Conclusions We propose an additive mixture model of a sample for feature extraction using, in principle, sparseness constrained factorization on a sample-by-sample basis. As opposed to that, existing methods factorize complete dataset simultaneously. The sample model is composed of a reference sample representing control and/or case (disease) groups and a test sample. Each sample is decomposed into two or more components that are selected automatically (without using label information) as control specific, case specific and not differentially expressed (neutral). The number of components is determined by cross-validation. Automatic assignment of features (m/z ratios or genes) to particular component is based on thresholds estimated from each sample directly. Due to the locality of decomposition, the strength of the expression of each feature across the samples can vary. Yet, they will still be allocated to the related disease and/or control specific component. Since label information is not used in the selection process, case and control specific components can be used for classification. That is not the case with standard factorization methods. Moreover, the component selected by proposed method as disease specific can be interpreted as a sub-mode and retained for further analysis to identify potential biomarkers. As opposed to standard matrix factorization methods this can be achieved on a sample (experiment)-by-sample basis. Postulating one or more components with indifferent features enables their removal from disease and control specific components on a sample-by-sample basis. This yields selected components with reduced complexity and generally, it increases prediction accuracy. PMID:22208882

  11. Progress in multirate digital control system design

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1991-01-01

    A new methodology for multirate sampled-data control design based on a new generalized control law structure, two new parameter-optimization-based control law synthesis methods, and a new singular-value-based robustness analysis method are described. The control law structure can represent multirate sampled-data control laws of arbitrary structure and dynamic order, with arbitrarily prescribed sampling rates for all sensors and update rates for all processor states and actuators. The two control law synthesis methods employ numerical optimization to determine values for the control law parameters. The robustness analysis method is based on the multivariable Nyquist criterion applied to the loop transfer function for the sampling period equal to the period of repetition of the system's complete sampling/update schedule. The complete methodology is demonstrated by application to the design of a combination yaw damper and modal suppression system for a commercial aircraft.

  12. A Radio-Map Automatic Construction Algorithm Based on Crowdsourcing

    PubMed Central

    Yu, Ning; Xiao, Chenxian; Wu, Yinfeng; Feng, Renjian

    2016-01-01

    Traditional radio-map-based localization methods need to sample a large number of location fingerprints offline, which requires huge amount of human and material resources. To solve the high sampling cost problem, an automatic radio-map construction algorithm based on crowdsourcing is proposed. The algorithm employs the crowd-sourced information provided by a large number of users when they are walking in the buildings as the source of location fingerprint data. Through the variation characteristics of users’ smartphone sensors, the indoor anchors (doors) are identified and their locations are regarded as reference positions of the whole radio-map. The AP-Cluster method is used to cluster the crowdsourced fingerprints to acquire the representative fingerprints. According to the reference positions and the similarity between fingerprints, the representative fingerprints are linked to their corresponding physical locations and the radio-map is generated. Experimental results demonstrate that the proposed algorithm reduces the cost of fingerprint sampling and radio-map construction and guarantees the localization accuracy. The proposed method does not require users’ explicit participation, which effectively solves the resource-consumption problem when a location fingerprint database is established. PMID:27070623

  13. Accelerated High-Dimensional MR Imaging with Sparse Sampling Using Low-Rank Tensors

    PubMed Central

    He, Jingfei; Liu, Qiegen; Christodoulou, Anthony G.; Ma, Chao; Lam, Fan

    2017-01-01

    High-dimensional MR imaging often requires long data acquisition time, thereby limiting its practical applications. This paper presents a low-rank tensor based method for accelerated high-dimensional MR imaging using sparse sampling. This method represents high-dimensional images as low-rank tensors (or partially separable functions) and uses this mathematical structure for sparse sampling of the data space and for image reconstruction from highly undersampled data. More specifically, the proposed method acquires two datasets with complementary sampling patterns, one for subspace estimation and the other for image reconstruction; image reconstruction from highly undersampled data is accomplished by fitting the measured data with a sparsity constraint on the core tensor and a group sparsity constraint on the spatial coefficients jointly using the alternating direction method of multipliers. The usefulness of the proposed method is demonstrated in MRI applications; it may also have applications beyond MRI. PMID:27093543

  14. Sample-space-based feature extraction and class preserving projection for gene expression data.

    PubMed

    Wang, Wenjun

    2013-01-01

    In order to overcome the problems of high computational complexity and serious matrix singularity for feature extraction using Principal Component Analysis (PCA) and Fisher's Linear Discrinimant Analysis (LDA) in high-dimensional data, sample-space-based feature extraction is presented, which transforms the computation procedure of feature extraction from gene space to sample space by representing the optimal transformation vector with the weighted sum of samples. The technique is used in the implementation of PCA, LDA, Class Preserving Projection (CPP) which is a new method for discriminant feature extraction proposed, and the experimental results on gene expression data demonstrate the effectiveness of the method.

  15. Efficient dynamic graph construction for inductive semi-supervised learning.

    PubMed

    Dornaika, F; Dahbi, R; Bosaghzadeh, A; Ruichek, Y

    2017-10-01

    Most of graph construction techniques assume a transductive setting in which the whole data collection is available at construction time. Addressing graph construction for inductive setting, in which data are coming sequentially, has received much less attention. For inductive settings, constructing the graph from scratch can be very time consuming. This paper introduces a generic framework that is able to make any graph construction method incremental. This framework yields an efficient and dynamic graph construction method that adds new samples (labeled or unlabeled) to a previously constructed graph. As a case study, we use the recently proposed Two Phase Weighted Regularized Least Square (TPWRLS) graph construction method. The paper has two main contributions. First, we use the TPWRLS coding scheme to represent new sample(s) with respect to an existing database. The representative coefficients are then used to update the graph affinity matrix. The proposed method not only appends the new samples to the graph but also updates the whole graph structure by discovering which nodes are affected by the introduction of new samples and by updating their edge weights. The second contribution of the article is the application of the proposed framework to the problem of graph-based label propagation using multiple observations for vision-based recognition tasks. Experiments on several image databases show that, without any significant loss in the accuracy of the final classification, the proposed dynamic graph construction is more efficient than the batch graph construction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Small Body GN and C Research Report: G-SAMPLE - An In-Flight Dynamical Method for Identifying Sample Mass [External Release Version

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Bayard, David S.

    2006-01-01

    G-SAMPLE is an in-flight dynamical method for use by sample collection missions to identify the presence and quantity of collected sample material. The G-SAMPLE method implements a maximum-likelihood estimator to identify the collected sample mass, based on onboard force sensor measurements, thruster firings, and a dynamics model of the spacecraft. With G-SAMPLE, sample mass identification becomes a computation rather than an extra hardware requirement; the added cost of cameras or other sensors for sample mass detection is avoided. Realistic simulation examples are provided for a spacecraft configuration with a sample collection device mounted on the end of an extended boom. In one representative example, a 1000 gram sample mass is estimated to within 110 grams (95% confidence) under realistic assumptions of thruster profile error, spacecraft parameter uncertainty, and sensor noise. For convenience to future mission design, an overall sample-mass estimation error budget is developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  17. Proximate Composition Analysis.

    PubMed

    2016-01-01

    The proximate composition of foods includes moisture, ash, lipid, protein and carbohydrate contents. These food components may be of interest in the food industry for product development, quality control (QC) or regulatory purposes. Analyses used may be rapid methods for QC or more accurate but time-consuming official methods. Sample collection and preparation must be considered carefully to ensure analysis of a homogeneous and representative sample, and to obtain accurate results. Estimation methods of moisture content, ash value, crude lipid, total carbohydrates, starch, total free amino acids and total proteins are put together in a lucid manner.

  18. Bayesian geostatistics in health cartography: the perspective of malaria.

    PubMed

    Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I

    2011-06-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision.

  19. Bayesian geostatistics in health cartography: the perspective of malaria

    PubMed Central

    Patil, Anand P.; Gething, Peter W.; Piel, Frédéric B.; Hay, Simon I.

    2011-01-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision. PMID:21420361

  20. Calculating a Continuous Metabolic Syndrome Score Using Nationally Representative Reference Values.

    PubMed

    Guseman, Emily Hill; Eisenmann, Joey C; Laurson, Kelly R; Cook, Stephen R; Stratbucker, William

    2018-02-26

    The prevalence of metabolic syndrome in youth varies on the basis of the classification system used, prompting implementation of continuous scores; however, the use of these scores is limited to the sample from which they were derived. We sought to describe the derivation of the continuous metabolic syndrome score using nationally representative reference values in a sample of obese adolescents and a national sample obtained from National Health and Nutrition Examination Survey (NHANES) 2011-2012. Clinical data were collected from 50 adolescents seeking obesity treatment at a stage 3 weight management center. A second analysis relied on data from adolescents included in NHANES 2011-2012, performed for illustrative purposes. The continuous metabolic syndrome score was calculated by regressing individual values onto nationally representative age- and sex-specific standards (NHANES III). Resultant z scores were summed to create a total score. The final sample included 42 obese adolescents (15 male and 35 female subjects; mean age, 14.8 ± 1.9 years) and an additional 445 participants from NHANES 2011-2012. Among the clinical sample, the mean continuous metabolic syndrome score was 4.16 ± 4.30, while the NHANES sample mean was quite a bit lower, at -0.24 ± 2.8. We provide a method to calculate the continuous metabolic syndrome by comparing individual risk factor values to age- and sex-specific percentiles from a nationally representative sample. Copyright © 2018 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  1. Rapidly differentiating grape seeds from different sources based on characteristic fingerprints using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics.

    PubMed

    Song, Yuqiao; Liao, Jie; Dong, Junxing; Chen, Li

    2015-09-01

    The seeds of grapevine (Vitis vinifera) are a byproduct of wine production. To examine the potential value of grape seeds, grape seeds from seven sources were subjected to fingerprinting using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics. Firstly, we listed all reported components (56 components) from grape seeds and calculated the precise m/z values of the deprotonated ions [M-H](-) . Secondly, the experimental conditions were systematically optimized based on the peak areas of total ion chromatograms of the samples. Thirdly, the seven grape seed samples were examined using the optimized method. Information about 20 grape seed components was utilized to represent characteristic fingerprints. Finally, hierarchical clustering analysis and principal component analysis were performed to analyze the data. Grape seeds from seven different sources were classified into two clusters; hierarchical clustering analysis and principal component analysis yielded similar results. The results of this study lay the foundation for appropriate utilization and exploitation of grape seed samples. Due to the absence of complicated sample preparation methods and chromatographic separation, the method developed in this study represents one of the simplest and least time-consuming methods for grape seed fingerprinting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Conduct Disorder and Oppositional Defiant Disorder in a National Sample: Developmental Epidemiology

    ERIC Educational Resources Information Center

    Maughan, Barbara; Rowe, Richard; Messer, Julie; Goodman, Robert; Meltzer, Howard

    2004-01-01

    Background: Despite an expanding epidemiological evidence base, uncertainties remain over key aspects of the epidemiology of the "antisocial" disorders in childhood and adolescence. Methods: We used cross-sectional data on a nationally representative sample of 10,438 5-15-year-olds drawn from the 1999 British Child Mental Health Survey…

  3. A Multilevel Testlet Model for Dual Local Dependence

    ERIC Educational Resources Information Center

    Jiao, Hong; Kamata, Akihito; Wang, Shudong; Jin, Ying

    2012-01-01

    The applications of item response theory (IRT) models assume local item independence and that examinees are independent of each other. When a representative sample for psychometric analysis is selected using a cluster sampling method in a testlet-based assessment, both local item dependence and local person dependence are likely to be induced.…

  4. Disabilities and Degrees: Identifying Health Impairments That Predict Lower Chances of College Enrollment and Graduation in a Nationally Representative Sample

    ERIC Educational Resources Information Center

    Rosenbaum, Janet E.

    2018-01-01

    Objective: Colleges have increased postsecondary educational access for youth, including individuals with disabilities, but completion rates remain low. This study tests the hypothesis that health conditions that reduce social integration predict lower educational attainment among college students. Method: The sample from the nationally…

  5. Latin hypercube approach to estimate uncertainty in ground water vulnerability

    USGS Publications Warehouse

    Gurdak, J.J.; McCray, J.E.; Thyne, G.; Qi, S.L.

    2007-01-01

    A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability. ?? 2007 National Ground Water Association.

  6. Detecting representative data and generating synthetic samples to improve learning accuracy with imbalanced data sets.

    PubMed

    Li, Der-Chiang; Hu, Susan C; Lin, Liang-Sian; Yeh, Chun-Wu

    2017-01-01

    It is difficult for learning models to achieve high classification performances with imbalanced data sets, because with imbalanced data sets, when one of the classes is much larger than the others, most machine learning and data mining classifiers are overly influenced by the larger classes and ignore the smaller ones. As a result, the classification algorithms often have poor learning performances due to slow convergence in the smaller classes. To balance such data sets, this paper presents a strategy that involves reducing the sizes of the majority data and generating synthetic samples for the minority data. In the reducing operation, we use the box-and-whisker plot approach to exclude outliers and the Mega-Trend-Diffusion method to find representative data from the majority data. To generate the synthetic samples, we propose a counterintuitive hypothesis to find the distributed shape of the minority data, and then produce samples according to this distribution. Four real datasets were used to examine the performance of the proposed approach. We used paired t-tests to compare the Accuracy, G-mean, and F-measure scores of the proposed data pre-processing (PPDP) method merging in the D3C method (PPDP+D3C) with those of the one-sided selection (OSS), the well-known SMOTEBoost (SB) study, and the normal distribution-based oversampling (NDO) approach, and the proposed data pre-processing (PPDP) method. The results indicate that the classification performance of the proposed approach is better than that of above-mentioned methods.

  7. Application of a luminescent bacterial biosensor for the detection of tetracyclines in routine analysis of poultry muscle samples.

    PubMed

    Pikkemaat, M G; Rapallini, M L B A; Karp, M T; Elferink, J W A

    2010-08-01

    Tetracyclines are extensively used in veterinary medicine. For the detection of tetracycline residues in animal products, a broad array of methods is available. Luminescent bacterial biosensors represent an attractive inexpensive, simple and fast method for screening large numbers of samples. A previously developed cell-biosensor method was subjected to an evaluation study using over 300 routine poultry samples and the results were compared with a microbial inhibition test. The cell-biosensor assay yielded many more suspect samples, 10.2% versus 2% with the inhibition test, which all could be confirmed by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Only one sample contained a concentration above the maximum residue limit (MRL) of 100 microg kg(-1), while residue levels in most of the suspect samples were very low (<10 microg kg(-1)). The method appeared to be specific and robust. Using an experimental set-up comprising the analysis of a series of three sample dilutions allowed an appropriate cut-off for confirmatory analysis, limiting the number of samples and requiring further analysis to a minimum.

  8. Rapid Analysis of Carbohydrates in Bioprocess Samples: An Evaluation of the CarboPac SA10 for HPAE-PAD Analysis by Interlaboratory Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevcik, R. S.; Hyman, D. A.; Basumallich, L.

    2013-01-01

    A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less

  9. ADHD and Method Variance: A Latent Variable Approach Applied to a Nationally Representative Sample of College Freshmen

    ERIC Educational Resources Information Center

    Konold, Timothy R.; Glutting, Joseph J.

    2008-01-01

    This study employed a correlated trait-correlated method application of confirmatory factor analysis to disentangle trait and method variance from measures of attention-deficit/hyperactivity disorder obtained at the college level. The two trait factors were "Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition" ("DSM-IV")…

  10. Validation sampling can reduce bias in health care database studies: an illustration using influenza vaccination effectiveness.

    PubMed

    Nelson, Jennifer Clark; Marsh, Tracey; Lumley, Thomas; Larson, Eric B; Jackson, Lisa A; Jackson, Michael L

    2013-08-01

    Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased owing to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. We applied two such methods, namely imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method's ability to reduce bias using the control time period before influenza circulation. Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not use the validation sample confounders. Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from health care database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which the data can be imputed or reweighted using the additional validation sample information. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Optimal tumor sampling for immunostaining of biomarkers in breast carcinoma

    PubMed Central

    2011-01-01

    Introduction Biomarkers, such as Estrogen Receptor, are used to determine therapy and prognosis in breast carcinoma. Immunostaining assays of biomarker expression have a high rate of inaccuracy; for example, estimates are as high as 20% for Estrogen Receptor. Biomarkers have been shown to be heterogeneously expressed in breast tumors and this heterogeneity may contribute to the inaccuracy of immunostaining assays. Currently, no evidence-based standards exist for the amount of tumor that must be sampled in order to correct for biomarker heterogeneity. The aim of this study was to determine the optimal number of 20X fields that are necessary to estimate a representative measurement of expression in a whole tissue section for selected biomarkers: ER, HER-2, AKT, ERK, S6K1, GAPDH, Cytokeratin, and MAP-Tau. Methods Two collections of whole tissue sections of breast carcinoma were immunostained for biomarkers. Expression was quantified using the Automated Quantitative Analysis (AQUA) method of quantitative immunofluorescence. Simulated sampling of various numbers of fields (ranging from one to thirty five) was performed for each marker. The optimal number was selected for each marker via resampling techniques and minimization of prediction error over an independent test set. Results The optimal number of 20X fields varied by biomarker, ranging between three to fourteen fields. More heterogeneous markers, such as MAP-Tau protein, required a larger sample of 20X fields to produce representative measurement. Conclusions The optimal number of 20X fields that must be sampled to produce a representative measurement of biomarker expression varies by marker with more heterogeneous markers requiring a larger number. The clinical implication of these findings is that breast biopsies consisting of a small number of fields may be inadequate to represent whole tumor biomarker expression for many markers. Additionally, for biomarkers newly introduced into clinical use, especially if therapeutic response is dictated by level of expression, the optimal size of tissue sample must be determined on a marker-by-marker basis. PMID:21592345

  12. Methods for estimating the amount of vernal pool habitat in the northeastern United States

    USGS Publications Warehouse

    Van Meter, R.; Bailey, L.L.; Grant, E.H.C.

    2008-01-01

    The loss of small, seasonal wetlands is a major concern for a variety of state, local, and federal organizations in the northeastern U.S. Identifying and estimating the number of vernal pools within a given region is critical to developing long-term conservation and management strategies for these unique habitats and their faunal communities. We use three probabilistic sampling methods (simple random sampling, adaptive cluster sampling, and the dual frame method) to estimate the number of vernal pools on protected, forested lands. Overall, these methods yielded similar values of vernal pool abundance for each study area, and suggest that photographic interpretation alone may grossly underestimate the number of vernal pools in forested habitats. We compare the relative efficiency of each method and discuss ways of improving precision. Acknowledging that the objectives of a study or monitoring program ultimately determine which sampling designs are most appropriate, we recommend that some type of probabilistic sampling method be applied. We view the dual-frame method as an especially useful way of combining incomplete remote sensing methods, such as aerial photograph interpretation, with a probabilistic sample of the entire area of interest to provide more robust estimates of the number of vernal pools and a more representative sample of existing vernal pool habitats.

  13. Utilization of breast cancer screening methods in a developing nation: results from a nationally representative sample of Malaysian households.

    PubMed

    Dunn, Richard A; Tan, Andrew K G

    2011-01-01

    As is the case in many developing nations, previous studies of breast cancer screening behavior in Malaysia have used relatively small samples that are not nationally representative, thereby limiting the generalizability of results. Therefore, this study uses nationally representative data from the Malaysia Non-Communicable Disease Surveillance-1 to investigate the role of socio-economic status on breast cancer screening behavior in Malaysia, particularly differences in screening behaviour between ethnic groups. The decisions of 816 women above age 40 in Malaysia to screen for breast cancer using mammography, clinical breast exams (CBE), and breast self-exams (BSE) are modeled using logistic regression. Results indicate that after adjusting for differences in age, education, household income, marital status, and residential location, Malay women are less likely than Chinese and Indian women to utilize mammography, but more likely to perform BSE. Education level and urban residence are positively associated with utilization of each method, but these relationships vary across ethnicity. Higher education levels are strongly related to using each screening method among Chinese women, but have no statistically significant relationship to screening among Malays. © 2011 Wiley Periodicals, Inc.

  14. Evaluation of the NCPDP Structured and Codified Sig Format for e-prescriptions.

    PubMed

    Liu, Hangsheng; Burkhart, Q; Bell, Douglas S

    2011-01-01

    To evaluate the ability of the structure and code sets specified in the National Council for Prescription Drug Programs Structured and Codified Sig Format to represent ambulatory electronic prescriptions. We parsed the Sig strings from a sample of 20,161 de-identified ambulatory e-prescriptions into variables representing the fields of the Structured and Codified Sig Format. A stratified random sample of these representations was then reviewed by a group of experts. For codified Sig fields, we attempted to map the actual words used by prescribers to the equivalent terms in the designated terminology. Proportion of prescriptions that the Format could fully represent; proportion of terms used that could be mapped to the designated terminology. The fields defined in the Format could fully represent 95% of Sigs (95% CI 93% to 97%), but ambiguities were identified, particularly in representing multiple-step instructions. The terms used by prescribers could be codified for only 60% of dose delivery methods, 84% of dose forms, 82% of vehicles, 95% of routes, 70% of sites, 33% of administration timings, and 93% of indications. The findings are based on a retrospective sample of ambulatory prescriptions derived mostly from primary care physicians. The fields defined in the Format could represent most of the patient instructions in a large prescription sample, but prior to its mandatory adoption, further work is needed to ensure that potential ambiguities are addressed and that a complete set of terms is available for the codified fields.

  15. Methodological Challenges in Collecting Social and Behavioural Data Regarding the HIV Epidemic among Gay and Other Men Who Have Sex with Men in Australia

    PubMed Central

    Holt, Martin; de Wit, John; Brown, Graham; Maycock, Bruce; Fairley, Christopher; Prestage, Garrett

    2014-01-01

    Background Behavioural surveillance and research among gay and other men who have sex with men (GMSM) commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS) and discussed their utility for behavioural surveillance. Methods Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. Results Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR) or casual partners (UAIC). The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. Conclusion Respondent-driven sampling produced the sample that was most consistent to population estimates, but this methodology is complex and logistically demanding. Time-location and online recruitment are more cost-effective and easier to implement; using these approaches in combination may offer the potential to recruit a more representative sample of GMSM. PMID:25409440

  16. [Blood sampling using "dried blood spot": a clinical biology revolution underway?].

    PubMed

    Hirtz, Christophe; Lehmann, Sylvain

    2015-01-01

    Blood testing using the dried blood spot (DBS) is used since the 1960s in clinical analysis, mainly within the framework of the neonatal screening (Guthrie test). Since then numerous analytes such as nucleic acids, small molecules or lipids, were successfully measured on the DBS. While this pre-analytical method represents an interesting alternative to classic blood sampling, its use in routine is still limited. We review here the different clinical applications of the blood sampling on DBS and estimate its future place, supported by the new methods of analysis as the LC-MS mass spectrometry.

  17. Segments from red blood cell units should not be used for quality testing.

    PubMed

    Kurach, Jayme D R; Hansen, Adele L; Turner, Tracey R; Jenkins, Craig; Acker, Jason P

    2014-02-01

    Nondestructive testing of blood components could permit in-process quality control and reduce discards. Tubing segments, generated during red blood cell (RBC) component production, were tested to determine their suitability as a sample source for quality testing. Leukoreduced RBC components were produced from whole blood (WB) by two different methods: WB filtration and buffy coat (BC). Components and their corresponding segments were tested on Days 5 and 42 of hypothermic storage (HS) for spun hematocrit (Hct), hemoglobin (Hb) content, percentage hemolysis, hematologic indices, and adenosine triphosphate concentration to determine whether segment quality represents unit quality. Segment samples overestimated hemolysis on Days 5 and 42 of HS in both BC- and WB filtration-produced RBCs (p < 0.001 for all). Hct and Hb levels in the segments were also significantly different from the units at both time points for both production methods (p < 0.001 for all). Indeed, for all variables tested different results were obtained from segment and unit samples, and these differences were not consistent across production methods. The quality of samples from tubing segments is not representative of the quality of the corresponding RBC unit. Segments are not suitable surrogates with which to assess RBC quality. © 2013 American Association of Blood Banks.

  18. Moving on From Representativeness: Testing the Utility of the Global Drug Survey.

    PubMed

    Barratt, Monica J; Ferris, Jason A; Zahnow, Renee; Palamar, Joseph J; Maier, Larissa J; Winstock, Adam R

    2017-01-01

    A decline in response rates in traditional household surveys, combined with increased internet coverage and decreased research budgets, has resulted in increased attractiveness of web survey research designs based on purposive and voluntary opt-in sampling strategies. In the study of hidden or stigmatised behaviours, such as cannabis use, web survey methods are increasingly common. However, opt-in web surveys are often heavily criticised due to their lack of sampling frame and unknown representativeness. In this article, we outline the current state of the debate about the relevance of pursuing representativeness, the state of probability sampling methods, and the utility of non-probability, web survey methods especially for accessing hidden or minority populations. Our article has two aims: (1) to present a comprehensive description of the methodology we use at Global Drug Survey (GDS), an annual cross-sectional web survey and (2) to compare the age and sex distributions of cannabis users who voluntarily completed (a) a household survey or (b) a large web-based purposive survey (GDS), across three countries: Australia, the United States, and Switzerland. We find that within each set of country comparisons, the demographic distributions among recent cannabis users are broadly similar, demonstrating that the age and sex distributions of those who volunteer to be surveyed are not vastly different between these non-probability and probability methods. We conclude that opt-in web surveys of hard-to-reach populations are an efficient way of gaining in-depth understanding of stigmatised behaviours and are appropriate, as long as they are not used to estimate drug use prevalence of the general population.

  19. Moving on From Representativeness: Testing the Utility of the Global Drug Survey

    PubMed Central

    Barratt, Monica J; Ferris, Jason A; Zahnow, Renee; Palamar, Joseph J; Maier, Larissa J; Winstock, Adam R

    2017-01-01

    A decline in response rates in traditional household surveys, combined with increased internet coverage and decreased research budgets, has resulted in increased attractiveness of web survey research designs based on purposive and voluntary opt-in sampling strategies. In the study of hidden or stigmatised behaviours, such as cannabis use, web survey methods are increasingly common. However, opt-in web surveys are often heavily criticised due to their lack of sampling frame and unknown representativeness. In this article, we outline the current state of the debate about the relevance of pursuing representativeness, the state of probability sampling methods, and the utility of non-probability, web survey methods especially for accessing hidden or minority populations. Our article has two aims: (1) to present a comprehensive description of the methodology we use at Global Drug Survey (GDS), an annual cross-sectional web survey and (2) to compare the age and sex distributions of cannabis users who voluntarily completed (a) a household survey or (b) a large web-based purposive survey (GDS), across three countries: Australia, the United States, and Switzerland. We find that within each set of country comparisons, the demographic distributions among recent cannabis users are broadly similar, demonstrating that the age and sex distributions of those who volunteer to be surveyed are not vastly different between these non-probability and probability methods. We conclude that opt-in web surveys of hard-to-reach populations are an efficient way of gaining in-depth understanding of stigmatised behaviours and are appropriate, as long as they are not used to estimate drug use prevalence of the general population. PMID:28924351

  20. Sampling and analysis plan for sludge located on the floor and in the pits of the 105-K basins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BAKER, R.B.

    1998-11-20

    This Sampling and Analysis Plan (SAP) provides direction for the sampling of the sludge found on the floor and in the remote pits of the 105-K Basins to provide: (1) basic data for the sludges that have not been characterized to-date and (2) representative Sludge material for process tests to be made by the SNF Project/K Basins sludge treatment process subproject. The sampling equipment developed will remove representative samples of the radioactive sludge from underwater at the K Basins, depositing them in shielded containers for transport to the Hanford Site laboratories. Included in the present document is the basic backgroundmore » logic for selection of the samples to meet the requirements established in the Data Quality Objectives (DQO), HNF-2033, for this sampling activity. The present document also includes the laboratory analyses, methods, procedures, and reporting that will be required to meet the DQO.« less

  1. Multiplex Microsphere Immunoassays for the Detection of IgM and IgG to Arboviral Diseases

    PubMed Central

    Basile, Alison J.; Horiuchi, Kalanthe; Panella, Amanda J.; Laven, Janeen; Kosoy, Olga; Lanciotti, Robert S.; Venkateswaran, Neeraja; Biggerstaff, Brad J.

    2013-01-01

    Serodiagnosis of arthropod-borne viruses (arboviruses) at the Division of Vector-Borne Diseases, CDC, employs a combination of individual enzyme-linked immunosorbent assays and microsphere immunoassays (MIAs) to test for IgM and IgG, followed by confirmatory plaque-reduction neutralization tests. Based upon the geographic origin of a sample, it may be tested concurrently for multiple arboviruses, which can be a cumbersome task. The advent of multiplexing represents an opportunity to streamline these types of assays; however, because serologic cross-reactivity of the arboviral antigens often confounds results, it is of interest to employ data analysis methods that address this issue. Here, we constructed 13-virus multiplexed IgM and IgG MIAs that included internal and external controls, based upon the Luminex platform. Results from samples tested using these methods were analyzed using 8 different statistical schemes to identify the best way to classify the data. Geographic batteries were also devised to serve as a more practical diagnostic format, and further samples were tested using the abbreviated multiplexes. Comparative error rates for the classification schemes identified a specific boosting method based on logistic regression “Logitboost” as the classification method of choice. When the data from all samples tested were combined into one set, error rates from the multiplex IgM and IgG MIAs were <5% for all geographic batteries. This work represents both the most comprehensive, validated multiplexing method for arboviruses to date, and also the most systematic attempt to determine the most useful classification method for use with these types of serologic tests. PMID:24086608

  2. Effect of ground control mesh on dust sampling and explosion mitigation.

    PubMed

    Alexander, D W; Chasko, L L

    2015-07-01

    Researchers from the National Institute for Occupational Safety and Health's Office of Mine Safety and Health Research conducted an assessment of the effects that ground control mesh might have on rock and float coal dust distribution in a coal mine. The increased use of mesh to control roof and rib spall introduces additional elevated surfaces on which rock or coal dust can collect. It is possible to increase the potential for dust explosion propagation if any float coal dust is not adequately inerted. In addition, the mesh may interfere with the collection of representative dust samples when using the pan-and-brush sampling method developed by the U.S. Bureau of Mines and used by the Mine Safety and Health Administration for band sampling. This study estimates the additional coal or rock dust that could accumulate on mesh and develops a means to collect representative dust samples from meshed entries.

  3. Effect of ground control mesh on dust sampling and explosion mitigation

    PubMed Central

    Alexander, D.W.; Chasko, L.L.

    2017-01-01

    Researchers from the National Institute for Occupational Safety and Health’s Office of Mine Safety and Health Research conducted an assessment of the effects that ground control mesh might have on rock and float coal dust distribution in a coal mine. The increased use of mesh to control roof and rib spall introduces additional elevated surfaces on which rock or coal dust can collect. It is possible to increase the potential for dust explosion propagation if any float coal dust is not adequately inerted. In addition, the mesh may interfere with the collection of representative dust samples when using the pan-and-brush sampling method developed by the U.S. Bureau of Mines and used by the Mine Safety and Health Administration for band sampling. This study estimates the additional coal or rock dust that could accumulate on mesh and develops a means to collect representative dust samples from meshed entries. PMID:28936000

  4. Vitamin concentrations in human milk vary with time within feed, circadian rhythm, and single-dose supplementation

    USDA-ARS?s Scientific Manuscript database

    Importance: Human milk is the subject of many nutrition studies but methods for representative sample collection are not established. Our recently improved, validated methods for analyzing micronutrients in human milk now enable systematic study of factors affecting their concentration. Objective...

  5. Fat- and water-soluble vitamin concentrations in human milk: effects of collection protocol, circadian variation and acute maternal supplementation

    USDA-ARS?s Scientific Manuscript database

    Importance: Human milk is the subject of many nutrition studies but methods for representative sample collection are not established. Our recently improved, validated methods for analyzing micronutrients in human milk now enable systematic study of factors affecting their concentration. Objective:...

  6. TNO/Centaurs grouping tested with asteroid data sets

    NASA Astrophysics Data System (ADS)

    Fulchignoni, M.; Birlan, M.; Barucci, M. A.

    2001-11-01

    Recently, we have discussed the possible subdivision in few groups of a sample of 22 TNO and Centaurs for which the BVRIJ photometry were available (Barucci et al., 2001, A&A, 371,1150). We obtained this results using the multivariate statistics adopted to define the current asteroid taxonomy, namely the Principal Components Analysis and the G-mode method (Tholen & Barucci, 1989, in ASTEROIDS II). How these methods work with a very small statistical sample as the TNO/Centaurs one? Theoretically, the number of degrees of freedom of the sample is correct. In fact it is 88 in our case and have to be larger then 50 to cope with the requirements of the G-mode. Does the random sampling of the small number of members of a large population contain enough information to reveal some structure in the population? We extracted several samples of 22 asteroids out of a data-base of 86 objects of known taxonomic type for which BVRIJ photometry is available from ECAS (Zellner et al. 1985, ICARUS 61, 355), SMASS II (S.W. Bus, 1999, PhD Thesis, MIT), and the Bell et al. Atlas of the asteroid infrared spectra. The objects constituting the first sample were selected in order to give a good representation of the major asteroid taxonomic classes (at least three samples each class): C,S,D,A, and G. Both methods were able to distinguish all these groups confirming the validity of the adopted methods. The S class is hard to individuate as a consequence of the choice of I and J variables, which imply a lack of information on the absorption band at 1 micron. The other samples were obtained by random choice of the objects. Not all the major groups were well represented (less than three samples per groups), but the general trend of the asteroid taxonomy has been always obtained. We conclude that the quoted grouping of TNO/Centaurs is representative of some physico-chemical structure of the outer solar system small body population.

  7. The software peculiarities of pattern recognition in track detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starkov, N.

    The different kinds of nuclear track recognition algorithms are represented. Several complicated samples of use them in physical experiments are considered. The some processing methods of complicated images are described.

  8. Combined sample collection and gas extraction for the measurement of helium isotopes and neon in natural waters

    NASA Astrophysics Data System (ADS)

    Roether, Wolfgang; Vogt, Martin; Vogel, Sandra; Sültenfuß, Jürgen

    2013-06-01

    We present a new method to obtain samples for the measurement of helium isotopes and neon in water, to replace the classical sampling procedure using clamped-off Cu tubing containers that we have been using so far. The new method saves the gas extraction step prior to admission to the mass spectrometer, which the classical method requires. Water is drawn into evacuated glass ampoules with subsequent flame sealing. Approximately 50% headspace is left, from which admission into the mass spectrometer occurs without further treatment. Extensive testing has shown that, with due care and with small corrections applied, the samples represent the gas concentrations in the water within ±0.07% (95% confidence level; ±0.05% with special handling). Fast evacuation is achieved by pumping on a small charge of water placed in the ampoule. The new method was successfully tested at sea in comparison with Cu-tubing sampling. We found that the ampoule samples were superior in data precision and that a lower percentage of samples were lost prior to measurement. Further measurements revealed agreement between the two methods in helium, 3He and neon within ±0.1%. The new method facilitates the dealing with large sample sets and minimizes the delay between sampling and measurement. The method is applicable also for gases other than helium and neon.

  9. Sample preparation for thermo-gravimetric determination and thermo-gravimetric characterization of refuse derived fuel.

    PubMed

    Robinson, T; Bronson, B; Gogolek, P; Mehrani, P

    2016-02-01

    Thermo-gravimetric analysis (TGA) is a useful method for characterizing fuels. In the past it has been applied to the study of refuse derived fuel (RDF) and related materials. However, the heterogeneity of RDF makes the preparation of small representative samples very difficult and this difficulty has limited the effectiveness of TGA for characterization of RDF. A TGA method was applied to a variety of materials prepared from a commercially available RDF using a variety of procedures. Applicability of TGA method to the determination of the renewable content of RDF was considered. Cryogenic ball milling was found to be an effective means of preparing RDF samples for TGA. When combined with an effective sample preparation, TGA could be used as an alternative method for assessing the renewable content of RDF. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  10. Towards robust and repeatable sampling methods in eDNA based studies.

    PubMed

    Dickie, Ian A; Boyer, Stephane; Buckley, Hannah; Duncan, Richard P; Gardner, Paul; Hogg, Ian D; Holdaway, Robert J; Lear, Gavin; Makiola, Andreas; Morales, Sergio E; Powell, Jeff R; Weaver, Louise

    2018-05-26

    DNA based techniques are increasingly used for measuring the biodiversity (species presence, identity, abundance and community composition) of terrestrial and aquatic ecosystems. While there are numerous reviews of molecular methods and bioinformatic steps, there has been little consideration of the methods used to collect samples upon which these later steps are based. This represents a critical knowledge gap, as methodologically sound field sampling is the foundation for subsequent analyses. We reviewed field sampling methods used for metabarcoding studies of both terrestrial and freshwater ecosystem biodiversity over a nearly three-year period (n = 75). We found that 95% (n = 71) of these studies used subjective sampling methods, inappropriate field methods, and/or failed to provide critical methodological information. It would be possible for researchers to replicate only 5% of the metabarcoding studies in our sample, a poorer level of reproducibility than for ecological studies in general. Our findings suggest greater attention to field sampling methods and reporting is necessary in eDNA-based studies of biodiversity to ensure robust outcomes and future reproducibility. Methods must be fully and accurately reported, and protocols developed that minimise subjectivity. Standardisation of sampling protocols would be one way to help to improve reproducibility, and have additional benefits in allowing compilation and comparison of data from across studies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    PubMed

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  12. Evaluation of Common Methods for Sampling Invertebrate Pollinator Assemblages: Net Sampling Out-Perform Pan Traps

    PubMed Central

    Popic, Tony J.; Davila, Yvonne C.; Wardle, Glenda M.

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service. PMID:23799127

  13. Resilience to Adult Psychopathology Following Childhood Maltreatment: Evidence from a Community Sample

    ERIC Educational Resources Information Center

    Collishaw, Stephan; Pickles, Andrew; Messer, Julie; Rutter, Michael; Shearer, Christina; Maughan, Barbara

    2007-01-01

    Objective: Child abuse is an important risk for adult psychiatric morbidity. However, not all maltreated children experience mental health problems as adults. The aims of the present study were to address the extent of resilience to adult psychopathology in a representative community sample, and to explore predictors of a good prognosis. Methods:…

  14. One Sample, One Shot - Evaluation of sample preparation protocols for the mass spectrometric proteome analysis of human bile fluid without extensive fractionation.

    PubMed

    Megger, Dominik A; Padden, Juliet; Rosowski, Kristin; Uszkoreit, Julian; Bracht, Thilo; Eisenacher, Martin; Gerges, Christian; Neuhaus, Horst; Schumacher, Brigitte; Schlaak, Jörg F; Sitek, Barbara

    2017-02-10

    The proteome analysis of bile fluid represents a promising strategy to identify biomarker candidates for various diseases of the hepatobiliary system. However, to obtain substantive results in biomarker discovery studies large patient cohorts necessarily need to be analyzed. Consequently, this would lead to an unmanageable number of samples to be analyzed if sample preparation protocols with extensive fractionation methods are applied. Hence, the performance of simple workflows allowing for "one sample, one shot" experiments have been evaluated in this study. In detail, sixteen different protocols implying modifications at the stages of desalting, delipidation, deglycosylation and tryptic digestion have been examined. Each method has been individually evaluated regarding various performance criteria and comparative analyses have been conducted to uncover possible complementarities. Here, the best performance in terms of proteome coverage has been assessed for a combination of acetone precipitation with in-gel digestion. Finally, a mapping of all obtained protein identifications with putative biomarkers for hepatocellular carcinoma (HCC) and cholangiocellular carcinoma (CCC) revealed several proteins easily detectable in bile fluid. These results can build the basis for future studies with large and well-defined patient cohorts in a more disease-related context. Human bile fluid is a proximal body fluid and supposed to be a potential source of disease markers. However, due to its biochemical composition, the proteome analysis of bile fluid still represents a challenging task and is therefore mostly conducted using extensive fractionation procedures. This in turn leads to a high number of mass spectrometric measurements for one biological sample. Considering the fact that in order to overcome the biological variability a high number of biological samples needs to be analyzed in biomarker discovery studies, this leads to the dilemma of an unmanageable number of necessary MS-based analyses. Hence, easy sample preparation protocols are demanded representing a compromise between proteome coverage and simplicity. In the presented study, such protocols have been evaluated regarding various technical criteria (e.g. identification rates, missed cleavages, chromatographic separation) uncovering the strengths and weaknesses of various methods. Furthermore, a cumulative bile proteome list has been generated that extends the current bile proteome catalog by 248 proteins. Finally, a mapping with putative biomarkers for hepatocellular carcinoma (HCC) and cholangiocellular carcinoma (CCC) derived from tissue-based studies, revealed several of these proteins being easily and reproducibly detectable in human bile. Therefore, the presented technical work represents a solid base for future disease-related studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Fast Ordered Sampling of DNA Sequence Variants.

    PubMed

    Greenberg, Anthony J

    2018-05-04

    Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD) decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects. Copyright © 2018 Greenberg.

  16. 40 CFR 98.264 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-process phosphoric acid process line. You can use existing plant procedures that are used for accounting... the process line. Conduct the representative bulk sampling using the applicable standard method in the...

  17. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  18. Gas sampling system for reactive gas-solid mixtures

    DOEpatents

    Daum, Edward D.; Downs, William; Jankura, Bryan J.; McCoury, Jr., John M.

    1989-01-01

    An apparatus and method for sampling a gas containing a reactive particulate solid phase flowing through a duct and for communicating a representative sample to a gas analyzer. A sample probe sheath 32 with an angular opening 34 extends vertically into a sample gas duct 30. The angular opening 34 is opposite the gas flow. A gas sampling probe 36 concentrically located within sheath 32 along with calibration probe 40 partly extend in the sheath 32. Calibration probe 40 extends further in the sheath 32 than gas sampling probe 36 for purging the probe sheath area with a calibration gas during calibration.

  19. Gas sampling system for reactive gas-solid mixtures

    DOEpatents

    Daum, Edward D.; Downs, William; Jankura, Bryan J.; McCoury, Jr., John M.

    1990-01-01

    An apparatus and method for sampling gas containing a reactive particulate solid phase flowing through a duct and for communicating a representative sample to a gas analyzer. A sample probe sheath 32 with an angular opening 34 extends vertically into a sample gas duct 30. The angular opening 34 is opposite the gas flow. A gas sampling probe 36 concentrically located within sheath 32 along with calibration probe 40 partly extends in the sheath 32. Calibration probe 40 extends further in the sheath 32 than gas sampling probe 36 for purging the probe sheath area with a calibration gas during calibration.

  20. The National Comorbidity Survey Adolescent Supplement (NCS-A): II. Overview and Design

    PubMed Central

    Kessler, Ronald C.; Avenevoli, Shelli; Costello, E. Jane; Green, Jennifer Greif; Gruber, Michael J.; Heeringa, Steven; Merikangas, Kathleen R.; Pennell, Beth-Ellen; Sampson, Nancy A.; Zaslavsky, Alan M.

    2009-01-01

    OBJECTIVE To present an overview of the design and field procedures of the National Comorbidity Survey Replication Adolescent Supplement (NCS-A) METHOD The NCS-A is a nationally representative face-to-face household survey of the prevalence and correlates of DSM-IV mental disorders among US adolescents (ages 13–17) that was carried out between February 2001 and January 2004 by the Survey Research Center of the Institute for Social Research at the University of Michigan. The sample was based on a dual-frame design that included 904 adolescent residents of the households that participated in the National Comorbidity Survey Replication (85.9% response rate) and 9244 adolescent students selected from a representative sample of 320 schools in the same nationally representative sample of counties as the NCS-R (74.7% response rate). RESULTS Comparisons of sample and population distributions on Census socio-demographic variables and, in the school sample, school characteristics documented only minor differences that were corrected with post-stratification weighting. Comparisons of DSM-IV disorder prevalence estimates among household vs. school sample respondents in counties that differed in the use of replacement schools for originally selected schools that refused to participate showed that the use of replacement schools did not introduce bias into prevalence estimates. CONCLUSIONS The NCS-A is a rich nationally representative dataset that will substantially increase understanding of the mental health and well-being of adolescents in the United States. PMID:19242381

  1. Measuring solids concentration in stormwater runoff: comparison of analytical methods.

    PubMed

    Clark, Shirley E; Siu, Christina Y S

    2008-01-15

    Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.

  2. Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables

    NASA Astrophysics Data System (ADS)

    Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino

    2013-12-01

    Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.

  3. Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables.

    PubMed

    Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino

    2013-12-07

    Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.

  4. Evaluation of Sampling Methods for Bacillus Spore ...

    EPA Pesticide Factsheets

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  5. Antenna pattern interpolation by generalized Whittaker reconstruction

    NASA Astrophysics Data System (ADS)

    Tjonneland, K.; Lindley, A.; Balling, P.

    Whittaker reconstruction is an effective tool for interpolation of band limited data. Whittaker originally introduced the interpolation formula termed the cardinal function as the function that represents a set of equispaced samples but has no periodic components of period less than twice the sample spacing. It appears that its use for reflector antennas was pioneered in France. The method is now a useful tool in the analysis and design of multiple beam reflector antenna systems. A good description of the method has been given by Bucci et al. This paper discusses some problems encountered with the method and their solution.

  6. Recruitment of representative samples for low incidence cancer populations: Do registries deliver?

    PubMed Central

    2011-01-01

    Background Recruiting large and representative samples of adolescent and young adult (AYA) cancer survivors is important for gaining accurate data regarding the prevalence of unmet needs in this population. This study aimed to describe recruitment rates for AYAs recruited through a cancer registry with particular focus on: active clinician consent protocols, reasons for clinicians not providing consent and the representativeness of the final sample. Methods Adolescents and young adults aged 14 to19 years inclusive and listed on the cancer registry from January 1 2002 to December 31 2007 were identified. An active clinician consent protocol was used whereby the registry sent a letter to AYAs primary treating clinicians requesting permission to contact the survivors. The registry then sent survivors who received their clinician's consent a letter seeking permission to forward their contact details to the research team. Consenting AYAs were sent a questionnaire which assessed their unmet needs. Results The overall consent rate for AYAs identified as eligible by the registry was 7.8%. Of the 411 potentially eligible survivors identified, just over half (n = 232, 56%) received their clinician's consent to be contacted. Of those 232 AYAs, 65% were unable to be contacted. Only 18 AYAs (7.8%) refused permission for their contact details to be passed on to the research team. Of the 64 young people who agreed to be contacted, 50% (n = 32) completed the questionnaire. Conclusions Cancer registries which employ active clinician consent protocols may not be appropriate for recruiting large, representative samples of AYAs diagnosed with cancer. Given that AYA cancer survivors are highly mobile, alternative methods such as treatment centre and clinic based recruitment may need to be considered. PMID:21235819

  7. Validation sampling can reduce bias in healthcare database studies: an illustration using influenza vaccination effectiveness

    PubMed Central

    Nelson, Jennifer C.; Marsh, Tracey; Lumley, Thomas; Larson, Eric B.; Jackson, Lisa A.; Jackson, Michael

    2014-01-01

    Objective Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased due to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. Study Design and Setting We applied two such methods, imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method’s ability to reduce bias using the control time period prior to influenza circulation. Results Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not utilize the validation sample confounders. Conclusion Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from healthcare database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which data can be imputed or reweighted using the additional validation sample information. PMID:23849144

  8. A Bayesian nonparametric method for prediction in EST analysis

    PubMed Central

    Lijoi, Antonio; Mena, Ramsés H; Prünster, Igor

    2007-01-01

    Background Expressed sequence tags (ESTs) analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a) the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b) the number of new unique genes to be observed in a future sample; c) the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample. PMID:17868445

  9. Fast Physically Accurate Rendering of Multimodal Signatures of Distributed Fracture in Heterogeneous Materials.

    PubMed

    Visell, Yon

    2015-04-01

    This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

  10. Sampling studies to estimate the HIV prevalence rate in female commercial sex workers.

    PubMed

    Pascom, Ana Roberta Pati; Szwarcwald, Célia Landmann; Barbosa Júnior, Aristides

    2010-01-01

    We investigated sampling methods being used to estimate the HIV prevalence rate among female commercial sex workers. The studies were classified according to the adequacy or not of the sample size to estimate HIV prevalence rate and according to the sampling method (probabilistic or convenience). We identified 75 studies that estimated the HIV prevalence rate among female sex workers. Most of the studies employed convenience samples. The sample size was not adequate to estimate HIV prevalence rate in 35 studies. The use of convenience sample limits statistical inference for the whole group. It was observed that there was an increase in the number of published studies since 2005, as well as in the number of studies that used probabilistic samples. This represents a large advance in the monitoring of risk behavior practices and HIV prevalence rate in this group.

  11. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    PubMed

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  12. Development and Validation of A 48-Target Analytical Method for High-throughput Monitoring of Genetically Modified Organisms

    PubMed Central

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-01

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930

  13. Calibration of collection procedures for the determination of precipitation chemistry

    Treesearch

    James N. Galloway; Gene E. Likens

    1976-01-01

    Precipitation is currently collected by several methods, including several different designs of collection apparatus. We are investigating these differing methods and designs to determine which gives the most representative sample of precipitation for the analysis of some 25 chemical parameters. The experimental site, located in Ithaca, New York, has 22 collectors of...

  14. Rapid habitability assessment of Mars samples by pyrolysis-FTIR

    NASA Astrophysics Data System (ADS)

    Gordon, Peter R.; Sephton, Mark A.

    2016-02-01

    Pyrolysis Fourier transform infrared spectroscopy (pyrolysis FTIR) is a potential sample selection method for Mars Sample Return missions. FTIR spectroscopy can be performed on solid and liquid samples but also on gases following preliminary thermal extraction, pyrolysis or gasification steps. The detection of hydrocarbon and non-hydrocarbon gases can reveal information on sample mineralogy and past habitability of the environment in which the sample was created. The absorption of IR radiation at specific wavenumbers by organic functional groups can indicate the presence and type of any organic matter present. Here we assess the utility of pyrolysis-FTIR to release water, carbon dioxide, sulfur dioxide and organic matter from Mars relevant materials to enable a rapid habitability assessment of target rocks for sample return. For our assessment a range of minerals were analyzed by attenuated total reflectance FTIR. Subsequently, the mineral samples were subjected to single step pyrolysis and multi step pyrolysis and the products characterised by gas phase FTIR. Data from both single step and multi step pyrolysis-FTIR provide the ability to identify minerals that reflect habitable environments through their water and carbon dioxide responses. Multi step pyrolysis-FTIR can be used to gain more detailed information on the sources of the liberated water and carbon dioxide owing to the characteristic decomposition temperatures of different mineral phases. Habitation can be suggested when pyrolysis-FTIR indicates the presence of organic matter within the sample. Pyrolysis-FTIR, therefore, represents an effective method to assess whether Mars Sample Return target rocks represent habitable conditions and potential records of habitation and can play an important role in sample triage operations.

  15. Development of quantitative screen for 1550 chemicals with GC-MS.

    PubMed

    Bergmann, Alan J; Points, Gary L; Scott, Richard P; Wilson, Glenn; Anderson, Kim A

    2018-05-01

    With hundreds of thousands of chemicals in the environment, effective monitoring requires high-throughput analytical techniques. This paper presents a quantitative screening method for 1550 chemicals based on statistical modeling of responses with identification and integration performed using deconvolution reporting software. The method was evaluated with representative environmental samples. We tested biological extracts, low-density polyethylene, and silicone passive sampling devices spiked with known concentrations of 196 representative chemicals. A multiple linear regression (R 2  = 0.80) was developed with molecular weight, logP, polar surface area, and fractional ion abundance to predict chemical responses within a factor of 2.5. Linearity beyond the calibration had R 2  > 0.97 for three orders of magnitude. Median limits of quantitation were estimated to be 201 pg/μL (1.9× standard deviation). The number of detected chemicals and the accuracy of quantitation were similar for environmental samples and standard solutions. To our knowledge, this is the most precise method for the largest number of semi-volatile organic chemicals lacking authentic standards. Accessible instrumentation and software make this method cost effective in quantifying a large, customizable list of chemicals. When paired with silicone wristband passive samplers, this quantitative screen will be very useful for epidemiology where binning of concentrations is common. Graphical abstract A multiple linear regression of chemical responses measured with GC-MS allowed quantitation of 1550 chemicals in samples such as silicone wristbands.

  16. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  17. Evaluation of the NCPDP Structured and Codified Sig Format for e-prescriptions

    PubMed Central

    Burkhart, Q; Bell, Douglas S

    2011-01-01

    Objective To evaluate the ability of the structure and code sets specified in the National Council for Prescription Drug Programs Structured and Codified Sig Format to represent ambulatory electronic prescriptions. Design We parsed the Sig strings from a sample of 20 161 de-identified ambulatory e-prescriptions into variables representing the fields of the Structured and Codified Sig Format. A stratified random sample of these representations was then reviewed by a group of experts. For codified Sig fields, we attempted to map the actual words used by prescribers to the equivalent terms in the designated terminology. Measurements Proportion of prescriptions that the Format could fully represent; proportion of terms used that could be mapped to the designated terminology. Results The fields defined in the Format could fully represent 95% of Sigs (95% CI 93% to 97%), but ambiguities were identified, particularly in representing multiple-step instructions. The terms used by prescribers could be codified for only 60% of dose delivery methods, 84% of dose forms, 82% of vehicles, 95% of routes, 70% of sites, 33% of administration timings, and 93% of indications. Limitations The findings are based on a retrospective sample of ambulatory prescriptions derived mostly from primary care physicians. Conclusion The fields defined in the Format could represent most of the patient instructions in a large prescription sample, but prior to its mandatory adoption, further work is needed to ensure that potential ambiguities are addressed and that a complete set of terms is available for the codified fields. PMID:21613642

  18. ProUCL version 4.1.00 Documentation Downloads

    EPA Pesticide Factsheets

    ProUCL version 4.1.00 represents a comprehensive statistical software package equipped with statistical methods and graphical tools needed to address many environmental sampling and statistical issues as described in various these guidance documents.

  19. A two-stage cluster sampling method using gridded population data, a GIS, and Google Earth(TM) imagery in a population-based mortality survey in Iraq.

    PubMed

    Galway, Lp; Bell, Nathaniel; Sae, Al Shatari; Hagopian, Amy; Burnham, Gilbert; Flaxman, Abraham; Weiss, Wiliam M; Rajaratnam, Julie; Takaro, Tim K

    2012-04-27

    Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings.

  20. A two-stage cluster sampling method using gridded population data, a GIS, and Google EarthTM imagery in a population-based mortality survey in Iraq

    PubMed Central

    2012-01-01

    Background Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. Results We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Conclusion Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings. PMID:22540266

  1. Comparing concentration methods: parasitrap® versus Kato-Katz for studying the prevalence of Helminths in Bengo province, Angola.

    PubMed

    Mirante, Clara; Clemente, Isabel; Zambu, Graciette; Alexandre, Catarina; Ganga, Teresa; Mayer, Carlos; Brito, Miguel

    2016-09-01

    Helminth intestinal parasitoses are responsible for high levels of child mortality and morbidity. Hence, the capacity to diagnose these parasitoses and consequently ensure due treatment represents a factor of great importance. The main objective of this study involves comparing two methods of concentration, parasitrap and Kato-Katz, for the diagnosis of intestinal parasitoses in faecal samples. Sample processing made recourse to two different concentration methods: the commercial parasitrap® method and the Kato-Katz method. We correspondingly collected a total of 610 stool samples from pre-school and school age children. The results demonstrate the incidence of helminth parasites in 32.8% or 32.3% of the sample collected depending on whether the concentration method applied was either the parasitrap method or the Kato-Katz method. We detected a relatively high percentage of samples testing positive for two or more species of helminth parasites. We would highlight that in searching for larvae the Kato-Katz method does not prove as appropriate as the parasitrap method. Both techniques prove easily applicable even in field working conditions and returning mutually agreeing results. This study concludes in favour of the need for deworming programs and greater public awareness among the rural populations of Angola.

  2. [The actual possibilities of robotic microscopy in analysis automation and laboratory telemedicine].

    PubMed

    Medovyĭ, V S; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Balugian, R Sh

    2012-10-01

    The article discusses the possibilities of automation microscopy complexes manufactured by Cellavision and MEKOS to perform the medical analyses of blood films and other biomaterials. The joint work of the complex and physician in the regimen of automatic load stages, screening, sampling and sorting on types with simple morphology, visual sorting of sub-sample with complex morphology provides significant increase of method sensitivity, load decrease and enhancement of physician work conditions. The information technologies, the virtual slides and laboratory telemedicine included permit to develop the representative samples of rare types and pathologies to promote automation methods and medical research targets.

  3. Biomarkers for liver fibrosis

    DOEpatents

    Jacobs, Jon M.; Burnum-Johnson, Kristin E.; Baker, Erin M.; Smith, Richard D.; Gritsenko, Marina A.; Orton, Daniel

    2017-05-16

    Methods and systems for diagnosing or prognosing liver fibrosis in a subject are provided. In some examples, such methods and systems can include detecting liver fibrosis-related molecules in a sample obtained from the subject, comparing expression of the molecules in the sample to controls representing expression values expected in a subject who does not have liver fibrosis or who has non-progressing fibrosis, and diagnosing or prognosing liver fibrosis in the subject when differential expression of the molecules between the sample and the controls is detected. Kits for the diagnosis or prognosis of liver fibrosis in a subject are also provided which include reagents for detecting liver fibrosis related molecules.

  4. Biomarkers for liver fibrosis

    DOEpatents

    Jacobs, Jon M.; Burnum-Johnson, Kristin E.; Baker, Erin M.; Smith, Richard D.; Gritsenko, Marina A.; Orton, Daniel

    2015-09-15

    Methods and systems for diagnosing or prognosing liver fibrosis in a subject are provided. In some examples, such methods and systems can include detecting liver fibrosis-related molecules in a sample obtained from the subject, comparing expression of the molecules in the sample to controls representing expression values expected in a subject who does not have liver fibrosis or who has non-progressing fibrosis, and diagnosing or prognosing liver fibrosis in the subject when differential expression of the molecules between the sample and the controls is detected. Kits for the diagnosis or prognosis of liver fibrosis in a subject are also provided which include reagents for detecting liver fibrosis related molecules.

  5. Solid-Phase Extraction (SPE): Principles and Applications in Food Samples.

    PubMed

    Ötles, Semih; Kartal, Canan

    2016-01-01

    Solid-Phase Extraction (SPE) is a sample preparation method that is practised on numerous application fields due to its many advantages compared to other traditional methods. SPE was invented as an alternative to liquid/liquid extraction and eliminated multiple disadvantages, such as usage of large amount of solvent, extended operation time/procedure steps, potential sources of error, and high cost. Moreover, SPE can be plied to the samples combined with other analytical methods and sample preparation techniques optionally. SPE technique is a useful tool for many purposes through its versatility. Isolation, concentration, purification and clean-up are the main approaches in the practices of this method. Food structures represent a complicated matrix and can be formed into different physical stages, such as solid, viscous or liquid. Therefore, sample preparation step particularly has an important role for the determination of specific compounds in foods. SPE offers many opportunities not only for analysis of a large diversity of food samples but also for optimization and advances. This review aims to provide a comprehensive overview on basic principles of SPE and its applications for many analytes in food matrix.

  6. Sampling for mercury at subnanogram per litre concentrations for load estimation in rivers

    USGS Publications Warehouse

    Colman, J.A.; Breault, R.F.

    2000-01-01

    Estimation of constituent loads in streams requires collection of stream samples that are representative of constituent concentrations, that is, composites of isokinetic multiple verticals collected along a stream transect. An all-Teflon isokinetic sampler (DH-81) cleaned in 75??C, 4 N HCl was tested using blank, split, and replicate samples to assess systematic and random sample contamination by mercury species. Mean mercury concentrations in field-equipment blanks were low: 0.135 ng??L-1 for total mercury (??Hg) and 0.0086 ng??L-1 for monomethyl mercury (MeHg). Mean square errors (MSE) for ??Hg and MeHg duplicate samples collected at eight sampling stations were not statistically different from MSE of samples split in the laboratory, which represent the analytical and splitting error. Low fieldblank concentrations and statistically equal duplicate- and split-sample MSE values indicate that no measurable contamination was occurring during sampling. Standard deviations associated with example mercury load estimations were four to five times larger, on a relative basis, than standard deviations calculated from duplicate samples, indicating that error of the load determination was primarily a function of the loading model used, not of sampling or analytical methods.

  7. Tritium monitor with improved gamma-ray discrimination

    DOEpatents

    Cox, S.A.; Bennett, E.F.; Yule, T.J.

    1982-10-21

    Apparatus and method are presented for selective measurement of tritium oxide in an environment which may include other radioactive components and gamma radiation, the measurement including the selective separation of tritium oxide from a sample gas through a membrane into a counting gas, the generation of electrical pulses individually representative by rise times of tritium oxide and other radioactivity in the counting gas, separation of the pulses by rise times, and counting of those pulses representative of tritium oxide. The invention further includes the separate measurement of any tritium in the sample gas by oxidizing the tritium to tritium oxide and carrying out a second separation and analysis procedure as described above.

  8. Tritium monitor with improved gamma-ray discrimination

    DOEpatents

    Cox, Samson A.; Bennett, Edgar F.; Yule, Thomas J.

    1985-01-01

    Apparatus and method for selective measurement of tritium oxide in an environment which may include other radioactive components and gamma radiation, the measurement including the selective separation of tritium oxide from a sample gas through a membrane into a counting gas, the generation of electrical pulses individually representative by rise times of tritium oxide and other radioactivity in the counting gas, separation of the pulses by rise times, and counting of those pulses representative of tritium oxide. The invention further includes the separate measurement of any tritium in the sample gas by oxidizing the tritium to tritium oxide and carrying out a second separation and analysis procedure as described above.

  9. Views of Old Forestry and New Among Reference Groups in the Pacific Northwest

    Treesearch

    Robert G. Ribe; Mollie Y. Matteson

    2002-01-01

    A public opinion survey was conducted in Washington and Oregon. It was not a representative poll sample but instead sampled groups of people favoring forest production, those favaring forest protection, and others not aligned with either of these viewpoints. There is strong consensus across groups regarding the unpopularity of established forestry methods and the need...

  10. Agreement and Diagnostic Performance of FITNESSGRAM®, International Obesity Task Force, and Hungarian National BMI Standards

    ERIC Educational Resources Information Center

    Laurson, Kelly R.; Welk, Gregory J.; Marton, Orsolya; Kaj, Mónika; Csányi, Tamás

    2015-01-01

    Purpose: This study examined agreement between all 3 standards (as well as relative diagnostic associations with metabolic syndrome) using a representative sample of youth from the Hungarian National Youth Fitness Study. Method: Body mass index (BMI) was assessed in a field sample of 2,352 adolescents (ages 10-18.5 years) and metabolic syndrome…

  11. Promotional methods used by representatives of drug companies: A prospective survey in general practice

    PubMed Central

    Schramm, Jesper; Andersen, Morten; Vach, Kirstin; Kragstrup, Jakob; Peter Kampmann, Jens; Søndergaard, Jens

    2007-01-01

    Objective To examine the extent and composition of pharmaceutical industry representatives’ marketing techniques with a particular focus on drug sampling in relation to drug age. Design A group of 47 GPs prospectively collected data on drug promotional activities during a six-month period, and a sub-sample of 10 GPs furthermore recorded the representatives’ marketing techniques in detail. Setting Primary healthcare. Subjects General practitioners in the County of Funen, Denmark. Main outcome measures. Promotional visits and corresponding marketing techniques. Results The 47 GPs recorded 1050 visits corresponding to a median of 19 (range 3 to 63) per GP in the six months. The majority of drugs promoted (52%) were marketed more than five years ago. There was a statistically significant decline in the proportion of visits where drug samples were offered with drug age, but the decline was small OR 0.97 (95% CI 0.95;0.98) per year. Leaflets (68%), suggestions on how to improve therapy for a specific patient registered with the practice (53%), drug samples (48%), and gifts (36%) were the most frequently used marketing techniques. Conclusion Drug-industry representatives use a variety of promotional methods. The tendency to hand out drug samples was statistically significantly associated with drug age, but the decline was small. PMID:17497486

  12. Trace elemental analysis of glass and paint samples of forensic interest by ICP-MS using laser ablation solid sample introduction

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Trejos, Tatiana; Hobbs, Andria; Furton, Kenneth G.

    2003-09-01

    The importance of small amounts of glass and paint evidence as a means to associate a crime event to a suspect or a suspect to another individual has been demonstrated in many cases. Glass is a fragile material that is often found at the scenes of crimes such as burglaries, hit-and-run accidents and violent crime offenses. Previous work has demonstrated the utility of elemental analysis by solution ICP-MS of small amounts of glass for the comparison between a fragment found at a crime scene to a possible source of the glass. The multi-element capability and the sensitivity of ICP-MS combined with the simplified sample introduction of laser ablation prior to ion detection provides for an excellent and relatively non-destructive technique for elemental analysis of glass fragments. The direct solid sample introduction technique of laser ablation (LA) is reported as an alternative to the solution method. Direct solid sampling provides several advantages over solution methods and shows great potential for a number of solid sample analyses in forensic science. The advantages of laser ablation include the simplification of sample preparation, thereby reducing the time and complexity of the analysis, the elimination of handling acid dissolution reagents such as HF and the reduction of sources of interferences in the ionization plasma. Direct sampling also provides for essentially "non-destructive" sampling due to the removal of very small amounts of sample needed for analysis. The discrimination potential of LA-ICP-MS is compared with previously reported solution ICP-MS methods using external calibration with internal standardization and a newly reported solution isotope dilution (ID) method. A total of ninety-one different glass samples were used for the comparison study using the techniques mentioned. One set consisted of forty-five headlamps taken from a variety of automobiles representing a range of twenty years of manufacturing dates. A second set consisted of forty-six automotive glasses (side windows and windshields) representing casework glass from different vehicle manufacturers over several years was also characterized by RI and elemental composition analysis. The solution sample introduction techniques (external calibration and isotope dilution) provide for excellent sensitivity and precision but have the disadvantages of destroying the sample and also involve complex sample preparation. The laser ablation method was simpler, faster and produced comparable discrimination to the EC-ICP-MS and ID-ICP-MS. LA-ICP-MS can provide for an excellent alternative to solution analysis of glass in forensic casework samples. Paints and coatings are frequently encountered as trace evidence samples submitted to forensic science laboratories. A LA-ICP-MS method has been developed to complement the commonly used techniques in forensic laboratories in order to better characterize these samples for forensic purposes. Time-resolved plots of each sample can be compared to associate samples to each other or to discriminate between samples. Additionally, the concentration of lead and the ratios of other elements have been determined in various automotive paints by the reported method. A sample set of eighteen (18) survey automotive paint samples have been analyzed with the developed method in order to determine the utility of LA-ICP-MS and to compare the method to the more commonly used scanning electron microscopy (SEM) method for elemental characterization of paint layers in forensic casework.

  13. Multi-edge X-ray absorption spectroscopy study of road dust samples from a traffic area of Venice using stoichiometric and environmental references

    NASA Astrophysics Data System (ADS)

    Valotto, Gabrio; Cattaruzza, Elti; Bardelli, Fabrizio

    2017-02-01

    The appropriate selection of representative pure compounds to be used as reference is a crucial step for successful analysis of X-ray absorption near edge spectroscopy (XANES) data, and it is often not a trivial task. This is particularly true when complex environmental matrices are investigated, being their elemental speciation a priori unknown. In this paper, an investigation on the speciation of Cu, Zn, and Sb based on the use of conventional (stoichiometric compounds) and non-conventional (environmental samples or relevant certified materials) references is explored. This method can be useful in when the effectiveness of XANES analysis is limited because of the difficulty in obtaining a set of references sufficiently representative of the investigated samples. Road dust samples collected along the bridge connecting Venice to the mainland were used to show the potentialities and the limits of this approach.

  14. Sampling and Pooling Methods for Capturing Herd Level Antibiotic Resistance in Swine Feces using qPCR and CFU Approaches

    PubMed Central

    Mellerup, Anders; Ståhl, Marie

    2015-01-01

    The aim of this article was to define the sampling level and method combination that captures antibiotic resistance at pig herd level utilizing qPCR antibiotic resistance gene quantification and culture-based quantification of antibiotic resistant coliform indicator bacteria. Fourteen qPCR assays for commonly detected antibiotic resistance genes were developed, and used to quantify antibiotic resistance genes in total DNA from swine fecal samples that were obtained using different sampling and pooling methods. In parallel, the number of antibiotic resistant coliform indicator bacteria was determined in the same swine fecal samples. The results showed that the qPCR assays were capable of detecting differences in antibiotic resistance levels in individual animals that the coliform bacteria colony forming units (CFU) could not. Also, the qPCR assays more accurately quantified antibiotic resistance genes when comparing individual sampling and pooling methods. qPCR on pooled samples was found to be a good representative for the general resistance level in a pig herd compared to the coliform CFU counts. It had significantly reduced relative standard deviations compared to coliform CFU counts in the same samples, and therefore differences in antibiotic resistance levels between samples were more readily detected. To our knowledge, this is the first study to describe sampling and pooling methods for qPCR quantification of antibiotic resistance genes in total DNA extracted from swine feces. PMID:26114765

  15. Instrumentation development for In Situ 40Ar/39Ar planetary geochronology

    USGS Publications Warehouse

    Morgan, Leah; Munk, Madicken; Davidheiser-Kroll, Brett; Warner, Nicholas H.; Gupta, Sanjeev; Slaybaugh, Rachel; Harkness, Patrick; Mark, Darren

    2017-01-01

    The chronology of the Solar System, particularly the timing of formation of extra-terrestrial bodies and their features, is an outstanding problem in planetary science. Although various chronological methods for in situ geochronology have been proposed (e.g., Rb-Sr, K-Ar), and even applied (K-Ar), the reliability, accuracy, and applicability of the 40Ar/39Ar method makes it by far the most desirable chronometer for dating extra-terrestrial bodies. The method however relies on the neutron irradiation of samples, and thus a neutron source. Herein, we discuss the challenges and feasibility of deploying a passive neutron source to planetary surfaces for the in situ application of the 40Ar/39Ar chronometer. Requirements in generating and shielding neutrons, as well as analysing samples are described, along with an exploration of limitations such as mass, power and cost. Two potential solutions for the in situ extra-terrestrial deployment of the 40Ar/39Ar method are presented. Although this represents a challenging task, developing the technology to apply the 40Ar/39Ar method on planetary surfaces would represent a major advance towards constraining the timescale of solar system formation and evolution.

  16. Multi-laboratory survey of qPCR enterococci analysis method performance

    EPA Pesticide Factsheets

    Quantitative polymerase chain reaction (qPCR) has become a frequently used technique for quantifying enterococci in recreational surface waters, but there are several methodological options. Here we evaluated how three method permutations, type of mastermix, sample extract dilution and use of controls in results calculation, affect method reliability among multiple laboratories with respect to sample interference. Multiple samples from each of 22 sites representing an array of habitat types were analyzed using EPA Method 1611 and 1609 reagents with full strength and five-fold diluted extracts. The presence of interference was assessed three ways: using sample processing and PCR amplifications controls; consistency of results across extract dilutions; and relative recovery of target genes from spiked enterococci in water sample compared to control matrices with acceptable recovery defined as 50 to 200%. Method 1609, which is based on an environmental mastermix, was found to be superior to Method 1611, which is based on a universal mastermix. Method 1611 had over a 40% control assay failure rate with undiluted extracts and a 6% failure rate with diluted extracts. Method 1609 failed in only 11% and 3% of undiluted and diluted extracts analyses. Use of sample processing control assay results in the delta-delta Ct method for calculating relative target gene recoveries increased the number of acceptable recovery results. Delta-delta tended to bias recoveries fr

  17. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  18. Electrical network method for the thermal or structural characterization of a conducting material sample or structure

    DOEpatents

    Ortiz, Marco G.

    1993-01-01

    A method for modeling a conducting material sample or structure system, as an electrical network of resistances in which each resistance of the network is representative of a specific physical region of the system. The method encompasses measuring a resistance between two external leads and using this measurement in a series of equations describing the network to solve for the network resistances for a specified region and temperature. A calibration system is then developed using the calculated resistances at specified temperatures. This allows for the translation of the calculated resistances to a region temperature. The method can also be used to detect and quantify structural defects in the system.

  19. Electrical network method for the thermal or structural characterization of a conducting material sample or structure

    DOEpatents

    Ortiz, M.G.

    1993-06-08

    A method for modeling a conducting material sample or structure system, as an electrical network of resistances in which each resistance of the network is representative of a specific physical region of the system. The method encompasses measuring a resistance between two external leads and using this measurement in a series of equations describing the network to solve for the network resistances for a specified region and temperature. A calibration system is then developed using the calculated resistances at specified temperatures. This allows for the translation of the calculated resistances to a region temperature. The method can also be used to detect and quantify structural defects in the system.

  20. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    NASA Astrophysics Data System (ADS)

    Saha, Debasish; Kemanian, Armen R.; Rau, Benjamin M.; Adler, Paul R.; Montes, Felipe

    2017-04-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (corn-soybean rotation), College Station, TX (corn-vetch rotation), Fort Collins, CO (irrigated corn), and Pullman, WA (winter wheat), representing diverse agro-ecoregions of the United States. Fertilization source, rate, and timing were site-specific. These simulated fluxes surrogated daily measurements in the analysis. We ;sampled; the fluxes using a fixed interval (1-32 days) or a rule-based (decision tree-based) sampling method. Two types of decision trees were built: a high-input tree (HI) that included soil inorganic nitrogen (SIN) as a predictor variable, and a low-input tree (LI) that excluded SIN. Other predictor variables were identified with Random Forest. The decision trees were inverted to be used as rules for sampling a representative number of members from each terminal node. The uncertainty of the annual N2O flux estimation increased along with the fixed interval length. A 4- and 8-day fixed sampling interval was required at College Station and Ames, respectively, to yield ±20% accuracy in the flux estimate; a 12-day interval rendered the same accuracy at Fort Collins and Pullman. Both the HI and the LI rule-based methods provided the same accuracy as that of fixed interval method with up to a 60% reduction in sampling events, particularly at locations with greater temporal flux variability. For instance, at Ames, the HI rule-based and the fixed interval methods required 16 and 91 sampling events, respectively, to achieve the same absolute bias of 0.2 kg N ha-1 yr-1 in estimating cumulative N2O flux. These results suggest that using simulation models along with decision trees can reduce the cost and improve the accuracy of the estimations of cumulative N2O fluxes using the discrete chamber-based method.

  1. Estimating the Effective Sample Size of Tree Topologies from Bayesian Phylogenetic Analyses

    PubMed Central

    Lanfear, Robert; Hua, Xia; Warren, Dan L.

    2016-01-01

    Bayesian phylogenetic analyses estimate posterior distributions of phylogenetic tree topologies and other parameters using Markov chain Monte Carlo (MCMC) methods. Before making inferences from these distributions, it is important to assess their adequacy. To this end, the effective sample size (ESS) estimates how many truly independent samples of a given parameter the output of the MCMC represents. The ESS of a parameter is frequently much lower than the number of samples taken from the MCMC because sequential samples from the chain can be non-independent due to autocorrelation. Typically, phylogeneticists use a rule of thumb that the ESS of all parameters should be greater than 200. However, we have no method to calculate an ESS of tree topology samples, despite the fact that the tree topology is often the parameter of primary interest and is almost always central to the estimation of other parameters. That is, we lack a method to determine whether we have adequately sampled one of the most important parameters in our analyses. In this study, we address this problem by developing methods to estimate the ESS for tree topologies. We combine these methods with two new diagnostic plots for assessing posterior samples of tree topologies, and compare their performance on simulated and empirical data sets. Combined, the methods we present provide new ways to assess the mixing and convergence of phylogenetic tree topologies in Bayesian MCMC analyses. PMID:27435794

  2. Analysis of Aircraft Fuels and Related Materials

    DTIC Science & Technology

    1982-09-01

    content by the Karl Fischer method . Each 2040 solvent sample represented a different step in a clean-up procedure conducted by Aero Propulsion...izes a potentiometric titration with alcoholic silver nitrate. This method has a minimum detectability of 1 ppm. It has a re- peatability of 0.1 ppm... Method 163-80, which util- izes a potentiometric titration with alcoholic silver nitrate. This method has a minimum detectability of 1 ppm and has a

  3. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  4. Comparison of the Reveal 20-hour method and the BAM culture method for the detection of Escherichia coli O157:H7 in selected foods and environmental swabs: collaborative study.

    PubMed

    Bird, C B; Hoerner, R J; Restaino, L

    2001-01-01

    Four different food types along with environmental swabs were analyzed by the Reveal for E. coli O157:H7 test (Reveal) and the Bacteriological Analytical Manual (BAM) culture method for the presence of Escherichia coli O157:H7. Twenty-seven laboratories representing academia and private industry in the United States and Canada participated. Sample types were inoculated with E. coli O157:H7 at 2 different levels. Of the 1,095 samples and controls analyzed and confirmed, 459 were positive and 557 were negative by both methods. No statistical differences (p <0.05) were observed between the Reveal and BAM methods.

  5. Analysis and comparison of glass fragments by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and ICP-MS.

    PubMed

    Trejos, Tatiana; Montero, Shirly; Almirall, José R

    2003-08-01

    The discrimination potential of Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS) is compared with previously reported solution ICP-MS methods using external calibration (EC) with internal standardization and a newly reported solution isotope dilution (ID) method for the analysis of two different glass populations. A total of 91 different glass samples were used for the comparison study; refractive index and elemental composition were measured by the techniques mentioned above. One set consisted of 45 headlamps taken from a variety of automobiles that represents a range of 20 years of manufacturing dates. A second set consisted of 46 automotive glasses (side windows, rear windows, and windshields) representing casework glass from different vehicle manufacturers over several years. The element menu for the LA-ICP-MS and EC-ICP-MS methods include Mg, Al, Ca, Mn, Ce, Ti, Zr, Sb, Ga, Ba, Rb, Sm, Sr, Hf, La, and Pb. The ID method was limited to the analysis of two isotopes each of Mg, Sr, Zr, Sb, Ba, Sm, Hf, and Pb. Laser ablation analyses were performed with a Q switched Nd:YAG, 266 nm, 6 mJ output energy laser. The laser was used in depth profile mode while sampling using a 50 microm spot size for 50 sec at 10 Hz (500 shots). The typical bias for the analysis of NIST 612 by LA-ICP-MS was less than 5% in all cases and typically better than 5% for most isotopes. The precision for the vast majority of the element menu was determined generally less than 10% for all the methods when NIST 612 was measured (40 microg x g(-1)). Method detection limits (MDL) for the EC and LA-ICP-MS methods were similar and generally reported as less than 1 microg x g(-1) for the analysis of NIST 612. While the solution sample introduction methods using EC and ID presented excellent sensitivity and precision, these methods have the disadvantages of destroying the sample, and also involve complex sample preparation. The laser ablation method was simpler, faster, and produced comparable discrimination to the EC-ICP-MS and ID-ICP-MS. LA-ICP-MS can offer an excellent alternative to solution analysis of glass in forensic casework samples.

  6. Hog Charm II tetracycline test screening results compared with a liquid chromatography tandem mass spectrometry 10-μg/kg method.

    PubMed

    Salter, Robert; Holmes, Steven; Legg, David; Coble, Joel; George, Bruce

    2012-02-01

    Pork tissue samples that tested positive and negative by the Charm II tetracycline test screening method in the slaughter plant laboratory were tested with the modified AOAC International liquid chromatography tandem mass spectrometry (LC-MS-MS) method 995.09 to determine the predictive value of the screening method at detecting total tetracyclines at 10 μg/kg of tissue, in compliance with Russian import regulations. There were 218 presumptive-positive tetracycline samples of 4,195 randomly tested hogs. Of these screening test positive samples, 83% (182) were positive, >10 μg/kg by LC-MS-MS; 12.8% (28) were false violative, greater than limit of detection (LOD) but <10 μg/kg; and 4.2% (8) were not detected at the LC-MS-MS LOD. The 36 false-violative and not-detected samples represent 1% of the total samples screened. Twenty-seven of 30 randomly selected tetracycline screening negative samples tested below the LC-MS-MS LOD, and 3 samples tested <3 μg/kg chlortetracycline. Results indicate that the Charm II tetracycline test is effective at predicting hogs containing >10 μg/kg total tetracyclines in compliance with Russian import regulations.

  7. A recreation quality rapid assessment method for visitor capacity management

    Treesearch

    Kenneth Chilman; Stuart Schneider; Les Wadzinski

    2007-01-01

    A rapid assessment method for inexpensively obtaining representative samples of place-specific visitor numbers and perceptions of visit quality was tested on Niobrara National Scenic River (NSR). Similar tests have been done on national forest areas in Indiana and Illinois. The data are used in meetings focusing on visitor capacity management. The rapid assessment...

  8. Monitoring Malaria Vector Control Interventions: Effectiveness of Five Different Adult Mosquito Sampling Methods

    PubMed Central

    Onyango, Shirley A.; Kitron, Uriel; Mungai, Peter; Muchiri, Eric M.; Kokwaro, Elizabeth; King, Charles H.; Mutuku, Francis M.

    2014-01-01

    Long-term success of ongoing malaria control efforts based on mosquito bed nets (long-lasting insecticidal net) and indoor residual spraying is dependent on continuous monitoring of mosquito vectors, and thus on effective mosquito sampling tools. The objective of our study was to identify the most efficient mosquito sampling tool(s) for routine vector surveillance for malaria and lymphatic filariasis transmission in coastal Kenya. We evaluated relative efficacy of five collection methods—light traps associated with a person sleeping under a net, pyrethrum spray catches, Prokopack aspirator, clay pots, and urine-baited traps—in four villages representing three ecological settings along the south coast of Kenya. Of the five methods, light traps were the most efficient for collecting female Anopheles gambiae s.l. (Giles) (Diptera: Culicidae) and Anopheles funestus (Giles) (Diptera: Culicidae) mosquitoes, whereas the Prokopack aspirator was most efficient in collecting Culex quinquefasciatus (Say) (Diptera: Culicidae) and other culicines. With the low vector densities here, and across much of sub-Saharan Africa, wherever malaria interventions, long-lasting insecticidal nets, and/or indoor residual spraying are in place, the use of a single mosquito collection method will not be sufficient to achieve a representative sample of mosquito population structure. Light traps will remain a relevant tool for host-seeking mosquitoes, especially in the absence of human landing catches. For a fair representation of the indoor mosquito population, light traps will have to be supplemented with aspirator use, which has potential for routine monitoring of indoor resting mosquitoes, and can substitute the more labor-intensive and intrusive pyrethrum spray catches. There are still no sufficiently efficient mosquito collection methods for sampling outdoor mosquitoes, particularly those that are bloodfed. PMID:24180120

  9. Molecular methods to assess Listeria monocytogenes route of contamination in a dairy processing plant.

    PubMed

    Alessandria, Valentina; Rantsiou, Kalliopi; Dolci, Paola; Cocolin, Luca

    2010-07-31

    In this study we investigated the occurrence of Listeria monocytogenes in a dairy processing plant during two sampling campaigns in 2007 and 2008. Samples represented by semifinished and finished cheeses, swabs from the equipment and brines from the salting step, were subjected to analysis by using traditional and molecular methods, represented mainly by quantitative PCR. Comparing the results obtained by the application of the two approaches used, it became evident how traditional microbiological analysis underestimated the presence of L. monocytogenes in the dairy plant. Especially samples of the brines and the equipment swabs were positive only with qPCR. For some equipment swabs it was possible to detect a load of 10(4)-10(5) cfu/cm(2), while the modified ISO method employed gave negative results both before and after the enrichment step. The evidences collected during the first sampling year, highlighting a heavy contamination of the brines and of the equipment, lead to the implementation of specific actions that decreased the contamination in these samples during the 2008 campaign. However, no reduction in the number of L. monocytogenes positive final products was observed, suggesting that a more strict control is necessary to avoid the presence of the pathogen. All the isolates of L. monocytogenes were able to attach to abiotic surfaces, and, interestingly, considering the results obtained from their molecular characterization it became evident how strains present in the brines, were genetically connected with isolates from the equipment and from the final product, suggesting a clear route of contamination of the pathogen in the dairy plant. This study underlines the necessity to use appropriate analytical tools, such as molecular methods, to fully understand the spread and persistence of L. monocytogenes in food producing companies. Copyright 2010 Elsevier B.V. All rights reserved.

  10. A Novel Method to Handle the Effect of Uneven Sampling Effort in Biodiversity Databases

    PubMed Central

    Pardo, Iker; Pata, María P.; Gómez, Daniel; García, María B.

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses. PMID:23326357

  11. A novel method to handle the effect of uneven sampling effort in biodiversity databases.

    PubMed

    Pardo, Iker; Pata, María P; Gómez, Daniel; García, María B

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses.

  12. Component-based subspace linear discriminant analysis method for face recognition with one training sample

    NASA Astrophysics Data System (ADS)

    Huang, Jian; Yuen, Pong C.; Chen, Wen-Sheng; Lai, J. H.

    2005-05-01

    Many face recognition algorithms/systems have been developed in the last decade and excellent performances have also been reported when there is a sufficient number of representative training samples. In many real-life applications such as passport identification, only one well-controlled frontal sample image is available for training. Under this situation, the performance of existing algorithms will degrade dramatically or may not even be implemented. We propose a component-based linear discriminant analysis (LDA) method to solve the one training sample problem. The basic idea of the proposed method is to construct local facial feature component bunches by moving each local feature region in four directions. In this way, we not only generate more samples with lower dimension than the original image, but also consider the face detection localization error while training. After that, we propose a subspace LDA method, which is tailor-made for a small number of training samples, for the local feature projection to maximize the discrimination power. Theoretical analysis and experiment results show that our proposed subspace LDA is efficient and overcomes the limitations in existing LDA methods. Finally, we combine the contributions of each local component bunch with a weighted combination scheme to draw the recognition decision. A FERET database is used for evaluating the proposed method and results are encouraging.

  13. A liquid chromatography/tandem mass spectrometry assay for the analysis of atomoxetine in human plasma and in vitro cellular samples

    PubMed Central

    Appel, David I.; Brinda, Bryan; Markowitz, John S.; Newcorn, Jeffrey H.; Zhu, Hao-Jie

    2012-01-01

    A simple, rapid and sensitive method for quantification of atomoxetine by liquid chromatography- tandem mass spectrometry (LC-MS/MS) was developed. This assay represents the first LC-MS/MS quantification method for atomoxetine utilizing electrospray ionization. Deuterated atomoxetine (d3-atomoxetine) was adopted as the internal standard. Direct protein precipitation was utilized for sample preparation. This method was validated for both human plasma and in vitro cellular samples. The lower limit of quantification was 3 ng/ml and 10 nM for human plasma and cellular samples, respectively. The calibration curves were linear within the ranges of 3 ng/ml to 900 ng/ml and 10 nM to 10 μM for human plasma and cellular samples, respectively (r2 > 0.999). The intra- and inter-day assay accuracy and precision were evaluated using quality control samples at 3 different concentrations in both human plasma and cellular lysate. Sample run stability, assay selectivity, matrix effect, and recovery were also successfully demonstrated. The present assay is superior to previously published LC-MS and LC-MS/MS methods in terms of sensitivity or the simplicity of sample preparation. This assay is applicable to the analysis of atomoxetine in both human plasma and in vitro cellular samples. PMID:22275222

  14. Characterization of nine polyphenols in fruits of Malus pumila Mill by high-performance liquid chromatography.

    PubMed

    Bai, Lu; Guo, Sen; Liu, Qingchao; Cui, Xueqin; Zhang, Xinxin; Zhang, Li; Yang, Xinwen; Hou, Manwei; Ho, Chi-Tang; Bai, Naisheng

    2016-04-01

    Polyphenols are important bioactive substances in apple. To explore the profiles of the nine representative polyphenols in this fruit, a high-performance liquid chromatography method has been established and validated. The validated method was successfully applied for the simultaneous characterization and quantification of these nine apple polyphenols in 11 apple extracts, which were obtained from six cultivars from Shaanxi Province, China. The results showed that only abscission of the Fuji apple sample was rich in the nine apple polyphenols, and the polyphenol contents of other samples varied. Although all the samples were collected in the same region, the contents of nine polyphenols were different. The proposed method could serve as a prerequisite for quality control of Malus products. Copyright © 2015. Published by Elsevier B.V.

  15. An improved initialization center k-means clustering algorithm based on distance and density

    NASA Astrophysics Data System (ADS)

    Duan, Yanling; Liu, Qun; Xia, Shuyin

    2018-04-01

    Aiming at the problem of the random initial clustering center of k means algorithm that the clustering results are influenced by outlier data sample and are unstable in multiple clustering, a method of central point initialization method based on larger distance and higher density is proposed. The reciprocal of the weighted average of distance is used to represent the sample density, and the data sample with the larger distance and the higher density are selected as the initial clustering centers to optimize the clustering results. Then, a clustering evaluation method based on distance and density is designed to verify the feasibility of the algorithm and the practicality, the experimental results on UCI data sets show that the algorithm has a certain stability and practicality.

  16. Measuring of electrical changes induced by in situ combustion through flow-through electrodes in a laboratory sample of core material

    DOEpatents

    Lee, D.O.; Montoya, P.C.; Wayland, J.R. Jr.

    1986-12-09

    Method and apparatus are provided for obtaining accurate dynamic measurements for passage of phase fronts through a core sample in a test fixture. Flow-through grid structures are provided for electrodes to permit data to be obtained before, during and after passage of a front there through. Such electrodes are incorporated in a test apparatus for obtaining electrical characteristics of the core sample. With the inventive structure a method is provided for measurement of instabilities in a phase front progressing through the medium. Availability of accurate dynamic data representing parameters descriptive of material characteristics before, during and after passage of a front provides a more efficient method for enhanced recovery of oil using a fire flood technique. 12 figs.

  17. Measuring of electrical changes induced by in situ combustion through flow-through electrodes in a laboratory sample of core material

    DOEpatents

    Lee, David O.; Montoya, Paul C.; Wayland, Jr., James R.

    1986-01-01

    Method and apparatus are provided for obtaining accurate dynamic measurements for passage of phase fronts through a core sample in a test fixture. Flow-through grid structures are provided for electrodes to permit data to be obtained before, during and after passage of a front therethrough. Such electrodes are incorporated in a test apparatus for obtaining electrical characteristics of the core sample. With the inventive structure a method is provided for measurement of instabilities in a phase front progressing through the medium. Availability of accurate dynamic data representing parameters descriptive of material characteristics before, during and after passage of a front provides a more efficient method for enhanced recovery of oil using a fire flood technique.

  18. Comparing adult cannabis treatment-seekers enrolled in a clinical trial with national samples of cannabis users in the United States

    PubMed Central

    McClure, Erin A.; King, Jacqueline S.; Wahle, Aimee; Matthews, Abigail G.; Sonne, Susan C.; Lofwall, Michelle R.; McRae-Clark, Aimee L.; Ghitza, Udi E.; Martinez, Melissa; Cloud, Kasie; Virk, Harvir S.; Gray, Kevin M.

    2017-01-01

    Background Cannabis use rates are increasing among adults in the United States (US) while the perception of harm is declining. This may result in an increased prevalence of cannabis use disorder and the need for more clinical trials to evaluate efficacious treatment strategies. Clinical trials are the gold standard for evaluating treatment, yet study samples are rarely representative of the target population. This finding has not yet been established for cannabis treatment trials. This study compared demographic and cannabis use characteristics of a cannabis cessation clinical trial sample (run through National Drug Abuse Treatment Clinical Trials Network) with three nationally representative datasets from the US; 1) National Survey on Drug Use and Health, 2) National Epidemiologic Survey on Alcohol and Related Conditions-III, and 3) Treatment Episodes Data Set – Admissions. Methods Comparisons were made between the clinical trial sample and appropriate cannabis using sub-samples from the national datasets, and propensity scores were calculated to determine the degree of similarity between samples. Results Results showed that the clinical trial sample was significantly different from all three national datasets, with the clinical trial sample having greater representation among older adults, African Americans, Hispanic/Latinos, adults with more education, non-tobacco users, and daily and almost daily cannabis users. Conclusions These results are consistent with previous studies of other substance use disorder populations and extend sample representation issues to a cannabis use disorder population. This illustrates the need to ensure representative samples within cannabis treatment clinical trials to improve the generalizability of promising findings. PMID:28511033

  19. Exploring Genomic Diversity Using Metagenomics of Deep-Sea Subsurface Microbes from the Louisville Seamount and the South Pacific Gyre

    NASA Astrophysics Data System (ADS)

    Tully, B. J.; Sylvan, J. B.; Heidelberg, J. F.; Huber, J. A.

    2014-12-01

    There are many limitations involved with sampling microbial diversity from deep-sea subsurface environments, ranging from physical sample collection, low microbial biomass, culturing at in situ conditions, and inefficient nucleic acid extractions. As such, we are continually modifying our methods to obtain better results and expanding what we know about microbes in these environments. Here we present analysis of metagenomes sequences from samples collected from 120 m within the Louisville Seamount and from the top 5-10cm of the sediment in the center of the south Pacific gyre (SPG). Both systems are low biomass with ~102 and ~104 cells per cm3 for Louisville Seamount samples analyzed and the SPG sediment, respectively. The Louisville Seamount represents the first in situ subseafloor basalt and the SPG sediments represent the first in situ low biomass sediment microbial metagenomes. Both of these environments, subseafloor basalt and sediments underlying oligotrophic ocean gyres, represent large provinces of the seafloor environment that remain understudied. Despite the low biomass and DNA generated from these samples, we have generated 16 near complete genomes (5 from Louisville and 11 from the SPG) from the two metagenomic datasets. These genomes are estimated to be between 51-100% complete and span a range of phylogenetic groups, including the Proteobacteria, Actinobacteria, Firmicutes, Chloroflexi, and unclassified bacterial groups. With these genomes, we have assessed potential functional capabilities of these organisms and performed a comparative analysis between the environmental genomes and previously sequenced relatives to determine possible adaptations that may elucidate survival mechanisms for these low energy environments. These methods illustrate a baseline analysis that can be applied to future metagenomic deep-sea subsurface datasets and will help to further our understanding of microbiology within these environments.

  20. Automated acid and base number determination of mineral-based lubricants by fourier transform infrared spectroscopy: commercial laboratory evaluation.

    PubMed

    Winterfield, Craig; van de Voort, F R

    2014-12-01

    The Fluid Life Corporation assessed and implemented Fourier transform infrared spectroscopy (FTIR)-based methods using American Society for Testing and Materials (ASTM)-like stoichiometric reactions for determination of acid and base number for in-service mineral-based oils. The basic protocols, quality control procedures, calibration, validation, and performance of these new quantitative methods are assessed. ASTM correspondence is attained using a mixed-mode calibration, using primary reference standards to anchor the calibration, supplemented by representative sample lubricants analyzed by ASTM procedures. A partial least squares calibration is devised by combining primary acid/base reference standards and representative samples, focusing on the main spectral stoichiometric response with chemometrics assisting in accounting for matrix variability. FTIR(AN/BN) methodology is precise, accurate, and free of most interference that affects ASTM D664 and D4739 results. Extensive side-by-side operational runs produced normally distributed differences with mean differences close to zero and standard deviations of 0.18 and 0.26 mg KOH/g, respectively. Statistically, the FTIR methods are a direct match to the ASTM methods, with superior performance in terms of analytical throughput, preparation time, and solvent use. FTIR(AN/BN) analysis is a viable, significant advance for in-service lubricant analysis, providing an economic means of trending samples instead of tedious and expensive conventional ASTM(AN/BN) procedures. © 2014 Society for Laboratory Automation and Screening.

  1. Quantification and semiquantification of multiple representative components for the holistic quality control of Allii Macrostemonis Bulbus by ultra high performance liquid chromatography with quadrupole time-of-flight tandem mass spectrometry.

    PubMed

    Qin, Zifei; Lin, Pei; Dai, Yi; Yao, Zhihong; Wang, Li; Yao, Xinsheng; Liu, Liyin; Chen, Haifeng

    2016-05-01

    Allii Macrostemonis Bulbus (named Xiebai in China) is a folk medicine with medicinal values for the treatment of thoracic obstruction and cardialgia, and a food additive as well. However, there is even no quantitative standard for Allii Macrostemonis Bulbus recorded in the current edition of the Chinese Pharmacopeia. Hence, simultaneous assay of multiple components is urgent. In this study, chemometric methods were firstly applied to discover the components with significant fluctuation among multiple Allii Macrostemonis Bulbus samples based on optimized fingerprints. Meanwhile, the major components and main absorbed components in rats were all selected as its representative components. Subsequently, a sensitive method was established for the simultaneous determination of 54 components (15 components for quantification and 39 components for semiquantification) by ultra high performance liquid chromatography coupled with quadrupole time-of-flight tandem mass spectrometry. Moreover, the validated method was successfully applied to evaluate the quality of multiple samples on the market. It became known that multiple Allii Macrostemonis Bulbus samples varied significantly and showed poor consistency. This work illustrated that the proposed approach could improve the quality of Allii Macrostemonis Bulbus, and it also provided a feasible method for quality evaluation of other traditional Chinese medicines. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Sampling enhancement for the quantum mechanical potential based molecular dynamics simulations: a general algorithm and its extension for free energy calculation on rugged energy surface.

    PubMed

    Li, Hongzhi; Yang, Wei

    2007-03-21

    An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.

  3. Direct analysis of organic priority pollutants by IMS

    NASA Technical Reports Server (NTRS)

    Giam, C. S.; Reed, G. E.; Holliday, T. L.; Chang, L.; Rhodes, B. J.

    1995-01-01

    Many routine methods for monitoring of trace amounts of atmospheric organic pollutants consist of several steps. Typical steps are: (1) collection of the air sample; (2) trapping of organics from the sample; (3) extraction of the trapped organics; and (4) identification of the organics in the extract by GC (gas chromatography), HPLC (High Performance Liquid Chromatography), or MS (Mass Spectrometry). These methods are often cumbersome and time consuming. A simple and fast method for monitoring atmospheric organics using an IMS (Ion Mobility Spectrometer) is proposed. This method has a short sampling time and does not require extraction of the organics since the sample is placed directly in the IMS. The purpose of this study was to determine the responses in the IMS to organic 'priority pollutants'. Priority pollutants including representative polycyclic aromatic hydrocarbons (PAHs), phthalates, phenols, chlorinated pesticides, and polychlorinated biphenyls (PCB's) were analyzed in both the positive and negative detection mode at ambient atmospheric pressure. Detection mode and amount detected are presented.

  4. An Accurate Framework for Arbitrary View Pedestrian Detection in Images

    NASA Astrophysics Data System (ADS)

    Fan, Y.; Wen, G.; Qiu, S.

    2018-01-01

    We consider the problem of detect pedestrian under from images collected under various viewpoints. This paper utilizes a novel framework called locality-constrained affine subspace coding (LASC). Firstly, the positive training samples are clustered into similar entities which represent similar viewpoint. Then Principal Component Analysis (PCA) is used to obtain the shared feature of each viewpoint. Finally, the samples that can be reconstructed by linear approximation using their top- k nearest shared feature with a small error are regarded as a correct detection. No negative samples are required for our method. Histograms of orientated gradient (HOG) features are used as the feature descriptors, and the sliding window scheme is adopted to detect humans in images. The proposed method exploits the sparse property of intrinsic information and the correlations among the multiple-views samples. Experimental results on the INRIA and SDL human datasets show that the proposed method achieves a higher performance than the state-of-the-art methods in form of effect and efficiency.

  5. Alpha Matting with KL-Divergence Based Sparse Sampling.

    PubMed

    Karacan, Levent; Erdem, Aykut; Erdem, Erkut

    2017-06-22

    In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.

  6. Measuring the gut microbiome in birds: Comparison of faecal and cloacal sampling.

    PubMed

    Videvall, Elin; Strandh, Maria; Engelbrecht, Anel; Cloete, Schalk; Cornwallis, Charlie K

    2018-05-01

    The gut microbiomes of birds and other animals are increasingly being studied in ecological and evolutionary contexts. Numerous studies on birds and reptiles have made inferences about gut microbiota using cloacal sampling; however, it is not known whether the bacterial community of the cloaca provides an accurate representation of the gut microbiome. We examined the accuracy with which cloacal swabs and faecal samples measure the microbiota in three different parts of the gastrointestinal tract (ileum, caecum, and colon) using a case study on juvenile ostriches, Struthio camelus, and high-throughput 16S rRNA sequencing. We found that faeces were significantly better than cloacal swabs in representing the bacterial community of the colon. Cloacal samples had a higher abundance of Gammaproteobacteria and fewer Clostridia relative to the gut and faecal samples. However, both faecal and cloacal samples were poor representatives of the microbial communities in the caecum and ileum. Furthermore, the accuracy of each sampling method in measuring the abundance of different bacterial taxa was highly variable: Bacteroidetes was the most highly correlated phylum between all three gut sections and both methods, whereas Actinobacteria, for example, was only strongly correlated between faecal and colon samples. Based on our results, we recommend sampling faeces, whenever possible, as this sample type provides the most accurate assessment of the colon microbiome. The fact that neither sampling technique accurately portrayed the bacterial community of the ileum nor the caecum illustrates the difficulty in noninvasively monitoring gut bacteria located further up in the gastrointestinal tract. These results have important implications for the interpretation of avian gut microbiome studies. © 2017 John Wiley & Sons Ltd.

  7. In search of a representative sample of residential building work.

    PubMed

    Lobb, Brenda; Woods, Gregory R

    2012-09-01

    Most research investigating injuries in construction work is limited by reliance on work samples unrepresentative of the multiple, variable-cycle tasks involved, resulting in incomplete characterisation of ergonomic exposures. In this case study, a participatory approach was used including hierarchical task analysis and site observations of a typical team of house builders in New Zealand, over several working days, to obtain a representative work sample. The builders' work consisted of 14 goal-defined jobs using varying subsets of 15 task types, each taking from less than 1 s to more than 1 h and performed in a variety of postures. Task type and duration varied within and between participants and days, although all participants spent at least 25% of the time moving from place to place, mostly carrying materials, and more than half the time either reaching up or bending down to work. This research has provided a description of residential building work based on a work sample more nearly representative than those previously published and has demonstrated a simple, low-cost but robust field observation method that can provide a valid basis for further study of hazard exposures. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  8. Robust inference of population structure for ancestry prediction and correction of stratification in the presence of relatedness.

    PubMed

    Conomos, Matthew P; Miller, Michael B; Thornton, Timothy A

    2015-05-01

    Population structure inference with genetic data has been motivated by a variety of applications in population genetics and genetic association studies. Several approaches have been proposed for the identification of genetic ancestry differences in samples where study participants are assumed to be unrelated, including principal components analysis (PCA), multidimensional scaling (MDS), and model-based methods for proportional ancestry estimation. Many genetic studies, however, include individuals with some degree of relatedness, and existing methods for inferring genetic ancestry fail in related samples. We present a method, PC-AiR, for robust population structure inference in the presence of known or cryptic relatedness. PC-AiR utilizes genome-screen data and an efficient algorithm to identify a diverse subset of unrelated individuals that is representative of all ancestries in the sample. The PC-AiR method directly performs PCA on the identified ancestry representative subset and then predicts components of variation for all remaining individuals based on genetic similarities. In simulation studies and in applications to real data from Phase III of the HapMap Project, we demonstrate that PC-AiR provides a substantial improvement over existing approaches for population structure inference in related samples. We also demonstrate significant efficiency gains, where a single axis of variation from PC-AiR provides better prediction of ancestry in a variety of structure settings than using 10 (or more) components of variation from widely used PCA and MDS approaches. Finally, we illustrate that PC-AiR can provide improved population stratification correction over existing methods in genetic association studies with population structure and relatedness. © 2015 WILEY PERIODICALS, INC.

  9. Fast transient digitizer

    DOEpatents

    Villa, Francesco

    1982-01-01

    Method and apparatus for sequentially scanning a plurality of target elements with an electron scanning beam modulated in accordance with variations in a high-frequency analog signal to provide discrete analog signal samples representative of successive portions of the analog signal; coupling the discrete analog signal samples from each of the target elements to a different one of a plurality of high speed storage devices; converting the discrete analog signal samples to equivalent digital signals; and storing the digital signals in a digital memory unit for subsequent measurement or display.

  10. Advancing Research on Racial–Ethnic Health Disparities: Improving Measurement Equivalence in Studies with Diverse Samples

    PubMed Central

    Landrine, Hope; Corral, Irma

    2014-01-01

    To conduct meaningful, epidemiologic research on racial–ethnic health disparities, racial–ethnic samples must be rendered equivalent on other social status and contextual variables via statistical controls of those extraneous factors. The racial–ethnic groups must also be equally familiar with and have similar responses to the methods and measures used to collect health data, must have equal opportunity to participate in the research, and must be equally representative of their respective populations. In the absence of such measurement equivalence, studies of racial–ethnic health disparities are confounded by a plethora of unmeasured, uncontrolled correlates of race–ethnicity. Those correlates render the samples, methods, and measures incomparable across racial–ethnic groups, and diminish the ability to attribute health differences discovered to race–ethnicity vs. to its correlates. This paper reviews the non-equivalent yet normative samples, methodologies and measures used in epidemiologic studies of racial–ethnic health disparities, and provides concrete suggestions for improving sample, method, and scalar measurement equivalence. PMID:25566524

  11. Comparison of flume and towing methods for verifying the calibration of a suspended-sediment sampler

    USGS Publications Warehouse

    Beverage, J.P.; Futrell, J.C.

    1986-01-01

    Suspended-sediment samplers must sample isokinetically (at stream velocity) in order to collect representative water samples of rivers. Each sampler solo by the Federal Interagency Sedimentation Project or by the U.S. Geological Survey Hydrologic Instrumentation Facility has been adjusted to sample isokinetically and tested in a flume to verify the calibration. The test program for a modified U.S. P-61 sampler provided an opportunity to compare flume and towing tank tests. Although the two tests yielded statistically distinct results, the difference between them was quite small. The conclusion is that verifying the calibration of any suspended-sediment sampler by either the flume or towing method should give acceptable results.

  12. Fast, exact k-space sample density compensation for trajectories composed of rotationally symmetric segments, and the SNR-optimized image reconstruction from non-Cartesian samples.

    PubMed

    Mitsouras, Dimitris; Mulkern, Robert V; Rybicki, Frank J

    2008-08-01

    A recently developed method for exact density compensation of non uniformly arranged samples relies on the analytically known cross-correlations of Fourier basis functions corresponding to the traced k-space trajectory. This method produces a linear system whose solution represents compensated samples that normalize the contribution of each independent element of information that can be expressed by the underlying trajectory. Unfortunately, linear system-based density compensation approaches quickly become computationally demanding with increasing number of samples (i.e., image resolution). Here, it is shown that when a trajectory is composed of rotationally symmetric interleaves, such as spiral and PROPELLER trajectories, this cross-correlations method leads to a highly simplified system of equations. Specifically, it is shown that the system matrix is circulant block-Toeplitz so that the linear system is easily block-diagonalized. The method is described and demonstrated for 32-way interleaved spiral trajectories designed for 256 image matrices; samples are compensated non iteratively in a few seconds by solving the small independent block-diagonalized linear systems in parallel. Because the method is exact and considers all the interactions between all acquired samples, up to a 10% reduction in reconstruction error concurrently with an up to 30% increase in signal to noise ratio are achieved compared to standard density compensation methods. (c) 2008 Wiley-Liss, Inc.

  13. An Overview of Conventional and Emerging Analytical Methods for the Determination of Mycotoxins

    PubMed Central

    Cigić, Irena Kralj; Prosen, Helena

    2009-01-01

    Mycotoxins are a group of compounds produced by various fungi and excreted into the matrices on which they grow, often food intended for human consumption or animal feed. The high toxicity and carcinogenicity of these compounds and their ability to cause various pathological conditions has led to widespread screening of foods and feeds potentially polluted with them. Maximum permissible levels in different matrices have also been established for some toxins. As these are quite low, analytical methods for determination of mycotoxins have to be both sensitive and specific. In addition, an appropriate sample preparation and pre-concentration method is needed to isolate analytes from rather complicated samples. In this article, an overview of methods for analysis and sample preparation published in the last ten years is given for the most often encountered mycotoxins in different samples, mainly in food. Special emphasis is on liquid chromatography with fluorescence and mass spectrometric detection, while in the field of sample preparation various solid-phase extraction approaches are discussed. However, an overview of other analytical and sample preparation methods less often used is also given. Finally, different matrices where mycotoxins have to be determined are discussed with the emphasis on their specific characteristics important for the analysis (human food and beverages, animal feed, biological samples, environmental samples). Various issues important for accurate qualitative and quantitative analyses are critically discussed: sampling and choice of representative sample, sample preparation and possible bias associated with it, specificity of the analytical method and critical evaluation of results. PMID:19333436

  14. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    PubMed Central

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  15. Flexible sampling large-scale social networks by self-adjustable random walk

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Ke; Zhu, Jonathan J. H.

    2016-12-01

    Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.

  16. Lightning vulnerability of fiber-optic cables.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinez, Leonard E.; Caldwell, Michele

    2008-06-01

    One reason to use optical fibers to transmit data is for isolation from unintended electrical energy. Using fiber optics in an application where the fiber cable/system penetrates the aperture of a grounded enclosure serves two purposes: first, it allows for control signals to be transmitted where they are required, and second, the insulating properties of the fiber system help to electrically isolate the fiber terminations on the inside of the grounded enclosure. A fundamental question is whether fiber optic cables can allow electrical energy to pass through a grounded enclosure, with a lightning strike representing an extreme but very importantmore » case. A DC test bed capable of producing voltages up to 200 kV was used to characterize electrical properties of a variety of fiber optic cable samples. Leakage current in the samples were measured with a micro-Ammeter. In addition to the leakage current measurements, samples were also tested to DC voltage breakdown. After the fiber optic cables samples were tested with DC methods, they were tested under representative lightning conditions at the Sandia Lightning Simulator (SLS). Simulated lightning currents of 30 kA and 200 kA were selected for this test series. This paper documents measurement methods and test results for DC high voltage and simulated lightning tests performed at the Sandia Lightning Simulator on fiber optic cables. The tests performed at the SLS evaluated whether electrical energy can be conducted inside or along the surface of a fiber optic cable into a grounded enclosure under representative lightning conditions.« less

  17. Determination Plastic Properties of a Material by Spherical Indentation Base on the Representative Stress Approach

    NASA Astrophysics Data System (ADS)

    Budiarsa, I. N.; Gde Antara, I. N.; Dharma, Agus; Karnata, I. N.

    2018-04-01

    Under an indentation, the material undergoes a complex deformation. One of the most effective ways to analyse indentation has been the representative method. The concept coupled with finite element (FE) modelling has been used successfully in analysing sharp indenters. It is of great importance to extend this method to spherical indentation and associated hardness system. One particular case is the Rockwell B test, where the hardness is determined by two points on the P-h curve of a spherical indenter. In this case, an established link between materials parameters and P-h curves can naturally lead to direct hardness estimation from the materials parameters (e.g. yield stress (y) and work hardening coefficients (n)). This could provide a useful tool for both research and industrial applications. Two method to predict p-h curve in spherical indentation has been established. One is use method using C1-C2 polynomial equation approach and another one by depth approach. Both approach has been successfully. An effective method in representing the P-h curves using a normalized representative stress concept was established. The concept and methodology developed is used to predict hardness (HRB) values of materials through direct analysis and validated with experimental data on selected samples of steel.

  18. Semi-Supervised Sparse Representation Based Classification for Face Recognition With Insufficient Labeled Samples

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Ma, Jiayi; Yuille, Alan L.

    2017-05-01

    This paper addresses the problem of face recognition when there is only few, or even only a single, labeled examples of the face that we wish to recognize. Moreover, these examples are typically corrupted by nuisance variables, both linear (i.e., additive nuisance variables such as bad lighting, wearing of glasses) and non-linear (i.e., non-additive pixel-wise nuisance variables such as expression changes). The small number of labeled examples means that it is hard to remove these nuisance variables between the training and testing faces to obtain good recognition performance. To address the problem we propose a method called Semi-Supervised Sparse Representation based Classification (S$^3$RC). This is based on recent work on sparsity where faces are represented in terms of two dictionaries: a gallery dictionary consisting of one or more examples of each person, and a variation dictionary representing linear nuisance variables (e.g., different lighting conditions, different glasses). The main idea is that (i) we use the variation dictionary to characterize the linear nuisance variables via the sparsity framework, then (ii) prototype face images are estimated as a gallery dictionary via a Gaussian Mixture Model (GMM), with mixed labeled and unlabeled samples in a semi-supervised manner, to deal with the non-linear nuisance variations between labeled and unlabeled samples. We have done experiments with insufficient labeled samples, even when there is only a single labeled sample per person. Our results on the AR, Multi-PIE, CAS-PEAL, and LFW databases demonstrate that the proposed method is able to deliver significantly improved performance over existing methods.

  19. Efficient statistical tests to compare Youden index: accounting for contingency correlation.

    PubMed

    Chen, Fangyao; Xue, Yuqiang; Tan, Ming T; Chen, Pingyan

    2015-04-30

    Youden index is widely utilized in studies evaluating accuracy of diagnostic tests and performance of predictive, prognostic, or risk models. However, both one and two independent sample tests on Youden index have been derived ignoring the dependence (association) between sensitivity and specificity, resulting in potentially misleading findings. Besides, paired sample test on Youden index is currently unavailable. This article develops efficient statistical inference procedures for one sample, independent, and paired sample tests on Youden index by accounting for contingency correlation, namely associations between sensitivity and specificity and paired samples typically represented in contingency tables. For one and two independent sample tests, the variances are estimated by Delta method, and the statistical inference is based on the central limit theory, which are then verified by bootstrap estimates. For paired samples test, we show that the estimated covariance of the two sensitivities and specificities can be represented as a function of kappa statistic so the test can be readily carried out. We then show the remarkable accuracy of the estimated variance using a constrained optimization approach. Simulation is performed to evaluate the statistical properties of the derived tests. The proposed approaches yield more stable type I errors at the nominal level and substantially higher power (efficiency) than does the original Youden's approach. Therefore, the simple explicit large sample solution performs very well. Because we can readily implement the asymptotic and exact bootstrap computation with common software like R, the method is broadly applicable to the evaluation of diagnostic tests and model performance. Copyright © 2015 John Wiley & Sons, Ltd.

  20. On the use of the Reciprocity Gap Functional in inverse scattering with near-field data: An application to mammography

    NASA Astrophysics Data System (ADS)

    Delbary, Fabrice; Aramini, Riccardo; Bozza, Giovanni; Brignone, Massimo; Piana, Michele

    2008-11-01

    Microwave tomography is a non-invasive approach to the early diagnosis of breast cancer. However the problem of visualizing tumors from diffracted microwaves is a difficult nonlinear ill-posed inverse scattering problem. We propose a qualitative approach to the solution of such a problem, whereby the shape and location of cancerous tissues can be detected by means of a combination of the Reciprocity Gap Functional method and the Linear Sampling method. We validate this approach to synthetic near-fields produced by a finite element method for boundary integral equations, where the breast is mimicked by the axial view of two nested cylinders, the external one representing the skin and the internal one representing the fat tissue.

  1. When continuous observations just won't do: developing accurate and efficient sampling strategies for the laying hen.

    PubMed

    Daigle, Courtney L; Siegford, Janice M

    2014-03-01

    Continuous observation is the most accurate way to determine animals' actual time budget and can provide a 'gold standard' representation of resource use, behavior frequency, and duration. Continuous observation is useful for capturing behaviors that are of short duration or occur infrequently. However, collecting continuous data is labor intensive and time consuming, making multiple individual or long-term data collection difficult. Six non-cage laying hens were video recorded for 15 h and behavioral data collected every 2 s were compared with data collected using scan sampling intervals of 5, 10, 15, 30, and 60 min and subsamples of 2 second observations performed for 10 min every 30 min, 15 min every 1 h, 30 min every 1.5 h, and 15 min every 2 h. Three statistical approaches were used to provide a comprehensive analysis to examine the quality of the data obtained via different sampling methods. General linear mixed models identified how the time budget from the sampling techniques differed from continuous observation. Correlation analysis identified how strongly results from the sampling techniques were associated with those from continuous observation. Regression analysis identified how well the results from the sampling techniques were associated with those from continuous observation, changes in magnitude, and whether a sampling technique had bias. Static behaviors were well represented with scan and time sampling techniques, while dynamic behaviors were best represented with time sampling techniques. Methods for identifying an appropriate sampling strategy based upon the type of behavior of interest are outlined and results for non-caged laying hens are presented. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. A Generalized Least Squares Regression Approach for Computing Effect Sizes in Single-Case Research: Application Examples

    ERIC Educational Resources Information Center

    Maggin, Daniel M.; Swaminathan, Hariharan; Rogers, Helen J.; O'Keeffe, Breda V.; Sugai, George; Horner, Robert H.

    2011-01-01

    A new method for deriving effect sizes from single-case designs is proposed. The strategy is applicable to small-sample time-series data with autoregressive errors. The method uses Generalized Least Squares (GLS) to model the autocorrelation of the data and estimate regression parameters to produce an effect size that represents the magnitude of…

  3. Risk Factors of Falls in Community-Dwelling Older Adults: Logistic Regression Tree Analysis

    ERIC Educational Resources Information Center

    Yamashita, Takashi; Noe, Douglas A.; Bailer, A. John

    2012-01-01

    Purpose of the Study: A novel logistic regression tree-based method was applied to identify fall risk factors and possible interaction effects of those risk factors. Design and Methods: A nationally representative sample of American older adults aged 65 years and older (N = 9,592) in the Health and Retirement Study 2004 and 2006 modules was used.…

  4. 40 CFR Appendix 6 to Subpart A of... - Reverse Phase Extraction (RPE) Method for Detection of Oil Contamination in Non-Aqueous Drilling...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...

  5. 40 CFR Appendix 6 to Subpart A of... - Reverse Phase Extraction (RPE) Method for Detection of Oil Contamination in Non-Aqueous Drilling...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...

  6. 40 CFR Appendix 6 to Subpart A of... - Reverse Phase Extraction (RPE) Method for Detection of Oil Contamination in Non-Aqueous Drilling...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...

  7. Examination of Hydrate Formation Methods: Trying to Create Representative Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kneafsey, T.J.; Rees, E.V.L.; Nakagawa, S.

    2011-04-01

    Forming representative gas hydrate-bearing laboratory samples is important so that the properties of these materials may be measured, while controlling the composition and other variables. Natural samples are rare, and have often experienced pressure and temperature changes that may affect the property to be measured [Waite et al., 2008]. Forming methane hydrate samples in the laboratory has been done a number of ways, each having advantages and disadvantages. The ice-to-hydrate method [Stern et al., 1996], contacts melting ice with methane at the appropriate pressure to form hydrate. The hydrate can then be crushed and mixed with mineral grains under controlledmore » conditions, and then compacted to create laboratory samples of methane hydrate in a mineral medium. The hydrate in these samples will be part of the load-bearing frame of the medium. In the excess gas method [Handa and Stupin, 1992], water is distributed throughout a mineral medium (e.g. packed moist sand, drained sand, moistened silica gel, other porous media) and the mixture is brought to hydrate-stable conditions (chilled and pressurized with gas), allowing hydrate to form. This method typically produces grain-cementing hydrate from pendular water in sand [Waite et al., 2004]. In the dissolved gas method [Tohidi et al., 2002], water with sufficient dissolved guest molecules is brought to hydrate-stable conditions where hydrate forms. In the laboratory, this is can be done by pre-dissolving the gas of interest in water and then introducing it to the sample under the appropriate conditions. With this method, it is easier to form hydrate from more soluble gases such as carbon dioxide. It is thought that this method more closely simulates the way most natural gas hydrate has formed. Laboratory implementation, however, is difficult, and sample formation is prohibitively time consuming [Minagawa et al., 2005; Spangenberg and Kulenkampff, 2005]. In another version of this technique, a specified quantity of gas is placed in a sample, then the sample is flooded with water and cooled [Priest et al., 2009]. We have performed a number of tests in which hydrate was formed and the uniformity of the hydrate formation was examined. These tests have primarily used a variety of modifications of the excess gas method to make the hydrate, although we have also used a version of the excess water technique. Early on, we found difficulties in creating uniform samples with a particular sand/ initial water saturation combination (F-110 Sand, {approx} 35% initial water saturation). In many of our tests we selected this combination intentionally to determine whether we could use a method to make the samples uniform. The following methods were examined: Excess gas, Freeze/thaw/form, Freeze/pressurize/thaw, Excess gas followed by water saturation, Excess water, Sand and kaolinite, Use of a nucleation enhancer (SnoMax), and Use of salt in the water. Below, each method, the underlying hypothesis, and our results are briefly presented, followed by a brief conclusion. Many of the hypotheses investigated are not our own, but were presented to us. Much of the data presented is from x-ray CT scanning our samples. The x-ray CT scanner provides a three-dimensional density map of our samples. From this map and the physics that is occurring in our samples, we are able to gain an understanding of the spatial nature of the processes that occur, and attribute them to the locations where they occur.« less

  8. Method and apparatus for improved observation of in-situ combustion processes

    DOEpatents

    Lee, D.O.; Montoya, P.C.; Wayland, J.R. Jr.

    Method and apparatus are provided for obtaining accurate dynamic measurements for passage of phase fronts through a core sample in a test fixture. Flow-through grid structures are provided for electrodes to permit data to be obtained before, during and after passage of a front there-through. Such electrodes are incorporated in a test apparatus for obtaining electrical characteristics of the core sample. With the inventive structure a method is provided for measurement of instabilities in a phase front progressing through the medium. Availability of accurate dynamic data representing parameters descriptive of material characteristics before, during and after passage of a front provides a more efficient method for enhanced recovery of oil using a fire flood technique. 6 figures, 2 tables.

  9. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics.

    PubMed

    Feng, Shu; Gale, Michael J; Fay, Jonathan D; Faridi, Ambar; Titus, Hope E; Garg, Anupam K; Michaels, Keith V; Erker, Laura R; Peters, Dawn; Smith, Travis B; Pennesi, Mark E

    2015-09-01

    To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population.

  10. Efficacy of Pitfall Trapping, Winkler and Berlese Extraction Methods for Measuring Ground-Dwelling Arthropods in Moist-Deciduous Forests in the Western Ghats

    PubMed Central

    Sabu, Thomas K.; Shiju, Raj T.

    2010-01-01

    The present study provides data to decide on the most appropriate method for sampling of ground-dwelling arthropods measured in a moist-deciduous forest in the Western Ghats in South India. The abundance of ground-dwelling arthropods was compared among large numbers of samples obtained using pitfall trapping, Berlese and Winkler extraction methods. Highest abundance and frequency of most of the represented taxa indicated pitfall trapping as the ideal method for sampling of ground-dwelling arthropods. However, with possible bias towards surface-active taxa, pitfall-trapping data is inappropriate for quantitative studies, and Berlese extraction is the better alternative. Berlese extraction is the better method for quantitative measurements than the other two methods, whereas pitfall trapping would be appropriate for qualitative measurements. A comparison of the Berlese and Winkler extraction data shows that in a quantitative multigroup approach, Winkler extraction was inferior to Berlese extraction because the total number of arthropods caught was the lowest; and many of the taxa that were caught from an identical sample via Berlese extraction method were not caught. Significantly a greater frequency and higher abundance of arthropods belonging to Orthoptera, Blattaria, and Diptera occurred in pitfall-trapped samples and Psocoptera and Acariformes in Berlese-extracted samples than that were obtained in the other two methods, indicating that both methods are useful, one complementing the other, eliminating a chance for possible under-representation of taxa in quantitative studies. PMID:20673122

  11. Method For Detecting The Presence Of A Ferromagnetic Object

    DOEpatents

    Roybal, Lyle G.

    2000-11-21

    A method for detecting a presence or an absence of a ferromagnetic object within a sensing area may comprise the steps of sensing, during a sample time, a magnetic field adjacent the sensing area; producing surveillance data representative of the sensed magnetic field; determining an absolute value difference between a maximum datum and a minimum datum comprising the surveillance data; and determining whether the absolute value difference has a positive or negative sign. The absolute value difference and the corresponding positive or negative sign thereof forms a representative surveillance datum that is indicative of the presence or absence in the sensing area of the ferromagnetic material.

  12. The Efficacy of Consensus Tree Methods for Summarizing Phylogenetic Relationships from a Posterior Sample of Trees Estimated from Morphological Data.

    PubMed

    O'Reilly, Joseph E; Donoghue, Philip C J

    2018-03-01

    Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data.

  13. The Efficacy of Consensus Tree Methods for Summarizing Phylogenetic Relationships from a Posterior Sample of Trees Estimated from Morphological Data

    PubMed Central

    O’Reilly, Joseph E; Donoghue, Philip C J

    2018-01-01

    Abstract Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data. PMID:29106675

  14. Comparison of water-quality samples collected by siphon samplers and automatic samplers in Wisconsin

    USGS Publications Warehouse

    Graczyk, David J.; Robertson, Dale M.; Rose, William J.; Steur, Jeffrey J.

    2000-01-01

    In small streams, flow and water-quality concentrations often change quickly in response to meteorological events. Hydrologists, field technicians, or locally hired stream ob- servers involved in water-data collection are often unable to reach streams quickly enough to observe or measure these rapid changes. Therefore, in hydrologic studies designed to describe changes in water quality, a combination of manual and automated sampling methods have commonly been used manual methods when flow is relatively stable and automated methods when flow is rapidly changing. Auto- mated sampling, which makes use of equipment programmed to collect samples in response to changes in stage and flow of a stream, has been shown to be an effective method of sampling to describe the rapid changes in water quality (Graczyk and others, 1993). Because of the high cost of automated sampling, however, especially for studies examining a large number of sites, alternative methods have been considered for collecting samples during rapidly changing stream conditions. One such method employs the siphon sampler (fig. 1). also referred to as the "single-stage sampler." Siphon samplers are inexpensive to build (about $25- $50 per sampler), operate, and maintain, so they are cost effective to use at a large number of sites. Their ability to collect samples representing the average quality of water passing though the entire cross section of a stream, however, has not been fully demonstrated for many types of stream sites.

  15. Progress in Developing Transfer Functions for Surface Scanning Eddy Current Inspections

    NASA Astrophysics Data System (ADS)

    Shearer, J.; Heebl, J.; Brausch, J.; Lindgren, E.

    2009-03-01

    As US Air Force (USAF) aircraft continue to age, additional inspections are required for structural components. The validation of new inspections typically requires a capability demonstration of the method using representative structure with representative damage. To minimize the time and cost required to prepare such samples, Electric Discharge machined (EDM) notches are commonly used to represent fatigue cracks in validation studies. However, the sensitivity to damage typically changes as a function of damage type. This requires a mathematical relationship to be developed between the responses from the two different flaw types to enable the use of EDM notched samples to validate new inspections. This paper reviews progress to develop transfer functions for surface scanning eddy current inspections of aluminum and titanium alloys found in structural aircraft components. Multiple samples with well characterized grown fatigue cracks and master gages with EDM notches, both with a range of flaw sizes, were used to collect flaw signals with USAF field inspection equipment. Analysis of this empirical data was used to develop a transfer function between the response from the EDM notches and grown fatigue cracks.

  16. Improving response rate and quality of survey data with a scratch lottery ticket incentive

    PubMed Central

    2012-01-01

    Background The quality of data collected in survey research is usually indicated by the response rate; the representativeness of the sample, and; the rate of completed questions (item-response). In attempting to improve a generally declining response rate in surveys considerable efforts are being made through follow-up mailings and various types of incentives. This study examines effects of including a scratch lottery ticket in the invitation letter to a survey. Method Questionnaires concerning oral health were mailed to a random sample of 2,400 adults. A systematically selected half of the sample (1,200 adults) received a questionnaire including a scratch lottery ticket. One reminder without the incentive was sent. Results The incentive increased the response rate and improved representativeness by reaching more respondents with lower education. Furthermore, it reduced item nonresponse. The initial incentive had no effect on the propensity to respond after the reminder. Conclusion When attempting to improve survey data, three issues become important: response rate, representativeness, and item-response. This study shows that including a scratch lottery ticket in the invitation letter performs well on all the three. PMID:22515335

  17. A parallel spatiotemporal saliency and discriminative online learning method for visual target tracking in aerial videos.

    PubMed

    Aghamohammadi, Amirhossein; Ang, Mei Choo; A Sundararajan, Elankovan; Weng, Ng Kok; Mogharrebi, Marzieh; Banihashem, Seyed Yashar

    2018-01-01

    Visual tracking in aerial videos is a challenging task in computer vision and remote sensing technologies due to appearance variation difficulties. Appearance variations are caused by camera and target motion, low resolution noisy images, scale changes, and pose variations. Various approaches have been proposed to deal with appearance variation difficulties in aerial videos, and amongst these methods, the spatiotemporal saliency detection approach reported promising results in the context of moving target detection. However, it is not accurate for moving target detection when visual tracking is performed under appearance variations. In this study, a visual tracking method is proposed based on spatiotemporal saliency and discriminative online learning methods to deal with appearance variations difficulties. Temporal saliency is used to represent moving target regions, and it was extracted based on the frame difference with Sauvola local adaptive thresholding algorithms. The spatial saliency is used to represent the target appearance details in candidate moving regions. SLIC superpixel segmentation, color, and moment features can be used to compute feature uniqueness and spatial compactness of saliency measurements to detect spatial saliency. It is a time consuming process, which prompted the development of a parallel algorithm to optimize and distribute the saliency detection processes that are loaded into the multi-processors. Spatiotemporal saliency is then obtained by combining the temporal and spatial saliencies to represent moving targets. Finally, a discriminative online learning algorithm was applied to generate a sample model based on spatiotemporal saliency. This sample model is then incrementally updated to detect the target in appearance variation conditions. Experiments conducted on the VIVID dataset demonstrated that the proposed visual tracking method is effective and is computationally efficient compared to state-of-the-art methods.

  18. A parallel spatiotemporal saliency and discriminative online learning method for visual target tracking in aerial videos

    PubMed Central

    2018-01-01

    Visual tracking in aerial videos is a challenging task in computer vision and remote sensing technologies due to appearance variation difficulties. Appearance variations are caused by camera and target motion, low resolution noisy images, scale changes, and pose variations. Various approaches have been proposed to deal with appearance variation difficulties in aerial videos, and amongst these methods, the spatiotemporal saliency detection approach reported promising results in the context of moving target detection. However, it is not accurate for moving target detection when visual tracking is performed under appearance variations. In this study, a visual tracking method is proposed based on spatiotemporal saliency and discriminative online learning methods to deal with appearance variations difficulties. Temporal saliency is used to represent moving target regions, and it was extracted based on the frame difference with Sauvola local adaptive thresholding algorithms. The spatial saliency is used to represent the target appearance details in candidate moving regions. SLIC superpixel segmentation, color, and moment features can be used to compute feature uniqueness and spatial compactness of saliency measurements to detect spatial saliency. It is a time consuming process, which prompted the development of a parallel algorithm to optimize and distribute the saliency detection processes that are loaded into the multi-processors. Spatiotemporal saliency is then obtained by combining the temporal and spatial saliencies to represent moving targets. Finally, a discriminative online learning algorithm was applied to generate a sample model based on spatiotemporal saliency. This sample model is then incrementally updated to detect the target in appearance variation conditions. Experiments conducted on the VIVID dataset demonstrated that the proposed visual tracking method is effective and is computationally efficient compared to state-of-the-art methods. PMID:29438421

  19. 40 CFR Appendix A to Subpart E of... - Interim Transmission Electron Microscopy Analytical Methods-Mandatory and Nonmandatory-and...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... representative of the air entering the abatement site. c. Two field blanks are to be taken by removing the cap... of Sample Data Quality Objectives is shown in the following Table II: EC01AP92.003 C. Sample Shipment... or the Burdette procedure (Ref. 7 of Unit II.J.) ii. Plasma etching of the collapsed filter is...

  20. 40 CFR Appendix A to Subpart E of... - Interim Transmission Electron Microscopy Analytical Methods-Mandatory and Nonmandatory-and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... representative of the air entering the abatement site. c. Two field blanks are to be taken by removing the cap... of Sample Data Quality Objectives is shown in the following Table II: EC01AP92.003 C. Sample Shipment... or the Burdette procedure (Ref. 7 of Unit II.J.) ii. Plasma etching of the collapsed filter is...

  1. Multivariate analysis of light scattering spectra of liquid dairy products

    NASA Astrophysics Data System (ADS)

    Khodasevich, M. A.

    2010-05-01

    Visible light scattering spectra from the surface layer of samples of commercial liquid dairy products are recorded with a colorimeter. The principal component method is used to analyze these spectra. Vectors representing the samples of dairy products in a multidimensional space of spectral counts are projected onto a three-dimensional subspace of principal components. The magnitudes of these projections are found to depend on the type of dairy product.

  2. IndeCut evaluates performance of network motif discovery algorithms.

    PubMed

    Ansariola, Mitra; Megraw, Molly; Koslicki, David

    2018-05-01

    Genomic networks represent a complex map of molecular interactions which are descriptive of the biological processes occurring in living cells. Identifying the small over-represented circuitry patterns in these networks helps generate hypotheses about the functional basis of such complex processes. Network motif discovery is a systematic way of achieving this goal. However, a reliable network motif discovery outcome requires generating random background networks which are the result of a uniform and independent graph sampling method. To date, there has been no method to numerically evaluate whether any network motif discovery algorithm performs as intended on realistically sized datasets-thus it was not possible to assess the validity of resulting network motifs. In this work, we present IndeCut, the first method to date that characterizes network motif finding algorithm performance in terms of uniform sampling on realistically sized networks. We demonstrate that it is critical to use IndeCut prior to running any network motif finder for two reasons. First, IndeCut indicates the number of samples needed for a tool to produce an outcome that is both reproducible and accurate. Second, IndeCut allows users to choose the tool that generates samples in the most independent fashion for their network of interest among many available options. The open source software package is available at https://github.com/megrawlab/IndeCut. megrawm@science.oregonstate.edu or david.koslicki@math.oregonstate.edu. Supplementary data are available at Bioinformatics online.

  3. Toward a Principled Sampling Theory for Quasi-Orders

    PubMed Central

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  4. Toward a Principled Sampling Theory for Quasi-Orders.

    PubMed

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  5. Frontal crashworthiness characterisation of a vehicle segment using curve comparison metrics.

    PubMed

    Abellán-López, D; Sánchez-Lozano, M; Martínez-Sáez, L

    2018-08-01

    The objective of this work is to propose a methodology for the characterization of the collision behaviour and crashworthiness of a segment of vehicles, by selecting the vehicle that best represents that group. It would be useful in the development of deformable barriers, to be used in crash tests intended to study vehicle compatibility, as well as for the definition of the representative standard pulses used in numerical simulations or component testing. The characterisation and selection of representative vehicles is based on the objective comparison of the occupant compartment acceleration and barrier force pulses, obtained during crash tests, by using appropriate comparison metrics. This method is complemented with another one, based exclusively on the comparison of a few characteristic parameters of crash behaviour obtained from the previous curves. The method has been applied to different vehicle groups, using test data from a sample of vehicles. During this application, the performance of several metrics usually employed in the validation of simulation models have been analysed, and the most efficient ones have been selected for the task. The methodology finally defined is useful for vehicle segment characterization, taken into account aspects of crash behaviour related to the shape of the curves, difficult to represent by simple numerical parameters, and it may be tuned in future works when applied to larger and different samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Highly sensitive mode mapping of whispering-gallery modes by scanning thermocouple-probe microscopy.

    PubMed

    Klein, Angela E; Schmidt, Carsten; Liebsch, Mattes; Janunts, Norik; Dobynde, Mikhail; Tünnermann, Andreas; Pertsch, Thomas

    2014-03-01

    We propose a method for mapping optical near-fields with the help of a thermocouple scanning-probe microscope tip. As the tip scans the sample surface, its apex is heated by light absorption, generating a thermovoltage. The thermovoltage map represents the intensity distribution of light at the sample surface. The measurement technique has been employed to map optical whispering-gallery modes in fused silica microdisk resonators operating at near-infrared wavelengths. The method could potentially be employed for near-field imaging of a variety of systems in the near-infrared and visible spectral range.

  7. Analyzing thematic maps and mapping for accuracy

    USGS Publications Warehouse

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by either the row totals or the column totals from the original classification error matrices. In hypothesis testing, when the results of tests of multiple sample cases prove to be significant, some form of statistical test must be used to separate any results that differ significantly from the others. In the past, many analyses of the data in this error matrix were made by comparing the relative magnitudes of the percentage of correct classifications, for either individual categories, the entire map or both. More rigorous analyses have used data transformations and (or) two-way classification analysis of variance. A more sophisticated step of data analysis techniques would be to use the entire classification error matrices using the methods of discrete multivariate analysis or of multiviariate analysis of variance.

  8. Pilot Test of a Novel Method for Assessing Community Response to Low-Amplitude Sonic Booms

    NASA Technical Reports Server (NTRS)

    Fidell, Sanford; Horonjeff, Richard D.; Harris, Michael

    2012-01-01

    A pilot test of a novel method for assessing residents annoyance to sonic booms was performed. During a two-week period, residents of the base housing area at Edwards Air Force Base provided data on their reactions to sonic booms using Smartphone-based interviews. Noise measurements were conducted at the same time. The report presents information about data collection methods and about test participants reactions to low-amplitude sonic booms. The latter information should not be viewed as definitive for several reasons. It may not be reliably generalized to the wider U.S. residential population (because it was not derived from a representative random sample) and the sample itself was not large.

  9. Methodological challenges in collecting social and behavioural data regarding the HIV epidemic among gay and other men who have sex with men in Australia.

    PubMed

    Zablotska, Iryna B; Frankland, Andrew; Holt, Martin; de Wit, John; Brown, Graham; Maycock, Bruce; Fairley, Christopher; Prestage, Garrett

    2014-01-01

    Behavioural surveillance and research among gay and other men who have sex with men (GMSM) commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS) and discussed their utility for behavioural surveillance. Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR) or casual partners (UAIC). The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. Respondent-driven sampling produced the sample that was most consistent to population estimates, but this methodology is complex and logistically demanding. Time-location and online recruitment are more cost-effective and easier to implement; using these approaches in combination may offer the potential to recruit a more representative sample of GMSM.

  10. A geostatistical approach to predicting sulfur content in the Pittsburgh coal bed

    USGS Publications Warehouse

    Watson, W.D.; Ruppert, L.F.; Bragg, L.J.; Tewalt, S.J.

    2001-01-01

    The US Geological Survey (USGS) is completing a national assessment of coal resources in the five top coal-producing regions in the US. Point-located data provide measurements on coal thickness and sulfur content. The sample data and their geologic interpretation represent the most regionally complete and up-to-date assessment of what is known about top-producing US coal beds. The sample data are analyzed using a combination of geologic and Geographic Information System (GIS) models to estimate tonnages and qualities of the coal beds. Traditionally, GIS practitioners use contouring to represent geographical patterns of "similar" data values. The tonnage and grade of coal resources are then assessed by using the contour lines as references for interpolation. An assessment taken to this point is only indicative of resource quantity and quality. Data users may benefit from a statistical approach that would allow them to better understand the uncertainty and limitations of the sample data. To develop a quantitative approach, geostatistics were applied to the data on coal sulfur content from samples taken in the Pittsburgh coal bed (located in the eastern US, in the southwestern part of the state of Pennsylvania, and in adjoining areas in the states of Ohio and West Virginia). Geostatistical methods that account for regional and local trends were applied to blocks 2.7 mi (4.3 km) on a side. The data and geostatistics support conclusions concerning the average sulfur content and its degree of reliability at regional- and economic-block scale over the large, contiguous part of the Pittsburgh outcrop, but not to a mine scale. To validate the method, a comparison was made with the sulfur contents in sample data taken from 53 coal mines located in the study area. The comparison showed a high degree of similarity between the sulfur content in the mine samples and the sulfur content represented by the geostatistically derived contours. Published by Elsevier Science B.V.

  11. Sampling challenges in a study examining refugee resettlement

    PubMed Central

    2011-01-01

    Background As almost half of all refugees currently under United Nations protection are from Afghanistan or Iraq and significant numbers have already been resettled outside the region of origin, it is likely that future research will examine their resettlement needs. A number of methodological challenges confront researchers working with culturally and linguistically diverse groups; however, few detailed articles are available to inform other studies. The aim of this paper is to outline challenges with sampling and recruitment of socially invisible refugee groups, describing the method adopted for a mixed methods exploratory study assessing mental health, subjective wellbeing and resettlement perspectives of Afghan and Kurdish refugees living in New Zealand and Australia. Sampling strategies used in previous studies with similar refugee groups were considered before determining the approach to recruitment Methods A snowball approach was adopted for the study, with multiple entry points into the communities being used to choose as wide a range of people as possible to provide further contacts and reduce selection bias. Census data was used to assess the representativeness of the sample. Results A sample of 193 former refugee participants was recruited in Christchurch (n = 98) and Perth (n = 95), 47% were of Afghan and 53% Kurdish ethnicity. A good gender balance (males 52%, females 48%) was achieved overall, mainly as a result of the sampling method used. Differences in the demographic composition of groups in each location were observed, especially in relation to the length of time spent in a refugee situation and time since arrival, reflecting variations in national humanitarian quota intakes. Although some measures were problematic, Census data comparison to assess reasonable representativeness of the study sample was generally reassuring. Conclusions Snowball sampling, with multiple initiation points to reduce selection bias, was necessary to locate and identify participants, provide reassurance and break down barriers. Personal contact was critical for both recruitment and data quality, and highlighted the importance of interviewer cultural sensitivity. Cross-national comparative studies, particularly relating to refugee resettlement within different policy environments, also need to take into consideration the differing pre-migration experiences and time since arrival of refugee groups, as these can add additional layers of complexity to study design and interpretation. PMID:21406104

  12. Guidelines for Measuring Disease Episodes: An Analysis of the Effects on the Components of Expenditure Growth.

    PubMed

    Dunn, Abe; Liebman, Eli; Rittmueller, Lindsey; Shapiro, Adam Hale

    2017-04-01

    To provide guidelines to researchers measuring health expenditures by disease and compare these methodologies' implied inflation estimates. A convenience sample of commercially insured individuals over the 2003 to 2007 period from Truven Health. Population weights are applied, based on age, sex, and region, to make the sample of over 4 million enrollees representative of the entire commercially insured population. Different methods are used to allocate medical-care expenditures to distinct condition categories. We compare the estimates of disease-price inflation by method. Across a variety of methods, the compound annual growth rate stays within the range 3.1 to 3.9 percentage points. Disease-specific inflation measures are more sensitive to the selected methodology. The selected allocation method impacts aggregate inflation rates, but considering the variety of methods applied, the differences appear small. Future research is necessary to better understand these differences in other population samples and to connect disease expenditures to measures of quality. © Health Research and Educational Trust.

  13. A comparative evaluation between real time Roche COBas TAQMAN 48 HCV and bDNA Bayer Versant HCV 3.0.

    PubMed

    Giraldi, Cristina; Noto, Alessandra; Tenuta, Robert; Greco, Francesca; Perugini, Daniela; Spadafora, Mario; Bianco, Anna Maria Lo; Savino, Olga; Natale, Alfonso

    2006-10-01

    The HCV virus is a common human pathogen made of a single stranded RNA genome with 9600nt. This work compared two different commercial methods used for HCV viral load, the bDNA Bayer Versant HCV 3.0 and the RealTime Roche COBAS TaqMan 48 HCV. We compared the reproducibility and linearity of the two methods. Seventy-five plasma samples with genotypes 1 to 4, which represent the population (45% genotype 1; 24% genotype 2; 13% genotype 3; 18% genotype 4) were directly processed with the Versanto method based upon signal amplification; the same samples were first extracted (COBAS Ampliprep - TNAI) and then amplified using RealTime PCR (COBAS TaqMan 48). The results obtained indicate the same performance for both methods if they have genotype 1, but in samples with genotypes 2, 3 and 4 the RealTime PCR Roche method gave an underestimation in respect to the Bayer bDNA assay.

  14. 21 CFR 111.80 - What representative samples must you collect?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Process Control System § 111.80 What representative samples must you collect? The representative samples... unique lot within each unique shipment); (b) Representative samples of in-process materials for each manufactured batch at points, steps, or stages, in the manufacturing process as specified in the master...

  15. SEIPS-based process modeling in primary care.

    PubMed

    Wooldridge, Abigail R; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter L T

    2017-04-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. SEIPS-Based Process Modeling in Primary Care

    PubMed Central

    Wooldridge, Abigail R.; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter

    2016-01-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. PMID:28166883

  17. Simultaneous determination of fluoroquinolones in environmental water by liquid chromatography-tandem mass spectrometry with direct injection: A green approach.

    PubMed

    Denadai, Marina; Cass, Quezia Bezerra

    2015-10-30

    This work describes an on-line multi-residue method for simultaneous quantification of ciprofloxacin, enrofloxacin, gemifloxacin, moxifloxacin, norfloxacin and ofloxacin in superficial and wastewater samples. For that, an octyl restricted-access media bovine serum albumin column (RAM-BSA C8) was used for sample clean-up, enrichment and analysis with quantitation carried out by tandem mass spectrometry. For water samples volumes of only 500μL the method provided good selectivity, extraction efficiency, accuracy, and precision with quantification limits in the order of 20-150ngL(-1). Out of the six fluoroquinolones only ciprofloxacin (195ngL(-1)) and norfloxacin (270ngL(-1)) were quantified in an influent sample of the wastewater treatment plant (WWTP) of São Carlos (SP, Brazil). None were found in the superficial water samples analyzed. The capability of injecting native sample in an automated mode provides high productivity and represents a greener approach in environmental sample analysis. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. A new method of snowmelt sampling for water stable isotopes

    USGS Publications Warehouse

    Penna, D.; Ahmad, M.; Birks, S. J.; Bouchaou, L.; Brencic, M.; Butt, S.; Holko, L.; Jeelani, G.; Martinez, D. E.; Melikadze, G.; Shanley, J.B.; Sokratov, S. A.; Stadnyk, T.; Sugimoto, A.; Vreca, P.

    2014-01-01

    We modified a passive capillary sampler (PCS) to collect snowmelt water for isotopic analysis. Past applications of PCSs have been to sample soil water, but the novel aspect of this study was the placement of the PCSs at the ground-snowpack interface to collect snowmelt. We deployed arrays of PCSs at 11 sites in ten partner countries on five continents representing a range of climate and snow cover worldwide. The PCS reliably collected snowmelt at all sites and caused negligible evaporative fractionation effects in the samples. PCS is low-cost, easy to install, and collects a representative integrated snowmelt sample throughout the melt season or at the melt event scale. Unlike snow cores, the PCS collects the water that would actually infiltrate the soil; thus, its isotopic composition is appropriate to use for tracing snowmelt water through the hydrologic cycle. The purpose of this Briefing is to show the potential advantages of PCSs and recommend guidelines for constructing and installing them based on our preliminary results from two snowmelt seasons.

  19. Multi-edge X-ray absorption spectroscopy study of road dust samples from a traffic area of Venice using stoichiometric and environmental references.

    PubMed

    Valotto, Gabrio; Cattaruzza, Elti; Bardelli, Fabrizio

    2017-02-15

    The appropriate selection of representative pure compounds to be used as reference is a crucial step for successful analysis of X-ray absorption near edge spectroscopy (XANES) data, and it is often not a trivial task. This is particularly true when complex environmental matrices are investigated, being their elemental speciation a priori unknown. In this paper, an investigation on the speciation of Cu, Zn, and Sb based on the use of conventional (stoichiometric compounds) and non-conventional (environmental samples or relevant certified materials) references is explored. This method can be useful in when the effectiveness of XANES analysis is limited because of the difficulty in obtaining a set of references sufficiently representative of the investigated samples. Road dust samples collected along the bridge connecting Venice to the mainland were used to show the potentialities and the limits of this approach. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Immunological detection of small organic molecules in the presence of perchlorates: relevance to the life marker chip and life detection on Mars.

    PubMed

    Rix, Catherine S; Sims, Mark R; Cullen, David C

    2011-11-01

    The proposed ExoMars mission, due to launch in 2018, aims to look for evidence of extant and extinct life in martian rocks and regolith. Previous attempts to detect organic molecules of biological or abiotic origin on Mars have been unsuccessful, which may be attributable to destruction of these molecules by perchlorate salts during pyrolysis sample extraction techniques. Organic molecules can also be extracted and measured with solvent-based systems. The ExoMars payload includes the Life Marker Chip (LMC) instrument, capable of detecting biomarker molecules of extant and extinct Earth-like life in liquid extracts of martian samples with an antibody microarray assay. The aim of the work reported here was to investigate whether the presence of perchlorate salts, at levels similar to those at the NASA Phoenix landing site, would compromise the LMC extraction and detection method. To test this, we implemented an LMC-representative sample extraction process with an LMC-representative antibody assay and used these to extract and analyze a model sample that consisted of a Mars analog sample matrix (JSC Mars-1) spiked with a representative organic molecular target (pyrene, an example of abiotic meteoritic infall targets) in the presence of perchlorate salts. We found no significant change in immunoassay function when using pyrene standards with added perchlorate salts. When model samples spiked with perchlorate salts were subjected to an LMC-representative liquid extraction, immunoassays functioned in a liquid extract and detected extracted pyrene. For the same model sample matrix without perchlorate salts, we observed anomalous assay signals that coincided with yellow coloration of the extracts. This unexpected observation is being studied further. This initial study indicates that the presence of perchlorate salts, at levels similar to those detected at the NASA Phoenix landing site, is unlikely to prevent the LMC from extracting and detecting organic molecules from martian samples.

  1. Rapid Radiochemical Methods for Asphalt Paving Material ...

    EPA Pesticide Factsheets

    Technical Brief Validated rapid radiochemical methods for alpha and beta emitters in solid matrices that are commonly encountered in urban environments were previously unavailable for public use by responding laboratories. A lack of tested rapid methods would delay the quick determination of contamination levels and the assessment of acceptable site-specific exposure levels. Of special concern are matrices with rough and porous surfaces, which allow the movement of radioactive material deep into the building material making it difficult to detect. This research focuses on methods that address preparation, radiochemical separation, and analysis of asphalt paving materials and asphalt roofing shingles. These matrices, common to outdoor environments, challenge the capability and capacity of very experienced radiochemistry laboratories. Generally, routine sample preparation and dissolution techniques produce liquid samples (representative of the original sample material) that can be processed using available radiochemical methods. The asphalt materials are especially difficult because they do not readily lend themselves to these routine sample preparation and dissolution techniques. The HSRP and ORIA coordinate radiological reference laboratory priorities and activities in conjunction with HSRP’s Partner Process. As part of the collaboration, the HSRP worked with ORIA to publish rapid radioanalytical methods for selected radionuclides in building material matrice

  2. A Critical Assessment of Bias in Survey Studies Using Location-Based Sampling to Recruit Patrons in Bars

    PubMed Central

    Morrison, Christopher; Lee, Juliet P.; Gruenewald, Paul J.; Marzell, Miesha

    2015-01-01

    Location-based sampling is a method to obtain samples of people within ecological contexts relevant to specific public health outcomes. Random selection increases generalizability, however in some circumstances (such as surveying bar patrons) recruitment conditions increase risks of sample bias. We attempted to recruit representative samples of bars and patrons in six California cities, but low response rates precluded meaningful analysis. A systematic review of 24 similar studies revealed that none addressed the key shortcomings of our study. We recommend steps to improve studies that use location-based sampling: (i) purposively sample places of interest, (ii) utilize recruitment strategies appropriate to the environment, and (iii) provide full information on response rates at all levels of sampling. PMID:26574657

  3. Comparison of indoor air sampling and dust collection methods for fungal exposure assessment using quantitative PCR.

    PubMed

    Cox, Jennie; Indugula, Reshmi; Vesper, Stephen; Zhu, Zheng; Jandarov, Roman; Reponen, Tiina

    2017-10-18

    Evaluating fungal contamination indoors is complicated because of the many different sampling methods utilized. In this study, fungal contamination was evaluated using five sampling methods and four matrices for results. The five sampling methods were a 48 hour indoor air sample collected with a Button™ inhalable aerosol sampler and four types of dust samples: a vacuumed floor dust sample, newly settled dust collected for four weeks onto two types of electrostatic dust cloths (EDCs) in trays, and a wipe sample of dust from above floor surfaces. The samples were obtained in the bedrooms of asthmatic children (n = 14). Quantitative polymerase chain reaction (qPCR) was used to analyze the dust and air samples for the 36 fungal species that make up the Environmental Relative Moldiness Index (ERMI). The results from the samples were compared by four matrices: total concentration of fungal cells, concentration of fungal species associated with indoor environments, concentration of fungal species associated with outdoor environments, and ERMI values (or ERMI-like values for air samples). The ERMI values for the dust samples and the ERMI-like values for the 48 hour air samples were not significantly different. The total cell concentrations of the 36 species obtained with the four dust collection methods correlated significantly (r = 0.64-0.79, p < 0.05), with the exception of the vacuumed floor dust and newly settled dust. In addition, fungal cell concentrations of indoor associated species correlated well between all four dust sampling methods (r = 0.68-0.86, p < 0.01). No correlation was found between the fungal concentrations in the air and dust samples primarily because of differences in concentrations of Cladosporium cladosporioides Type 1 and Epicoccum nigrum. A representative type of dust sample and a 48 hour air sample might both provide useful information about fungal exposures.

  4. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  5. A sampling approach for predicting the eating quality of apples using visible-near infrared spectroscopy.

    PubMed

    Martínez Vega, Mabel V; Sharifzadeh, Sara; Wulfsohn, Dvoralai; Skov, Thomas; Clemmensen, Line Harder; Toldam-Andersen, Torben B

    2013-12-01

    Visible-near infrared spectroscopy remains a method of increasing interest as a fast alternative for the evaluation of fruit quality. The success of the method is assumed to be achieved by using large sets of samples to produce robust calibration models. In this study we used representative samples of an early and a late season apple cultivar to evaluate model robustness (in terms of prediction ability and error) on the soluble solids content (SSC) and acidity prediction, in the wavelength range 400-1100 nm. A total of 196 middle-early season and 219 late season apples (Malus domestica Borkh.) cvs 'Aroma' and 'Holsteiner Cox' samples were used to construct spectral models for SSC and acidity. Partial least squares (PLS), ridge regression (RR) and elastic net (EN) models were used to build prediction models. Furthermore, we compared three sub-sample arrangements for forming training and test sets ('smooth fractionator', by date of measurement after harvest and random). Using the 'smooth fractionator' sampling method, fewer spectral bands (26) and elastic net resulted in improved performance for SSC models of 'Aroma' apples, with a coefficient of variation CVSSC = 13%. The model showed consistently low errors and bias (PLS/EN: R(2) cal = 0.60/0.60; SEC = 0.88/0.88°Brix; Biascal = 0.00/0.00; R(2) val = 0.33/0.44; SEP = 1.14/1.03; Biasval = 0.04/0.03). However, the prediction acidity and for SSC (CV = 5%) of the late cultivar 'Holsteiner Cox' produced inferior results as compared with 'Aroma'. It was possible to construct local SSC and acidity calibration models for early season apple cultivars with CVs of SSC and acidity around 10%. The overall model performance of these data sets also depend on the proper selection of training and test sets. The 'smooth fractionator' protocol provided an objective method for obtaining training and test sets that capture the existing variability of the fruit samples for construction of visible-NIR prediction models. The implication is that by using such 'efficient' sampling methods for obtaining an initial sample of fruit that represents the variability of the population and for sub-sampling to form training and test sets it should be possible to use relatively small sample sizes to develop spectral predictions of fruit quality. Using feature selection and elastic net appears to improve the SSC model performance in terms of R(2), RMSECV and RMSEP for 'Aroma' apples. © 2013 Society of Chemical Industry.

  6. Effect of DNA Extraction Methods and Sampling Techniques on the Apparent Structure of Cow and Sheep Rumen Microbial Communities

    PubMed Central

    Henderson, Gemma; Cox, Faith; Kittelmann, Sandra; Miri, Vahideh Heidarian; Zethof, Michael; Noel, Samantha J.; Waghorn, Garry C.; Janssen, Peter H.

    2013-01-01

    Molecular microbial ecology techniques are widely used to study the composition of the rumen microbiota and to increase understanding of the roles they play. Therefore, sampling and DNA extraction methods that result in adequate yields of microbial DNA that also accurately represents the microbial community are crucial. Fifteen different methods were used to extract DNA from cow and sheep rumen samples. The DNA yield and quality, and its suitability for downstream PCR amplifications varied considerably, depending on the DNA extraction method used. DNA extracts from nine extraction methods that passed these first quality criteria were evaluated further by quantitative PCR enumeration of microbial marker loci. Absolute microbial numbers, determined on the same rumen samples, differed by more than 100-fold, depending on the DNA extraction method used. The apparent compositions of the archaeal, bacterial, ciliate protozoal, and fungal communities in identical rumen samples were assessed using 454 Titanium pyrosequencing. Significant differences in microbial community composition were observed between extraction methods, for example in the relative abundances of members of the phyla Bacteroidetes and Firmicutes. Microbial communities in parallel samples collected from cows by oral stomach-tubing or through a rumen fistula, and in liquid and solid rumen digesta fractions, were compared using one of the DNA extraction methods. Community representations were generally similar, regardless of the rumen sampling technique used, but significant differences in the abundances of some microbial taxa such as the Clostridiales and the Methanobrevibacter ruminantium clade were observed. The apparent microbial community composition differed between rumen sample fractions, and Prevotellaceae were most abundant in the liquid fraction. DNA extraction methods that involved phenol-chloroform extraction and mechanical lysis steps tended to be more comparable. However, comparison of data from studies in which different sampling techniques, different rumen sample fractions or different DNA extraction methods were used should be avoided. PMID:24040342

  7. Effect of DNA extraction methods and sampling techniques on the apparent structure of cow and sheep rumen microbial communities.

    PubMed

    Henderson, Gemma; Cox, Faith; Kittelmann, Sandra; Miri, Vahideh Heidarian; Zethof, Michael; Noel, Samantha J; Waghorn, Garry C; Janssen, Peter H

    2013-01-01

    Molecular microbial ecology techniques are widely used to study the composition of the rumen microbiota and to increase understanding of the roles they play. Therefore, sampling and DNA extraction methods that result in adequate yields of microbial DNA that also accurately represents the microbial community are crucial. Fifteen different methods were used to extract DNA from cow and sheep rumen samples. The DNA yield and quality, and its suitability for downstream PCR amplifications varied considerably, depending on the DNA extraction method used. DNA extracts from nine extraction methods that passed these first quality criteria were evaluated further by quantitative PCR enumeration of microbial marker loci. Absolute microbial numbers, determined on the same rumen samples, differed by more than 100-fold, depending on the DNA extraction method used. The apparent compositions of the archaeal, bacterial, ciliate protozoal, and fungal communities in identical rumen samples were assessed using 454 Titanium pyrosequencing. Significant differences in microbial community composition were observed between extraction methods, for example in the relative abundances of members of the phyla Bacteroidetes and Firmicutes. Microbial communities in parallel samples collected from cows by oral stomach-tubing or through a rumen fistula, and in liquid and solid rumen digesta fractions, were compared using one of the DNA extraction methods. Community representations were generally similar, regardless of the rumen sampling technique used, but significant differences in the abundances of some microbial taxa such as the Clostridiales and the Methanobrevibacter ruminantium clade were observed. The apparent microbial community composition differed between rumen sample fractions, and Prevotellaceae were most abundant in the liquid fraction. DNA extraction methods that involved phenol-chloroform extraction and mechanical lysis steps tended to be more comparable. However, comparison of data from studies in which different sampling techniques, different rumen sample fractions or different DNA extraction methods were used should be avoided.

  8. Reflexion on linear regression trip production modelling method for ensuring good model quality

    NASA Astrophysics Data System (ADS)

    Suprayitno, Hitapriya; Ratnasari, Vita

    2017-11-01

    Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.

  9. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    NASA Technical Reports Server (NTRS)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  10. Correction of bias in belt transect studies of immotile objects

    USGS Publications Warehouse

    Anderson, D.R.; Pospahala, R.S.

    1970-01-01

    Unless a correction is made, population estimates derived from a sample of belt transects will be biased if a fraction of, the individuals on the sample transects are not counted. An approach, useful for correcting this bias when sampling immotile populations using transects of a fixed width, is presented. The method assumes that a searcher's ability to find objects near the center of the transect is nearly perfect. The method utilizes a mathematical equation, estimated from the data, to represent the searcher's inability to find all objects at increasing distances from the center of the transect. An example of the analysis of data, formation of the equation, and application is presented using waterfowl nesting data collected in Colorado.

  11. Lead burdens and behavioral impairments of the lined shore crab Pachygrapsus crassipes

    USGS Publications Warehouse

    Hui, Clifford A.

    2002-01-01

    Unless a correction is made, population estimates derived from a sample of belt transects will be biased if a fraction of, the individuals on the sample transects are not counted. An approach, useful for correcting this bias when sampling immotile populations using transects of a fixed width, is presented. The method assumes that a searcher's ability to find objects near the center of the transect is nearly perfect. The method utilizes a mathematical equation, estimated from the data, to represent the searcher's inability to find all objects at increasing distances from the center of the transect. An example of the analysis of data, formation of the equation, and application is presented using waterfowl nesting data collected in Colorado.

  12. Quantitation of Mycotoxins Using Direct Analysis in Real Time Mass Spectrometry (DART-MS).

    PubMed

    Busman, Mark

    2018-05-01

    Ambient ionization represents a new generation of MS ion sources and is used for the rapid ionization of small molecules under ambient conditions. The combination of ambient ionization and MS allows the analysis of multiple food samples with simple or no sample treatment or in conjunction with prevailing sample preparation methods. Two ambient ionization methods, desorptive electrospray ionization (DESI) and direct analysis in real time (DART) have been adapted for food safety application. Both ionization techniques provide unique advantages and capabilities. DART has been used for a variety of qualitative and quantitative applications. In particular, mycotoxin contamination of food and feed materials has been addressed by DART-MS. Applications to mycotoxin analysis by ambient ionization MS and particularly DART-MS are summarized.

  13. Applying of Factor Analyses for Determination of Trace Elements Distribution in Water from River Vardar and Its Tributaries, Macedonia/Greece

    PubMed Central

    Popov, Stanko Ilić; Stafilov, Trajče; Šajn, Robert; Tănăselia, Claudiu; Bačeva, Katerina

    2014-01-01

    A systematic study was carried out to investigate the distribution of fifty-six elements in the water samples from river Vardar (Republic of Macedonia and Greece) and its major tributaries. The samples were collected from 27 sampling sites. Analyses were performed by mass spectrometry with inductively coupled plasma (ICP-MS) and atomic emission spectrometry with inductively coupled plasma (ICP-AES). Cluster and R mode factor analysis (FA) was used to identify and characterise element associations and four associations of elements were determined by the method of multivariate statistics. Three factors represent the associations of elements that occur in the river water naturally while Factor 3 represents an anthropogenic association of the elements (Cd, Ga, In, Pb, Re, Tl, Cu, and Zn) introduced in the river waters from the waste waters from the mining and metallurgical activities in the country. PMID:24587756

  14. Applying of factor analyses for determination of trace elements distribution in water from Vardar and its tributaries, Macedonia/Greece.

    PubMed

    Popov, Stanko Ilić; Stafilov, Trajče; Sajn, Robert; Tănăselia, Claudiu; Bačeva, Katerina

    2014-01-01

    A systematic study was carried out to investigate the distribution of fifty-six elements in the water samples from river Vardar (Republic of Macedonia and Greece) and its major tributaries. The samples were collected from 27 sampling sites. Analyses were performed by mass spectrometry with inductively coupled plasma (ICP-MS) and atomic emission spectrometry with inductively coupled plasma (ICP-AES). Cluster and R mode factor analysis (FA) was used to identify and characterise element associations and four associations of elements were determined by the method of multivariate statistics. Three factors represent the associations of elements that occur in the river water naturally while Factor 3 represents an anthropogenic association of the elements (Cd, Ga, In, Pb, Re, Tl, Cu, and Zn) introduced in the river waters from the waste waters from the mining and metallurgical activities in the country.

  15. Survey of spatial data needs and land use forecasting methods in the electric utility industry

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A representative sample of the electric utility industry in the United States was surveyed to determine industry need for spatial data (specifically LANDSAT and other remotely sensed data) and the methods used by the industry to forecast land use changes and future energy demand. Information was acquired through interviews, written questionnaires, and reports (both published and internal).

  16. A gel-based visual immunoassay for non-instrumental detection of chloramphenicol in food samples.

    PubMed

    Yuan, Meng; Sheng, Wei; Zhang, Yan; Wang, Junping; Yang, Yijin; Zhang, Shuguang; Goryacheva, Irina Yu; Wang, Shuo

    2012-11-02

    A gel-based non-instrumental immuno-affinity assay was developed for the rapid screening of chloramphenicol (CAP) in food samples with the limit of detection (LOD) of 1 μg L(-1). The immuno-affinity test column (IATC) consisted of a test layer containing anti-CAP antibody coupled gel, and a control layer with anti-HRP antibody coupled gel. Based on the direct competitive immuno-reaction and the horseradish peroxidase enzymatic reaction, the test results could be evaluated visually. Basically, blue color development represented the negative results, while the absence of color development represented the positive results. In this study, CAP spiked samples of raw milk, pasteurized milk, UHT milk, skimmed milk powder, acacia honey, date honey, fish and shrimp were tested. Little or none sample pretreatment was required for this assay. The whole procedure was completed within 10min. In conclusion, the gel-based immuno-affinity test is a simple, rapid, and promising on-site screening method for CAP residues in food samples, with no instrumental requirement. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Systems and Methods for Correcting Optical Reflectance Measurements

    NASA Technical Reports Server (NTRS)

    Yang, Ye (Inventor); Shear, Michael A. (Inventor); Soller, Babs R. (Inventor); Soyemi, Olusola O. (Inventor)

    2014-01-01

    We disclose measurement systems and methods for measuring analytes in target regions of samples that also include features overlying the target regions. The systems include: (a) a light source; (b) a detection system; (c) a set of at least first, second, and third light ports which transmit light from the light source to a sample and receive and direct light reflected from the sample to the detection system, generating a first set of data including information corresponding to both an internal target within the sample and features overlying the internal target, and a second set of data including information corresponding to features overlying the internal target; and (d) a processor configured to remove information characteristic of the overlying features from the first set of data using the first and second sets of data to produce corrected information representing the internal target.

  18. Systems and methods for correcting optical reflectance measurements

    NASA Technical Reports Server (NTRS)

    Yang, Ye (Inventor); Soller, Babs R. (Inventor); Soyemi, Olusola O. (Inventor); Shear, Michael A. (Inventor)

    2009-01-01

    We disclose measurement systems and methods for measuring analytes in target regions of samples that also include features overlying the target regions. The systems include: (a) a light source; (b) a detection system; (c) a set of at least first, second, and third light ports which transmit light from the light source to a sample and receive and direct light reflected from the sample to the detection system, generating a first set of data including information corresponding to both an internal target within the sample and features overlying the internal target, and a second set of data including information corresponding to features overlying the internal target; and (d) a processor configured to remove information characteristic of the overlying features from the first set of data using the first and second sets of data to produce corrected information representing the internal target.

  19. Forensic discrimination of copper wire using trace element concentrations.

    PubMed

    Dettman, Joshua R; Cassabaum, Alyssa A; Saunders, Christopher P; Snyder, Deanna L; Buscaglia, JoAnn

    2014-08-19

    Copper may be recovered as evidence in high-profile cases such as thefts and improvised explosive device incidents; comparison of copper samples from the crime scene and those associated with the subject of an investigation can provide probative associative evidence and investigative support. A solution-based inductively coupled plasma mass spectrometry method for measuring trace element concentrations in high-purity copper was developed using standard reference materials. The method was evaluated for its ability to use trace element profiles to statistically discriminate between copper samples considering the precision of the measurement and manufacturing processes. The discriminating power was estimated by comparing samples chosen on the basis of the copper refining and production process to represent the within-source (samples expected to be similar) and between-source (samples expected to be different) variability using multivariate parametric- and empirical-based data simulation models with bootstrap resampling. If the false exclusion rate is set to 5%, >90% of the copper samples can be correctly determined to originate from different sources using a parametric-based model and >87% with an empirical-based approach. These results demonstrate the potential utility of the developed method for the comparison of copper samples encountered as forensic evidence.

  20. Rapid and sensitive determination of tellurium in soil and plant samples by sector-field inductively coupled plasma mass spectrometry.

    PubMed

    Yang, Guosheng; Zheng, Jian; Tagami, Keiko; Uchida, Shigeo

    2013-11-15

    In this work, we report a rapid and highly sensitive analytical method for the determination of tellurium in soil and plant samples using sector field inductively coupled plasma mass spectrometry (SF-ICP-MS). Soil and plant samples were digested using Aqua regia. After appropriate dilution, Te in soil and plant samples was directly analyzed without any separation and preconcentration. This simple sample preparation approach avoided to a maximum extent any contamination and loss of Te prior to the analysis. The developed analytical method was validated by the analysis of soil/sediment and plant reference materials. Satisfactory detection limits of 0.17 ng g(-1) for soil and 0.02 ng g(-1) for plant samples were achieved, which meant that the developed method was applicable to studying the soil-to-plant transfer factor of Te. Our work represents for the first time that data on the soil-to-plant transfer factor of Te were obtained for Japanese samples which can be used for the estimation of internal radiation dose of radioactive tellurium due to the Fukushima Daiichi Nuclear Power Plant accident. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Communication Modality Sampling for a Toddler with Angelman Syndrome

    ERIC Educational Resources Information Center

    Martin, Jolene Hyppa; Reichle, Joe; Dimian, Adele; Chen, Mo

    2013-01-01

    Purpose: Vocal, gestural, and graphic communication modes were implemented concurrently with a toddler with Angelman syndrome to identify the most efficiently learned communication mode to emphasize in an initial augmentative communication system. Method: Symbols representing preferred objects were introduced in vocal, gestural, and graphic…

  2. New Insights Toward Quantitative Relationships between Lignin Reactivity to Monomers and Their Structural Characteristics.

    PubMed

    Ma, Ruoshui; Zhang, Xiumei; Wang, Yi; Zhang, Xiao

    2018-04-27

    The heterogeneous and complex structural characteristics of lignin present a significant challenge to predict its processability (e.g. depolymerization, modifications etc) to valuable products. This study provides a detailed characterization and comparison of structural properties of seven representative biorefinery lignin samples derived from forest and agricultural residues, which were subjected to representative pretreatment methods. A range of wet chemistry and spectroscopy methods were applied to determine specific lignin structural characteristics such as functional groups, inter-unit linkages and peak molecular weight. In parallel, oxidative depolymerization of these lignin samples to either monomeric phenolic compounds or dicarboxylic acids were conducted, and the product yields were quantified. Based on these results (lignin structural characteristics and monomer yields), we demonstrated for the first time to apply multiple-variable linear estimations (MVLE) approach using R statistics to gain insight toward a quantitative correlation between lignin structural properties and their conversion reactivity toward oxidative depolymerization to monomers. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Social surveys in HIV/AIDS: telling or writing? A comparison of interview and postal methods.

    PubMed

    McEwan, R T; Harrington, B E; Bhopal, R S; Madhok, R; McCallum, A

    1992-06-01

    We compare a probability sample postal questionnaire survey and a quota controlled interview survey, and review the literature on these subjects. In contrast to other studies, where quota samples were not representative because of biased selection of respondents by interviewers, our quota sample was representative. Response rates were similar in our postal and interview surveys (74 and 77%, respectively), although many previous similar postal surveys had poor response rates. As in other comparison studies, costs were higher in our interview survey, substantive responses and the quality of responses to closed-ended questions were similar, and responses to open-ended questions were better in the interview survey. 'Socially unacceptable' responses on sexual behaviour were less likely in interviews. Quota controlled surveys are appropriate in surveys on HIV/AIDS under certain circumstances, e.g. where the population parameters are well known, and where interviewers can gain access to the entire population. Postal questionnaires are better for obtaining information on sexual behaviour, if adequate steps are taken to improve response rates, and when in-depth answers are not needed. For most surveys in the HIV/AIDS field we recommend the postal method.

  4. Exploring Representativeness and Informativeness for Active Learning.

    PubMed

    Du, Bo; Wang, Zengmao; Zhang, Lefei; Zhang, Liangpei; Liu, Wei; Shen, Jialie; Tao, Dacheng

    2017-01-01

    How can we find a general way to choose the most suitable samples for training a classifier? Even with very limited prior information? Active learning, which can be regarded as an iterative optimization procedure, plays a key role to construct a refined training set to improve the classification performance in a variety of applications, such as text analysis, image recognition, social network modeling, etc. Although combining representativeness and informativeness of samples has been proven promising for active sampling, state-of-the-art methods perform well under certain data structures. Then can we find a way to fuse the two active sampling criteria without any assumption on data? This paper proposes a general active learning framework that effectively fuses the two criteria. Inspired by a two-sample discrepancy problem, triple measures are elaborately designed to guarantee that the query samples not only possess the representativeness of the unlabeled data but also reveal the diversity of the labeled data. Any appropriate similarity measure can be employed to construct the triple measures. Meanwhile, an uncertain measure is leveraged to generate the informativeness criterion, which can be carried out in different ways. Rooted in this framework, a practical active learning algorithm is proposed, which exploits a radial basis function together with the estimated probabilities to construct the triple measures and a modified best-versus-second-best strategy to construct the uncertain measure, respectively. Experimental results on benchmark datasets demonstrate that our algorithm consistently achieves superior performance over the state-of-the-art active learning algorithms.

  5. 19 CFR 151.52 - Sampling procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    .... Representative commercial moisture and assay samples shall be taken under Customs supervision for testing by the Customs laboratory. The samples used for the moisture test shall be representative of the shipment at the... verified commercial moisture sample and prepared assay sample certified to be representative of the...

  6. Bacteriophage-based nanoprobes for rapid bacteria separation

    NASA Astrophysics Data System (ADS)

    Chen, Juhong; Duncan, Bradley; Wang, Ziyuan; Wang, Li-Sheng; Rotello, Vincent M.; Nugen, Sam R.

    2015-10-01

    The lack of practical methods for bacterial separation remains a hindrance for the low-cost and successful development of rapid detection methods from complex samples. Antibody-tagged magnetic particles are commonly used to pull analytes from a liquid sample. While this method is well-established, improvements in capture efficiencies would result in an increase of the overall detection assay performance. Bacteriophages represent a low-cost and more consistent biorecognition element as compared to antibodies. We have developed nanoscale bacteriophage-tagged magnetic probes, where T7 bacteriophages were bound to magnetic nanoparticles. The nanoprobe allowed the specific recognition and attachment to E. coli cells. The phage magnetic nanprobes were directly compared to antibody-conjugated magnetic nanoprobes. The capture efficiencies of bacteriophages and antibodies on nanoparticles for the separation of E. coli K12 at varying concentrations were determined. The results indicated a similar bacteria capture efficiency between the two nanoprobes.The lack of practical methods for bacterial separation remains a hindrance for the low-cost and successful development of rapid detection methods from complex samples. Antibody-tagged magnetic particles are commonly used to pull analytes from a liquid sample. While this method is well-established, improvements in capture efficiencies would result in an increase of the overall detection assay performance. Bacteriophages represent a low-cost and more consistent biorecognition element as compared to antibodies. We have developed nanoscale bacteriophage-tagged magnetic probes, where T7 bacteriophages were bound to magnetic nanoparticles. The nanoprobe allowed the specific recognition and attachment to E. coli cells. The phage magnetic nanprobes were directly compared to antibody-conjugated magnetic nanoprobes. The capture efficiencies of bacteriophages and antibodies on nanoparticles for the separation of E. coli K12 at varying concentrations were determined. The results indicated a similar bacteria capture efficiency between the two nanoprobes. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr03779d

  7. Rectified factor networks for biclustering of omics data.

    PubMed

    Clevert, Djork-Arné; Unterthiner, Thomas; Povysil, Gundula; Hochreiter, Sepp

    2017-07-15

    Biclustering has become a major tool for analyzing large datasets given as matrix of samples times features and has been successfully applied in life sciences and e-commerce for drug design and recommender systems, respectively. actor nalysis for cluster cquisition (FABIA), one of the most successful biclustering methods, is a generative model that represents each bicluster by two sparse membership vectors: one for the samples and one for the features. However, FABIA is restricted to about 20 code units because of the high computational complexity of computing the posterior. Furthermore, code units are sometimes insufficiently decorrelated and sample membership is difficult to determine. We propose to use the recently introduced unsupervised Deep Learning approach Rectified Factor Networks (RFNs) to overcome the drawbacks of existing biclustering methods. RFNs efficiently construct very sparse, non-linear, high-dimensional representations of the input via their posterior means. RFN learning is a generalized alternating minimization algorithm based on the posterior regularization method which enforces non-negative and normalized posterior means. Each code unit represents a bicluster, where samples for which the code unit is active belong to the bicluster and features that have activating weights to the code unit belong to the bicluster. On 400 benchmark datasets and on three gene expression datasets with known clusters, RFN outperformed 13 other biclustering methods including FABIA. On data of the 1000 Genomes Project, RFN could identify DNA segments which indicate, that interbreeding with other hominins starting already before ancestors of modern humans left Africa. https://github.com/bioinf-jku/librfn. djork-arne.clevert@bayer.com or hochreit@bioinf.jku.at. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  8. Sampling the structure and chemical order in assemblies of ferromagnetic nanoparticles by nuclear magnetic resonance

    PubMed Central

    Liu, Yuefeng; Luo, Jingjie; Shin, Yooleemi; Moldovan, Simona; Ersen, Ovidiu; Hébraud, Anne; Schlatter, Guy; Pham-Huu, Cuong; Meny, Christian

    2016-01-01

    Assemblies of nanoparticles are studied in many research fields from physics to medicine. However, as it is often difficult to produce mono-dispersed particles, investigating the key parameters enhancing their efficiency is blurred by wide size distributions. Indeed, near-field methods analyse a part of the sample that might not be representative of the full size distribution and macroscopic methods give average information including all particle sizes. Here, we introduce temperature differential ferromagnetic nuclear resonance spectra that allow sampling the crystallographic structure, the chemical composition and the chemical order of non-interacting ferromagnetic nanoparticles for specific size ranges within their size distribution. The method is applied to cobalt nanoparticles for catalysis and allows extracting the size effect from the crystallographic structure effect on their catalytic activity. It also allows sampling of the chemical composition and chemical order within the size distribution of alloyed nanoparticles and can thus be useful in many research fields. PMID:27156575

  9. Validation of the 3M molecular detection system for the detection of listeria in meat, seafood, dairy, and retail environments.

    PubMed

    Fortes, Esther D; David, John; Koeritzer, Bob; Wiedmann, Martin

    2013-05-01

    There is a continued need to develop improved rapid methods for detection of foodborne pathogens. The aim of this project was to evaluate the 3M Molecular Detection System (3M MDS), which uses isothermal DNA amplification, and the 3M Molecular Detection Assay Listeria using environmental samples obtained from retail delicatessens and meat, seafood, and dairy processing plants. Environmental sponge samples were tested for Listeria with the 3M MDS after 22 and 48 h of enrichment in 3M Modified Listeria Recovery Broth (3M mLRB); enrichments were also used for cultural detection of Listeria spp. Among 391 samples tested for Listeria, 74 were positive by both the 3M MDS and the cultural method, 310 were negative by both methods, 2 were positive by the 3M MDS and negative by the cultural method, and one sample was negative by the 3M MDS and positive by the cultural method. Four samples were removed from the sample set, prior to statistical analyses, due to potential cross-contamination during testing. Listeria isolates from positive samples represented L. monocytogenes, L. innocua, L. welshimeri, and L. seeligeri. Overall, the 3M MDS and culture-based detection after enrichment in 3M mLRB did not differ significantly (P < 0.05) with regard to the number of positive samples, when chi-square analyses were performed for (i) number of positive samples after 22 h, (ii) number of positive samples after 48 h, and (iii) number of positive samples after 22 and/or 48 h of enrichment in 3M mLRB. Among 288 sampling sites that were tested with duplicate sponges, 67 each tested positive with the 3M MDS and the traditional U.S. Food and Drug Administration Bacteriological Analytical Manual method, further supporting that the 3M MDS performs equivalently to traditional methods when used with environmental sponge samples.

  10. Face recognition via sparse representation of SIFT feature on hexagonal-sampling image

    NASA Astrophysics Data System (ADS)

    Zhang, Daming; Zhang, Xueyong; Li, Lu; Liu, Huayong

    2018-04-01

    This paper investigates a face recognition approach based on Scale Invariant Feature Transform (SIFT) feature and sparse representation. The approach takes advantage of SIFT which is local feature other than holistic feature in classical Sparse Representation based Classification (SRC) algorithm and possesses strong robustness to expression, pose and illumination variations. Since hexagonal image has more inherit merits than square image to make recognition process more efficient, we extract SIFT keypoint in hexagonal-sampling image. Instead of matching SIFT feature, firstly the sparse representation of each SIFT keypoint is given according the constructed dictionary; secondly these sparse vectors are quantized according dictionary; finally each face image is represented by a histogram and these so-called Bag-of-Words vectors are classified by SVM. Due to use of local feature, the proposed method achieves better result even when the number of training sample is small. In the experiments, the proposed method gave higher face recognition rather than other methods in ORL and Yale B face databases; also, the effectiveness of the hexagonal-sampling in the proposed method is verified.

  11. Sample selection via angular distance in the space of the arguments of an artificial neural network

    NASA Astrophysics Data System (ADS)

    Fernández Jaramillo, J. M.; Mayerle, R.

    2018-05-01

    In the construction of an artificial neural network (ANN) a proper data splitting of the available samples plays a major role in the training process. This selection of subsets for training, testing and validation affects the generalization ability of the neural network. Also the number of samples has an impact in the time required for the design of the ANN and the training. This paper introduces an efficient and simple method for reducing the set of samples used for training a neural network. The method reduces the required time to calculate the network coefficients, while keeping the diversity and avoiding overtraining the ANN due the presence of similar samples. The proposed method is based on the calculation of the angle between two vectors, each one representing one input of the neural network. When the angle formed among samples is smaller than a defined threshold only one input is accepted for the training. The accepted inputs are scattered throughout the sample space. Tidal records are used to demonstrate the proposed method. The results of a cross-validation show that with few inputs the quality of the outputs is not accurate and depends on the selection of the first sample, but as the number of inputs increases the accuracy is improved and differences among the scenarios with a different starting sample have and important reduction. A comparison with the K-means clustering algorithm shows that for this application the proposed method with a smaller number of samples is producing a more accurate network.

  12. Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.

    PubMed

    Wang, Zuozhen

    2018-01-01

    Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.

  13. Combined target factor analysis and Bayesian soft-classification of interference-contaminated samples: forensic fire debris analysis.

    PubMed

    Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh

    2012-10-10

    A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Recent and ancient recharge deciphered by multi-dating tracer technique

    NASA Astrophysics Data System (ADS)

    Dogramaci, Shawan; Cook, Peter; Mccallum, Jimes; Purtchert, Roland

    2017-04-01

    Determining groundwater residence time from environmental tracer concentrations obtained from open bores or long screened intervals is fraught with difficulty because the sampled water represents variety of ages. Information on the distribution of groundwater age is commonly obtained by measuring more than one tracer. We examined the use of the multi-tracer technique representing different time frames (39Ar, 85Kr, 14C, 3H, CFC 11- CFC-12 CFC-113, SF6 and Cl,) to decipher the groundwater ages sampled from long screened bores in a regional aquifer in the Pilbara region of northwest Australia. We then applied a technique that assumes limited details of the form of the age distribution. Tracer concentrations suggest that groundwater samples are a mixture of young and old water - the former is inferred to represent localised recharge from an adjacent creek, and the latter to be diffuse recharge. Using our method, we were able to identify distinct age components in the groundwater. The results suggest the presence of four distinct age groups; zero and 20 years, 50 to 100 years, 100 to 600 years and approximately 1000 years old. These relatively high recharge events were consistent with local recharge sources (50-100 years) and confirmed by palaeo-climate record obtained from lake sediments. We found that although the ages of these components were well constrained, the relative proportions of each component was highly sensitive to errors of environmental tracer data. Our results show that the method we implemented can identify distinct age groups in groundwater samples without prior knowledge of the age distribution. The presence of distinct recharge times gives insight into groundwater flow conditions over long periods of time.

  15. Relative extraction ratio (RER) for arsenic and heavy metals in soils and tailings from various metal mines, Korea.

    PubMed

    Son, Hye Ok; Jung, Myung Chae

    2011-01-01

    This study focused on the evaluation of leaching behaviours for arsenic and heavy metals (Cd, Cu, Ni, Pb and Zn) in soils and tailings contaminated by mining activities. Ten representative mine soils were taken at four representative metal mines in Korea. To evaluate the leaching characteristics of the samples, eight extraction methods were adapted namely 0.1 M HCl, 0.5 M HCl, 1.0 M HCl, 3.0 M HCl, Korean Standard Leaching Procedure for waste materials (KSLP), Synthetic Precipitation Leaching Procedure (SPLP), Toxicity Characteristic Leaching Procedure (TCLP) and aqua regia extraction (AR) methods. In order to compare element concentrations as extraction methods, relative extraction ratios (RERs, %), defined as element concentration extracted by the individual leaching method divided by that extracted by aqua regia based on USEPA method 3050B, were calculated. Although the RER values can vary upon sample types and elements, they increase with increasing ionic strength of each extracting solution. Thus, the RER for arsenic and heavy metals in the samples increased in the order of KSLP < SPLP < TCLP < 0.1 M HCl < 0.5 M HCl < 1.0 M HCl < 3.0 M HCl. In the same extraction method, the RER values for Cd and Zn were relatively higher than those for As, Cu, Ni and Pb. This may be due to differences in geochemical behaviour of each element, namely high solubility of Cd and Zn and low solubility of As, Cu, Ni and Pb in surface environment. Thus, the extraction results can give important information on the degree and extent of arsenic and heavy metal dispersion in the surface environment.

  16. Detection methods for human enteric viruses in representative foods.

    PubMed

    Leggitt, P R; Jaykus, L A

    2000-12-01

    Although viral foodborne disease is a significant problem, foods are rarely tested for viral contamination, and when done, testing is limited to shellfish commodities. In this work, we report a method to extract and detect human enteric viruses from alternative food commodities using an elution-concentration approach followed by detection using reverse transcription-polymerase chain reaction (RT-PCR). Fifty-gram lettuce or hamburger samples were artificially inoculated with poliovirus type 1 (PV1), hepatitis A virus (HAV), or the Norwalk virus and processed by the sequential steps of homogenization, filtration, Freon extraction (hamburger), and polyethylene glycol (PEG) precipitation. To reduce volumes further and remove RT-PCR inhibitors, a secondary PEG precipitation was necessary, resulting in an overall 10- to 20-fold sample size reduction from 50 g to 3 to 5 ml. Virus recoveries in secondary PEG concentrates ranged from 10 to 70% for PV1 and 2 to 4% for HAV as evaluated by mammalian cell culture infectivity assay. Total RNA from PEG concentrates was extracted to a small volume (30 to 40 microl) and subjected to RT-PCR amplification of viral RNA sequences. Detection limit studies indicated that viral RNA was consistently detected by RT-PCR at initial inoculum levels > or =102 PFU/50-g food sample for PV1 and > or =10(3) PFU/50-g food sample for HAV. In similar studies with the Norwalk virus, detection at inoculum levels > or =1.5 X 10(3) PCR-amplifiable units/50-g sample for both food products was possible. All RT-PCR amplicons were confirmed by subsequent Southern hybridization. The procedure reported represents progress toward the development of methods to detect human enteric viral contamination in foods other than shellfish.

  17. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  18. Do your extractable TPH concentrations represent dissolved petroleum? An update on applied research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zemo, D.A.

    1997-12-31

    Elevated concentrations of {open_quotes}dissolved-phase{close_quotes} extractable total petroleum hydrocarbons (TPH) in groundwater samples can be a significant impediment to site closure in states that regulate groundwater using TPH criteria. These analytical results are inconsistent with petroleum chemistry because of the relatively low water solubility of petroleum products. This paper presents an update of our research into the source of medium- to high-boiling TPH detections in groundwater samples and application of the results to multiple projects. This work follows from a 1995 publication in which positive interferences to the Method 8015M (GC-FID) TPH measurement by soluble, non-petroleum hydrocarbons resulting from intrinsic bioremediationmore » or non-dissolved petroleum adhered to particulates was described. The 1995 paper was largely theoretical and focused on one case study. Since 1995, we have evaluated the source of TPH detections in groundwater at numerous petroleum sites and have demonstrated the significance of interferences to the Method 8015M measurement to the California regulatory community. Our work has shown conclusively that elevated concentrations of extractable TPH are not representative of dissolved petroleum constituents. We have shown that a sample cleanup prior to analysis using silica gel cleanup (to remove polar non-petroleum hydrocarbons) and/or laboratory filtration (to reduce petroleum-affected particulates) is required to overcome the false positives caused by interferences to the Method 8015M measurement.« less

  19. High-throughput immunomagnetic scavenging technique for quantitative analysis of live VX nerve agent in water, hamburger, and soil matrixes.

    PubMed

    Knaack, Jennifer S; Zhou, Yingtao; Abney, Carter W; Prezioso, Samantha M; Magnuson, Matthew; Evans, Ronald; Jakubowski, Edward M; Hardy, Katelyn; Johnson, Rudolph C

    2012-11-20

    We have developed a novel immunomagnetic scavenging technique for extracting cholinesterase inhibitors from aqueous matrixes using biological targeting and antibody-based extraction. The technique was characterized using the organophosphorus nerve agent VX. The limit of detection for VX in high-performance liquid chromatography (HPLC)-grade water, defined as the lowest calibrator concentration, was 25 pg/mL in a small, 500 μL sample. The method was characterized over the course of 22 sample sets containing calibrators, blanks, and quality control samples. Method precision, expressed as the mean relative standard deviation, was less than 9.2% for all calibrators. Quality control sample accuracy was 102% and 100% of the mean for VX spiked into HPLC-grade water at concentrations of 2.0 and 0.25 ng/mL, respectively. This method successfully was applied to aqueous extracts from soil, hamburger, and finished tap water spiked with VX. Recovery was 65%, 81%, and 100% from these matrixes, respectively. Biologically based extractions of organophosphorus compounds represent a new technique for sample extraction that provides an increase in extraction specificity and sensitivity.

  20. A systematic random sampling scheme optimized to detect the proportion of rare synapses in the neuropil.

    PubMed

    da Costa, Nuno Maçarico; Hepp, Klaus; Martin, Kevan A C

    2009-05-30

    Synapses can only be morphologically identified by electron microscopy and this is often a very labor-intensive and time-consuming task. When quantitative estimates are required for pathways that contribute a small proportion of synapses to the neuropil, the problems of accurate sampling are particularly severe and the total time required may become prohibitive. Here we present a sampling method devised to count the percentage of rarely occurring synapses in the neuropil using a large sample (approximately 1000 sampling sites), with the strong constraint of doing it in reasonable time. The strategy, which uses the unbiased physical disector technique, resembles that used in particle physics to detect rare events. We validated our method in the primary visual cortex of the cat, where we used biotinylated dextran amine to label thalamic afferents and measured the density of their synapses using the physical disector method. Our results show that we could obtain accurate counts of the labeled synapses, even when they represented only 0.2% of all the synapses in the neuropil.

  1. Determination of protein carbonyls in plasma, cell extracts, tissue homogenates, isolated proteins: Focus on sample preparation and derivatization conditions

    PubMed Central

    Weber, Daniela; Davies, Michael J.; Grune, Tilman

    2015-01-01

    Protein oxidation is involved in regulatory physiological events as well as in damage to tissues and is thought to play a key role in the pathophysiology of diseases and in the aging process. Protein-bound carbonyls represent a marker of global protein oxidation, as they are generated by multiple different reactive oxygen species in blood, tissues and cells. Sample preparation and stabilization are key steps in the accurate quantification of oxidation-related products and examination of physiological/pathological processes. This review therefore focuses on the sample preparation processes used in the most relevant methods to detect protein carbonyls after derivatization with 2,4-dinitrophenylhydrazine with an emphasis on measurement in plasma, cells, organ homogenates, isolated proteins and organelles. Sample preparation, derivatization conditions and protein handling are presented for the spectrophotometric and HPLC method as well as for immunoblotting and ELISA. An extensive overview covering these methods in previously published articles is given for researchers who plan to measure protein carbonyls in different samples. PMID:26141921

  2. Determination of protein carbonyls in plasma, cell extracts, tissue homogenates, isolated proteins: Focus on sample preparation and derivatization conditions.

    PubMed

    Weber, Daniela; Davies, Michael J; Grune, Tilman

    2015-08-01

    Protein oxidation is involved in regulatory physiological events as well as in damage to tissues and is thought to play a key role in the pathophysiology of diseases and in the aging process. Protein-bound carbonyls represent a marker of global protein oxidation, as they are generated by multiple different reactive oxygen species in blood, tissues and cells. Sample preparation and stabilization are key steps in the accurate quantification of oxidation-related products and examination of physiological/pathological processes. This review therefore focuses on the sample preparation processes used in the most relevant methods to detect protein carbonyls after derivatization with 2,4-dinitrophenylhydrazine with an emphasis on measurement in plasma, cells, organ homogenates, isolated proteins and organelles. Sample preparation, derivatization conditions and protein handling are presented for the spectrophotometric and HPLC method as well as for immunoblotting and ELISA. An extensive overview covering these methods in previously published articles is given for researchers who plan to measure protein carbonyls in different samples. © 2015 Published by Elsevier Ltd.

  3. International Study to Evaluate PCR Methods for Detection of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients

    PubMed Central

    Schijman, Alejandro G.; Bisio, Margarita; Orellana, Liliana; Sued, Mariela; Duffy, Tomás; Mejia Jaramillo, Ana M.; Cura, Carolina; Auter, Frederic; Veron, Vincent; Qvarnstrom, Yvonne; Deborggraeve, Stijn; Hijar, Gisely; Zulantay, Inés; Lucero, Raúl Horacio; Velazquez, Elsa; Tellez, Tatiana; Sanchez Leon, Zunilda; Galvão, Lucia; Nolder, Debbie; Monje Rumi, María; Levi, José E.; Ramirez, Juan D.; Zorrilla, Pilar; Flores, María; Jercic, Maria I.; Crisante, Gladys; Añez, Néstor; De Castro, Ana M.; Gonzalez, Clara I.; Acosta Viana, Karla; Yachelini, Pedro; Torrico, Faustino; Robello, Carlos; Diosque, Patricio; Triana Chavez, Omar; Aznar, Christine; Russomando, Graciela; Büscher, Philippe; Assal, Azzedine; Guhl, Felipe; Sosa Estani, Sergio; DaSilva, Alexandre; Britto, Constança; Luquetti, Alejandro; Ladzins, Janis

    2011-01-01

    Background A century after its discovery, Chagas disease still represents a major neglected tropical threat. Accurate diagnostics tools as well as surrogate markers of parasitological response to treatment are research priorities in the field. The purpose of this study was to evaluate the performance of PCR methods in detection of Trypanosoma cruzi DNA by an external quality evaluation. Methodology/Findings An international collaborative study was launched by expert PCR laboratories from 16 countries. Currently used strategies were challenged against serial dilutions of purified DNA from stocks representing T. cruzi discrete typing units (DTU) I, IV and VI (set A), human blood spiked with parasite cells (set B) and Guanidine Hidrochloride-EDTA blood samples from 32 seropositive and 10 seronegative patients from Southern Cone countries (set C). Forty eight PCR tests were reported for set A and 44 for sets B and C; 28 targeted minicircle DNA (kDNA), 13 satellite DNA (Sat-DNA) and the remainder low copy number sequences. In set A, commercial master mixes and Sat-DNA Real Time PCR showed better specificity, but kDNA-PCR was more sensitive to detect DTU I DNA. In set B, commercial DNA extraction kits presented better specificity than solvent extraction protocols. Sat-DNA PCR tests had higher specificity, with sensitivities of 0.05–0.5 parasites/mL whereas specific kDNA tests detected 5.10−3 par/mL. Sixteen specific and coherent methods had a Good Performance in both sets A and B (10 fg/µl of DNA from all stocks, 5 par/mL spiked blood). The median values of sensitivities, specificities and accuracies obtained in testing the Set C samples with the 16 tests determined to be good performing by analyzing Sets A and B samples varied considerably. Out of them, four methods depicted the best performing parameters in all three sets of samples, detecting at least 10 fg/µl for each DNA stock, 0.5 par/mL and a sensitivity between 83.3–94.4%, specificity of 85–95%, accuracy of 86.8–89.5% and kappa index of 0.7–0.8 compared to consensus PCR reports of the 16 good performing tests and 63–69%, 100%, 71.4–76.2% and 0.4–0.5, respectively compared to serodiagnosis. Method LbD2 used solvent extraction followed by Sybr-Green based Real time PCR targeted to Sat-DNA; method LbD3 used solvent DNA extraction followed by conventional PCR targeted to Sat-DNA. The third method (LbF1) used glass fiber column based DNA extraction followed by TaqMan Real Time PCR targeted to Sat-DNA (cruzi 1/cruzi 2 and cruzi 3 TaqMan probe) and the fourth method (LbQ) used solvent DNA extraction followed by conventional hot-start PCR targeted to kDNA (primer pairs 121/122). These four methods were further evaluated at the coordinating laboratory in a subset of human blood samples, confirming the performance obtained by the participating laboratories. Conclusion/Significance This study represents a first crucial step towards international validation of PCR procedures for detection of T. cruzi in human blood samples. PMID:21264349

  4. Pooling sheep faecal samples for the assessment of anthelmintic drug efficacy using McMaster and Mini-FLOTAC in gastrointestinal strongyle and Nematodirus infection.

    PubMed

    Kenyon, Fiona; Rinaldi, Laura; McBean, Dave; Pepe, Paola; Bosco, Antonio; Melville, Lynsey; Devin, Leigh; Mitchell, Gillian; Ianniello, Davide; Charlier, Johannes; Vercruysse, Jozef; Cringoli, Giuseppe; Levecke, Bruno

    2016-07-30

    In small ruminants, faecal egg counts (FECs) and reduction in FECs (FECR) are the most common methods for the assessment of intensity of gastrointestinal (GI) nematodes infections and anthelmintic drug efficacy, respectively. The main limitation of these methods is the time and cost to conduct FECs on a representative number of individual animals. A cost-saving alternative would be to examine pooled faecal samples, however little is known regarding whether pooling can give representative results. In the present study, we compared the FECR results obtained by both an individual and a pooled examination strategy across different pool sizes and analytical sensitivity of the FEC techniques. A survey was conducted on 5 sheep farms in Scotland, where anthelmintic resistance is known to be widespread. Lambs were treated with fenbendazole (4 groups), levamisole (3 groups), ivermectin (3 groups) or moxidectin (1 group). For each group, individual faecal samples were collected from 20 animals, at baseline (D0) and 14 days after (D14) anthelmintic administration. Faecal samples were analyzed as pools of 3-5, 6-10, and 14-20 individual samples. Both individual and pooled samples were screened for GI strongyle and Nematodirus eggs using two FEC techniques with three different levels of analytical sensitivity, including Mini-FLOTAC (analytical sensitivity of 10 eggs per gram of faeces (EPG)) and McMaster (analytical sensitivity of 15 or 50 EPG).For both Mini-FLOTAC and McMaster (analytical sensitivity of 15 EPG), there was a perfect agreement in classifying the efficacy of the anthelmintic as 'normal', 'doubtful' or 'reduced' regardless of pool size. When using the McMaster method (analytical sensitivity of 50 EPG) anthelmintic efficacy was often falsely classified as 'normal' or assessment was not possible due to zero FECs at D0, and this became more pronounced when the pool size increased. In conclusion, pooling ovine faecal samples holds promise as a cost-saving and efficient strategy for assessing GI nematode FECR. However, for the assessment FECR one will need to consider the baseline FEC, pool size and analytical sensitivity of the method. Copyright © 2016. Published by Elsevier B.V.

  5. Extraction of Organic Molecules from Terrestrial Material: Quantitative Yields from Heat and Water Extractions

    NASA Technical Reports Server (NTRS)

    Beegle, L. W.; Abbey, W. A.; Tsapin, A. T.; Dragoi, D.; Kanik, I.

    2004-01-01

    In the robotic search for life on Mars, different proposed missions will analyze the chemical and biological signatures of life using different platforms. The analysis of samples via analytical instrumentation on the surface of Mars has thus far only been attempted by the two Viking missions. Robotic arms scooped relogith material into a pyrolysis oven attached to a GC/MS. No trace of organic material was found on any of the two different samples at either of the two different landing sites. This null result puts an upper limit on the amount of organics that might be present in Martian soil/rocks, although the level of detection for each individual molecular species is still debated. Determining the absolute limit of detection for each analytical instrument is essential so that null results can be understood. This includes investigating the trade off of using pyrolysis versus liquid solvent extraction to release organic materials (in terms of extraction efficiencies and the complexity of the sample extraction process.) Extraction of organics from field samples can be accomplished by a variety of methods such utilizing various solvents including HCl, pure water, supercritical fluid and Soxhelt extraction. Utilizing 6N HCl is one of the most commonly used method and frequently utilized for extraction of organics from meteorites but it is probably infeasible for robotic exploration due to difficulty of storage and transport. Extraction utilizing H2O is promising, but it could be less efficient than 6N HCl. Both supercritical fluid and Soxhelt extraction methods require bulky hardware and require complex steps, inappropriate for inclusion on rover spacecraft. This investigation reports the efficiencies of pyrolysis and solvent extraction methods for amino acids for different terrestrial samples. The samples studied here, initially created in aqueous environments, are sedimentary in nature. These particular samples were chosen because they possibly represent one of the best terrestrial analogs of Mars and they represent one of the absolute best case scenarios for finding organic molecules on the Martian surface.

  6. Verification Of The Defense Waste Processing Facility's (DWPF) Process Digestion Methods For The Sludge Batch 8 Qualification Sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Click, D. R.; Edwards, T. B.; Wiedenman, B. J.

    2013-03-18

    This report contains the results and comparison of data generated from inductively coupled plasma – atomic emission spectroscopy (ICP-AES) analysis of Aqua Regia (AR), Sodium Peroxide/Sodium Hydroxide Fusion Dissolution (PF) and Cold Chem (CC) method digestions and Cold Vapor Atomic Absorption analysis of Hg digestions from the DWPF Hg digestion method of Sludge Batch 8 (SB8) Sludge Receipt and Adjustment Tank (SRAT) Receipt and SB8 SRAT Product samples. The SB8 SRAT Receipt and SB8 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB8 Batch ormore » qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 7b (SB7b), to form the SB8 Blend composition.« less

  7. Lipid Vesicle Shape Analysis from Populations Using Light Video Microscopy and Computer Vision

    PubMed Central

    Zupanc, Jernej; Drašler, Barbara; Boljte, Sabina; Kralj-Iglič, Veronika; Iglič, Aleš; Erdogmus, Deniz; Drobne, Damjana

    2014-01-01

    We present a method for giant lipid vesicle shape analysis that combines manually guided large-scale video microscopy and computer vision algorithms to enable analyzing vesicle populations. The method retains the benefits of light microscopy and enables non-destructive analysis of vesicles from suspensions containing up to several thousands of lipid vesicles (1–50 µm in diameter). For each sample, image analysis was employed to extract data on vesicle quantity and size distributions of their projected diameters and isoperimetric quotients (measure of contour roundness). This process enables a comparison of samples from the same population over time, or the comparison of a treated population to a control. Although vesicles in suspensions are heterogeneous in sizes and shapes and have distinctively non-homogeneous distribution throughout the suspension, this method allows for the capture and analysis of repeatable vesicle samples that are representative of the population inspected. PMID:25426933

  8. A multiplex real-time PCR assay, based on invA and pagC genes, for the detection and quantification of Salmonella enterica from cattle lymph nodes.

    PubMed

    Bai, Jianfa; Trinetta, Valentina; Shi, Xiaorong; Noll, Lance W; Magossi, Gabriela; Zheng, Wanglong; Porter, Elizabeth P; Cernicchiaro, Natalia; Renter, David G; Nagaraja, Tiruvoor G

    2018-05-01

    Cattle lymph nodes can harbor Salmonella and potentially contaminate beef products. We have developed and validated a new real-time PCR (qPCR) assay for the detection and quantification of Salmonella enterica in cattle lymph nodes. The assay targets both the invA and pagC genes, the most conserved molecular targets in Salmonella enterica. An 18S rRNA gene assay that amplifies from cattle and other animal species was also included as an internal control. Available DNA sequences for invA, pagC and 18S rRNA genes were used for primer and probe selections. Three Salmonella serotypes, S. Typhimurium, S. Anatum, and S. Montevideo, were used to assess the assay's analytical sensitivity. Correlation coefficients of standard curves generated for each target and for all three serotypes were >99% and qPCR amplification efficiencies were between 93% and 110%. Assay sensitivity was also determined using standard curve data generated from Salmonella-negative cattle lymph nodes spiked with 10-fold dilutions of the three Salmonella serotypes. Assay specificity was determined using Salmonella culture method, and qPCR testing on 36 Salmonella strains representing 33 serotypes, 38 Salmonella strains of unknown serotypes, 252 E. coli strains representing 40 serogroups, and 31 other bacterial strains representing 18 different species. A collection of 647 cattle lymph node samples from steers procured from the Midwest region of the US were tested by the qPCR, and compared to culture-method of detection. Salmonella prevalence by qPCR for pre-enriched and enriched lymph nodes was 19.8% (128/647) and 94.9% (614/647), respectively. A majority of qPCR positive pre-enriched samples (105/128) were at concentrations between 10 4 and 10 5  CFU/mL. Culture method detected Salmonella in 7.7% (50/647) and 80.7% (522/647) of pre- and post-enriched samples, respectively; 96.0% (48/50) of pre-enriched and 99.4% (519/522) of post-enriched culture-positive samples were also positive by qPCR. More samples tested positive by qPCR than by culture method, indicating that the real-time PCR assay was more sensitive. Our data indicate that this triplex qPCR can be used to accurately detect and quantify Salmonella enterica strains from cattle lymph node samples. The assay may serve as a useful tool to monitor the prevalence of Salmonella in beef production systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurley, D.F.; Whitehouse, J.M.

    A dedicated low-flow groundwater sample collection system was designed for implementation in a post-closure ACL monitoring program at the Yaworski Lagoon NPL site in Canterbury, Connecticut. The system includes dedicated bladder pumps with intake ports located in the screened interval of the monitoring wells. This sampling technique was implemented in the spring of 1993. The system was designed to simultaneously obtain samples directly from the screened interval of nested wells in three distinct water bearing zones. Sample collection is begun upon stabilization of field parameters. Other than line volume, no prior purging of the well is required. It was foundmore » that dedicated low-flow sampling from the screened interval provides a method of representative sample collection without the bias of suspended solids introduced by traditional techniques of pumping and bailing. Analytical data indicate that measured chemical constituents are representative of groundwater migrating through the screened interval. Upon implementation of the low-flow monitoring system, analytical results exhibited a decrease in concentrations of some organic compounds and metals. The system has also proven to be a cost effective alternative to pumping and bailing which generate large volumes of purge water requiring containment and disposal.« less

  10. METHOD INDUCED EXPOSURE MISCLASSIFICATION FOR A RESPIRABLE DUST SAMPLED USING ISO/ACGIH/CEN CRITERIA. (R826786)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  11. Extracurricular Activities and Bullying Perpetration: Results from a Nationally Representative Sample

    ERIC Educational Resources Information Center

    Riese, Alison; Gjelsvik, Annie; Ranney, Megan L.

    2015-01-01

    Background: Bullying is a widespread problem for school-aged children and adolescents. Interventions to reduce bullying are not well disseminated. Extracurricular involvement is, however, common. This study aims to examine the relationship between parent-reported participation in extracurricular activities and bullying perpetration. Methods: Using…

  12. Developmental Trajectories of Early Communication Skills

    ERIC Educational Resources Information Center

    Maatta, Sira; Laakso, Marja-Leena; Tolvanen, Asko; Ahonen, Timo; Aro, Tuija

    2012-01-01

    Purpose: This study focused on developmental trajectories of prelinguistic communication skills and their connections to later parent-reported language difficulties. Method: The participants represent a subset of a community-based sample of 508 children. Data include parent reports of prelinguistic communication skills at 12, 15, 18, and 21 months…

  13. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    PubMed

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  14. Methods for fitting a parametric probability distribution to most probable number data.

    PubMed

    Williams, Michael S; Ebel, Eric D

    2012-07-02

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two data sets that represent Salmonella and Campylobacter concentrations on chicken carcasses. The results demonstrate a bias in the maximum likelihood estimator that increases with reductions in average concentration. The Bayesian method provided unbiased estimates of the concentration distribution parameters for all data sets. We provide computer code for the Bayesian fitting method. Published by Elsevier B.V.

  15. [Classical and molecular methods for identification and quantification of domestic moulds].

    PubMed

    Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S

    2017-12-01

    To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.

  16. The contribution of cluster and discriminant analysis to the classification of complex aquifer systems.

    PubMed

    Panagopoulos, G P; Angelopoulou, D; Tzirtzilakis, E E; Giannoulopoulos, P

    2016-10-01

    This paper presents an innovated method for the discrimination of groundwater samples in common groups representing the hydrogeological units from where they have been pumped. This method proved very efficient even in areas with complex hydrogeological regimes. The proposed method requires chemical analyses of water samples only for major ions, meaning that it is applicable to most of cases worldwide. Another benefit of the method is that it gives a further insight of the aquifer hydrogeochemistry as it provides the ions that are responsible for the discrimination of the group. The procedure begins with cluster analysis of the dataset in order to classify the samples in the corresponding hydrogeological unit. The feasibility of the method is proven from the fact that the samples of volcanic origin were separated into two different clusters, namely the lava units and the pyroclastic-ignimbritic aquifer. The second step is the discriminant analysis of the data which provides the functions that distinguish the groups from each other and the most significant variables that define the hydrochemical composition of the aquifer. The whole procedure was highly successful as the 94.7 % of the samples were classified to the correct aquifer system. Finally, the resulted functions can be safely used to categorize samples of either unknown or doubtful origin improving thus the quality and the size of existing hydrochemical databases.

  17. Dried haematic microsamples and LC-MS/MS for the analysis of natural and synthetic cannabinoids.

    PubMed

    Protti, Michele; Rudge, James; Sberna, Angelo Eliseo; Gerra, Gilberto; Mercolini, Laura

    2017-02-15

    Synthetic cannabinoids are new psychoactive substances (NPS) with similar effects when compared to natural ones found in Cannabis derivatives. They have rapidly integrated into the illicit market, often sold as alternatives under international control. The need to identify and quantify an unprecedented and growing number of new compounds represents a unique challenge for toxicological, forensic and anti-doping analysis. Dried blood spots have been used within the bioanalytical framework in place of plasma or serum, in order to reduce invasiveness, lower sample size, simplify handling, storage and shipping of samples and to facilitate home-based and on-field applications. However, DBS implementation has been limited mainly by concerns related to haematocrit effect on method accuracy. Volumetric absorptive microsampling (VAMS™), a second generation dried miniaturized sampling technology, has been developed just in order to eliminate haematocrit effect, thus providing accurate sampling but still granting feasible sample processing. An original LC-MS/MS method was herein developed and validated for the analysis of THC and its 2 main metabolites, together with 10 representative synthetic cannabinoids in both DBS and VAMS dried microsamples. The ultimate goal of this work is to provide highly innovative DBS and VAMS analytical protocols, whose performances were extensively optimized and compared, in order to provide effective and alternative tools that can be applied for natural and synthetic cannabinoid determination, in place of classical analytical strategies. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Test plan for evaluating the operational performance of the prototype nested, fixed-depth fluidic sampler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    REICH, F.R.

    The PHMC will provide Low Activity Wastes (LAW) tank wastes for final treatment by a privatization contractor from two double-shell feed tanks, 241-AP-102 and 241-AP-104. Concerns about the inability of the baseline ''grab'' sampling to provide large volume samples within time constraints has led to the development of a nested, fixed-depth sampling system. This sampling system will provide large volume, representative samples without the environmental, radiation exposure, and sample volume impacts of the current base-line ''grab'' sampling method. A plan has been developed for the cold testing of this nested, fixed-depth sampling system with simulant materials. The sampling system willmore » fill the 500-ml bottles and provide inner packaging to interface with the Hanford Sites cask shipping systems (PAS-1 and/or ''safe-send''). The sampling system will provide a waste stream that will be used for on-line, real-time measurements with an at-tank analysis system. The cold tests evaluate the performance and ability to provide samples that are representative of the tanks' content within a 95 percent confidence interval, to sample while mixing pumps are operating, to provide large sample volumes (1-15 liters) within a short time interval, to sample supernatant wastes with over 25 wt% solids content, to recover from precipitation- and settling-based plugging, and the potential to operate over the 20-year expected time span of the privatization contract.« less

  19. Recruiting and retaining youth and young adults: challenges and opportunities in survey research for tobacco control.

    PubMed

    Cantrell, Jennifer; Hair, Elizabeth C; Smith, Alexandria; Bennett, Morgane; Rath, Jessica Miller; Thomas, Randall K; Fahimi, Mansour; Dennis, J Michael; Vallone, Donna

    2018-03-01

    Evaluation studies of population-based tobacco control interventions often rely on large-scale survey data from numerous respondents across many geographic areas to provide evidence of their effectiveness. Significant challenges for survey research have emerged with the evolving communications landscape, particularly for surveying hard-to-reach populations such as youth and young adults. This study combines the comprehensive coverage of an address-based sampling (ABS) frame with the timeliness of online data collection to develop a nationally representative longitudinal cohort of young people aged 15-21. We constructed an ABS frame, partially supplemented with auxiliary data, to recruit this hard-to-reach sample. Branded and tested mail-based recruitment materials were designed to bring respondents online for screening, consent and surveying. Once enrolled, respondents completed online surveys every 6 months via computer, tablet or smartphone. Numerous strategies were utilized to enhance retention and representativeness RESULTS: Results detail sample performance, representativeness and retention rates as well as device utilization trends for survey completion among youth and young adult respondents. Panel development efforts resulted in a large, nationally representative sample with high retention rates. This study is among the first to employ this hybrid ABS-to-online methodology to recruit and retain youth and young adults in a probability-based online cohort panel. The approach is particularly valuable for conducting research among younger populations as it capitalizes on their increasing access to and comfort with digital communication. We discuss challenges and opportunities of panel recruitment and retention methods in an effort to provide valuable information for tobacco control researchers seeking to obtain representative, population-based samples of youth and young adults in the U.S. as well as across the globe. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Instrumental methods of analysis of sulfur compounds in synfuel process streams. Quarterly technical progress report, July-September 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, J.; Talbott, J.

    1984-01-01

    Task 1. Methods development for the speciation of the polysulfides. Work on this task has been completed in December 1983 and reported accordingly in DOE/PC/40783-T13. Task 2. Methods development for the speciation of dithionite and polythionates. Work on Task 2 has been completed in June 1984 and has been reported accordingly in DOE/PC/40783-T15. Task 3. Total accounting of the sulfur balance in representative samples of synfuel process streams. A systematic and critical comparison of results, obtained in the analysis of sulfur moieties in representative samples of coal conversion process streams, revealed the following general trends. (a) In specimens of highmore » pH (9-10) and low redox potential (-0.3 to -0.4 volt versus NHE) sulfidic and polysulfidic sulfur moieties predominate. (b) In process streams of lower pH and more positive redox potential, higher oxidation states of sulfur (notably sulfate) account for most of the total sulfur present. (c) Oxidative wastewater treatment procedures by the PETC stripping process convert lower oxidation states of sulfur into thiosulfate and sulfate. In this context, remarkable similarities were observed between liquefaction and gasification process streams. However, the thiocyanate present in samples from the Grand Forks gasifier were impervious to the PETC stripping process. (d) Total sulfur contaminant levels in coal conversion process stream wastewater samples are primarily determined by the abundance of sulfur in the coal used as starting material than by the nature of the conversion process (liquefaction or gasification). 13 references.« less

  1. AST: an automated sequence-sampling method for improving the taxonomic diversity of gene phylogenetic trees.

    PubMed

    Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying

    2014-01-01

    A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php.

  2. AST: An Automated Sequence-Sampling Method for Improving the Taxonomic Diversity of Gene Phylogenetic Trees

    PubMed Central

    Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying

    2014-01-01

    A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php. PMID:24892935

  3. Methods to Obtain a Representative Sample of Ryan White-Funded Patients for a Needs Assessment in Los Angeles County: Results from a Replicable Approach.

    PubMed

    Dierst-Davies, Rhodri; Wohl, Amy Rock; Pinney, Glenda; Johnson, Christopher H; Vincent-Jones, Craig; Pérez, Mario J

    The Health Resources and Services Administration requires that jurisdictions receiving Ryan White (RW) funding justify need, set priorities, and provide allocations using evidence-based methods. Methods and results from the 2011 Los Angeles Coordinated HIV/AIDS Needs Assessment-Care (LACHNA-Care) study are presented. Individual-level weights were applied to expand the sample from 400 to 18 912 persons, consistent with the 19 915 clients in the system. Awareness, need, and utilization for medical outpatient care were high (>90%). Other services (eg, child care) had limited awareness (21%). Majority of participants reported at least 1 service gap (81%). Lack of insurance (risk ratio [RR] = 3.0, 95% confidence interval [CI]: 1.5-6.2), substance use (RR = 2.9, 95% CI: 1.3-6.4), and past lapses in medical care (RR = 2.8, 95% CI: 1.3-5.9) were associated with gaps. Within clusters, past incarceration was associated with gaps for housing (RR = 13.5, 95% CI: 3.5-52.1), transportation (RR = 3.2, 95% CI: 1.2-8.4), and case management (RR = 4.0, 95% CI: 1.3-12.2). Applied methods resulted in representative data instrumental to RW program planning efforts.

  4. [Bacteriological quality of traditional, organic and hydroponic cultured lettuce in Costa Rica].

    PubMed

    Monge, Claudio; Chaves, Carolina; Arias, María Laura

    2011-03-01

    The main objective of this work was to evaluate the microbiological quality of lettuces commercialized in the Metropolitan Area of San José, Costa Rica, and cultured in different ways, in order to detect differences between the culturing methods and the risk that these products may represent for Public Health. The study was done at the Food Microbiology Laboratory, Universidad de Costa Rica, from March to July, 2010. 30 lettuce samples were analyzed (10 obtained by traditional culture, 10 by organic culture and 10 by hydropony). All samples were obtained from markets where their origin was certified. Total aerobic plate count, total and fecal coliforms count and Escherichia coli were determined to all samples, as well as the presence/abscense of Salmonella spp. and Listeria monocytogenes in 25 g. Results obtained show that there is no statistically significant difference (p < 0.001) between the different types of cultures analyzed for any of the parameters evaluated. An important percentage of the samples presented coliforms, nevertheless, just one E. coli strain was isolated from a traditionally cultured lettuce sample. Four different Salmonella spp. strains were isolated from the samples as well as one Listeria monocytogenes strain. Data obtained show that the consumption of this product, raw or without an adequate hygiene and disinfection may represent a risk for health. Also, from the bacteriological point of view, there is no significant difference between the culturing methods evaluated, suggesting that the specific directions for each type of culture are not followed or that there is an inadequate handling of the products or post harvest contamination.

  5. Organic Matter Detection on Mars by Pyrolysis-FTIR: An Analysis of Sensitivity and Mineral Matrix Effects

    NASA Astrophysics Data System (ADS)

    Gordon, Peter R.; Sephton, Mark A.

    2016-11-01

    Returning samples from Mars will require an effective method to assess and select the highest-priority geological materials. The ideal instrument for sample triage would be simple in operation, limited in its demand for resources, and rich in produced diagnostic information. Pyrolysis-Fourier infrared spectroscopy (pyrolysis-FTIR) is a potentially attractive triage instrument that considers both the past habitability of the sample depositional environment and the presence of organic matter that may reflect actual habitation. An important consideration for triage protocols is the sensitivity of the instrumental method. Experimental data indicate pyrolysis-FTIR sensitivities for organic matter at the tens of parts per million level. The mineral matrix in which the organic matter is hosted also has an influence on organic detection. To provide an insight into matrix effects, we mixed well-characterized organic matter with a variety of dry minerals, to represent the various inorganic matrices of Mars samples, prior to analysis. During pyrolysis-FTIR, serpentinites analogous to those on Mars indicative of the Phyllocian Era led to no negative effects on organic matter detection; sulfates analogous to those of the Theiikian Era led, in some instances, to the combustion of organic matter; and palagonites, which may represent samples from the Siderikian Era, led, in some instances, to the chlorination of organic matter. Any negative consequences brought about by these mineral effects can be mitigated by the correct choice of thermal extraction temperature. Our results offer an improved understanding of how pyrolysis-FTIR can perform during sample triage on Mars.

  6. Effects of Sample Selection Bias on the Accuracy of Population Structure and Ancestry Inference

    PubMed Central

    Shringarpure, Suyash; Xing, Eric P.

    2014-01-01

    Population stratification is an important task in genetic analyses. It provides information about the ancestry of individuals and can be an important confounder in genome-wide association studies. Public genotyping projects have made a large number of datasets available for study. However, practical constraints dictate that of a geographical/ethnic population, only a small number of individuals are genotyped. The resulting data are a sample from the entire population. If the distribution of sample sizes is not representative of the populations being sampled, the accuracy of population stratification analyses of the data could be affected. We attempt to understand the effect of biased sampling on the accuracy of population structure analysis and individual ancestry recovery. We examined two commonly used methods for analyses of such datasets, ADMIXTURE and EIGENSOFT, and found that the accuracy of recovery of population structure is affected to a large extent by the sample used for analysis and how representative it is of the underlying populations. Using simulated data and real genotype data from cattle, we show that sample selection bias can affect the results of population structure analyses. We develop a mathematical framework for sample selection bias in models for population structure and also proposed a correction for sample selection bias using auxiliary information about the sample. We demonstrate that such a correction is effective in practice using simulated and real data. PMID:24637351

  7. Fractionated dynamic headspace sampling in the analysis of matrices of vegetable origin in the food field.

    PubMed

    Liberto, Erica; Cagliero, Cecilia; Cordero, Chiara; Rubiolo, Patrizia; Bicchi, Carlo; Sgorbini, Barbara

    2017-03-17

    Recent technological advances in dynamic headspace sampling (D-HS) and the possibility to automate this sampling method have lead to a marked improvement in its the performance, a strong renewal of interest in it, and have extended its fields of application. The introduction of in-parallel and in-series automatic multi-sampling and of new trapping materials, plus the possibility to design an effective sampling process by correctly applying the breakthrough volume theory, have make profiling more representative, and have enhanced selectivity, and flexibility, also offering the possibility of fractionated enrichment in particular for high-volatility compounds. This study deals with fractionated D-HS ability to produce a sample representative of the volatile fraction of solid or liquid matrices. Experiments were carried out on a model equimolar (0.5mM) EtOH/water solution, comprising 16 compounds with different polarities and volatilities, structures ranging from C5 to C15 and vapor pressures from 4.15kPa (2,3-pentandione) to 0.004kPa (t-β-caryophyllene), and on an Arabica roasted coffee powder. Three trapping materials were considered: Tenax TA™ (TX), Polydimethylsiloxane foam (PDMS), and a three-carbon cartridge Carbopack B/Carbopack C/Carbosieve S-III™ (CBS). The influence of several parameters on the design of successful fractionated D-HS sampling. Including the physical and chemical characteristics of analytes and matrix, trapping material, analyte breakthrough, purge gas volumes, and sampling temperature, were investigated. The results show that, by appropriately choosing sampling conditions, fractionated D-HS sampling, based on component volatility, can produce a fast and representative profile of the matrix volatile fraction, with total recoveries comparable to those obtained by full evaporation D-HS for liquid samples, and very high concentration factors for solid samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. An integrated system for identifying the hidden assassins in traditional medicines containing aristolochic acids

    NASA Astrophysics Data System (ADS)

    Wu, Lan; Sun, Wei; Wang, Bo; Zhao, Haiyu; Li, Yaoli; Cai, Shaoqing; Xiang, Li; Zhu, Yingjie; Yao, Hui; Song, Jingyuan; Cheng, Yung-Chi; Chen, Shilin

    2015-08-01

    Traditional herbal medicines adulterated and contaminated with plant materials from the Aristolochiaceae family, which contain aristolochic acids (AAs), cause aristolochic acid nephropathy. Approximately 256 traditional Chinese patent medicines, containing Aristolochiaceous materials, are still being sold in Chinese markets today. In order to protect consumers from health risks due to AAs, the hidden assassins, efficient methods to differentiate Aristolochiaceous herbs from their putative substitutes need to be established. In this study, 158 Aristolochiaceous samples representing 46 species and four genera as well as 131 non-Aristolochiaceous samples representing 33 species, 20 genera and 12 families were analyzed using DNA barcodes based on the ITS2 and psbA-trnH sequences. Aristolochiaceous materials and their non-Aristolochiaceous substitutes were successfully identified using BLAST1, the nearest distance method and the neighbor-joining (NJ) tree. In addition, based on sequence information of ITS2, we developed a Real-Time PCR assay which successfully identified herbal material from the Aristolochiaceae family. Using Ultra High Performance Liquid Chromatography-Mass Spectrometer (UHPLC-HR-MS), we demonstrated that most representatives from the Aristolochiaceae family contain toxic AAs. Therefore, integrated DNA barcodes, Real-Time PCR assays using TaqMan probes and UHPLC-HR-MS system provides an efficient and reliable authentication system to protect consumers from health risks due to the hidden assassins (AAs).

  9. Current status of contraceptive use among rural married women in Anhui Province of China.

    PubMed

    Zhang, X-J; Wang, G-Y; Shen, Q; Yu, Y-L; Sun, Y-H; Yu, G-B; Zhao, D; Ye, D-Q

    2009-11-01

    This study aims to explore the current status of married women in regard of their use of contraceptive methods (permanent methods versus non-permanent methods) and to find out factors that affect the use of contraceptive methods in rural areas of Anhui Province of China. Survey. Anhui, China. A total of 53,652 married women aged 18-49 years. A multistage probability sampling method was used to identify a representative sample of 53,652 married women aged 18-49 years. All women were asked to provide detailed information by completing detailed questionnaires. Contraceptive prevalence and influence factors. The total birth control rate of the sample was 95.2%. Samples choosing the permanent and nonpermanent contraceptive methods have taken up 46.7 and 48.5% respectively. Female sterilisation was the first choice with a usage rate of 43.6%, followed by intrauterine device (IUD), which was used by 41.1% of samples. Single-variable analysis showed that the choice of contraceptive methods was associated with age, education level, parity, frequency of sex intercourses in a month, contraceptive knowledge, RTI symptom and the gender of the last child of rural married women. A significant increase in contraceptive use of rural married women in Anhui Province of China. Female sterilisation and IUD still play the dominant role. Effective family planning methods should be advocated through adequate counselling on the correct use and proper management, with consideration of the background of custom and belief.

  10. Research of mine water source identification based on LIF technology

    NASA Astrophysics Data System (ADS)

    Zhou, Mengran; Yan, Pengcheng

    2016-09-01

    According to the problem that traditional chemical methods to the mine water source identification takes a long time, put forward a method for rapid source identification system of mine water inrush based on the technology of laser induced fluorescence (LIF). Emphatically analyzes the basic principle of LIF technology. The hardware composition of LIF system are analyzed and the related modules were selected. Through the fluorescence experiment with the water samples of coal mine in the LIF system, fluorescence spectra of water samples are got. Traditional water source identification mainly according to the ion concentration representative of the water, but it is hard to analysis the ion concentration of the water from the fluorescence spectra. This paper proposes a simple and practical method of rapid identification of water by fluorescence spectrum, which measure the space distance between unknown water samples and standard samples, and then based on the clustering analysis, the category of the unknown water sample can be get. Water source identification for unknown samples verified the reliability of the LIF system, and solve the problem that the current coal mine can't have a better real-time and online monitoring on water inrush, which is of great significance for coal mine safety in production.

  11. Compressive Sensing of Roller Bearing Faults via Harmonic Detection from Under-Sampled Vibration Signals

    PubMed Central

    Tang, Gang; Hou, Wei; Wang, Huaqing; Luo, Ganggang; Ma, Jianwei

    2015-01-01

    The Shannon sampling principle requires substantial amounts of data to ensure the accuracy of on-line monitoring of roller bearing fault signals. Challenges are often encountered as a result of the cumbersome data monitoring, thus a novel method focused on compressed vibration signals for detecting roller bearing faults is developed in this study. Considering that harmonics often represent the fault characteristic frequencies in vibration signals, a compressive sensing frame of characteristic harmonics is proposed to detect bearing faults. A compressed vibration signal is first acquired from a sensing matrix with information preserved through a well-designed sampling strategy. A reconstruction process of the under-sampled vibration signal is then pursued as attempts are conducted to detect the characteristic harmonics from sparse measurements through a compressive matching pursuit strategy. In the proposed method bearing fault features depend on the existence of characteristic harmonics, as typically detected directly from compressed data far before reconstruction completion. The process of sampling and detection may then be performed simultaneously without complete recovery of the under-sampled signals. The effectiveness of the proposed method is validated by simulations and experiments. PMID:26473858

  12. Core Hunter 3: flexible core subset selection.

    PubMed

    De Beukelaer, Herman; Davenport, Guy F; Fack, Veerle

    2018-05-31

    Core collections provide genebank curators and plant breeders a way to reduce size of their collections and populations, while minimizing impact on genetic diversity and allele frequency. Many methods have been proposed to generate core collections, often using distance metrics to quantify the similarity of two accessions, based on genetic marker data or phenotypic traits. Core Hunter is a multi-purpose core subset selection tool that uses local search algorithms to generate subsets relying on one or more metrics, including several distance metrics and allelic richness. In version 3 of Core Hunter (CH3) we have incorporated two new, improved methods for summarizing distances to quantify diversity or representativeness of the core collection. A comparison of CH3 and Core Hunter 2 (CH2) showed that these new metrics can be effectively optimized with less complex algorithms, as compared to those used in CH2. CH3 is more effective at maximizing the improved diversity metric than CH2, still ensures a high average and minimum distance, and is faster for large datasets. Using CH3, a simple stochastic hill-climber is able to find highly diverse core collections, and the more advanced parallel tempering algorithm further increases the quality of the core and further reduces variability across independent samples. We also evaluate the ability of CH3 to simultaneously maximize diversity, and either representativeness or allelic richness, and compare the results with those of the GDOpt and SimEli methods. CH3 can sample equally representative cores as GDOpt, which was specifically designed for this purpose, and is able to construct cores that are simultaneously more diverse, and either are more representative or have higher allelic richness, than those obtained by SimEli. In version 3, Core Hunter has been updated to include two new core subset selection metrics that construct cores for representativeness or diversity, with improved performance. It combines and outperforms the strengths of other methods, as it (simultaneously) optimizes a variety of metrics. In addition, CH3 is an improvement over CH2, with the option to use genetic marker data or phenotypic traits, or both, and improved speed. Core Hunter 3 is freely available on http://www.corehunter.org .

  13. Generating virtual training samples for sparse representation of face images and face recognition

    NASA Astrophysics Data System (ADS)

    Du, Yong; Wang, Yu

    2016-03-01

    There are many challenges in face recognition. In real-world scenes, images of the same face vary with changing illuminations, different expressions and poses, multiform ornaments, or even altered mental status. Limited available training samples cannot convey these possible changes in the training phase sufficiently, and this has become one of the restrictions to improve the face recognition accuracy. In this article, we view the multiplication of two images of the face as a virtual face image to expand the training set and devise a representation-based method to perform face recognition. The generated virtual samples really reflect some possible appearance and pose variations of the face. By multiplying a training sample with another sample from the same subject, we can strengthen the facial contour feature and greatly suppress the noise. Thus, more human essential information is retained. Also, uncertainty of the training data is simultaneously reduced with the increase of the training samples, which is beneficial for the training phase. The devised representation-based classifier uses both the original and new generated samples to perform the classification. In the classification phase, we first determine K nearest training samples for the current test sample by calculating the Euclidean distances between the test sample and training samples. Then, a linear combination of these selected training samples is used to represent the test sample, and the representation result is used to classify the test sample. The experimental results show that the proposed method outperforms some state-of-the-art face recognition methods.

  14. EPIFLUORESCENCE MICROSCOPY AND SOLID PHASE CYTOMETRY AS CONFIRMATORY METHODS FOR THE ENUMERATION OF PROTOZOA BY FLOW CYTOMETRY

    EPA Science Inventory

    The detection of infective protozoan parasites contained in large volume environmental samples represents a unique challenge in environmental parasitology. Compounding this problem is the fact that infective stages of many protozoan parasites do not readily replicate in media or ...

  15. Longitudinal Changes in Anthropometry and Body Composition in University Freshmen

    ERIC Educational Resources Information Center

    Hootman, Katie C.; Guertin, Kristin A.; Cassano, Patricia A.

    2017-01-01

    Objective: We investigated predictors of weight gain in college freshmen. Participants: A longitudinal cohort study followed a representative sample of freshmen (N = 264) from 8/2011 to 6/2012. Methods: Repeated measurements of anthropometry, dual-energy X-ray absorptiometry (DXA), physical activity, and diet were collected. We investigated…

  16. Method of identifying clusters representing statistical dependencies in multivariate data

    NASA Technical Reports Server (NTRS)

    Borucki, W. J.; Card, D. H.; Lyle, G. C.

    1975-01-01

    Approach is first to cluster and then to compute spatial boundaries for resulting clusters. Next step is to compute, from set of Monte Carlo samples obtained from scrambled data, estimates of probabilities of obtaining at least as many points within boundaries as were actually observed in original data.

  17. IMMUNE FUNCTION AS A BIOMARKER FOR CONTAMINANT EXPOSURE IN SEABIRDS: DEVELOPMENT OF SAMPLE STORAGE AND ANALYSIS METHODS. (U915730)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  18. Suicide Ideation and Psychosocial Distress in Sub-Saharan African Youth

    ERIC Educational Resources Information Center

    Page, Randy M.; West, Joshua H.

    2011-01-01

    Objectives: To determine if there is an association between psychosocial distress, health-risk behaviors and 12-month suicidal ideation among sub-Saharan African adolescents. Methods: Subjects included a cross-national sample of adolescents (N25,568) representing 7 African countries who completed the Global School-based Student Health Survey…

  19. Age, Marital Processes, and Depressed Affect

    ERIC Educational Resources Information Center

    Bookwala, Jamila; Jacobs, Jamie

    2004-01-01

    Purpose: We examined age-cohort differences in the interrelationships among marital processes and depressed affect. Design and Methods: We used data from individuals in first marriages that participated in the National Survey of Families and Households (NSFH). The NSFH interviewed one adult per household of a nationally representative sample.…

  20. RELATIONSHIPS OF INDOOR, OUTDOOR, AND PERSONAL AIR (RIOPA). PART I. COLLECTION METHODS AND DESCRIPTIVE ANALYSES

    EPA Science Inventory

    The homes and subjects selected did not proportionally represent the greater population. Rather, homes close to sources were preferentially sampled in order to examine the impact of possibly high exposures. In addition, the characteristics of the subjects and the homes diff...

  1. Configurations of Common Childhood Psychosocial Risk Factors

    ERIC Educational Resources Information Center

    Copeland, William; Shanahan, Lilly; Costello, E. Jane; Angold, Adrian

    2009-01-01

    Background: Co-occurrence of psychosocial risk factors is commonplace, but little is known about psychiatrically-predictive configurations of psychosocial risk factors. Methods: Latent class analysis (LCA) was applied to 17 putative psychosocial risk factors in a representative population sample of 920 children ages 9 to 17. The resultant class…

  2. A sensitive method for detecting and genotyping Cryptosporidium parvum oocysts

    USDA-ARS?s Scientific Manuscript database

    Cryptosporidium parvum oocysts represent a considerable health risk to humans and animals because the parasite has a low infectious dose and usually exists in low numbers in environmental samples, which makes detection problematic. The purpose of this study was to evaluate Cryspovirus as a target f...

  3. Volatile organic compound emissions from engineered wood products

    Treesearch

    Steve Zylkowski; Charles Frihart

    2017-01-01

    Thirteen bonded engineered wood products representing those commonly used in building construction were evaluated for volatile organic chemicals using methods developed for interior bonded wood products. Although formaldehyde and acetaldehyde were emitted from all samples, they were not the dominant volatiles, which greatly depended on wood species and bonding...

  4. Learning Opportunities for Group Learning

    ERIC Educational Resources Information Center

    Gil, Alfonso J.; Mataveli, Mara

    2017-01-01

    Purpose: This paper aims to analyse the impact of organizational learning culture and learning facilitators in group learning. Design/methodology/approach: This study was conducted using a survey method applied to a statistically representative sample of employees from Rioja wine companies in Spain. A model was tested using a structural equation…

  5. Drunkorexia: Understanding the Co-Occurrence of Alcohol Consumption and Eating/Exercise Weight Management Behaviors

    ERIC Educational Resources Information Center

    Barry, Adam E.; Piazza-Gardner, Anna K.

    2012-01-01

    Objective: Examine the co-occurrence of alcohol consumption, physical activity, and disordered eating behaviors via a drunkorexia perspective. Participants: Nationally representative sample (n = 22,488) of college students completing the Fall 2008 National College Health Assessment. Methods: Hierarchical logistic regression was employed to…

  6. Redefining the WISC-R: Implications for Professional Practice and Public Policy.

    ERIC Educational Resources Information Center

    Macmann, Gregg M.; Barnett, David W.

    1992-01-01

    The factor structure of the Wechsler Intelligence Scale for Children (Revised) was examined in the standardization sample using new methods of factor analysis. The substantial overlap across factors was most parsimoniously represented by a single general factor. Implications for public policy regarding the purposes and outcomes of special…

  7. Obesity and Physical Inactivity in Rural America

    ERIC Educational Resources Information Center

    Patterson, Paul Daniel; Moore, Charity G.; Probst, Janice C.; Shinogle, Judith Ann

    2004-01-01

    Context and Purpose: Obesity and physical inactivity are common in the United States, but few studies examine this issue within rural populations. The present study uses nationally representative data to study obesity and physical inactivity in rural populations. Methods: Data came from the 1998 National Health Interview Survey Sample Adult and…

  8. Variables Related to MDTA Trainee Employment Success in Minnesota.

    ERIC Educational Resources Information Center

    Pucel, David J.

    To predict a person's use of his Manpower Development and Training Act (MDTA) training, this study attempted to supplement existing methods of evaluation, using personal descriptive data about trainees and General Aptitude Test Battery Scores. The sample under study included all students enrolled in ten MDTA projects, representing a geographical…

  9. Do we really need a large number of particles to simulate bimolecular reactive transport with random walk methods? A kernel density estimation approach

    NASA Astrophysics Data System (ADS)

    Rahbaralam, Maryam; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2015-12-01

    Random walk particle tracking methods are a computationally efficient family of methods to solve reactive transport problems. While the number of particles in most realistic applications is in the order of 106-109, the number of reactive molecules even in diluted systems might be in the order of fractions of the Avogadro number. Thus, each particle actually represents a group of potentially reactive molecules. The use of a low number of particles may result not only in loss of accuracy, but also may lead to an improper reproduction of the mixing process, limited by diffusion. Recent works have used this effect as a proxy to model incomplete mixing in porous media. In this work, we propose using a Kernel Density Estimation (KDE) of the concentrations that allows getting the expected results for a well-mixed solution with a limited number of particles. The idea consists of treating each particle as a sample drawn from the pool of molecules that it represents; this way, the actual location of a tracked particle is seen as a sample drawn from the density function of the location of molecules represented by that given particle, rigorously represented by a kernel density function. The probability of reaction can be obtained by combining the kernels associated to two potentially reactive particles. We demonstrate that the observed deviation in the reaction vs time curves in numerical experiments reported in the literature could be attributed to the statistical method used to reconstruct concentrations (fixed particle support) from discrete particle distributions, and not to the occurrence of true incomplete mixing. We further explore the evolution of the kernel size with time, linking it to the diffusion process. Our results show that KDEs are powerful tools to improve computational efficiency and robustness in reactive transport simulations, and indicates that incomplete mixing in diluted systems should be modeled based on alternative mechanistic models and not on a limited number of particles.

  10. Integrating conventional and inverse representation for face recognition.

    PubMed

    Xu, Yong; Li, Xuelong; Yang, Jian; Lai, Zhihui; Zhang, David

    2014-10-01

    Representation-based classification methods are all constructed on the basis of the conventional representation, which first expresses the test sample as a linear combination of the training samples and then exploits the deviation between the test sample and the expression result of every class to perform classification. However, this deviation does not always well reflect the difference between the test sample and each class. With this paper, we propose a novel representation-based classification method for face recognition. This method integrates conventional and the inverse representation-based classification for better recognizing the face. It first produces conventional representation of the test sample, i.e., uses a linear combination of the training samples to represent the test sample. Then it obtains the inverse representation, i.e., provides an approximation representation of each training sample of a subject by exploiting the test sample and training samples of the other subjects. Finally, the proposed method exploits the conventional and inverse representation to generate two kinds of scores of the test sample with respect to each class and combines them to recognize the face. The paper shows the theoretical foundation and rationale of the proposed method. Moreover, this paper for the first time shows that a basic nature of the human face, i.e., the symmetry of the face can be exploited to generate new training and test samples. As these new samples really reflect some possible appearance of the face, the use of them will enable us to obtain higher accuracy. The experiments show that the proposed conventional and inverse representation-based linear regression classification (CIRLRC), an improvement to linear regression classification (LRC), can obtain very high accuracy and greatly outperforms the naive LRC and other state-of-the-art conventional representation based face recognition methods. The accuracy of CIRLRC can be 10% greater than that of LRC.

  11. Quantitative Analysis and Stability of the Rodenticide TETS ...

    EPA Pesticide Factsheets

    Journal Article The determination of the rodenticide tetramethylenedisulfotetramine (TETS) in drinking water is reportable through the use of automated sample preparation via solid phase extraction and detection using isotope dilution gas chromatography-mass spectrometry. The method was characterized over twenty-two analytical batches with quality control samples. Accuracies for low and high concentration quality control pools were 100 and 101%, respectively. The minimum reporting level (MRL) for TETS in this method is 0.50 ug/L. Five drinking waters representing a range of water quality parameters and disinfection practices were fortified with TETS at ten times the MRL and analyzed over a 28 day period to determine the stability of TETS in these waters. The amount of TETS measured in these samples averaged 100 ± 6% of the amount fortified suggesting that tap water samples may be held for up to 28 days prior to analysis.

  12. Top-down analysis of protein samples by de novo sequencing techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vyatkina, Kira; Wu, Si; Dekker, Lennard J. M.

    MOTIVATION: Recent technological advances have made high-resolution mass spectrometers affordable to many laboratories, thus boosting rapid development of top-down mass spectrometry, and implying a need in efficient methods for analyzing this kind of data. RESULTS: We describe a method for analysis of protein samples from top-down tandem mass spectrometry data, which capitalizes on de novo sequencing of fragments of the proteins present in the sample. Our algorithm takes as input a set of de novo amino acid strings derived from the given mass spectra using the recently proposed Twister approach, and combines them into aggregated strings endowed with offsets. Themore » former typically constitute accurate sequence fragments of sufficiently well-represented proteins from the sample being analyzed, while the latter indicate their location in the protein sequence, and also bear information on post-translational modifications and fragmentation patterns.« less

  13. New color-based tracking algorithm for joints of the upper extremities

    NASA Astrophysics Data System (ADS)

    Wu, Xiangping; Chow, Daniel H. K.; Zheng, Xiaoxiang

    2007-11-01

    To track the joints of the upper limb of stroke sufferers for rehabilitation assessment, a new tracking algorithm which utilizes a developed color-based particle filter and a novel strategy for handling occlusions is proposed in this paper. Objects are represented by their color histogram models and particle filter is introduced to track the objects within a probability framework. Kalman filter, as a local optimizer, is integrated into the sampling stage of the particle filter that steers samples to a region with high likelihood and therefore fewer samples is required. A color clustering method and anatomic constraints are used in dealing with occlusion problem. Compared with the general basic particle filtering method, the experimental results show that the new algorithm has reduced the number of samples and hence the computational consumption, and has achieved better abilities of handling complete occlusion over a few frames.

  14. Whither RDS? An investigation of Respondent Driven Sampling as a method of recruiting mainstream marijuana users

    PubMed Central

    2010-01-01

    Background An important challenge in conducting social research of specific relevance to harm reduction programs is locating hidden populations of consumers of substances like cannabis who typically report few adverse or unwanted consequences of their use. Much of the deviant, pathologized perception of drug users is historically derived from, and empirically supported, by a research emphasis on gaining ready access to users in drug treatment or in prison populations with higher incidence of problems of dependence and misuse. Because they are less visible, responsible recreational users of illicit drugs have been more difficult to study. Methods This article investigates Respondent Driven Sampling (RDS) as a method of recruiting experienced marijuana users representative of users in the general population. Based on sampling conducted in a multi-city study (Halifax, Montreal, Toronto, and Vancouver), and compared to samples gathered using other research methods, we assess the strengths and weaknesses of RDS recruitment as a means of gaining access to illicit substance users who experience few harmful consequences of their use. Demographic characteristics of the sample in Toronto are compared with those of users in a recent household survey and a pilot study of Toronto where the latter utilized nonrandom self-selection of respondents. Results A modified approach to RDS was necessary to attain the target sample size in all four cities (i.e., 40 'users' from each site). The final sample in Toronto was largely similar, however, to marijuana users in a random household survey that was carried out in the same city. Whereas well-educated, married, whites and females in the survey were all somewhat overrepresented, the two samples, overall, were more alike than different with respect to economic status and employment. Furthermore, comparison with a self-selected sample suggests that (even modified) RDS recruitment is a cost-effective way of gathering respondents who are more representative of users in the general population than nonrandom methods of recruitment ordinarily produce. Conclusions Research on marijuana use, and other forms of drug use hidden in the general population of adults, is important for informing and extending harm reduction beyond its current emphasis on 'at-risk' populations. Expanding harm reduction in a normalizing context, through innovative research on users often overlooked, further challenges assumptions about reducing harm through prohibition of drug use and urges consideration of alternative policies such as decriminalization and legal regulation. PMID:20618944

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haudebourg, Raphael; Fichet, Pascal; Goutelard, Florence

    The detection (location and quantification) of nuclear facilities to be dismantled possible contamination with low-range particles emitters ({sup 3}H, other low-energy β emitters, a emitters) remains a tedious and expensive task. Indeed, usual remote counters show a too low sensitivity to these non-penetrating radiations, while conventional wipe tests are irrelevant for fixed radioactivity evaluation. The only method to accurately measure activity levels consists in sampling and running advanced laboratory analyses (spectroscopy, liquid scintillation counting, pyrolysis...). Such measurements generally induce sample preparation, waste production (destructive analyses, solvents), nuclear material transportation, long durations, and significant labor mobilization. Therefore, the search for themore » limitation of their number and cost easily conflicts with the necessity to perform a dense screening for sampling (to maximize the representativeness of the samples), in installations of thousands of square meters (floors, wells, ceilings), plus furniture, pipes, and other wastes. To overcome this contradiction, Digital Autoradiography (D. A.) was re-routed from bio molecular research to radiological mapping of nuclear installations under dismantling and to waste and sample analysis. After in-situ exposure to the possibly-contaminated areas to investigate, commercial reusable radiosensitive phosphor screens (of a few 100 cm{sup 2}) were scanned in the proper laboratory device and sharp quantitative images of the radioactivity could be obtained. The implementation of geostatistical tools in the data processing software enabled the exhaustive characterization of concrete floors at a rate of 2 weeks / 100 m{sup 2}, at lowest costs. Various samples such as drilled cores, or tank and wood pieces, were also successfully evaluated with this method, for decisive results. Thanks to the accurate location of potential contamination spots, this approach ensures relevant and representative sampling for further laboratory analyses and should be inserted in the range of common tools used in dismantling. (authors)« less

  16. An opportunity cost approach to sample size calculation in cost-effectiveness analysis.

    PubMed

    Gafni, A; Walter, S D; Birch, S; Sendi, P

    2008-01-01

    The inclusion of economic evaluations as part of clinical trials has led to concerns about the adequacy of trial sample size to support such analysis. The analytical tool of cost-effectiveness analysis is the incremental cost-effectiveness ratio (ICER), which is compared with a threshold value (lambda) as a method to determine the efficiency of a health-care intervention. Accordingly, many of the methods suggested to calculating the sample size requirements for the economic component of clinical trials are based on the properties of the ICER. However, use of the ICER and a threshold value as a basis for determining efficiency has been shown to be inconsistent with the economic concept of opportunity cost. As a result, the validity of the ICER-based approaches to sample size calculations can be challenged. Alternative methods for determining improvements in efficiency have been presented in the literature that does not depend upon ICER values. In this paper, we develop an opportunity cost approach to calculating sample size for economic evaluations alongside clinical trials, and illustrate the approach using a numerical example. We compare the sample size requirement of the opportunity cost method with the ICER threshold method. In general, either method may yield the larger required sample size. However, the opportunity cost approach, although simple to use, has additional data requirements. We believe that the additional data requirements represent a small price to pay for being able to perform an analysis consistent with both concept of opportunity cost and the problem faced by decision makers. Copyright (c) 2007 John Wiley & Sons, Ltd.

  17. Comparison of Enterococcus Species Diversity in Marine Water and Wastewater Using Enterolert and EPA Method 1600

    PubMed Central

    Ferguson, Donna M.; Griffith, John F.; McGee, Charles D.; Weisberg, Stephen B.; Hagedorn, Charles

    2013-01-01

    EPA Method 1600 and Enterolert are used interchangeably to measure Enterococcus for fecal contamination of public beaches, but the methods occasionally produce different results. Here we assess whether these differences are attributable to the selectivity for certain species within the Enterococcus group. Both methods were used to obtain 1279 isolates from 17 environmental samples, including influent and effluent of four wastewater treatment plants, ambient marine water from seven different beaches, and freshwater urban runoff from two stream systems. The isolates were identified to species level. Detection of non-Enterococcus species was slightly higher using Enterolert (8.4%) than for EPA Method 1600 (5.1%). E. faecalis and E. faecium, commonly associated with human fecal waste, were predominant in wastewater; however, Enterolert had greater selectivity for E. faecalis, which was also shown using a laboratory-created sample. The same species selectivity was not observed for most beach water and urban runoff samples. These samples had relatively higher proportions of plant associated species, E. casseliflavus (18.5%) and E. mundtii (5.7%), compared to wastewater, suggesting environmental inputs to beaches and runoff. The potential for species selectivity among water testing methods should be considered when assessing the sanitary quality of beaches so that public health warnings are based on indicators representative of fecal sources. PMID:23840233

  18. Analysis of laboratory compaction methods of roller compacted concrete

    NASA Astrophysics Data System (ADS)

    Trtík, Tomáš; Chylík, Roman; Bílý, Petr; Fládr, Josef

    2017-09-01

    Roller-Compacted Concrete (RCC) is an ordinary concrete poured and compacted with machines typically used for laying of asphalt road layers. One of the problems connected with this technology is preparation of representative samples in the laboratory. The aim of this work was to analyse two methods of preparation of RCC laboratory samples with bulk density as the comparative parameter. The first method used dynamic compaction by pneumatic hammer. The second method of compaction had a static character. The specimens were loaded by precisely defined force in laboratory loading machine to create the same conditions as during static rolling (in the Czech Republic, only static rolling is commonly used). Bulk densities obtained by the two compaction methods were compared with core drills extracted from real RCC structure. The results have shown that the samples produced by pneumatic hammer tend to overestimate the bulk density of the material. For both compaction methods, immediate bearing index test was performed to verify the quality of compaction. A fundamental difference between static and dynamic compaction was identified. In static compaction, initial resistance to penetration of the mandrel was higher, after exceeding certain limit the resistance was constant. This means that the samples were well compacted just on the surface. Specimens made by pneumatic hammer actively resisted throughout the test, the whole volume was uniformly compacted.

  19. Observation procedure, observer gender, and behavior valence as determinants of sampling error in a behavior assessment analogue

    PubMed Central

    Farkas, Gary M.; Tharp, Roland G.

    1980-01-01

    Several factors thought to influence the representativeness of behavioral assessment data were examined in an analogue study using a multifactorial design. Systematic and unsystematic methods of observing group behavior were investigated using 18 male and 18 female observers. Additionally, valence properties of the observed behaviors were inspected. Observers' assessments of a videotape were compared to a criterion code that defined the population of behaviors. Results indicated that systematic observation procedures were more accurate than unsystematic procedures, though this factor interacted with gender of observer and valence of behavior. Additionally, males tended to sample more representatively than females. A third finding indicated that the negatively valenced behavior was overestimated, whereas the neutral and positively valenced behaviors were accurately assessed. PMID:16795631

  20. Miniprimer PCR, a New Lens for Viewing the Microbial World▿ †

    PubMed Central

    Isenbarger, Thomas A.; Finney, Michael; Ríos-Velázquez, Carlos; Handelsman, Jo; Ruvkun, Gary

    2008-01-01

    Molecular methods based on the 16S rRNA gene sequence are used widely in microbial ecology to reveal the diversity of microbial populations in environmental samples. Here we show that a new PCR method using an engineered polymerase and 10-nucleotide “miniprimers” expands the scope of detectable sequences beyond those detected by standard methods using longer primers and Taq polymerase. After testing the method in silico to identify divergent ribosomal genes in previously cloned environmental sequences, we applied the method to soil and microbial mat samples, which revealed novel 16S rRNA gene sequences that would not have been detected with standard primers. Deeply divergent sequences were discovered with high frequency and included representatives that define two new division-level taxa, designated CR1 and CR2, suggesting that miniprimer PCR may reveal new dimensions of microbial diversity. PMID:18083877

  1. Malware analysis using visualized image matrices.

    PubMed

    Han, KyoungSoo; Kang, BooJoong; Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively.

  2. Exhaled human breath measurement method for assessing exposure to halogenated volatile organic compounds.

    PubMed

    Pleil, J D; Lindstrom, A B

    1997-05-01

    The organic constituents of exhaled human breath are representative of blood-borne concentrations through gas exchange in the blood/breath interface in the lungs. The presence of specific compounds can be an indicator of recent exposure or represent a biological response of the subject. For volatile organic compounds (VOCs), sampling and analysis of breath is preferred to direct measurement from blood samples because breath collection is noninvasive, potentially infectious waste is avoided, and the measurement of gas-phase analytes is much simpler in a gas matrix rather than in a complex biological tissue such as blood. To exploit these advantages, we have developed the "single breath canister" (SBC) technique, a simple direct collection method for individual alveolar breath samples, and adapted conventional gas chromatography-mass spectrometry analytical methods for trace-concentration VOC analysis. The focus of this paper is to describe briefly the techniques for making VOC measurements in breath, to present some specific applications for which these methods are relevant, and to demonstrate how to estimate exposure to example VOCs on the basis of breath elimination. We present data from three different exposure scenarios: (a) vinyl chloride and cis-1,2-dichloroethene from showering with contaminated water from a private well, (b) chloroform and bromodichloromethane from high-intensity swimming in chlorinated pool water, and (c) trichloroethene from a controlled exposure chamber experiment. In all cases, for all subjects, the experiment is the same: preexposure breath measurement, exposure to halogenated VOC, and a postexposure time-dependent series of breath measurements. Data are presented only to demonstrate the use of the method and how to interpret the analytical results.

  3. Hematopoietic Lineage Transcriptome Stability and Representation in PAXgene™ Collected Peripheral Blood Utilising SPIA Single-Stranded cDNA Probes for Microarray

    PubMed Central

    Kennedy, Laura; Vass, J. Keith; Haggart, D. Ross; Moore, Steve; Burczynski, Michael E.; Crowther, Dan; Miele, Gino

    2008-01-01

    Peripheral blood as a surrogate tissue for transcriptome profiling holds great promise for the discovery of diagnostic and prognostic disease biomarkers, particularly when target tissues of disease are not readily available. To maximize the reliability of gene expression data generated from clinical blood samples, both the sample collection and the microarray probe generation methods should be optimized to provide stabilized, reproducible and representative gene expression profiles faithfully representing the transcriptional profiles of the constituent blood cell types present in the circulation. Given the increasing innovation in this field in recent years, we investigated a combination of methodological advances in both RNA stabilisation and microarray probe generation with the goal of achieving robust, reliable and representative transcriptional profiles from whole blood. To assess the whole blood profiles, the transcriptomes of purified blood cell types were measured and compared with the global transcriptomes measured in whole blood. The results demonstrate that a combination of PAXgene™ RNA stabilising technology and single-stranded cDNA probe generation afforded by the NuGEN Ovation RNA amplification system V2™ enables an approach that yields faithful representation of specific hematopoietic cell lineage transcriptomes in whole blood without the necessity for prior sample fractionation, cell enrichment or globin reduction. Storage stability assessments of the PAXgene™ blood samples also advocate a short, fixed room temperature storage time for all PAXgene™ blood samples collected for the purposes of global transcriptional profiling in clinical studies. PMID:19578521

  4. MCDA swing weighting and discrete choice experiments for elicitation of patient benefit-risk preferences: a critical assessment.

    PubMed

    Tervonen, Tommi; Gelhorn, Heather; Sri Bhashyam, Sumitra; Poon, Jiat-Ling; Gries, Katharine S; Rentz, Anne; Marsh, Kevin

    2017-12-01

    Multiple criteria decision analysis swing weighting (SW) and discrete choice experiments (DCE) are appropriate methods for capturing patient preferences on treatment benefit-risk trade-offs. This paper presents a qualitative comparison of the 2 methods. We review and critically assess similarities and differences of SW and DCE based on 6 aspects: comprehension by study participants, cognitive biases, sample representativeness, ability to capture heterogeneity in preferences, reliability and validity, and robustness of the results. The SW choice task can be more difficult, but the workshop context in which SW is conducted may provide more support to patients who are unfamiliar with the end points being evaluated or who have cognitive impairments. Both methods are similarly prone to a number of biases associated with preference elicitation, and DCE is prone to simplifying heuristics, which limits its application with large number of attributes. The low cost per patient of the DCE means that it can be better at achieving a representative sample, though SW does not require such large sample sizes due to exact nature of the collected preference data. This also means that internal validity is automatically enforced with SW, while the internal validity of DCE results needs to be assessed manually. Choice between the 2 methods depends on characteristics of the benefit-risk assessment, especially on how difficult the trade-offs are for the patients to make and how many patients are available. Although there exist some empirical studies on many of the evaluation aspects, critical evidence gaps remain. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Radial line-scans as representative sampling strategy in dried-droplet laser ablation of liquid samples deposited on pre-cut filter paper disks

    NASA Astrophysics Data System (ADS)

    Nischkauer, Winfried; Vanhaecke, Frank; Bernacchi, Sébastien; Herwig, Christoph; Limbeck, Andreas

    2014-11-01

    Nebulising liquid samples and using the aerosol thus obtained for further analysis is the standard method in many current analytical techniques, also with inductively coupled plasma (ICP)-based devices. With such a set-up, quantification via external calibration is usually straightforward for samples with aqueous or close-to-aqueous matrix composition. However, there is a variety of more complex samples. Such samples can be found in medical, biological, technological and industrial contexts and can range from body fluids, like blood or urine, to fuel additives or fermentation broths. Specialized nebulizer systems or careful digestion and dilution are required to tackle such demanding sample matrices. One alternative approach is to convert the liquid into a dried solid and to use laser ablation for sample introduction. Up to now, this approach required the application of internal standards or matrix-adjusted calibration due to matrix effects. In this contribution, we show a way to circumvent these matrix effects while using simple external calibration for quantification. The principle of representative sampling that we propose uses radial line-scans across the dried residue. This compensates for centro-symmetric inhomogeneities typically observed in dried spots. The effectiveness of the proposed sampling strategy is exemplified via the determination of phosphorus in biochemical fermentation media. However, the universal viability of the presented measurement protocol is postulated. Detection limits using laser ablation-ICP-optical emission spectrometry were in the order of 40 μg mL- 1 with a reproducibility of 10 % relative standard deviation (n = 4, concentration = 10 times the quantification limit). The reported sensitivity is fit-for-purpose in the biochemical context described here, but could be improved using ICP-mass spectrometry, if future analytical tasks would require it. Trueness of the proposed method was investigated by cross-validation with conventional liquid measurements, and by analyzing IAEA-153 reference material (Trace Elements in Milk Powder); a good agreement with the certified value for phosphorus was obtained.

  6. RAPID SEPARATION METHOD FOR EMERGENCY WATER AND URINE SAMPLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maxwell, S.; Culligan, B.

    2008-08-27

    The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared tomore » NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.« less

  7. The transferability of diatoms to clothing and the methods appropriate for their collection and analysis in forensic geoscience.

    PubMed

    Scott, Kirstie R; Morgan, Ruth M; Jones, Vivienne J; Cameron, Nigel G

    2014-08-01

    Forensic geoscience is concerned with the analysis of geological materials in order to compare and exclude environmental samples from a common source, or to identify an unknown provenance in a criminal investigation. Diatom analysis is currently an underused technique within the forensic geoscience approach, which has the potential to provide an independent ecological assessment of trace evidence. This study presents empirical data to provide a preliminary evidence base in order to be able to understand the nature of diatom transfers to items of clothing, and the collection of transferred diatom trace evidence from a range of environments under experimental conditions. Three diatom extraction methods were tested on clothing that had been in contact with soil and water sites: rinsing in water (RW), rinsing in ethanol (RE), and submersion in H2O2 solution (H). Scanning electron microscopy (S.E.M.) analysis was undertaken in order to examine the degree of diatom retention on treated clothing samples. The total diatom yield and species richness data was recorded from each experimental sample in order to compare the efficacy of each method in collecting a representative sample for analysis. Similarity was explored using correspondence analysis. The results highlight the efficiency of H2O2 submersion in consistently extracting high diatom counts with representative species from clothing exposed to both aquatic and terrestrial sites. This is corroborated by S.E.M. analysis. This paper provides an important empirical evidence base for both establishing that diatoms do indeed transfer to clothing under forensic conditions in a range of environments, and in identifying that H2O2 extraction is the most efficient technique for the optimal collection of comparative samples. There is therefore potentially great value in collecting and analysing diatom components of geoforensic samples in order to aid in forensic investigation. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  8. Metadynamic metainference: Enhanced sampling of the metainference ensemble using metadynamics

    PubMed Central

    Bonomi, Massimiliano; Camilloni, Carlo; Vendruscolo, Michele

    2016-01-01

    Accurate and precise structural ensembles of proteins and macromolecular complexes can be obtained with metainference, a recently proposed Bayesian inference method that integrates experimental information with prior knowledge and deals with all sources of errors in the data as well as with sample heterogeneity. The study of complex macromolecular systems, however, requires an extensive conformational sampling, which represents a separate challenge. To address such challenge and to exhaustively and efficiently generate structural ensembles we combine metainference with metadynamics and illustrate its application to the calculation of the free energy landscape of the alanine dipeptide. PMID:27561930

  9. Studies of erosion of solar max samples of Kapton and Teflon

    NASA Technical Reports Server (NTRS)

    Fristrom, R. M.; Benson, R. C.; Bargeron, C. B.; Phillips, T. E.; Vest, C. E.; Hoshall, C. H.; Satkiewicz, F. G.; Uy, O. M.

    1985-01-01

    Several samples of Kapton and Teflon which was exposed to solar radiation were examined. The samples represent material behavior in near Earth space. Clues to the identity of erosive processes and the responsible species were searched for. Interest centered around oxygen atoms which are ubiquitous at these altitudes and are known to erode some metal surfaces. Three diagnostic methods were employed: optical microscopy, scanning electron microscopy, and fourier transform infrared spectroscopy. Two types of simulation were used: a flow containing low energy oxygen atoms and bombardment with 3000 volt Ar ions. Results and conclusions are presented.

  10. Does sampling using random digit dialling really cost more than sampling from telephone directories: Debunking the myths

    PubMed Central

    Yang, Baohui; Eyeson-Annan, Margo

    2006-01-01

    Background Computer assisted telephone interviewing (CATI) is widely used for health surveys. The advantages of CATI over face-to-face interviewing are timeliness and cost reduction to achieve the same sample size and geographical coverage. Two major CATI sampling procedures are used: sampling directly from the electronic white pages (EWP) telephone directory and list assisted random digit dialling (LA-RDD) sampling. EWP sampling covers telephone numbers of households listed in the printed white pages. LA-RDD sampling has a better coverage of households than EWP sampling but is considered to be more expensive due to interviewers dialling more out-of-scope numbers. Methods This study compared an EWP sample and a LA-RDD sample from the New South Wales Population Health Survey in 2003 on demographic profiles, health estimates, coefficients of variation in weights, design effects on estimates, and cost effectiveness, on the basis of achieving the same level of precision of estimates. Results The LA-RDD sample better represented the population than the EWP sample, with a coefficient of variation of weights of 1.03 for LA-RDD compared with 1.21 for EWP, and average design effects of 2.00 for LA-RDD compared with 2.38 for EWP. Also, a LA-RDD sample can save up to 14.2% in cost compared to an EWP sample to achieve the same precision for health estimates. Conclusion A LA-RDD sample better represents the population, which potentially leads to reduced bias in health estimates, and rather than costing more than EWP actually costs less. PMID:16504117

  11. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less

  12. Analysis of munitions constituents in groundwater using a field-portable GC-MS.

    PubMed

    Bednar, A J; Russell, A L; Hayes, C A; Jones, W T; Tackett, P; Splichal, D E; Georgian, T; Parker, L V; Kirgan, R A; MacMillan, D K

    2012-05-01

    The use of munitions constituents (MCs) at military installations can produce soil and groundwater contamination that requires periodic monitoring even after training or manufacturing activities have ceased. Traditional groundwater monitoring methods require large volumes of aqueous samples (e.g., 2-4 L) to be shipped under chain of custody, to fixed laboratories for analysis. The samples must also be packed on ice and shielded from light to minimize degradation that may occur during transport and storage. The laboratory's turn-around time for sample analysis and reporting can be as long as 45 d. This process hinders the reporting of data to customers in a timely manner; yields data that are not necessarily representative of current site conditions owing to the lag time between sample collection and reporting; and incurs significant shipping costs for samples. The current work compares a field portable Gas Chromatograph-Mass Spectrometer (GC-MS) for analysis of MCs on-site with traditional laboratory-based analysis using High Performance Liquid Chromatography with UV absorption detection. The field method provides near real-time (within ~1 h of sampling) concentrations of MCs in groundwater samples. Mass spectrometry provides reliable confirmation of MCs and a means to identify unknown compounds that are potential false positives for methods with UV and other non-selective detectors. Published by Elsevier Ltd.

  13. Simultaneous quantification of fluoxetine and norfluoxetine in colostrum and mature human milk using a 2-dimensional liquid chromatography-tandem mass spectrometry system.

    PubMed

    Lopes, Bianca Rebelo; Cassiano, Neila Maria; Carvalho, Daniela Miarelli; Moisés, Elaine Christine Dantas; Cass, Quezia Bezerra

    2018-02-20

    A two-dimensional liquid chromatography system coupled to triple quadrupole tandem mass spectrometer (2D LC-MS/MS) was employed for the determination of fluoxetine (FLU) and norfluoxetine (N-FLU) in colostrum and mature milk by direct sample injection. With a run time of 12 min representing a gain in throughput analysis, the validated methods furnished selectivity, extraction efficiency, accuracy, and precision in accordance with the criteria preconized by the European Medicines Agency guidelines. With a linear range of 3.00-150 ng/mL for FLU and 4.00-200 ng/mL for N-FLU they were applied to the analysis of colostrum and mature milk samples from nursing mothers. The paper discusses the differences and similarity of sample preparation for this two sample matrices. The herein reported methods are an advance in sample preparation procedures providing waste reduction and a sustainable approach. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. DNA barcoding and regional diversity of understudied Micropeplinae (Coleoptera: Staphylinidae) in Southwest China: phylogenetic implications and a new Micropeplus from Mount Emei.

    PubMed

    Grebennikov, Vasily V; Smetana, Aleš

    2015-02-18

    Extensive litter sampling at eight forested localities in Yunnan and Sichuan detected 381 specimens of Micropeplinae rove beetles. DNA barcoding data from 85 representative specimens were analysed to delimit species and infer their relationships. Statistical methods were implemented to assess regional species diversity of understudied Micropeplinae. The total number of sampled Micropeplinae species varied between 14 and 17, depending on a splitting versus lumping approach for allopatric populations. A single Micropeplinae species was sampled in six of eight studied localities, three species were found on Mount Gongga, while ten species were discovered on hyperdiverse Mount Emei in Sichuan. All Micropeplinae specimens from our samples belong either to the genus Cerapeplus, or to three other inclusive groups temporarily retained inside Micropeplus sensu lato. Each of the three groups potentially represents a separate genus: tesserula group, sculptus group and Micropeplus sensu stricto. A new species Micropeplus jason sp. n. from Mount Emei in Sichuan is described. Numerous illustrations introduce regional fauna and clarify the discussed morphological characters.

  15. Psychometric Validation of the Parental Bonding Instrument in a U.K. Population-Based Sample: Role of Gender and Association With Mental Health in Mid-Late Life.

    PubMed

    Xu, Man K; Morin, Alexandre J S; Marsh, Herbert W; Richards, Marcus; Jones, Peter B

    2016-08-01

    The factorial structure of the Parental Bonding Instrument (PBI) has been frequently studied in diverse samples but no study has examined its psychometric properties from large, population-based samples. In particular, important questions have not been addressed such as the measurement invariance properties across parental and offspring gender. We evaluated the PBI based on responses from a large, representative population-based sample, using an exploratory structural equation modeling method appropriate for categorical data. Analysis revealed a three-factor structure representing "care," "overprotection," and "autonomy" parenting styles. In terms of psychometric measurement validity, our results supported the complete invariance of the PBI ratings across sons and daughters for their mothers and fathers. The PBI ratings were also robust in relation to personality and mental health status. In terms of predictive value, paternal care showed a protective effect on mental health at age 43 in sons. The PBI is a sound instrument for capturing perceived parenting styles, and is predictive of mental health in middle adulthood. © The Author(s) 2016.

  16. COMPARISON OF ECOLOGICAL COMMUNITIES: THE PROBLEM OF SAMPLE REPRESENTATIVENESS

    EPA Science Inventory

    Obtaining an adequate, representative sample of ecological communities to make taxon richness (TR) or compositional comparisons among sites is a continuing challenge. Sample representativeness literally means the similarity in species composition and relative abundance between a ...

  17. Rapid method for identification and enumeration of oral Actinomyces.

    PubMed Central

    Marucha, P T; Keyes, P H; Wittenberger, C L; London, J

    1978-01-01

    Serotype-specific antisera prepared against whole cells of Actinomyces viscosus, A. naeslundii, and A. israeli were labeled with fluorescein dye and used to detect and quantitate antigenically related microorganisms in human dental plaque. By relating the DNA content of the dental plaque microflora to the number of Actinomyces present in the plaque samples, a reproducible method was developed for specifically enumerating five serotypic representatives of this genus found in human plaque. PMID:711333

  18. Rarity and Incomplete Sampling in DNA-Based Species Delimitation.

    PubMed

    Ahrens, Dirk; Fujisawa, Tomochika; Krammer, Hans-Joachim; Eberle, Jonas; Fabrizi, Silvia; Vogler, Alfried P

    2016-05-01

    DNA-based species delimitation may be compromised by limited sampling effort and species rarity, including "singleton" representatives of species, which hampers estimates of intra- versus interspecies evolutionary processes. In a case study of southern African chafers (beetles in the family Scarabaeidae), many species and subclades were poorly represented and 48.5% of species were singletons. Using cox1 sequences from >500 specimens and ∼100 species, the Generalized Mixed Yule Coalescent (GMYC) analysis as well as various other approaches for DNA-based species delimitation (Automatic Barcode Gap Discovery (ABGD), Poisson tree processes (PTP), Species Identifier, Statistical Parsimony), frequently produced poor results if analyzing a narrow target group only, but the performance improved when several subclades were combined. Hence, low sampling may be compensated for by "clade addition" of lineages outside of the focal group. Similar findings were obtained in reanalysis of published data sets of taxonomically poorly known species assemblages of insects from Madagascar. The low performance of undersampled trees is not due to high proportions of singletons per se, as shown in simulations (with 13%, 40% and 52% singletons). However, the GMYC method was highly sensitive to variable effective population size ([Formula: see text]), which was exacerbated by variable species abundances in the simulations. Hence, low sampling success and rarity of species affect the power of the GMYC method only if they reflect great differences in [Formula: see text] among species. Potential negative effects of skewed species abundances and prevalence of singletons are ultimately an issue about the variation in [Formula: see text] and the degree to which this is correlated with the census population size and sampling success. Clade addition beyond a limited study group can overcome poor sampling for the GMYC method in particular under variable [Formula: see text] This effect was less pronounced for methods of species delimitation not based on coalescent models. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. After site selection and before data analysis: sampling, sorting, and laboratory procedures used in stream benthic macroinvertebrate monitoring programs by USA state agencies

    USGS Publications Warehouse

    Carter, James L.; Resh, Vincent H.

    2001-01-01

    A survey of methods used by US state agencies for collecting and processing benthic macroinvertebrate samples from streams was conducted by questionnaire; 90 responses were received and used to describe trends in methods. The responses represented an estimated 13,000-15,000 samples collected and processed per year. Kicknet devices were used in 64.5% of the methods; other sampling devices included fixed-area samplers (Surber and Hess), artificial substrates (Hester-Dendy and rock baskets), grabs, and dipnets. Regional differences existed, e.g., the 1-m kicknet was used more often in the eastern US than in the western US. Mesh sizes varied among programs but 80.2% of the methods used a mesh size between 500 and 600 (mu or u)m. Mesh size variations within US Environmental Protection Agency regions were large, with size differences ranging from 100 to 700 (mu or u)m. Most samples collected were composites; the mean area sampled was 1.7 m2. Samples rarely were collected using a random method (4.7%); most samples (70.6%) were collected using "expert opinion", which may make data obtained operator-specific. Only 26.3% of the methods sorted all the organisms from a sample; the remainder subsampled in the laboratory. The most common method of subsampling was to remove 100 organisms (range = 100-550). The magnification used for sorting ranged from 1 (sorting by eye) to 30x, which results in inconsistent separation of macroinvertebrates from detritus. In addition to subsampling, 53% of the methods sorted large/rare organisms from a sample. The taxonomic level used for identifying organisms varied among taxa; Ephemeroptera, Plecoptera, and Trichoptera were generally identified to a finer taxonomic resolution (genus and species) than other taxa. Because there currently exists a large range of field and laboratory methods used by state programs, calibration among all programs to increase data comparability would be exceptionally challenging. However, because many techniques are shared among methods, limited testing could be designed to evaluate whether procedural differences affect the ability to determine levels of environmental impairment using benthic macroinvertebrate communities.

  20. Determination of some organophosphorus pesticides in water and watermelon samples by microextraction prior to high-performance liquid chromatography.

    PubMed

    Wang, Chun; Wu, Qiuhua; Wu, Chunxia; Wang, Zhi

    2011-11-01

    A novel method based on simultaneous liquid-liquid microextraction and carbon nanotube reinforced hollow fiber microporous membrane solid-liquid phase microextraction has been developed for the determination of six organophosphorus pesticides, i.e. isocarbophos, phosmet, parathion-methyl, triazophos, fonofos and phoxim, in water and watermelon samples prior to high-performance liquid chromatography (HPLC). Under the optimum conditions, the method shows a good linearity within a range of 1-200 ng/mL for water samples and 5-200 ng/g for watermelon samples, with the correlation coefficients (r) varying from 0.9990 to 0.9997 and 0.9986 to 0.9995, respectively. The limits of detection (LODs) were in the range between 0.1 and 0.3 ng/mL for water samples and between 1.0 and 1.5 ng/g for watermelon samples. The recoveries of the method at spiking levels of 5.0 and 50.0 ng/mL for water samples were between 85.4 and 100.8%, and at spiking levels of 5.0 and 50.0 ng/g for watermelon samples, they were between 82.6 and 92.4%, with the relative standard deviations (RSDs) varying from 4.5-6.9% and 5.2-7.4%, respectively. The results suggested that the developed method represents a simple, low-cost, high analytes preconcentration and excellent sample cleanup procedure for the determination of organophosphorus pesticides in water and watermelon samples. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Sampling challenges in a study examining refugee resettlement.

    PubMed

    Sulaiman-Hill, Cheryl Mr; Thompson, Sandra C

    2011-03-15

    As almost half of all refugees currently under United Nations protection are from Afghanistan or Iraq and significant numbers have already been resettled outside the region of origin, it is likely that future research will examine their resettlement needs. A number of methodological challenges confront researchers working with culturally and linguistically diverse groups; however, few detailed articles are available to inform other studies. The aim of this paper is to outline challenges with sampling and recruitment of socially invisible refugee groups, describing the method adopted for a mixed methods exploratory study assessing mental health, subjective wellbeing and resettlement perspectives of Afghan and Kurdish refugees living in New Zealand and Australia. Sampling strategies used in previous studies with similar refugee groups were considered before determining the approach to recruitment A snowball approach was adopted for the study, with multiple entry points into the communities being used to choose as wide a range of people as possible to provide further contacts and reduce selection bias. Census data was used to assess the representativeness of the sample. A sample of 193 former refugee participants was recruited in Christchurch (n = 98) and Perth (n = 95), 47% were of Afghan and 53% Kurdish ethnicity. A good gender balance (males 52%, females 48%) was achieved overall, mainly as a result of the sampling method used. Differences in the demographic composition of groups in each location were observed, especially in relation to the length of time spent in a refugee situation and time since arrival, reflecting variations in national humanitarian quota intakes. Although some measures were problematic, Census data comparison to assess reasonable representativeness of the study sample was generally reassuring. Snowball sampling, with multiple initiation points to reduce selection bias, was necessary to locate and identify participants, provide reassurance and break down barriers. Personal contact was critical for both recruitment and data quality, and highlighted the importance of interviewer cultural sensitivity. Cross-national comparative studies, particularly relating to refugee resettlement within different policy environments, also need to take into consideration the differing pre-migration experiences and time since arrival of refugee groups, as these can add additional layers of complexity to study design and interpretation.

  2. Behavioural cues of reproductive status in seahorses Hippocampus abdominalis.

    PubMed

    Whittington, C M; Musolf, K; Sommer, S; Wilson, A B

    2013-07-01

    A method is described to assess the reproductive status of male Hippocampus abdominalis on the basis of behavioural traits. The non-invasive nature of this technique minimizes handling stress and reduces sampling requirements for experimental work. It represents a useful tool to assist researchers in sample collection for studies of reproduction and development in viviparous syngnathids, which are emerging as important model species. © 2013 The Authors. Journal of Fish Biology © 2013 The Fisheries Society of the British Isles.

  3. Ultrasonic imaging of textured alumina

    NASA Technical Reports Server (NTRS)

    Stang, David B.; Salem, Jonathan A.; Generazio, Edward R.

    1989-01-01

    Ultrasonic images representing the bulk attenuation and velocity of a set of alumina samples were obtained by a pulse-echo contact scanning technique. The samples were taken from larger bodies that were chemically similar but were processed by extrusion or isostatic processing. The crack growth resistance and fracture toughness of the larger bodies were found to vary with processing method and test orientation. The results presented here demonstrate that differences in texture that contribute to variations in structural performance can be revealed by analytic ultrasonic techniques.

  4. Mental Health Disorders, Psychological Distress, and Suicidality in a Diverse Sample of Lesbian, Gay, Bisexual, and Transgender Youths

    PubMed Central

    Garofalo, Robert; Emerson, Erin M.

    2010-01-01

    Objectives. We examined associations of race/ethnicity, gender, and sexual orientation with mental disorders among lesbian, gay, bisexual, and transgender (LGBT) youths. Methods. We assessed mental disorders by administering a structured diagnostic interview to a community sample of 246 LGBT youths aged 16 to 20 years. Participants also completed the Brief Symptom Inventory 18 (BSI 18). Results. One third of participants met criteria for any mental disorder, 17% for conduct disorder, 15% for major depression, and 9% for posttraumatic stress disorder. Anorexia and bulimia were rare. Lifetime suicide attempts were frequent (31%) but less so in the prior 12 months (7%). Few racial/ethnic and gender differences were statistically significant. Bisexually identified youths had lower prevalences of every diagnosis. The BSI 18 had high negative predictive power (90%) and low positive predictive power (25%) for major depression. Conclusions. LGBT youths had higher prevalences of mental disorder diagnoses than youths in national samples, but were similar to representative samples of urban, racial/ethnic minority youths. Suicide behaviors were similar to those among representative youth samples in the same geographic area. Questionnaires measuring psychological distress may overestimate depression prevalence among this population. PMID:20966378

  5. A Four-Hour Yeast Bioassay for the Direct Measure of Estrogenic Activity in Wastewater without Sample Extraction, Concentration, or Sterilization

    PubMed Central

    Balsiger, Heather A.; de la Torre, Roberto; Lee, Wen-Yee; Cox, Marc B.

    2010-01-01

    The assay described here represents an improved yeast bioassay that provides a rapid yet sensitive screening method for EDCs with very little hands-on time and without the need for sample preparation. Traditional receptor-mediated reporter assays in yeast were performed twelve to twenty four hours after ligand addition, used colorimetric substrates, and, in many cases, required high, non-physiological concentrations of ligand. With the advent of new chemiluminescent substrates a ligand-induced signal can be detected within thirty minutes using high picomolar to low nanomolar concentrations of estrogen. As a result of the sensitivity (EC50 for estradiol is ~ 0.7 nM) and the very short assay time (2-4 hours) environmental water samples can typically be assayed directly without sterilization, extraction, and concentration. Thus, these assays represent rapid and sensitive approaches for determining the presence of contaminants in environmental samples. As proof of principle, we directly assayed wastewater influent and effluent taken from a wastewater treatment plant in the El Paso, TX area for the presence of estrogenic activity. The data obtained in the four-hour yeast bioassay directly correlated with GC-mass spectrometry analysis of these same water samples. PMID:20074779

  6. Percentile curves for skinfold thickness for Canadian children and youth.

    PubMed

    Kuhle, Stefan; Ashley-Martin, Jillian; Maguire, Bryan; Hamilton, David C

    2016-01-01

    Background. Skinfold thickness (SFT) measurements are a reliable and feasible method for assessing body fat in children but their use and interpretation is hindered by the scarcity of reference values in representative populations of children. The objective of the present study was to develop age- and sex-specific percentile curves for five SFT measures (biceps, triceps, subscapular, suprailiac, medial calf) in a representative population of Canadian children and youth. Methods. We analyzed data from 3,938 children and adolescents between 6 and 19 years of age who participated in the Canadian Health Measures Survey cycles 1 (2007/2009) and 2 (2009/2011). Standardized procedures were used to measure SFT. Age- and sex-specific centiles for SFT were calculated using the GAMLSS method. Results. Percentile curves were materially different in absolute value and shape for boys and girls. Percentile girls in girls steadily increased with age whereas percentile curves in boys were characterized by a pubertal centered peak. Conclusions. The current study has presented for the first time percentile curves for five SFT measures in a representative sample of Canadian children and youth.

  7. Percentile curves for skinfold thickness for Canadian children and youth

    PubMed Central

    Ashley-Martin, Jillian; Maguire, Bryan; Hamilton, David C.

    2016-01-01

    Background. Skinfold thickness (SFT) measurements are a reliable and feasible method for assessing body fat in children but their use and interpretation is hindered by the scarcity of reference values in representative populations of children. The objective of the present study was to develop age- and sex-specific percentile curves for five SFT measures (biceps, triceps, subscapular, suprailiac, medial calf) in a representative population of Canadian children and youth. Methods. We analyzed data from 3,938 children and adolescents between 6 and 19 years of age who participated in the Canadian Health Measures Survey cycles 1 (2007/2009) and 2 (2009/2011). Standardized procedures were used to measure SFT. Age- and sex-specific centiles for SFT were calculated using the GAMLSS method. Results. Percentile curves were materially different in absolute value and shape for boys and girls. Percentile girls in girls steadily increased with age whereas percentile curves in boys were characterized by a pubertal centered peak. Conclusions. The current study has presented for the first time percentile curves for five SFT measures in a representative sample of Canadian children and youth. PMID:27547554

  8. Reviewing the research methods literature: principles and strategies illustrated by a systematic overview of sampling in qualitative research.

    PubMed

    Gentles, Stephen J; Charles, Cathy; Nicholas, David B; Ploeg, Jenny; McKibbon, K Ann

    2016-10-11

    Overviews of methods are potentially useful means to increase clarity and enhance collective understanding of specific methods topics that may be characterized by ambiguity, inconsistency, or a lack of comprehensiveness. This type of review represents a distinct literature synthesis method, although to date, its methodology remains relatively undeveloped despite several aspects that demand unique review procedures. The purpose of this paper is to initiate discussion about what a rigorous systematic approach to reviews of methods, referred to here as systematic methods overviews, might look like by providing tentative suggestions for approaching specific challenges likely to be encountered. The guidance offered here was derived from experience conducting a systematic methods overview on the topic of sampling in qualitative research. The guidance is organized into several principles that highlight specific objectives for this type of review given the common challenges that must be overcome to achieve them. Optional strategies for achieving each principle are also proposed, along with discussion of how they were successfully implemented in the overview on sampling. We describe seven paired principles and strategies that address the following aspects: delimiting the initial set of publications to consider, searching beyond standard bibliographic databases, searching without the availability of relevant metadata, selecting publications on purposeful conceptual grounds, defining concepts and other information to abstract iteratively, accounting for inconsistent terminology used to describe specific methods topics, and generating rigorous verifiable analytic interpretations. Since a broad aim in systematic methods overviews is to describe and interpret the relevant literature in qualitative terms, we suggest that iterative decision making at various stages of the review process, and a rigorous qualitative approach to analysis are necessary features of this review type. We believe that the principles and strategies provided here will be useful to anyone choosing to undertake a systematic methods overview. This paper represents an initial effort to promote high quality critical evaluations of the literature regarding problematic methods topics, which have the potential to promote clearer, shared understandings, and accelerate advances in research methods. Further work is warranted to develop more definitive guidance.

  9. Looking for trees in the forest: summary tree from posterior samples

    PubMed Central

    2013-01-01

    Background Bayesian phylogenetic analysis generates a set of trees which are often condensed into a single tree representing the whole set. Many methods exist for selecting a representative topology for a set of unrooted trees, few exist for assigning branch lengths to a fixed topology, and even fewer for simultaneously setting the topology and branch lengths. However, there is very little research into locating a good representative for a set of rooted time trees like the ones obtained from a BEAST analysis. Results We empirically compare new and known methods for generating a summary tree. Some new methods are motivated by mathematical constructions such as tree metrics, while the rest employ tree concepts which work well in practice. These use more of the posterior than existing methods, which discard information not directly mapped to the chosen topology. Using results from a large number of simulations we assess the quality of a summary tree, measuring (a) how well it explains the sequence data under the model and (b) how close it is to the “truth”, i.e to the tree used to generate the sequences. Conclusions Our simulations indicate that no single method is “best”. Methods producing good divergence time estimates have poor branch lengths and lower model fit, and vice versa. Using the results presented here, a user can choose the appropriate method based on the purpose of the summary tree. PMID:24093883

  10. Looking for trees in the forest: summary tree from posterior samples.

    PubMed

    Heled, Joseph; Bouckaert, Remco R

    2013-10-04

    Bayesian phylogenetic analysis generates a set of trees which are often condensed into a single tree representing the whole set. Many methods exist for selecting a representative topology for a set of unrooted trees, few exist for assigning branch lengths to a fixed topology, and even fewer for simultaneously setting the topology and branch lengths. However, there is very little research into locating a good representative for a set of rooted time trees like the ones obtained from a BEAST analysis. We empirically compare new and known methods for generating a summary tree. Some new methods are motivated by mathematical constructions such as tree metrics, while the rest employ tree concepts which work well in practice. These use more of the posterior than existing methods, which discard information not directly mapped to the chosen topology. Using results from a large number of simulations we assess the quality of a summary tree, measuring (a) how well it explains the sequence data under the model and (b) how close it is to the "truth", i.e to the tree used to generate the sequences. Our simulations indicate that no single method is "best". Methods producing good divergence time estimates have poor branch lengths and lower model fit, and vice versa. Using the results presented here, a user can choose the appropriate method based on the purpose of the summary tree.

  11. Standard methods for sampling freshwater fishes: Opportunities for international collaboration

    USGS Publications Warehouse

    Bonar, Scott A.; Mercado-Silva, Norman; Hubert, Wayne A.; Beard, Douglas; Dave, Göran; Kubečka, Jan; Graeb, Brian D. S.; Lester, Nigel P.; Porath, Mark T.; Winfield, Ian J.

    2017-01-01

    With publication of Standard Methods for Sampling North American Freshwater Fishes in 2009, the American Fisheries Society (AFS) recommended standard procedures for North America. To explore interest in standardizing at intercontinental scales, a symposium attended by international specialists in freshwater fish sampling was convened at the 145th Annual AFS Meeting in Portland, Oregon, in August 2015. Participants represented all continents except Australia and Antarctica and were employed by state and federal agencies, universities, nongovernmental organizations, and consulting businesses. Currently, standardization is practiced mostly in North America and Europe. Participants described how standardization has been important for management of long-term data sets, promoting fundamental scientific understanding, and assessing efficacy of large spatial scale management strategies. Academics indicated that standardization has been useful in fisheries education because time previously used to teach how sampling methods are developed is now more devoted to diagnosis and treatment of problem fish communities. Researchers reported that standardization allowed increased sample size for method validation and calibration. Group consensus was to retain continental standards where they currently exist but to further explore international and intercontinental standardization, specifically identifying where synergies and bridges exist, and identify means to collaborate with scientists where standardization is limited but interest and need occur.

  12. Using ancestry matching to combine family-based and unrelated samples for genome-wide association studies‡

    PubMed Central

    Crossett, Andrew; Kent, Brian P.; Klei, Lambertus; Ringquist, Steven; Trucco, Massimo; Roeder, Kathryn; Devlin, Bernie

    2015-01-01

    We propose a method to analyze family-based samples together with unrelated cases and controls. The method builds on the idea of matched case–control analysis using conditional logistic regression (CLR). For each trio within the family, a case (the proband) and matched pseudo-controls are constructed, based upon the transmitted and untransmitted alleles. Unrelated controls, matched by genetic ancestry, supplement the sample of pseudo-controls; likewise unrelated cases are also paired with genetically matched controls. Within each matched stratum, the case genotype is contrasted with control pseudo-control genotypes via CLR, using a method we call matched-CLR (mCLR). Eigenanalysis of numerous SNP genotypes provides a tool for mapping genetic ancestry. The result of such an analysis can be thought of as a multidimensional map, or eigenmap, in which the relative genetic similarities and differences amongst individuals is encoded in the map. Once constructed, new individuals can be projected onto the ancestry map based on their genotypes. Successful differentiation of individuals of distinct ancestry depends on having a diverse, yet representative sample from which to construct the ancestry map. Once samples are well-matched, mCLR yields comparable power to competing methods while ensuring excellent control over Type I error. PMID:20862653

  13. Fast, Exact Bootstrap Principal Component Analysis for p > 1 million

    PubMed Central

    Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim

    2015-01-01

    Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801

  14. Histopathological Image Classification using Discriminative Feature-oriented Dictionary Learning

    PubMed Central

    Vu, Tiep Huu; Mousavi, Hojjat Seyed; Monga, Vishal; Rao, Ganesh; Rao, UK Arvind

    2016-01-01

    In histopathological image analysis, feature extraction for classification is a challenging task due to the diversity of histology features suitable for each problem as well as presence of rich geometrical structures. In this paper, we propose an automatic feature discovery framework via learning class-specific dictionaries and present a low-complexity method for classification and disease grading in histopathology. Essentially, our Discriminative Feature-oriented Dictionary Learning (DFDL) method learns class-specific dictionaries such that under a sparsity constraint, the learned dictionaries allow representing a new image sample parsimoniously via the dictionary corresponding to the class identity of the sample. At the same time, the dictionary is designed to be poorly capable of representing samples from other classes. Experiments on three challenging real-world image databases: 1) histopathological images of intraductal breast lesions, 2) mammalian kidney, lung and spleen images provided by the Animal Diagnostics Lab (ADL) at Pennsylvania State University, and 3) brain tumor images from The Cancer Genome Atlas (TCGA) database, reveal the merits of our proposal over state-of-the-art alternatives. Moreover, we demonstrate that DFDL exhibits a more graceful decay in classification accuracy against the number of training images which is highly desirable in practice where generous training is often not available. PMID:26513781

  15. Methods for assessing long-term mean pathogen count in drinking water and risk management implications.

    PubMed

    Englehardt, James D; Ashbolt, Nicholas J; Loewenstine, Chad; Gadzinski, Erik R; Ayenu-Prah, Albert Y

    2012-06-01

    Recently pathogen counts in drinking and source waters were shown theoretically to have the discrete Weibull (DW) or closely related discrete growth distribution (DGD). The result was demonstrated versus nine short-term and three simulated long-term water quality datasets. These distributions are highly skewed such that available datasets seldom represent the rare but important high-count events, making estimation of the long-term mean difficult. In the current work the methods, and data record length, required to assess long-term mean microbial count were evaluated by simulation of representative DW and DGD waterborne pathogen count distributions. Also, microbial count data were analyzed spectrally for correlation and cycles. In general, longer data records were required for more highly skewed distributions, conceptually associated with more highly treated water. In particular, 500-1,000 random samples were required for reliable assessment of the population mean ±10%, though 50-100 samples produced an estimate within one log (45%) below. A simple correlated first order model was shown to produce count series with 1/f signal, and such periodicity over many scales was shown in empirical microbial count data, for consideration in sampling. A tiered management strategy is recommended, including a plan for rapid response to unusual levels of routinely-monitored water quality indicators.

  16. Low faunal diversity on Maltese sandy beaches: fact or artefact?

    NASA Astrophysics Data System (ADS)

    Deidun, Alan; Azzopardi, Marthese; Saliba, Stephen; Schembri, Patrick J.

    2003-10-01

    Eight sandy beaches on Malta and two on Gozo were sampled for macrofauna to test the hypothesis that Maltese beaches have an intrinsically low diversity. Stations distributed in the supralittoral (dry zone), mediolittoral (wet zone) and upper infralittoral (submerged zone to 1 m water depth) were sampled by sieving core samples and standardised searching during daytime, and pitfall trapping and standardised sweeping of the water column using a hand-net at night, as appropriate. Physical parameters of the sediment were measured and human occupancy of the beaches was estimated. From the supralittoral and mediolittoral, 39 species represented by 1584 individuals were collected by the combined techniques of pitfall trapping, sieving and standard searching. For Ramla beach, which had the highest diversity, 267 individuals representing 25 infaunal species were collected by sieving from a combined volume of 1.175 m 3 of sand, and 149 individuals representing 28 epifaunal species were collected by standardised searching from a combined area of 700 m 2 of sand during two winter and two summer sampling sessions between 1992 and 1993. For nine other beaches sampled during the summer of 2000, only six macrofaunal species were collected from core samples, with overall population densities ranging from 4.13 to 45.45 individuals m -2. Only 92 individuals belonging to 12 species were collected by hand-net from the uppermost infralittoral of five beaches sampled using this method during the summer of 2000. Taxa of gastropods, bivalves, decapods, mysids and staphylinid beetles generally abundant on Mediterranean sandy beaches, were entirely absent from the beaches sampled. Few correlations that could explain the impoverishment of Maltese sandy beaches were found between physical parameters and faunal abundances, and other factors such as inadequate sampling effort, human disturbance and marine pollution were also excluded; however, seasonally biased sampling may partly explain the results obtained. One factor that may explain why certain species are missing could be lack of recruitment, due to Malta's geographical isolation from the European and African mainlands.

  17. SSAGES: Software Suite for Advanced General Ensemble Simulations.

    PubMed

    Sidky, Hythem; Colón, Yamil J; Helfferich, Julian; Sikora, Benjamin J; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S; Reid, Daniel R; Sevgen, Emre; Thapar, Vikram; Webb, Michael A; Whitmer, Jonathan K; de Pablo, Juan J

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques-including adaptive biasing force, string methods, and forward flux sampling-that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  18. Slaughterhouses Fungal Burden Assessment: A Contribution for the Pursuit of a Better Assessment Strategy

    PubMed Central

    Viegas, Carla; Faria, Tiago; dos Santos, Mateus; Carolino, Elisabete; Sabino, Raquel; Quintal Gomes, Anita; Viegas, Susana

    2016-01-01

    In slaughterhouses, the biological risk is present not only from the direct or indirect contact with animal matter, but also from the exposure to bioaerosols. Fungal contamination was already reported from the floors and walls of slaughterhouses. This study intends to assess fungal contamination by cultural and molecular methods in poultry, swine/bovine and large animal slaughterhouses. Air samples were collected through an impaction method, while surface samples were collected by the swabbing method and subjected to further macro- and micro-scopic observations. In addition, we collected air samples using the impinger method in order to perform real-time quantitative PCR (qPCR) amplification of genes from specific fungal species, namely A. flavus, A. fumigatus and A. ochraceus complexes. Poultry and swine/bovine slaughterhouses presented each two sampling sites that surpass the guideline of 150 CFU/m3. Scopulariopsis candida was the most frequently isolated (59.5%) in poultry slaughterhouse air; Cladosporium sp. (45.7%) in the swine/bovine slaughterhouse; and Penicillium sp. (80.8%) in the large animal slaughterhouse. Molecular tools successfully amplified DNA from the A. fumigatus complex in six sampling sites where the presence of this fungal species was not identified by conventional methods. This study besides suggesting the indicators that are representative of harmful fungal contamination, also indicates a strategy as a protocol to ensure a proper characterization of fungal occupational exposure. PMID:27005642

  19. TPH detection in groundwater: Identification and elimination of positive interferences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zemo, D.A.; Synowiec, K.A.

    1996-01-01

    Groundwater assessment programs frequently require total petroleum hydrocarbon (TPH) analyses (Methods 8015M and 418.1). TPH analyses are often unreliable indicators of water quality because these methods are not constituent-specific and are vulnerable to significant sources of positive interferences. These positive interferences include: (a) non-dissolved petroleum constituents; (b) soluble, non-petroleum hydrocarbons (e.g., biodegradation products); and (c) turbidity, commonly introduced into water samples during sample collection. In this paper, we show that the portion of a TPH concentration not directly the result of water-soluble petroleum constituents can be attributed solely to these positive interferences. To demonstrate the impact of these interferences, wemore » conducted a field experiment at a site affected by degraded crude oil. Although TPH was consistently detected in groundwater samples, BTEX was not detected. PNAs were not detected, except for very low concentrations of fluorene (<5 ug/1). Filtering and silica gel cleanup steps were added to sampling and analyses to remove particulates and biogenic by-products. Results showed that filtering lowered the Method 8015M concentrations and reduced the Method 418.1 concentrations to non-detectable. Silica gel cleanup reduced the Method 8015M concentrations to non-detectable. We conclude from this study that the TPH results from groundwater samples are artifacts of positive interferences caused by both particulates and biogenic materials and do not represent dissolved-phase petroleum constituents.« less

  20. TPH detection in groundwater: Identification and elimination of positive interferences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zemo, D.A.; Synowiec, K.A.

    1996-12-31

    Groundwater assessment programs frequently require total petroleum hydrocarbon (TPH) analyses (Methods 8015M and 418.1). TPH analyses are often unreliable indicators of water quality because these methods are not constituent-specific and are vulnerable to significant sources of positive interferences. These positive interferences include: (a) non-dissolved petroleum constituents; (b) soluble, non-petroleum hydrocarbons (e.g., biodegradation products); and (c) turbidity, commonly introduced into water samples during sample collection. In this paper, we show that the portion of a TPH concentration not directly the result of water-soluble petroleum constituents can be attributed solely to these positive interferences. To demonstrate the impact of these interferences, wemore » conducted a field experiment at a site affected by degraded crude oil. Although TPH was consistently detected in groundwater samples, BTEX was not detected. PNAs were not detected, except for very low concentrations of fluorene (<5 ug/1). Filtering and silica gel cleanup steps were added to sampling and analyses to remove particulates and biogenic by-products. Results showed that filtering lowered the Method 8015M concentrations and reduced the Method 418.1 concentrations to non-detectable. Silica gel cleanup reduced the Method 8015M concentrations to non-detectable. We conclude from this study that the TPH results from groundwater samples are artifacts of positive interferences caused by both particulates and biogenic materials and do not represent dissolved-phase petroleum constituents.« less

  1. An Improved Nested Sampling Algorithm for Model Selection and Assessment

    NASA Astrophysics Data System (ADS)

    Zeng, X.; Ye, M.; Wu, J.; WANG, D.

    2017-12-01

    Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.

  2. Minerals and Trace Elements in Milk, Milk Products, Infant Formula, and Adult/Pediatric Nutritional Formula, ICP-MS Method: Collaborative Study, AOAC Final Action 2015.06, ISO/DIS 21424, IDF 243.

    PubMed

    Pacquette, Lawrence H; Thompson, Joseph J; Malaviole, I; Zywicki, R; Woltjes, F; Ding, Y; Mittal, A; Ikeuchi, Y; Sadipiralla, B; Kimura, S; Veltman, H; Miura, A

    2018-03-01

    AOAC Final Action Official MethodSM 2015.06 "Minerals and Trace Elements in Milk, Milk Products, Infant Formula and Adult/Pediatric Nutritional Formula, ICP-MS Method" was collaboratively studied. Note that "milk, milk products" has now been added to the title of the Final Action method because whole milk and several dairy ingredients were successfully incorporated into the collaborative study for the purpose of developing an International Organization for Standardization/International Dairy Federation standard (ISO/DIS 21424; in progress). The method determines sodium, magnesium, phosphorus, potassium, calcium, iron, manganese, zinc, copper, chromium, molybdenum, and selenium by inductively coupled plasma (ICP)-MS after microwave digestion. Ten laboratories participated in the study, and data from five different model ICP-MS units were represented. Thirteen products, five placebo products, and six dairy samples were tested as blind duplicates in this study, along with a standard reference material, for a total 50 samples. The overall repeatability and reproducibility for all samples met Standard Method Performance Requirements put forth by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals, with a few exceptions. Comparisons are made to ICP-atomic emission data from a collaborative study of AOAC Official Method 2011.14 carried out concurrently on these same samples.

  3. Evaluation of Legionella pneumophila contamination in Italian hotel water systems by quantitative real-time PCR and culture methods.

    PubMed

    Bonetta, Sa; Bonetta, Si; Ferretti, E; Balocco, F; Carraro, E

    2010-05-01

    This study was designed to define the extent of water contamination by Legionella pneumophila of certain Italian hotels and to compare quantitative real-time PCR with the conventional culture method. Nineteen Italian hotels of different sizes were investigated. In each hotel three hot water samples (boiler, room showers, recycling) and one cold water sample (inlet) were collected. Physico-chemical parameters were also analysed. Legionella pneumophila was detected in 42% and 74% of the hotels investigated by the culture method and by real-time PCR, respectively. In 21% of samples analysed by the culture method, a concentration of >10(4) CFU l(-1) was found, and Leg. pneumophila serogroup 1 was isolated from 10.5% of the hotels. The presence of Leg. pneumophila was significantly influenced by water sample temperature, while no association with water hardness or residual-free chlorine was found. This study showed a high percentage of buildings colonized by Leg. pneumophila. Moreover, real-time PCR proved to be sensitive enough to detect lower levels of contamination than the culture method. This study indicates that the Italian hotels represent a possible source of risk for Legionnaires' disease and confirms the sensitivity of the molecular method. To our knowledge, this is the first report to demonstrate Legionella contamination in Italian hotels using real-time PCR and culture methods.

  4. Hydraulically controlled discrete sampling from open boreholes

    USGS Publications Warehouse

    Harte, Philip T.

    2013-01-01

    Groundwater sampling from open boreholes in fractured-rock aquifers is particularly challenging because of mixing and dilution of fluid within the borehole from multiple fractures. This note presents an alternative to traditional sampling in open boreholes with packer assemblies. The alternative system called ZONFLO (zonal flow) is based on hydraulic control of borehole flow conditions. Fluid from discrete fractures zones are hydraulically isolated allowing for the collection of representative samples. In rough-faced open boreholes and formations with less competent rock, hydraulic containment may offer an attractive alternative to physical containment with packers. Preliminary test results indicate a discrete zone can be effectively hydraulically isolated from other zones within a borehole for the purpose of groundwater sampling using this new method.

  5. European validation of a real-time PCR-based method for detection of Listeria monocytogenes in soft cheese.

    PubMed

    Gianfranceschi, Monica Virginia; Rodriguez-Lazaro, David; Hernandez, Marta; González-García, Patricia; Comin, Damiano; Gattuso, Antonietta; Delibato, Elisabetta; Sonnessa, Michele; Pasquali, Frederique; Prencipe, Vincenza; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Kozačinski, Lidija; Tomic, Danijela Horvatek; Zdolec, Nevijo; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John Elmerdahl; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Paiusco, Antonella; De Cesare, Alessandra; Manfreda, Gerardo; De Medici, Dario

    2014-08-01

    The classical microbiological method for detection of Listeria monocytogenes requires around 7 days for final confirmation, and due to perishable nature of RTE food products, there is a clear need for an alternative methodology for detection of this pathogen. This study presents an international (at European level) ISO 16140-based validation trial of a non-proprietary real-time PCR-based methodology that can generate final results in the following day of the analysis. This methodology is based on an ISO compatible enrichment coupled to a bacterial DNA extraction and a consolidated real-time PCR assay. Twelve laboratories from six European countries participated in this trial, and soft cheese was selected as food model since it can represent a difficult matrix for the bacterial DNA extraction and real-time PCR amplification. The limit of detection observed was down to 10 CFU per 25 of sample, showing excellent concordance and accordance values between samples and laboratories (>75%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (82.75%, 96.70% and 97.62%, respectively) when the results obtained for the real-time PCR-based methods were compared to those of the ISO 11290-1 standard method. An interesting observation was that the L. monocytogenes detection by the real-time PCR method was less affected in the presence of Listeria innocua in the contaminated samples, proving therefore to be more reliable than the reference method. The results of this international trial demonstrate that the evaluated real-time PCR-based method represents an excellent alterative to the ISO standard since it shows a higher performance as well as reduce the extent of the analytical process, and can be easily implemented routinely by the competent authorities and food industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients

    PubMed Central

    Ramírez, Juan Carlos; Cura, Carolina Inés; Moreira, Otacilio da Cruz; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Guedes, Paulo Marcos da Matta; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Galvão, Lúcia Maria da Cunha; da Câmara, Antonia Cláudia Jácome; Espinoza, Bertha; de Noya, Belkisyole Alarcón; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G.

    2015-01-01

    An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. PMID:26320872

  7. Rumen Bacterial Community Composition in Holstein and Jersey Cows Is Different under Same Dietary Condition and Is Not Affected by Sampling Method

    PubMed Central

    Paz, Henry A.; Anderson, Christopher L.; Muller, Makala J.; Kononoff, Paul J.; Fernando, Samodha C.

    2016-01-01

    The rumen microbial community in dairy cows plays a critical role in efficient milk production. However, there is a lack of data comparing the composition of the rumen bacterial community of the main dairy breeds. This study utilizes 16S rRNA gene sequencing to describe the rumen bacterial community composition in Holstein and Jersey cows fed the same diet by sampling the rumen microbiota via the rumen cannula (Holstein cows) or esophageal tubing (both Holstein and Jersey cows). After collection of the rumen sample via esophageal tubing, particles attached to the strainer were added to the sample to ensure representative sampling of both the liquid and solid fraction of the rumen contents. Alpha diversity metrics, Chao1 and observed OTUs estimates, displayed higher (P = 0.02) bacterial richness in Holstein compared to Jersey cows and no difference (P > 0.70) in bacterial community richness due to sampling method. The principal coordinate analysis displayed distinct clustering of bacterial communities by breed suggesting that Holstein and Jersey cows harbor different rumen bacterial communities. Family level classification of most abundant (>1%) differential OTUs displayed that OTUs from the bacterial families Lachnospiraceae and p-2534-18B5 to be predominant in Holstein cows compared to Jersey cows. Additionally, OTUs belonging to family Prevotellaceae were differentially abundant in the two breeds. Overall, the results from this study suggest that the bacterial community between Holstein and Jersey cows differ and that esophageal tubing with collection of feed particles associated with the strainer provides a representative rumen sample similar to a sample collected via the rumen cannula. Thus, in future studies esophageal tubing with addition of retained particles can be used to collect rumen samples reducing the cost of cannulation and increasing the number of animals used in microbiome investigations, thus increasing the statistical power of rumen microbial community evaluations. PMID:27536291

  8. A population-based nested case control study on recurrent pneumonias in children with severe generalized cerebral palsy: ethical considerations of the design and representativeness of the study sample.

    PubMed

    Veugelers, Rebekka; Calis, Elsbeth A C; Penning, Corine; Verhagen, Arianne; Bernsen, Roos; Bouquet, Jan; Benninga, Marc A; Merkus, Peter J F M; Arets, Hubertus G M; Tibboel, Dick; Evenhuis, Heleen M

    2005-07-19

    In children with severe generalized cerebral palsy, pneumonias are a major health issue. Malnutrition, dysphagia, gastro-oesophageal reflux, impaired respiratory function and constipation are hypothesized risk factors. Still, no data are available on the relative contribution of these possible risk factors in the described population. This paper describes the initiation of a study in 194 children with severe generalized cerebral palsy, on the prevalence and on the impact of these hypothesized risk factors of recurrent pneumonias. A nested case-control design with 18 months follow-up was chosen. Dysphagia, respiratory function and constipation will be assessed at baseline, malnutrition and gastro-oesophageal reflux at the end of the follow-up. The study population consists of a representative population sample of children with severe generalized cerebral palsy. Inclusion was done through care-centres in a predefined geographical area and not through hospitals. All measurements will be done on-site which sets high demands on all measurements. If these demands were not met in "gold standard" methods, other methods were chosen. Although the inclusion period was prolonged, the desired sample size of 300 children was not met. With a consent rate of 33%, nearly 10% of all eligible children in The Netherlands are included (n = 194). The study population is subtly different from the non-participants with regard to severity of dysphagia and prevalence rates of pneumonias and gastro-oesophageal reflux. Ethical issues complicated the study design. Assessment of malnutrition and gastro-oesophageal reflux at baseline was considered unethical, since these conditions can be easily treated. Therefore, we postponed these diagnostics until the end of the follow-up. In order to include a representative sample, all eligible children in a predefined geographical area had to be contacted. To increase the consent rate, on-site measurements are of first choice, but timely inclusion is jeopardized. The initiation of this first study among children with severe neurological impairment led to specific, unexpected problems. Despite small differences between participants and non-participating children, our sample is as representative as can be expected from any population-based study and will provide important, new information to bring us further towards effective interventions to prevent pneumonias in this population.

  9. Gender Differences in the Associations among Body Mass Index, Weight Loss, Exercise, and Drinking among College Students

    ERIC Educational Resources Information Center

    Barry, Adam E.; Whiteman, Shawn; Piazza-Gardner, Anna K.; Jensen, Alexander C.

    2013-01-01

    Objective: To explore gender differences regarding weight management behaviors of college drinkers. Participants: Nationally representative sample of college students from the fall 2008 American College Health Association's National College Health Assessment II ("N" = 26,062 students). Methods: Structural equation modeling was used…

  10. Tobacco Use and Cessation Behavior Among Adolescents Participating in Organized Sports

    ERIC Educational Resources Information Center

    Castrucci, Brian C.; Gerlach, Karen K.; Kaufman, Nancy J.; Orleans, C. Tracy

    2004-01-01

    Objectives: To examine the difference in tobacco use between adolescents who participate in organized sports and those who do not. Methods: Using a cross-sectional study design, this study uses data from a nationally representative sample of adolescents enrolled in public high schools in the United States. Results: Those participating in organized…

  11. Health Risk Behaviors in a Representative Sample of Bisexual and Heterosexual Female High School Students in Massachusetts

    ERIC Educational Resources Information Center

    White Hughto, Jaclyn M.; Biello, Katie B.; Reisner, Sari L.; Perez-Brumer, Amaya; Heflin, Katherine J.; Mimiaga, Matthew J.

    2016-01-01

    Background: Differences in sexual health-related outcomes by sexual behavior and identity remain underinvestigated among bisexual female adolescents. Methods: Data from girls (N?=?875) who participated in the Massachusetts Youth Risk Behavior Surveillance survey were analyzed. Weighted logistic regression models were fit to examine sexual and…

  12. Stroke Knowledge among Urban and Frontier First Responders and Emergency Medical Technicians in Montana

    ERIC Educational Resources Information Center

    McNamara, Michael J.; Oser, Carrie; Gohdes, Dorothy; Fogle, Crystelle C.; Dietrich, Dennis W.; Burnett, Anne; Okon, Nicholas; Russell, Joseph A.; DeTienne, James; Harwell, Todd S.; Helgerson, Steven D.

    2008-01-01

    Purpose: To assess stroke knowledge and practice among frontier and urban emergency medical services (EMS) providers and to evaluate the need for additional prehospital stroke training opportunities in Montana. Methods: In 2006, a telephone survey of a representative sample of EMS providers was conducted in Montana. Respondents were stratified…

  13. Multiple Health Risk Behaviors in Adolescents: An Examination of Youth Risk Behavior Survey Data

    ERIC Educational Resources Information Center

    Coleman, Casey; Wileyto, E. Paul; Lenhart, Clare M.; Patterson, Freda

    2014-01-01

    Background: Chronic disease risk factors tend to cooccur. Purpose: This study examined the cooccurrence of 8 negative health behaviors in a representative sample of urban adolescents to inform educational interventions. Methods: The prevalence, cooccurrence, and clustering of suicide attempt, lifetime history of sexual activity, tobacco use, cell…

  14. Trauma-Related Impairment in Children--A Survey in Sri Lankan Provinces Affected by Armed Conflict

    ERIC Educational Resources Information Center

    Elbert, Thomas; Schauer, Maggie; Schauer, Elisabeth; Huschka, Bianca; Hirth, Michael; Neuner, Frank

    2009-01-01

    Objectives: The present study examined traumatic experiences, PTSD, and co-morbid symptoms in relation to neuropsychological and school performance in school children affected by two decades of civil war and unrest. Method: The epidemiological survey of children's mental health included a representative sample of 420 school children. Local…

  15. Dietary Intake among U.S. Adults with Disability

    ERIC Educational Resources Information Center

    An, Ruopeng; Chiu, Chung-Yi

    2015-01-01

    Purpose: Physical, mental, and financial barriers among individuals with disability may limit their access to fruit and vegetable. In this study, we examined the relationship between disability status and vegetable, fruit, and fruit juice intake among U.S. adults aged 18 years and older using a large nationally representative sample. Methods:…

  16. The Prevalence of Stalking among College Students: The Disparity between Researcher- and Self-Identified Victimization

    ERIC Educational Resources Information Center

    McNamara, Corinne L.; Marsil, Dorothy F.

    2012-01-01

    Objective: Researchers examined the prevalence of self-identified and researcher-identified stalking victimization among college students. Participants and Methods: A representative sample of 1,573 (70.1% female; 29.9% male) student respondents completed an online stalking questionnaire. Results: Overall, 12% self-identified as having been…

  17. Being Bullied and Psychosocial Adjustment among Middle School Students in China

    ERIC Educational Resources Information Center

    Cheng, Yulan; Newman, Ian M.; Qu, Ming; Mbulo, Lazarous; Chai, Yan; Chen, Yan; Shell, Duane F.

    2010-01-01

    Background: Using the Chinese version of the Global School-based Health Survey (GSHS), this article describes the prevalence of being bullied among a nationally representative sample of Chinese students in grades 6-10 and explores the relationships between being bullied and selected indicators of psychosocial adjustment. Methods: A total of 9015…

  18. Dual Use of Cigarettes and Smokeless Tobacco among South African Adolescents

    ERIC Educational Resources Information Center

    Rantao, Masego; Ayo-Yusuf, Olalekan A.

    2012-01-01

    Objectives: To determine factors associated with dual use of tobacco products in a population of black South African adolescents. Methods: Data were obtained from a self-administered questionnaire completed by a representative sample of grade 8 students from 21 randomly selected secondary state schools in the Limpopo Province, South Africa (n =…

  19. Correlates of Sexual Abuse and Smoking among French Adults

    ERIC Educational Resources Information Center

    King, Gary; Guilbert, Philippe; Ward, D. Gant; Arwidson, Pierre; Noubary, Farzad

    2006-01-01

    Objective: The goal of this study was to examine the association between sexual abuse (SA) and initiation, cessation, and current cigarette smoking among a large representative adult population in France. Method: A random sample size of 12,256 adults (18-75 years of age) was interviewed by telephone concerning demographic variables, health…

  20. Parent and conjugated estrogens and progestagens in surface water of the Santa Ana River: Determination, occurrence, and risk assessment

    USDA-ARS?s Scientific Manuscript database

    This study presents a sensitive analytical method using high performance liquid chromatography tandem mass spectrometry for the simultaneous monitoring of five estrogen conjugates, six estrogens and two progestagens in surface water of the Santa Ana River. Samples at ten representative sites along t...

  1. Seismic Propagation in the Kuriles/Kamchatka Region

    DTIC Science & Technology

    1980-07-25

    model the final profile is well-represented by a spline interpolation. Figure 7 shows the sampling grid used to input velocity perturbations due to the...A modification of Cagniard’s method for s~ lving seismic pulse problems, Appl. Sci. Res. B., 8, p. 349, 1960. Fuchs, K. and G. Muller, Computation of

  2. "Feedback" For Instructioal Television.

    ERIC Educational Resources Information Center

    Schramm, Wilbur

    A number of different methods have been used by instructional television (ITV) projects to obtain audience feedback, and some of these are now being used in the ITV system in El Salvador. We know that pretesting programs on a representative sample can bring considerable gains in learning. Another feedback source can be a classroom of pupils in the…

  3. 12 CFR 715.8 - Requirements for verification of accounts and passbooks.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... selection: (ii) A sample which is representative of the population from which it was selected; (iii) An equal chance of selecting each dollar in the population; (iv) Sufficient accounts in both number and... consistent with GAAS if such methods provide for: (i) Sufficient accounts in both number and scope on which...

  4. Gene-by-Preschool Interaction on the Development of Early Externalizing Problems

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.; Harden, K. Paige

    2013-01-01

    Background: Preschool involves an array of new social experiences that may impact the development of early externalizing behavior problems over the transition to grade school. Methods: Using longitudinal data from a nationally representative sample of over 600 pairs of US twins, we tested whether the genetic and environmental influences on…

  5. Sampling methods for bats.

    Treesearch

    D.W. Thomas; S.D. West

    1989-01-01

    Bats represent the second most diverse group of mammals inhabiting the western slopes of the Cascade Range in southern Washington and the Oregon Coast Range. Bat populations may well be sensitive to changes in forest age, structure, or distribution, but their nocturnal habits and high mobility render the study of the habitat requirements of bats problematical. Unlike...

  6. Ethics Education in Australian Preservice Teacher Programs: A Hidden Imperative?

    ERIC Educational Resources Information Center

    Boon, Helen J.; Maxwell, Bruce

    2016-01-01

    This paper provides a snapshot of the current approach to ethics education in accredited Australian pre-service teacher programs. Methods included a manual calendar search of ethics related subjects required in teacher programs using a sample of 24 Australian universities and a survey of 26 university representatives. Findings show a paucity of…

  7. Professional Development Activities and Support among Physical Education Teachers in the United States

    ERIC Educational Resources Information Center

    Cardina, Catherine E.; DeNysschen, Carol

    2018-01-01

    Purpose: This study described professional development (PD) among public school physical education (PE) teachers and compared PE teachers to teachers of other subjects. Method: Data were collected from a nationally representative sample of public school teachers in the United States. Descriptive statistics were used to describe teachers' support…

  8. Does Fall History Influence Residential Adjustments?

    ERIC Educational Resources Information Center

    Leland, Natalie; Porell, Frank; Murphy, Susan L.

    2011-01-01

    Purpose of the study: To determine whether reported falls at baseline are associated with an older adult's decision to make a residential adjustment (RA) and the type of adjustment made in the subsequent 2 years. Design and Methods: Observations (n = 25,036) were from the Health and Retirement Study, a nationally representative sample of…

  9. Physical Activity Breaks and Facilities in US Secondary Schools

    ERIC Educational Resources Information Center

    Hood, Nancy E.; Colabianchi, Natalie; Terry-McElrath, Yvonne M.; O'Malley, Patrick M.; Johnston, Lloyd D.

    2014-01-01

    Background: Research on physical activity breaks and facilities (indoor and outdoor) in secondary schools is relatively limited. Methods: School administrators and students in nationally representative samples of 8th (middle school) and 10th/12th grade (high school) students were surveyed annually from 2008-2009 to 2011-2012. School administrators…

  10. Methods for increasing cooperation rates for surveys of family forest owners

    Treesearch

    Brett J. Butler; Jaketon H. Hewes; Mary L. Tyrrell; Sarah M. Butler

    2016-01-01

    To maximize the representativeness of results from surveys, coverage, sampling, nonresponse, measurement, and analysis errors must be minimized. Although not a cure-all, one approach for mitigating nonresponse errors is to maximize cooperation rates. In this study, personalizing mailings, token financial incentives, and the use of real stamps were tested for their...

  11. Epidemiology of Attention Problems among Turkish Children and Adolescents: A National Study

    ERIC Educational Resources Information Center

    Erol, Nese; Simsek, Zeynep; Oner, Ozgur; Munir, Kerim

    2008-01-01

    Objective: To evaluate the epidemiology of attention problems using parent, teacher, and youth informants among a nationally representative Turkish sample. Method: The children and adolescents, 4 to 18 years old, were selected from a random household survey. Attention problems derived from the Child Behavior Checklist (CBCL) (N = 4,488), Teacher…

  12. Factors Influencing Openness to Future Smoking among Nonsmoking Adolescents

    ERIC Educational Resources Information Center

    Seo, Dong-Chul; Torabi, Mohammad R.; Weaver, Amy E.

    2008-01-01

    Background: To investigate the correlates of youth tobacco use in terms of nonsmoking adolescents' openness to future smoking, a secondary analysis of the 2000 and 2004 Indiana Youth Tobacco Survey (IYTS) was conducted. Methods: A representative sample of 1416 public high school students in grades 9-12 and 1516 public middle school students in…

  13. Increasing Prevalence of US Elementary School Gardens, but Disparities Reduce Opportunities for Disadvantaged Students

    ERIC Educational Resources Information Center

    Turner, Lindsey; Eliason, Meghan; Sandoval, Anna; Chaloupka, Frank J.

    2016-01-01

    Background: We examined the prevalence of school garden programs at US public elementary schools. The study examined time trends, demographic and regional disparities, and associations with related programs such as farm-to-school. Methods: Annual surveys were gathered from nationally representative samples of elementary schools between 2006-2007…

  14. Development and Validation of a Measure of Interpersonal Strengths: The Inventory of Interpersonal Strengths

    ERIC Educational Resources Information Center

    Hatcher, Robert L.; Rogers, Daniel T.

    2009-01-01

    An Inventory of Interpersonal Strengths (IIS) was developed and validated in a series of large college student samples. Based on interpersonal theory and associated methods, the IIS was designed to assess positive characteristics representing the full range of interpersonal domains, including those generally thought to have negative qualities…

  15. Trend Analysis of Bullying Victimization Prevalence in Spanish Adolescent Youth at School

    ERIC Educational Resources Information Center

    Sánchez-Queija, Inmaculada; García-Moya, Irene; Moreno, Carmen

    2017-01-01

    Background: We analyze trends in bullying victimization prevalence in a representative sample of Spanish adolescent schoolchildren in 2006, 2010, and 2014. Methods: We distinguish between reported bullying, which is assessed via the global question in the Revised Bully/Victim Questionnaire by Olweus, and observed bullying, which is a measure…

  16. Congruence of Standard Setting Methods for a Nursing Certification Examination.

    ERIC Educational Resources Information Center

    Fabrey, Lawrence J.; Raymond, Mark R.

    The American Nurses' Association certification provides professional recognition beyond licensure to nurses who pass an examination. To determine the passing score as it would be set by a representative peer group, a survey was mailed to a random sample of 200 recently certified nurses. Three questions were asked: (1) what percentage of examinees…

  17. Carbon fiber counting. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A method was developed for characterizing the number and lengths of carbon fibers accidentally released by the burning of composite portions of civil aircraft structure in a jet fuel fire after an accident. Representative samplings of carbon fibers collected on transparent sticky film were counted from photographic enlargements with a computer aided technique which also provided fiber lengths.

  18. Detection of microbial concentration in ice-cream using the impedance technique.

    PubMed

    Grossi, M; Lanzoni, M; Pompei, A; Lazzarini, R; Matteuzzi, D; Riccò, B

    2008-06-15

    The detection of microbial concentration, essential for safe and high quality food products, is traditionally made with the plate count technique, that is reliable, but also slow and not easily realized in the automatic form, as required for direct use in industrial machines. To this purpose, the method based on impedance measurements represents an attractive alternative since it can produce results in about 10h, instead of the 24-48h needed by standard plate counts and can be easily realized in automatic form. In this paper such a method has been experimentally studied in the case of ice-cream products. In particular, all main ice-cream compositions of real interest have been considered and no nutrient media has been used to dilute the samples. A measurement set-up has been realized using benchtop instruments for impedance measurements on samples whose bacteria concentration was independently measured by means of standard plate counts. The obtained results clearly indicate that impedance measurement represents a feasible and reliable technique to detect total microbial concentration in ice-cream, suitable to be implemented as an embedded system for industrial machines.

  19. Windowed cross-correlation and peak picking for the analysis of variability in the association between behavioral time series.

    PubMed

    Boker, Steven M; Xu, Minquan; Rotondo, Jennifer L; King, Kadijah

    2002-09-01

    Cross-correlation and most other longitudinal analyses assume that the association between 2 variables is stationary. Thus, a sample of occasions of measurement is expected to be representative of the association between variables regardless of the time of onset or number of occasions in the sample. The authors propose a method to analyze the association between 2 variables when the assumption of stationarity may not be warranted. The method results in estimates of both the strength of peak association and the time lag when the peak association occurred for a range of starting values of elapsed time from the beginning of an experiment.

  20. Plant protection product residues in white grapes and wines of "Malvasia Istriana" produced in Istria.

    PubMed

    Baša Česnik, Helena; Velikonja Bolta, Špela; Bavčar, Dejan; Radeka, Sanja; Lisjak, Klemen

    2016-12-01

    Monitorting of plant protection product residues was performed in 12 grape and 66 wine samples of "Malvasia Istriana" variety, produced in Istria winegrowing region of Croatia and Slovenia. The samples were analysed for the presence of 169 different active compounds using two multiresidual analytical methods: gas chromatography coupled with mass spectrometry and liquid chromatography coupled with tandem mass spectrometry. Residues were found in 58.3% of all the inspected grape samples and in 28.8% of all the inspected wine samples. Beside that contents of residues in grapes were below 10% of maximum residue level values and they should not represent any risk for "Malvasia Istriana" grape or wine consumers.

Top