Sample records for sample selection sample

  1. 40 CFR 761.247 - Sample site selection for pipe segment removal.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.247 Sample site selection for pipe segment removal. (a) General. (1) Select the pipe... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample site selection for pipe segment...

  2. 40 CFR 761.247 - Sample site selection for pipe segment removal.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sample site selection for pipe segment... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.247 Sample site selection for pipe segment removal. (a) General. (1) Select the pipe...

  3. 40 CFR 761.250 - Sample site selection for pipeline section abandonment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample site selection for pipeline... Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.250 Sample site selection for pipeline section abandonment. This procedure...

  4. 40 CFR 761.250 - Sample site selection for pipeline section abandonment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sample site selection for pipeline... Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.250 Sample site selection for pipeline section abandonment. This procedure...

  5. 40 CFR 90.507 - Sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sample selection. 90.507 Section 90... Auditing § 90.507 Sample selection. (a) Engines comprising a test sample will be selected at the location... cannot be selected in the manner specified in the test order, an alternative selection procedure may be...

  6. 40 CFR 89.507 - Sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sample selection. 89.507 Section 89... Auditing § 89.507 Sample selection. (a) Engines comprising a test sample will be selected at the location... cannot be selected in the manner specified in the test order, an alternative selection procedure may be...

  7. 40 CFR 91.606 - Sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sample selection. 91.606 Section 91....606 Sample selection. (a) Engines comprising a test sample will be selected at the location and in the... in the manner specified in the test order, an alternative selection procedure may be employed...

  8. Does self-selection affect samples' representativeness in online surveys? An investigation in online video game research.

    PubMed

    Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-07-07

    The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.

  9. 40 CFR 761.353 - Second level of sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Second level of sample selection. 761...-Site Disposal, in Accordance With § 761.61 § 761.353 Second level of sample selection. The second level of sample selection reduces the size of the 19-liter subsample that was collected according to...

  10. Does Self-Selection Affect Samples’ Representativeness in Online Surveys? An Investigation in Online Video Game Research

    PubMed Central

    van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-01-01

    Background The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Objective Our objective was to explore the representativeness of a self-selected sample of online gamers using online players’ virtual characters (avatars). Methods All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars’ characteristics were defined using various games’ scores, reported on the WoW’s official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. Results We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Conclusions Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted. PMID:25001007

  11. Sample selection in foreign similarity regions for multicrop experiments

    NASA Technical Reports Server (NTRS)

    Malin, J. T. (Principal Investigator)

    1981-01-01

    The selection of sample segments in the U.S. foreign similarity regions for development of proportion estimation procedures and error modeling for Argentina, Australia, Brazil, and USSR in AgRISTARS is described. Each sample was chosen to be similar in crop mix to the corresponding indicator region sample. Data sets, methods of selection, and resulting samples are discussed.

  12. A novel heterogeneous training sample selection method on space-time adaptive processing

    NASA Astrophysics Data System (ADS)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  13. 40 CFR 761.283 - Determination of the number of samples to collect and sample collection locations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sampling points after the recleaning, but select three new pairs of sampling coordinates. (i) Beginning in the southwest corner (lower left when facing magnetic north) of the area to be sampled, measure in... new pair of sampling coordinates. Continue to select pairs of sampling coordinates until three are...

  14. Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks

    USGS Publications Warehouse

    Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.

    2011-01-01

    Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  15. 40 CFR 205.57-2 - Test vehicle sample selection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Test vehicle sample selection. 205.57... vehicle sample selection. (a) Vehicles comprising the batch sample which are required to be tested... test request from a batch of vehicles of the category or configuration specified in the test request...

  16. An active learning representative subset selection method using net analyte signal.

    PubMed

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-05

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. An active learning representative subset selection method using net analyte signal

    NASA Astrophysics Data System (ADS)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  18. National accident sampling system sample design, phases 2 and 3 : executive summary

    DOT National Transportation Integrated Search

    1979-11-01

    This report describes the Phase 2 and 3 sample design for the : National Accident Sampling System (NASS). It recommends a procedure : for the first-stage selection of Primary Sampling Units (PSU's) and : the second-stage design for the selection of a...

  19. 40 CFR 205.171-2 - Test exhaust system sample selection and preparation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Test exhaust system sample selection... Systems § 205.171-2 Test exhaust system sample selection and preparation. (a)(1) Exhaust systems comprising the sample which are required to be tested under a test request in accordance with this subpart...

  20. 40 CFR 86.607-84 - Sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Sample selection. 86.607-84 Section 86... selection. (a) Vehicles comprising a test sample which are required to be tested, pursuant to a test order... specified in the test order, an alternative selection procedure may be employed: Provided, That the...

  1. Protein crystallography prescreen kit

    DOEpatents

    Segelke, Brent W.; Krupka, Heike I.; Rupp, Bernhard

    2007-10-02

    A kit for prescreening protein concentration for crystallization includes a multiplicity of vials, a multiplicity of pre-selected reagents, and a multiplicity of sample plates. The reagents and a corresponding multiplicity of samples of the protein in solutions of varying concentrations are placed on sample plates. The sample plates containing the reagents and samples are incubated. After incubation the sample plates are examined to determine which of the sample concentrations are too low and which the sample concentrations are too high. The sample concentrations that are optimal for protein crystallization are selected and used.

  2. Protein crystallography prescreen kit

    DOEpatents

    Segelke, Brent W.; Krupka, Heike I.; Rupp, Bernhard

    2005-07-12

    A kit for prescreening protein concentration for crystallization includes a multiplicity of vials, a multiplicity of pre-selected reagents, and a multiplicity of sample plates. The reagents and a corresponding multiplicity of samples of the protein in solutions of varying concentrations are placed on sample plates. The sample plates containing the reagents and samples are incubated. After incubation the sample plates are examined to determine which of the sample concentrations are too low and which the sample concentrations are too high. The sample concentrations that are optimal for protein crystallization are selected and used.

  3. Estimation of Variance in the Case of Complex Samples.

    ERIC Educational Resources Information Center

    Groenewald, A. C.; Stoker, D. J.

    In a complex sampling scheme it is desirable to select the primary sampling units (PSUs) without replacement to prevent duplications in the sample. Since the estimation of the sampling variances is more complicated when the PSUs are selected without replacement, L. Kish (1965) recommends that the variance be calculated using the formulas…

  4. Biological sample collector

    DOEpatents

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  5. A sampling design framework for monitoring secretive marshbirds

    USGS Publications Warehouse

    Johnson, D.H.; Gibbs, J.P.; Herzog, M.; Lor, S.; Niemuth, N.D.; Ribic, C.A.; Seamans, M.; Shaffer, T.L.; Shriver, W.G.; Stehman, S.V.; Thompson, W.L.

    2009-01-01

    A framework for a sampling plan for monitoring marshbird populations in the contiguous 48 states is proposed here. The sampling universe is the breeding habitat (i.e. wetlands) potentially used by marshbirds. Selection protocols would be implemented within each of large geographical strata, such as Bird Conservation Regions. Site selection will be done using a two-stage cluster sample. Primary sampling units (PSUs) would be land areas, such as legal townships, and would be selected by a procedure such as systematic sampling. Secondary sampling units (SSUs) will be wetlands or portions of wetlands in the PSUs. SSUs will be selected by a randomized spatially balanced procedure. For analysis, the use of a variety of methods as a means of increasing confidence in conclusions that may be reached is encouraged. Additional effort will be required to work out details and implement the plan.

  6. X-ray versus infrared selection of distant galaxy clusters: A case study using the XMM-LSS and SpARCS cluster samples

    NASA Astrophysics Data System (ADS)

    Willis, J. P.; Ramos-Ceja, M. E.; Muzzin, A.; Pacaud, F.; Yee, H. K. C.; Wilson, G.

    2018-04-01

    We present a comparison of two samples of z > 0.8 galaxy clusters selected using different wavelength-dependent techniques and examine the physical differences between them. We consider 18 clusters from the X-ray selected XMM-LSS distant cluster survey and 92 clusters from the optical-MIR selected SpARCS cluster survey. Both samples are selected from the same approximately 9 square degree sky area and we examine them using common XMM-Newton, Spitzer-SWIRE and CFHT Legacy Survey data. Clusters from each sample are compared employing aperture measures of X-ray and MIR emission. We divide the SpARCS distant cluster sample into three sub-samples: a) X-ray bright, b) X-ray faint, MIR bright, and c) X-ray faint, MIR faint clusters. We determine that X-ray and MIR selected clusters display very similar surface brightness distributions of galaxy MIR light. In addition, the average location and amplitude of the galaxy red sequence as measured from stacked colour histograms is very similar in the X-ray and MIR-selected samples. The sub-sample of X-ray faint, MIR bright clusters displays a distribution of BCG-barycentre position offsets which extends to higher values than all other samples. This observation indicates that such clusters may exist in a more disturbed state compared to the majority of the distant cluster population sampled by XMM-LSS and SpARCS. This conclusion is supported by stacked X-ray images for the X-ray faint, MIR bright cluster sub-sample that display weak, centrally-concentrated X-ray emission, consistent with a population of growing clusters accreting from an extended envelope of material.

  7. DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR BENCH-SCALE REFORMER TREATABILITY STUDIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING DL

    2011-02-11

    This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Bench-Scale Reforming testing. The type, quantity, and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluidized bed steam reformer. A determination of the adequacy of the fluidized bed steam reformer process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the fluidized bed steam reformer process is to select archived waste samples from the 222-S Laboratory that will be used in a bench scale tests. Analyses of the selected samples will be required to confirm the samples meet the shipping requirements and for comparison to the bench scale reformer (BSR) test sample selection requirements.« less

  8. Strong Selection at MHC in Mexicans since Admixture

    PubMed Central

    Zhou, Quan; Zhao, Liang; Guan, Yongtao

    2016-01-01

    Mexicans are a recent admixture of Amerindians, Europeans, and Africans. We performed local ancestry analysis of Mexican samples from two genome-wide association studies obtained from dbGaP, and discovered that at the MHC region Mexicans have excessive African ancestral alleles compared to the rest of the genome, which is the hallmark of recent selection for admixed samples. The estimated selection coefficients are 0.05 and 0.07 for two datasets, which put our finding among the strongest known selections observed in humans, namely, lactase selection in northern Europeans and sickle-cell trait in Africans. Using inaccurate Amerindian training samples was a major concern for the credibility of previously reported selection signals in Latinos. Taking advantage of the flexibility of our statistical model, we devised a model fitting technique that can learn Amerindian ancestral haplotype from the admixed samples, which allows us to infer local ancestries for Mexicans using only European and African training samples. The strong selection signal at the MHC remains without Amerindian training samples. Finally, we note that medical history studies suggest such a strong selection at MHC is plausible in Mexicans. PMID:26863142

  9. Robust model selection and the statistical classification of languages

    NASA Astrophysics Data System (ADS)

    García, J. E.; González-López, V. A.; Viola, M. L. L.

    2012-10-01

    In this paper we address the problem of model selection for the set of finite memory stochastic processes with finite alphabet, when the data is contaminated. We consider m independent samples, with more than half of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We devise a model selection procedure such that for a sample size large enough, the selected process is the one with law Q. Our model selection strategy is based on estimating relative entropies to select a subset of samples that are realizations of the same law. Although the procedure is valid for any family of finite order Markov models, we will focus on the family of variable length Markov chain models, which include the fixed order Markov chain model family. We define the asymptotic breakdown point (ABDP) for a model selection procedure, and we show the ABDP for our procedure. This means that if the proportion of contaminated samples is smaller than the ABDP, then, as the sample size grows our procedure selects a model for the process with law Q. We also use our procedure in a setting where we have one sample conformed by the concatenation of sub-samples of two or more stochastic processes, with most of the subsamples having law Q. We conducted a simulation study. In the application section we address the question of the statistical classification of languages according to their rhythmic features using speech samples. This is an important open problem in phonology. A persistent difficulty on this problem is that the speech samples correspond to several sentences produced by diverse speakers, corresponding to a mixture of distributions. The usual procedure to deal with this problem has been to choose a subset of the original sample which seems to best represent each language. The selection is made by listening to the samples. In our application we use the full dataset without any preselection of samples. We apply our robust methodology estimating a model which represent the main law for each language. Our findings agree with the linguistic conjecture, related to the rhythm of the languages included on our dataset.

  10. Approaches to sampling and case selection in qualitative research: examples in the geography of health.

    PubMed

    Curtis, S; Gesler, W; Smith, G; Washburn, S

    2000-04-01

    This paper focuses on the question of sampling (or selection of cases) in qualitative research. Although the literature includes some very useful discussions of qualitative sampling strategies, the question of sampling often seems to receive less attention in methodological discussion than questions of how data is collected or is analysed. Decisions about sampling are likely to be important in many qualitative studies (although it may not be an issue in some research). There are varying accounts of the principles applicable to sampling or case selection. Those who espouse 'theoretical sampling', based on a 'grounded theory' approach, are in some ways opposed to those who promote forms of 'purposive sampling' suitable for research informed by an existing body of social theory. Diversity also results from the many different methods for drawing purposive samples which are applicable to qualitative research. We explore the value of a framework suggested by Miles and Huberman [Miles, M., Huberman,, A., 1994. Qualitative Data Analysis, Sage, London.], to evaluate the sampling strategies employed in three examples of research by the authors. Our examples comprise three studies which respectively involve selection of: 'healing places'; rural places which incorporated national anti-malarial policies; young male interviewees, identified as either chronically ill or disabled. The examples are used to show how in these three studies the (sometimes conflicting) requirements of the different criteria were resolved, as well as the potential and constraints placed on the research by the selection decisions which were made. We also consider how far the criteria Miles and Huberman suggest seem helpful for planning 'sample' selection in qualitative research.

  11. Analysing home-ownership of couples: the effect of selecting couples at the time of the survey.

    PubMed

    Mulder, C H

    1996-09-01

    "The analysis of events encountered by couple and family households may suffer from sample selection bias when data are restricted to couples existing at the moment of interview. The paper discusses the effect of sample selection bias on event history analyses of buying a home [in the Netherlands] by comparing analyses performed on a sample of existing couples with analyses of a more complete sample including past as well as current partner relationships. The results show that, although home-buying in relationships that have ended differs clearly from behaviour in existing relationships, sample selection bias is not alarmingly large." (SUMMARY IN FRE) excerpt

  12. DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR THE BENCH STEAM REFORMER TEST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING DL

    2010-08-03

    This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.« less

  13. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  14. 40 CFR 761.240 - Scope and definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.240 Scope and definitions. (a) Use these procedures to select surface sampling sites for natural gas pipe to determine its PCB surface concentration for abandonment-in-place or removal and disposal off-site in...

  15. 40 CFR 761.240 - Scope and definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.240 Scope and definitions. (a) Use these procedures to select surface sampling sites for natural gas pipe to determine its PCB surface concentration for abandonment-in-place or removal and disposal off-site in...

  16. 40 CFR 761.240 - Scope and definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.240 Scope and definitions. (a) Use these procedures to select surface sampling sites for natural gas pipe to determine its PCB surface concentration for abandonment-in-place or removal and disposal off-site in...

  17. 40 CFR 761.240 - Scope and definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.240 Scope and definitions. (a) Use these procedures to select surface sampling sites for natural gas pipe to determine its PCB surface concentration for abandonment-in-place or removal and disposal off-site in...

  18. CTEPP STANDARD OPERATING PROCEDURE FOR SAMPLE SELECTION (SOP-1.10)

    EPA Science Inventory

    The procedures for selecting CTEPP study subjects are described in the SOP. The primary, county-level stratification is by region and urbanicity. Six sample counties in each of the two states (North Carolina and Ohio) are selected using stratified random sampling and reflect ...

  19. The K-selected Butcher-Oemler Effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanford, S A; De Propris, R; Dickinson, M

    2004-03-02

    We investigate the Butcher-Oemler effect using samples of galaxies brighter than observed frame K* + 1.5 in 33 clusters at 0.1 {approx}< z {approx}< 0.9. We attempt to duplicate as closely as possible the methodology of Butcher & Oemler. Apart from selecting in the K-band, the most important difference is that we use a brightness limit fixed at 1.5 magnitudes below an observed frame K* rather than the nominal limit of rest frame M(V ) = -20 used by Butcher & Oemler. For an early type galaxy at z = 0.1 our sample cutoff is 0.2 magnitudes brighter than restmore » frame M(V ) = -20, while at z = 0.9 our cutoff is 0.9 magnitudes brighter. If the blue galaxies tend to be faint, then the difference in magnitude limits should result in our measuring lower blue fractions. A more minor difference from the Butcher & Oemler methodology is that the area covered by our galaxy samples has a radius of 0.5 or 0.7 Mpc at all redshifts rather than R{sub 30}, the radius containing 30% of the cluster population. In practice our field sizes are generally similar to those used by Butcher & Oemler. We find the fraction of blue galaxies in our K-selected samples to be lower on average than that derived from several optically selected samples, and that it shows little trend with redshift. However, at the redshifts z < 0.6 where our sample overlaps with that of Butcher & Oemler, the difference in fB as determined from our K-selected samples and those of Butcher & Oemler is much reduced. The large scatter in the measured f{sub B}, even in small redshift ranges, in our study indicates that determining the f{sub B} for a much larger sample of clusters from K-selected galaxy samples is important. As a test of our methods, our data allow us to construct optically-selected samples down to rest frame M(V ) = -20, as used by Butcher & Oemler, for four clusters that are common between our sample and that of Butcher & Oemler. For these rest V selected samples, we find similar fractions of blue galaxies to Butcher & Oemler, while the K selected samples for the same 4 clusters yield blue fractions which are typically half as large. This comparison indicates that selecting in the K-band is the primary difference between our study and previous optically-based studies of the Butcher & Oemler effect. Selecting in the observed K-band is more nearly a process of selecting galaxies by their mass than is the case for optically-selected samples. Our results suggest that the Butcher-Oemler effect is at least partly due to low mass galaxies whose optical luminosities are boosted. These lower mass galaxies could evolve into the rich dwarf population observed in nearby clusters.« less

  20. Avoid Early Selection for Growth Rate in Cottonwood

    Treesearch

    D. T. Cooper; Robert B. Ferguson

    1971-01-01

    A sample of 37 cottonwood clones from a selection program was compared with a sample of 40 random clones in a 14-year test at two sites near Stoneville, Mississippi. Throughout the test period, the select sample was slightly better in mean growth rate, but this difference decreased with age. Performance of ''blue tag" clones selected at age 5 and planted...

  1. ROLE OF LABORATORY SAMPLING DEVICES AND LABORATORY SUBSAMPLING METHODS IN OPTIMIZING REPRESENTATIVENESS STRATEGIES

    EPA Science Inventory

    Sampling is the act of selecting items from a specified population in order to estimate the parameters of that population (e.g., selecting soil samples to characterize the properties at an environmental site). Sampling occurs at various levels and times throughout an environmenta...

  2. 10 CFR 430.70 - Enforcement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...), the Secretary may conduct testing of that covered product under this subpart by means of a test notice... be selected for testing, the method of selecting the test sample, the time at which testing shall be... shall select a batch, a batch sample, and test units from the batch sample in accordance with the...

  3. 40 CFR 90.706 - Engine sample selection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Engine sample selection. 90.706 Section...) CONTROL OF EMISSIONS FROM NONROAD SPARK-IGNITION ENGINES AT OR BELOW 19 KILOWATTS Manufacturer Production Line Testing Program § 90.706 Engine sample selection. (a) At the start of each model year, the small...

  4. 40 CFR 90.706 - Engine sample selection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Engine sample selection. 90.706... (CONTINUED) CONTROL OF EMISSIONS FROM NONROAD SPARK-IGNITION ENGINES AT OR BELOW 19 KILOWATTS Manufacturer Production Line Testing Program § 90.706 Engine sample selection. (a) At the start of each model year, the...

  5. 40 CFR 91.506 - Engine sample selection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Engine sample selection. 91.506... (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Manufacturer Production Line Testing Program § 91.506 Engine sample selection. (a) At the start of each model year, the marine SI engine...

  6. X-ray versus infrared selection of distant galaxy clusters: a case study using the XMM-LSS and SpARCS cluster samples

    NASA Astrophysics Data System (ADS)

    Willis, J. P.; Ramos-Ceja, M. E.; Muzzin, A.; Pacaud, F.; Yee, H. K. C.; Wilson, G.

    2018-07-01

    We present a comparison of two samples of z> 0.8 galaxy clusters selected using different wavelength-dependent techniques and examine the physical differences between them. We consider 18 clusters from the X-ray-selected XMM Large Scale Structure (LSS) distant cluster survey and 92 clusters from the optical-mid-infrared (MIR)-selected Spitzer Adaptation of the Red Sequence Cluster survey (SpARCS) cluster survey. Both samples are selected from the same approximately 9 sq deg sky area and we examine them using common XMM-Newton, Spitizer Wide-Area Infrared Extra-galactic (SWIRE) survey, and Canada-France-Hawaii Telescope Legacy Survey data. Clusters from each sample are compared employing aperture measures of X-ray and MIR emission. We divide the SpARCS distant cluster sample into three sub-samples: (i) X-ray bright, (ii) X-ray faint, MIR bright, and (iii) X-ray faint, MIR faint clusters. We determine that X-ray- and MIR-selected clusters display very similar surface brightness distributions of galaxy MIR light. In addition, the average location and amplitude of the galaxy red sequence as measured from stacked colour histograms is very similar in the X-ray- and MIR-selected samples. The sub-sample of X-ray faint, MIR bright clusters displays a distribution of brightest cluster galaxy-barycentre position offsets which extends to higher values than all other samples. This observation indicates that such clusters may exist in a more disturbed state compared to the majority of the distant cluster population sampled by XMM-LSS and SpARCS. This conclusion is supported by stacked X-ray images for the X-ray faint, MIR bright cluster sub-sample that display weak, centrally concentrated X-ray emission, consistent with a population of growing clusters accreting from an extended envelope of material.

  7. Representativeness of direct observations selected using a work-sampling equation.

    PubMed

    Sharp, Rebecca A; Mudford, Oliver C; Elliffe, Douglas

    2015-01-01

    Deciding on appropriate sampling to obtain representative samples of behavior is important but not straightforward, because the relative duration of the target behavior may affect its observation in a given sampling interval. Work-sampling methods, which offer a way to adjust the frequency of sampling according to a priori or ongoing estimates of the behavior to achieve a preselected level of representativeness, may provide a solution. Full-week observations of 7 behaviors were conducted for 3 students with autism spectrum disorder and intellectual disabilities. Work-sampling methods were used to select momentary time samples from the full time-of-interest, which produced representative samples. However, work sampling required impractically high numbers of time samples to obtain representative samples. More practical momentary time samples produced less representative samples, particularly for low-duration behaviors. The utility and limits of work-sampling methods for applied behavior analysis are discussed. © Society for the Experimental Analysis of Behavior.

  8. 78 FR 57033 - United States Standards for Condition of Food Containers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-17

    ... containers during production. Stationary lot sampling is the process of randomly selecting sample units from.... * * * * * Stationary lot sampling. The process of randomly selecting sample units from a lot whose production has been... less than \\1/16\\-inch Stringy seal (excessive plastic threads showing at edge of seal 222 area...

  9. THREE-PEE SAMPLING THEORY and program 'THRP' for computer generation of selection criteria

    Treesearch

    L. R. Grosenbaugh

    1965-01-01

    Theory necessary for sampling with probability proportional to prediction ('three-pee,' or '3P,' sampling) is first developed and then exemplified by numerical comparisons of several estimators. Program 'T RP' for computer generation of appropriate 3P-sample-selection criteria is described, and convenient random integer dispensers are...

  10. Fluid sampling device

    NASA Technical Reports Server (NTRS)

    Studenick, D. K. (Inventor)

    1977-01-01

    An inlet leak is described for sampling gases, more specifically, for selectively sampling multiple fluids. This fluid sampling device includes a support frame. A plurality of fluid inlet devices extend through the support frame and each of the fluid inlet devices include a longitudinal aperture. An opening device that is responsive to a control signal selectively opens the aperture to allow fluid passage. A closing device that is responsive to another control signal selectively closes the aperture for terminating further fluid flow.

  11. 9 CFR 592.450 - Procedures for selecting appeal samples.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Procedures for selecting appeal samples. 592.450 Section 592.450 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE EGG PRODUCTS INSPECTION VOLUNTARY INSPECTION OF EGG PRODUCTS Appeals § 592.450 Procedures for selecting appeal samples. (a)...

  12. 9 CFR 592.450 - Procedures for selecting appeal samples.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Procedures for selecting appeal samples. 592.450 Section 592.450 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE EGG PRODUCTS INSPECTION VOLUNTARY INSPECTION OF EGG PRODUCTS Appeals § 592.450 Procedures for selecting appeal samples. (a)...

  13. 9 CFR 592.450 - Procedures for selecting appeal samples.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Procedures for selecting appeal samples. 592.450 Section 592.450 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE EGG PRODUCTS INSPECTION VOLUNTARY INSPECTION OF EGG PRODUCTS Appeals § 592.450 Procedures for selecting appeal samples. (a)...

  14. 9 CFR 592.450 - Procedures for selecting appeal samples.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Procedures for selecting appeal samples. 592.450 Section 592.450 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE EGG PRODUCTS INSPECTION VOLUNTARY INSPECTION OF EGG PRODUCTS Appeals § 592.450 Procedures for selecting appeal samples. (a)...

  15. High throughput screening of ligand binding to macromolecules using high resolution powder diffraction

    DOEpatents

    Von Dreele, Robert B.; D'Amico, Kevin

    2006-10-31

    A process is provided for the high throughput screening of binding of ligands to macromolecules using high resolution powder diffraction data including producing a first sample slurry of a selected polycrystalline macromolecule material and a solvent, producing a second sample slurry of a selected polycrystalline macromolecule material, one or more ligands and the solvent, obtaining a high resolution powder diffraction pattern on each of said first sample slurry and the second sample slurry, and, comparing the high resolution powder diffraction pattern of the first sample slurry and the high resolution powder diffraction pattern of the second sample slurry whereby a difference in the high resolution powder diffraction patterns of the first sample slurry and the second sample slurry provides a positive indication for the formation of a complex between the selected polycrystalline macromolecule material and at least one of the one or more ligands.

  16. GeoLab Concept: The Importance of Sample Selection During Long Duration Human Exploration Mission

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.; Evans, C. A.; Bell, M. S.; Graff, T. G.

    2011-01-01

    In the future when humans explore planetary surfaces on the Moon, Mars, and asteroids or beyond, the return of geologic samples to Earth will be a high priority for human spaceflight operations. All future sample return missions will have strict down-mass and volume requirements; methods for in-situ sample assessment and prioritization will be critical for selecting the best samples for return-to-Earth.

  17. Systematic evaluation of matrix effects in hydrophilic interaction chromatography versus reversed phase liquid chromatography coupled to mass spectrometry.

    PubMed

    Periat, Aurélie; Kohler, Isabelle; Thomas, Aurélien; Nicoli, Raul; Boccard, Julien; Veuthey, Jean-Luc; Schappler, Julie; Guillarme, Davy

    2016-03-25

    Reversed phase liquid chromatography (RPLC) coupled to mass spectrometry (MS) is the gold standard technique in bioanalysis. However, hydrophilic interaction chromatography (HILIC) could represent a viable alternative to RPLC for the analysis of polar and/or ionizable compounds, as it often provides higher MS sensitivity and alternative selectivity. Nevertheless, this technique can be also prone to matrix effects (ME). ME are one of the major issues in quantitative LC-MS bioanalysis. To ensure acceptable method performance (i.e., trueness and precision), a careful evaluation and minimization of ME is required. In the present study, the incidence of ME in HILIC-MS/MS and RPLC-MS/MS was compared for plasma and urine samples using two representative sets of 38 pharmaceutical compounds and 40 doping agents, respectively. The optimal generic chromatographic conditions in terms of selectivity with respect to interfering compounds were established in both chromatographic modes by testing three different stationary phases in each mode with different mobile phase pH. A second step involved the assessment of ME in RPLC and HILIC under the best generic conditions, using the post-extraction addition method. Biological samples were prepared using two different sample pre-treatments, i.e., a non-selective sample clean-up procedure (protein precipitation and simple dilution for plasma and urine samples, respectively) and a selective sample preparation, i.e., solid phase extraction for both matrices. The non-selective pretreatments led to significantly less ME in RPLC vs. HILIC conditions regardless of the matrix. On the contrary, HILIC appeared as a valuable alternative to RPLC for plasma and urine samples treated by a selective sample preparation. Indeed, in the case of selective sample preparation, the compounds influenced by ME were different in HILIC and RPLC, and lower and similar ME occurrence was generally observed in RPLC vs. HILIC for urine and plasma samples, respectively. The complementary of both chromatographic modes was also demonstrated, as ME was observed only scarcely for urine and plasma samples when selecting the most appropriate chromatographic mode. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Sampling considerations for disease surveillance in wildlife populations

    USGS Publications Warehouse

    Nusser, S.M.; Clark, W.R.; Otis, D.L.; Huang, L.

    2008-01-01

    Disease surveillance in wildlife populations involves detecting the presence of a disease, characterizing its prevalence and spread, and subsequent monitoring. A probability sample of animals selected from the population and corresponding estimators of disease prevalence and detection provide estimates with quantifiable statistical properties, but this approach is rarely used. Although wildlife scientists often assume probability sampling and random disease distributions to calculate sample sizes, convenience samples (i.e., samples of readily available animals) are typically used, and disease distributions are rarely random. We demonstrate how landscape-based simulation can be used to explore properties of estimators from convenience samples in relation to probability samples. We used simulation methods to model what is known about the habitat preferences of the wildlife population, the disease distribution, and the potential biases of the convenience-sample approach. Using chronic wasting disease in free-ranging deer (Odocoileus virginianus) as a simple illustration, we show that using probability sample designs with appropriate estimators provides unbiased surveillance parameter estimates but that the selection bias and coverage errors associated with convenience samples can lead to biased and misleading results. We also suggest practical alternatives to convenience samples that mix probability and convenience sampling. For example, a sample of land areas can be selected using a probability design that oversamples areas with larger animal populations, followed by harvesting of individual animals within sampled areas using a convenience sampling method.

  19. Rare-Earth Oxide (Yb2O3) Selective Emitter Fabrication and Evaluation

    NASA Technical Reports Server (NTRS)

    Jennette, Bryan; Gregory, Don A.; Herren, Kenneth; Tucker, Dennis; Smith, W. Scott (Technical Monitor)

    2001-01-01

    This investigation involved the fabrication and evaluation of rare-earth oxide selective emitters. The first goal of this study was to successfully fabricate the selective emitter samples using paper and ceramic materials processing techniques. The resulting microstructure was also analyzed using a Scanning Electron Microscope. All selective emitter samples fabricated for this study were made with ytterbium oxide (Yb2O3). The second goal of this study involved the measurement of the spectral emission and the radiated power of all the selective emitter samples. The final goal of this study involved the direct comparison of the radiated power emitted by the selective emitter samples to that of a standard blackbody at the same temperature and within the same wavelength range.

  20. Thermomechanical Methodology for Stabilizing Shape Memory Alloy (SMA) Response

    NASA Technical Reports Server (NTRS)

    Padula, II, Santo A (Inventor)

    2013-01-01

    Methods and apparatuses for stabilizing the strain-temperature response for a shape memory alloy are provided. To perform stabilization of a second sample of the shape memory alloy, a first sample of the shape memory alloy is selected for isobaric treatment and the second sample is selected for isothermal treatment. When applying the isobaric treatment to the first sample, a constant stress is applied to the first sample. Temperature is also cycled from a minimum temperature to a maximum temperature until a strain on the first sample stabilizes. Once the strain on the first sample stabilizes, the isothermal treatment is performed on the second sample. During isothermal treatment, different levels of stress on the second sample are applied until a strain on the second sample matches the stabilized strain on the first sample.

  1. Thermomechanical Methodology for Stabilizing Shape Memory Alloy (SMA) Response

    NASA Technical Reports Server (NTRS)

    Padula, Santo A., II (Inventor)

    2016-01-01

    Methods and apparatuses for stabilizing the strain-temperature response for a shape memory alloy are provided. To perform stabilization of a second sample of the shape memory alloy, a first sample of the shape memory alloy is selected for isobaric treatment and the second sample is selected for isothermal treatment. When applying the isobaric treatment to the first sample, a constant stress is applied to the first sample. Temperature is also cycled from a minimum temperature to a maximum temperature until a strain on the first sample stabilizes. Once the strain on the first sample stabilizes, the isothermal treatment is performed on the second sample. During isothermal treatment, different levels of stress on the second sample are applied until a strain on the second sample matches the stabilized strain on the first sample.

  2. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.

    1996-03-26

    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  3. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, Gary J.; Motes, Billy G.; Bird, Susan K.; Kotter, Dale K.

    1996-01-01

    Apparatus for obtaining a whole gas sample, composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method of obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant.

  4. The quasar luminosity function from a variability-selected sample

    NASA Astrophysics Data System (ADS)

    Hawkins, M. R. S.; Veron, P.

    1993-01-01

    A sample of quasars is selected from a 10-yr sequence of 30 UK Schmidt plates. Luminosity functions are derived in several redshift intervals, which in each case show a featureless power-law rise towards low luminosities. There is no sign of the 'break' found in the recent UVX sample of Boyle et al. It is suggested that reasons for the disagreement are connected with biases in the selection of the UVX sample. The question of the nature of quasar evolution appears to be still unresolved.

  5. Hybrid selection for sequencing pathogen genomes from clinical samples

    PubMed Central

    2011-01-01

    We have adapted a solution hybrid selection protocol to enrich pathogen DNA in clinical samples dominated by human genetic material. Using mock mixtures of human and Plasmodium falciparum malaria parasite DNA as well as clinical samples from infected patients, we demonstrate an average of approximately 40-fold enrichment of parasite DNA after hybrid selection. This approach will enable efficient genome sequencing of pathogens from clinical samples, as well as sequencing of endosymbiotic organisms such as Wolbachia that live inside diverse metazoan phyla. PMID:21835008

  6. Quantifying recent erosion and sediment delivery using probability sampling: A case study

    Treesearch

    Jack Lewis

    2002-01-01

    Abstract - Estimates of erosion and sediment delivery have often relied on measurements from locations that were selected to be representative of particular terrain types. Such judgement samples are likely to overestimate or underestimate the mean of the quantity of interest. Probability sampling can eliminate the bias due to sample selection, and it permits the...

  7. Location uncertainty and the tri-areal design

    Treesearch

    Francis A. Roesch

    2007-01-01

    The U.S. Department of Agriculture Forest Service Forest Inventory and Analysis Program (FIA) uses a field plot design that incorporates multiple sample selection mechanisms. Not all of the five FIA units currently use the entire suite of available sample selection mechanisms. These sampling selection mechanisms could be described in a number of ways with respect to...

  8. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  9. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  10. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  11. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  12. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  13. Location uncertainty and the tri-areal design

    Treesearch

    Francis A. Roesch

    2005-01-01

    The U.S. Department of Agriculture Forest Service Forest Inventory and Analysis Program (FTA) uses a field plot design that incorporates multiple sample selection mechanisms. Not all of the five FIA units currently use the entire suite of available sample selection mechanisms. These sampling selection mechanisms could be described in a number of ways with respect to...

  14. Optimized probability sampling of study sites to improve generalizability in a multisite intervention trial.

    PubMed

    Kraschnewski, Jennifer L; Keyserling, Thomas C; Bangdiwala, Shrikant I; Gizlice, Ziya; Garcia, Beverly A; Johnston, Larry F; Gustafson, Alison; Petrovic, Lindsay; Glasgow, Russell E; Samuel-Hodge, Carmen D

    2010-01-01

    Studies of type 2 translation, the adaption of evidence-based interventions to real-world settings, should include representative study sites and staff to improve external validity. Sites for such studies are, however, often selected by convenience sampling, which limits generalizability. We used an optimized probability sampling protocol to select an unbiased, representative sample of study sites to prepare for a randomized trial of a weight loss intervention. We invited North Carolina health departments within 200 miles of the research center to participate (N = 81). Of the 43 health departments that were eligible, 30 were interested in participating. To select a representative and feasible sample of 6 health departments that met inclusion criteria, we generated all combinations of 6 from the 30 health departments that were eligible and interested. From the subset of combinations that met inclusion criteria, we selected 1 at random. Of 593,775 possible combinations of 6 counties, 15,177 (3%) met inclusion criteria. Sites in the selected subset were similar to all eligible sites in terms of health department characteristics and county demographics. Optimized probability sampling improved generalizability by ensuring an unbiased and representative sample of study sites.

  15. [The research protocol III. Study population].

    PubMed

    Arias-Gómez, Jesús; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    The study population is defined as a set of cases, determined, limited, and accessible, that will constitute the subjects for the selection of the sample, and must fulfill several characteristics and distinct criteria. The objectives of this manuscript are focused on specifying each one of the elements required to make the selection of the participants of a research project, during the elaboration of the protocol, including the concepts of study population, sample, selection criteria and sampling methods. After delineating the study population, the researcher must specify the criteria that each participant has to comply. The criteria that include the specific characteristics are denominated selection or eligibility criteria. These criteria are inclusion, exclusion and elimination, and will delineate the eligible population. The sampling methods are divided in two large groups: 1) probabilistic or random sampling and 2) non-probabilistic sampling. The difference lies in the employment of statistical methods to select the subjects. In every research, it is necessary to establish at the beginning the specific number of participants to be included to achieve the objectives of the study. This number is the sample size, and can be calculated or estimated with mathematical formulas and statistic software.

  16. The SDSS-IV MaNGA Sample: Design, Optimization, and Usage Considerations

    NASA Astrophysics Data System (ADS)

    Wake, David A.; Bundy, Kevin; Diamond-Stanic, Aleksandar M.; Yan, Renbin; Blanton, Michael R.; Bershady, Matthew A.; Sánchez-Gallego, José R.; Drory, Niv; Jones, Amy; Kauffmann, Guinevere; Law, David R.; Li, Cheng; MacDonald, Nicholas; Masters, Karen; Thomas, Daniel; Tinker, Jeremy; Weijmans, Anne-Marie; Brownstein, Joel R.

    2017-09-01

    We describe the sample design for the SDSS-IV MaNGA survey and present the final properties of the main samples along with important considerations for using these samples for science. Our target selection criteria were developed while simultaneously optimizing the size distribution of the MaNGA integral field units (IFUs), the IFU allocation strategy, and the target density to produce a survey defined in terms of maximizing signal-to-noise ratio, spatial resolution, and sample size. Our selection strategy makes use of redshift limits that only depend on I-band absolute magnitude (M I ), or, for a small subset of our sample, M I and color (NUV - I). Such a strategy ensures that all galaxies span the same range in angular size irrespective of luminosity and are therefore covered evenly by the adopted range of IFU sizes. We define three samples: the Primary and Secondary samples are selected to have a flat number density with respect to M I and are targeted to have spectroscopic coverage to 1.5 and 2.5 effective radii (R e ), respectively. The Color-Enhanced supplement increases the number of galaxies in the low-density regions of color-magnitude space by extending the redshift limits of the Primary sample in the appropriate color bins. The samples cover the stellar mass range 5× {10}8≤slant {M}* ≤slant 3× {10}11 {M}⊙ {h}-2 and are sampled at median physical resolutions of 1.37 and 2.5 kpc for the Primary and Secondary samples, respectively. We provide weights that will statistically correct for our luminosity and color-dependent selection function and IFU allocation strategy, thus correcting the observed sample to a volume-limited sample.

  17. A mixture model with a reference-based automatic selection of components for disease classification from protein and/or gene expression levels

    PubMed Central

    2011-01-01

    Background Bioinformatics data analysis is often using linear mixture model representing samples as additive mixture of components. Properly constrained blind matrix factorization methods extract those components using mixture samples only. However, automatic selection of extracted components to be retained for classification analysis remains an open issue. Results The method proposed here is applied to well-studied protein and genomic datasets of ovarian, prostate and colon cancers to extract components for disease prediction. It achieves average sensitivities of: 96.2 (sd = 2.7%), 97.6% (sd = 2.8%) and 90.8% (sd = 5.5%) and average specificities of: 93.6% (sd = 4.1%), 99% (sd = 2.2%) and 79.4% (sd = 9.8%) in 100 independent two-fold cross-validations. Conclusions We propose an additive mixture model of a sample for feature extraction using, in principle, sparseness constrained factorization on a sample-by-sample basis. As opposed to that, existing methods factorize complete dataset simultaneously. The sample model is composed of a reference sample representing control and/or case (disease) groups and a test sample. Each sample is decomposed into two or more components that are selected automatically (without using label information) as control specific, case specific and not differentially expressed (neutral). The number of components is determined by cross-validation. Automatic assignment of features (m/z ratios or genes) to particular component is based on thresholds estimated from each sample directly. Due to the locality of decomposition, the strength of the expression of each feature across the samples can vary. Yet, they will still be allocated to the related disease and/or control specific component. Since label information is not used in the selection process, case and control specific components can be used for classification. That is not the case with standard factorization methods. Moreover, the component selected by proposed method as disease specific can be interpreted as a sub-mode and retained for further analysis to identify potential biomarkers. As opposed to standard matrix factorization methods this can be achieved on a sample (experiment)-by-sample basis. Postulating one or more components with indifferent features enables their removal from disease and control specific components on a sample-by-sample basis. This yields selected components with reduced complexity and generally, it increases prediction accuracy. PMID:22208882

  18. A quick method based on SIMPLISMA-KPLS for simultaneously selecting outlier samples and informative samples for model standardization in near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng

    2017-12-01

    A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.

  19. The effect of branding on consumer palatability ratings of beef strip loin steaks.

    PubMed

    Wilfong, A K; McKillip, K V; Gonzalez, J M; Houser, T A; Unruh, J A; Boyle, E A E; O'Quinn, T G

    2016-11-01

    The objective of this study was to determine the influence of knowing the brand or USDA grade on consumer palatability ratings of beef strip loin steaks. Strip loins were selected to represent 5 USDA grades and brands, USDA Select, Choice, Prime, Certified Angus Beef (CAB; upper 2/3 Choice), and Select, from carcasses of cattle classified as Angus on the basis of phenotype. After 21 d of aging, 2.5-cm-thick steaks were cut, consecutively cut steaks were paired for consumer evaluation. Consumer panelists ( = 112) evaluated samples for tenderness, juiciness, flavor liking, and overall liking. Additionally, consumers rated each palatability trait as either acceptable or unacceptable. Samples were fed in 2 rounds on the same day: blind and informed testing. In the first round, blind testing, consumers were served 1 sample from each treatment, with no product information provided. In the second round, consumers were informed of the brand or quality grade prior to sampling. During blind testing, CAB rated similar ( > 0.05) to Choice for all palatability traits; however, CAB rated greater ( < 0.05) than Choice for all traits during informed testing. Additionally, Angus Select and Select were rated similar > 0.05) for all traits when tested blind, but Angus Select was rated greater ( < 0.05) than Select for flavor and overall liking when brand was declared. When comparing blind and informed ratings, Angus Select and CAB had greater ( < 0.05) ratings for juiciness, flavor liking, and overall liking, and Prime had increased ( < 0.05) ratings for flavor liking and overall liking because of brand disclosure. However, ratings for Choice and Select samples were unaffected ( > 0.05) when brand was disclosed. Brand knowledge increased ( < 0.05) the percentage of Prime samples rated as acceptable for flavor and the percentage of Angus Select samples rated as acceptable for flavor and overall liking. Conversely, there was no difference ( > 0.05) in the percentage of Choice and Select samples rated as acceptable for all palatability traits. These data indicate that Prime, CAB, and Angus Select steaks receive an increase in consumer palatability perception, or "brand lift," which does not occur for Choice and Select beef.

  20. Effects of Sample Selection Bias on the Accuracy of Population Structure and Ancestry Inference

    PubMed Central

    Shringarpure, Suyash; Xing, Eric P.

    2014-01-01

    Population stratification is an important task in genetic analyses. It provides information about the ancestry of individuals and can be an important confounder in genome-wide association studies. Public genotyping projects have made a large number of datasets available for study. However, practical constraints dictate that of a geographical/ethnic population, only a small number of individuals are genotyped. The resulting data are a sample from the entire population. If the distribution of sample sizes is not representative of the populations being sampled, the accuracy of population stratification analyses of the data could be affected. We attempt to understand the effect of biased sampling on the accuracy of population structure analysis and individual ancestry recovery. We examined two commonly used methods for analyses of such datasets, ADMIXTURE and EIGENSOFT, and found that the accuracy of recovery of population structure is affected to a large extent by the sample used for analysis and how representative it is of the underlying populations. Using simulated data and real genotype data from cattle, we show that sample selection bias can affect the results of population structure analyses. We develop a mathematical framework for sample selection bias in models for population structure and also proposed a correction for sample selection bias using auxiliary information about the sample. We demonstrate that such a correction is effective in practice using simulated and real data. PMID:24637351

  1. A re-evaluation of a case-control model with contaminated controls for resource selection studies

    Treesearch

    Christopher T. Rota; Joshua J. Millspaugh; Dylan C. Kesler; Chad P. Lehman; Mark A. Rumble; Catherine M. B. Jachowski

    2013-01-01

    A common sampling design in resource selection studies involves measuring resource attributes at sample units used by an animal and at sample units considered available for use. Few models can estimate the absolute probability of using a sample unit from such data, but such approaches are generally preferred over statistical methods that estimate a relative probability...

  2. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  3. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  4. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  5. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  6. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  7. Duplex sampling apparatus and method

    DOEpatents

    Brown, Paul E.; Lloyd, Robert

    1992-01-01

    An improved apparatus is provided for sampling a gaseous mixture and for measuring mixture components. The apparatus includes two sampling containers connected in series serving as a duplex sampling apparatus. The apparatus is adapted to independently determine the amounts of condensable and noncondensable gases in admixture from a single sample. More specifically, a first container includes a first port capable of selectively connecting to and disconnecting from a sample source and a second port capable of selectively connecting to and disconnecting from a second container. A second container also includes a first port capable of selectively connecting to and disconnecting from the second port of the first container and a second port capable of either selectively connecting to and disconnecting from a differential pressure source. By cooling a mixture sample in the first container, the condensable vapors form a liquid, leaving noncondensable gases either as free gases or dissolved in the liquid. The condensed liquid is heated to drive out dissolved noncondensable gases, and all the noncondensable gases are transferred to the second container. Then the first and second containers are separated from one another in order to separately determine the amount of noncondensable gases and the amount of condensable gases in the sample.

  8. Observed Characteristics and Teacher Quality: Impacts of Sample Selection on a Value Added Model

    ERIC Educational Resources Information Center

    Winters, Marcus A.; Dixon, Bruce L.; Greene, Jay P.

    2012-01-01

    We measure the impact of observed teacher characteristics on student math and reading proficiency using a rich dataset from Florida. We expand upon prior work by accounting directly for nonrandom attrition of teachers from the classroom in a sample selection framework. We find evidence that sample selection is present in the estimation of the…

  9. 40 CFR Appendix A to Subpart G of... - Sampling Plans for Selective Enforcement Auditing of Marine Engines

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Enforcement Auditing of Marine Engines A Appendix A to Subpart G of Part 91 Protection of Environment...-IGNITION ENGINES Selective Enforcement Auditing Regulations Pt. 91, Subpt. G, App. A Appendix A to Subpart G of Part 91—Sampling Plans for Selective Enforcement Auditing of Marine Engines Table 1—Sampling...

  10. 40 CFR Appendix A to Subpart F of... - Sampling Plans for Selective Enforcement Auditing of Nonroad Engines

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Enforcement Auditing of Nonroad Engines A Appendix A to Subpart F of Part 89 Protection of Environment... NONROAD COMPRESSION-IGNITION ENGINES Selective Enforcement Auditing Pt. 89, Subpt. F, App. A Appendix A to Subpart F of Part 89—Sampling Plans for Selective Enforcement Auditing of Nonroad Engines Table 1—Sampling...

  11. 40 CFR Appendix A to Subpart G of... - Sampling Plans for Selective Enforcement Auditing of Marine Engines

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Enforcement Auditing of Marine Engines A Appendix A to Subpart G of Part 91 Protection of Environment...-IGNITION ENGINES Selective Enforcement Auditing Regulations Pt. 91, Subpt. G, App. A Appendix A to Subpart G of Part 91—Sampling Plans for Selective Enforcement Auditing of Marine Engines Table 1—Sampling...

  12. Local Feature Selection for Data Classification.

    PubMed

    Armanfard, Narges; Reilly, James P; Komeili, Majid

    2016-06-01

    Typical feature selection methods choose an optimal global feature subset that is applied over all regions of the sample space. In contrast, in this paper we propose a novel localized feature selection (LFS) approach whereby each region of the sample space is associated with its own distinct optimized feature set, which may vary both in membership and size across the sample space. This allows the feature set to optimally adapt to local variations in the sample space. An associated method for measuring the similarities of a query datum to each of the respective classes is also proposed. The proposed method makes no assumptions about the underlying structure of the samples; hence the method is insensitive to the distribution of the data over the sample space. The method is efficiently formulated as a linear programming optimization problem. Furthermore, we demonstrate the method is robust against the over-fitting problem. Experimental results on eleven synthetic and real-world data sets demonstrate the viability of the formulation and the effectiveness of the proposed algorithm. In addition we show several examples where localized feature selection produces better results than a global feature selection method.

  13. An Hα-selected sample of cataclysmic variables - I. Observations of newly discovered systems

    NASA Astrophysics Data System (ADS)

    Pretorius, Magaretha L.; Knigge, Christian

    2008-04-01

    Strong selection effects are present in observational samples of cataclysmic variables (CVs), complicating comparisons to theoretical predictions. The selection criteria used to define most CV samples discriminate heavily against the discovery of short-period, intrinsically faint systems. The situation can be improved by selecting CVs for the presence of emission lines. For this reason, we have constructed a homogeneous sample of CVs selected on the basis of Hα emission. We present discovery observations of the 14 CVs and two additional CV candidates found in this search. The orbital periods of 11 of the new CVs were measured; all are above 3 h. There are two eclipsing systems in the sample, and one in which we observed a quasi-periodic modulation on a ~1000s time-scale. We also detect the secondary star in the spectrum of one system, and measure its spectral type. Several of the new CVs have the spectroscopic appearance of nova-like variables, and a few display what may be SW Sex star behaviour. In a companion paper, we discuss the implications of this new sample for CV evolution.

  14. Methodology Series Module 5: Sampling Strategies.

    PubMed

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  15. Enhancement of the spectral selectivity of complex samples by measuring them in a frozen state at low temperatures in order to improve accuracy for quantitative analysis. Part II. Determination of viscosity for lube base oils using Raman spectroscopy.

    PubMed

    Kim, Mooeung; Chung, Hoeil

    2013-03-07

    The use of selectivity-enhanced Raman spectra of lube base oil (LBO) samples achieved by the spectral collection under frozen conditions at low temperatures was effective for improving accuracy for the determination of the kinematic viscosity at 40 °C (KV@40). A collection of Raman spectra from samples cooled around -160 °C provided the most accurate measurement of KV@40. Components of the LBO samples were mainly long-chain hydrocarbons with molecular structures that were deformable when these were frozen, and the different structural deformabilities of the components enhanced spectral selectivity among the samples. To study the structural variation of components according to the change of sample temperature from cryogenic to ambient condition, n-heptadecane and pristane (2,6,10,14-tetramethylpentadecane) were selected as representative components of LBO samples, and their temperature-induced spectral features as well as the corresponding spectral loadings were investigated. A two-dimensional (2D) correlation analysis was also employed to explain the origin for the improved accuracy. The asynchronous 2D correlation pattern was simplest at the optimal temperature, indicating the occurrence of distinct and selective spectral variations, which enabled the variation of KV@40 of LBO samples to be more accurately assessed.

  16. Corn blight review: Sampling model and ground data measurements program

    NASA Technical Reports Server (NTRS)

    Allen, R. D.

    1972-01-01

    The sampling plan involved the selection of the study area, determination of the flightline and segment sample design within the study area, and determination of a field sample design. Initial interview survey data consisting of crop species acreage and land use were collected. On all corn fields, additional information such as seed type, row direction, population, planting date, ect. were also collected. From this information, sample corn fields were selected to be observed through the growing season on a biweekly basis by county extension personnel.

  17. Powder Handling Device for Analytical Instruments

    NASA Technical Reports Server (NTRS)

    Sarrazin, Philippe C. (Inventor); Blake, David F. (Inventor)

    2006-01-01

    Method and system for causing a powder sample in a sample holder to undergo at least one of three motions (vibration, rotation and translation) at a selected motion frequency in order to present several views of an individual grain of the sample. One or more measurements of diffraction, fluorescence, spectroscopic interaction, transmission, absorption and/or reflection can be made on the sample, using light in a selected wavelength region.

  18. Developments in Sampling and Analysis Instrumentation for Stationary Sources

    ERIC Educational Resources Information Center

    Nader, John S.

    1973-01-01

    Instrumentation for the measurement of pollutant emissions is considered including sample-site selection, sample transport, sample treatment, sample analysis, and data reduction, display, and interpretation. Measurement approaches discussed involve sample extraction from within the stack and electro-optical methods. (BL)

  19. Sample Selection for Training Cascade Detectors.

    PubMed

    Vállez, Noelia; Deniz, Oscar; Bueno, Gloria

    2015-01-01

    Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  20. Sampling and Analysis for Lead in Water and Soil Samples on a University Campus: A Student Research Project.

    ERIC Educational Resources Information Center

    Butala, Steven J.; Zarrabi, Kaveh

    1995-01-01

    Describes a student research project that determined concentrations of lead in water drawn from selected drinking fountains and in selected soil samples on the campus of the University of Nevada, Las Vegas. (18 references) (DDR)

  1. Effects of Sample Selection on Estimates of Economic Impacts of Outdoor Recreation

    Treesearch

    Donald B.K. English

    1997-01-01

    Estimates of the economic impacts of recreation often come from spending data provided by a self-selected subset of a random sample of site visitors. The subset is frequently less than half the onsite sample. Biased vectors of per trip spending and impact estimates can result if self-selection is related to spending pattctns, and proper corrective procedures arc not...

  2. Automated fluid analysis apparatus and techniques

    DOEpatents

    Szecsody, James E.

    2004-03-16

    An automated device that couples a pair of differently sized sample loops with a syringe pump and a source of degassed water. A fluid sample is mounted at an inlet port and delivered to the sample loops. A selected sample from the sample loops is diluted in the syringe pump with the degassed water and fed to a flow through detector for analysis. The sample inlet is also directly connected to the syringe pump to selectively perform analysis without dilution. The device is airtight and used to detect oxygen-sensitive species, such as dithionite in groundwater following a remedial injection to treat soil contamination.

  3. Sampling in epidemiological research: issues, hazards and pitfalls.

    PubMed

    Tyrer, Stephen; Heyman, Bob

    2016-04-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research.

  4. Sampling in epidemiological research: issues, hazards and pitfalls

    PubMed Central

    Tyrer, Stephen; Heyman, Bob

    2016-01-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985

  5. Methodology Series Module 5: Sampling Strategies

    PubMed Central

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  6. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  7. INCORPORATING PRIOR KNOWLEDGE IN ENVIRONMENTAL SAMPLING: RANKED SET SAMPLING AND OTHER DOUBLE SAMPLING PROCEDURES

    EPA Science Inventory

    Environmental sampling can be difficult and expensive to carry out. Those taking the samples would like to integrate their knowledge of the system of study or their judgment about the system into the sample selection process to decrease the number of necessary samples. However,...

  8. Selection of sampling rate for digital control of aircrafts

    NASA Technical Reports Server (NTRS)

    Katz, P.; Powell, J. D.

    1974-01-01

    The considerations in selecting the sample rates for digital control of aircrafts are identified and evaluated using the optimal discrete method. A high performance aircraft model which includes a bending mode and wind gusts was studied. The following factors which influence the selection of the sampling rates were identified: (1) the time and roughness response to control inputs; (2) the response to external disturbances; and (3) the sensitivity to variations of parameters. It was found that the time response to a control input and the response to external disturbances limit the selection of the sampling rate. The optimal discrete regulator, the steady state Kalman filter, and the mean response to external disturbances are calculated.

  9. Learning from Past Classification Errors: Exploring Methods for Improving the Performance of a Deep Learning-based Building Extraction Model through Quantitative Analysis of Commission Errors for Optimal Sample Selection

    NASA Astrophysics Data System (ADS)

    Swan, B.; Laverdiere, M.; Yang, L.

    2017-12-01

    In the past five years, deep Convolutional Neural Networks (CNN) have been increasingly favored for computer vision applications due to their high accuracy and ability to generalize well in very complex problems; however, details of how they function and in turn how they may be optimized are still imperfectly understood. In particular, their complex and highly nonlinear network architecture, including many hidden layers and self-learned parameters, as well as their mathematical implications, presents open questions about how to effectively select training data. Without knowledge of the exact ways the model processes and transforms its inputs, intuition alone may fail as a guide to selecting highly relevant training samples. Working in the context of improving a CNN-based building extraction model used for the LandScan USA gridded population dataset, we have approached this problem by developing a semi-supervised, highly-scalable approach to select training samples from a dataset of identified commission errors. Due to the large scope this project, tens of thousands of potential samples could be derived from identified commission errors. To efficiently trim those samples down to a manageable and effective set for creating additional training sample, we statistically summarized the spectral characteristics of areas with rates of commission errors at the image tile level and grouped these tiles using affinity propagation. Highly representative members of each commission error cluster were then used to select sites for training sample creation. The model will be incrementally re-trained with the new training data to allow for an assessment of how the addition of different types of samples affects the model performance, such as precision and recall rates. By using quantitative analysis and data clustering techniques to select highly relevant training samples, we hope to improve model performance in a manner that is resource efficient, both in terms of training process and in sample creation.

  10. Preparation of novel alumina nanowire solid-phase microextraction fiber coating for ultra-selective determination of volatile esters and alcohols from complicated food samples.

    PubMed

    Zhang, Zhuomin; Ma, Yunjian; Wang, Qingtang; Chen, An; Pan, Zhuoyan; Li, Gongke

    2013-05-17

    A novel alumina nanowire (ANW) solid-phase microextraction (SPME) fiber coating was prepared by a simple and rapid anodization-chemical etching method for ultra-selective determination of volatile esters and alcohols from complicated food samples. Preparation conditions for ANW SPME fiber coating including corrosion solution concentration and corrosion time were optimized in detail for better surface morphology and higher surface area based on scanning electron microscope (SEM). Under the optimum conditions, homogeneous alumina nanowire structure of ANW SPME fiber coating was achieved with the average thickness of 20 μm around. Compared with most of commercial SPME fiber coatings, ANW SPME fiber coatings achieved the higher extraction capacity and special selectivity for volatile esters and alcohols. Finally, an efficient gas sampling technique based on ANW SPME fiber coating as the core was established and successfully applied for the ultra-selective determination of trace volatile esters and alcohols from complicated banana and fermented glutinous rice samples coupled with gas chromatography/mass spectrometry (GC/MS) detection. It was interesting that 25 esters and 2 alcohols among 30 banana volatile organic compounds (VOCs) identified and 4 esters and 7 alcohols among 13 identified VOCs of fermented glutinous rice were selectively sampled by ANW SPME fiber coatings. Furthermore, new analytical methods for the determination of some typical volatile esters and alcohols from banana and fermented glutinous rice samples at specific storage or brewing phases were developed and validated. Good recoveries for banana and fermented glutinous rice samples were achieved in range of 108-115% with relative standard deviations (RSDs) of 2.6-6.7% and 80.0-91.8% with RSDs of 0.3-1.3% (n=3), respectively. This work proposed a novel and efficient gas sampling technique of ANW SPME which was quite suitable for ultra-selectively sampling trace volatile esters and alcohols from complicated food samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Local density measurement of additive manufactured copper parts by instrumented indentation

    NASA Astrophysics Data System (ADS)

    Santo, Loredana; Quadrini, Fabrizio; Bellisario, Denise; Tedde, Giovanni Matteo; Zarcone, Mariano; Di Domenico, Gildo; D'Angelo, Pierpaolo; Corona, Diego

    2018-05-01

    Instrumented flat indentation has been used to evaluate local density of additive manufactured (AM) copper samples with different relative density. Indentations were made by using tungsten carbide (WC) flat pins with 1 mm diameter. Pure copper powders were used in a selective laser melting (SLM) machine to produce samples to test. By changing process parameters, samples density was changed from the relative density of 63% to 71%. Indentation tests were performed on the xy surface of the AM samples. In order to make a correlation between indentation test results and sample density, the indentation pressure at fixed displacement was selected. Results show that instrumented indentation is a valid technique to measure density distribution along the geometry of an SLM part. In fact, a linear trend between indentation pressure and sample density was found for the selected density range.

  12. Ground-water-quality and ground-water-level data, Bernalillo County, central New Mexico, 1990-1993

    USGS Publications Warehouse

    Kues, G.E.; Garcia, B.M.

    1995-01-01

    Ground-water-quality and ground-water-level data were collected in four unincorporated areas of Bernalillo County during 1990-93. Twenty wells in the east mountain area of Bernalillo County were sampled approximately monthly between January 1990 and June 1993. The water samples were analyzed for concentrations of chloride and selected nutrient species; many of the samples also were analyzed for concentrations of total organic carbon and dissolved boron and iron. Eleven wells northeast of the city of Albuquerque, 20 wells in the Rio Grande Valley immediately north of Albuquerque, and 30 wells in the Rio Grande Valley immediately south of Albuquerque were sampled once each between December 1992 and September 1993; all water samples were analyzed for chloride and selected nutrient species, and selected samples from wells in the north and south valley areas were also analyzed for major dissolved constituents, iron, manganese, and methylene blue active substances. Samples from 10 of the wells in the north and south valley areas were analyzed for 47 selected pesticides. Field measurements of specific conductance, pH, temperature, and alkalinity were made on most samples at the time of sample collection. Water levels also were measured at the time of sample collection when possible. Results of the monthly water-quality and water-level monitoring in the east mountain area of Bernalillo County are presented in graphical form. Water-quality and water-level data collected from the other areas are presented in tabular form.

  13. Compendium of selected methods for sampling and analysis at geothermal facilities

    NASA Astrophysics Data System (ADS)

    Kindle, C. H.; Pool, K. H.; Ludwick, J. D.; Robertson, D. E.

    1984-06-01

    An independent study of the field has resulted in a compilation of the best methods for sampling, preservation and analysis of potential pollutants from geothermally fueled electric power plants. These methods are selected as the most usable over the range of application commonly experienced in the various geothermal plant sample locations. In addition to plant and well piping, techniques for sampling cooling towers, ambient gases, solids, surface and subsurface waters are described. Emphasis is placed on the use of sampling proves to extract samples from heterogeneous flows. Certain sampling points, constituents and phases of plant operation are more amenable to quality assurance improvement in the emission measurements than others and are so identified.

  14. Fuzziness-based active learning framework to enhance hyperspectral image classification performance for discriminative and generative classifiers

    PubMed Central

    2018-01-01

    Hyperspectral image classification with a limited number of training samples without loss of accuracy is desirable, as collecting such data is often expensive and time-consuming. However, classifiers trained with limited samples usually end up with a large generalization error. To overcome the said problem, we propose a fuzziness-based active learning framework (FALF), in which we implement the idea of selecting optimal training samples to enhance generalization performance for two different kinds of classifiers, discriminative and generative (e.g. SVM and KNN). The optimal samples are selected by first estimating the boundary of each class and then calculating the fuzziness-based distance between each sample and the estimated class boundaries. Those samples that are at smaller distances from the boundaries and have higher fuzziness are chosen as target candidates for the training set. Through detailed experimentation on three publically available datasets, we showed that when trained with the proposed sample selection framework, both classifiers achieved higher classification accuracy and lower processing time with the small amount of training data as opposed to the case where the training samples were selected randomly. Our experiments demonstrate the effectiveness of our proposed method, which equates favorably with the state-of-the-art methods. PMID:29304512

  15. Screening of ground water samples for volatile organic compounds using a portable gas chromatograph

    USGS Publications Warehouse

    Buchmiller, R.C.

    1989-01-01

    A portable gas chromatograph was used to screen 32 ground water samples for volatile organic compounds. Seven screened samples were positive; four of the seven samples had volatile organic substances identified by second-column confirmation. Four of the seven positive, screened samples also tested positive in laboratory analyses of duplicate samples. No volatile organic compounds were detected in laboratory analyses of samples that headspace screening indicated to be negative. Samples that contained volatile organic compounds, as identified by laboratory analysis, and that contained a volatile organic compound present in a standard of selected compounds were correctly identified by using the portable gas chromatography. Comparisons of screened-sample data with laboratory data indicate the ability to detect selected volatile organic compounds at concentrations of about 1 microgram per liter in the headspace of water samples by use of a portable gas chromatography. -Author

  16. Selected Factors Related to Selective Service Rejection and Rejection Rate in Delaware (1967): A Study of the Characteristics of Young Men Failing to Meet Mental Qualifications for Military Service.

    ERIC Educational Resources Information Center

    Price, Jay R.

    This study sought information about selective service rejection in Delaware, specifically rejectee characteristics, reasons for rejection, and the high rejection rate in Delaware. The basic design was a modified case study method in which a sample of individual records were examined. Differences between this sample and national samples were tested…

  17. Edge Effects in Line Intersect Sampling With

    Treesearch

    David L. R. Affleck; Timothy G. Gregoire; Harry T. Valentine

    2005-01-01

    Transects consisting of multiple, connected segments with a prescribed configuration are commonly used in ecological applications of line intersect sampling. The transect configuration has implications for the probability with which population elements are selected and for how the selection probabilities can be modified by the boundary of the tract being sampled. As...

  18. The Impact of Selection, Gene Conversion, and Biased Sampling on the Assessment of Microbial Demography.

    PubMed

    Lapierre, Marguerite; Blin, Camille; Lambert, Amaury; Achaz, Guillaume; Rocha, Eduardo P C

    2016-07-01

    Recent studies have linked demographic changes and epidemiological patterns in bacterial populations using coalescent-based approaches. We identified 26 studies using skyline plots and found that 21 inferred overall population expansion. This surprising result led us to analyze the impact of natural selection, recombination (gene conversion), and sampling biases on demographic inference using skyline plots and site frequency spectra (SFS). Forward simulations based on biologically relevant parameters from Escherichia coli populations showed that theoretical arguments on the detrimental impact of recombination and especially natural selection on the reconstructed genealogies cannot be ignored in practice. In fact, both processes systematically lead to spurious interpretations of population expansion in skyline plots (and in SFS for selection). Weak purifying selection, and especially positive selection, had important effects on skyline plots, showing patterns akin to those of population expansions. State-of-the-art techniques to remove recombination further amplified these biases. We simulated three common sampling biases in microbiological research: uniform, clustered, and mixed sampling. Alone, or together with recombination and selection, they further mislead demographic inferences producing almost any possible skyline shape or SFS. Interestingly, sampling sub-populations also affected skyline plots and SFS, because the coalescent rates of populations and their sub-populations had different distributions. This study suggests that extreme caution is needed to infer demographic changes solely based on reconstructed genealogies. We suggest that the development of novel sampling strategies and the joint analyzes of diverse population genetic methods are strictly necessary to estimate demographic changes in populations where selection, recombination, and biased sampling are present. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  19. Effect of finite sample size on feature selection and classification: a simulation study.

    PubMed

    Way, Ted W; Sahiner, Berkman; Hadjiiski, Lubomir M; Chan, Heang-Ping

    2010-02-01

    The small number of samples available for training and testing is often the limiting factor in finding the most effective features and designing an optimal computer-aided diagnosis (CAD) system. Training on a limited set of samples introduces bias and variance in the performance of a CAD system relative to that trained with an infinite sample size. In this work, the authors conducted a simulation study to evaluate the performances of various combinations of classifiers and feature selection techniques and their dependence on the class distribution, dimensionality, and the training sample size. The understanding of these relationships will facilitate development of effective CAD systems under the constraint of limited available samples. Three feature selection techniques, the stepwise feature selection (SFS), sequential floating forward search (SFFS), and principal component analysis (PCA), and two commonly used classifiers, Fisher's linear discriminant analysis (LDA) and support vector machine (SVM), were investigated. Samples were drawn from multidimensional feature spaces of multivariate Gaussian distributions with equal or unequal covariance matrices and unequal means, and with equal covariance matrices and unequal means estimated from a clinical data set. Classifier performance was quantified by the area under the receiver operating characteristic curve Az. The mean Az values obtained by resubstitution and hold-out methods were evaluated for training sample sizes ranging from 15 to 100 per class. The number of simulated features available for selection was chosen to be 50, 100, and 200. It was found that the relative performance of the different combinations of classifier and feature selection method depends on the feature space distributions, the dimensionality, and the available training sample sizes. The LDA and SVM with radial kernel performed similarly for most of the conditions evaluated in this study, although the SVM classifier showed a slightly higher hold-out performance than LDA for some conditions and vice versa for other conditions. PCA was comparable to or better than SFS and SFFS for LDA at small samples sizes, but inferior for SVM with polynomial kernel. For the class distributions simulated from clinical data, PCA did not show advantages over the other two feature selection methods. Under this condition, the SVM with radial kernel performed better than the LDA when few training samples were available, while LDA performed better when a large number of training samples were available. None of the investigated feature selection-classifier combinations provided consistently superior performance under the studied conditions for different sample sizes and feature space distributions. In general, the SFFS method was comparable to the SFS method while PCA may have an advantage for Gaussian feature spaces with unequal covariance matrices. The performance of the SVM with radial kernel was better than, or comparable to, that of the SVM with polynomial kernel under most conditions studied.

  20. Journal of Chemical Education: Software.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1988

    1988-01-01

    Describes a chemistry software program that emulates a modern binary gradient HPLC system with reversed phase column behavior. Allows for solvent selection, adjustment of gradient program, column selection, detectory selection, handling of computer sample data, and sample preparation. (MVL)

  1. Mendelian breeding units versus standard sampling strategies: Mitochondrial DNA variation in southwest Sardinia

    PubMed Central

    Sanna, Daria; Pala, Maria; Cossu, Piero; Dedola, Gian Luca; Melis, Sonia; Fresu, Giovanni; Morelli, Laura; Obinu, Domenica; Tonolo, Giancarlo; Secchi, Giannina; Triunfo, Riccardo; Lorenz, Joseph G.; Scheinfeldt, Laura; Torroni, Antonio; Robledo, Renato; Francalacci, Paolo

    2011-01-01

    We report a sampling strategy based on Mendelian Breeding Units (MBUs), representing an interbreeding group of individuals sharing a common gene pool. The identification of MBUs is crucial for case-control experimental design in association studies. The aim of this work was to evaluate the possible existence of bias in terms of genetic variability and haplogroup frequencies in the MBU sample, due to severe sample selection. In order to reach this goal, the MBU sampling strategy was compared to a standard selection of individuals according to their surname and place of birth. We analysed mitochondrial DNA variation (first hypervariable segment and coding region) in unrelated healthy subjects from two different areas of Sardinia: the area around the town of Cabras and the western Campidano area. No statistically significant differences were observed when the two sampling methods were compared, indicating that the stringent sample selection needed to establish a MBU does not alter original genetic variability and haplogroup distribution. Therefore, the MBU sampling strategy can be considered a useful tool in association studies of complex traits. PMID:21734814

  2. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    PubMed

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  3. The predictive validity of selection for entry into postgraduate training in general practice: evidence from three longitudinal studies

    PubMed Central

    Patterson, Fiona; Lievens, Filip; Kerrin, Máire; Munro, Neil; Irish, Bill

    2013-01-01

    Background The selection methodology for UK general practice is designed to accommodate several thousand applicants per year and targets six core attributes identified in a multi-method job-analysis study Aim To evaluate the predictive validity of selection methods for entry into postgraduate training, comprising a clinical problem-solving test, a situational judgement test, and a selection centre. Design and setting A three-part longitudinal predictive validity study of selection into training for UK general practice. Method In sample 1, participants were junior doctors applying for training in general practice (n = 6824). In sample 2, participants were GP registrars 1 year into training (n = 196). In sample 3, participants were GP registrars sitting the licensing examination after 3 years, at the end of training (n = 2292). The outcome measures include: assessor ratings of performance in a selection centre comprising job simulation exercises (sample 1); supervisor ratings of trainee job performance 1 year into training (sample 2); and licensing examination results, including an applied knowledge examination and a 12-station clinical skills objective structured clinical examination (OSCE; sample 3). Results Performance ratings at selection predicted subsequent supervisor ratings of job performance 1 year later. Selection results also significantly predicted performance on both the clinical skills OSCE and applied knowledge examination for licensing at the end of training. Conclusion In combination, these longitudinal findings provide good evidence of the predictive validity of the selection methods, and are the first reported for entry into postgraduate training. Results show that the best predictor of work performance and training outcomes is a combination of a clinical problem-solving test, a situational judgement test, and a selection centre. Implications for selection methods for all postgraduate specialties are considered. PMID:24267856

  4. The predictive validity of selection for entry into postgraduate training in general practice: evidence from three longitudinal studies.

    PubMed

    Patterson, Fiona; Lievens, Filip; Kerrin, Máire; Munro, Neil; Irish, Bill

    2013-11-01

    The selection methodology for UK general practice is designed to accommodate several thousand applicants per year and targets six core attributes identified in a multi-method job-analysis study To evaluate the predictive validity of selection methods for entry into postgraduate training, comprising a clinical problem-solving test, a situational judgement test, and a selection centre. A three-part longitudinal predictive validity study of selection into training for UK general practice. In sample 1, participants were junior doctors applying for training in general practice (n = 6824). In sample 2, participants were GP registrars 1 year into training (n = 196). In sample 3, participants were GP registrars sitting the licensing examination after 3 years, at the end of training (n = 2292). The outcome measures include: assessor ratings of performance in a selection centre comprising job simulation exercises (sample 1); supervisor ratings of trainee job performance 1 year into training (sample 2); and licensing examination results, including an applied knowledge examination and a 12-station clinical skills objective structured clinical examination (OSCE; sample 3). Performance ratings at selection predicted subsequent supervisor ratings of job performance 1 year later. Selection results also significantly predicted performance on both the clinical skills OSCE and applied knowledge examination for licensing at the end of training. In combination, these longitudinal findings provide good evidence of the predictive validity of the selection methods, and are the first reported for entry into postgraduate training. Results show that the best predictor of work performance and training outcomes is a combination of a clinical problem-solving test, a situational judgement test, and a selection centre. Implications for selection methods for all postgraduate specialties are considered.

  5. A Trade Study and Metric for Penetration and Sampling Devices for Possible Use on the NASA 2003 and 2005 Mars Sample Return Missions

    NASA Technical Reports Server (NTRS)

    McConnell, Joshua B.

    2000-01-01

    The scientific exploration of Mars will require the collection and return of subterranean samples to Earth for examination. This necessitates the use of some type of device or devices that possesses the ability to effectively penetrate the Martian surface, collect suitable samples and return them to the surface in a manner consistent with imposed scientific constraints. The first opportunity for such a device will occur on the 2003 and 2005 Mars Sample Return missions, being performed by NASA. This paper reviews the work completed on the compilation of a database containing viable penetrating and sampling devices, the performance of a system level trade study comparing selected devices to a set of prescribed parameters and the employment of a metric for the evaluation and ranking of the traded penetration and sampling devices, with respect to possible usage on the 03 and 05 sample return missions. The trade study performed is based on a select set of scientific, engineering, programmatic and socio-political criterion. The use of a metric for the various penetration and sampling devices will act to expedite current and future device selection.

  6. Integrated fluorescence analysis system

    DOEpatents

    Buican, Tudor N.; Yoshida, Thomas M.

    1992-01-01

    An integrated fluorescence analysis system enables a component part of a sample to be virtually sorted within a sample volume after a spectrum of the component part has been identified from a fluorescence spectrum of the entire sample in a flow cytometer. Birefringent optics enables the entire spectrum to be resolved into a set of numbers representing the intensity of spectral components of the spectrum. One or more spectral components are selected to program a scanning laser microscope, preferably a confocal microscope, whereby the spectrum from individual pixels or voxels in the sample can be compared. Individual pixels or voxels containing the selected spectral components are identified and an image may be formed to show the morphology of the sample with respect to only those components having the selected spectral components. There is no need for any physical sorting of the sample components to obtain the morphological information.

  7. The Chandra Strong Lens Sample: Revealing Baryonic Physics In Strong Lensing Selected Clusters

    NASA Astrophysics Data System (ADS)

    Bayliss, Matthew

    2017-08-01

    We propose for Chandra imaging of the hot intra-cluster gas in a unique new sample of 29 galaxy clusters selected purely on their strong gravitational lensing signatures. This will be the first program targeting a purely strong lensing selected cluster sample, enabling new comparisons between the ICM properties and scaling relations of strong lensing and mass/ICM selected cluster samples. Chandra imaging, combined with high precision strong lens models, ensures powerful constraints on the distribution and state of matter in the cluster cores. This represents a novel angle from which we can address the role played by baryonic physics |*| the infamous |*|gastrophysics|*| in shaping the cores of massive clusters, and opens up an exciting new galaxy cluster discovery space with Chandra.

  8. The Chandra Strong Lens Sample: Revealing Baryonic Physics In Strong Lensing Selected Clusters

    NASA Astrophysics Data System (ADS)

    Bayliss, Matthew

    2017-09-01

    We propose for Chandra imaging of the hot intra-cluster gas in a unique new sample of 29 galaxy clusters selected purely on their strong gravitational lensing signatures. This will be the first program targeting a purely strong lensing selected cluster sample, enabling new comparisons between the ICM properties and scaling relations of strong lensing and mass/ICM selected cluster samples. Chandra imaging, combined with high precision strong lens models, ensures powerful constraints on the distribution and state of matter in the cluster cores. This represents a novel angle from which we can address the role played by baryonic physics -- the infamous ``gastrophysics''-- in shaping the cores of massive clusters, and opens up an exciting new galaxy cluster discovery space with Chandra.

  9. Construction of a remotely sensed area sampling frame for Southern Brazil

    NASA Technical Reports Server (NTRS)

    Fecso, R.; Gardner, W.; Hale, B.; Johnson, V.; Pavlasek, S. (Principal Investigator)

    1982-01-01

    A remotely sensed area sampling frame was constructed for selected areas in Southern Brazil. The sampling unit information was stored in digital form in a latitudinal/longitudinal characterized population. Computerized sampling procedures were developed which allow for flexibility in sample unit specifications and sampling designs.

  10. Adaptive web sampling.

    PubMed

    Thompson, Steven K

    2006-12-01

    A flexible class of adaptive sampling designs is introduced for sampling in network and spatial settings. In the designs, selections are made sequentially with a mixture distribution based on an active set that changes as the sampling progresses, using network or spatial relationships as well as sample values. The new designs have certain advantages compared with previously existing adaptive and link-tracing designs, including control over sample sizes and of the proportion of effort allocated to adaptive selections. Efficient inference involves averaging over sample paths consistent with the minimal sufficient statistic. A Markov chain resampling method makes the inference computationally feasible. The designs are evaluated in network and spatial settings using two empirical populations: a hidden human population at high risk for HIV/AIDS and an unevenly distributed bird population.

  11. 40 CFR 94.505 - Sample selection for testing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engine family. The required sample size is zero if a manufacturer's projected annual production for all Category 1 engine families is less than 100. (ii) The required sample size for a Category 2 engine family... manufacturer will begin to select engines from each Category 1 and Category 2 engine family for production line...

  12. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  13. Sample Integrity Evaluation and EPA Method 325b Interlaboratory Comparison for Select Volatile Organic Compounds Collected Diffusively on Carbopack X Sorbent Tubes

    EPA Science Inventory

    Sample integrity evaluations and inter-laboratory comparisons were conducted in application of U.S. Environmental Protection Agency (EPA) Methods 325A/B for monitoring benzene and additional selected volatile organic compounds (VOCs) usingpassive-diffusive Carbopack X tube sample...

  14. The Atacama Cosmology Telescope: Physical Properties and Purity of a Galaxy Cluster Sample Selected Via the Sunyaev-Zel'Dovich Effect

    NASA Technical Reports Server (NTRS)

    Menanteau, Felipe; Gonzalez, Jorge; Juin, Jean-Baptiste; Marriage, Tobias; Reese, Erik D.; Acquaviva, Viviana; Aguirre, Paula; Appel, John Willam; Baker, Andrew J.; Barrientos, L. Felipe; hide

    2010-01-01

    We present optical and X-ray properties for the first confirmed galaxy cluster sample selected by the Sunyaev-Zel'dovich Effect from 148 GHz maps over 455 square degrees of sky made with the Atacama Cosmology Telescope. These maps. coupled with multi-band imaging on 4-meter-class optical telescopes, have yielded a sample of 23 galaxy clusters with redshifts between 0.118 and 1.066. Of these 23 clusters, 10 are newly discovered. The selection of this sample is approximately mass limited and essentially independent of redshift. We provide optical positions, images, redshifts and X-ray fluxes and luminosities for the full sample, and X-ray temperatures of an important subset. The mass limit of the full sample is around 8.0 x 10(exp 14) Stellar Mass. with a number distribution that peaks around a redshift of 0.4. For the 10 highest significance SZE-selected cluster candidates, all of which are optically confirmed, the mass threshold is 1 x 10(exp 15) Stellar Mass and the redshift range is 0.167 to 1.066. Archival observations from Chandra, XMM-Newton. and ROSAT provide X-ray luminosities and temperatures that are broadly consistent with this mass threshold. Our optical follow-up procedure also allowed us to assess the purity of the ACT cluster sample. Eighty (one hundred) percent of the 148 GHz candidates with signal-to-noise ratios greater than 5.1 (5.7) are confirmed as massive clusters. The reported sample represents one of the largest SZE-selected sample of massive clusters over all redshifts within a cosmologically-significant survey volume, which will enable cosmological studies as well as future studies on the evolution, morphology, and stellar populations in the most massive clusters in the Universe.

  15. Intermediate BL Lac objects

    NASA Astrophysics Data System (ADS)

    Bondi, M.; Marchã, M. J. M.; Dallacasa, D.; Stanghellini, C.

    2001-08-01

    The 200-mJy sample, defined by Marchã et al., contains about 60 nearby, northern, flat-spectrum radio sources. In particular, the sample has proved effective at finding nearby radio-selected BL Lac objects with radio luminosities comparable to those of X-ray-selected objects, and low-luminosity flat-spectrum weak emission-line radio galaxies (WLRGs). The 200-mJy sample contains 23 BL Lac objects (including 6 BL Lac candidates) and 19 WLRGs. We will refer to these subsamples as the 200-mJy BL Lac sample and the 200-mJy WLRG sample, respectively. We have started a systematic analysis of the morphological pc-scale properties of the 200-mJy radio sources using VLBI observations. This paper presents VLBI observations at 5 and 1.6GHz of 14 BL Lac objects and WLRGs selected from the 200-mJy sample. The pc-scale morphology of these objects is briefly discussed. We derive the radio beaming parameters of the 200-mJy BL Lac objects and WLRGs and compare them with those of other BL Lac samples and with a sample of FR I radio galaxies. The overall broad-band radio, optical and X-ray properties of the 200-mJy BL Lac sample are discussed and compared with those of other BL Lac samples, radio- and X-ray-selected. We find that the 200-mJy BL Lac objects fill the gap between HBL and LBL objects in the colour-colour plot, and have intermediate αXOX as expected in the spectral energy distribution unification scenario. Finally, we briefly discuss the role of the WLRGs.

  16. Sampling designs for HIV molecular epidemiology with application to Honduras.

    PubMed

    Shepherd, Bryan E; Rossini, Anthony J; Soto, Ramon Jeremias; De Rivera, Ivette Lorenzana; Mullins, James I

    2005-11-01

    Proper sampling is essential to characterize the molecular epidemiology of human immunodeficiency virus (HIV). HIV sampling frames are difficult to identify, so most studies use convenience samples. We discuss statistically valid and feasible sampling techniques that overcome some of the potential for bias due to convenience sampling and ensure better representation of the study population. We employ a sampling design called stratified cluster sampling. This first divides the population into geographical and/or social strata. Within each stratum, a population of clusters is chosen from groups, locations, or facilities where HIV-positive individuals might be found. Some clusters are randomly selected within strata and individuals are randomly selected within clusters. Variation and cost help determine the number of clusters and the number of individuals within clusters that are to be sampled. We illustrate the approach through a study designed to survey the heterogeneity of subtype B strains in Honduras.

  17. Method and apparatus for chemical and topographical microanalysis

    NASA Technical Reports Server (NTRS)

    Kossakovski, Dmitri A. (Inventor); Baldeschwieler, John D. (Inventor); Beauchamp, Jesse L. (Inventor)

    2002-01-01

    A scanning probe microscope is combined with a laser induced breakdown spectrometer to provide spatially resolved chemical analysis of the surface correlated with the surface topography. Topographical analysis is achieved by scanning a sharp probe across the sample at constant distance from the surface. Chemical analysis is achieved by the means of laser induced breakdown spectroscopy by delivering pulsed laser radiation to the sample surface through the same sharp probe, and consequent collection and analysis of emission spectra from plasma generated on the sample by the laser radiation. The method comprises performing microtopographical analysis of the sample with a scanning probe, selecting a scanned topological site on the sample, generating a plasma plume at the selected scanned topological site, and measuring a spectrum of optical emission from the plasma at the selected scanned topological site. The apparatus comprises a scanning probe, a pulsed laser optically coupled to the probe, an optical spectrometer, and a controller coupled to the scanner, laser and spectrometer for controlling the operation of the scanner, laser and spectrometer. The probe and scanner are used for topographical profiling the sample. The probe is also used for laser radiation delivery to the sample for generating a plasma plume from the sample. Optical emission from the plasma plume is collected and delivered to the optical spectrometer so that analysis of emission spectrum by the optical spectrometer allows for identification of chemical composition of the sample at user selected sites.

  18. Sources and preparation of data for assessing trends in concentrations of pesticides in streams of the United States, 1992–2010

    USGS Publications Warehouse

    Martin, Jeffrey D.; Eberle, Michael; Nakagaki, Naomi

    2011-01-01

    This report updates a previously published water-quality dataset of 44 commonly used pesticides and 8 pesticide degradates suitable for a national assessment of trends in pesticide concentrations in streams of the United States. Water-quality samples collected from January 1992 through September 2010 at stream-water sites of the U.S. Geological Survey (USGS) National Water-Quality Assessment (NAWQA) Program and the National Stream Quality Accounting Network (NASQAN) were compiled, reviewed, selected, and prepared for trend analysis. The principal steps in data review for trend analysis were to (1) identify analytical schedule, (2) verify sample-level coding, (3) exclude inappropriate samples or results, (4) review pesticide detections per sample, (5) review high pesticide concentrations, and (6) review the spatial and temporal extent of NAWQA pesticide data and selection of analytical methods for trend analysis. The principal steps in data preparation for trend analysis were to (1) select stream-water sites for trend analysis, (2) round concentrations to a consistent level of precision for the concentration range, (3) identify routine reporting levels used to report nondetections unaffected by matrix interference, (4) reassign the concentration value for routine nondetections to the maximum value of the long-term method detection level (maxLT-MDL), (5) adjust concentrations to compensate for temporal changes in bias of recovery of the gas chromatography/mass spectrometry (GCMS) analytical method, and (6) identify samples considered inappropriate for trend analysis. Samples analyzed at the USGS National Water Quality Laboratory (NWQL) by the GCMS analytical method were the most extensive in time and space and, consequently, were selected for trend analysis. Stream-water sites with 3 or more water years of data with six or more samples per year were selected for pesticide trend analysis. The selection criteria described in the report produced a dataset of 21,988 pesticide samples at 212 stream-water sites. Only 21,144 pesticide samples, however, are considered appropriate for trend analysis.

  19. Penetrator role in Mars sample strategy

    NASA Technical Reports Server (NTRS)

    Boynton, William; Dwornik, Steve; Eckstrom, William; Roalstad, David A.

    1988-01-01

    The application of the penetrator to a Mars Return Sample Mission (MRSM) has direct advantages to meet science objectives and mission safety. Based on engineering data and work currently conducted at Ball Aerospace Systems Division, the concept of penetrators as scientific instruments is entirely practical. The primary utilization of a penetrator for MRSM would be to optimize the selection of the sample site location and to help in selection of the actual sample to be returned to Earth. It is recognized that the amount of sample to be returned is very limited, therefore the selection of the sample site is critical to the success of the mission. The following mission scenario is proposed. The site selection of a sample to be acquired will be performed by science working groups. A decision will be reached and a set of target priorities established based on data to give geochemical, geophysical and geological information. The first task of a penetrator will be to collect data at up to 4 to 6 possible landing sites. The penetrator can include geophysical, geochemical, geological and engineering instruments to confirm that scientific data requirements at that site will be met. This in situ near real-time data, collected prior to final targeting of the lander, will insure that the sample site is both scientifically valuable and also that it is reachable within limits of the capability of the lander.

  20. Analyses of native water, bottom material, elutriate samples, and dredged material from selected southern Louisiana waterways and selected areas in the Gulf of Mexico, 1979-81

    USGS Publications Warehouse

    Lurry, Dee L.

    1983-01-01

    The U.S. Geological Survey was requested by the U.S. Army Corps of Engineers, New Orleans District, to provide water-quality data to evaluate environmental effects of dredging activities in selected reaches of the Calcasieu River in southwestern Louisiana. Samples were collected from the upper and lower Calcasieu River between January 1980 and March 1981. Thirty-three samples (22 native-water and 11 effluent) were collected from eleven dredging sites. In addition, a series of elutriate studies were conducted between July 1979 and July 1981 to determine water quality as a basis for assessing possible environmental effects of proposed dredging activities in the following areas: Grand Bayou and Martins Canal near Happy Jack, unnamed bayou near Port Sulphur, Grand Bayou and Pipeline Canal near Port Sulphur and Bayou des Plantins near Empire; Mississippi River Gulf Outlet and Inner Harbor Navigation Canal; Southwest Pass; Barataria Bay; Atchafalaya Bay at Eugene Island; Calcasieu Ship Channel. Samples of native water and samples of bottom material were collected from 22 different sites and elutriate (mixtures of native water and bottom material) samples were prepared and analyzed. Four proposed ocean-disposal sites were sampled for bottom material only. Samples were analyzed for selected chemical and biological constituents and physical properties. (USGS)

  1. The coalescent of a sample from a binary branching process.

    PubMed

    Lambert, Amaury

    2018-04-25

    At time 0, start a time-continuous binary branching process, where particles give birth to a single particle independently (at a possibly time-dependent rate) and die independently (at a possibly time-dependent and age-dependent rate). A particular case is the classical birth-death process. Stop this process at time T>0. It is known that the tree spanned by the N tips alive at time T of the tree thus obtained (called a reduced tree or coalescent tree) is a coalescent point process (CPP), which basically means that the depths of interior nodes are independent and identically distributed (iid). Now select each of the N tips independently with probability y (Bernoulli sample). It is known that the tree generated by the selected tips, which we will call the Bernoulli sampled CPP, is again a CPP. Now instead, select exactly k tips uniformly at random among the N tips (a k-sample). We show that the tree generated by the selected tips is a mixture of Bernoulli sampled CPPs with the same parent CPP, over some explicit distribution of the sampling probability y. An immediate consequence is that the genealogy of a k-sample can be obtained by the realization of k random variables, first the random sampling probability Y and then the k-1 node depths which are iid conditional on Y=y. Copyright © 2018. Published by Elsevier Inc.

  2. Selection bias in population-based cancer case-control studies due to incomplete sampling frame coverage.

    PubMed

    Walsh, Matthew C; Trentham-Dietz, Amy; Gangnon, Ronald E; Nieto, F Javier; Newcomb, Polly A; Palta, Mari

    2012-06-01

    Increasing numbers of individuals are choosing to opt out of population-based sampling frames due to privacy concerns. This is especially a problem in the selection of controls for case-control studies, as the cases often arise from relatively complete population-based registries, whereas control selection requires a sampling frame. If opt out is also related to risk factors, bias can arise. We linked breast cancer cases who reported having a valid driver's license from the 2004-2008 Wisconsin women's health study (N = 2,988) with a master list of licensed drivers from the Wisconsin Department of Transportation (WDOT). This master list excludes Wisconsin drivers that requested their information not be sold by the state. Multivariate-adjusted selection probability ratios (SPR) were calculated to estimate potential bias when using this driver's license sampling frame to select controls. A total of 962 cases (32%) had opted out of the WDOT sampling frame. Cases age <40 (SPR = 0.90), income either unreported (SPR = 0.89) or greater than $50,000 (SPR = 0.94), lower parity (SPR = 0.96 per one-child decrease), and hormone use (SPR = 0.93) were significantly less likely to be covered by the WDOT sampling frame (α = 0.05 level). Our results indicate the potential for selection bias due to differential opt out between various demographic and behavioral subgroups of controls. As selection bias may differ by exposure and study base, the assessment of potential bias needs to be ongoing. SPRs can be used to predict the direction of bias when cases and controls stem from different sampling frames in population-based case-control studies.

  3. Apparatus and method for the characterization of respirable aerosols

    DOEpatents

    Clark, Douglas K.; Hodges, Bradley W.; Bush, Jesse D.; Mishima, Jofu

    2016-05-31

    An apparatus for the characterization of respirable aerosols, including: a burn chamber configured to selectively contain a sample that is selectively heated to generate an aerosol; a heating assembly disposed within the burn chamber adjacent to the sample; and a sampling segment coupled to the burn chamber and configured to collect the aerosol such that it may be analyzed. The apparatus also includes an optional sight window disposed in a wall of the burn chamber such that the sample may be viewed during heating. Optionally, the sample includes one of a Lanthanide, an Actinide, and a Transition metal.

  4. [Demonstration plan used in the study of human reproduction in the district of Sao Paulo. 1967].

    PubMed

    Silva, Eunice Pinho de Castro

    2006-10-01

    This work presents the sampling procedure used to select the sample got for a "Human Reproduction Study in the District of São Paulo" (Brazil), done by the Department of Applied Statistics of "Faculdade de Higiene e Saúde Pública da Universidade de São Paulo". The procedure tried to solve the situation which resulted from the limitation in cost, time and lack of a frame that could be used in order to get a probability sample in the fixed term of time and with the fixed cost. It consisted in a two stage sampling with dwelling-units as primary units and women as secondary units. At the first stage, it was used stratified sampling in which sub-districts were taken as strata. In order to select primary units, there was a selection of points ("starting points") on the maps of subdistricts by a procedure that was similar to that one called "square grid" but differed from this in several aspects. There were fixed rules to establish a correspondence between each selected "starting point" and a set of three dwelling units where at least one woman of the target population lived. In the selected dwelling units where more than one woman of target population lived, there was a sub-sampling in order to select one of them. In this selection each woman living in the dwelling unit had equal probability of selection. Several "no-answer" cases and correspondent instructions to be followed by the interviewers are presented too.

  5. Computerized stratified random site-selection approaches for design of a ground-water-quality sampling network

    USGS Publications Warehouse

    Scott, J.C.

    1990-01-01

    Computer software was written to randomly select sites for a ground-water-quality sampling network. The software uses digital cartographic techniques and subroutines from a proprietary geographic information system. The report presents the approaches, computer software, and sample applications. It is often desirable to collect ground-water-quality samples from various areas in a study region that have different values of a spatial characteristic, such as land-use or hydrogeologic setting. A stratified network can be used for testing hypotheses about relations between spatial characteristics and water quality, or for calculating statistical descriptions of water-quality data that account for variations that correspond to the spatial characteristic. In the software described, a study region is subdivided into areal subsets that have a common spatial characteristic to stratify the population into several categories from which sampling sites are selected. Different numbers of sites may be selected from each category of areal subsets. A population of potential sampling sites may be defined by either specifying a fixed population of existing sites, or by preparing an equally spaced population of potential sites. In either case, each site is identified with a single category, depending on the value of the spatial characteristic of the areal subset in which the site is located. Sites are selected from one category at a time. One of two approaches may be used to select sites. Sites may be selected randomly, or the areal subsets in the category can be grouped into cells and sites selected randomly from each cell.

  6. Sample diversity and premise typicality in inductive reasoning: evidence for developmental change.

    PubMed

    Rhodes, Marjorie; Brickman, Daniel; Gelman, Susan A

    2008-08-01

    Evaluating whether a limited sample of evidence provides a good basis for induction is a critical cognitive task. We hypothesized that whereas adults evaluate the inductive strength of samples containing multiple pieces of evidence by attending to the relations among the exemplars (e.g., sample diversity), six-year-olds would attend to the degree to which each individual exemplar in a sample independently appears informative (e.g., premise typicality). To test these hypotheses, participants were asked to select between diverse and non-diverse samples to help them learn about basic-level animal categories. Across various between-subject conditions (N=133), we varied the typicality present in the diverse and non-diverse samples. We found that adults reliably selected to examine diverse over non-diverse samples, regardless of exemplar typicality, six-year-olds preferred to examine samples containing typical exemplars, regardless of sample diversity, and nine-year-olds were somewhat in the midst of this developmental transition.

  7. A soil sampling intercomparison exercise for the ALMERA network.

    PubMed

    Belli, Maria; de Zorzi, Paolo; Sansone, Umberto; Shakhashiro, Abduhlghani; Gondin da Fonseca, Adelaide; Trinkl, Alexander; Benesch, Thomas

    2009-11-01

    Soil sampling and analysis for radionuclides after an accidental or routine release is a key factor for the dose calculation to members of the public, and for the establishment of possible countermeasures. The IAEA organized for selected laboratories of the ALMERA (Analytical Laboratories for the Measurement of Environmental Radioactivity) network a Soil Sampling Intercomparison Exercise (IAEA/SIE/01) with the objective of comparing soil sampling procedures used by different laboratories. The ALMERA network is a world-wide network of analytical laboratories located in IAEA member states capable of providing reliable and timely analysis of environmental samples in the event of an accidental or intentional release of radioactivity. Ten ALMERA laboratories were selected to participate in the sampling exercise. The soil sampling intercomparison exercise took place in November 2005 in an agricultural area qualified as a "reference site", aimed at assessing the uncertainties associated with soil sampling in agricultural, semi-natural, urban and contaminated environments and suitable for performing sampling intercomparison. In this paper, the laboratories sampling performance were evaluated.

  8. Ground-water-quality data for selected wells in the Beaver Creek watershed, West Tennessee

    USGS Publications Warehouse

    Williams, S.D.

    1996-01-01

    In 1993 the U.S. Geological Survey, in cooperation with the Tennessee Department of Environment and Conservation (TDEC), began an investigation of the quality of ground water in the Beaver Creek watershed in West Tennessee. A total of 408 water samples were collected from 91 wells during 5 sampling periods in 1994. Water samples were analyzed for selected water-quality properties, fecal coliform and streptococci bacteria, nutrients, and major inorganic constituents. Selected well- construction data and information on potential sources of contamination were also collected for the 91 wells sampled. Nitrate concentrations (measured as NO3) ranged from a detection limit of 0.1 to 91 milligrams per liter (mg/L). Nitrate concentrations exceeding 13 mg/L were detected in 71 of the samples collected. Nitrate concentrations in water samples collected from three wells exceeded the TDEC primary drinking water standard of 44 mg/L for nitrate (measured as NO3). Nitrite (measured as NO2), ammonium (measured as NH4), and orthophosphate (measured as PO4) concentrations in samples were generally less than 0.1 mg/L (detection limit). Fecal coliform bacteria were detected in 33 of the 408 water samples collected. Samples from 21 of the 91 wells contained fecal coliform bacteria during one or more of the five sampling periods. Fecal streptococci bacteria were detected in 123 of the 408 samples. Samples from 59 wells contained fecal streptococci bacteria during one or more of the five sampling periods.

  9. Selected trace metals and organic compounds and bioavailability of selected organic compounds in soils, Hackberry Flat, Tillman County, Oklahoma, 1994-95

    USGS Publications Warehouse

    Becker, M.F.

    1997-01-01

    In 1995 the Oklahoma Department of Wildlife Conservation acquired a drained wetland in southwest Oklahoma known as Hackberry Flat. Following restoration by Wildlife Conservation the wetland will be used by migratory birds and waterfowl. If naturally occurring trace metals and residual organic compounds from agriculture and industry were present, they may have posed a potential biohazard and were a concern for Wildlife Conservation. The U. S. Geological Survey, in cooperation with Wildlife Conservation and the Oklahoma Geological Survey, examined the soils of Hackberry Flat to determine trace metal concentrations, presence of selected organic compounds, and the bioavailability of selected organic compounds in the soils. The purpose of this report is to present the data that establish the baseline concentrations of selected trace metals and organic compounds in the soils of Hackberry Flat prior to wetland restoration. Sampling and analysis were performed using two approaches. One was to collect soil samples and analyze the composition with standard laboratory practices. The second exposed composite soils samples to organic-free water and a semipermeable membrane device that mimics an organism and then analyzed the device. Ten soil samples were collected in 1994 to be analyzed for trace metals, organochlorine pesticides, and polychlorinated biphenyls. Soil samples tested for bioavailability of selected organic compounds were collected in 1995. Most of the 182 soil samples collected were from the center of every 40-acre quarter-quarter section owned by the Wildlife Conservation. The samples were grouped by geographical area with a maximum of 16 sample sites per group. Concentrations of most selected trace metals measured from soils in Hackberry Flat are within the range of mean concentrations measured in cultivated soils within the United States. Organochlorine pesticides, polychlorinated biphenyls, and polyaromatic hydrocarbons were not found at concentrations above the analytical detection levels and, if present, in the soil samples are at concentrations below the detection level of the analytical method used. Organochlorine pesticides, total polychlorinated biphenyls, and polyaromatic hydrocarbons were not detected in any of the semipermeable membrane devices at the analytical detection levels.

  10. An Alternative View of Forest Sampling

    Treesearch

    Francis A. Roesch; Edwin J. Green; Charles T. Scott

    1993-01-01

    A generalized concept is presented for all of the commonly used methods of forest sampling. The concept views the forest as a two-dimensional picture which is cut up into pieces like a jigsaw puzzle, with the pieces defined by the individual selection probabilities of the trees in the forest. This concept results in a finite number of independently selected sample...

  11. 40 CFR Appendix Xi to Part 86 - Sampling Plans for Selective Enforcement Auditing of Light-Duty Vehicles

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Sampling Plans for Selective Enforcement Auditing of Light-Duty Vehicles XI Appendix XI to Part 86 Protection of Environment ENVIRONMENTAL... Enforcement Auditing of Light-Duty Vehicles 40% AQL Table 1—Sampling Plan Code Letter Annual sales of...

  12. Downselection for Sample Return — Defining Sampling Strategies Using Lessons from Terrestrial Field Analogues

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Gentry, D.; Amador, E.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Jacobsen, M.; Kirby, J.; McCaig, H.; Murukesan, G.; Rader, E.; Rennie, V.; Schwieterman, E.; Sutton, S.; Tan, G.; Yin, C.; Cullen, D.; Geppert, W.; Stockton, A.

    2018-04-01

    We detail multi-year field investigations in Icelandic Mars analogue environments that have yielded results that can help inform strategies for sample selection and downselection for Mars Sample Return.

  13. Determination of hazardous ingredients in personal care products using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Abrar, M.; Iqbal, T.; Fahad, M.; Andleeb, M.; Farooq, Z.; Afsheen, S.

    2018-05-01

    In the present work, the laser-induced breakdown spectroscopy technique is applied to explore the concentration of toxic elements present in cosmetic materials. The elemental analysis of chromium (Cr), magnesium (Mg), cadmium (Cd) and lead (Pb) are selected as major elements and manganese (Mn), sodium (Na), potassium (P), sulfur (S), silicon (Si) and titanium (Ti) as minor elements in cosmetic products. In this technique, a plasma plume is generated by using an Nd:YAG Laser of 532 nm wavelength and spectral lines for the respective samples are observed. Four different samples of cosmetic products are selected, i.e. two samples for lipstick and two for eyeshadow. The observed spectral lines of all major and minor elements are used to calculate their concentration in all samples through the intensity ratio method. Among selected lipstick and eyeshadow samples, one sample is branded, and one is collected from the local market. It is observed that chromium, magnesium and lead have strong spectral lines and consequently show high concentration. The calculated concentrations are then compared to permissible limits set by the Food and Drug Administration with regard to the cosmetics industry. The concentration of these toxic elements in selected local cosmetic samples exceeds the safe permissible limit for human use and could lead to serious health problems.

  14. Feature Selection for Ridge Regression with Provable Guarantees.

    PubMed

    Paul, Saurabh; Drineas, Petros

    2016-04-01

    We introduce single-set spectral sparsification as a deterministic sampling-based feature selection technique for regularized least-squares classification, which is the classification analog to ridge regression. The method is unsupervised and gives worst-case guarantees of the generalization power of the classification function after feature selection with respect to the classification function obtained using all features. We also introduce leverage-score sampling as an unsupervised randomized feature selection method for ridge regression. We provide risk bounds for both single-set spectral sparsification and leverage-score sampling on ridge regression in the fixed design setting and show that the risk in the sampled space is comparable to the risk in the full-feature space. We perform experiments on synthetic and real-world data sets; a subset of TechTC-300 data sets, to support our theory. Experimental results indicate that the proposed methods perform better than the existing feature selection methods.

  15. Occurrence, distribution, and concentrations of selected contaminants in streambed- and suspended-sediment samples collected in Bexar County, Texas, 2007-09

    USGS Publications Warehouse

    Wilson, Jennifer T.

    2011-01-01

    High concentrations of sediment-associated contaminants are typically associated with urban areas such as San Antonio, Texas, in Bexar County, the seventh most populous city in the United States. U.S. Geological Survey personnel periodically collected surficial streambed-sediment samples during 2007-09 and collected suspended-sediment samples from selected streams after storms during 2008 and 2009. All sediment samples were analyzed for major and trace elements, pesticides, polychlorinated biphenyls, and polycyclic aromatic hydrocarbons.

  16. Coliform species recovered from untreated surface water and drinking water by the membrane filter, standard, and modified most-probable-number techniques.

    PubMed Central

    Evans, T M; LeChevallier, M W; Waarvick, C E; Seidler, R J

    1981-01-01

    The species of total coliform bacteria isolated from drinking water and untreated surface water by the membrane filter (MF), the standard most-probable-number (S-MPN), and modified most-probable-number (M-MPN) techniques were compared. Each coliform detection technique selected for a different profile of coliform species from both types of water samples. The MF technique indicated that Citrobacter freundii was the most common coliform species in water samples. However, the fermentation tube techniques displayed selectivity towards the isolation of Escherichia coli and Klebsiella. The M-MPN technique selected for more C. freundii and Enterobacter spp. from untreated surface water samples and for more Enterobacter and Klebsiella spp. from drinking water samples than did the S-MPN technique. The lack of agreement between the number of coliforms detected in a water sample by the S-MPN, M-MPN, and MF techniques was a result of the selection for different coliform species by the various techniques. PMID:7013706

  17. Systematic sampling for suspended sediment

    Treesearch

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  18. Random bit generation at tunable rates using a chaotic semiconductor laser under distributed feedback.

    PubMed

    Li, Xiao-Zhou; Li, Song-Sui; Zhuang, Jun-Ping; Chan, Sze-Chun

    2015-09-01

    A semiconductor laser with distributed feedback from a fiber Bragg grating (FBG) is investigated for random bit generation (RBG). The feedback perturbs the laser to emit chaotically with the intensity being sampled periodically. The samples are then converted into random bits by a simple postprocessing of self-differencing and selecting bits. Unlike a conventional mirror that provides localized feedback, the FBG provides distributed feedback which effectively suppresses the information of the round-trip feedback delay time. Randomness is ensured even when the sampling period is commensurate with the feedback delay between the laser and the grating. Consequently, in RBG, the FBG feedback enables continuous tuning of the output bit rate, reduces the minimum sampling period, and increases the number of bits selected per sample. RBG is experimentally investigated at a sampling period continuously tunable from over 16 ns down to 50 ps, while the feedback delay is fixed at 7.7 ns. By selecting 5 least-significant bits per sample, output bit rates from 0.3 to 100 Gbps are achieved with randomness examined by the National Institute of Standards and Technology test suite.

  19. Accounting for animal movement in estimation of resource selection functions: sampling and data analysis.

    PubMed

    Forester, James D; Im, Hae Kyung; Rathouz, Paul J

    2009-12-01

    Patterns of resource selection by animal populations emerge as a result of the behavior of many individuals. Statistical models that describe these population-level patterns of habitat use can miss important interactions between individual animals and characteristics of their local environment; however, identifying these interactions is difficult. One approach to this problem is to incorporate models of individual movement into resource selection models. To do this, we propose a model for step selection functions (SSF) that is composed of a resource-independent movement kernel and a resource selection function (RSF). We show that standard case-control logistic regression may be used to fit the SSF; however, the sampling scheme used to generate control points (i.e., the definition of availability) must be accommodated. We used three sampling schemes to analyze simulated movement data and found that ignoring sampling and the resource-independent movement kernel yielded biased estimates of selection. The level of bias depended on the method used to generate control locations, the strength of selection, and the spatial scale of the resource map. Using empirical or parametric methods to sample control locations produced biased estimates under stronger selection; however, we show that the addition of a distance function to the analysis substantially reduced that bias. Assuming a uniform availability within a fixed buffer yielded strongly biased selection estimates that could be corrected by including the distance function but remained inefficient relative to the empirical and parametric sampling methods. As a case study, we used location data collected from elk in Yellowstone National Park, USA, to show that selection and bias may be temporally variable. Because under constant selection the amount of bias depends on the scale at which a resource is distributed in the landscape, we suggest that distance always be included as a covariate in SSF analyses. This approach to modeling resource selection is easily implemented using common statistical tools and promises to provide deeper insight into the movement ecology of animals.

  20. Accounting for selection bias in association studies with complex survey data.

    PubMed

    Wirth, Kathleen E; Tchetgen Tchetgen, Eric J

    2014-05-01

    Obtaining representative information from hidden and hard-to-reach populations is fundamental to describe the epidemiology of many sexually transmitted diseases, including HIV. Unfortunately, simple random sampling is impractical in these settings, as no registry of names exists from which to sample the population at random. However, complex sampling designs can be used, as members of these populations tend to congregate at known locations, which can be enumerated and sampled at random. For example, female sex workers may be found at brothels and street corners, whereas injection drug users often come together at shooting galleries. Despite the logistical appeal, complex sampling schemes lead to unequal probabilities of selection, and failure to account for this differential selection can result in biased estimates of population averages and relative risks. However, standard techniques to account for selection can lead to substantial losses in efficiency. Consequently, researchers implement a variety of strategies in an effort to balance validity and efficiency. Some researchers fully or partially account for the survey design, whereas others do nothing and treat the sample as a realization of the population of interest. We use directed acyclic graphs to show how certain survey sampling designs, combined with subject-matter considerations unique to individual exposure-outcome associations, can induce selection bias. Finally, we present a novel yet simple maximum likelihood approach for analyzing complex survey data; this approach optimizes statistical efficiency at no cost to validity. We use simulated data to illustrate this method and compare it with other analytic techniques.

  1. Parallel cascade selection molecular dynamics for efficient conformational sampling and free energy calculation of proteins

    NASA Astrophysics Data System (ADS)

    Kitao, Akio; Harada, Ryuhei; Nishihara, Yasutaka; Tran, Duy Phuoc

    2016-12-01

    Parallel Cascade Selection Molecular Dynamics (PaCS-MD) was proposed as an efficient conformational sampling method to investigate conformational transition pathway of proteins. In PaCS-MD, cycles of (i) selection of initial structures for multiple independent MD simulations and (ii) conformational sampling by independent MD simulations are repeated until the convergence of the sampling. The selection is conducted so that protein conformation gradually approaches a target. The selection of snapshots is a key to enhance conformational changes by increasing the probability of rare event occurrence. Since the procedure of PaCS-MD is simple, no modification of MD programs is required; the selections of initial structures and the restart of the next cycle in the MD simulations can be handled with relatively simple scripts with straightforward implementation. Trajectories generated by PaCS-MD were further analyzed by the Markov state model (MSM), which enables calculation of free energy landscape. The combination of PaCS-MD and MSM is reported in this work.

  2. An evaluation of flow-stratified sampling for estimating suspended sediment loads

    Treesearch

    Robert B. Thomas; Jack Lewis

    1995-01-01

    Abstract - Flow-stratified sampling is a new method for sampling water quality constituents such as suspended sediment to estimate loads. As with selection-at-list-time (SALT) and time-stratified sampling, flow-stratified sampling is a statistical method requiring random sampling, and yielding unbiased estimates of load and variance. It can be used to estimate event...

  3. 7 CFR 43.104 - Master table of single and double sampling plans.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Master table of single and double sampling plans. 43... STANDARD CONTAINER REGULATIONS STANDARDS FOR SAMPLING PLANS Sampling Plans § 43.104 Master table of single and double sampling plans. (a) In the master table, a sampling plan is selected by first determining...

  4. 7 CFR 43.104 - Master table of single and double sampling plans.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Master table of single and double sampling plans. 43... STANDARD CONTAINER REGULATIONS STANDARDS FOR SAMPLING PLANS Sampling Plans § 43.104 Master table of single and double sampling plans. (a) In the master table, a sampling plan is selected by first determining...

  5. 7 CFR 43.104 - Master table of single and double sampling plans.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Master table of single and double sampling plans. 43... STANDARD CONTAINER REGULATIONS STANDARDS FOR SAMPLING PLANS Sampling Plans § 43.104 Master table of single and double sampling plans. (a) In the master table, a sampling plan is selected by first determining...

  6. 7 CFR 43.104 - Master table of single and double sampling plans.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Master table of single and double sampling plans. 43... STANDARD CONTAINER REGULATIONS STANDARDS FOR SAMPLING PLANS Sampling Plans § 43.104 Master table of single and double sampling plans. (a) In the master table, a sampling plan is selected by first determining...

  7. 5 CFR 532.215 - Establishments included in regular appropriated fund surveys.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... in surveys shall be selected under standard probability sample selection procedures. In areas with... establishment list drawn under statistical sampling procedures. [55 FR 46142, Nov. 1, 1990] ...

  8. Compilation of a near-infrared library for the construction of quantitative models of amoxicillin and potassium clavulanate oral dosage forms

    NASA Astrophysics Data System (ADS)

    Zou, Wen-bo; Chong, Xiao-meng; Wang, Yan; Hu, Chang-qin

    2018-05-01

    The accuracy of NIR quantitative models depends on calibration samples with concentration variability. Conventional sample collecting methods have some shortcomings especially the time-consuming which remains a bottleneck in the application of NIR models for Process Analytical Technology (PAT) control. A study was performed to solve the problem of sample selection collection for construction of NIR quantitative models. Amoxicillin and potassium clavulanate oral dosage forms were used as examples. The aim was to find a normal approach to rapidly construct NIR quantitative models using an NIR spectral library based on the idea of a universal model [2021]. The NIR spectral library of amoxicillin and potassium clavulanate oral dosage forms was defined and consisted of spectra of 377 batches of samples produced by 26 domestic pharmaceutical companies, including tablets, dispersible tablets, chewable tablets, oral suspensions, and granules. The correlation coefficient (rT) was used to indicate the similarities of the spectra. The samples’ calibration sets were selected from a spectral library according to the median rT of the samples to be analyzed. The rT of the samples selected was close to the median rT. The difference in rT of those samples was 1.0% to 1.5%. We concluded that sample selection is not a problem when constructing NIR quantitative models using a spectral library versus conventional methods of determining universal models. The sample spectra with a suitable concentration range in the NIR models were collected quickly. In addition, the models constructed through this method were more easily targeted.

  9. 40 CFR 761.257 - Determining the regulatory status of sampled pipe.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... COMMERCE, AND USE PROHIBITIONS Determining a PCB Concentration for Purposes of Abandonment or Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe... disposal of a pipe segment that has been sampled, the sample results for that segment determines its PCB...

  10. FLUID SELECTING APPARATUS

    DOEpatents

    Stinson, W.J.

    1958-09-16

    A valve designed to selectively sample fluids from a number of sources is described. The valve comprises a rotatable operating lever connected through a bellows seal to a rotatable assembly containing a needle valve, bearings, and a rotational lock. The needle valve is connected through a flexible tube to the sample fluid outlet. By rotating the lever the needle valve is placed over . one of several fluid sources and locked in position so that the fluid is traasferred through the flexible tubing and outlet to a remote sampling system. The fluids from the nonselected sources are exhausted to a waste line. This valve constitutes a simple, dependable means of selecting a sample from one of several scurces.

  11. A comparison of selection at list time and time-stratified sampling for estimating suspended sediment loads

    Treesearch

    Robert B. Thomas; Jack Lewis

    1993-01-01

    Time-stratified sampling of sediment for estimating suspended load is introduced and compared to selection at list time (SALT) sampling. Both methods provide unbiased estimates of load and variance. The magnitude of the variance of the two methods is compared using five storm populations of suspended sediment flux derived from turbidity data. Under like conditions,...

  12. Filtration of water-sediment samples for the determination of organic compounds

    USGS Publications Warehouse

    Sandstrom, Mark W.

    1995-01-01

    This report describes the equipment and procedures used for on-site filtration of surface-water and ground-water samples for determination of organic compounds. Glass-fiber filters and a positive displacement pumping system are suitable for processing most samples for organic analyses. An optional system that uses disposable in-line membrane filters is suitable for a specific gas chromatography/mass spectrometry, selected-ion monitoring analytical method for determination of organonitrogen herbicides. General procedures to minimize contamination of the samples include preparing a clean workspace at the site, selecting appropriate sample-collection materials, and cleaning of the equipment with detergent, tap water, and methanol.

  13. Mars sample return: Site selection and sample acquisition study

    NASA Technical Reports Server (NTRS)

    Nickle, N. (Editor)

    1980-01-01

    Various vehicle and mission options were investigated for the continued exploration of Mars; the cost of a minimum sample return mission was estimated; options and concepts were synthesized into program possibilities; and recommendations for the next Mars mission were made to the Planetary Program office. Specific sites and all relevant spacecraft and ground-based data were studied in order to determine: (1) the adequacy of presently available data for identifying landing sities for a sample return mission that would assure the acquisition of material from the most important geologic provinces of Mars; (2) the degree of surface mobility required to assure sample acquisition for these sites; (3) techniques to be used in the selection and drilling of rock a samples; and (4) the degree of mobility required at the two Viking sites to acquire these samples.

  14. AEGIS: Demographics of X-ray and Optically Selected Active Galactic Nuclei

    NASA Astrophysics Data System (ADS)

    Yan, Renbin; Ho, Luis C.; Newman, Jeffrey A.; Coil, Alison L.; Willmer, Christopher N. A.; Laird, Elise S.; Georgakakis, Antonis; Aird, James; Barmby, Pauline; Bundy, Kevin; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Fang, Taotao; Griffith, Roger L.; Koekemoer, Anton M.; Koo, David C.; Nandra, Kirpal; Park, Shinae Q.; Sarajedini, Vicki L.; Weiner, Benjamin J.; Willner, S. P.

    2011-02-01

    We develop a new diagnostic method to classify galaxies into active galactic nucleus (AGN) hosts, star-forming galaxies, and absorption-dominated galaxies by combining the [O III]/Hβ ratio with rest-frame U - B color. This can be used to robustly select AGNs in galaxy samples at intermediate redshifts (z < 1). We compare the result of this optical AGN selection with X-ray selection using a sample of 3150 galaxies with 0.3 < z < 0.8 and I AB < 22, selected from the DEEP2 Galaxy Redshift Survey and the All-wavelength Extended Groth Strip International Survey. Among the 146 X-ray sources in this sample, 58% are classified optically as emission-line AGNs, the rest as star-forming galaxies or absorption-dominated galaxies. The latter are also known as "X-ray bright, optically normal galaxies" (XBONGs). Analysis of the relationship between optical emission lines and X-ray properties shows that the completeness of optical AGN selection suffers from dependence on the star formation rate and the quality of observed spectra. It also shows that XBONGs do not appear to be a physically distinct population from other X-ray detected, emission-line AGNs. On the other hand, X-ray AGN selection also has strong bias. About 2/3 of all emission-line AGNs at L bol > 1044 erg s-1 in our sample are not detected in our 200 ks Chandra images, most likely due to moderate or heavy absorption by gas near the AGN. The 2-7 keV detection rate of Seyfert 2s at z ~ 0.6 suggests that their column density distribution and Compton-thick fraction are similar to that of local Seyferts. Multiple sample selection techniques are needed to obtain as complete a sample as possible.

  15. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    PubMed Central

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-01-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464

  16. A novel approach to assessing environmental disturbance based on habitat selection by zebra fish as a model organism.

    PubMed

    Araújo, Cristiano V M; Griffith, Daniel M; Vera-Vera, Victoria; Jentzsch, Paul Vargas; Cervera, Laura; Nieto-Ariza, Beatriz; Salvatierra, David; Erazo, Santiago; Jaramillo, Rusbel; Ramos, Luis A; Moreira-Santos, Matilde; Ribeiro, Rui

    2018-04-01

    Aquatic ecotoxicity assays used to assess ecological risk assume that organisms living in a contaminated habitat are forcedly exposed to the contamination. This assumption neglects the ability of organisms to detect and avoid contamination by moving towards less disturbed habitats, as long as connectivity exists. In fluvial systems, many environmental parameters vary spatially and thus condition organisms' habitat selection. We assessed the preference of zebra fish (Danio rerio) when exposed to water samples from two western Ecuadorian rivers with apparently distinct disturbance levels: Pescadillo River (highly disturbed) and Oro River (moderately disturbed). Using a non-forced exposure system in which water samples from each river were arranged according to their spatial sequence in the field and connected to allow individuals to move freely among samples, we assayed habitat selection by D. rerio to assess environmental disturbance in the two rivers. Fish exposed to Pescadillo River samples preferred downstream samples near the confluence zone with the Oro River. Fish exposed to Oro River samples preferred upstream waters. When exposed to samples from both rivers simultaneously, fish exhibited the same pattern of habitat selection by preferring the Oro River samples. Given that the rivers are connected, preference for the Oro River enabled us to predict a depression in fish populations in the Pescadillo River. Although these findings indicate higher disturbance levels in the Pescadillo River, none of the physical-chemical variables measured was significantly correlated with the preference pattern towards the Oro River. Non-linear spatial patterns of habitat preference suggest that other environmental parameters like urban or agricultural contaminants play an important role in the model organism's habitat selection in these rivers. The non-forced exposure system represents a habitat selection-based approach that can serve as a valuable tool to unravel the factors that dictate organisms' spatial distribution in connected ecosystems. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Site Plan Safety Submission for Sampling, Monitoring, and Decontamination of Mustard Agent - South Plant, Rocky Mountain Arsenal. Volume 1

    DTIC Science & Technology

    1988-10-01

    sample these ducts. This judgement was based on the following factors : 1. The ducts were open to the atmosphere. 2. RMA records of building area samples...selected based on several factors including piping arrangements, volume to be sampled, sampling equipment flow rates, and the flow rate necessary for...effective sampling. Therefore, each sampling point strategy and procedure was customized based on these factors . The individual specific sampling

  18. Well installation and documentation, and ground-water sampling protocols for the pilot National Water-Quality Assessment Program

    USGS Publications Warehouse

    Hardy, M.A.; Leahy, P.P.; Alley, W.M.

    1989-01-01

    Several pilot projects are being conducted as part of the National Water Quality Assessment (NAWQA) Program. The purpose of the pilot program is to test and refine concepts for a proposed full-scale program. Three of the pilot projects are specifically designed to assess groundwater. The purpose of this report is to describe the criteria that are being used in the NAWQA pilot projects for selecting and documenting wells, installing new wells, and sampling wells for different water quality constituents. Guidelines are presented for the selection of wells for sampling. Information needed to accurately document each well includes site characteristics related to the location of the well, land use near the well, and important well construction features. These guidelines ensure the consistency of the information collected and will provide comparable data for interpretive purposes. Guidelines for the installation of wells are presented and include procedures that need to be followed for preparations prior to drilling, the selection of the drilling technique and casing type, the grouting procedure, and the well-development technique. A major component of the protocols is related to water quality sampling. Tasks are identified that need to be completed prior to visiting the site for sampling. Guidelines are presented for purging the well prior t sampling, both in terms of the volume of water pumped and the chemical stability of field parameters. Guidelines are presented concerning sampler selection as related to both inorganic and organic constituents. Documentation needed to describe the measurements and observations related to sampling each well and treating and preserving the samples are also presented. Procedures are presented for the storage and shipping of water samples, equipment cleaning, and quality assurance. Quality assurance guidelines include the description of the general distribution of the various quality assurance samples (blanks, spikes, duplicates, and reference samples) that will be used in the pilot program. (Lantz-PTT)

  19. Automated liver sampling using a gradient dual-echo Dixon-based technique.

    PubMed

    Bashir, Mustafa R; Dale, Brian M; Merkle, Elmar M; Boll, Daniel T

    2012-05-01

    Magnetic resonance spectroscopy of the liver requires input from a physicist or physician at the time of acquisition to insure proper voxel selection, while in multiecho chemical shift imaging, numerous regions of interest must be manually selected in order to ensure analysis of a representative portion of the liver parenchyma. A fully automated technique could improve workflow by selecting representative portions of the liver prior to human analysis. Complete volumes from three-dimensional gradient dual-echo acquisitions with two-point Dixon reconstruction acquired at 1.5 and 3 T were analyzed in 100 subjects, using an automated liver sampling algorithm, based on ratio pairs calculated from signal intensity image data as fat-only/water-only and log(in-phase/opposed-phase) on a voxel-by-voxel basis. Using different gridding variations of the algorithm, the average correct liver volume samples ranged from 527 to 733 mL. The average percentage of sample located within the liver ranged from 95.4 to 97.1%, whereas the average incorrect volume selected was 16.5-35.4 mL (2.9-4.6%). Average run time was 19.7-79.0 s. The algorithm consistently selected large samples of the hepatic parenchyma with small amounts of erroneous extrahepatic sampling, and run times were feasible for execution on an MRI system console during exam acquisition. Copyright © 2011 Wiley Periodicals, Inc.

  20. A Hierarchical Feature and Sample Selection Framework and Its Application for Alzheimer’s Disease Diagnosis

    NASA Astrophysics Data System (ADS)

    An, Le; Adeli, Ehsan; Liu, Mingxia; Zhang, Jun; Lee, Seong-Whan; Shen, Dinggang

    2017-03-01

    Classification is one of the most important tasks in machine learning. Due to feature redundancy or outliers in samples, using all available data for training a classifier may be suboptimal. For example, the Alzheimer’s disease (AD) is correlated with certain brain regions or single nucleotide polymorphisms (SNPs), and identification of relevant features is critical for computer-aided diagnosis. Many existing methods first select features from structural magnetic resonance imaging (MRI) or SNPs and then use those features to build the classifier. However, with the presence of many redundant features, the most discriminative features are difficult to be identified in a single step. Thus, we formulate a hierarchical feature and sample selection framework to gradually select informative features and discard ambiguous samples in multiple steps for improved classifier learning. To positively guide the data manifold preservation process, we utilize both labeled and unlabeled data during training, making our method semi-supervised. For validation, we conduct experiments on AD diagnosis by selecting mutually informative features from both MRI and SNP, and using the most discriminative samples for training. The superior classification results demonstrate the effectiveness of our approach, as compared with the rivals.

  1. Well-defined magnetic surface imprinted nanoparticles for selective enrichment of 2,4-dichlorophenoxyacetic acid in real samples.

    PubMed

    Sheng, Le; Jin, Yulong; He, Yonghuan; Huang, Yanyan; Yan, Liushui; Zhao, Rui

    2017-11-01

    Superparamagnetic core-shell molecularly imprinted polymer nanoparticles (MIPs) were prepared via surface initiated reversible-addition fragmentation chain transfer (si-RAFT) polymerization for the selective recognition of 2,4-dichlorophenoxyacetic acid (2,4-D) in real samples. The construction of uniform core-shell structure with a 50nm MIP layer was successfully accomplished, which favored mass transfer and resulted in fast recognition kinetics. The static equilibrium experiments revealed the satisfied adsorption capacity and imprinting efficiency of Fe 3 O 4 @MIP. Moreover, the Fe 3 O 4 @MIP exhibited high selectivity and affinity towards 2,4-D over structural analogues. The prepared Fe 3 O 4 @MIP nanoparticles were used for the selective enrichment of 2,4-D in tap water and Chinese cabbage samples. Combined with RP-HPLC, the recoveries of 2,4-D were calculated from 93.1% to 103.3% with RSD of 1.7-5.4% (n = 3) in Chinese cabbage samples. This work provides a versatile approach for fabricating well-constructed core-shell MIP nanoparticles for rapid enrichment and highly selective separation of target molecules in real samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Aqueous nitrite ion determination by selective reduction and gas phase nitric oxide chemiluminescence

    NASA Technical Reports Server (NTRS)

    Dunham, A. J.; Barkley, R. M.; Sievers, R. E.; Clarkson, T. W. (Principal Investigator)

    1995-01-01

    An improved method of flow injection analysis for aqueous nitrite ion exploits the sensitivity and selectivity of the nitric oxide (NO) chemilluminescence detector. Trace analysis of nitrite ion in a small sample (5-160 microL) is accomplished by conversion of nitrite ion to NO by aqueous iodide in acid. The resulting NO is transported to the gas phase through a semipermeable membrane and subsequently detected by monitoring the photoemission of the reaction between NO and ozone (O3). Chemiluminescence detection is selective for measurement of NO, and, since the detection occurs in the gas-phase, neither sample coloration nor turbidity interfere. The detection limit for a 100-microL sample is 0.04 ppb of nitrite ion. The precision at the 10 ppb level is 2% relative standard deviation, and 60-180 samples can be analyzed per hour. Samples of human saliva and food extracts were analyzed; the results from a standard colorimetric measurement are compared with those from the new chemiluminescence method in order to further validate the latter method. A high degree of selectivity is obtained due to the three discriminating steps in the process: (1) the nitrite ion to NO conversion conditions are virtually specific for nitrite ion, (2) only volatile products of the conversion will be swept to the gas phase (avoiding turbidity or color in spectrophotometric methods), and (3) the NO chemiluminescence detector selectively detects the emission from the NO + O3 reaction. The method is free of interferences, offers detection limits of low parts per billion of nitrite ion, and allows the analysis of up to 180 microL-sized samples per hour, with little sample preparation and no chromatographic separation. Much smaller samples can be analyzed by this method than in previously reported batch analysis methods, which typically require 5 mL or more of sample and often need chromatographic separations as well.

  3. A selective medium for the isolation of Microbacterium species in oral cavities.

    PubMed

    Tsuzukibashi, Osamu; Uchibori, Satoshi; Kobayashi, Taira; Saito, Masanori; Umezawa, Koji; Ohta, Mitsuhiro; Shinozaki-Kuwahara, Noriko

    2015-09-01

    The genus Microbacterium has been isolated from the environment, dairy goods, and human clinical specimens. Although, in our previous studies, some Microbacterium species were infrequently detected in oral samples collected from humans, there is currently no report that these organisms, which are capable of causing serious systemic infections, were isolated from the human oral cavity. The aim of the present study was to develop a selective medium to isolate the representative Microbacterium species most frequently detected in human clinical specimens, and reveal the distribution of individual Microbacterium species in the oral cavity. The growth recoveries of representative Microbacterium species on the selective medium, designated as MSM, were sufficient. Moreover, the growth of other representative oral bacteria was markedly inhibited on the selective medium. The proportion of Microbacterium species in the saliva samples of 60 subjects, 20 of whom were removable denture wearers, was then examined. The proportion of these organisms was also examined in environmental samples obtained by swabbing 20 washstands. PCR primers were designed for representative Microbacterium species. The genus Microbacterium was detected in 45% of the saliva and denture plaque samples collected from the twenty removable denture wearers, but was absent in the saliva of the forty non-denture wearers. On the other hand, these organisms were detected in all environmental samples. The genus Microbacterium accounted for 0.00003%, 0.0001%, and 12.6% of the total cultivable bacteria number on the BHI medium in the saliva and denture plaque samples of removable denture wearers and in the environmental samples, respectively. The most predominant Microbacterium species in all positive samples was Microbacterium oxydans. These results indicated that the genus Microbacterium was not a part of the normal flora in the human oral cavity, except for subjects wearing dentures that were contaminated by the environment, and the selective medium, designated as MSM, was useful for isolating Microbacterium species, which are frequently encountered in human clinical specimens, from the various samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Mars Rover Sample Return mission

    NASA Technical Reports Server (NTRS)

    Bourke, Roger D.; Kwok, Johnny H.; Friedlander, Alan

    1989-01-01

    To gain a detailed understanding of the character of the planet Mars, it is necessary to send vehicle to the surface and return selected samples for intensive study in earth laboratories. Toward that end, studies have been underway for several years to determine the technically feasible means for exploring the surface and returning selected samples. This paper describes several MRSR mission concepts that have emerged from the most recent studies.

  5. Training Program Efficacy in Developing Health Life Skills among Sample Selected from Kindergarten Children

    ERIC Educational Resources Information Center

    Al Mohtadi, Reham Mohammad; Al Zboon, Habis Sa'ad

    2017-01-01

    This study drove at identifying the training program efficacy in developing the health life skills among sample selected from Kindergarten children. Study sample consisted of 60 children of both genders, ages of which are ranged from 5-6 years old. We have applied herein the pre and post dimension of health life skills scale; consisting of 28…

  6. 40 CFR Appendix X to Part 86 - Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks X Appendix X to Part 86 Protection of... AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) Pt. 86, App. X Appendix X to Part 86—Sampling...

  7. 40 CFR Appendix X to Part 86 - Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks X Appendix X to Part 86 Protection of... AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) Pt. 86, App. X Appendix X to Part 86—Sampling...

  8. 40 CFR Appendix X to Part 86 - Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks X Appendix X to Part 86 Protection of... AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) Pt. 86, App. X Appendix X to Part 86—Sampling...

  9. 40 CFR Appendix X to Part 86 - Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 19 2014-07-01 2014-07-01 false Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks X Appendix X to Part 86 Protection of... AND IN-USE HIGHWAY VEHICLES AND ENGINES Pt. 86, App. X Appendix X to Part 86—Sampling Plans for...

  10. 40 CFR Appendix X to Part 86 - Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 20 2013-07-01 2013-07-01 false Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks X Appendix X to Part 86 Protection of... AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) Pt. 86, App. X Appendix X to Part 86—Sampling...

  11. Apparatus for transporting hazardous materials

    DOEpatents

    Osterman, Robert A.; Cox, Robert

    1992-01-01

    An apparatus and method are provided for selectively receiving, transporting, and releasing one or more radioactive or other hazardous samples for analysis on a differential thermal analysis (DTA) apparatus. The apparatus includes a portable sample transporting apparatus for storing and transporting the samples and includes a support assembly for supporting the transporting apparatus when a sample is transferred to the DTA apparatus. The transporting apparatus includes a storage member which includes a plurality of storage chambers arrayed circumferentially with respect to a central axis. An adjustable top door is located on the top side of the storage member, and the top door includes a channel capable of being selectively placed in registration with the respective storage chambers thereby permitting the samples to selectively enter the respective storage chambers. The top door, when closed, isolates the respective samples within the storage chambers. A plurality of spring-biased bottom doors are located on the bottom sides of the respective storage chambers. The bottom doors isolate the samples in the respective storage chambers when the bottom doors are in the closed position. The bottom doors permit the samples to leave the respective storage chambers from the bottom side when the respective bottom doors are in respective open positions. The bottom doors permit the samples to be loaded into the respective storage chambers after the analysis for storage and transport to a permanent storage location.

  12. Quantifying Uncertainties from Presence Data Sampling Methods for Species Distribution Modeling: Focused on Vegetation.

    NASA Astrophysics Data System (ADS)

    Sung, S.; Kim, H. G.; Lee, D. K.; Park, J. H.; Mo, Y.; Kil, S.; Park, C.

    2016-12-01

    The impact of climate change has been observed throughout the globe. The ecosystem experiences rapid changes such as vegetation shift, species extinction. In these context, Species Distribution Model (SDM) is one of the popular method to project impact of climate change on the ecosystem. SDM basically based on the niche of certain species with means to run SDM present point data is essential to find biological niche of species. To run SDM for plants, there are certain considerations on the characteristics of vegetation. Normally, to make vegetation data in large area, remote sensing techniques are used. In other words, the exact point of presence data has high uncertainties as we select presence data set from polygons and raster dataset. Thus, sampling methods for modeling vegetation presence data should be carefully selected. In this study, we used three different sampling methods for selection of presence data of vegetation: Random sampling, Stratified sampling and Site index based sampling. We used one of the R package BIOMOD2 to access uncertainty from modeling. At the same time, we included BioCLIM variables and other environmental variables as input data. As a result of this study, despite of differences among the 10 SDMs, the sampling methods showed differences in ROC values, random sampling methods showed the lowest ROC value while site index based sampling methods showed the highest ROC value. As a result of this study the uncertainties from presence data sampling methods and SDM can be quantified.

  13. Microbiological sampling plan based on risk classification to verify supplier selection and production of served meals in food service operation.

    PubMed

    Lahou, Evy; Jacxsens, Liesbeth; Van Landeghem, Filip; Uyttendaele, Mieke

    2014-08-01

    Food service operations are confronted with a diverse range of raw materials and served meals. The implementation of a microbial sampling plan in the framework of verification of suppliers and their own production process (functionality of their prerequisite and HACCP program), demands selection of food products and sampling frequencies. However, these are often selected without a well described scientifically underpinned sampling plan. Therefore, an approach on how to set-up a focused sampling plan, enabled by a microbial risk categorization of food products, for both incoming raw materials and meals served to the consumers is presented. The sampling plan was implemented as a case study during a one-year period in an institutional food service operation to test the feasibility of the chosen approach. This resulted in 123 samples of raw materials and 87 samples of meal servings (focused on high risk categorized food products) which were analyzed for spoilage bacteria, hygiene indicators and food borne pathogens. Although sampling plans are intrinsically limited in assessing the quality and safety of sampled foods, it was shown to be useful to reveal major non-compliances and opportunities to improve the food safety management system in place. Points of attention deduced in the case study were control of Listeria monocytogenes in raw meat spread and raw fish as well as overall microbial quality of served sandwiches and salads. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Adaptive sampling in behavioral surveys.

    PubMed

    Thompson, S K

    1997-01-01

    Studies of populations such as drug users encounter difficulties because the members of the populations are rare, hidden, or hard to reach. Conventionally designed large-scale surveys detect relatively few members of the populations so that estimates of population characteristics have high uncertainty. Ethnographic studies, on the other hand, reach suitable numbers of individuals only through the use of link-tracing, chain referral, or snowball sampling procedures that often leave the investigators unable to make inferences from their sample to the hidden population as a whole. In adaptive sampling, the procedure for selecting people or other units to be in the sample depends on variables of interest observed during the survey, so the design adapts to the population as encountered. For example, when self-reported drug use is found among members of the sample, sampling effort may be increased in nearby areas. Types of adaptive sampling designs include ordinary sequential sampling, adaptive allocation in stratified sampling, adaptive cluster sampling, and optimal model-based designs. Graph sampling refers to situations with nodes (for example, people) connected by edges (such as social links or geographic proximity). An initial sample of nodes or edges is selected and edges are subsequently followed to bring other nodes into the sample. Graph sampling designs include network sampling, snowball sampling, link-tracing, chain referral, and adaptive cluster sampling. A graph sampling design is adaptive if the decision to include linked nodes depends on variables of interest observed on nodes already in the sample. Adjustment methods for nonsampling errors such as imperfect detection of drug users in the sample apply to adaptive as well as conventional designs.

  15. Design and status of the RF-digitizer integrated circuit

    NASA Technical Reports Server (NTRS)

    Rayhrer, B.; Lam, B.; Young, L. E.; Srinivasan, J. M.; Thomas, J. B.

    1991-01-01

    An integrated circuit currently under development samples a bandpass-limited signal at a radio frequency in quadrature and then performs a simple sum-and-dump operation in order to filter and lower the rate of the samples. Downconversion to baseband is carried out by the sampling step itself through the aliasing effect of an appropriately selected subharmonic sampling frequency. Two complete RF digitizer circuits with these functions will be implemented with analog and digital elements on one GaAs substrate. An input signal, with a carrier frequency as high as 8 GHz, can be sampled at a rate as high as 600 Msamples/sec for each quadrature component. The initial version of the chip will sign-sample (1-bit) the input RF signal. The chip will contain a synthesizer to generate a sample frequency that is a selectable integer multiple of an input reference frequency. In addition to the usual advantages of compactness and reliability associated with integrated circuits, the single chip will replace several steps required by standard analog downconversion. Furthermore, when a very high initial sample rate is selected, the presampling analog filters can be given very large bandwidths, thereby greatly reducing phase and delay instabilities typically introduced by such filters, as well as phase and delay variation due to Doppler changes.

  16. Water-quality and biological data for selected streams, lakes, and wells in the High Point Lake watershed, Guilford County, North Carolina, 1988-89

    USGS Publications Warehouse

    Davenport, M.S.

    1993-01-01

    Water and bottom-sediment samples were collected at 26 sites in the 65-square-mile High Point Lake watershed area of Guilford County, North Carolina, from December 1988 through December 1989. Sampling locations included 10 stream sites, 8 lake sites, and 8 ground-water sites. Generally, six steady-flow samples were collected at each stream site and three storm samples were collected at five sites. Four lake samples and eight ground-water samples also were collected. Chemical analyses of stream and lake sediments and particle-size analyses of lake sediments were performed once during the study. Most stream and lake samples were analyzed for field characteristics, nutrients, major ions, trace elements, total organic carbon, and chemical-oxygen demand. Analyses were performed to detect concentrations of 149 selected organic compounds, including acid and base/neutral extractable and volatile constituents and carbamate, chlorophenoxy acid, triazine, organochlorine, and organophosphorus pesticides and herbicides. Selected lake samples were analyzed for all constituents listed in the Safe Drinking Water Act of 1986, including Giardia, Legionella, radiochemicals, asbestos, and viruses. Various chromatograms from organic analyses were submitted to computerized library searches. The results of these and all other analyses presented in this report are in tabular form.

  17. Sampling for Patient Exit Interviews: Assessment of Methods Using Mathematical Derivation and Computer Simulations.

    PubMed

    Geldsetzer, Pascal; Fink, Günther; Vaikath, Maria; Bärnighausen, Till

    2018-02-01

    (1) To evaluate the operational efficiency of various sampling methods for patient exit interviews; (2) to discuss under what circumstances each method yields an unbiased sample; and (3) to propose a new, operationally efficient, and unbiased sampling method. Literature review, mathematical derivation, and Monte Carlo simulations. Our simulations show that in patient exit interviews it is most operationally efficient if the interviewer, after completing an interview, selects the next patient exiting the clinical consultation. We demonstrate mathematically that this method yields a biased sample: patients who spend a longer time with the clinician are overrepresented. This bias can be removed by selecting the next patient who enters, rather than exits, the consultation room. We show that this sampling method is operationally more efficient than alternative methods (systematic and simple random sampling) in most primary health care settings. Under the assumption that the order in which patients enter the consultation room is unrelated to the length of time spent with the clinician and the interviewer, selecting the next patient entering the consultation room tends to be the operationally most efficient unbiased sampling method for patient exit interviews. © 2016 The Authors. Health Services Research published by Wiley Periodicals, Inc. on behalf of Health Research and Educational Trust.

  18. The Complete Local-Volume Groups Sample (CLoGS): Early results from X-ray and radio observations

    NASA Astrophysics Data System (ADS)

    Vrtilek, Jan M.; O'Sullivan, Ewan; David, Laurence P.; Giacintucci, Simona; Kolokythas, Konstantinos

    2017-08-01

    Although the group environment is the dominant locus of galaxy evolution (in contrast to rich clusters, which contain only a few percent of galaxies), there has been a lack of reliable, representative group samples in the local Universe. In particular, X-ray selected samples are strongly biased in favor of the X-ray bright, centrally-concentrated cool-core systems. In response, we have designed the Complete Local-Volume Groups Sample (CLoGS), an optically-selected statistically-complete sample of 53 groups within 80 Mpc which is intended to overcome the limitations of X-ray selected samples and serve as a representative survey of groups in the local Universe. We have supplemented X-ray data from Chandra and XMM (70% complete to date, using both archival and new observations, with a 26-group high richness subsample 100% complete) with GMRT radio continuum observations (at 235 and 610 MHz, complete for the entire sample). CLoGS includes groups with a wide variety of properties in terms of galaxy population, hot gas content, and AGN power. We here describe early results from the survey, including the range of AGN activity observed in the dominant galaxies, the relative fraction of cool-core and non-cool-core groups in our sample, and the degree of disturbance observed in the IGM.

  19. Relativistic effects on galaxy redshift samples due to target selection

    NASA Astrophysics Data System (ADS)

    Alam, Shadab; Croft, Rupert A. C.; Ho, Shirley; Zhu, Hongyu; Giusarma, Elena

    2017-10-01

    In a galaxy redshift survey, the objects to be targeted for spectra are selected from a photometrically observed sample. The observed magnitudes and colours of galaxies in this parent sample will be affected by their peculiar velocities, through relativistic Doppler and relativistic beaming effects. In this paper, we compute the resulting expected changes in galaxy photometry. The magnitudes of the relativistic effects are a function of redshift, stellar mass, galaxy velocity and velocity direction. We focus on the CMASS sample from the Sloan Digital Sky Survey (SDSS) and Baryon Oscillation Spectroscopic Survey (BOSS), which is selected on the basis of colour and magnitude. We find that 0.10 per cent of the sample (∼585 galaxies) has been scattered into the targeted region of colour-magnitude space by relativistic effects, and conversely 0.09 per cent of the sample (∼532 galaxies) has been scattered out. Observational consequences of these effects include an asymmetry in clustering statistics, which we explore in a companion paper. Here, we compute a set of weights that can be used to remove the effect of modulations introduced into the density field inferred from a galaxy sample. We conclude by investigating the possible effects of these relativistic modulation on large-scale clustering of the galaxy sample.

  20. Reweighting of the primary sampling units in the National Automotive Sampling System

    DOT National Transportation Integrated Search

    1997-09-01

    The original design of hte National Automotive Sampling System - formerly the National Accident Sampling System - called for 75 PSUs randomly selected from PSUs which were grouped into various strata across the U.S. The implementation of the PSU samp...

  1. 7 CFR 275.14 - Review processing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... addition, the sample of active and negative cases shall be selected in accordance with the sampling techniques described in the Quality Control Sampling Handbook, FNS Handbook 311. (c) Worksheets. The...

  2. Sampling Large Graphs for Anticipatory Analytics

    DTIC Science & Technology

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  3. CRUMP 2003 Selected Water Sample Results

    EPA Pesticide Factsheets

    Point locations and water sampling results performed in 2003 by the Church Rock Uranium Monitoring Project (CRUMP) a consortium of organizations (Navajo Nation Environmental Protection Agency, US Environmental Protection Agency, New Mexico Scientific Laboratory Division, Navajo Tribal Utility Authority and NM Water Quality Control Commission). Samples include general description of the wells sampled, general chemistry, heavy metals and aestheic parameters, and selected radionuclides. Here only six sampling results are presented in this point shapefile, including: Gross Alpha (U-Nat Ref.) (pCi/L), Gross Beta (Sr/Y-90 Ref.) (pCi/L), Radium-226 (pCi/L), Radium-228 (pCi/L), Total Uranium (pCi/L), and Uranium mass (ug/L). The CRUMP samples were collected in the area of Churchrock, NM in the Eastern AUM Region of the Navajo Nation.

  4. Water-quality, well-construction, and ground-water level data for an investigation of radionuclides in ground water, Hickman and Maury counties, Tennessee

    USGS Publications Warehouse

    Hileman, G.E.

    1990-01-01

    Water quality, well construction, and groundwater level data were collected for an investigation of radionuclides in groundwater in Maury and Hickman Counties, Tennessee. Seventeen wells and 3 springs were sampled in Hickman County, and 20 wells were sampled in Maury County. Samples from each site were analyzed for radionuclides, common and trace inorganic ions, indicators of redox conditions, selected nutrients, total organic carbon, and selected physical characteristics. Well-construction data were obtained to help determine the source of the water. Where possible, groundwater level measurements were made for each well sampled. Samples were collected from May 1989 through mid-August 1989. Data are presented in tables. Maps of each county show the location of the sites sampled. (USGS)

  5. Strategy for Ranking the Science Value of the Surface of Asteroid 101955 Bennu for Sample Site Selection for Osiris-REx

    NASA Technical Reports Server (NTRS)

    Nakamura-Messenger, K.; Connolly, H. C., Jr.; Lauretta, D. S.

    2014-01-01

    OSRIS-REx is NASA's New Frontiers 3 sample return mission that will return at least 60 g of pristine surface material from near-Earth asteroid 101955 Bennu in September 2023. The scientific value of the sample increases enormously with the amount of knowledge captured about the geological context from which the sample is collected. The OSIRIS-REx spacecraft is highly maneuverable and capable of investigating the surface of Bennu at scales down to the sub-cm. The OSIRIS-REx instruments will characterize the overall surface geology including spectral properties, microtexture, and geochemistry of the regolith at the sampling site in exquisite detail for up to 505 days after encountering Bennu in August 2018. The mission requires at the very minimum one acceptable location on the asteroid where a touch-and-go (TAG) sample collection maneuver can be successfully per-formed. Sample site selection requires that the follow-ing maps be produced: Safety, Deliverability, Sampleability, and finally Science Value. If areas on the surface are designated as safe, navigation can fly to them, and they have ingestible regolith, then the scientific value of one site over another will guide site selection.

  6. Studying hardness, workability and minimum bending radius in selectively laser-sintered Ti–6Al–4V alloy samples

    NASA Astrophysics Data System (ADS)

    Galkina, N. V.; Nosova, Y. A.; Balyakin, A. V.

    2018-03-01

    This research is relevant as it tries to improve the mechanical and service performance of the Ti–6Al–4V titanium alloy obtained by selective laser sintering. For that purpose, sintered samples were annealed at 750 and 850°C for an hour. Sintered and annealed samples were tested for hardness, workability and microstructure. It was found that incomplete annealing of selectively laser-sintered Ti–6Al–4V samples results in an insignificant reduction in hardness and ductility. Sintered and incompletely annealed samples had a hardness of 32..33 HRC, which is lower than the value of annealed parts specified in standards. Complete annealing at temperature 850°C reduces the hardness to 25 HRC and ductility by 15...20%. Incomplete annealing lowers the ductility factor from 0.08 to 0.06. Complete annealing lowers that value to 0.025. Complete annealing probably results in the embrittlement of sintered samples, perhaps due to their oxidation and hydrogenation in the air. Optical metallography showed lateral fractures in both sintered and annealed samples, which might be the reason why they had lower hardness and ductility.

  7. Molecularly imprinted membrane extraction combined with high-performance liquid chromatography for selective analysis of cloxacillin from shrimp samples.

    PubMed

    Du, Wei; Sun, Min; Guo, Pengqi; Chang, Chun; Fu, Qiang

    2018-09-01

    Nowadays, the abuse of antibiotics in aquaculture has generated considerable problems for food safety. Therefore, it is imperative to develop a simple and selective method for monitoring illegal use of antibiotics in aquatic products. In this study, a method combined molecularly imprinted membranes (MIMs) extraction and liquid chromatography was developed for the selective analysis of cloxacillin from shrimp samples. The MIMs was synthesized by UV photopolymerization, and characterized by scanning electron microscope, Fourier transform infrared spectra, thermo-gravimetric analysis and swelling test. The results showed that the MIMs exhibited excellent permselectivity, high adsorption capacity and fast adsorption rate for cloxacillin. Finally, the method was utilized to determine cloxacillin from shrimp samples, with good accuracies and acceptable relative standard deviation values for precision. The proposed method was a promising alternative for selective analysis of cloxacillin in shrimp samples, due to the easy-operation and excellent selectivity. Copyright © 2018. Published by Elsevier Ltd.

  8. Autocorrelation of location estimates and the analysis of radiotracking data

    USGS Publications Warehouse

    Otis, D.L.; White, Gary C.

    1999-01-01

    The wildlife literature has been contradictory about the importance of autocorrelation in radiotracking data used for home range estimation and hypothesis tests of habitat selection. By definition, the concept of a home range involves autocorrelated movements, but estimates or hypothesis tests based on sampling designs that predefine a time frame of interest, and that generate representative samples of an animal's movement during this time frame, should not be affected by length of the sampling interval and autocorrelation. Intensive sampling of the individual's home range and habitat use during the time frame of the study leads to improved estimates for the individual, but use of location estimates as the sample unit to compare across animals is pseudoreplication. We therefore recommend against use of habitat selection analysis techniques that use locations instead of individuals as the sample unit. We offer a general outline for sampling designs for radiotracking studies.

  9. Environmental sampling and analysis in support of NTI-3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGuire, R.R.; Harrar, J.E.; Haas, J.S.

    1991-04-06

    The third National Trail Inspection took place at the Monsanto Chemical Plant in Luling, Louisiana. In order to test the effectiveness of environmental sampling (soil, water and air) in determining the nature of the chemical process in a given production plant and to examine the distance from a process building that samples can effectively be taken, we needed to select some materials that constituted components of process streams. Three materials were selected: 1. isopropyl amine for air monitoring, 2. 4-nitrophenol, one of the precursors in the acetaminophen process, and 3. an intermediate in the production of glyphosate for ROUNDUP thatmore » is known simply as glyphosate intermediated. LLNL did not participate in the air sampling nor the analysis for isopropyl amine. This paper discussed the steps in this experiment including sample collection, sample workshop, sample analysis the results and discussion and the conclusion. 3 figs., 6 tabs.« less

  10. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  11. Methods for producing silicon carbide architectural preforms

    NASA Technical Reports Server (NTRS)

    DiCarlo, James A. (Inventor); Yun, Hee (Inventor)

    2010-01-01

    Methods are disclosed for producing architectural preforms and high-temperature composite structures containing high-strength ceramic fibers with reduced preforming stresses within each fiber, with an in-situ grown coating on each fiber surface, with reduced boron within the bulk of each fiber, and with improved tensile creep and rupture resistance properties for each fiber. The methods include the steps of preparing an original sample of a preform formed from a pre-selected high-strength silicon carbide ceramic fiber type, placing the original sample in a processing furnace under a pre-selected preforming stress state and thermally treating the sample in the processing furnace at a pre-selected processing temperature and hold time in a processing gas having a pre-selected composition, pressure, and flow rate. For the high-temperature composite structures, the method includes additional steps of depositing a thin interphase coating on the surface of each fiber and forming a ceramic or carbon-based matrix within the sample.

  12. Arrayed Micro-Ring Spectrometer System and Method of Use

    NASA Technical Reports Server (NTRS)

    Choi, Sang H. (Inventor); Park, Yeonjoon (Inventor); King, Glen C. (Inventor); Elliott, James R. (Inventor)

    2012-01-01

    A spectrometer system includes an array of micro-zone plates (MZP) each having coaxially-aligned ring gratings, a sample plate for supporting and illuminating a sample, and an array of photon detectors for measuring a spectral characteristic of the predetermined wavelength. The sample plate emits an evanescent wave in response to incident light, which excites molecules of the sample to thereby cause an emission of secondary photons. A method of detecting the intensity of a selected wavelength of incident light includes directing the incident light onto an array of MZP, diffracting a selected wavelength of the incident light onto a target focal point using the array of MZP, and detecting the intensity of the selected portion using an array of photon detectors. An electro-optic layer positioned adjacent to the array of MZP may be excited via an applied voltage to select the wavelength of the incident light.

  13. Molecularly imprinted covalent organic polymers for the selective extraction of benzoxazole fluorescent whitening agents from food samples.

    PubMed

    Ding, Hui; Wang, Rongyu; Wang, Xiao; Ji, Wenhua

    2018-06-21

    Molecularly imprinted covalent organic polymers were constructed by an imine-linking reaction between 1,3,5-triformylphloroglucinol and 2,6-diaminopyridine and used for the selective solid-phase extraction of benzoxazole fluorescent whitening agents from food samples. Binding experiments showed that imprinting sites on molecularly imprinted polymers had higher selectivity for targets compared with those of the corresponding non-imprinted polymers. Parameters affecting the solid-phase extraction procedure were examined. Under optimal conditions, actual samples were treated and the eluent was analyzed with high-performance liquid chromatography with diode-array detection. The results showed that the established method owned the wide linearity, satisfactory detection limits and quantification limits, and acceptable recoveries. Thus, this developed method possesses the practical potential to the selectively determine benzoxazole fluorescent whitening agents in complex food samples. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  14. Soy sauce classification by geographic region and fermentation based on artificial neural network and genetic algorithm.

    PubMed

    Xu, Libin; Li, Yang; Xu, Ning; Hu, Yong; Wang, Chao; He, Jianjun; Cao, Yueze; Chen, Shigui; Li, Dongsheng

    2014-12-24

    This work demonstrated the possibility of using artificial neural networks to classify soy sauce from China. The aroma profiles of different soy sauce samples were differentiated using headspace solid-phase microextraction. The soy sauce samples were analyzed by gas chromatography-mass spectrometry, and 22 and 15 volatile aroma compounds were selected for sensitivity analysis to classify the samples by fermentation and geographic region, respectively. The 15 selected samples can be classified by fermentation and geographic region with a prediction success rate of 100%. Furans and phenols represented the variables with the greatest contribution in classifying soy sauce samples by fermentation and geographic region, respectively.

  15. Method and apparatus for sampling atmospheric mercury

    DOEpatents

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  16. The structured ancestral selection graph and the many-demes limit.

    PubMed

    Slade, Paul F; Wakeley, John

    2005-02-01

    We show that the unstructured ancestral selection graph applies to part of the history of a sample from a population structured by restricted migration among subpopulations, or demes. The result holds in the limit as the number of demes tends to infinity with proportionately weak selection, and we have also made the assumptions of island-type migration and that demes are equivalent in size. After an instantaneous sample-size adjustment, this structured ancestral selection graph converges to an unstructured ancestral selection graph with a mutation parameter that depends inversely on the migration rate. In contrast, the selection parameter for the population is independent of the migration rate and is identical to the selection parameter in an unstructured population. We show analytically that estimators of the migration rate, based on pairwise sequence differences, derived under the assumption of neutrality should perform equally well in the presence of weak selection. We also modify an algorithm for simulating genealogies conditional on the frequencies of two selected alleles in a sample. This permits efficient simulation of stronger selection than was previously possible. Using this new algorithm, we simulate gene genealogies under the many-demes ancestral selection graph and identify some situations in which migration has a strong effect on the time to the most recent common ancestor of the sample. We find that a similar effect also increases the sensitivity of the genealogy to selection.

  17. Statistical analysis tables for truncated or censored samples

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.; Cooley, C. G.

    1971-01-01

    Compilation describes characteristics of truncated and censored samples, and presents six illustrations of practical use of tables in computing mean and variance estimates for normal distribution using selected samples.

  18. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power

    PubMed Central

    Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%–155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%–71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power. PMID:28479943

  19. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power.

    PubMed

    Miciak, Jeremy; Taylor, W Pat; Stuebing, Karla K; Fletcher, Jack M; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%-155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%-71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power.

  20. VARIABLE SELECTION IN NONPARAMETRIC ADDITIVE MODELS

    PubMed Central

    Huang, Jian; Horowitz, Joel L.; Wei, Fengrong

    2010-01-01

    We consider a nonparametric additive model of a conditional mean function in which the number of variables and additive components may be larger than the sample size but the number of nonzero additive components is “small” relative to the sample size. The statistical problem is to determine which additive components are nonzero. The additive components are approximated by truncated series expansions with B-spline bases. With this approximation, the problem of component selection becomes that of selecting the groups of coefficients in the expansion. We apply the adaptive group Lasso to select nonzero components, using the group Lasso to obtain an initial estimator and reduce the dimension of the problem. We give conditions under which the group Lasso selects a model whose number of components is comparable with the underlying model, and the adaptive group Lasso selects the nonzero components correctly with probability approaching one as the sample size increases and achieves the optimal rate of convergence. The results of Monte Carlo experiments show that the adaptive group Lasso procedure works well with samples of moderate size. A data example is used to illustrate the application of the proposed method. PMID:21127739

  1. Point-Sampling and Line-Sampling Probability Theory, Geometric Implications, Synthesis

    Treesearch

    L.R. Grosenbaugh

    1958-01-01

    Foresters concerned with measuring tree populations on definite areas have long employed two well-known methods of representative sampling. In list or enumerative sampling the entire tree population is tallied with a known proportion being randomly selected and measured for volume or other variables. In area sampling all trees on randomly located plots or strips...

  2. 50 CFR 260.57 - How samples are drawn by inspectors or licensed samplers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false How samples are drawn by inspectors or... drawn by inspectors or licensed samplers. An inspector or a licensed sampler shall select samples, upon... representative sample of the lot. Samples drawn for inspection shall be furnished by the applicant at no cost to...

  3. Sampling Operations on Big Data

    DTIC Science & Technology

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  4. Assessing Generative Braille Responding Following Training in a Matching-to-Sample Format

    ERIC Educational Resources Information Center

    Putnam, Brittany C.; Tiger, Jeffrey H.

    2016-01-01

    We evaluated the effects of teaching sighted college students to select printed text letters given a braille sample stimulus in a matching-to-sample (MTS) format on the emergence of untrained (a) construction of print characters given braille samples, (b) construction of braille characters given print samples, (c) transcription of print characters…

  5. Galaxy And Mass Assembly (GAMA): Improved emission lines measurements in four representative samples at 0.07

    NASA Astrophysics Data System (ADS)

    Rodrigues, M.; Foster, C.; Taylor, E. N.; Wright, A. H.; Hopkins, A. M.; Baldry, I.; Brough, S.; Bland-Hawthorn, J.; Cluver, M. E.; Lara-López, M. A.; Liske, J.; López-Sánchez, Á. R.; Pimbblet, K. A.

    2016-05-01

    This paper presents a new catalog of emission lines based on the GAMA II data for galaxies between 0.07 9.4 at z ~ 0.1 and log M∗> 10.6 at z ~ 0.30. We have developed a dedicated code called MARVIN that automates the main steps of the data analysis, but imposes visual individual quality control of each measurement. We use this catalog to investigate how the sample selection influences the shape of the stellar mass - metallicity relation. We find that commonly used selection criteria on line detections and by AGN rejection could affect the shape and dispersion of the high-mass end of the M - Z relation. For log M∗> 10.6, common selection criteria reject about 65% of the emission-line galaxies. We also find that the relation does not evolve significantly from z = 0.07 to z = 0.34 in the range of stellar mass for which the samples are representative (log M∗> 10.6). For lower stellar masses (log M∗< 10.2) we are able to show that the observed 0.15 dex metallicity decrease in the same redshift range is a consequence of a color bias arising from selecting targets in the r-band. We highlight that this color selection bias affects all samples selected in r-band (e.g., GAMA and SDSS), even those drawn from volume-limited samples. Previously reported evolution of the M - Z relation at various redshifts may need to be revised to evaluate the effect of this selection bias.

  6. Instrumentation of sampling aircraft for measurement of launch vehicle effluents

    NASA Technical Reports Server (NTRS)

    Wornom, D. E.; Woods, D. C.; Thomas, M. E.; Tyson, R. W.

    1977-01-01

    An aircraft was selected and instrumented to measure effluents emitted from large solid propellant rockets during launch activities. The considerations involved in aircraft selection, sampling probes, and instrumentation are discussed with respect to obtaining valid airborne measurements. Discussions of the data acquisition system used, the instrument power system, and operational sampling procedures are included. Representative measurements obtained from an actual rocket launch monitoring activity are also presented.

  7. 2008 Homeland Security S and T Stakeholders Conference West volume 2 Monday

    DTIC Science & Technology

    2008-01-16

    per collection and pressure to be applied, etc. . - Enviromental effects; dry vs. wet surface (vs. type of sample swipe), clean vs. dirty surfaces...selection of collection via low volume or high volume sampling, distance to suspect item critical, etc. - Enviromental effects; temperature (range of...selection of material, collection via hand wiping or sampling wand, area per collection and pressure to be applied, etc. . - Enviromental effects; dry

  8. Robust online tracking via adaptive samples selection with saliency detection

    NASA Astrophysics Data System (ADS)

    Yan, Jia; Chen, Xi; Zhu, QiuPing

    2013-12-01

    Online tracking has shown to be successful in tracking of previously unknown objects. However, there are two important factors which lead to drift problem of online tracking, the one is how to select the exact labeled samples even when the target locations are inaccurate, and the other is how to handle the confusors which have similar features with the target. In this article, we propose a robust online tracking algorithm with adaptive samples selection based on saliency detection to overcome the drift problem. To deal with the problem of degrading the classifiers using mis-aligned samples, we introduce the saliency detection method to our tracking problem. Saliency maps and the strong classifiers are combined to extract the most correct positive samples. Our approach employs a simple yet saliency detection algorithm based on image spectral residual analysis. Furthermore, instead of using the random patches as the negative samples, we propose a reasonable selection criterion, in which both the saliency confidence and similarity are considered with the benefits that confusors in the surrounding background are incorporated into the classifiers update process before the drift occurs. The tracking task is formulated as a binary classification via online boosting framework. Experiment results in several challenging video sequences demonstrate the accuracy and stability of our tracker.

  9. X-Ray Temperatures, Luminosities, and Masses from XMM-Newton Follow-up of the First Shear-selected Galaxy Cluster Sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshpande, Amruta J.; Hughes, John P.; Wittman, David, E-mail: amrejd@physics.rutgers.edu, E-mail: jph@physics.rutgers.edu, E-mail: dwittman@physics.ucdavis.edu

    We continue the study of the first sample of shear-selected clusters from the initial 8.6 square degrees of the Deep Lens Survey (DLS); a sample with well-defined selection criteria corresponding to the highest ranked shear peaks in the survey area. We aim to characterize the weak lensing selection by examining the sample’s X-ray properties. There are multiple X-ray clusters associated with nearly all the shear peaks: 14 X-ray clusters corresponding to seven DLS shear peaks. An additional three X-ray clusters cannot be definitively associated with shear peaks, mainly due to large positional offsets between the X-ray centroid and the shearmore » peak. Here we report on the XMM-Newton properties of the 17 X-ray clusters. The X-ray clusters display a wide range of luminosities and temperatures; the L {sub X} − T {sub X} relation we determine for the shear-associated X-ray clusters is consistent with X-ray cluster samples selected without regard to dynamical state, while it is inconsistent with self-similarity. For a subset of the sample, we measure X-ray masses using temperature as a proxy, and compare to weak lensing masses determined by the DLS team. The resulting mass comparison is consistent with equality. The X-ray and weak lensing masses show considerable intrinsic scatter (∼48%), which is consistent with X-ray selected samples when their X-ray and weak lensing masses are independently determined.« less

  10. A Monte Carlo Program for Simulating Selection Decisions from Personnel Tests

    ERIC Educational Resources Information Center

    Petersen, Calvin R.; Thain, John W.

    1976-01-01

    Relative to test and criterion parameters and cutting scores, the correlation coefficient, sample size, and number of samples to be drawn (all inputs), this program calculates decision classification rates across samples and for combined samples. Several other related indices are also computed. (Author)

  11. Identifying Fossil Biosignatures and Minerals in Mars Analog Materials Using Time-Resolved Raman Spectroscopy

    NASA Astrophysics Data System (ADS)

    Shkolyar, S.; Farmer, J.; Alerstam, E.; Maruyama, Y.; Blacksberg, J.

    2013-12-01

    Mars sample return has been identified as a top priority in the planetary science decadal survey. A Mars sample selection and caching mission would be the likely first step in this endeavor. Such a mission would aim to select and prioritize for return to Earth aqueously formed geological samples present at a selected site on Mars, based upon their potential for biosignature capture and preservation. If evidence of past life exists and is found, it is likely to come via the identification of fossilized carbonaceous matter of biological origin (kerogen) found in the selected samples analyzed in laboratories after return to Earth. Raman spectroscopy is considered one of the primary techniques for analyzing materials in situ and selecting the most promising samples for Earth return. We have previously performed a pilot study to better understand the complexities of identifying kerogen using Raman spectroscopy. For the study, we examined a variety of Mars analog materials representing a broad range of mineral compositions and kerogen maturities. The study revealed that kerogen identification in many of the most promising lithologies is often impeded by background fluorescence that originates from long (>10 ns to ms) and short (<1 ns) lifetime fluorophores in both the mineral matrixes and preserved organic matter in the samples. This work explores the potential for time-gated Raman spectroscopy to enable clear kerogen and mineral identifications in such samples. The JPL time-resolved Raman system uses time gating to reduce background fluorescence. It uses a custom-built SPAD (single photon avalanche diode) detector, featuring a 1-ns time-gate, and electronically variable gate delay. Results for a range of fluorescent samples show that the JPL system reduces fluorescence, allowing the identification of both kerogen and mineral components more successfully than with conventional Raman systems. In some of the most challenging samples, the detection of organic matter is hindered by a combination of short lifetime fluorescence and weak Raman scattering coming from preserved kerogen grains. Fluorescence Lifetime Imaging Microscopy (FLIM) measurements were also performed to characterize the lifetimes of both components in the samples and to inform future system improvements such as shorter time gating. Here, we will discuss the results, along with identified challenges to the consistent and reliable in situ identification of kerogen in samples on Mars.

  12. The Local Universe as Seen in the Far-Infrared and Far-Ultraviolet: A Global Point of View of the Local Recent Star Formation

    NASA Astrophysics Data System (ADS)

    Buat, V.; Takeuchi, T. T.; Iglesias-Páramo, J.; Xu, C. K.; Burgarella, D.; Boselli, A.; Barlow, T.; Bianchi, L.; Donas, J.; Forster, K.; Friedman, P. G.; Heckman, T. M.; Lee, Y.-W.; Madore, B. F.; Martin, D. C.; Milliard, B.; Morissey, P.; Neff, S.; Rich, M.; Schiminovich, D.; Seibert, M.; Small, T.; Szalay, A. S.; Welsh, B.; Wyder, T.; Yi, S. K.

    2007-12-01

    We select far-infrared (FIR: 60 μm) and far-ultraviolet (FUV: 530 Å) samples of nearby galaxies in order to discuss the biases encountered by monochromatic surveys (FIR or FUV). Very different volumes are sampled by each selection, and much care is taken to apply volume corrections to all the analyses. The distributions of the bolometric luminosity of young stars are compared for both samples: they are found to be consistent with each other for galaxies of intermediate luminosities, but some differences are found for high (>5×1010 Lsolar) luminosities. The shallowness of the IRAS survey prevents us from securing a comparison at low luminosities (<2×109 Lsolar). The ratio of the total infrared (TIR) luminosity to the FUV luminosity is found to increase with the bolometric luminosity in a similar way for both samples up to 5×1010 Lsolar. Brighter galaxies are found to have a different behavior according to their selection: the LTIR/LFUV ratio of the FUV-selected galaxies brighter than 5×1010 Lsolar reaches a plateau, whereas LTIR/LFUV continues to increase with the luminosity of bright galaxies selected in FIR. The volume-averaged specific star formation rate (SFR per unit galaxy stellar mass, SSFR) is found to decrease toward massive galaxies within each selection. The mean values of the SSFR are found to be larger than those measured for optical and NIR-selected samples over the whole mass range for the FIR selection, and for masses larger than 1010 Msolar for the FUV selection. Luminous and massive galaxies selected in FIR appear as active as galaxies with similar characteristics detected at z~0.7.

  13. Residues of selected antibiotics in the South Moravian Rivers, Czech Republic.

    PubMed

    Jarova, Katerina; Vavrova, Milada; Koleckarova, Alice

    2015-01-01

    The aim of this study was to assess the contamination level of aquatic ecosystems of the Oslava and the Jihlava Rivers, and of the Nove Mlyny Water Reservoir, situated in the South Moravian Region (Czech Republic), by residues of selected veterinary pharmaceuticals. We isolated and determined 10 sulfonamide antibiotics in samples of surface water and bottom sediments using optimized analytical methods. A representative number of sampling sites in the entire basin of selected waters were chosen. Samples were collected particularly near the larger cities in order to assess their possible impact to the aquatic ecosystems. Extraction, pre-concentration and purification of samples were performed using optimized methods of solid phase extraction and pressurized solvent extraction. Final identification and quantification were carried out by high-performance liquid chromatography coupled with diode array detector. The concentration of sulfonamides in water samples were all under the limit of detection. Regarding sediment samples, sulfadimidine was found at most sampling sites; its highest values were recorded in the Jihlava River (up to 979.8 µg.kg(-1) dry matter). Other frequently detected sulfonamides were sulfamethoxazole and sulfamerazine. Most other sulfonamides were under the limit of detection or limit of quantification. Monitoring of antibiotic residues in the environment, especially in the aquatic ecosystem, is a current topic due to the growing worldwide use in both human and veterinary medicine. According to obtained results, we document the pollution of selected rivers and water reservoir by particular sulfonamides which basically reflects their application in veterinary medicine.

  14. Optical spectroscopy and initial mass function of z = 0.4 red galaxies

    NASA Astrophysics Data System (ADS)

    Tang, Baitian; Worthey, Guy

    2017-05-01

    Spectral absorption features can be used to constrain the stellar initial mass function (IMF) in the integrated light of galaxies. Spectral indices used at low redshift are in the far red, and therefore increasingly hard to detect at higher and higher redshifts as they pass out of atmospheric transmission and CCD detector wavelength windows. We employ IMF-sensitive indices at bluer wavelengths. We stack spectra of red, quiescent galaxies around z = 0.4 from the DEEP2 Galaxy Redshift Survey. The z = 0.4 red galaxies have 2 Gyr average ages so that they cannot be passively evolving precursors of nearby galaxies. They are slightly enhanced in C and Na, and slightly depressed in Ti. Split by luminosity, the fainter half appears to be older, a result that should be checked with larger samples in the future. We uncover no evidence for IMF evolution between z = 0.4 and now, but we highlight the importance of sample selection, finding that an SDSS sample culled to select archetypal elliptical galaxies at z ˜ 0 is offset towards a more bottom-heavy IMF. Other samples, including our DEEP2 sample, show an offset towards a more spiral galaxy-like IMF. All samples confirm that the reddest galaxies look bottom-heavy compared with bluer ones. Sample selection also influences age-colour trends: red, luminous galaxies always look old and metal rich, but the bluer ones can be more metal poor, the same abundance or more metal rich, depending on how they are selected.

  15. Sample selection via angular distance in the space of the arguments of an artificial neural network

    NASA Astrophysics Data System (ADS)

    Fernández Jaramillo, J. M.; Mayerle, R.

    2018-05-01

    In the construction of an artificial neural network (ANN) a proper data splitting of the available samples plays a major role in the training process. This selection of subsets for training, testing and validation affects the generalization ability of the neural network. Also the number of samples has an impact in the time required for the design of the ANN and the training. This paper introduces an efficient and simple method for reducing the set of samples used for training a neural network. The method reduces the required time to calculate the network coefficients, while keeping the diversity and avoiding overtraining the ANN due the presence of similar samples. The proposed method is based on the calculation of the angle between two vectors, each one representing one input of the neural network. When the angle formed among samples is smaller than a defined threshold only one input is accepted for the training. The accepted inputs are scattered throughout the sample space. Tidal records are used to demonstrate the proposed method. The results of a cross-validation show that with few inputs the quality of the outputs is not accurate and depends on the selection of the first sample, but as the number of inputs increases the accuracy is improved and differences among the scenarios with a different starting sample have and important reduction. A comparison with the K-means clustering algorithm shows that for this application the proposed method with a smaller number of samples is producing a more accurate network.

  16. Use of a Smartphone as a Colorimetric Analyzer in Paper-based Devices for Sensitive and Selective Determination of Mercury in Water Samples.

    PubMed

    Jarujamrus, Purim; Meelapsom, Rattapol; Pencharee, Somkid; Obma, Apinya; Amatatongchai, Maliwan; Ditcharoen, Nadh; Chairam, Sanoe; Tamuang, Suparb

    2018-01-01

    A smartphone application, called CAnal, was developed as a colorimetric analyzer in paper-based devices for sensitive and selective determination of mercury(II) in water samples. Measurement on the double layer of a microfluidic paper-based analytical device (μPAD) fabricated by alkyl ketene dimer (AKD)-inkjet printing technique with special design doped with unmodified silver nanoparticles (AgNPs) onto the detection zones was performed by monitoring the gray intensity in the blue channel of AgNPs, which disintegrated when exposed to mercury(II) on μPAD. Under the optimized conditions, the developed approach showed high sensitivity, low limit of detection (0.003 mg L -1 , 3SD blank/slope of the calibration curve), small sample volume uptake (two times of 2 μL), and short analysis time. The linearity range of this technique ranged from 0.01 to 10 mg L -1 (r 2 = 0.993). Furthermore, practical analysis of various water samples was also demonstrated to have acceptable performance that was in agreement with the data from cold vapor atomic absorption spectrophotometry (CV-AAS), a conventional method. The proposed technique allows for a rapid, simple (instant report of the final mercury(II) concentration in water samples via smartphone display), sensitive, selective, and on-site analysis with high sample throughput (48 samples h -1 , n = 3) of trace mercury(II) in water samples, which is suitable for end users who are unskilled in analyzing mercury(II) in water samples.

  17. Longitudinal study of Escherichia coli O157 shedding and super shedding in dairy heifers.

    PubMed

    Williams, K J; Ward, M P; Dhungyel, O P

    2015-04-01

    A longitudinal study was conducted to assess the methods available for detection of Escherichia coli O157 and to investigate the prevalence and occurrence of long-term shedding and super shedding in a cohort of Australian dairy heifers. Samples were obtained at approximately weekly intervals from heifers at pasture under normal management systems. Selective sampling techniques were used with the aim of identifying heifers with a higher probability of shedding or super shedding. Rectoanal mucosal swabs (RAMS) and fecal samples were obtained from each heifer. Direct culture of feces was used for detection and enumeration. Feces and RAMS were tested by enrichment culture. Selected samples were further tested retrospectively by immunomagnetic separation of enriched samples. Of 784 samples obtained, 154 (19.6%) were detected as positive using culture methods. Adjusting for selective sampling, the prevalence was 71 (15.6%) of 454. In total, 66 samples were detected as positive at >10(2) CFU/g of which 8 were >10(4) CFU/g and classed as super shedding. A significant difference was observed in detection by enriched culture of RAMS and feces. Dairy heifers within this cohort exhibited variable E. coli O157 shedding, consistent with previous estimates of shedding. Super shedding was detected at a low frequency and inconsistently from individual heifers. All detection methods identified some samples as positive that were not detected by any other method, indicating that the testing methods used will influence survey results.

  18. Clustering behavior in microbial communities from acute endodontic infections.

    PubMed

    Montagner, Francisco; Jacinto, Rogério C; Signoretti, Fernanda G C; Sanches, Paula F; Gomes, Brenda P F A

    2012-02-01

    Acute endodontic infections harbor heterogeneous microbial communities in both the root canal (RC) system and apical tissues. Data comparing the microbial structure and diversity in endodontic infections in related ecosystems, such as RC with necrotic pulp and acute apical abscess (AAA), are scarce in the literature. The aim of this study was to examine the presence of selected endodontic pathogens in paired samples from necrotic RC and AAA using polymerase chain reaction (PCR) followed by the construction of cluster profiles. Paired samples of RC and AAA exudates were collected from 20 subjects and analyzed by PCR for the presence of selected strict and facultative anaerobic strains. The frequency of species was compared between the RC and the AAA samples. A stringent neighboring clustering algorithm was applied to investigate the existence of similar high-order groups of samples. A dendrogram was constructed to show the arrangement of the sample groups produced by the hierarchical clustering. All samples harbored bacterial DNA. Porphyromonas endodontalis, Prevotella nigrescens, Filifactor alocis, and Tannerela forsythia were frequently detected in both RC and AAA samples. The selected anaerobic species were distributed in diverse small bacteria consortia. The samples of RC and AAA that presented at least one of the targeted microorganisms were grouped in small clusters. Anaerobic species were frequently detected in acute endodontic infections and heterogeneous microbial communities with low clustering behavior were observed in paired samples of RC and AAA. Copyright © 2012. Published by Elsevier Inc.

  19. Physical-property, water-quality, plankton, and bottom-material data for Devils Lake and East Devils Lake, North Dakota, September 1988 through October 1990

    USGS Publications Warehouse

    Sando, Steven K.; Sether, Bradley A.

    1993-01-01

    Physical-properties were measured and water-quality, plankton, and bottom-material samples were collected at 10 sites in Devils Lake and East Devils Lake during September 1988 through October 1990 to study water-quality variability and water-quality and plankton relations in Devils Lake and East Devils Lake. Physical properties measured include specific conductance, pH, water temperature, dissolved-oxygen concentration, water transparency, and light transmission. Water-quality samples were analyzed for concentrations of major ions, selected nutrients, and selected trace elements. Plankton samples were examined for identification and enumeration of phytoplankton and zooplankton species, and bottom-material samples were analyzed for concentrations of selected nutrients. Data-collection procedures are discussed and the data are presented in tabular form.

  20. Apparatus for gas sorption measurement with integrated gas composition measurement device and gas mixing

    DOEpatents

    Micklash. II, Kenneth James; Dutton, Justin James; Kaye, Steven

    2014-06-03

    An apparatus for testing of multiple material samples includes a gas delivery control system operatively connectable to the multiple material samples and configured to provide gas to the multiple material samples. Both a gas composition measurement device and pressure measurement devices are included in the apparatus. The apparatus includes multiple selectively openable and closable valves and a series of conduits configured to selectively connect the multiple material samples individually to the gas composition device and the pressure measurement devices by operation of the valves. A mixing system is selectively connectable to the series of conduits and is operable to cause forced mixing of the gas within the series of conduits to achieve a predetermined uniformity of gas composition within the series of conduits and passages.

  1. Circular Samples as Objects for Magnetic Resonance Imaging - Mathematical Simulation, Experimental Results

    NASA Astrophysics Data System (ADS)

    Frollo, Ivan; Krafčík, Andrej; Andris, Peter; Přibil, Jiří; Dermek, Tomáš

    2015-12-01

    Circular samples are the frequent objects of "in-vitro" investigation using imaging method based on magnetic resonance principles. The goal of our investigation is imaging of thin planar layers without using the slide selection procedure, thus only 2D imaging or imaging of selected layers of samples in circular vessels, eppendorf tubes,.. compulsorily using procedure "slide selection". In spite of that the standard imaging methods was used, some specificity arise when mathematical modeling of these procedure is introduced. In the paper several mathematical models were presented that were compared with real experimental results. Circular magnetic samples were placed into the homogenous magnetic field of a low field imager based on nuclear magnetic resonance. For experimental verification an MRI 0.178 Tesla ESAOTE Opera imager was used.

  2. The Coalescent Process in Models with Selection

    PubMed Central

    Kaplan, N. L.; Darden, T.; Hudson, R. R.

    1988-01-01

    Statistical properties of the process describing the genealogical history of a random sample of genes are obtained for a class of population genetics models with selection. For models with selection, in contrast to models without selection, the distribution of this process, the coalescent process, depends on the distribution of the frequencies of alleles in the ancestral generations. If the ancestral frequency process can be approximated by a diffusion, then the mean and the variance of the number of segregating sites due to selectively neutral mutations in random samples can be numerically calculated. The calculations are greatly simplified if the frequencies of the alleles are tightly regulated. If the mutation rates between alleles maintained by balancing selection are low, then the number of selectively neutral segregating sites in a random sample of genes is expected to substantially exceed the number predicted under a neutral model. PMID:3066685

  3. Estimating the carbon in coarse woody debris with perpendicular distance sampling. Chapter 6

    Treesearch

    Harry T. Valentine; Jeffrey H. Gove; Mark J. Ducey; Timothy G. Gregoire; Michael S. Williams

    2008-01-01

    Perpendicular distance sampling (PDS) is a design for sampling the population of pieces of coarse woody debris (logs) in a forested tract. In application, logs are selected at sample points with probability proportional to volume. Consequently, aggregate log volume per unit land area can be estimated from tallies of logs at sample points. In this chapter we provide...

  4. Systematic sampling of discrete and continuous populations: sample selection and the choice of estimator

    Treesearch

    Harry T. Valentine; David L. R. Affleck; Timothy G. Gregoire

    2009-01-01

    Systematic sampling is easy, efficient, and widely used, though it is not generally recognized that a systematic sample may be drawn from the population of interest with or without restrictions on randomization. The restrictions or the lack of them determine which estimators are unbiased, when using the sampling design as the basis for inference. We describe the...

  5. Eating and Exercising: Nebraska Adolescents' Attitudes and Behaviors. Technical Report 25.

    ERIC Educational Resources Information Center

    Newman, Ian M.

    This report describes selected eating and exercise patterns among a sample of 2,237 Nebraska youth in grades 9-12 selected from a random sample of 24 junior and senior high schools. The eating patterns reported cover food selection, body image, weight management, and weight loss methods. The exercise patterns relate to the frequency of…

  6. Academic Research Equipment in Selected Science Engineering Fields: 1982-83 to 1985-86.

    ERIC Educational Resources Information Center

    Burgdorf, Kenneth; Chaney, Bradford

    This report presents information for identification of the national trends in the amount, age, loss, condition, and perceived adequacy of academic research equipment in selected science and engineering fields. The data were obtained from a stratified probability sample of 55 colleges and universities and from a separately selected sample of 24…

  7. Accuracy and suitability of selected sampling methods within conifer dominated riparian zones

    Treesearch

    Theresa Marquardt; Hailemariam Temesgen; Paul D. Anderson

    2010-01-01

    Sixteen sampling alternatives were examined for their performance to quantify selected attributes of overstory conifers in riparian areas of western Oregon. Each alternative was examined at eight headwater forest locations based on 0.52 ha square stem maps. The alternatives were evaluated for selected stand attributes (tree per hectare, basal area per hectare, and...

  8. Spatially balanced survey designs for natural resources

    EPA Science Inventory

    Ecological resource monitoring programs typically require the use of a probability survey design to select locations or entities to be physically sampled in the field. The ecological resource of interest, the target population, occurs over a spatial domain and the sample selecte...

  9. CONTINUOUS GAS ANALYZER

    DOEpatents

    Katz, S.; Weber, C.W.

    1960-02-16

    A reagent gas and a sample gas are chemically combined on a continuous basis in a reaction zone maintained at a selected temperature. The reagent gas and the sample gas are introduced to the reaction zone at preselected. constant molar rates of flow. The reagent gas and the selected gas in the sample mixture combine in the reaction zone to form a product gas having a different number of moles from the sum of the moles of the reactants. The difference in the total molar rates of flow into and out of the reaction zone is measured and indicated to determine the concentration of the selected gas.

  10. Engineering molecularly imprinted polymers (MIPs) for the selective extraction and quantification of the novel psychoactive substance (NPS) methoxphenidine and its regioisomers.

    PubMed

    Lowdon, J W; Alkirkit, S M O; Mewis, R E; Fulton, D; Banks, C E; Sutcliffe, O B; Peeters, M

    2018-04-30

    In this communication, we present the first developed Molecularly Imprinted Polymers (MIPs) for the specific detection of a New Psychoactive Substance (NPS); namely, methoxphenidine (MXP) and its regioisomers. Selectivity of the MIP towards MXP is studied by analysing mixtures and an acquired street sample with High Performance Liquid Chromatography coupled to UV detection. The study demonstrates that the engineered polymers selectively extract MXP from heterogeneous samples, which makes for a very powerful diagnostic tool that can detect traces of MXP in complicated NPS samples.

  11. Optical monitor for water vapor concentration

    DOEpatents

    Kebabian, Paul

    1998-01-01

    A system for measuring and monitoring water vapor concentration in a sample uses as a light source an argon discharge lamp, which inherently emits light with a spectral line that is close to a water vapor absorption line. In a preferred embodiment, the argon line is split by a magnetic field parallel to the direction of light propagation from the lamp into sets of components of downshifted and upshifted frequencies of approximately 1575 Gauss. The downshifted components are centered on a water vapor absorption line and are thus readily absorbed by water vapor in the sample; the upshifted components are moved away from that absorption line and are minimally absorbed. A polarization modulator alternately selects the upshifted components or downshifted components and passes the selected components to the sample. After transmission through the sample, the transmitted intensity of a component of the argon line varies as a result of absorption by the water vapor. The system then determines the concentration of water vapor in the sample based on differences in the transmitted intensity between the two sets of components. In alternative embodiments alternate selection of sets of components is achieved by selectively reversing the polarity of the magnetic field or by selectively supplying the magnetic field to the emitting plasma.

  12. Optical monitor for water vapor concentration

    DOEpatents

    Kebabian, P.

    1998-06-02

    A system for measuring and monitoring water vapor concentration in a sample uses as a light source an argon discharge lamp, which inherently emits light with a spectral line that is close to a water vapor absorption line. In a preferred embodiment, the argon line is split by a magnetic field parallel to the direction of light propagation from the lamp into sets of components of downshifted and upshifted frequencies of approximately 1575 Gauss. The downshifted components are centered on a water vapor absorption line and are thus readily absorbed by water vapor in the sample; the upshifted components are moved away from that absorption line and are minimally absorbed. A polarization modulator alternately selects the upshifted components or downshifted components and passes the selected components to the sample. After transmission through the sample, the transmitted intensity of a component of the argon line varies as a result of absorption by the water vapor. The system then determines the concentration of water vapor in the sample based on differences in the transmitted intensity between the two sets of components. In alternative embodiments alternate selection of sets of components is achieved by selectively reversing the polarity of the magnetic field or by selectively supplying the magnetic field to the emitting plasma. 5 figs.

  13. 40 CFR 1065.1107 - Sample media and sample system preparation; sample system assembly.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) For capturing PM, we recommend using pure quartz filters with no binder. Select the filter diameter to minimize filter change intervals, accounting for the expected PM emission rate, sample flow rate, and... filter without replacing the sorbent or otherwise disassembling the batch sampler. In those cases...

  14. 40 CFR 761.289 - Compositing samples.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Remediation Waste and Porous Surfaces in Accordance With § 761.61(a)(6) § 761.289 Compositing samples. Compositing is a method of combining several samples of a specific type of bulk PCB remediation waste or... compositing bulk PCB remediation waste samples. These procedures are based on the method for selecting...

  15. Sampling Error in a Particulate Mixture: An Analytical Chemistry Experiment.

    ERIC Educational Resources Information Center

    Kratochvil, Byron

    1980-01-01

    Presents an undergraduate experiment demonstrating sampling error. Selected as the sampling system is a mixture of potassium hydrogen phthalate and sucrose; using a self-zeroing, automatically refillable buret to minimize titration time of multiple samples and employing a dilute back-titrant to obtain high end-point precision. (CS)

  16. 40 CFR 90.706 - Engine sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... = emission test result for an individual engine. x = mean of emission test results of the actual sample. FEL... test with the last test result from the previous model year and then calculate the required sample size.... Test results used to calculate the variables in the following Sample Size Equation must be final...

  17. 40 CFR 761.289 - Compositing samples.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Remediation Waste and Porous Surfaces in Accordance With § 761.61(a)(6) § 761.289 Compositing samples. Compositing is a method of combining several samples of a specific type of bulk PCB remediation waste or... compositing bulk PCB remediation waste samples. These procedures are based on the method for selecting...

  18. Fecal-indicator bacteria in the Allegheny, Monongahela, and Ohio Rivers and selected tributaries, Allegheny County, Pennsylvania, 2001-2005

    USGS Publications Warehouse

    Buckwalter, Theodore F.; Zimmerman, Tammy M.; Fulton, John W.

    2006-01-01

    Concentrations of fecal-indicator bacteria were determined in 1,027 water-quality samples collected from July 2001 through August 2005 during dry- (72-hour dry antecedent period) and wet-weather (48-hour dry antecedent period and at least 0.3 inch of rain in a 24-hour period) conditions in the Allegheny, Monongahela, and Ohio Rivers (locally referred to as the Three Rivers) and selected tributaries in Allegheny County. Samples were collected at five sampling sites on the Three Rivers and at eight sites on four tributaries to the Three Rivers having combined sewer overflows. Water samples were analyzed for three fecal-indicator organisms fecal coliform, Escherichia coli (E. coli), and enterococci bacteria. Left-bank and right-bank surface-water samples were collected in addition to a cross-section composite sample at each site. Concentrations of fecal coliform, E. coli, and enterococci were detected in 98.6, 98.5, and 87.7 percent of all samples, respectively. The maximum fecal-indicator bacteria concentrations were collected from Sawmill Run, a tributary to the Ohio River; Sawmill Run at Duquesne Heights had concentrations of fecal coliform, E. coli, and enterococci of 410,000, 510,000, and 180,000 col/100 mL, respectively, following a large storm. The samples collected in the Three Rivers and selected tributaries frequently exceeded established recreational standards and criteria for bacteria. Concentrations of fecal coliform exceeded the Pennsylvania water-quality standard (200 col/100 mL) in approximately 63 percent of the samples. Sample concentrations of E. coli and enterococci exceeded the U.S. Environmental Protection Agency (USEPA) water-quality criteria (235 and 61 col/100 mL, respectively) in about 53 and 47 percent, respectively, of the samples. Fecal-indicator bacteria were most strongly correlated with streamflow, specific conductance, and turbidity. These correlations most frequently were observed in samples collected from tributary sites. Fecal-indicator bacteria concentrations and turbidity were correlated to the location of sample collection in the cross section. Most differences were between bank and composite samples; differences between right-bank and left-bank samples were rarely observed. The Allegheny River sites had more significant correlations than the Monongahela or Ohio River sites. Comparisons were made between fecal-indicator bacteria in composite samples collected during dry-weather, wet-weather day-one, wet-weather day-two (tributary sites only), and wet-weather day-three (Three Rivers sites only) events in the Three Rivers and selected tributary sites. The lowest median bacteria concentrations generally were observed in the dry-weather composite samples. All median bacteria concentrations in dry-weather composite samples in the five Three Rivers sites were below water-quality standards and criteria; bacteria concentrations in the upstream tributary sites rarely met all standards or criteria. Only Turtle Creek, Thompson Run, and Chartiers Creek had at least one median bacteria concentration below water-quality standards or criteria. Median bacteria concentrations in the composite samples generally were higher the day after a wet-weather event compared to dry-weather composite samples and other wet-weather composite samples collected. In the five Three Rivers sites, median bacteria concentrations 3 days after a wet-weather event in composite samples tended to fall below the water-quality standards and criteria; in the eight tributary sites, median bacteria concentrations in the dry-weather and wet-weather composite samples generally were above the water-quality standards or criteria. Composite samples collected at the upstream sites on the Three Rivers and selected tributaries generally had lower median bacteria concentrations than composite samples collected at the downstream sites during dry- and wet-weather events. Higher concentrations downstream may be because o

  19. The genealogy of samples in models with selection.

    PubMed

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  20. The Genealogy of Samples in Models with Selection

    PubMed Central

    Neuhauser, C.; Krone, S. M.

    1997-01-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models, DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case. PMID:9071604

  1. A new comprehensive method for detection of livestock-related pathogenic viruses using a target enrichment system.

    PubMed

    Oba, Mami; Tsuchiaka, Shinobu; Omatsu, Tsutomu; Katayama, Yukie; Otomaru, Konosuke; Hirata, Teppei; Aoki, Hiroshi; Murata, Yoshiteru; Makino, Shinji; Nagai, Makoto; Mizutani, Tetsuya

    2018-01-08

    We tested usefulness of a target enrichment system SureSelect, a comprehensive viral nucleic acid detection method, for rapid identification of viral pathogens in feces samples of cattle, pigs and goats. This system enriches nucleic acids of target viruses in clinical/field samples by using a library of biotinylated RNAs with sequences complementary to the target viruses. The enriched nucleic acids are amplified by PCR and subjected to next generation sequencing to identify the target viruses. In many samples, SureSelect target enrichment method increased efficiencies for detection of the viruses listed in the biotinylated RNA library. Furthermore, this method enabled us to determine nearly full-length genome sequence of porcine parainfluenza virus 1 and greatly increased Breadth, a value indicating the ratio of the mapping consensus length in the reference genome, in pig samples. Our data showed usefulness of SureSelect target enrichment system for comprehensive analysis of genomic information of various viruses in field samples. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Hypocalcemic stimulation and nonselective venous sampling for localizing parathyroid adenomas: work in progress.

    PubMed

    Doppman, J L; Skarulis, M C; Chang, R; Alexander, H R; Bartlett, D; Libutti, S K; Marx, S J; Spiegel, A M

    1998-07-01

    To evaluate whether the release of parathyroid hormone (PTH) from parathyroid tumors during selective parathyroid arteriography can help localize the tumors. In 20 patients (six men, 14 women; age range, 24-72 years) with parathyroid tumors undergoing parathyroid arteriography after failed surgery, serial measurements of PTH were obtained during selective arteriography with nonionic contrast material. PTH levels were measured in the superior vena cava (SVC) before and at varying times from 20 to 120 seconds after arteriography. A 1.4-fold increase in the PTH level of the postarteriographic SVC samples enabled correct prediction of the site of adenoma in 13 of the 20 patients (65%). Of nine patients with positive arteriograms, eight had positive results of postarteriographic sampling. Of 11 patients with negative arteriograms, five had positive results of postarteriographic sampling. Sampling the SVC for PTH gradients after selective parathyroid arteriography correctly indicated the site of the adenoma in 13 of 20 patients (65%).

  3. Detection of genomic loci associated with environmental variables using generalized linear mixed models.

    PubMed

    Lobréaux, Stéphane; Melodelima, Christelle

    2015-02-01

    We tested the use of Generalized Linear Mixed Models to detect associations between genetic loci and environmental variables, taking into account the population structure of sampled individuals. We used a simulation approach to generate datasets under demographically and selectively explicit models. These datasets were used to analyze and optimize GLMM capacity to detect the association between markers and selective coefficients as environmental data in terms of false and true positive rates. Different sampling strategies were tested, maximizing the number of populations sampled, sites sampled per population, or individuals sampled per site, and the effect of different selective intensities on the efficiency of the method was determined. Finally, we apply these models to an Arabidopsis thaliana SNP dataset from different accessions, looking for loci associated with spring minimal temperature. We identified 25 regions that exhibit unusual correlations with the climatic variable and contain genes with functions related to temperature stress. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. A two-stage cluster sampling method using gridded population data, a GIS, and Google Earth(TM) imagery in a population-based mortality survey in Iraq.

    PubMed

    Galway, Lp; Bell, Nathaniel; Sae, Al Shatari; Hagopian, Amy; Burnham, Gilbert; Flaxman, Abraham; Weiss, Wiliam M; Rajaratnam, Julie; Takaro, Tim K

    2012-04-27

    Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings.

  5. A two-stage cluster sampling method using gridded population data, a GIS, and Google EarthTM imagery in a population-based mortality survey in Iraq

    PubMed Central

    2012-01-01

    Background Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. Results We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Conclusion Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings. PMID:22540266

  6. Clonal selection in xenografted human T cell acute lymphoblastic leukemia recapitulates gain of malignancy at relapse.

    PubMed

    Clappier, Emmanuelle; Gerby, Bastien; Sigaux, François; Delord, Marc; Touzri, Farah; Hernandez, Lucie; Ballerini, Paola; Baruchel, André; Pflumio, Françoise; Soulier, Jean

    2011-04-11

    Genomic studies in human acute lymphoblastic leukemia (ALL) have revealed clonal heterogeneity at diagnosis and clonal evolution at relapse. In this study, we used genome-wide profiling to compare human T cell ALL samples at the time of diagnosis and after engraftment (xenograft) into immunodeficient recipient mice. Compared with paired diagnosis samples, the xenograft leukemia often contained additional genomic lesions in established human oncogenes and/or tumor suppressor genes. Mimicking such genomic lesions by short hairpin RNA-mediated knockdown in diagnosis samples conferred a selective advantage in competitive engraftment experiments, demonstrating that additional lesions can be drivers of increased leukemia-initiating activity. In addition, the xenograft leukemias appeared to arise from minor subclones existing in the patient at diagnosis. Comparison of paired diagnosis and relapse samples showed that, with regard to genetic lesions, xenograft leukemias more frequently more closely resembled relapse samples than bulk diagnosis samples. Moreover, a cell cycle- and mitosis-associated gene expression signature was present in xenograft and relapse samples, and xenograft leukemia exhibited diminished sensitivity to drugs. Thus, the establishment of human leukemia in immunodeficient mice selects and expands a more aggressive malignancy, recapitulating the process of relapse in patients. These findings may contribute to the design of novel strategies to prevent or treat relapse.

  7. Concordance with dietary and lifestyle population goals for cancer prevention in Dutch, Scottish, Mexican, and Guatemalan population samples.

    PubMed

    Vossenaar, Marieke; Solomons, Noel W; Valdés-Ramos, Roxana; Anderson, Annie S

    2010-01-01

    We assessed concordance with selected population goal components of the 1997 World Cancer Research Fund/American Institute for Cancer Research (WCRF/AICR) diet and lifestyle recommendations to decrease cancer risk across four population samples. This was a prospectively designed survey examining concordance with the population goals of the WCRF/AICR recommendations using target criteria across sites. Population samples were from the Netherlands, Scotland, Mexico, and Guatemala. A total of 3564 men and women aged 18 to 70 y were recruited in equal proportions by site and gender. None of the four pooled samples met the target population average criteria for body mass index or refined sugar intake. The Guatemalan sample had concordance with the largest number of recommended cancer-prevention goals (10 of 12 selected WCRF/AICR components). Successively, Mexican, Scottish, and Dutch samples were concordant with seven, four, and three selected components, respectively. A prospectively designed research instrument and exhaustive prior examination of operative criteria allow for the assessment of group-level concordance with cancer-prevention goals. To the extent that the study samples reflect the respective national situations, geographic variance in concordance exists, with conditions and behaviors in Guatemala bringing that nation into more general compliance with the 1997 WCRF/AICR goals.

  8. Automated imaging of cellular spheroids with selective plane illumination microscopy on a chip (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Paiè, Petra; Bassi, Andrea; Bragheri, Francesca; Osellame, Roberto

    2017-02-01

    Selective plane illumination microscopy (SPIM) is an optical sectioning technique that allows imaging of biological samples at high spatio-temporal resolution. Standard SPIM devices require dedicated set-ups, complex sample preparation and accurate system alignment, thus limiting the automation of the technique, its accessibility and throughput. We present a millimeter-scaled optofluidic device that incorporates selective plane illumination and fully automatic sample delivery and scanning. To this end an integrated cylindrical lens and a three-dimensional fluidic network were fabricated by femtosecond laser micromachining into a single glass chip. This device can upgrade any standard fluorescence microscope to a SPIM system. We used SPIM on a CHIP to automatically scan biological samples under a conventional microscope, without the need of any motorized stage: tissue spheroids expressing fluorescent proteins were flowed in the microchannel at constant speed and their sections were acquired while passing through the light sheet. We demonstrate high-throughput imaging of the entire sample volume (with a rate of 30 samples/min), segmentation and quantification in thick (100-300 μm diameter) cellular spheroids. This optofluidic device gives access to SPIM analyses to non-expert end-users, opening the way to automatic and fast screening of a high number of samples at subcellular resolution.

  9. Determination of variables in the prediction of strontium distribution coefficients for selected sediments

    USGS Publications Warehouse

    Pace, M.N.; Rosentreter, J.J.; Bartholomay, R.C.

    2001-01-01

    Idaho State University and the US Geological Survey, in cooperation with the US Department of Energy, conducted a study to determine and evaluate strontium distribution coefficients (Kds) of subsurface materials at the Idaho National Engineering and Environmental Laboratory (INEEL). The Kds were determined to aid in assessing the variability of strontium Kds and their effects on chemical transport of strontium-90 in the Snake River Plain aquifer system. Data from batch experiments done to determine strontium Kds of five sediment-infill samples and six standard reference material samples were analyzed by using multiple linear regression analysis and the stepwise variable-selection method in the statistical program, Statistical Product and Service Solutions, to derive an equation of variables that can be used to predict strontium Kds of sediment-infill samples. The sediment-infill samples were from basalt vesicles and fractures from a selected core at the INEEL; strontium Kds ranged from ???201 to 356 ml g-1. The standard material samples consisted of clay minerals and calcite. The statistical analyses of the batch-experiment results showed that the amount of strontium in the initial solution, the amount of manganese oxide in the sample material, and the amount of potassium in the initial solution are the most important variables in predicting strontium Kds of sediment-infill samples.

  10. Analysis of training sample selection strategies for regression-based quantitative landslide susceptibility mapping methods

    NASA Astrophysics Data System (ADS)

    Erener, Arzu; Sivas, A. Abdullah; Selcuk-Kestel, A. Sevtap; Düzgün, H. Sebnem

    2017-07-01

    All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

  11. THE ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM (EMAP) SAMPLE DESIGNS AND OVERVIEW

    EPA Science Inventory

    This presentation will discuss the fundamentals of the EMAP sample design and program elements. Central components of EMAP such as the methodology for site selection and data analysis, indicator selection and interpretation will be discussed. Examples from wadeable surfact water,...

  12. X-Ray Temperatures, Luminosities, and Masses from XMM-Newton Follow-upof the First Shear-selected Galaxy Cluster Sample

    NASA Astrophysics Data System (ADS)

    Deshpande, Amruta J.; Hughes, John P.; Wittman, David

    2017-04-01

    We continue the study of the first sample of shear-selected clusters from the initial 8.6 square degrees of the Deep Lens Survey (DLS); a sample with well-defined selection criteria corresponding to the highest ranked shear peaks in the survey area. We aim to characterize the weak lensing selection by examining the sample’s X-ray properties. There are multiple X-ray clusters associated with nearly all the shear peaks: 14 X-ray clusters corresponding to seven DLS shear peaks. An additional three X-ray clusters cannot be definitively associated with shear peaks, mainly due to large positional offsets between the X-ray centroid and the shear peak. Here we report on the XMM-Newton properties of the 17 X-ray clusters. The X-ray clusters display a wide range of luminosities and temperatures; the L X -T X relation we determine for the shear-associated X-ray clusters is consistent with X-ray cluster samples selected without regard to dynamical state, while it is inconsistent with self-similarity. For a subset of the sample, we measure X-ray masses using temperature as a proxy, and compare to weak lensing masses determined by the DLS team. The resulting mass comparison is consistent with equality. The X-ray and weak lensing masses show considerable intrinsic scatter (˜48%), which is consistent with X-ray selected samples when their X-ray and weak lensing masses are independently determined. Some of the data presented herein were obtained at the W.M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and the National Aeronautics and Space Administration. The Observatory was made possible by the generous financial support of the W. M. Keck Foundation.

  13. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Treesearch

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  14. Model selection with multiple regression on distance matrices leads to incorrect inferences.

    PubMed

    Franckowiak, Ryan P; Panasci, Michael; Jarvis, Karl J; Acuña-Rodriguez, Ian S; Landguth, Erin L; Fortin, Marie-Josée; Wagner, Helene H

    2017-01-01

    In landscape genetics, model selection procedures based on Information Theoretic and Bayesian principles have been used with multiple regression on distance matrices (MRM) to test the relationship between multiple vectors of pairwise genetic, geographic, and environmental distance. Using Monte Carlo simulations, we examined the ability of model selection criteria based on Akaike's information criterion (AIC), its small-sample correction (AICc), and the Bayesian information criterion (BIC) to reliably rank candidate models when applied with MRM while varying the sample size. The results showed a serious problem: all three criteria exhibit a systematic bias toward selecting unnecessarily complex models containing spurious random variables and erroneously suggest a high level of support for the incorrectly ranked best model. These problems effectively increased with increasing sample size. The failure of AIC, AICc, and BIC was likely driven by the inflated sample size and different sum-of-squares partitioned by MRM, and the resulting effect on delta values. Based on these findings, we strongly discourage the continued application of AIC, AICc, and BIC for model selection with MRM.

  15. Rapid and selective extraction of multiple macrolide antibiotics in foodstuff samples based on magnetic molecularly imprinted polymers.

    PubMed

    Zhou, Yusun; Zhou, Tingting; Jin, Hua; Jing, Tao; Song, Bin; Zhou, Yikai; Mei, Surong; Lee, Yong-Ill

    2015-05-01

    Magnetic molecularly imprinted polymers (MMIPs) were prepared based on surface molecular imprinting using erythromycin (ERY) as template molecule and Fe3O4 nanoparticles as support substrate. The MMIPs possessed high adsorption capacity of 94.1 mg/g for ERY and the imprinting factor was 11.9 indicating good imprinted effect for ERY. Selective evaluation demonstrated favorable selectivity of MMIPs for multiple macrolide antibiotics (MACs). Using MMIPs as adsorptive material, a rapid and convenient magnetic solid-phase extraction (MSPE) procedure was established for simultaneous and selective separation of six MACs in pork, fish and shrimp samples, then the MACs was subjected to high-performance liquid chromatography-ultraviolet (HPLC-UV) analysis. At different fortified concentrations, the extraction recoveries could reach 89.1% and the relative standard deviations were lower than 12.4%. Chromatogram revealed the response signals of MACs in spiked samples were greatly enhanced and matrix interferences were effectively eliminated after treatment with MSPE. The proposed MSPE procedure coupled with HPLC-UV realized selective and sensitive determination of multiple MACs in foodstuff samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Selected well and ground-water chemistry data for the Boise River Valley, southwestern Idaho, 1990-95

    USGS Publications Warehouse

    Parliman, D.J.; Boyle, Linda; Nicholls, Sabrina

    1996-01-01

    Water samples were collected from 903 wells in the Boise River Valley, Idaho, from January 1990 through December 1995. Selected well information and analyses of 1,357 water samples are presented. Analyses include physical properties ad concentrations of nutrients, bacteria, major ions, selected trace elements, radon-222, volatile organic compounds, and pesticides.

  17. THE SELECTION OF A NATIONAL RANDOM SAMPLE OF TEACHERS FOR EXPERIMENTAL CURRICULUM EVALUATION.

    ERIC Educational Resources Information Center

    WELCH, WAYNE W.; AND OTHERS

    MEMBERS OF THE EVALUATION SECTION OF HARVARD PROJECT PHYSICS, DESCRIBING WHAT IS SAID TO BE THE FIRST ATTEMPT TO SELECT A NATIONAL RANDOM SAMPLE OF (HIGH SCHOOL PHYSICS) TEACHERS, LIST THE STEPS AS (1) PURCHASE OF A LIST OF PHYSICS TEACHERS FROM THE NATIONAL SCIENCE TEACHERS ASSOCIATION (MOST COMPLETE AVAILABLE), (2) SELECTION OF 136 NAMES BY A…

  18. Brief Report: The Effect of Delayed Matching to Sample on Stimulus Over-Selectivity

    ERIC Educational Resources Information Center

    Reed, Phil

    2012-01-01

    Stimulus over-selectivity occurs when one aspect of the environment controls behavior at the expense of other equally salient aspects. Participants were trained on a match-to-sample (MTS) discrimination task. Levels of over-selectivity in a group of children (4-18 years) with Autism Spectrum Disorders (ASD) were compared with a mental-aged matched…

  19. 40 CFR Appendix A to Subpart F of... - Sampling Plans for Selective Enforcement Auditing of Small Nonroad Engines

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Enforcement Auditing of Small Nonroad Engines A Appendix A to Subpart F of Part 90 Protection of Environment...-IGNITION ENGINES AT OR BELOW 19 KILOWATTS Selective Enforcement Auditing Pt. 90, Subpt. F, App. A Appendix A to Subpart F of Part 90—Sampling Plans for Selective Enforcement Auditing of Small Nonroad Engines...

  20. Re-Emergence of Under-Selected Stimuli, after the Extinction of Over-Selected Stimuli in an Automated Match to Samples Procedure

    ERIC Educational Resources Information Center

    Broomfield, Laura; McHugh, Louise; Reed, Phil

    2008-01-01

    Stimulus over-selectivity occurs when one of potentially many aspects of the environment comes to control behaviour. In two experiments, adults with no developmental disabilities, were trained and tested in an automated match to samples (MTS) paradigm. In Experiment 1, participants completed two conditions, in one of which the over-selected…

  1. Establishing Interpretive Consistency When Mixing Approaches: Role of Sampling Designs in Evaluations

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.

    2013-01-01

    The goal of this chapter is to recommend quality criteria to guide evaluators' selections of sampling designs when mixing approaches. First, we contextualize our discussion of quality criteria and sampling designs by discussing the concept of interpretive consistency and how it impacts sampling decisions. Embedded in this discussion are…

  2. On estimation in k-tree sampling

    Treesearch

    Christoph Kleinn; Frantisek Vilcko

    2007-01-01

    The plot design known as k-tree sampling involves taking the k nearest trees from a selected sample point as sample trees. While this plot design is very practical and easily applied in the field for moderate values of k, unbiased estimation remains a problem. In this article, we give a brief introduction to the...

  3. 76 FR 56141 - Notice of Intent To Request New Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-12

    ... level surveys of similar scope and size. The sample for each selected community will be strategically... of 2 hours per sample community. Full Study: The maximum sample size for the full study is 2,812... questionnaires. The initial sample size for this phase of the research is 100 respondents (10 respondents per...

  4. Community Violence and PTSD in Selected South African Townships

    ERIC Educational Resources Information Center

    Dinan, B. Ann; McCall, George J.; Gibson, Diana

    2004-01-01

    Given the high rates of crime in South Africa's townships, nonpolitical violence out-side the home and its psychological impact on women were investigated within two samples, the primary a help-seeking sample and the secondary a community sample. In the help-seeking sample, two thirds of the women reported having experienced several traumatic…

  5. Calibrating SALT: a sampling scheme to improve estimates of suspended sediment yield

    Treesearch

    Robert B. Thomas

    1986-01-01

    Abstract - SALT (Selection At List Time) is a variable probability sampling scheme that provides unbiased estimates of suspended sediment yield and its variance. SALT performs better than standard schemes which are estimate variance. Sampling probabilities are based on a sediment rating function which promotes greater sampling intensity during periods of high...

  6. Effects of sampling techniques on physical parameters and concentrations of selected persistent organic pollutants in suspended matter.

    PubMed

    Pohlert, Thorsten; Hillebrand, Gudrun; Breitung, Vera

    2011-06-01

    This study focusses on the effect of sampling techniques for suspended matter in stream water on subsequent particle-size distribution and concentrations of total organic carbon and selected persistent organic pollutants. The key questions are whether differences between the sampling techniques are due to the separation principle of the devices or due to the difference between time-proportional versus integral sampling. Several multivariate homogeneity tests were conducted on an extensive set of field-data that covers the period from 2002 to 2007, when up to three different sampling techniques were deployed in parallel at four monitoring stations of the River Rhine. The results indicate homogeneity for polychlorinated biphenyls, but significant effects due to the sampling techniques on particle-size, organic carbon and hexachlorobenzene. The effects can be amplified depending on the site characteristics of the monitoring stations.

  7. Evaluating adequacy of the representative stream reach used in invertebrate monitoring programs

    USGS Publications Warehouse

    Rabeni, C.F.; Wang, N.; Sarver, R.J.

    1999-01-01

    Selection of a representative stream reach is implicitly or explicitly recommended in many biomonitoring protocols using benthic invertebrates. We evaluated the adequacy of sampling a single stream reach selected on the basis of its appearance. We 1st demonstrated the precision of our within-reach sampling. Then we sampled 3 or 4 reaches (each ~20x mean width) within an 8-16 km segment on each of 8 streams in 3 ecoregions and calculated 4 common metrics: 1) total taxa; 2) Ephemeroptera, Plecoptera, and Trichoptera taxa; 3) biotic index; and 4) Sharmon's diversity index. In only 6% of possible cases was the coefficient of variation for any of the metrics reduced >10% by sampling additional reaches. Sampling a 2nd reach on a stream improved the ability to detect impairment by an average of only 9.3%. Sampling a 3rd reach on a stream additionally improved ability to detect impairment by only 4.5%. We concluded that a single well-chosen reach, if adequately sampled, can be representative of an entire stream segment, and sampling additional reaches within a segment may not be cost effective.

  8. A nonparametric method to generate synthetic populations to adjust for complex sampling design features.

    PubMed

    Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E

    2014-06-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.

  9. A nonparametric method to generate synthetic populations to adjust for complex sampling design features

    PubMed Central

    Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.

    2017-01-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608

  10. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    PubMed

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  11. A new selective enrichment procedure for isolating Pasteurella multocida from avian and environmental samples

    USGS Publications Warehouse

    Moore, M.K.; Cicnjak-Chubbs, L.; Gates, R.J.

    1994-01-01

    A selective enrichment procedure, using two new selective media, was developed to isolate Pasteurella multocida from wild birds and environmental samples. These media were developed by testing 15 selective agents with six isolates of P. multocida from wild avian origin and seven other bacteria representing genera frequently found in environmental and avian samples. The resulting media—Pasteurella multocida selective enrichment broth and Pasteurella multocida selective agar—consisted of a blood agar medium at pH 10 containing gentamicin, potassium tellurite, and amphotericin B. Media were tested to determine: 1) selectivity when attempting isolation from pond water and avian carcasses, 2) sensitivity for detection of low numbers of P. multocida from pure and mixed cultures, 3) host range specificity of the media, and 4) performance compared with standard blood agar. With the new selective enrichment procedure, P. multocida was isolated from inoculated (60 organisms/ml) pond water 84% of the time, whereas when standard blood agar was used, the recovery rate was 0%.

  12. Compressive strength of human openwedges: a selection method

    NASA Astrophysics Data System (ADS)

    Follet, H.; Gotteland, M.; Bardonnet, R.; Sfarghiu, A. M.; Peyrot, J.; Rumelhart, C.

    2004-02-01

    A series of 44 samples of bone wedges of human origin, intended for allograft openwedge osteotomy and obtained without particular precautions during hip arthroplasty were re-examined. After viral inactivity chemical treatment, lyophilisation and radio-sterilisation (intended to produce optimal health safety), the compressive strength, independent of age, sex and the height of the sample (or angle of cut), proved to be too widely dispersed [ 10{-}158 MPa] in the first study. We propose a method for selecting samples which takes into account their geometry (width, length, thicknesses, cortical surface area). Statistical methods (Principal Components Analysis PCA, Hierarchical Cluster Analysis, Multilinear regression) allowed final selection of 29 samples having a mean compressive strength σ_{max} =103 MPa ± 26 and with variation [ 61{-}158 MPa] . These results are equivalent or greater than average materials currently used in openwedge osteotomy.

  13. Assessment of water quality index of bore well water samples from some selected locations of South Gujarat, India.

    PubMed

    Tripathi, S; Patel, H M; Srivastava, P K; Bafna, A M

    2013-10-01

    The present study calculates the water quality index (WQI) of some selected sites from South Gujarat (India) and assesses the impact of industries, agriculture and human activities. Chemical parameters were monitored for the calculation of WQI of some selected bore well samples. The results revealed that the WQI of the some bore well samples exceeded acceptable levels due to the dumping of wastes from municipal, industrial and domestic sources and agricultural runoff as well. Inverse Distance Weighting (IDW) was implemented for interpolation of each water quality parameter (pH, EC, alkalinity, total hardness, chloride, nitrate and sulphate) for the entire sampled area. The bore water is unsuitable for drinking and if the present state of affairs continues for long, it may soon become an ecologically dead bore.

  14. Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation

    EPA Science Inventory

    This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...

  15. Recovery of Sublethally Injured Bacteria Using Selective Agar Overlays.

    ERIC Educational Resources Information Center

    McKillip, John L.

    2001-01-01

    This experiment subjects bacteria in a food sample and an environmental sample to conditions of sublethal stress in order to assess the effectiveness of the agar overlay method to recover sublethally injured cells compared to direct plating onto the appropriate selective medium. (SAH)

  16. Methods for Producing High-Performance Silicon Carbide Fibers, Architectural Preforms, and High-Temperature Composite Structures

    NASA Technical Reports Server (NTRS)

    Yun, Hee-Mann (Inventor); DiCarlo, James A. (Inventor)

    2014-01-01

    Methods are disclosed for producing architectural preforms and high-temperature composite structures containing high-strength ceramic fibers with reduced preforming stresses within each fiber, with an in-situ grown coating on each fiber surface, with reduced boron within the bulk of each fiber, and with improved tensile creep and rupture resistance properties tier each fiber. The methods include the steps of preparing an original sample of a preform formed from a pre-selected high-strength silicon carbide ceramic fiber type, placing the original sample in a processing furnace under a pre-selected preforming stress state and thermally treating the sample in the processing furnace at a pre-selected processing temperature and hold time in a processing gas having a pre-selected composition, pressure, and flow rate. For the high-temperature composite structures, the method includes additional steps of depositing a thin interphase coating on the surface of each fiber and forming a ceramic or carbon-based matrix within the sample.

  17. Sampling design for the Study of Cardiovascular Risks in Adolescents (ERICA).

    PubMed

    Vasconcellos, Mauricio Teixeira Leite de; Silva, Pedro Luis do Nascimento; Szklo, Moyses; Kuschnir, Maria Cristina Caetano; Klein, Carlos Henrique; Abreu, Gabriela de Azevedo; Barufaldi, Laura Augusta; Bloch, Katia Vergetti

    2015-05-01

    The Study of Cardiovascular Risk in Adolescents (ERICA) aims to estimate the prevalence of cardiovascular risk factors and metabolic syndrome in adolescents (12-17 years) enrolled in public and private schools of the 273 municipalities with over 100,000 inhabitants in Brazil. The study population was stratified into 32 geographical strata (27 capitals and five sets with other municipalities in each macro-region of the country) and a sample of 1,251 schools was selected with probability proportional to size. In each school three combinations of shift (morning and afternoon) and grade were selected, and within each of these combinations, one class was selected. All eligible students in the selected classes were included in the study. The design sampling weights were calculated by the product of the reciprocals of the inclusion probabilities in each sampling stage, and were later calibrated considering the projections of the numbers of adolescents enrolled in schools located in the geographical strata by sex and age.

  18. [Combining speech sample and feature bilateral selection algorithm for classification of Parkinson's disease].

    PubMed

    Zhang, Xiaoheng; Wang, Lirui; Cao, Yao; Wang, Pin; Zhang, Cheng; Yang, Liuyang; Li, Yongming; Zhang, Yanling; Cheng, Oumei

    2018-02-01

    Diagnosis of Parkinson's disease (PD) based on speech data has been proved to be an effective way in recent years. However, current researches just care about the feature extraction and classifier design, and do not consider the instance selection. Former research by authors showed that the instance selection can lead to improvement on classification accuracy. However, no attention is paid on the relationship between speech sample and feature until now. Therefore, a new diagnosis algorithm of PD is proposed in this paper by simultaneously selecting speech sample and feature based on relevant feature weighting algorithm and multiple kernel method, so as to find their synergy effects, thereby improving classification accuracy. Experimental results showed that this proposed algorithm obtained apparent improvement on classification accuracy. It can obtain mean classification accuracy of 82.5%, which was 30.5% higher than the relevant algorithm. Besides, the proposed algorithm detected the synergy effects of speech sample and feature, which is valuable for speech marker extraction.

  19. Analyses of water, core material, and elutriate samples collected near Buras, Louisiana (New Orleans to Venice, Louisiana, Hurricane Protection Project)

    USGS Publications Warehouse

    Leone, Harold A.

    1977-01-01

    Eight core-material-sampling sites were chosen by the U.S. Army Corps of Engineers as possible borrow areas for fill material to be used in levee contruction near Buras, La. Eleven receiving-water sites also were selected to represent the water that will contact the porposed levees. Analyses of selected nutrients, metals, pesticides, and other organic constitutents were performed upon these bed-material and native-water samples as well as upon elutriate samples of specific core material-receiving water systems. The results of these analyses are presented without interpretation. (Woodard-USGS)

  20. Current trends and challenges in sample preparation for metallic nanoparticles analysis in daily products and environmental samples: A review

    NASA Astrophysics Data System (ADS)

    De la Calle, Inmaculada; Menta, Mathieu; Séby, Fabienne

    2016-11-01

    Due to the increasing use of nanoparticles (NPs) in consumer products, it becomes necessary to develop different strategies for their detection, identification, characterization and quantification in a wide variety of samples. Since the analysis of NPs in consumer products and environmental samples is particularly troublesome, a detailed description of challenges and limitations is given here. This review mainly focuses on sample preparation procedures applied for the mostly used techniques for metallic and metal oxide NPs characterization in consumer products and most outstanding publications of biological and environmental samples (from 2006 to 2015). We summarize the procedures applied for total metal content, extraction/separation and/or preconcentration of NPs from the matrix, separation of metallic NPs from their ions or from larger particles and NPs' size fractionation. Sample preparation procedures specifically for microscopy are also described. Selected applications in cosmetics, food, other consumer products, biological tissues and environmental samples are presented. Advantages and inconveniences of those procedures are considered. Moreover, selected simplified schemes for NPs sample preparation, as well as usual techniques applied are included. Finally, promising directions for further investigations are discussed.

  1. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    PubMed

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Soil sampling strategies: evaluation of different approaches.

    PubMed

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-11-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2sigma, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  3. Obscured AGN at z ~ 1 from the zCOSMOS-Bright Survey. I. Selection and optical properties of a [Ne v]-selected sample

    NASA Astrophysics Data System (ADS)

    Mignoli, M.; Vignali, C.; Gilli, R.; Comastri, A.; Zamorani, G.; Bolzonella, M.; Bongiorno, A.; Lamareille, F.; Nair, P.; Pozzetti, L.; Lilly, S. J.; Carollo, C. M.; Contini, T.; Kneib, J.-P.; Le Fèvre, O.; Mainieri, V.; Renzini, A.; Scodeggio, M.; Bardelli, S.; Caputi, K.; Cucciati, O.; de la Torre, S.; de Ravel, L.; Franzetti, P.; Garilli, B.; Iovino, A.; Kampczyk, P.; Knobel, C.; Kovač, K.; Le Borgne, J.-F.; Le Brun, V.; Maier, C.; Pellò, R.; Peng, Y.; Perez Montero, E.; Presotto, V.; Silverman, J. D.; Tanaka, M.; Tasca, L.; Tresse, L.; Vergani, D.; Zucca, E.; Bordoloi, R.; Cappi, A.; Cimatti, A.; Koekemoer, A. M.; McCracken, H. J.; Moresco, M.; Welikala, N.

    2013-08-01

    Aims: The application of multi-wavelength selection techniques is essential for obtaining a complete and unbiased census of active galactic nuclei (AGN). We present here a method for selecting z ~ 1 obscured AGN from optical spectroscopic surveys. Methods: A sample of 94 narrow-line AGN with 0.65 < z < 1.20 was selected from the 20k-Bright zCOSMOS galaxy sample by detection of the high-ionization [Ne v] λ3426 line. The presence of this emission line in a galaxy spectrum is indicative of nuclear activity, although the selection is biased toward low absorbing column densities on narrow-line region or galactic scales. A similar sample of unobscured (type 1 AGN) was collected applying the same analysis to zCOSMOS broad-line objects. This paper presents and compares the optical spectral properties of the two AGN samples. Taking advantage of the large amount of data available in the COSMOS field, the properties of the [Ne v]-selected type 2 AGN were investigated, focusing on their host galaxies, X-ray emission, and optical line-flux ratios. Finally, a previously developed diagnostic, based on the X-ray-to-[Ne v] luminosity ratio, was exploited to search for the more heavily obscured AGN. Results: We found that [Ne v]-selected narrow-line AGN have Seyfert 2-like optical spectra, although their emission line ratios are diluted by a star-forming component. The ACS morphologies and stellar component in the optical spectra indicate a preference for our type 2 AGN to be hosted in early-type spirals with stellar masses greater than 109.5 - 10 M⊙, on average higher than those of the galaxy parent sample. The fraction of galaxies hosting [Ne v]-selected obscured AGN increases with the stellar mass, reaching a maximum of about 3% at ≈2 × 1011 M⊙. A comparison with other selection techniques at z ~ 1, namely the line-ratio diagnostics and X-ray detections, shows that the detection of the [Ne v] λ3426 line is an effective method for selecting AGN in the optical band, in particular the most heavily obscured ones, but cannot provide a complete census of type 2 AGN by itself. Finally, the high fraction of [Ne v]-selected type 2 AGN not detected in medium-deep (≈100-200 ks) Chandra observations (67%) is suggestive of the inclusion of Compton-thick (i.e., with NH > 1024 cm-2) sources in our sample. The presence of a population of heavily obscured AGN is corroborated by the X-ray-to-[Ne v] ratio; we estimated, by means of an X-ray stacking technique and simulations, that the Compton-thick fraction in our sample of type 2 AGN is 43 ± 4% (statistical errors only), which agrees well with standard assumptions by XRB synthesis models.

  4. Do men and women report their sexual partnerships differently? Evidence from Kisumu, Kenya.

    PubMed

    Clark, Shelley; Kabiru, Caroline; Zulu, Eliya

    2011-12-01

    It is generally believed that men and women misreport their sexual behaviors, which undermines the ability of researchers, program designers and health care providers to assess whether these behaviors compromise individuals' sexual and reproductive health. Data on 1,299 recent sexual partnerships were collected in a 2007 survey of 1,275 men and women aged 18-24 and living in Kisumu, Kenya. Chi-square and t tests were used to examine how sample selection bias and selective partnership reporting may result in gender differences in reported sexual behaviors. Correlation coefficients and kappa statistics were calculated in further analysis of a sample of 280 matched marital and nonmarital couples to assess agreement on reported behaviors. Even after adjustment for sample selection bias, men reported twice as many partnerships as women (0.5 vs. 0.2), as well as more casual partnerships. However, when selective reporting was controlled for, aggregate gender differences in sexual behaviors almost entirely disappeared. In the matched-couples sample, men and women exhibited moderate to substantial levels of agreement for most relationship characteristics and behaviors, including type of relationship, frequency of sex and condom use. Finally, men and women tended to agree about whether men had other nonmarital partners, but disagreed about women's nonmarital partners. Both sample selection bias and selective partnership reporting can influence the level of agreement between men's and women's reports of sexual behaviors. Although men report more casual partners than do women, accounts of sexual behavior within reported relationships are generally reliable.

  5. Assessing generative braille responding following training in a matching-to-sample format.

    PubMed

    Putnam, Brittany C; Tiger, Jeffrey H

    2016-12-01

    We evaluated the effects of teaching sighted college students to select printed text letters given a braille sample stimulus in a matching-to-sample (MTS) format on the emergence of untrained (a) construction of print characters given braille samples, (b) construction of braille characters given print samples, (c) transcription of print characters given braille sample sentences, and (d) vocal reading given braille sample passages. The results demonstrated the generative development of these repertoires given MTS instruction. © 2016 Society for the Experimental Analysis of Behavior.

  6. Sample selection and preservation techniques for the Mars sample return mission

    NASA Technical Reports Server (NTRS)

    Tsay, Fun-Dow

    1988-01-01

    It is proposed that a miniaturized electron spin resonance (ESR) spectrometer be developed as an effective, nondestructivew sample selection and characterization instrument for the Mars Rover Sample Return mission. The ESR instrument can meet rover science payload requirements and yet has the capability and versatility to perform the following in situ Martian sample analyses: (1) detection of active oxygen species, and characterization of Martian surface chemistry and photocatalytic oxidation processes; (2) determination of paramagnetic Fe(3+) in clay silicate minerals, Mn(2+) in carbonates, and ferromagnetic centers of magnetite, maghemite and hematite; (3) search for organic compounds in the form of free radicals in subsoil, and detection of Martian fossil organic matter likely to be associated with carbonate and other sedimentary deposits. The proposed instrument is further detailed.

  7. A Miniaturized Spectrometer for Optimized Selection of Subsurface Samples for Future MSR Missions

    NASA Astrophysics Data System (ADS)

    De Sanctis, M. C.; Altieri, F.; De Angelis, S.; Ferrari, M.; Frigeri, A.; Biondi, D.; Novi, S.; Antonacci, F.; Gabrieli, R.; Paolinetti, R.; Villa, F.; Ammannito, A.; Mugnuolo, R.; Pirrotta, S.

    2018-04-01

    We present the concept of a miniaturized spectrometer based on the ExoMars2020/Ma_MISS experiment. Coupled with a drill tool, it will allow an assessment of subsurface composition and optimize the selection of martian samples with a high astrobiological potential.

  8. Geochemical maps showing the distribution and abundance of selected elements in stream-sediment samples, Solomon and Bendeleben 1 degree by 3 degree quadrangles, Seward Peninsula, Alaska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, S.C.; King, H.D.; O'Leary, R.M.

    Geochemical maps showing the distribution and abundance of selected elements in stream-sediment samples, Solomon and Bendeleben 1{degree} by 3{degree} quadrangles, Seward Peninsula, Alaska is presented.

  9. Concentrations of Elements in Sediments and Selective Fractions of Sediments, and in Natural Waters in Contact with Sediments from Lake Roosevelt, Washington, September 2004

    USGS Publications Warehouse

    Paulson, Anthony J.; Wagner, Richard J.; Sanzolone, Richard F.; Cox, Steven E.

    2006-01-01

    Twenty-eight composite and replicate sediment samples from 8 Lake Roosevelt sites were collected and analyzed for 10 alkali and alkaline earth elements, 2 non-metals, 20 metals, and 4 lanthanide and actinide elements. All elements were detected in all sediment samples except for silver (95 percent of the elements detected for 1,008 analyses), which was detected only in 4 samples. Sequential selective extraction procedures were performed on single composite samples from the eight sites. The percentage of detections for the 31 elements analyzed ranged from 76 percent for the first extraction fraction using a weak extractant to 93 percent for the four-acid dissolution of the sediments remaining after the third sequential selective extraction. Water samples in various degrees of contact with the sediment were analyzed for 10 alkali and alkaline earth elements, 5 non-metals, 25 metals, and 16 lanthanide and actinide elements. The filtered water samples included 10 samples from the reservoir water column at 8 sites, 32 samples of porewater, 55 samples from reservoir water overlying sediments in 8 cores from the site incubated in a field laboratory, and 24 water samples that were filtered after being tumbled with sediments from 8 sites. Overall, the concentrations of only 37 percent of the 6,776 analyses of the 121 water samples were greater than the reporting limit. Selenium, bismuth, chromium, niobium, silver, and zirconium were not detected in any water samples. The percentage of concentrations for the water samples that were above the reporting limit ranged from 14 percent for the lanthanide and actinide elements to 77 percent for the alkali and alkaline earth elements. Concentrations were greater than reporting limits in only 23 percent of the analyses of reservoir water and 29 percent of the analyses of reservoir water overlying incubation cores. In contrast, 47 and 48 percent of the concentrations of porewater and water samples tumbled with sediments, respectively, were greater than the reporting limit.

  10. Sample integrity evaluation and EPA method 325B interlaboratory comparison for select volatile organic compounds collected diffusively on Carbopack X sorbent tubes

    NASA Astrophysics Data System (ADS)

    Oliver, Karen D.; Cousett, Tamira A.; Whitaker, Donald A.; Smith, Luther A.; Mukerjee, Shaibal; Stallings, Casson; Thoma, Eben D.; Alston, Lillian; Colon, Maribel; Wu, Tai; Henkle, Stacy

    2017-08-01

    A sample integrity evaluation and an interlaboratory comparison were conducted in application of U.S. Environmental Protection Agency (EPA) Methods 325A and 325B for diffusively monitoring benzene and other selected volatile organic compounds (VOCs) using Carbopack X sorbent tubes. To evaluate sample integrity, VOC samples were refrigerated for up to 240 days and analyzed using thermal desorption/gas chromatography-mass spectrometry at the EPA Office of Research and Development laboratory in Research Triangle Park, NC, USA. For the interlaboratory comparison, three commercial analytical laboratories were asked to follow Method 325B when analyzing samples of VOCs that were collected in field and laboratory settings for EPA studies. Overall results indicate that the selected VOCs collected diffusively on sorbent tubes generally were stable for 6 months or longer when samples were refrigerated. This suggests the specified maximum 30-day storage time of VOCs collected diffusively on Carbopack X passive samplers and analyzed using Method 325B might be able to be relaxed. Interlaboratory comparison results were in agreement for the challenge samples collected diffusively in an exposure chamber in the laboratory, with most measurements within ±25% of the theoretical concentration. Statistically significant differences among laboratories for ambient challenge samples were small, less than 1 part per billion by volume (ppbv). Results from all laboratories exhibited good precision and generally agreed well with each other.

  11. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR)

    PubMed Central

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-01-01

    Background Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. Methods We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. Application We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. Conclusion This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy. PMID:17543100

  12. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR).

    PubMed

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-06-01

    Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy.

  13. Eddy current nondestructive testing device for measuring variable characteristics of a sample utilizing Walsh functions

    DOEpatents

    Libby, Hugo L.; Hildebrand, Bernard P.

    1978-01-01

    An eddy current testing device for measuring variable characteristics of a sample generates a signal which varies with variations in such characteristics. A signal expander samples at least a portion of this generated signal and expands the sampled signal on a selected basis of square waves or Walsh functions to produce a plurality of signal components representative of the sampled signal. A network combines these components to provide a display of at least one of the characteristics of the sample.

  14. Photoacoustic sample vessel and method of elevated pressure operation

    DOEpatents

    Autrey, Tom; Yonker, Clement R.

    2004-05-04

    An improved photoacoustic vessel and method of photoacoustic analysis. The photoacoustic sample vessel comprises an acoustic detector, an acoustic couplant, and an acoustic coupler having a chamber for holding the acoustic couplant and a sample. The acoustic couplant is selected from the group consisting of liquid, solid, and combinations thereof. Passing electromagnetic energy through the sample generates an acoustic signal within the sample, whereby the acoustic signal propagates through the sample to and through the acoustic couplant to the acoustic detector.

  15. Optical polarimetry and photometry of X-ray selected BL Lacertae objects

    NASA Technical Reports Server (NTRS)

    Jannuzi, Buell T.; Smith, Paul S.; Elston, Richard

    1993-01-01

    We present the data from 3 years of monitoring the optical polarization and apparent brightness of 37 X-ray-selected BL Lacertae objects. The monitored objects include a complete sample drawn from the Einstein Extended Medium Sensitivity Survey. We confirm the BL Lac identifications for 15 of these 22 objects. We include descriptions of the objects and samples in our monitoring program and of the existing complete samples of BL Lac objects, highly polarized quasars, optically violent variable quasars, and blazars.

  16. Studies of an x ray selected sample of cataclysmic variables. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Silber, Andrew D.

    1986-01-01

    Just prior to the thesis research, an all-sky survey in hard x rays with the HEAO-1 satellite and further observations in the optical resulted in a catalog of about 700 x-ray sources with known optical counterparts. This sample includes 43 cataclysmic variables, which are binaries consisting of a detached white-dwarf and a Roche lobe filling companion star. This thesis consists of studies of the x-ray selected sample of catalcysmic variables.

  17. The Earth Microbiome Project: Meeting report of the "1 EMP meeting on sample selection and acquisition" at Argonne National Laboratory October 6 2010.

    PubMed

    Gilbert, Jack A; Meyer, Folker; Jansson, Janet; Gordon, Jeff; Pace, Norman; Tiedje, James; Ley, Ruth; Fierer, Noah; Field, Dawn; Kyrpides, Nikos; Glöckner, Frank-Oliver; Klenk, Hans-Peter; Wommack, K Eric; Glass, Elizabeth; Docherty, Kathryn; Gallery, Rachel; Stevens, Rick; Knight, Rob

    2010-12-25

    This report details the outcome the first meeting of the Earth Microbiome Project to discuss sample selection and acquisition. The meeting, held at the Argonne National Laboratory on Wednesday October 6(th) 2010, focused on discussion of how to prioritize environmental samples for sequencing and metagenomic analysis as part of the global effort of the EMP to systematically determine the functional and phylogenetic diversity of microbial communities across the world.

  18. CALIFA: a diameter-selected sample for an integral field spectroscopy galaxy survey

    NASA Astrophysics Data System (ADS)

    Walcher, C. J.; Wisotzki, L.; Bekeraité, S.; Husemann, B.; Iglesias-Páramo, J.; Backsmann, N.; Barrera Ballesteros, J.; Catalán-Torrecilla, C.; Cortijo, C.; del Olmo, A.; Garcia Lorenzo, B.; Falcón-Barroso, J.; Jilkova, L.; Kalinova, V.; Mast, D.; Marino, R. A.; Méndez-Abreu, J.; Pasquali, A.; Sánchez, S. F.; Trager, S.; Zibetti, S.; Aguerri, J. A. L.; Alves, J.; Bland-Hawthorn, J.; Boselli, A.; Castillo Morales, A.; Cid Fernandes, R.; Flores, H.; Galbany, L.; Gallazzi, A.; García-Benito, R.; Gil de Paz, A.; González-Delgado, R. M.; Jahnke, K.; Jungwiert, B.; Kehrig, C.; Lyubenova, M.; Márquez Perez, I.; Masegosa, J.; Monreal Ibero, A.; Pérez, E.; Quirrenbach, A.; Rosales-Ortega, F. F.; Roth, M. M.; Sanchez-Blazquez, P.; Spekkens, K.; Tundo, E.; van de Ven, G.; Verheijen, M. A. W.; Vilchez, J. V.; Ziegler, B.

    2014-09-01

    We describe and discuss the selection procedure and statistical properties of the galaxy sample used by the Calar Alto Legacy Integral Field Area (CALIFA) survey, a public legacy survey of 600 galaxies using integral field spectroscopy. The CALIFA "mother sample" was selected from the Sloan Digital Sky Survey (SDSS) DR7 photometric catalogue to include all galaxies with an r-band isophotal major axis between 45'' and 79.2'' and with a redshift 0.005 < z < 0.03. The mother sample contains 939 objects, 600 of which will be observed in the course of the CALIFA survey. The selection of targets for observations is based solely on visibility and thus keeps the statistical properties of the mother sample. By comparison with a large set of SDSS galaxies, we find that the CALIFA sample is representative of galaxies over a luminosity range of -19 > Mr > -23.1 and over a stellar mass range between 109.7 and 1011.4 M⊙. In particular, within these ranges, the diameter selection does not lead to any significant bias against - or in favour of - intrinsically large or small galaxies. Only below luminosities of Mr = -19 (or stellar masses <109.7 M⊙) is there a prevalence of galaxies with larger isophotal sizes, especially of nearly edge-on late-type galaxies, but such galaxies form <10% of the full sample. We estimate volume-corrected distribution functions in luminosities and sizes and show that these are statistically fully compatible with estimates from the full SDSS when accounting for large-scale structure. For full characterization of the sample, we also present a number of value-added quantities determined for the galaxies in the CALIFA sample. These include consistent multi-band photometry based on growth curve analyses; stellar masses; distances and quantities derived from these; morphological classifications; and an overview of available multi-wavelength photometric measurements. We also explore different ways of characterizing the environments of CALIFA galaxies, finding that the sample covers environmental conditions from the field to genuine clusters. We finally consider the expected incidence of active galactic nuclei among CALIFA galaxies given the existing pre-CALIFA data, finding that the final observed CALIFA sample will contain approximately 30 Sey2 galaxies. Based on observations collected at the Centro Astronómico Hispano Alemán (CAHA) at Calar Alto, operated jointly by the Max Planck Institute for Astronomy and the Instituto de Astrofísica de Andalucía (CSIC). Publically released data products from CALIFA are made available on the webpage http://www.caha.es/CALIFA

  19. Piecewise SALT sampling for estimating suspended sediment yields

    Treesearch

    Robert B. Thomas

    1989-01-01

    A probability sampling method called SALT (Selection At List Time) has been developed for collecting and summarizing data on delivery of suspended sediment in rivers. It is based on sampling and estimating yield using a suspended-sediment rating curve for high discharges and simple random sampling for low flows. The method gives unbiased estimates of total yield and...

  20. Sampling methods for amphibians in streams in the Pacific Northwest.

    Treesearch

    R. Bruce Bury; Paul Stephen Corn

    1991-01-01

    Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...

  1. SnagPRO: snag and tree sampling and analysis methods for wildlife

    Treesearch

    Lisa J. Bate; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe sampling methods and provide software to accurately and efficiently estimate snag and tree densities at desired scales to meet a variety of research and management objectives. The methods optimize sampling effort by choosing a plot size appropriate for the specified forest conditions and sampling goals. Plot selection and data analyses are supported by...

  2. The development of a Martian atmospheric Sample collection canister

    NASA Astrophysics Data System (ADS)

    Kulczycki, E.; Galey, C.; Kennedy, B.; Budney, C.; Bame, D.; Van Schilfgaarde, R.; Aisen, N.; Townsend, J.; Younse, P.; Piacentine, J.

    The collection of an atmospheric sample from Mars would provide significant insight to the understanding of the elemental composition and sub-surface out-gassing rates of noble gases. A team of engineers at the Jet Propulsion Laboratory (JPL), California Institute of Technology have developed an atmospheric sample collection canister for Martian application. The engineering strategy has two basic elements: first, to collect two separately sealed 50 cubic centimeter unpressurized atmospheric samples with minimal sensing and actuation in a self contained pressure vessel; and second, to package this atmospheric sample canister in such a way that it can be easily integrated into the orbiting sample capsule for collection and return to Earth. Sample collection and integrity are demonstrated by emulating the atmospheric collection portion of the Mars Sample Return mission on a compressed timeline. The test results achieved by varying the pressure inside of a thermal vacuum chamber while opening and closing the valve on the sample canister at Mars ambient pressure. A commercial off-the-shelf medical grade micro-valve is utilized in the first iteration of this design to enable rapid testing of the system. The valve has been independently leak tested at JPL to quantify and separate the leak rates associated with the canister. The results are factored in to an overall system design that quantifies mass, power, and sensing requirements for a Martian atmospheric Sample Collection (MASC) canister as outlined in the Mars Sample Return mission profile. Qualitative results include the selection of materials to minimize sample contamination, preliminary science requirements, priorities in sample composition, flight valve selection criteria, a storyboard from sample collection to loading in the orbiting sample capsule, and contributions to maintaining “ Earth” clean exterior surfaces on the orbiting sample capsule.

  3. Sample Collection Information Document for Chemical & Radiochemical Analytes – Companion to Selected Analytical Methods for Environmental Remediation and Recovery (SAM) 2012

    EPA Pesticide Factsheets

    Sample Collection Information Document is intended to provide sampling information to be used during site assessment, remediation and clearance activities following a chemical or radiological contamination incident.

  4. Multiplex Real-Time PCR for Detection of Staphylococcus aureus, mecA and Panton-Valentine Leukocidin (PVL) Genes from Selective Enrichments from Animals and Retail Meat

    PubMed Central

    Velasco, Valeria; Sherwood, Julie S.; Rojas-García, Pedro P.; Logue, Catherine M.

    2014-01-01

    The aim of this study was to compare a real-time PCR assay, with a conventional culture/PCR method, to detect S. aureus, mecA and Panton-Valentine Leukocidin (PVL) genes in animals and retail meat, using a two-step selective enrichment protocol. A total of 234 samples were examined (77 animal nasal swabs, 112 retail raw meat, and 45 deli meat). The multiplex real-time PCR targeted the genes: nuc (identification of S. aureus), mecA (associated with methicillin resistance) and PVL (virulence factor), and the primary and secondary enrichment samples were assessed. The conventional culture/PCR method included the two-step selective enrichment, selective plating, biochemical testing, and multiplex PCR for confirmation. The conventional culture/PCR method recovered 95/234 positive S. aureus samples. Application of real-time PCR on samples following primary and secondary enrichment detected S. aureus in 111/234 and 120/234 samples respectively. For detection of S. aureus, the kappa statistic was 0.68–0.88 (from substantial to almost perfect agreement) and 0.29–0.77 (from fair to substantial agreement) for primary and secondary enrichments, using real-time PCR. For detection of mecA gene, the kappa statistic was 0–0.49 (from no agreement beyond that expected by chance to moderate agreement) for primary and secondary enrichment samples. Two pork samples were mecA gene positive by all methods. The real-time PCR assay detected the mecA gene in samples that were negative for S. aureus, but positive for Staphylococcus spp. The PVL gene was not detected in any sample by the conventional culture/PCR method or the real-time PCR assay. Among S. aureus isolated by conventional culture/PCR method, the sequence type ST398, and multi-drug resistant strains were found in animals and raw meat samples. The real-time PCR assay may be recommended as a rapid method for detection of S. aureus and the mecA gene, with further confirmation of methicillin-resistant S. aureus (MRSA) using the standard culture method. PMID:24849624

  5. Multiplex real-time PCR for detection of Staphylococcus aureus, mecA and Panton-Valentine Leukocidin (PVL) genes from selective enrichments from animals and retail meat.

    PubMed

    Velasco, Valeria; Sherwood, Julie S; Rojas-García, Pedro P; Logue, Catherine M

    2014-01-01

    The aim of this study was to compare a real-time PCR assay, with a conventional culture/PCR method, to detect S. aureus, mecA and Panton-Valentine Leukocidin (PVL) genes in animals and retail meat, using a two-step selective enrichment protocol. A total of 234 samples were examined (77 animal nasal swabs, 112 retail raw meat, and 45 deli meat). The multiplex real-time PCR targeted the genes: nuc (identification of S. aureus), mecA (associated with methicillin resistance) and PVL (virulence factor), and the primary and secondary enrichment samples were assessed. The conventional culture/PCR method included the two-step selective enrichment, selective plating, biochemical testing, and multiplex PCR for confirmation. The conventional culture/PCR method recovered 95/234 positive S. aureus samples. Application of real-time PCR on samples following primary and secondary enrichment detected S. aureus in 111/234 and 120/234 samples respectively. For detection of S. aureus, the kappa statistic was 0.68-0.88 (from substantial to almost perfect agreement) and 0.29-0.77 (from fair to substantial agreement) for primary and secondary enrichments, using real-time PCR. For detection of mecA gene, the kappa statistic was 0-0.49 (from no agreement beyond that expected by chance to moderate agreement) for primary and secondary enrichment samples. Two pork samples were mecA gene positive by all methods. The real-time PCR assay detected the mecA gene in samples that were negative for S. aureus, but positive for Staphylococcus spp. The PVL gene was not detected in any sample by the conventional culture/PCR method or the real-time PCR assay. Among S. aureus isolated by conventional culture/PCR method, the sequence type ST398, and multi-drug resistant strains were found in animals and raw meat samples. The real-time PCR assay may be recommended as a rapid method for detection of S. aureus and the mecA gene, with further confirmation of methicillin-resistant S. aureus (MRSA) using the standard culture method.

  6. Training set optimization under population structure in genomic selection

    USDA-ARS?s Scientific Manuscript database

    The optimization of the training set (TRS) in genomic selection (GS) has received much interest in both animal and plant breeding, because it is critical to the accuracy of the prediction models. In this study, five different TRS sampling algorithms, stratified sampling, mean of the Coefficient of D...

  7. Does Marital Status Influence the Parenting Styles Employed by Parents?

    ERIC Educational Resources Information Center

    Ashiono, Benard Litali; Mwoma, Teresa B.

    2015-01-01

    The current study sought to establish whether parents' marital status, influence their use of specific parenting styles in Kisauni District, Kenya. A correlational research design was employed to carry out this study. Stratified sampling technique was used to select preschools while purposive sampling technique was used to select preschool…

  8. 10 CFR 429.11 - General sampling requirements for selecting units to be tested.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... tested. 429.11 Section 429.11 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION CERTIFICATION, COMPLIANCE, AND ENFORCEMENT FOR CONSUMER PRODUCTS AND COMMERCIAL AND INDUSTRIAL EQUIPMENT Certification § 429.11 General sampling requirements for selecting units to be tested. (a) When testing of covered products or...

  9. 9 CFR 592.450 - Procedures for selecting appeal samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Procedures for selecting appeal samples. 592.450 Section 592.450 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE EGG PRODUCTS INSPECTION VOLUNTARY INSPECTION OF EGG PRODUCTS Appeals § 592.450...

  10. 40 CFR 205.57-2 - Test vehicle sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Test vehicle sample selection. 205.57-2 Section 205.57-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) NOISE ABATEMENT PROGRAMS TRANSPORTATION EQUIPMENT NOISE EMISSION CONTROLS Medium and Heavy Trucks § 205.57-2 Test...

  11. Development of a Simultaneous Extraction and Cleanup Method for Pyrethroid Pesticides from Indoor House Dust Samples

    EPA Science Inventory

    An efficient and reliable analytical method was developed for the sensitive and selective quantification of pyrethroid pesticides (PYRs) in house dust samples. The method is based on selective pressurized liquid extraction (SPLE) of the dust-bound PYRs into dichloromethane (DCM) wi...

  12. How Expert Special Educators Effectively Negotiate Their Job Demands

    ERIC Educational Resources Information Center

    Ortogero, Shawna P.; Black, Rhonda S.; Cook, Bryan G.

    2017-01-01

    This qualitative case study explored how three expert secondary special education teachers in Hawaii successfully negotiated their job demands. Purposeful sampling was used to select one secondary school on the Leeward coast of Oahu. We used reputational-case sampling to select participants that fit Dreyfus and Dreyfus' (1980) expert theoretical…

  13. MultiGeMS: detection of SNVs from multiple samples using model selection on high-throughput sequencing data.

    PubMed

    Murillo, Gabriel H; You, Na; Su, Xiaoquan; Cui, Wei; Reilly, Muredach P; Li, Mingyao; Ning, Kang; Cui, Xinping

    2016-05-15

    Single nucleotide variant (SNV) detection procedures are being utilized as never before to analyze the recent abundance of high-throughput DNA sequencing data, both on single and multiple sample datasets. Building on previously published work with the single sample SNV caller genotype model selection (GeMS), a multiple sample version of GeMS (MultiGeMS) is introduced. Unlike other popular multiple sample SNV callers, the MultiGeMS statistical model accounts for enzymatic substitution sequencing errors. It also addresses the multiple testing problem endemic to multiple sample SNV calling and utilizes high performance computing (HPC) techniques. A simulation study demonstrates that MultiGeMS ranks highest in precision among a selection of popular multiple sample SNV callers, while showing exceptional recall in calling common SNVs. Further, both simulation studies and real data analyses indicate that MultiGeMS is robust to low-quality data. We also demonstrate that accounting for enzymatic substitution sequencing errors not only improves SNV call precision at low mapping quality regions, but also improves recall at reference allele-dominated sites with high mapping quality. The MultiGeMS package can be downloaded from https://github.com/cui-lab/multigems xinping.cui@ucr.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Analysis of nutrients, selected inorganic constituents, and trace elements in water from Illinois community-supply wells, 1984-91

    USGS Publications Warehouse

    Warner, Kelly L.

    2000-01-01

    The lower Illinois River Basin (LIRB) study unit is part of the National Water-Quality Assessment program that includes studies of most major aquifer systems in the United States. Retrospective water-quality data from community-supply wells in the LIRB and in the rest of Illinois are grouped by aquifer and depth interval. Concentrations of selected chemical constituents in water samples from community-supply wells within the LIRB vary with aquifer and depth of well. Ranked data for 16 selected trace elements and nutrients are compared by aquifer, depth interval, and between the LIRB and the rest of Illinois using nonparametric statistical analyses. For all wells, median concentrations of nitrate and nitrite (as Nitrogen) are highest in water samples from the Quaternary aquifer at well depths less than 100 ft; ammonia concentrations (as Nitrogen), however, are highest in samples from well depths greater than 200 ft. Chloride and sulfate concentrations are higher in samples from the older bedrock aquifers. Arsenic, lead, sulfate, and zinc concentrations are appreciably different between samples from the LIRB and samples from the rest of Illinois for ground water from the Quaternary aquifer. Arsenic concentration is highest in the deep Quaternary aquifer. Chromium, cyanide, lead, and mercury are not frequently detected in water samples from community-supply wells in Illinois.

  15. A new approach based on off-line coupling of high-performance liquid chromatography with gas chromatography-mass spectrometry to determine acrylamide in coffee brew.

    PubMed

    Blanch, Gracia Patricia; Morales, Francisco José; Moreno, Fernando de la Peña; del Castillo, María Luisa Ruiz

    2013-01-01

    A new method based on off-line coupling of LC with GC in replacement of conventional sample preparation techniques is proposed to analyze acrylamide in coffee brews. The method involves the preseparation of the sample by LC, the collection of the selected fraction, its concentration under nitrogen, and subsequent analysis by GC coupled with MS. The composition of the LC mobile phase and the flow rate were studied to select those conditions that allowed separation of acrylamide without coeluting compounds. Under the conditions selected recoveries close to 100% were achieved while LODs and LOQs equal to 5 and 10 μg/L for acrylamide in brewed coffee were obtained. The method developed enabled the reliable detection of acrylamide in spiked coffee beverage samples without further clean-up steps or sample manipulation. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. IRAS variables as galactic structure tracers - Classification of the bright variables

    NASA Technical Reports Server (NTRS)

    Allen, L. E.; Kleinmann, S. G.; Weinberg, M. D.

    1993-01-01

    The characteristics of the 'bright infrared variables' (BIRVs), a sample consisting of the 300 brightest stars in the IRAS Point Source Catalog with IRAS variability index VAR of 98 or greater, are investigated with the purpose of establishing which of IRAS variables are AGB stars (e.g., oxygen-rich Miras and carbon stars, as was assumed by Weinberg (1992)). Results of the analysis of optical, infrared, and microwave spectroscopy of these stars indicate that, out of 88 stars in the BIRV sample identified with cataloged variables, 86 can be classified as Miras. Results of a similar analysis performed for a color-selected sample of stars, using the color limits employed by Habing (1988) to select AGB stars, showed that, out of 52 percent of classified stars, 38 percent are non-AGB stars, including H II regions, planetary nebulae, supergiants, and young stellar objects, indicating that studies using color-selected samples are subject to misinterpretation.

  17. Selected quality assurance data for water samples collected by the US Geological Survey, Idaho National Engineering Laboratory, Idaho, 1980 to 1988

    USGS Publications Warehouse

    Wegner, S.J.

    1989-01-01

    Multiple water samples from 115 wells and 3 surface water sites were collected between 1980 and 1988 for the ongoing quality assurance program at the Idaho National Engineering Laboratory. The reported results from the six laboratories involved were analyzed for agreement using descriptive statistics. The constituents and properties included: tritium, plutonium-238, plutonium-239, -240 (undivided), strontium-90, americium-241, cesium-137, total dissolved chromium, selected dissolved trace metals, sodium, chloride, nitrate, selected purgeable organic compounds, and specific conductance. Agreement could not be calculated for purgeable organic compounds, trace metals, some nitrates and blank sample analyses because analytical uncertainties were not consistently reported. However, differences between results for most of these data were calculated. The blank samples were not analyzed for differences. The laboratory results analyzed using descriptive statistics showed a median agreement between all useable data pairs of 95%. (USGS)

  18. Children Prefer Diverse Samples for Inductive Reasoning in the Social Domain.

    PubMed

    Noyes, Alexander; Christie, Stella

    2016-07-01

    Not all samples of evidence are equally conclusive: Diverse evidence is more representative than narrow evidence. Prior research showed that children did not use sample diversity in evidence selection tasks, indiscriminately choosing diverse or narrow sets (tiger-mouse; tiger-lion) to learn about animals. This failure is not due to a general deficit of inductive reasoning, but reflects children's belief about the category and property at test. Five- to 7 year-olds' inductive reasoning (n = 65) was tested in two categories (animal, people) and properties (toy preference, biological property). As stated earlier, children ignored diverse evidence when learning about animals' biological properties. When learning about people's toy preferences, however, children selected the diverse samples, providing the most compelling evidence to date of spontaneous selection of diverse evidence. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.

  19. High-Grading Lunar Samples for Return to Earth

    NASA Technical Reports Server (NTRS)

    Allen, Carlton; Sellar, Glenn; Nunez, Jorge; Winterhalter, Daniel; Farmer, Jack

    2009-01-01

    Astronauts on long-duration lunar missions will need the capability to "high-grade" their samples to select the highest value samples for transport to Earth and to leave others on the Moon. We are supporting studies to defile the "necessary and sufficient" measurements and techniques for highgrading samples at a lunar outpost. A glovebox, dedicated to testing instruments and techniques for high-grading samples, is in operation at the JSC Lunar Experiment Laboratory.

  20. Capturing heterogeneity: The role of a study area's extent for estimating mean throughfall

    NASA Astrophysics Data System (ADS)

    Zimmermann, Alexander; Voss, Sebastian; Metzger, Johanna Clara; Hildebrandt, Anke; Zimmermann, Beate

    2016-11-01

    The selection of an appropriate spatial extent of a sampling plot is one among several important decisions involved in planning a throughfall sampling scheme. In fact, the choice of the extent may determine whether or not a study can adequately characterize the hydrological fluxes of the studied ecosystem. Previous attempts to optimize throughfall sampling schemes focused on the selection of an appropriate sample size, support, and sampling design, while comparatively little attention has been given to the role of the extent. In this contribution, we investigated the influence of the extent on the representativeness of mean throughfall estimates for three forest ecosystems of varying stand structure. Our study is based on virtual sampling of simulated throughfall fields. We derived these fields from throughfall data sampled in a simply structured forest (young tropical forest) and two heterogeneous forests (old tropical forest, unmanaged mixed European beech forest). We then sampled the simulated throughfall fields with three common extents and various sample sizes for a range of events and for accumulated data. Our findings suggest that the size of the study area should be carefully adapted to the complexity of the system under study and to the required temporal resolution of the throughfall data (i.e. event-based versus accumulated). Generally, event-based sampling in complex structured forests (conditions that favor comparatively long autocorrelations in throughfall) requires the largest extents. For event-based sampling, the choice of an appropriate extent can be as important as using an adequate sample size.

  1. Molecularly imprinted polymer for selective extraction of malachite green from seawater and seafood coupled with high-performance liquid chromatographic determination.

    PubMed

    Lian, Ziru; Wang, Jiangtao

    2012-12-01

    In this paper, a highly selective sample cleanup procedure combining molecular imprinting technique (MIT) and solid-phase extraction (SPE) was developed for the isolation of malachite green in seawater and seafood samples. The molecularly imprinted polymer (MIP) was prepared using malachite green as the template molecule, methacrylic acid as the functional monomer and ethylene glycol dimethacrylate as the cross-linking monomer. The imprinted polymer and non-imprinted polymer were characterized by scanning electron microscope and static adsorption experiments. The MIP showed a high adsorption capacity and was used as selective sorbent for the SPE of malachite green. An off-line molecularly imprinted solid-phase extraction (MISPE) method followed by high-performance liquid chromatography with diodearray detection for the analysis of malachite green in seawater and seafood samples was also established. Finally, five samples were determined. The results showed that malachite green concentration in one seawater sample was at 1.30 μg L⁻¹ and the RSD (n=3) was 4.15%. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  2. Method and system for providing precise multi-function modulation

    NASA Technical Reports Server (NTRS)

    Davarian, Faramaz (Inventor); Sumida, Joe T. (Inventor)

    1989-01-01

    A method and system is disclosed which provides precise multi-function digitally implementable modulation for a communication system. The invention provides a modulation signal for a communication system in response to an input signal from a data source. A digitized time response is generated from samples of a time domain representation of a spectrum profile of a selected modulation scheme. The invention generates and stores coefficients for each input symbol in accordance with the selected modulation scheme. The output signal is provided by a plurality of samples, each sample being generated by summing the products of a predetermined number of the coefficients and a predetermined number of the samples of the digitized time response. In a specific illustrative implementation, the samples of the output signals are converted to analog signals, filtered and used to modulate a carrier in a conventional manner. The invention is versatile in that it allows for the storage of the digitized time responses and corresponding coefficient lookup table of a number of modulation schemes, any of which may then be selected for use in accordance with the teachings of the invention.

  3. Estimating Sampling Selection Bias in Human Genetics: A Phenomenological Approach

    PubMed Central

    Risso, Davide; Taglioli, Luca; De Iasio, Sergio; Gueresi, Paola; Alfani, Guido; Nelli, Sergio; Rossi, Paolo; Paoli, Giorgio; Tofanelli, Sergio

    2015-01-01

    This research is the first empirical attempt to calculate the various components of the hidden bias associated with the sampling strategies routinely-used in human genetics, with special reference to surname-based strategies. We reconstructed surname distributions of 26 Italian communities with different demographic features across the last six centuries (years 1447–2001). The degree of overlapping between "reference founding core" distributions and the distributions obtained from sampling the present day communities by probabilistic and selective methods was quantified under different conditions and models. When taking into account only one individual per surname (low kinship model), the average discrepancy was 59.5%, with a peak of 84% by random sampling. When multiple individuals per surname were considered (high kinship model), the discrepancy decreased by 8–30% at the cost of a larger variance. Criteria aimed at maximizing locally-spread patrilineages and long-term residency appeared to be affected by recent gene flows much more than expected. Selection of the more frequent family names following low kinship criteria proved to be a suitable approach only for historically stable communities. In any other case true random sampling, despite its high variance, did not return more biased estimates than other selective methods. Our results indicate that the sampling of individuals bearing historically documented surnames (founders' method) should be applied, especially when studying the male-specific genome, to prevent an over-stratification of ancient and recent genetic components that heavily biases inferences and statistics. PMID:26452043

  4. Estimating Sampling Selection Bias in Human Genetics: A Phenomenological Approach.

    PubMed

    Risso, Davide; Taglioli, Luca; De Iasio, Sergio; Gueresi, Paola; Alfani, Guido; Nelli, Sergio; Rossi, Paolo; Paoli, Giorgio; Tofanelli, Sergio

    2015-01-01

    This research is the first empirical attempt to calculate the various components of the hidden bias associated with the sampling strategies routinely-used in human genetics, with special reference to surname-based strategies. We reconstructed surname distributions of 26 Italian communities with different demographic features across the last six centuries (years 1447-2001). The degree of overlapping between "reference founding core" distributions and the distributions obtained from sampling the present day communities by probabilistic and selective methods was quantified under different conditions and models. When taking into account only one individual per surname (low kinship model), the average discrepancy was 59.5%, with a peak of 84% by random sampling. When multiple individuals per surname were considered (high kinship model), the discrepancy decreased by 8-30% at the cost of a larger variance. Criteria aimed at maximizing locally-spread patrilineages and long-term residency appeared to be affected by recent gene flows much more than expected. Selection of the more frequent family names following low kinship criteria proved to be a suitable approach only for historically stable communities. In any other case true random sampling, despite its high variance, did not return more biased estimates than other selective methods. Our results indicate that the sampling of individuals bearing historically documented surnames (founders' method) should be applied, especially when studying the male-specific genome, to prevent an over-stratification of ancient and recent genetic components that heavily biases inferences and statistics.

  5. Star formation rates in isolated galaxies selected from the Two-Micron All-Sky Survey

    NASA Astrophysics Data System (ADS)

    Melnyk, O.; Karachentseva, V.; Karachentsev, I.

    2015-08-01

    We have considered the star formation properties of 1616 isolated galaxies from the 2MASS XSC (Extended Source Catalog) selected sample (2MIG) with the far-ultraviolet GALEX magnitudes. This sample was then compared with corresponding properties of isolated galaxies from the Local Orphan Galaxies (LOG) catalogue and paired galaxies. We found that different selection algorithms define different populations of isolated galaxies. The population of the LOG catalogue, selected from non-clustered galaxies in the Local Supercluster volume, mostly consists of low-mass spiral and late-type galaxies. The specific star formation rate (SSFR) upper limit in isolated and paired galaxies does not exceed the value of ˜dex(-9.4). This is probably common for galaxies of differing activity and environment (at least at z < 0.06). The fractions of quenched galaxies are nearly twice as high in the paired galaxy sample as in the 2MIG isolated galaxy sample. From the behaviour of (S)SFR versus M* relations we deduced that the characteristic value influencing evolutionary processes is the galaxy mass. However, the environmental influence is notable: paired massive galaxies with logM* > 11.5 have higher (S)SFR than isolated galaxies. Our results suggest that the environment helps to trigger the star formation in the highest mass galaxies. We found that the fraction of AGN in the paired sample is only a little higher than in our isolated galaxy sample. We assume that AGN phenomenon is probably defined by secular galaxy evolution.

  6. Water-Quality Data for Selected National Park Units within the Southern Colorado Plateau Network, Arizona, Utah, Colorado, and New Mexico, Water Years 2005 and 2006

    USGS Publications Warehouse

    Macy, Jamie P.; Monroe, Stephen A.

    2006-01-01

    The National Park Service initiated a Level 1 Water-Quality Inventory program to provide water-quality data to park managers so informed natural resource management decisions could be made. Level 1 water-quality data were collected by the U.S. Geological Survey Arizona Water Science Center at 57 sites in 13 National Park units located in the Southern Colorado Plateau Inventory and Monitoring network in water years 2005 and 2006. These data describe the current water-quality at selected sites within the park units and provide information for monitoring future trends. Water samples were collected three times at each type of site including wells, springs, seeps, tinajas, rivers, a lake, and an irrigation ditch. Field measurements were taken at each site and they included pH, specific conductance, temperature, barometric pressure, dissolved oxygen, alkalinity, turbidity, and discharge rates where applicable. Water samples collected were sent to the U.S. Geological Survey National Water Quality Laboratory and analyzed for major ions, trace elements, and nutrients. The National Water Quality Laboratory also analyzed selected samples for mercury and petroleum hydrocarbons. Additional samples at selected sites were collected and analyzed for cyanide, radiochemistry, and suspended sediment by U.S. Geological Survey contract labs. Fecal-indicator bacteria (Escherichia coli) were sampled for at selected sites as another indicator of water quality. Quality control for this study was achieved through proper training of field personnel, use of standard U.S. Geological Survey field and laboratory protocols, collection of sample blanks and replicates, and a thorough review of the water-quality analyses. Measured field pH ranged from 6.0 to 8.8, within normal range for springs and rivers, at most sites. Concentrations of dissolved solids ranged from 48 to 8,680 mg/L and the majority of samples had concentrations of dissolved solids below 900 mg/L. Trace-element concentrations at most sites were at or near the laboratory reporting levels. The highest overall trace-element concentrations were found at U.S. Highway 160 Spring near Park Entrance to Mesa Verde National Park. Concentrations of uranium in samples at all sites ranged from below the detection limit to 55.7 ?g/L. Water samples from selected sites were analyzed for total petroleum hydrocarbons and concentrations of total petroleum hydrocarbons were at or above the laboratory detection limit in samples at six National Park units. Ten sites were sampled for Escherichia coli and positive counts were found at 9 out of the ten sites, the highest colony counts were found at Chinle Creek at Chinle, AZ in Canyon de Chelly National Monument. Measured concentrations of dissolved ammonia, nitrite, and nitrate were at or near laboratory reporting levels at most sites; nitrate concentrations ranged from below the reporting limit (0.047 mg/L) to 9.77 mg/L. Samples that were analyzed for mercury had concentrations below or at the laboratory reporting level. Concentrations of cyanide were less than the laboratory reporting level for all samples except two, Spruce Tree House Spring in Mesa Verde National Park and Pine Tree Canyon Tinaja in Canyon de Chelly National Monument, which had average concentrations of .042 and .011 ?g/L respectively. Gross alpha/beta radioactivity counts were below the U.S. Environmental Protection Agency maximum contaminant level except for samples from Casa Chiquita Well Middle at Chaco Culture National Historical Park which averaged 35 pCi/L. Suspended-sediment concentrations were variable and ranged from 10 to 150,000 mg/L.

  7. Selective Laser Melting of Metal Powder Of Steel 3161

    NASA Astrophysics Data System (ADS)

    Smelov, V. G.; Sotov, A. V.; Agapovichev, A. V.; Tomilina, T. M.

    2016-08-01

    In this article the results of experimental study of the structure and mechanical properties of materials obtained by selective laser melting (SLM), metal powder steel 316L was carried out. Before the process of cultivation of samples as the input control, the morphology of the surface of the powder particles was studied and particle size analysis was carried out. Also, 3D X-ray quality control of the grown samples was carried out in order to detect hidden defects, their qualitative and quantitative assessment. To determine the strength characteristics of the samples synthesized by the SLM method, static tensile tests were conducted. To determine the stress X-ray diffraction analysis was carried out in the material samples.

  8. Environmental Monitoring and Assessment Program Western Pilot Project - Information about selected fish and macroinvertebrates sampled from North Dakota perennial streams, 2000-2003

    USGS Publications Warehouse

    Vining, Kevin C.; Lundgren, Robert F.

    2008-01-01

    Sixty-five sampling sites, selected by a statistical design to represent lengths of perennial streams in North Dakota, were chosen to be sampled for fish and aquatic insects (macroinvertebrates) to establish unbiased baseline data. Channel catfish and common carp were the most abundant game and large fish species in the Cultivated Plains and Rangeland Plains, respectively. Blackflies were present in more than 50 percent of stream lengths sampled in the State; mayflies and caddisflies were present in more than 80 percent. Dragonflies were present in a greater percentage of stream lengths in the Rangeland Plains than in the Cultivated Plains.

  9. A complete hard X-ray selected sample of local, luminous AGNs

    NASA Astrophysics Data System (ADS)

    Burtscher, Leonard; Davies, Ric; Lin, Ming-yi; Orban de Xivry, Gilles; Rosario, David

    2016-08-01

    Choosing a very well defined sample is essential for studying the AGN phenomenon. Only the most luminous AGNs can be expected to require a coherent feeding mechanism to sustain their activity and since host galaxy properties and AGN activity are essentially uncorrelated, nuclear scales must be resolved in order to shed light on the feeding mechanisms of AGNs. For these reasons we are compiling a sample of the most powerful, local AGNs. In this talk we present our on-going programme to observe a complete volume limited sample of nearby active galaxies selected by their 14-195 keV luminosity, and outline its rationale for studying the mechanisms regulating gas inflow and outflow.

  10. The far-infrared properties of the CfA galaxy sample. I - The catalog

    NASA Technical Reports Server (NTRS)

    Thuan, T. X.; Sauvage, M.

    1992-01-01

    IRAS flux densities are presented for all galaxies in the Center for Astrophysics magnitude-limited sample (mB not greater than 14.5) detected in the IRAS Faint Source Survey (FSS), a total of 1544 galaxies. The detection rate in the FSS is slightly larger than in the PSC for the long-wavelength 60- and 100-micron bands, but improves by a factor of about 3 or more for the short wavelength 12- and 25-micron bands. This optically selected sample consists of galaxies which are, on average, much less IR-active than galaxies in IR-selected samples. It possesses accurate and complete redshift, morphological, and magnitude information, along with observations at other wavelengths.

  11. SAMPLING OSCILLOSCOPE

    DOEpatents

    Sugarman, R.M.

    1960-08-30

    An oscilloscope is designed for displaying transient signal waveforms having random time and amplitude distributions. The oscilloscopc is a sampling device that selects for display a portion of only those waveforms having a particular range of amplitudes. For this purpose a pulse-height analyzer is provided to screen the pulses. A variable voltage-level shifter and a time-scale rampvoltage generator take the pulse height relative to the start of the waveform. The variable voltage shifter produces a voltage level raised one step for each sequential signal waveform to be sampled and this results in an unsmeared record of input signal waveforms. Appropriate delay devices permit each sample waveform to pass its peak amplitude before the circuit selects it for display.

  12. Size selective isocyanate aerosols personal air sampling using porous plastic foams

    NASA Astrophysics Data System (ADS)

    Khanh Huynh, Cong; Duc, Trinh Vu

    2009-02-01

    As part of a European project (SMT4-CT96-2137), various European institutions specialized in occupational hygiene (BGIA, HSL, IOM, INRS, IST, Ambiente e Lavoro) have established a program of scientific collaboration to develop one or more prototypes of European personal samplers for the collection of simultaneous three dust fractions: inhalable, thoracic and respirable. These samplers based on existing sampling heads (IOM, GSP and cassettes) use Polyurethane Plastic Foam (PUF) according to their porosity to support sampling and separator size of the particles. In this study, the authors present an original application of size selective personal air sampling using chemical impregnated PUF to perform isocyanate aerosols capturing and derivatizing in industrial spray-painting shops.

  13. Watershed trend analysis and water-quality assessment using bottom-sediment cores from Cheney Reservoir, south-central Kansas

    USGS Publications Warehouse

    Pope, Larry M.

    1998-01-01

    An examination of Cheney Reservoir bottom sediment was conducted in August 1997 to describe long-term trends and document the occurrence of selected constituents at concentrations that may be detrimental to aquatic organisms. Average concentrations of total phosphorus in bottom-sediment cores ranged from 94 to 674 milligrams per kilogram and were statistically related to silt- and (or) clay-size particles. Results from selected sampling sites in Cheney Reservoir indicate an increasing trend in total phosphorus concentrations. This trend is probably of nonpoint-source origin and may be related to an increase in fertilizer sales in the area, which more than doubled between 1965 and 1996, and to livestock production. Few organochlorine compounds were detected in bottom-sediment samples from Cheney Reservoir. DDT, its degradation products DDD and DDE, and dieldrin had detectable concentrations in the seven samples that were analyzed. DDT and DDD were each detected in one sample at concentrations of 1.0 and 0.65 microgram per kilogram, respectively. By far, the most frequently detected organochlorine insecticide was DDE, which was detected in all seven samples, ranging in concentration from 0.31 to 1.3 micrograms per kilogram. A decreasing trend in DDE concentrations was evident in sediment-core data from one sampling site. Dieldrin was detected in one sample from each of two sampling sites at concentrations of 0.21 and 0.22 micrograms per kilogram. Polychlorinated biphenyls were not detected in any bottom-sediment sample analyzed. Selected organophosphate, chlorophenoxy-acid, triazine, and acetanilide pesticides were analyzed in 18 bottom-sediment samples. Of the 23 pesticides analyzed, only the acetanilide herbicide metolachlor was detected (in 22 percent of the samples). Seven bottom-sediment samples were analyzed for major metals and trace elements. The median and maximum concentrations of arsenic and chromium, the maximum concentration of copper, and all concentrations of nickel in the seven samples were in the range where adverse effects to aquatic organisms occasionally occur. No time trends in trace elements were discernable in the August 1997 data.

  14. Impacts of sampling design and estimation methods on nutrient leaching of intensively monitored forest plots in the Netherlands.

    PubMed

    de Vries, W; Wieggers, H J J; Brus, D J

    2010-08-05

    Element fluxes through forest ecosystems are generally based on measurements of concentrations in soil solution at regular time intervals at plot locations sampled in a regular grid. Here we present spatially averaged annual element leaching fluxes in three Dutch forest monitoring plots using a new sampling strategy in which both sampling locations and sampling times are selected by probability sampling. Locations were selected by stratified random sampling with compact geographical blocks of equal surface area as strata. In each sampling round, six composite soil solution samples were collected, consisting of five aliquots, one per stratum. The plot-mean concentration was estimated by linear regression, so that the bias due to one or more strata being not represented in the composite samples is eliminated. The sampling times were selected in such a way that the cumulative precipitation surplus of the time interval between two consecutive sampling times was constant, using an estimated precipitation surplus averaged over the past 30 years. The spatially averaged annual leaching flux was estimated by using the modeled daily water flux as an ancillary variable. An important advantage of the new method is that the uncertainty in the estimated annual leaching fluxes due to spatial and temporal variation and resulting sampling errors can be quantified. Results of this new method were compared with the reference approach in which daily leaching fluxes were calculated by multiplying daily interpolated element concentrations with daily water fluxes and then aggregated to a year. Results show that the annual fluxes calculated with the reference method for the period 2003-2005, including all plots, elements and depths, lies only in 53% of the cases within the range of the average +/-2 times the standard error of the new method. Despite the differences in results, both methods indicate comparable N retention and strong Al mobilization in all plots, with Al leaching being nearly equal to the leaching of SO(4) and NO(3) with fluxes expressed in mol(c) ha(-1) yr(-1). This illustrates that Al release, which is the clearest signal of soil acidification, is mainly due to the external input of SO(4) and NO(3).

  15. Practical guidance on characterizing availability in resource selection functions under a use-availability design

    USGS Publications Warehouse

    Northrup, Joseph M.; Hooten, Mevin B.; Anderson, Charles R.; Wittemyer, George

    2013-01-01

    Habitat selection is a fundamental aspect of animal ecology, the understanding of which is critical to management and conservation. Global positioning system data from animals allow fine-scale assessments of habitat selection and typically are analyzed in a use-availability framework, whereby animal locations are contrasted with random locations (the availability sample). Although most use-availability methods are in fact spatial point process models, they often are fit using logistic regression. This framework offers numerous methodological challenges, for which the literature provides little guidance. Specifically, the size and spatial extent of the availability sample influences coefficient estimates potentially causing interpretational bias. We examined the influence of availability on statistical inference through simulations and analysis of serially correlated mule deer GPS data. Bias in estimates arose from incorrectly assessing and sampling the spatial extent of availability. Spatial autocorrelation in covariates, which is common for landscape characteristics, exacerbated the error in availability sampling leading to increased bias. These results have strong implications for habitat selection analyses using GPS data, which are increasingly prevalent in the literature. We recommend researchers assess the sensitivity of their results to their availability sample and, where bias is likely, take care with interpretations and use cross validation to assess robustness.

  16. Validation of molecularly imprinted polymers for side chain selective phosphopeptide enrichment.

    PubMed

    Chen, Jing; Shinde, Sudhirkumar; Subedi, Prabal; Wierzbicka, Celina; Sellergren, Börje; Helling, Stefan; Marcus, Katrin

    2016-11-04

    Selective enrichment techniques are essential for mapping of protein posttranslational modifications (PTMs). Phosphorylation is one of the PTMs which continues to be associated with significant analytical challenges. Particularly problematic are tyrosine-phosphorylated peptides (pY-peptides) resulting from tryptic digestion which commonly escape current chemo- or immuno- affinity enrichments and hence remain undetected. We here report on significant improvements in this regard using pY selective molecularly imprinted polymers (pY-MIPs). The pY-MIP was compared with titanium dioxide (TiO 2 ) affinity based enrichment and immunoprecipitation (IP) with respect to selective enrichment from a mixture of 13 standard peptides at different sample loads. At a low sample load (1pmol of each peptide), IP resulted in enrichment of only a triply phosphorylated peptide whereas TiO 2 enriched phosphopeptides irrespective of the amino acid side chain. However, with increased sample complexity, TiO 2 failed to enrich the doubly phosphorylated peptides. This contrasted with the pY-MIP showing enrichment of all four tyrosine phosphorylated peptides at 1pmol sample load of each peptide with a few other peptides binding unselectively. At an increased sample complexity consisting of the standard peptides spiked into mouse brain digest, the MIP showed clear enrichment of all four pY- peptides. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. The Bologna complete sample of nearby radio sources. II. Phase referenced observations of faint nuclear sources

    NASA Astrophysics Data System (ADS)

    Liuzzo, E.; Giovannini, G.; Giroletti, M.; Taylor, G. B.

    2009-10-01

    Aims: To study statistical properties of different classes of sources, it is necessary to observe a sample that is free of selection effects. To do this, we initiated a project to observe a complete sample of radio galaxies selected from the B2 Catalogue of Radio Sources and the Third Cambridge Revised Catalogue (3CR), with no selection constraint on the nuclear properties. We named this sample “the Bologna Complete Sample” (BCS). Methods: We present new VLBI observations at 5 and 1.6 GHz for 33 sources drawn from a sample not biased toward orientation. By combining these data with those in the literature, information on the parsec-scale morphology is available for a total of 76 of 94 radio sources with a range in radio power and kiloparsec-scale morphologies. Results: The fraction of two-sided sources at milliarcsecond resolution is high (30%), compared to the fraction found in VLBI surveys selected at centimeter wavelengths, as expected from the predictions of unified models. The parsec-scale jets are generally found to be straight and to line up with the kiloparsec-scale jets. A few peculiar sources are discussed in detail. Tables 1-4 are only available in electronic form at http://www.aanda.org

  18. Prediction of ECS and SSC Models for Flux-Limited Samples of Gamma-Ray Blazars

    NASA Technical Reports Server (NTRS)

    Lister, Matthew L.; Marscher, Alan P.

    1999-01-01

    The external Compton scattering (ECS) and synchrotron self-Compton (SSC) models make distinct predictions for the amount of Doppler boosting of high-energy gamma-rays emitted by Nazar. We examine how these differences affect the predicted properties of active galactic nucleus (AGN) samples selected on the basis of Murray emission. We create simulated flux-limited samples based on the ECS and SSC models, and compare their properties to those of identified EGRET blazars. We find that for small gamma-ray-selected samples, the two models make very similar predictions, and cannot be reliably distinguished. This is primarily due to the fact that not only the Doppler factor, but also the cosmological distance and intrinsic luminosity play a role in determining whether an AGN is included in a flux-limited gamma-ray sample.

  19. VLA observations of a complete sample of extragalactic X-ray sources. II

    NASA Technical Reports Server (NTRS)

    Schild, R.; Zamorani, G.; Gioia, I. M.; Feigelson, E. D.; Maccacaro, T.

    1983-01-01

    A complete sample of 35 X-ray selected sources found with the Einstein Observatory has been observed with the Very Large Array at 6 cm to investigate the relationship between radio and X-ray emission in extragalactic objects. Detections include three active galactic nuclei (AGNs), two clusters or groups of galaxies, two individual galaxies, and two BL Lac objects. The frequency of radio emission in X-ray selected AGNs is compared with that of optically selected quasars using the integral radio-optical luminosity function. The result suggests that the probability for X-ray selected quasars to be radio sources is higher than for those optically selected. No obvious correlation is found in the sample between the richness of X-ray luminosity of the cluster and the presence of a galaxy with radio luminosity at 5 GHz larger than 10 to the 30th ergs/s/Hz.

  20. Evolution of the major merger galaxy pair fraction at z < 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keenan, R. C.; Hsieh, B. C.; Lin, L.

    We present a study of the largest available sample of near-infrared selected (i.e., stellar mass selected) dynamically close pairs of galaxies at low redshifts (z < 0.3). We combine this sample with new estimates of the major merger pair fraction for stellar mass selected galaxies at z < 0.8, from the Red Sequence Cluster Survey (RCS1). We construct our low-redshift K-band selected sample using photometry from the UKIRT Infrared Deep Sky Survey and the Two Micron All Sky Survey (2MASS) in the K band (∼2.2 μm). Combined with all available spectroscopy, our K-band selected sample contains ∼250, 000 galaxies andmore » is >90% spectroscopically complete. The depth and large volume of this sample allow us to investigate the low-redshift pair fraction and merger rate of galaxies over a wide range in K-band luminosity. We find the major merger pair fraction to be flat at ∼2% as a function of K-band luminosity for galaxies in the range 10{sup 8}-10{sup 12} L {sub ☉}, in contrast to recent results from studies in the local group that find a substantially higher low-mass pair fraction. This low-redshift major merger pair fraction is ∼40%-50% higher than previous estimates drawn from K-band samples, which were based on 2MASS photometry alone. Combining with the RCS1 sample, we find a much flatter evolution (m = 0.7 ± 0.1) in the relation f {sub pair}∝(1 + z) {sup m} than indicated in many previous studies. These results indicate that a typical L ∼ L* galaxy has undergone ∼0.2-0.8 major mergers since z = 1 (depending on the assumptions of merger timescale and percentage of pairs that actually merge).« less

  1. Methods for purifying carbon materials

    DOEpatents

    Dailly, Anne [Pasadena, CA; Ahn, Channing [Pasadena, CA; Yazami, Rachid [Los Angeles, CA; Fultz, Brent T [Pasadena, CA

    2009-05-26

    Methods of purifying samples are provided that are capable of removing carbonaceous and noncarbonaceous impurities from a sample containing a carbon material having a selected structure. Purification methods are provided for removing residual metal catalyst particles enclosed in multilayer carbonaceous impurities in samples generate by catalytic synthesis methods. Purification methods are provided wherein carbonaceous impurities in a sample are at least partially exfoliated, thereby facilitating subsequent removal of carbonaceous and noncarbonaceous impurities from the sample. Methods of purifying carbon nanotube-containing samples are provided wherein an intercalant is added to the sample and subsequently reacted with an exfoliation initiator to achieve exfoliation of carbonaceous impurities.

  2. SAS procedures for designing and analyzing sample surveys

    USGS Publications Warehouse

    Stafford, Joshua D.; Reinecke, Kenneth J.; Kaminski, Richard M.

    2003-01-01

    Complex surveys often are necessary to estimate occurrence (or distribution), density, and abundance of plants and animals for purposes of re-search and conservation. Most scientists are familiar with simple random sampling, where sample units are selected from a population of interest (sampling frame) with equal probability. However, the goal of ecological surveys often is to make inferences about populations over large or complex spatial areas where organisms are not homogeneously distributed or sampling frames are in-convenient or impossible to construct. Candidate sampling strategies for such complex surveys include stratified,multistage, and adaptive sampling (Thompson 1992, Buckland 1994).

  3. Characterization of tank 51 sludge samples (HTF-51-17-44/ HTF-51-17-48) in support of sludge batch 10 processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oji, L. N.

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) Engineering (SRR-E) to provide sample characterization and analyses of Tank 51 sludge samples in support of Sludge Batch (SB) 10. The two Tank 51 sludge samples were sampled and delivered to SRNL in May of 2017. These two tank 51 sludge samples were combined into one composite sample and analyzed for corrosion controls analytes, select radionuclides, chemical elements, density and weight percent total solids and aluminum hydroxides (gibbsite and boehmite) by x-ray diffraction.

  4. Water-quality, bed-sediment, and biological data (October 2010 through September 2011) and statistical summaries of data for streams in the Clark Fork basin, Montana

    USGS Publications Warehouse

    Dodge, Kent A.; Hornberger, Michelle I.; Dyke, Jessica

    2013-01-01

    Water, bed sediment, and biota were sampled in streams from Butte to near Missoula, Montana, as part of a monitoring program in the upper Clark Fork basin of western Montana; additional water samples were collected from near Galen to near Missoula at select sites as part of a supplemental sampling program. The sampling program was conducted by the U.S. Geological Survey in cooperation with the U.S. Environmental Protection Agency to characterize aquatic resources in the Clark Fork basin, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were located on the Clark Fork and selected tributaries. Water samples were collected periodically at 20 sites from October 2010 through September 2011. Bed-sediment and biota samples were collected once at 14 sites during August 2011. This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2010 through September 2011. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. Turbidity was analyzed for water samples collected at the four sites where seasonal daily values of turbidity were being determined. Daily values of suspended-sediment concentration and suspended-sediment discharge were determined for four sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork basin are provided for the period of record since 1985.

  5. Water-quality, bed-sediment, and biological data (October 2015 through September 2016) and statistical summaries of data for streams in the Clark Fork Basin, Montana

    USGS Publications Warehouse

    Dodge, Kent A.; Hornberger, Michelle I.; Turner, Matthew A.

    2018-03-30

    Water, bed sediment, and biota were sampled in selected streams from Butte to near Missoula, Montana, as part of a monitoring program in the upper Clark Fork Basin of western Montana. The sampling program was led by the U.S. Geological Survey, in cooperation with the U.S. Environmental Protection Agency, to characterize aquatic resources in the Clark Fork Basin, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were on the Clark Fork and selected tributaries. Water samples were collected periodically at 20 sites from October 2015 through September 2016. Bed-sediment and biota samples were collected once at 13 sites during August 2016.This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2015 through September 2016. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. Samples for analysis of turbidity were collected at 13 sites, whereas samples for analysis of dissolved organic carbon were collected at 10 sites. In addition, samples for analysis of nitrogen (nitrate plus nitrite) were collected at two sites. Daily values of mean suspended-sediment concentration and suspended-sediment discharge were determined for three sites. Seasonal daily values of turbidity were determined for five sites. Bed-sediment data include trace-element concentrations in the fine-grained (less than 0.063 millimeter) fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork Basin are provided for the period of record.

  6. Adaptive Cluster Sampling for Forest Inventories

    Treesearch

    Francis A. Roesch

    1993-01-01

    Adaptive cluster sampling is shown to be a viable alternative for sampling forests when there are rare characteristics of the forest trees which are of interest and occur on clustered trees. The ideas of recent work in Thompson (1990) have been extended to the case in which the initial sample is selected with unequal probabilities. An example is given in which the...

  7. Instance-Based Learning: Integrating Sampling and Repeated Decisions from Experience

    ERIC Educational Resources Information Center

    Gonzalez, Cleotilde; Dutt, Varun

    2011-01-01

    In decisions from experience, there are 2 experimental paradigms: sampling and repeated-choice. In the sampling paradigm, participants sample between 2 options as many times as they want (i.e., the stopping point is variable), observe the outcome with no real consequences each time, and finally select 1 of the 2 options that cause them to earn or…

  8. The Accuracy of Estimated Total Test Statistics. Final Report.

    ERIC Educational Resources Information Center

    Kleinke, David J.

    In a post-mortem study of item sampling, 1,050 examinees were divided into ten groups 50 times. Each time, their papers were scored on four different sets of item samples from a 150-item test of academic aptitude. These samples were selected using (a) unstratified random sampling and stratification on (b) content, (c) difficulty, and (d) both.…

  9. Caution on the use of Viton® or FETFE® O-rings in carbon dioxide sample containers for δ180 analysis

    USGS Publications Warehouse

    Revesz, Kinga M.; Coplen, Tyler B.

    1991-01-01

    Caution needs to be exercised in selecting sample containers for CO2 isotope-ratio samples of < 200 μmol. If stopcocks are used in construction of containers for such samples, the use of all-glass stopcocks with Apiezon N® hydrocarbon-based grease will eliminate the fractionation of oxygen isotopes.

  10. 46 CFR 160.076-29 - Production oversight.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Samples selected for the indicated tests may not be used for more than one test. (iii) If a sample fails... 12 months. (iii) One sample of each means of marking on each type of fabric or finish used in PFD... if any sample fails one or more tests. (3) In lots of more than 200 PFDs, the lot must be rejected if...

  11. A Fast Algorithm of Convex Hull Vertices Selection for Online Classification.

    PubMed

    Ding, Shuguang; Nie, Xiangli; Qiao, Hong; Zhang, Bo

    2018-04-01

    Reducing samples through convex hull vertices selection (CHVS) within each class is an important and effective method for online classification problems, since the classifier can be trained rapidly with the selected samples. However, the process of CHVS is NP-hard. In this paper, we propose a fast algorithm to select the convex hull vertices, based on the convex hull decomposition and the property of projection. In the proposed algorithm, the quadratic minimization problem of computing the distance between a point and a convex hull is converted into a linear equation problem with a low computational complexity. When the data dimension is high, an approximate, instead of exact, convex hull is allowed to be selected by setting an appropriate termination condition in order to delete more nonimportant samples. In addition, the impact of outliers is also considered, and the proposed algorithm is improved by deleting the outliers in the initial procedure. Furthermore, a dimension convention technique via the kernel trick is used to deal with nonlinearly separable problems. An upper bound is theoretically proved for the difference between the support vector machines based on the approximate convex hull vertices selected and all the training samples. Experimental results on both synthetic and real data sets show the effectiveness and validity of the proposed algorithm.

  12. Effect of field view size and lighting on unique-hue selection using Natural Color System object colors.

    PubMed

    Shamey, Renzo; Zubair, Muhammad; Cheema, Hammad

    2015-08-01

    The aim of this study was twofold, first to determine the effect of field view size and second of illumination conditions on the selection of unique hue samples (UHs: R, Y, G and B) from two rotatable trays, each containing forty highly chromatic Natural Color System (NCS) samples, on one tray corresponding to 1.4° and on the other to 5.7° field of view size. UH selections were made by 25 color-normal observers who repeated assessments three times with a gap of at least 24h between trials. Observers separately assessed UHs under four illumination conditions simulating illuminants D65, A, F2 and F11. An apparent hue shift (statistically significant for UR) was noted for UH selections at 5.7° field of view compared to those at 1.4°. Observers' overall variability was found to be higher for UH stimuli selections at the larger field of view. Intra-observer variability was found to be approximately 18.7% of inter-observer variability in selection of samples for both sample sizes. The highest intra-observer variability was under simulated illuminant D65, followed by A, F11, and F2. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Concentrations of nitrate in drinking water in the lower Yakima River Basin, Groundwater Management Area, Yakima County, Washington, 2017

    USGS Publications Warehouse

    Huffman, Raegan L.

    2018-05-29

    The U.S. Geological Survey, in cooperation with the lower Yakima River Basin Groundwater Management Area (GWMA) group, conducted an intensive groundwater sampling collection effort of collecting nitrate concentration data in drinking water to provide a baseline for future nitrate assessments within the GWMA. About every 6 weeks from April through December 2017, a total of 1,059 samples were collected from 156 wells and 24 surface-water drains. The domestic wells were selected based on known location, completion depth, ability to collect a sample prior to treatment on filtration, and distribution across the GWMA. The drains were pre-selected by the GWMA group, and further assessed based on ability to access sites and obtain a representative sample. More than 20 percent of samples from the domestic wells and 12.8 percent of drain samples had nitrate concentrations that exceeded the maximum contaminant level (MCL) of 10 milligrams per liter established by the U.S. Environmental Protection Agency. At least one nitrate concentration above the MCL was detected in 26 percent of wells and 33 percent of drains sampled. Nitrate was not detected in 13 percent of all samples collected.

  14. Fitting distributions to microbial contamination data collected with an unequal probability sampling design.

    PubMed

    Williams, M S; Ebel, E D; Cao, Y

    2013-01-01

    The fitting of statistical distributions to microbial sampling data is a common application in quantitative microbiology and risk assessment applications. An underlying assumption of most fitting techniques is that data are collected with simple random sampling, which is often times not the case. This study develops a weighted maximum likelihood estimation framework that is appropriate for microbiological samples that are collected with unequal probabilities of selection. A weighted maximum likelihood estimation framework is proposed for microbiological samples that are collected with unequal probabilities of selection. Two examples, based on the collection of food samples during processing, are provided to demonstrate the method and highlight the magnitude of biases in the maximum likelihood estimator when data are inappropriately treated as a simple random sample. Failure to properly weight samples to account for how data are collected can introduce substantial biases into inferences drawn from the data. The proposed methodology will reduce or eliminate an important source of bias in inferences drawn from the analysis of microbial data. This will also make comparisons between studies and the combination of results from different studies more reliable, which is important for risk assessment applications. © 2012 No claim to US Government works.

  15. Front end for GPS receivers

    NASA Technical Reports Server (NTRS)

    Thomas, Jr., Jess Brooks (Inventor)

    1999-01-01

    The front end in GPS receivers has the functions of amplifying, down-converting, filtering and sampling the received signals. In the preferred embodiment, only two operations, A/D conversion and a sum, bring the signal from RF to filtered quadrature baseband samples. After amplification and filtering at RF, the L1 and L2 signals are each sampled at RF at a high selected subharmonic rate. The subharmonic sample rates are approximately 900 MHz for L1 and 982 MHz for L2. With the selected subharmonic sampling, the A/D conversion effectively down-converts the signal from RF to quadrature components at baseband. The resulting sample streams for L1 and L2 are each reduced to a lower rate with a digital filter, which becomes a straight sum in the simplest embodiment. The frequency subsystem can be very simple, only requiring the generation of a single reference frequency (e.g. 20.46 MHz minus a small offset) and the simple multiplication of this reference up to the subharmonic sample rates for L1 and L2. The small offset in the reference frequency serves the dual purpose of providing an advantageous offset in the down-converted carrier frequency and in the final baseband sample rate.

  16. Formation and retention of methane in coal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hucka, V.J.; Bodily, D.M.; Huang, H.

    1992-05-15

    The formation and retention of methane in coalbeds was studied for ten Utah coal samples, one Colorado coal sample and eight coal samples from the Argonne Premium Coal Sample Bank.Methane gas content of the Utah and Colorado coals varied from zero to 9 cm{sup 3}/g. The Utah coals were all high volatile bituminous coals. The Colorado coal was a gassy medium volatile bituminous coal. The Argonne coals cover a range or rank from lignite to low volatile bituminous coal and were used to determine the effect of rank in laboratory studies. The methane content of six selected Utah coal seamsmore » and the Colorado coal seam was measured in situ using a special sample collection device and a bubble desorbometer. Coal samples were collected at each measurement site for laboratory analysis. The cleat and joint system was evaluated for the coal and surrounding rocks and geological conditions were noted. Permeability measurements were performed on selected samples and all samples were analyzed for proximate and ultimate analysis, petrographic analysis, {sup 13}C NMR dipolar-dephasing spectroscopy, and density analysis. The observed methane adsorption behavior was correlated with the chemical structure and physical properties of the coals.« less

  17. Formation and retention of methane in coal. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hucka, V.J.; Bodily, D.M.; Huang, H.

    1992-05-15

    The formation and retention of methane in coalbeds was studied for ten Utah coal samples, one Colorado coal sample and eight coal samples from the Argonne Premium Coal Sample Bank.Methane gas content of the Utah and Colorado coals varied from zero to 9 cm{sup 3}/g. The Utah coals were all high volatile bituminous coals. The Colorado coal was a gassy medium volatile bituminous coal. The Argonne coals cover a range or rank from lignite to low volatile bituminous coal and were used to determine the effect of rank in laboratory studies. The methane content of six selected Utah coal seamsmore » and the Colorado coal seam was measured in situ using a special sample collection device and a bubble desorbometer. Coal samples were collected at each measurement site for laboratory analysis. The cleat and joint system was evaluated for the coal and surrounding rocks and geological conditions were noted. Permeability measurements were performed on selected samples and all samples were analyzed for proximate and ultimate analysis, petrographic analysis, {sup 13}C NMR dipolar-dephasing spectroscopy, and density analysis. The observed methane adsorption behavior was correlated with the chemical structure and physical properties of the coals.« less

  18. Recent select Sample Analysis at Mars (SAM) Testbed analog results

    NASA Astrophysics Data System (ADS)

    Malespin, C.; McAdam, A.; Teinturier, S.; Eigenbrode, J. L.; Freissinet, C.; Knudson, C. A.; Lewis, J. M.; Millan, M.; Steele, A.; Stern, J. C.; Williams, A. J.

    2017-12-01

    The Sample Analysis at Mars (SAM) testbed (TB) is a high fidelity replica of the flight instrument currently onboard the Curiosity rover in Gale Crater, Mars1. The SAM testbed is housed in a Mars environment chamber at NASA Goddard Space Flight Center (GSFC), which can replicate both thermal and environmental conditions. The testbed is used to validate and test new experimental procedures before they are implemented on Mars, but it is also used to analyze analog samples which assists in the interpretation of results from the surface. Samples are heated using the same experimental protocol as on Mars to allow for direct comparison with Martian sampling conditions. Here we report preliminary results from select samples that were loaded into the SAM TB, including meteorites, an organically rich iron oxide, and a synthetic analog to the Martian Cumberland sample drilled by the rover at Yellowknife Bay. Each of these samples have been analyzed under SAM-like conditions using breadboard and lab instrument systems. By comparing the data from the lab systems and SAM TB, further insight on results from Mars can be gained. References: [1] Mahaffy, P. R., et al. (2013), Science, 341(6143), 263-266, doi:10.1126/science.1237966.

  19. Radionuclides, inorganic constitutents, organic compounds, and bacteria in water from selected wells and springs from the southern boundary of the Idaho National Engineering Laboratory to the Hagerman Area, Idaho, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartholomay, R.C.; Edwards, D.D.; Campbell, L.J.

    1994-11-01

    The U.S. Geological Survey and the Idaho Department of Water Resources, in response to a request from the U.S. Department of Energy, sampled 18 sites as part of a long-term project to monitor water quality of the Snake River Plain aquifer from the southern boundary of the Idaho National Engineering Laboratory to the Hagerman area. Water samples were collected and analyzed for selected radionuclides, inorganic constituents, organic compounds, and bacteria. The samples were collected from 13 irrigation wells, 1 domestic well, 1 spring, 2 stock wells, and 1 public supply well. Quality assurance samples also were collected and analyzed. Nonemore » of the samples analyzed for radionuclides, inorganic constituents, or organic compounds exceeded the established maximum contaminant levels for drinking water. Most of the radionuclide and inorganic constituent concentrations exceeded their respective reporting levels. Most of the samples analyzed for surfactants and dissolved organic carbon had concentrations that exceeded their reporting levels. None of the samples contained reportable concentrations of purgeable organic compounds or pesticides. Total coliform bacteria was present in nine samples.« less

  20. Optimizing larval assessment to support sea lamprey control in the Great Lakes

    USGS Publications Warehouse

    Hansen, Michael J.; Adams, Jean V.; Cuddy, Douglas W.; Richards, Jessica M.; Fodale, Michael F.; Larson, Geraldine L.; Ollila, Dale J.; Slade, Jeffrey W.; Steeves, Todd B.; Young, Robert J.; Zerrenner, Adam

    2003-01-01

    Elements of the larval sea lamprey (Petromyzon marinus) assessment program that most strongly influence the chemical treatment program were analyzed, including selection of streams for larval surveys, allocation of sampling effort among stream reaches, allocation of sampling effort among habitat types, estimation of daily growth rates, and estimation of metamorphosis rates, to determine how uncertainty in each element influenced the stream selection program. First, the stream selection model based on current larval assessment sampling protocol significantly underestimated transforming sea lam-prey abundance, transforming sea lampreys killed, and marginal costs per sea lamprey killed, compared to a protocol that included more years of data (especially for large streams). Second, larval density in streams varied significantly with Type-I habitat area, but not with total area or reach length. Third, the ratio of larval density between Type-I and Type-II habitat varied significantly among streams, and that the optimal allocation of sampling effort varied with the proportion of habitat types and variability of larval density within each habitat. Fourth, mean length varied significantly among streams and years. Last, size at metamorphosis varied more among years than within or among regions and that metamorphosis varied significantly among streams within regions. Study results indicate that: (1) the stream selection model should be used to identify streams with potentially high residual populations of larval sea lampreys; (2) larval sampling in Type-II habitat should be initiated in all streams by increasing sampling in Type-II habitat to 50% of the sampling effort in Type-I habitat; and (3) methods should be investigated to reduce uncertainty in estimates of sea lamprey production, with emphasis on those that reduce the uncertainty associated with larval length at the end of the growing season and those used to predict metamorphosis.

  1. Towards the harmonization between National Forest Inventory and Forest Condition Monitoring. Consistency of plot allocation and effect of tree selection methods on sample statistics in Italy.

    PubMed

    Gasparini, Patrizia; Di Cosmo, Lucio; Cenni, Enrico; Pompei, Enrico; Ferretti, Marco

    2013-07-01

    In the frame of a process aiming at harmonizing National Forest Inventory (NFI) and ICP Forests Level I Forest Condition Monitoring (FCM) in Italy, we investigated (a) the long-term consistency between FCM sample points (a subsample of the first NFI, 1985, NFI_1) and recent forest area estimates (after the second NFI, 2005, NFI_2) and (b) the effect of tree selection method (tree-based or plot-based) on sample composition and defoliation statistics. The two investigations were carried out on 261 and 252 FCM sites, respectively. Results show that some individual forest categories (larch and stone pine, Norway spruce, other coniferous, beech, temperate oaks and cork oak forests) are over-represented and others (hornbeam and hophornbeam, other deciduous broadleaved and holm oak forests) are under-represented in the FCM sample. This is probably due to a change in forest cover, which has increased by 1,559,200 ha from 1985 to 2005. In case of shift from a tree-based to a plot-based selection method, 3,130 (46.7%) of the original 6,703 sample trees will be abandoned, and 1,473 new trees will be selected. The balance between exclusion of former sample trees and inclusion of new ones will be particularly unfavourable for conifers (with only 16.4% of excluded trees replaced by new ones) and less for deciduous broadleaves (with 63.5% of excluded trees replaced). The total number of tree species surveyed will not be impacted, while the number of trees per species will, and the resulting (plot-based) sample composition will have a much larger frequency of deciduous broadleaved trees. The newly selected trees have-in general-smaller diameter at breast height (DBH) and defoliation scores. Given the larger rate of turnover, the deciduous broadleaved part of the sample will be more impacted. Our results suggest that both a revision of FCM network to account for forest area change and a plot-based approach to permit statistical inference and avoid bias in the tree sample composition in terms of DBH (and likely age and structure) are desirable in Italy. As the adoption of a plot-based approach will keep a large share of the trees formerly selected, direct tree-by-tree comparison will remain possible, thus limiting the impact on the time series comparability. In addition, the plot-based design will favour the integration with NFI_2.

  2. Dielectric breakdown of additively manufactured polymeric materials

    DOE PAGES

    Monzel, W. Jacob; Hoff, Brad W.; Maestas, Sabrina S.; ...

    2016-01-11

    Dielectric strength testing of selected Polyjet-printed polymer plastics was performed in accordance with ASTM D149. This dielectric strength data is compared to manufacturer-provided dielectric strength data for selected plastics printed using the stereolithography (SLA), fused deposition modeling (FDM), and selective laser sintering (SLS) methods. Tested Polyjet samples demonstrated dielectric strengths as high as 47.5 kV/mm for a 0.5 mm thick sample and 32.1 kV/mm for a 1.0 mm sample. As a result, the dielectric strength of the additively manufactured plastics evaluated as part of this study was lower than the majority of non-printed plastics by at least 15% (with themore » exception of polycarbonate).« less

  3. Dielectric breakdown of additively manufactured polymeric materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monzel, W. Jacob; Hoff, Brad W.; Maestas, Sabrina S.

    Dielectric strength testing of selected Polyjet-printed polymer plastics was performed in accordance with ASTM D149. This dielectric strength data is compared to manufacturer-provided dielectric strength data for selected plastics printed using the stereolithography (SLA), fused deposition modeling (FDM), and selective laser sintering (SLS) methods. Tested Polyjet samples demonstrated dielectric strengths as high as 47.5 kV/mm for a 0.5 mm thick sample and 32.1 kV/mm for a 1.0 mm sample. As a result, the dielectric strength of the additively manufactured plastics evaluated as part of this study was lower than the majority of non-printed plastics by at least 15% (with themore » exception of polycarbonate).« less

  4. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using ESRI software (ArcGIS) extended by Hawth's Tools and later on its replacement the Geospatial Modelling Environment (GME). 88% of all desired points could actually be reached in the field and have been successfully sampled. Our results indicate that the sampled calibration and validation sets are representative for each other and could be successfully used as interpolation data for spatial prediction purposes. With respect to soil textural fractions, for instance, equal multivariate means and variance homogeneity were found for the two datasets as evidenced by significant (P > 0.05) Hotelling T²-test (2.3 with df1 = 3, df2 = 193) and Bartlett's test statistics (6.4 with df = 6). The multivariate prediction of clay, silt and sand content using a neural network residual cokriging approach reached an explained variance level of 56%, 47% and 63%. Thus, the presented case study is a successful example of considering readily available continuous information on soil forming factors such as geology and relief as stratifying variables for designing sampling schemes in digital soil mapping projects.

  5. Can groundwater sampling techniques used in monitoring wells influence methane concentrations and isotopes?

    PubMed

    Rivard, Christine; Bordeleau, Geneviève; Lavoie, Denis; Lefebvre, René; Malet, Xavier

    2018-03-06

    Methane concentrations and isotopic composition in groundwater are the focus of a growing number of studies. However, concerns are often expressed regarding the integrity of samples, as methane is very volatile and may partially exsolve during sample lifting in the well and transfer to sampling containers. While issues concerning bottle-filling techniques have already been documented, this paper documents a comparison of methane concentration and isotopic composition obtained with three devices commonly used to retrieve water samples from dedicated observation wells. This work lies within the framework of a larger project carried out in the Saint-Édouard area (southern Québec, Canada), whose objective was to assess the risk to shallow groundwater quality related to potential shale gas exploitation. The selected sampling devices, which were tested on ten wells during three sampling campaigns, consist of an impeller pump, a bladder pump, and disposable sampling bags (HydraSleeve). The sampling bags were used both before and after pumping, to verify the appropriateness of a no-purge approach, compared to the low-flow approach involving pumping until stabilization of field physicochemical parameters. Results show that methane concentrations obtained with the selected sampling techniques are usually similar and that there is no systematic bias related to a specific technique. Nonetheless, concentrations can sometimes vary quite significantly (up to 3.5 times) for a given well and sampling event. Methane isotopic composition obtained with all sampling techniques is very similar, except in some cases where sampling bags were used before pumping (no-purge approach), in wells where multiple groundwater sources enter the borehole.

  6. The AlSi10Mg samples produced by selective laser melting: single track, densification, microstructure and mechanical behavior

    NASA Astrophysics Data System (ADS)

    Wei, Pei; Wei, Zhengying; Chen, Zhen; Du, Jun; He, Yuyang; Li, Junfeng; Zhou, Yatong

    2017-06-01

    This densification behavior and attendant microstructural characteristics of the selective laser melting (SLM) processed AlSi10Mg alloy affected by the processing parameters were systematically investigated. The samples with a single track were produced by SLM to study the influences of laser power and scanning speed on the surface morphologies of scan tracks. Additionally, the bulk samples were produced to investigate the influence of the laser power, scanning speed, and hatch spacing on the densification level and the resultant microstructure. The experimental results showed that the level of porosity of the SLM-processed samples was significantly governed by energy density of laser beam and the hatch spacing. The tensile properties of SLM-processed samples and the attendant fracture surface can be enhanced by decreasing the level of porosity. The microstructure of SLM-processed samples consists of supersaturated Al-rich cellular structure along with eutectic Al/Si situated at the cellular boundaries. The Si content in the cellular boundaries increases with increasing the laser power and decreasing the scanning speed. The hardness of SLM-processed samples was significantly improved by this fine microstructure compared with the cast samples. Moreover, the hardness of SLM-processed samples at overlaps was lower than the hardness observed at track cores.

  7. A model of directional selection applied to the evolution of drug resistance in HIV-1.

    PubMed

    Seoighe, Cathal; Ketwaroo, Farahnaz; Pillay, Visva; Scheffler, Konrad; Wood, Natasha; Duffet, Rodger; Zvelebil, Marketa; Martinson, Neil; McIntyre, James; Morris, Lynn; Hide, Winston

    2007-04-01

    Understanding how pathogens acquire resistance to drugs is important for the design of treatment strategies, particularly for rapidly evolving viruses such as HIV-1. Drug treatment can exert strong selective pressures and sites within targeted genes that confer resistance frequently evolve far more rapidly than the neutral rate. Rapid evolution at sites that confer resistance to drugs can be used to help elucidate the mechanisms of evolution of drug resistance and to discover or corroborate novel resistance mutations. We have implemented standard maximum likelihood methods that are used to detect diversifying selection and adapted them for use with serially sampled reverse transcriptase (RT) coding sequences isolated from a group of 300 HIV-1 subtype C-infected women before and after single-dose nevirapine (sdNVP) to prevent mother-to-child transmission. We have also extended the standard models of codon evolution for application to the detection of directional selection. Through simulation, we show that the directional selection model can provide a substantial improvement in sensitivity over models of diversifying selection. Five of the sites within the RT gene that are known to harbor mutations that confer resistance to nevirapine (NVP) strongly supported the directional selection model. There was no evidence that other mutations that are known to confer NVP resistance were selected in this cohort. The directional selection model, applied to serially sampled sequences, also had more power than the diversifying selection model to detect selection resulting from factors other than drug resistance. Because inference of selection from serial samples is unlikely to be adversely affected by recombination, the methods we describe may have general applicability to the analysis of positive selection affecting recombining coding sequences when serially sampled data are available.

  8. Maximizing the reliability of genomic selection by optimizing the calibration set of reference individuals: comparison of methods in two diverse groups of maize inbreds (Zea mays L.).

    PubMed

    Rincent, R; Laloë, D; Nicolas, S; Altmann, T; Brunel, D; Revilla, P; Rodríguez, V M; Moreno-Gonzalez, J; Melchinger, A; Bauer, E; Schoen, C-C; Meyer, N; Giauffret, C; Bauland, C; Jamin, P; Laborde, J; Monod, H; Flament, P; Charcosset, A; Moreau, L

    2012-10-01

    Genomic selection refers to the use of genotypic information for predicting breeding values of selection candidates. A prediction formula is calibrated with the genotypes and phenotypes of reference individuals constituting the calibration set. The size and the composition of this set are essential parameters affecting the prediction reliabilities. The objective of this study was to maximize reliabilities by optimizing the calibration set. Different criteria based on the diversity or on the prediction error variance (PEV) derived from the realized additive relationship matrix-best linear unbiased predictions model (RA-BLUP) were used to select the reference individuals. For the latter, we considered the mean of the PEV of the contrasts between each selection candidate and the mean of the population (PEVmean) and the mean of the expected reliabilities of the same contrasts (CDmean). These criteria were tested with phenotypic data collected on two diversity panels of maize (Zea mays L.) genotyped with a 50k SNPs array. In the two panels, samples chosen based on CDmean gave higher reliabilities than random samples for various calibration set sizes. CDmean also appeared superior to PEVmean, which can be explained by the fact that it takes into account the reduction of variance due to the relatedness between individuals. Selected samples were close to optimality for a wide range of trait heritabilities, which suggests that the strategy presented here can efficiently sample subsets in panels of inbred lines. A script to optimize reference samples based on CDmean is available on request.

  9. Do Men and Women Report Their Sexual Partnerships Differently? Evidence from Kisumu, Kenya

    PubMed Central

    Clark, Shelley; Kabiru, Caroline; Zulu, Eliya

    2012-01-01

    CONTEXT It is generally believed that men and women misreport their sexual behaviors, which undermines the ability of researchers, program designers and health care providers to assess whether these behaviors compromise individuals’ sexual and reproductive health. METHODS Data on 1,299 recent sexual partnerships were collected in a 2007 survey of 1,275 men and women aged 18–24 and living in Kisumu, Kenya. Chi-square and t tests were used to examine how sample selection bias and selective partnership reporting may result in gender differences in reported sexual behaviors. Correlation coefficients and kappa statistics were calculated in further analysis of a sample of 280 matched marital and nonmarital couples to assess agreement on reported behaviors. RESULTS Even after adjustment for sample selection bias, men reported twice as many partnerships as women (0.5 vs. 0.2), as well as more casual partnerships. However, when selective reporting was controlled for, aggregate gender differences in sexual behaviors almost entirely disappeared. In the matched-couples sample, men and women exhibited moderate to substantial levels of agreement for most relationship characteristics and behaviors, including type of relationship, frequency of sex and condom use. Finally, men and women tended to agree about whether men had other nonmarital partners, but disagreed about women’s nonmarital partners. CONCLUSIONS Both sample selection bias and selective partnership reporting can influence the level of agreement between men’s and women’s reports of sexual behaviors. Although men report more casual partners than do women, accounts of sexual behavior within reported relationships are generally reliable. PMID:22227625

  10. The ATLAS3D project - I. A volume-limited sample of 260 nearby early-type galaxies: science goals and selection criteria

    NASA Astrophysics Data System (ADS)

    Cappellari, Michele; Emsellem, Eric; Krajnović, Davor; McDermid, Richard M.; Scott, Nicholas; Verdoes Kleijn, G. A.; Young, Lisa M.; Alatalo, Katherine; Bacon, R.; Blitz, Leo; Bois, Maxime; Bournaud, Frédéric; Bureau, M.; Davies, Roger L.; Davis, Timothy A.; de Zeeuw, P. T.; Duc, Pierre-Alain; Khochfar, Sadegh; Kuntschner, Harald; Lablanche, Pierre-Yves; Morganti, Raffaella; Naab, Thorsten; Oosterloo, Tom; Sarzi, Marc; Serra, Paolo; Weijmans, Anne-Marie

    2011-05-01

    The ATLAS3D project is a multiwavelength survey combined with a theoretical modelling effort. The observations span from the radio to the millimetre and optical, and provide multicolour imaging, two-dimensional kinematics of the atomic (H I), molecular (CO) and ionized gas (Hβ, [O III] and [N I]), together with the kinematics and population of the stars (Hβ, Fe5015 and Mg b), for a carefully selected, volume-limited (1.16 × 105 Mpc3) sample of 260 early-type (elliptical E and lenticular S0) galaxies (ETGs). The models include semi-analytic, N-body binary mergers and cosmological simulations of galaxy formation. Here we present the science goals for the project and introduce the galaxy sample and the selection criteria. The sample consists of nearby (D < 42 Mpc, |δ- 29°| < 35°, |b| > 15°) morphologically selected ETGs extracted from a parent sample of 871 galaxies (8 per cent E, 22 per cent S0 and 70 per cent spirals) brighter than MK < -21.5 mag (stellar mass M★≳ 6 ×109 M⊙). We analyse possible selection biases and we conclude that the parent sample is essentially complete and statistically representative of the nearby galaxy population. We present the size-luminosity relation for the spirals and ETGs and show that the ETGs in the ATLAS3D sample define a tight red sequence in a colour-magnitude diagram, with few objects in the transition from the blue cloud. We describe the strategy of the SAURON integral field observations and the extraction of the stellar kinematics with the pPXF method. We find typical 1σ errors of ΔV≈ 6 km s-1, Δσ≈ 7 km s-1, Δh3≈Δh4≈ 0.03 in the mean velocity, the velocity dispersion and Gauss-Hermite (GH) moments for galaxies with effective dispersion σe≳ 120 km s-1. For galaxies with lower σe (≈40 per cent of the sample) the GH moments are gradually penalized by pPXF towards zero to suppress the noise produced by the spectral undersampling and only V and σ can be measured. We give an overview of the characteristics of the other main data sets already available for our sample and of the ongoing modelling projects.

  11. Final report : sampling plan for pavement condition ratings of secondary roads.

    DOT National Transportation Integrated Search

    1984-01-01

    The purpose of this project was to develop a random sampling plan for use in selecting segments of the secondary highway system for evaluation under the Department's PMS. The plan developed is described here. It is a simple, workable, random sampling...

  12. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., AND USE PROHIBITIONS Determining a PCB Concentration for Purposes of Abandonment or Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe.../Rinse Cleanup as Recommended by the Environmental Protection Agency PCB Spill Cleanup Policy,” dated...

  13. Size-selective separation of polydisperse gold nanoparticles in supercritical ethane.

    PubMed

    Williams, Dylan P; Satherley, John

    2009-04-09

    The aim of this study was to use supercritical ethane to selectively disperse alkanethiol-stabilized gold nanoparticles of one size from a polydisperse sample in order to recover a monodisperse fraction of the nanoparticles. A disperse sample of metal nanoparticles with diameters in the range of 1-5 nm was prepared using established techniques then further purified by Soxhlet extraction. The purified sample was subjected to supercritical ethane at a temperature of 318 K in the pressure range 50-276 bar. Particles were characterized by UV-vis absorption spectroscopy, TEM, and MALDI-TOF mass spectroscopy. The results show that with increasing pressure the dispersibility of the nanoparticles increases, this effect is most pronounced for smaller nanoparticles. At the highest pressure investigated a sample of the particles was effectively stripped of all the smaller particles leaving a monodisperse sample. The relationship between dispersibility and supercritical fluid density for two different size samples of alkanethiol-stabilized gold nanoparticles was considered using the Chrastil chemical equilibrium model.

  14. Determination of trichloroanisole and trichlorophenol in wineries' ambient air by passive sampling and thermal desorption-gas chromatography coupled to tandem mass spectrometry.

    PubMed

    Camino-Sánchez, F J; Bermúdez-Peinado, R; Zafra-Gómez, A; Ruíz-García, J; Vílchez-Quero, J L

    2015-02-06

    The present paper describes the calibration of selected passive samplers used in the quantitation of trichlorophenol and trichloroanisole in wineries' ambient air, by calculating the corresponding sampling rates. The method is based on passive sampling with sorbent tubes and involves thermal desorption-gas chromatography-triple quadrupole mass spectrometry analysis. Three commercially available sorbents were tested using sampling cartridges with a radial design instead of axial ones. The best results were found for Tenax TA™. Sampling rates (R-values) for the selected sorbents were determined. Passive sampling was also used for accurately determining the amount of compounds present in the air. Adequate correlation coefficients between the mass of the target analytes and exposure time were obtained. The proposed validated method is a useful tool for the early detection of trichloroanisole and its precursor trichlorophenol in wineries' ambient air while avoiding contamination of wine or winery facilities. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Imaging a Large Sample with Selective Plane Illumination Microscopy Based on Multiple Fluorescent Microsphere Tracking

    NASA Astrophysics Data System (ADS)

    Ryu, Inkeon; Kim, Daekeun

    2018-04-01

    A typical selective plane illumination microscopy (SPIM) image size is basically limited by the field of view, which is a characteristic of the objective lens. If an image larger than the imaging area of the sample is to be obtained, image stitching, which combines step-scanned images into a single panoramic image, is required. However, accurately registering the step-scanned images is very difficult because the SPIM system uses a customized sample mount where uncertainties for the translational and the rotational motions exist. In this paper, an image registration technique based on multiple fluorescent microsphere tracking is proposed in the view of quantifying the constellations and measuring the distances between at least two fluorescent microspheres embedded in the sample. Image stitching results are demonstrated for optically cleared large tissue with various staining methods. Compensation for the effect of the sample rotation that occurs during the translational motion in the sample mount is also discussed.

  16. Automatic HTS force measurement instrument

    DOEpatents

    Sanders, Scott T.; Niemann, Ralph C.

    1999-01-01

    A device for measuring the levitation force of a high temperature superconductor sample with respect to a reference magnet includes a receptacle for holding several high temperature superconductor samples each cooled to superconducting temperature. A rotatable carousel successively locates a selected one of the high temperature superconductor samples in registry with the reference magnet. Mechanism varies the distance between one of the high temperature superconductor samples and the reference magnet, and a sensor measures levitation force of the sample as a function of the distance between the reference magnet and the sample. A method is also disclosed.

  17. Enhanced monitor system for water protection

    DOEpatents

    Hill, David E [Knoxville, TN; Rodriquez, Jr., Miguel [Oak Ridge, TN; Greenbaum, Elias [Knoxville, TN

    2009-09-22

    An automatic, self-contained device for detecting toxic agents in a water supply includes an analyzer for detecting at least one toxic agent in a water sample, introducing a means for introducing a water sample into the analyzer and discharging the water sample from the analyzer, holding means for holding a water sample for a pre-selected period of time before the water sample is introduced into the analyzer, and an electronics package that analyzes raw data from the analyzer and emits a signal indicating the presence of at least one toxic agent in the water sample.

  18. Prospecting by sampling and analysis of airborne particulates and gases

    DOEpatents

    Sehmel, G.A.

    1984-05-01

    A method is claimed for prospecting by sampling airborne particulates or gases at a ground position and recording wind direction values at the time of sampling. The samples are subsequently analyzed to determine the concentrations of a desired material or the ratios of the desired material to other identifiable materials in the collected samples. By comparing the measured concentrations or ratios to expected background data in the vicinity sampled, one can select recorded wind directions indicative of the upwind position of the land-based source of the desired material.

  19. Radionuclides, inorganic constituents, organic compounds, and bacteria in water from selected wells and springs from the southern boundary of the Idaho National Engineering Laboratory to the Hagerman Area, Idaho, 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartholomay, R.C.; Edwards, D.D.; Campbell, L.J.

    1993-11-01

    The US Geological Survey and the Idaho Department of Water Resources, in response to a request from the US Department of Energy, sampled 18 sites as part of a long-term project to monitor water quality of the Snake River Plain aquifer from the southern boundary of the Idaho National Engineering Laboratory to the Hagerman area. Water samples were collected and analyzed for manmade pollutants and naturally occurring constituents. The samples were collected from six irrigation wells, seven domestic wells, two springs, one stock well, one dairy well, and one observation well. Quality assurance samples also were collected and analyzed. Themore » water samples were analyzed for selected radionuclides, inorganic constituents, organic compounds, and bacteria. None of the samples analyzed for radionuclides, inorganic constituents, or organic compounds exceeded the established maximum contaminant levels for drinking water. Most of the radionuclide and inorganic constituent concentrations exceeded their respective reporting levels. All the samples analyzed for dissolved organic carbon had concentrations that exceeded their reporting level. Concentrations of 1,1,1 -trichloroethane exceeded the reporting level in two water samples. Two samples and a quality assurance replicate contained reportable concentrations of 2, 4-D. One sample contained fecal coliform bacteria counts that exceeded established maximum contaminant levels for drinking water.« less

  20. Measurement Error Calibration in Mixed-Mode Sample Surveys

    ERIC Educational Resources Information Center

    Buelens, Bart; van den Brakel, Jan A.

    2015-01-01

    Mixed-mode surveys are known to be susceptible to mode-dependent selection and measurement effects, collectively referred to as mode effects. The use of different data collection modes within the same survey may reduce selectivity of the overall response but is characterized by measurement errors differing across modes. Inference in sample surveys…

  1. Validity and Reliability of Psychosocial Factors Related to Breast Cancer Screening.

    ERIC Educational Resources Information Center

    Zapka, Jane G.; And Others

    1991-01-01

    The construct validity of hypothesized survey items and data reduction procedures for selected psychosocial constructs frequently used in breast cancer screening research were investigated in telephone interviews with randomly selected samples of 1,184 and 903 women and a sample of 169 Hispanic clinic clients. Validity of the constructs is…

  2. 1977 Survey of the American Professoriate. Technical Report.

    ERIC Educational Resources Information Center

    Ladd, Everett Carll, Jr.; And Others

    The development and data validation of the 1977 Ladd-Lipset national survey of the American professoriate are described. The respondents were selected from a random sample of colleges and universities and from a random sample of individual faculty members from the universities. The 158 institutions in the 1977 survey were selected from 2,406…

  3. On using sample selection methods in estimating the price elasticity of firms' demand for insurance.

    PubMed

    Marquis, M Susan; Louis, Thomas A

    2002-01-01

    We evaluate a technique based on sample selection models that has been used by health economists to estimate the price elasticity of firms' demand for insurance. We demonstrate that, this technique produces inflated estimates of the price elasticity. We show that alternative methods lead to valid estimates.

  4. Over-Selectivity as a Learned Response

    ERIC Educational Resources Information Center

    Reed, Phil; Petrina, Neysa; McHugh, Louise

    2011-01-01

    An experiment investigated the effects of different levels of task complexity in pre-training on over-selectivity in a subsequent match-to-sample (MTS) task. Twenty human participants were divided into two groups; exposed either to a 3-element, or a 9-element, compound stimulus as a sample during MTS training. After the completion of training,…

  5. Ecological Condition of Streams in Eastern and Southern NevadaEPA R-EMAP Muddy-Virgin River Project

    EPA Science Inventory

    The report presents data collected during a one year study period beginning in May of 2000. Sampling sites were selected using a probability-based design (as opposed to subjectively selected sites) using the USEPA River Reach File version 3 (RF3). About 37 sites were sampled. ...

  6. Attenders versus Slackers: A Classroom Demonstration of Quasi-Experimentation and Self-Selecting Samples

    ERIC Educational Resources Information Center

    Stellmack, Mark A.

    2013-01-01

    Studies of the effects of class attendance on class performance typically are quasi-experimental because students choose whether or not to attend class; that is, the samples are self-selecting. The lack of random assignment prevents one from establishing a causal relationship between attendance and performance. Relating attendance to performance…

  7. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  8. A Review of Selected Engineered Nanoparticles in the Atmosphere: Sources, Transformations, and Techniques for Sampling and Analysis

    EPA Science Inventory

    A state-of-the-science review was undertaken to identify and assess sampling and analysis methods to detect and quantify selected nanomaterials (NMs) in the ambient atmosphere. The review is restricted to five types of NMs of interest to the Office of Research and Development Nan...

  9. Microgravity Testing of a Surface Sampling System for Sample Return from Small Solar System Bodies

    NASA Technical Reports Server (NTRS)

    Franzen, M. A.; Preble, J.; Schoenoff, M.; Halona, K.; Long, T. E.; Park, T.; Sears, D. W. G.

    2004-01-01

    The return of samples from solar system bodies is becoming an essential element of solar system exploration. The recent National Research Council Solar System Exploration Decadal Survey identified six sample return missions as high priority missions: South-Aitken Basin Sample Return, Comet Surface Sample Return, Comet Surface Sample Return-sample from selected surface sites, Asteroid Lander/Rover/Sample Return, Comet Nucleus Sample Return-cold samples from depth, and Mars Sample Return [1] and the NASA Roadmap also includes sample return missions [2] . Sample collection methods that have been flown on robotic spacecraft to date return subgram quantities, but many scientific issues (like bulk composition, particle size distributions, petrology, chronology) require tens to hundreds of grams of sample. Many complex sample collection devices have been proposed, however, small robotic missions require simplicity. We present here the results of experiments done with a simple but innovative collection system for sample return from small solar system bodies.

  10. Improving the collection of knowledge, attitude and practice data with community surveys: a comparison of two second-stage sampling methods.

    PubMed

    Davis, Rosemary H; Valadez, Joseph J

    2014-12-01

    Second-stage sampling techniques, including spatial segmentation, are widely used in community health surveys when reliable household sampling frames are not available. In India, an unresearched technique for household selection is used in eight states, which samples the house with the last marriage or birth as the starting point. Users question whether this last-birth or last-marriage (LBLM) approach introduces bias affecting survey results. We conducted two simultaneous population-based surveys. One used segmentation sampling; the other used LBLM. LBLM sampling required modification before assessment was possible and a more systematic approach was tested using last birth only. We compared coverage proportions produced by the two independent samples for six malaria indicators and demographic variables (education, wealth and caste). We then measured the level of agreement between the caste of the selected participant and the caste of the health worker making the selection. No significant difference between methods was found for the point estimates of six malaria indicators, education, caste or wealth of the survey participants (range of P: 0.06 to >0.99). A poor level of agreement occurred between the caste of the health worker used in household selection and the caste of the final participant, (Κ = 0.185), revealing little association between the two, and thereby indicating that caste was not a source of bias. Although LBLM was not testable, a systematic last-birth approach was tested. If documented concerns of last-birth sampling are addressed, this new method could offer an acceptable alternative to segmentation in India. However, inter-state caste variation could affect this result. Therefore, additional assessment of last birth is required before wider implementation is recommended. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.

  11. EVIDENCE FOR THE UNIVERSALITY OF PROPERTIES OF RED-SEQUENCE GALAXIES IN X-RAY- AND RED-SEQUENCE-SELECTED CLUSTERS AT z ∼ 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foltz, R.; Wilson, G.; DeGroot, A.

    We study the slope, intercept, and scatter of the color–magnitude and color–mass relations for a sample of 10 infrared red-sequence-selected clusters at z ∼ 1. The quiescent galaxies in these clusters formed the bulk of their stars above z ≳ 3 with an age spread Δt ≳ 1 Gyr. We compare UVJ color–color and spectroscopic-based galaxy selection techniques, and find a 15% difference in the galaxy populations classified as quiescent by these methods. We compare the color–magnitude relations from our red-sequence selected sample with X-ray- and photometric-redshift-selected cluster samples of similar mass and redshift. Within uncertainties, we are unable tomore » detect any difference in the ages and star formation histories of quiescent cluster members in clusters selected by different methods, suggesting that the dominant quenching mechanism is insensitive to cluster baryon partitioning at z ∼ 1.« less

  12. SamSelect: a sample sequence selection algorithm for quorum planted motif search on large DNA datasets.

    PubMed

    Yu, Qiang; Wei, Dingbang; Huo, Hongwei

    2018-06-18

    Given a set of t n-length DNA sequences, q satisfying 0 < q ≤ 1, and l and d satisfying 0 ≤ d < l < n, the quorum planted motif search (qPMS) finds l-length strings that occur in at least qt input sequences with up to d mismatches and is mainly used to locate transcription factor binding sites in DNA sequences. Existing qPMS algorithms have been able to efficiently process small standard datasets (e.g., t = 20 and n = 600), but they are too time consuming to process large DNA datasets, such as ChIP-seq datasets that contain thousands of sequences or more. We analyze the effects of t and q on the time performance of qPMS algorithms and find that a large t or a small q causes a longer computation time. Based on this information, we improve the time performance of existing qPMS algorithms by selecting a sample sequence set D' with a small t and a large q from the large input dataset D and then executing qPMS algorithms on D'. A sample sequence selection algorithm named SamSelect is proposed. The experimental results on both simulated and real data show (1) that SamSelect can select D' efficiently and (2) that the qPMS algorithms executed on D' can find implanted or real motifs in a significantly shorter time than when executed on D. We improve the ability of existing qPMS algorithms to process large DNA datasets from the perspective of selecting high-quality sample sequence sets so that the qPMS algorithms can find motifs in a short time in the selected sample sequence set D', rather than take an unfeasibly long time to search the original sequence set D. Our motif discovery method is an approximate algorithm.

  13. VizieR Online Data Catalog: Quasar luminosity function (Hawkins+, 1993)

    NASA Astrophysics Data System (ADS)

    Hawkins, M. R. S.; Veron, P.

    1994-11-01

    A sample of quasars is selected from a 10-yr sequence of 30 UK Schmidt plates. Luminosity functions are derived in several redshift intervals, which in each case show a featureless power-law rise towards low luminosities. There is no sigh of the 'break' found in the recent UVX sample of Boyle, Shanks & Peterson. It is suggested that reasons for the disagreement are connected with biases in the selection of the UVX sample. The question of the nature of quasar evolution appears to be still unresolved. (1 data file).

  14. Analysis of native water, bed material, and elutriate samples of major Louisiana waterways, 1975

    USGS Publications Warehouse

    Demas, Charles R.

    1976-01-01

    The U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers, conducted a series of elutriate studies in selected reaches of major navigable waterways of Louisiana. As defined by the U.S. Environmental Protection Agency, an elutriate is the supernatant resulting from the vigorous 30-minute shaking of one part bottom sediment from the dredging site with four parts water (vol/vol) collected from the dredging site followed by one hour settling time and appropriate centrifugation and a 0.45-micron filtration. The elutriate studies were initiated to evaluate possible environmental effects of proposed dredging activities in selected reaches of Louisiana waterways. The waterways investigated were the Mississippi River-Gulf Outlet, Breton Sound, Mississippi River downstream from Baton Rouge, Bayou Long, Intracoastal Waterway (east and west of the Harvey Canal), Three Rivers area, Ouachita River, Barataria Bay, Houma Navigation Canal, Atchafalaya Bay (Ship Channel), Berwick Bay, Intracoastal Waterway (Port Allen to Morgan City), Petite Anse area, and Calcasieu River and Ship Channel. The Geological Survey collected 227 samples of native water and bed (bottom) material from 130 different sites. These samples (as well as elutriates prepared from mixtures of native water and bed material) were analyzed for selected metal, pesticide, nutrient, and organic constituents. An additional 116 bed samples collected at 58 sites were analyzed for selected pesticides; and 4 additional native-water samples from 2 sites were analyzed for selected metal pesticide, nutrient, and organic constituents. (Woodard-USGS)

  15. Identification, genetic localization, and allelic diversity of selectively amplified microsatellite polymorphic loci in lettuce and wild relatives (Lactuca spp.).

    PubMed

    Witsenboer, H; Michelmore, R W; Vogel, J

    1997-12-01

    Selectively amplified microsatellite polymorphic locus (SAMPL) analysis is a method of amplifying microsatellite loci using generic PCR primers. SAMPL analysis uses one AFLP primer in combination with a primer complementary to microsatellite sequences. SAMPL primers based on compound microsatellite sequences provided the clearest amplification patterns. We explored the potential of SAMPL analysis in lettuce to detect PCR-based codominant microsatellite markers. Fifty-eight SAMPLs were identified and placed on the genetic map. Seventeen were codominant. SAMPLs were dispersed with RFLP markers on 11 of the 12 main linkage groups in lettuce, indicating that they have a similar genomic distribution. Some but not all fragments amplified by SAMPL analysis were confirmed to contain microsatellite sequences by Southern hybridization. Forty-five cultivars of lettuce and five wild species of Lactuca were analyzed to determine the allelic diversity for codominant SAMPLs. From 3 to 11 putative alleles were found for each SAMPL; 2-6 alleles were found within Lactuca sativa and 1-3 alleles were found among the crisphead genotypes, the most genetically homogeneous plant type of L. sativa. This allelic diversity is greater than that found for RFLP markers. Numerous new alleles were observed in the wild species; however, there were frequent null alleles. Therefore, SAMPL analysis is more applicable to intraspecific than to interspecific comparisons. A phenetic analysis based on SAMPLs resulted in a dendrogram similar to those based on RFLP and AFLP markers.

  16. Selectivity in analytical chemistry: two interpretations for univariate methods.

    PubMed

    Dorkó, Zsanett; Verbić, Tatjana; Horvai, George

    2015-01-01

    Selectivity is extremely important in analytical chemistry but its definition is elusive despite continued efforts by professional organizations and individual scientists. This paper shows that the existing selectivity concepts for univariate analytical methods broadly fall in two classes: selectivity concepts based on measurement error and concepts based on response surfaces (the response surface being the 3D plot of the univariate signal as a function of analyte and interferent concentration, respectively). The strengths and weaknesses of the different definitions are analyzed and contradictions between them unveiled. The error based selectivity is very general and very safe but its application to a range of samples (as opposed to a single sample) requires the knowledge of some constraint about the possible sample compositions. The selectivity concepts based on the response surface are easily applied to linear response surfaces but may lead to difficulties and counterintuitive results when applied to nonlinear response surfaces. A particular advantage of this class of selectivity is that with linear response surfaces it can provide a concentration independent measure of selectivity. In contrast, the error based selectivity concept allows only yes/no type decision about selectivity. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Representativeness-based sampling network design for the State of Alaska

    Treesearch

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  18. 40 CFR 761.240 - Scope and definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROHIBITIONS Determining a PCB Concentration for Purposes of Abandonment or Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.240 Scope... determine its PCB surface concentration for abandonment-in-place or removal and disposal off-site in...

  19. 7 CFR 43.103 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... SAMPLING PLANS Sampling Plans § 43.103 Purpose and scope. (a) This subpart contains selected single and double sampling plans for inspection by attributes. They are to serve as a source of plans for developing...

  20. Electrostatic discharge test apparatus

    NASA Technical Reports Server (NTRS)

    Smith, William Conrad (Inventor)

    1988-01-01

    Electrostatic discharge properties of materials are quantitatively measured and ranked. Samples are rotated on a turntable beneath selectable, co-available electrostatic chargers, one being a corona charging element and the other a sample-engaging triboelectric charging element. Samples then pass under a voltage meter to measure the amount of residual charge on the samples. After charging is discontinued, measurements are continued to record the charge decay history over time.

  1. Determining the fraction of reddened quasars in COSMOS with multiple selection techniques from X-ray to radio wavelengths

    NASA Astrophysics Data System (ADS)

    Heintz, K. E.; Fynbo, J. P. U.; Møller, P.; Milvang-Jensen, B.; Zabl, J.; Maddox, N.; Krogager, J.-K.; Geier, S.; Vestergaard, M.; Noterdaeme, P.; Ledoux, C.

    2016-10-01

    The sub-population of quasars reddened by intrinsic or intervening clouds of dust are known to be underrepresented in optical quasar surveys. By defining a complete parent sample of the brightest and spatially unresolved quasars in the COSMOS field, we quantify to which extent this sub-population is fundamental to our understanding of the true population of quasars. By using the available multiwavelength data of various surveys in the COSMOS field, we built a parent sample of 33 quasars brighter than J = 20 mag, identified by reliable X-ray to radio wavelength selection techniques. Spectroscopic follow-up with the NOT/ALFOSC was carried out for four candidate quasars that had not been targeted previously to obtain a 100% redshift completeness of the sample. The population of high AV quasars (HAQs), a specific sub-population of quasars selected from optical/near-infrared photometry, some of which were shown to be missed in large optical surveys such as SDSS, is found to contribute 21%+9-5 of the parent sample. The full population of bright spatially unresolved quasars represented by our parent sample consists of 39%+9-8 reddened quasars defined by having AV > 0.1, and 21%+9-5 of the sample having E(B-V) > 0.1 assuming the extinction curve of the Small Magellanic Cloud. We show that the HAQ selection works well for selecting reddened quasars, but some are missed because their optical spectra are too blue to pass the g-r color cut in the HAQ selection. This is either due to a low degree of dust reddening or anomalous spectra. We find that the fraction of quasars with contributing light from the host galaxy, causing observed extended spatial morphology, is most dominant at z ≲ 1. At higher redshifts the population of spatially unresolved quasars selected by our parent sample is found to be representative of the full population of bright active galactic nuclei at J< 20 mag. This work quantifies the bias against reddened quasars in studies that are based solely on optical surveys. Partly based on observations made with the Nordic Optical Telescope, operated by the Nordic Optical Telescope Scientific Association at the Observatorio del Roque de los Muchachos, La Palma, Spain, of the Instituto de Astrofisica de Canarias.

  2. Signal enhancement using a switchable magnetic trap

    DOEpatents

    Beer, Neil Reginald [Pleasanton, CA

    2012-05-29

    A system for analyzing a sample including providing a microchannel flow channel; associating the sample with magnetic nanoparticles or magnetic polystyrene-coated beads; moving the sample with said magnetic nanoparticles or magnetic polystyrene-coated beads in the microchannel flow channel; holding the sample with the magnetic nanoparticles or magnetic polystyrene-coated beads in a magnetic trap in the microchannel flow channel; and analyzing the sample obtaining an enhanced analysis signal. An apparatus for analysis of a sample includes magnetic particles connected to the sample, a microchip, a flow channel in the microchip, a source of carrier fluid connected to the flow channel for moving the sample in the flow channel, an electromagnet trap connected to the flow line for selectively magnetically trapping the sample and the magnetic particles, and an analyzer for analyzing the sample.

  3. Variable selection based cotton bollworm odor spectroscopic detection

    NASA Astrophysics Data System (ADS)

    Lü, Chengxu; Gai, Shasha; Luo, Min; Zhao, Bo

    2016-10-01

    Aiming at rapid automatic pest detection based efficient and targeting pesticide application and shooting the trouble of reflectance spectral signal covered and attenuated by the solid plant, the possibility of near infrared spectroscopy (NIRS) detection on cotton bollworm odor is studied. Three cotton bollworm odor samples and 3 blank air gas samples were prepared. Different concentrations of cotton bollworm odor were prepared by mixing the above gas samples, resulting a calibration group of 62 samples and a validation group of 31 samples. Spectral collection system includes light source, optical fiber, sample chamber, spectrometer. Spectra were pretreated by baseline correction, modeled with partial least squares (PLS), and optimized by genetic algorithm (GA) and competitive adaptive reweighted sampling (CARS). Minor counts differences are found among spectra of different cotton bollworm odor concentrations. PLS model of all the variables was built presenting RMSEV of 14 and RV2 of 0.89, its theory basis is insect volatilizes specific odor, including pheromone and allelochemics, which are used for intra-specific and inter-specific communication and could be detected by NIR spectroscopy. 28 sensitive variables are selected by GA, presenting the model performance of RMSEV of 14 and RV2 of 0.90. Comparably, 8 sensitive variables are selected by CARS, presenting the model performance of RMSEV of 13 and RV2 of 0.92. CARS model employs only 1.5% variables presenting smaller error than that of all variable. Odor gas based NIR technique shows the potential for cotton bollworm detection.

  4. Exposure of jeepney drivers in Manila, Philippines, to selected volatile organic compounds (VOCs).

    PubMed

    Balanay, Jo Anne G; Lungu, Claudiu T

    2009-01-01

    The objective of this study was to assess the occupational exposure of jeepney drivers to selected volatile organic compounds (VOCs) in Manila, Philippines. Personal sampling was conducted on 15 jeepney drivers. Area sampling was conducted to determine the background VOC concentration in Manila as compared to that in a rural area. Both personal and area samples were collected for 5 working days. Samples were obtained using diffusive samplers and were analyzed for 6 VOCs (benzene, toluene, ethylbenzene, m,p-xylene and o-xylene) using gas chromatography. Results showed that the average personal exposure concentration of jeepney drivers was 55.6 (+/-9.3), 196.6 (+/-75.0), 17.9 (+/-9.0), 72.5 (+/-21.1) and 88.5 (+/-26.5) microg/m(3) for benzene, toluene, ethylbenzene, m,p-xylene and o-xylene, respectively. The urban ambient concentration was 11.8 (+/-2.2), 83.7 (+/-40.5) and 38.0 (+/-12.1) microg/m(3) for benzene, toluene and o-xylene, respectively. The rural ambient concentration was 14.0 (+/-6.0) and 24.7 (+/-11.9) microg/m(3) for toluene and o-xylene, respectively. The personal samples had significantly higher (p<0.05) concentrations for all selected VOCs than the urban area samples. Among the area samples, the urban concentrations of benzene and toluene were significantly higher (p<0.05) than the rural concentrations. The personal exposures for all the target VOCs were not significantly different among the jeepney drivers.

  5. A medium-bright quasar sample - New quasar surface densities in the magnitude range from 16.4 to 17.65 for B

    NASA Technical Reports Server (NTRS)

    Mitchell, K. J.; Warnock, A., III; Usher, P. D.

    1984-01-01

    A new medium-bright quasar sample (MBQS) is constructed from spectroscopic observations of 140 bright objects selected for varying degrees of blue and ultraviolet excess (B-UVX) in five Palomar 1.2 m Schmidt fields. The MBQS contains 32 quasars with B less than 17.65 mag. The new integral surface densities in the B range from 16.45 to 17.65 mag are approximately 40 percent (or more) higher than expected. The MBQS and its redshift distribution increase the area of the Hubble diagram covered by complete samples of quasars. The general spectroscopic results indicate that the three-color classification process used to catalog the spectroscopic candidates (1) has efficiently separated the intrinsically B-UVX stellar objects from the Population II subdwarfs and (2) has produced samples of B-UVX objects which are more complete than samples selected by (U - B) color alone.

  6. Quality of different in-clinic test systems for feline immunodeficiency virus and feline leukaemia virus infection.

    PubMed

    Hartmann, Katrin; Griessmayr, Pascale; Schulz, Bianka; Greene, Craig E; Vidyashankar, Anand N; Jarrett, Os; Egberink, Herman F

    2007-12-01

    Many new diagnostic in-house tests for identification of feline immunodeficiency virus (FIV) and feline leukaemia virus (FeLV) infection have been licensed for use in veterinary practice, and the question of the relative merits of these kits has prompted comparative studies. This study was designed to define the strengths and weaknesses of seven FIV and eight FeLV tests that are commercially available. In this study, 536 serum samples from randomly selected cats were tested. Those samples reacting FIV-positive in at least one of the tests were confirmed by Western blot, and those reacting FeLV-positive were confirmed by virus isolation. In addition, a random selection of samples testing negative in all test systems was re-tested by Western blot (100 samples) and by virus isolation (81 samples). Specificity, sensitivity, positive and negative predictive values of each test and the quality of the results were compared.

  7. Impact assessment and decontamination of pesticides from meat under different culinary processes.

    PubMed

    Sengupta, Dwaipayan; Aktar, Md Wasim; Alam, Samsul; Chowdhury, Ashim

    2010-10-01

    A total of 75 animals between 1.5 and 8 years old were randomly selected for the study. Of these, 57.8% were cross-bred animals and the rest were non-descript. Moreover, 61.8% of the animals under study were brought for slaughter from local sources and the rest from farm houses. Samples collected from five districts revealed contamination with traces of organochlorine pesticides (0.01-0.22 microg g(-1)) and organophosphorus pesticides (0.111-0.098 microg g(-1)). In general, all the raw meat samples possessed dichlorodiphenyltrichloroethane at the highest level. Contamination was highest in cow meat samples and lowest in chicken samples. No particular district-wise trend was obtained for the pesticides selected for analysis. Subsequent decontamination study revealed that cooking is the best option in reducing pesticide load in raw meat samples. Cooked chicken is the safest foodstuff for consumption.

  8. Further search for selectivity of positron annihilation in the skin and cancerous systems

    NASA Astrophysics Data System (ADS)

    Liu, Guang; Chen, Hongmin; Chakka, Lakshmi; Cheng, Mei-Ling; Gadzia, Joseph E.; Suzuki, R.; Ohdaira, T.; Oshima, N.; Jean, Y. C.

    2008-10-01

    Positronium annihilation lifetime (PAL) spectroscopy and Doppler broadening energy spectra (DBES) have been used to search for selectivity and sensitivity for cancerous skin samples with and without cancer. This study is to further explore the melanoma cancerous system and other different types of skin samples. We found that the S parameter in melanoma skin samples cut at 0.39 mm depth from the same patient's skin is smaller than near the skin surface. However in 10 melanoma samples from different patients, the S parameters vary significantly. Similarly, among 10 normal skin samples without cancer, the S parameters also vary largely among different patients. To understand the sensitivity of PAS as a tool to detect cancer formation at the early stage, we propose a controlled and systematic study of in vivo experiments using UV-induced cancer skin from living animals.

  9. Preliminary geochemical assessment of water in selected streams, springs, and caves in the Upper Baker and Snake Creek drainages in Great Basin National Park, Nevada, 2009

    USGS Publications Warehouse

    Paul, Angela P.; Thodal, Carl E.; Baker, Gretchen M.; Lico, Michael S.; Prudic, David E.

    2014-01-01

    Water in caves, discharging from springs, and flowing in streams in the upper Baker and Snake Creek drainages are important natural resources in Great Basin National Park, Nevada. Water and rock samples were collected from 15 sites during February 2009 as part of a series of investigations evaluating the potential for water resource depletion in the park resulting from the current and proposed groundwater withdrawals. This report summarizes general geochemical characteristics of water samples collected from the upper Baker and Snake Creek drainages for eventual use in evaluating possible hydrologic connections between the streams and selected caves and springs discharging in limestone terrain within each watershed.Generally, water discharging from selected springs in the upper Baker and Snake Creek watersheds is relatively young and, in some cases, has similar chemical characteristics to water collected from associated streams. In the upper Baker Creek drainage, geochemical data suggest possible hydrologic connections between Baker Creek and selected springs and caves along it. The analytical results for water samples collected from Wheelers Deep and Model Caves show characteristics similar to those from Baker Creek, suggesting a hydrologic connection between the creek and caves, a finding previously documented by other researchers. Generally, geochemical evidence does not support a connection between water flowing in Pole Canyon Creek to that in Model Cave, at least not to any appreciable extent. The water sample collected from Rosethorn Spring had relatively high concentrations of many of the constituents sampled as part of this study. This finding was expected as the water from the spring travelled through alluvium prior to being discharged at the surface and, as a result, was provided the opportunity to interact with soil minerals with which it came into contact. Isotopic evidence does not preclude a connection between Baker Creek and the water discharging from Rosethorn Spring. The residence time of water discharging into the caves and from selected springs sampled as part of this study ranged from 10 to 25 years.Within the upper Snake Creek drainage, the results of this study show geochemical similarities between Snake Creek and Outhouse Spring, Spring Creek Spring, and Squirrel Spring Cave. The strontium isotope ratio (87Sr/86Sr) for intrusive rock samples representative of the Snake Creek drainage were similar to carbonate rock samples. The water sample collected from Snake Creek at the pipeline discharge point had lower strontium concentrations than the sample downstream and a similar 87Sr/86Sr value as the carbonate and intrusive rocks. The chemistry of the water sample was considered representative of upstream conditions in Snake Creek and indicates minimal influence of rock dissolution. The results of this study suggest that water discharging from Outlet Spring is not hydrologically connected to Snake Creek but rather is recharged at high altitude(s) within the Snake Creek drainage. These findings for Outlet Spring largely stem from the relatively high specific conductance and chloride concentration, the lightest deuterium (δD) and oxygen-18 (δ18O) values, and the longest calculated residence time (60 to 90 years) relative to any other sample collected as part of this study. With the exception of water sampled from Outlet Spring, the residence time of water discharging into Squirrel Spring Cave and selected springs in the upper Snake Creek drainage was less than 30 years.

  10. Multiple Imputation in Two-Stage Cluster Samples Using The Weighted Finite Population Bayesian Bootstrap.

    PubMed

    Zhou, Hanzhi; Elliott, Michael R; Raghunathan, Trivellore E

    2016-06-01

    Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in "Delta-V," a key crash severity measure.

  11. Multiple Imputation in Two-Stage Cluster Samples Using The Weighted Finite Population Bayesian Bootstrap

    PubMed Central

    Zhou, Hanzhi; Elliott, Michael R.; Raghunathan, Trivellore E.

    2017-01-01

    Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in “Delta-V,” a key crash severity measure. PMID:29226161

  12. Selection within households in health surveys

    PubMed Central

    Alves, Maria Cecilia Goi Porto; Escuder, Maria Mercedes Loureiro; Claro, Rafael Moreira; da Silva, Nilza Nunes

    2014-01-01

    OBJECTIVE To compare the efficiency and accuracy of sampling designs including and excluding the sampling of individuals within sampled households in health surveys. METHODS From a population survey conducted in Baixada Santista Metropolitan Area, SP, Southeastern Brazil, lowlands between 2006 and 2007, 1,000 samples were drawn for each design and estimates for people aged 18 to 59 and 18 and over were calculated for each sample. In the first design, 40 census tracts, 12 households per sector, and one person per household were sampled. In the second, no sampling within the household was performed and 40 census sectors and 6 households for the 18 to 59-year old group and 5 or 6 for the 18 and over age group or more were sampled. Precision and bias of proportion estimates for 11 indicators were assessed in the two final sets of the 1000 selected samples with the two types of design. They were compared by means of relative measurements: coefficient of variation, bias/mean ratio, bias/standard error ratio, and relative mean square error. Comparison of costs contrasted basic cost per person, household cost, number of people, and households. RESULTS Bias was found to be negligible for both designs. A lower precision was found in the design including individuals sampling within households, and the costs were higher. CONCLUSIONS The design excluding individual sampling achieved higher levels of efficiency and accuracy and, accordingly, should be first choice for investigators. Sampling of household dwellers should be adopted when there are reasons related to the study subject that may lead to bias in individual responses if multiple dwellers answer the proposed questionnaire. PMID:24789641

  13. Viscoelasticity and texture of spreadable cheeses with different fat contents at refrigeration and room temperatures.

    PubMed

    Bayarri, S; Carbonell, I; Costell, E

    2012-12-01

    The effect of the 2 common consumption temperatures, refrigeration temperature (10°C) and room temperature (22°C), on the viscoelasticity, mechanical properties, and perceived texture of commercial cream cheeses was studied. Two samples with different fat contents, regular and low fat, from each of 4 selected commercial brands were analyzed. The selection criteria were based on identification of brands with different percentages of fat content reduction between the regular- and low-fat samples (35, 50, 84, and 98.5%). The fat content of regular-fat samples ranged from 19.8 to 26.0% (wt/wt), and that of low-fat samples ranged from 0.3 to 13.0% (wt/wt). Viscoelasticity was measured in a controlled-stress rheometer using parallel-plate geometry, and the mechanical characteristics of samples were measured using the spreadability test. Differences in the intensity of thickness, creaminess, and roughness between the regular- and low-fat samples of each commercial brand were evaluated at each of the selected temperatures by using the paired comparisons test. At 10°C, all samples showed higher viscoelastic modulus values, firmness, and stickiness, and lower spreadability than when they were measured at 22°C. Differences in viscoelasticity and mechanical properties between each pair of samples of the same brand were greater at 10°C than at 22°C because of the influence not only of fat content but also of fat state. Ingestion temperature did not modify the sensory differences detected between each pair of samples in terms of creaminess and roughness, but it did modify the differences detected in thickness. The joint consideration of sample composition, fat state, and product behavior during oral processing could explain the differences detected in thickness perceived because of measurement temperatures. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Sorbent-based sampling methods for volatile and semi-volatile organic compounds in air Part 1: Sorbent-based air monitoring options.

    PubMed

    Woolfenden, Elizabeth

    2010-04-16

    Sorbent tubes/traps are widely used in combination with gas chromatographic (GC) analytical methods to monitor the vapour-phase fraction of organic compounds in air. Target compounds range in volatility from acetylene and freons to phthalates and PCBs and include apolar, polar and reactive species. Airborne vapour concentrations will vary depending on the nature of the location, nearby pollution sources, weather conditions, etc. Levels can range from low percent concentrations in stack and vent emissions to low part per trillion (ppt) levels in ultra-clean outdoor locations. Hundreds, even thousands of different compounds may be present in any given atmosphere. GC is commonly used in combination with mass spectrometry (MS) detection especially for environmental monitoring or for screening uncharacterised workplace atmospheres. Given the complexity and variability of organic vapours in air, no one sampling approach suits every monitoring scenario. A variety of different sampling strategies and sorbent media have been developed to address specific applications. Key sorbent-based examples include: active (pumped) sampling onto tubes packed with one or more sorbents held at ambient temperature; diffusive (passive) sampling onto sorbent tubes/cartridges; on-line sampling of air/gas streams into cooled sorbent traps; and transfer of air samples from containers (canisters, Tedlar) bags, etc.) into cooled sorbent focusing traps. Whichever sampling approach is selected, subsequent analysis almost always involves either solvent extraction or thermal desorption (TD) prior to GC(/MS) analysis. The overall performance of the air monitoring method will depend heavily on appropriate selection of key sampling and analytical parameters. This comprehensive review of air monitoring using sorbent tubes/traps is divided into 2 parts. (1) Sorbent-based air sampling option. (2) Sorbent selection and other aspects of optimizing sorbent-based air monitoring methods. The paper presents current state-of-the-art and recent developments in relevant areas such as sorbent research, sampler design, enhanced approaches to analytical quality assurance and on-tube derivatisation. Copyright 2009 Elsevier B.V. All rights reserved.

  15. Mars sample collection and preservation

    NASA Technical Reports Server (NTRS)

    Blanchard, Douglas P.

    1988-01-01

    The intensive exploration of Mars is a major step in the systematic exploration of the solar system. Mars, earth, and Venus provide valuable contrasts in planetary evolution. Mars exploration has progressed through the stages of exploration and is now ready for a sample-return mission. About 5 kg of intelligently selected samples will be returned from Mars. A variety of samples are wanted. This requires accurate landing in areas of high interest, surface mobility and analytical capability, a variety of sampling tools, and stringent preservation and isolation measures.

  16. Assessment of computer-related health problems among post-graduate nursing students.

    PubMed

    Khan, Shaheen Akhtar; Sharma, Veena

    2013-01-01

    The study was conducted to assess computer-related health problems among post-graduate nursing students and to develop a Self Instructional Module for prevention of computer-related health problems in a selected university situated in Delhi. A descriptive survey with co-relational design was adopted. A total of 97 samples were selected from different faculties of Jamia Hamdard by multi stage sampling with systematic random sampling technique. Among post-graduate students, majority of sample subjects had average compliance with computer-related ergonomics principles. As regards computer related health problems, majority of post graduate students had moderate computer-related health problems, Self Instructional Module developed for prevention of computer-related health problems was found to be acceptable by the post-graduate students.

  17. The effect of heavy metal concentration and soil pH on the abundance of selected microbial groups within ArcelorMittal Poland steelworks in Cracow.

    PubMed

    Lenart, Anna; Wolny-Koładka, Katarzyna

    2013-01-01

    The present study aimed to identify the effect of heavy metal concentration and soil pH on the abundance of the selected soil microorganisms within ArcelorMittal Poland steelworks, Cracow. The analysis included 20 soil samples, where the concentration of Fe, Zn, Cd, Pb, Ni, Cu, Mn, Cr and soil pH were evaluated together with the number of mesophilic bacteria, fungi, Actinomycetes and Azotobacter spp. In the majority of samples soil pH was alkaline. The limits of heavy metals exceeded in eight samples and in one sample, the concentration of Zn exceeded 31-fold. Chromium was the element which most significantly limited the number of bacteria and Actinomycetes.

  18. Experimental Design in Clinical 'Omics Biomarker Discovery.

    PubMed

    Forshed, Jenny

    2017-11-03

    This tutorial highlights some issues in the experimental design of clinical 'omics biomarker discovery, how to avoid bias and get as true quantities as possible from biochemical analyses, and how to select samples to improve the chance of answering the clinical question at issue. This includes the importance of defining clinical aim and end point, knowing the variability in the results, randomization of samples, sample size, statistical power, and how to avoid confounding factors by including clinical data in the sample selection, that is, how to avoid unpleasant surprises at the point of statistical analysis. The aim of this Tutorial is to help translational clinical and preclinical biomarker candidate research and to improve the validity and potential of future biomarker candidate findings.

  19. A Novel Selective Deep Eutectic Solvent Extraction Method for Versatile Determination of Copper in Sediment Samples by ICP-OES.

    PubMed

    Bağda, Esra; Altundağ, Huseyin; Tüzen, Mustafa; Soylak, Mustafa

    2017-08-01

    In the present study, a simple, mono step deep eutectic solvent (DES) extraction was developed for selective extraction of copper from sediment samples. The optimization of all experimental parameters, e.g. DES type, sample/DES ratio, contact time and temperature were performed with using BCR-280 R (lake sediment certified reference material). The limit of detection (LOD) and the limit of quantification (LOQ) were found as 1.2 and 3.97 µg L -1 , respectively. The RSD of the procedure was 7.5%. The proposed extraction method was applied to river and lake sediments sampled from Serpincik, Çeltek, Kızılırmak (Fadl and Tecer region of the river), Sivas-Turkey.

  20. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research.

    PubMed

    Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly

    2015-09-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.

  1. In Situ Pre-Selection of Return Samples with Bio-Signatures by Combined Laser Mass Spectrometry and Optical Microscopy

    NASA Astrophysics Data System (ADS)

    Wiesendanger, R.; Wurz, P.; Tulej, M.; Wacey, D.; Neubeck, A.; Grimaudo, V.; Riedo, A.; Moreno, P.; Cedeño-López, A.; Ivarsson, M.

    2018-04-01

    The University of Bern developed instrument prototypes that allow analysis of samples on Mars prior to bringing them back to Earth, allowing to maximize the scientific outcome of the returned samples. We will present the systems and first results.

  2. Estimating total suspended sediment yield with probability sampling

    Treesearch

    Robert B. Thomas

    1985-01-01

    The ""Selection At List Time"" (SALT) scheme controls sampling of concentration for estimating total suspended sediment yield. The probability of taking a sample is proportional to its estimated contribution to total suspended sediment discharge. This procedure gives unbiased estimates of total suspended sediment yield and the variance of the...

  3. The "Anatomy" of a Performance-Enhancing Drug Test in Sports

    ERIC Educational Resources Information Center

    Werner, T. C.

    2012-01-01

    The components of a performance-enhancing drug (PED) test in sports include sample selection, collection, establishing sample integrity, sample pretreatment, analyte detection, data evaluation, reporting results, and action taken based on the result. Undergraduate curricula generally focus on the detection and evaluation steps of an analytical…

  4. Precision Efficacy Analysis for Regression.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.

    When multiple linear regression is used to develop a prediction model, sample size must be large enough to ensure stable coefficients. If the derivation sample size is inadequate, the model may not predict well for future subjects. The precision efficacy analysis for regression (PEAR) method uses a cross- validity approach to select sample sizes…

  5. VizieR Online Data Catalog: LVL global optical photometry (Cook+, 2014)

    NASA Astrophysics Data System (ADS)

    Cook, D. O.; Dale, D. A.; Johnson, B. D.; van Zee, L.; Lee, J. C.; Kennicutt, R. C.; Calzetti, D.; Staudaher, S. M.; Engelbracht, C. W.

    2015-05-01

    The LVL sample consists of 258 of our nearest galaxy neighbours reflecting a statistically complete, representative sample of the local Universe. The sample selection and description are detailed in Dale et al. (2009ApJ...703..517D, Cat. J/ApJ/703/517). (4 data files).

  6. VizieR Online Data Catalog: LVL SEDs and physical properties (Cook+, 2014)

    NASA Astrophysics Data System (ADS)

    Cook, D. O.; Dale, D. A.; Johnson, B. D.; van Zee, L.; Lee, J. C.; Kennicutt, R. C.; Calzetti, D.; Staudaher, S. M.; Engelbracht, C. W.

    2015-05-01

    The LVL sample consists of 258 of our nearest galaxy neighbours reflecting a statistically complete, representative sample of the local Universe. The sample selection and description are detailed in Dale et al. (2009ApJ...703..517D, Cat. J/ApJ/703/517). (1 data file).

  7. Design unbiased estimation in line intersect sampling using segmented transects

    Treesearch

    David L.R. Affleck; Timothy G. Gregoire; Harry T. Valentine; Harry T. Valentine

    2005-01-01

    In many applications of line intersect sampling. transects consist of multiple, connected segments in a prescribed configuration. The relationship between the transect configuration and the selection probability of a population element is illustrated and a consistent sampling protocol, applicable to populations composed of arbitrarily shaped elements, is proposed. It...

  8. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (essential). (6) Sample description. A description of how the research sample was identified and selected... the size of each subgroup (essential). A description of how the research sample compares with the...). Any quantitative data which identify or define the job constructs, such as factor analyses, should be...

  9. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (essential). (6) Sample description. A description of how the research sample was identified and selected... the size of each subgroup (essential). A description of how the research sample compares with the...). Any quantitative data which identify or define the job constructs, such as factor analyses, should be...

  10. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (essential). (6) Sample description. A description of how the research sample was identified and selected... the size of each subgroup (essential). A description of how the research sample compares with the...). Any quantitative data which identify or define the job constructs, such as factor analyses, should be...

  11. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (essential). (6) Sample description. A description of how the research sample was identified and selected... the size of each subgroup (essential). A description of how the research sample compares with the...). Any quantitative data which identify or define the job constructs, such as factor analyses, should be...

  12. Spectral Characterization of H2020/PTAL Mineral Samples: Implications for In Situ Martian Exploration and Mars Sample Selection

    NASA Astrophysics Data System (ADS)

    Lantz, C.; Pilorget, C.; Poulet, F.; Riu, L.; Dypvik, H.; Hellevang, H.; Rull Perez, F.; Veneranda, M.; Cousin, A.; Viennet, J.-C.; Werner, S. C.

    2018-04-01

    We present combined analysis performed in the framework of the Planetary Terrestrial Analogues Library (H2020 project). XRD, NIR, Raman, and LIBS spectroscopies are used to characterise samples to prepare ExoMars/ESA and Mars2020/NASA observations.

  13. Woody Species Diversity in Forest Plantations in a Mountainous Region of Beijing, China: Effects of Sampling Scale and Species Selection

    PubMed Central

    Zhang, Yuxin; Zhang, Shuang; Ma, Keming; Fu, Bojie; Anand, Madhur

    2014-01-01

    The role of forest plantations in biodiversity conservation has gained more attention in recent years. However, most work on evaluating the diversity of forest plantations focuses only on one spatial scale; thus, we examined the effects of sampling scale on diversity in forest plantations. We designed a hierarchical sampling strategy to collect data on woody species diversity in planted pine (Pinus tabuliformis Carr.), planted larch (Larix principis-rupprechtii Mayr.), and natural secondary deciduous broadleaf forests in a mountainous region of Beijing, China. Additive diversity partition analysis showed that, compared to natural forests, the planted pine forests had a different woody species diversity partitioning pattern at multi-scales (except the Simpson diversity in the regeneration layer), while the larch plantations did not show multi-scale diversity partitioning patterns that were obviously different from those in the natural secondary broadleaf forest. Compare to the natural secondary broadleaf forests, the effects of planted pine forests on woody species diversity are dependent on the sampling scale and layers selected for analysis. Diversity in the planted larch forest, however, was not significantly different from that in the natural forest for all diversity components at all sampling levels. Our work demonstrated that the species selected for afforestation and the sampling scales selected for data analysis alter the conclusions on the levels of diversity supported by plantations. We suggest that a wide range of scales should be considered in the evaluation of the role of forest plantations on biodiversity conservation. PMID:25545860

  14. Sample classification for improved performance of PLS models applied to the quality control of deep-frying oils of different botanic origins analyzed using ATR-FTIR spectroscopy.

    PubMed

    Kuligowski, Julia; Carrión, David; Quintás, Guillermo; Garrigues, Salvador; de la Guardia, Miguel

    2011-01-01

    The selection of an appropriate calibration set is a critical step in multivariate method development. In this work, the effect of using different calibration sets, based on a previous classification of unknown samples, on the partial least squares (PLS) regression model performance has been discussed. As an example, attenuated total reflection (ATR) mid-infrared spectra of deep-fried vegetable oil samples from three botanical origins (olive, sunflower, and corn oil), with increasing polymerized triacylglyceride (PTG) content induced by a deep-frying process were employed. The use of a one-class-classifier partial least squares-discriminant analysis (PLS-DA) and a rooted binary directed acyclic graph tree provided accurate oil classification. Oil samples fried without foodstuff could be classified correctly, independent of their PTG content. However, class separation of oil samples fried with foodstuff, was less evident. The combined use of double-cross model validation with permutation testing was used to validate the obtained PLS-DA classification models, confirming the results. To discuss the usefulness of the selection of an appropriate PLS calibration set, the PTG content was determined by calculating a PLS model based on the previously selected classes. In comparison to a PLS model calculated using a pooled calibration set containing samples from all classes, the root mean square error of prediction could be improved significantly using PLS models based on the selected calibration sets using PLS-DA, ranging between 1.06 and 2.91% (w/w).

  15. ANALYSIS OF SAMPLING TECHNIQUES FOR IMBALANCED DATA: AN N=648 ADNI STUDY

    PubMed Central

    Dubey, Rashmi; Zhou, Jiayu; Wang, Yalin; Thompson, Paul M.; Ye, Jieping

    2013-01-01

    Many neuroimaging applications deal with imbalanced imaging data. For example, in Alzheimer’s Disease Neuroimaging Initiative (ADNI) dataset, the mild cognitive impairment (MCI) cases eligible for the study are nearly two times the Alzheimer’s disease (AD) patients for structural magnetic resonance imaging (MRI) modality and six times the control cases for proteomics modality. Constructing an accurate classifier from imbalanced data is a challenging task. Traditional classifiers that aim to maximize the overall prediction accuracy tend to classify all data into the majority class. In this paper, we study an ensemble system of feature selection and data sampling for the class imbalance problem. We systematically analyze various sampling techniques by examining the efficacy of different rates and types of undersampling, oversampling, and a combination of over and under sampling approaches. We thoroughly examine six widely used feature selection algorithms to identify significant biomarkers and thereby reduce the complexity of the data. The efficacy of the ensemble techniques is evaluated using two different classifiers including Random Forest and Support Vector Machines based on classification accuracy, area under the receiver operating characteristic curve (AUC), sensitivity, and specificity measures. Our extensive experimental results show that for various problem settings in ADNI, (1). a balanced training set obtained with K-Medoids technique based undersampling gives the best overall performance among different data sampling techniques and no sampling approach; and (2). sparse logistic regression with stability selection achieves competitive performance among various feature selection algorithms. Comprehensive experiments with various settings show that our proposed ensemble model of multiple undersampled datasets yields stable and promising results. PMID:24176869

  16. Nonmanufacturing Businesses. U.S. Metric Study Interim Report.

    ERIC Educational Resources Information Center

    Cornog, June R.; Bunten, Elaine D.

    In this fifth interim report on the feasibility of a United States changeover to a metric system stems from the U.S. Metric Study, a primary stratified sample of 2,828 nonmanufacturing firms was randomly selected from 28,184 businesses taken from Social Security files, a secondary sample of 2,258 firms was randomly selected for replacement…

  17. Developing and Fostering Passion in Academic and Nonacademic Domains

    ERIC Educational Resources Information Center

    Fredricks, Jennifer A.; Alfeld, Corinne; Eccles, Jacquelynne

    2010-01-01

    The purpose of this study was to explore how passion was manifested among gifted and talent youth selected from a larger longitudinal study of child and adolescent development. The gifted sample included 25 high school and college students who were selected because they were in a gifted program in elementary school. The talent sample included 41…

  18. Geochemical maps showing the distribution and abundance of selected elements in nonmagnetic heavy-mineral-concentrate samples from stream sediment, Solomon and Bendelehen 1 degree by 3 degree Quadrangles , Seward Peninsula, Alaska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, H.D.; Smith, S.C.; Sutley, S.J.

    Geochemical maps showing the distribution and abundance of selected elements in nonmagnetic heavy-mineral-concentrate samples from stream sediment, Solomon and Bendelehen 1{degree} by 3{degree} Quadrangles , Seward Peninsula, Alaska is presented.

  19. ELT Research in Turkey: A Content Analysis of Selected Features of Published Articles

    ERIC Educational Resources Information Center

    Yagiz, Oktay; Aydin, Burcu; Akdemir, Ahmet Selçuk

    2016-01-01

    This study reviews a selected sample of 274 research articles on ELT, published between 2005 and 2015 in Turkish contexts. In the study, 15 journals in ULAKBIM database and articles from national and international journals accessed according to convenience sampling method were surveyed and relevant articles were obtained. A content analysis was…

  20. Evaluating Concentrations of Heavy Metals in the U.S. Peanut Crop in the Presence of Detection Limits

    USDA-ARS?s Scientific Manuscript database

    The concentration of mercury, cadmium, lead, and arsenic along with glyphosate and an extensive array of pesticides in the U.S. peanut crop was assessed for crop years 2013-2015. Samples were randomly selected from various buying points during the grading process. Samples were selected from the thre...

  1. The Relationship between Teachers Commitment and Female Students Academic Achievements in Some Selected Secondary School in Wolaita Zone, Southern Ethiopia

    ERIC Educational Resources Information Center

    Bibiso, Abyot; Olango, Menna; Bibiso, Mesfin

    2017-01-01

    The purpose of this study was to investigate the relationship between teacher's commitment and female students academic achievement in selected secondary school of Wolaita zone, Southern Ethiopia. The research method employed was survey study and the sampling techniques were purposive, simple random and stratified random sampling. Questionnaire…

  2. Socio-Economic Background and Access to Internet as Correlates of Students' Achievement in Agricultural Science

    ERIC Educational Resources Information Center

    Adegoke, Sunday Paul; Osokoya, Modupe M.

    2015-01-01

    This study investigated access to internet and socio-economic background as correlates of students' achievement in Agricultural Science among selected Senior Secondary Schools Two Students in Ogbomoso South and North Local Government Areas. The study adopted multi-stage sampling technique. Simple random sampling was used to select 30 students from…

  3. Assessing the performance of multiplexed tandem PCR for the diagnosis of pathogenic genotypes of Theileria orientalis using pooled blood samples from cattle.

    PubMed

    Gebrekidan, Hagos; Gasser, Robin B; Stevenson, Mark A; McGrath, Sean; Jabbar, Abdul

    2017-02-01

    Oriental theileriosis caused by multiple genotypes of Theileria orientalis is an important tick-borne disease of bovines. Here, we assessed the performance of an established multiplexed tandem PCR (MT-PCR) for the diagnosis of the two recognized, pathogenic genotypes (chitose and ikeda) of T. orientalis in cattle using pooled blood samples. We used a total of 265 cattle blood samples, which were divided into two groups according to previous MT-PCR results for individual samples. Samples in group 1 (n = 155) were from a herd with a relatively high prevalence of T. orientalis infection; and those in group 2 (n = 110) were from four herds with a low prevalence. For group 1, 31 and 15 batches of five- and ten-pooled samples (selected at random), respectively, were formed. For group 2, 22 and 11 batches of five- and ten-pooled samples (selected at random), respectively, were formed. DNAs from individual pooled samples in each batch and group were then tested by MT-PCR. For group 1, the apparent prevalences estimated using the 31 batches of five-pooled samples (97%) and 15 batches of ten-pooled samples (100%) were significantly higher compared with individual samples (75%). For group 2, higher apparent prevalences (9% and 36%) were also recorded for the 22 and 11 batches of pooled samples, respectively, compared with individual samples (7%). Overall, the average infection intensity recorded for the genotypes of chitose and ikeda were considerably lower in pooled compared with individual samples. The diagnostic specificities of MT-PCR were estimated at 95% and 94%, respectively, when batches of five- and ten-pooled samples were tested, and 94% for individual samples. The diagnostic sensitivity of this assay was estimated at 98% same for all individual, five- and ten-pooled samples. This study shows that screening batches of five- and ten-pooled blood samples from cattle herds are similar to those obtained for individual samples, and, importantly, that the reduced cost for the testing of pooled samples represents a considerable saving to herd managers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. A highly addressable static droplet array enabling digital control of a single droplet at pico-volume resolution.

    PubMed

    Jeong, Heon-Ho; Lee, Byungjin; Jin, Si Hyung; Jeong, Seong-Geun; Lee, Chang-Soo

    2016-04-26

    Droplet-based microfluidics enabling exquisite liquid-handling has been developed for diagnosis, drug discovery and quantitative biology. Compartmentalization of samples into a large number of tiny droplets is a great approach to perform multiplex assays and to improve reliability and accuracy using a limited volume of samples. Despite significant advances in microfluidic technology, individual droplet handling in pico-volume resolution is still a challenge in obtaining more efficient and varying multiplex assays. We present a highly addressable static droplet array (SDA) enabling individual digital manipulation of a single droplet using a microvalve system. In a conventional single-layer microvalve system, the number of microvalves required is dictated by the number of operation objects; thus, individual trap-and-release on a large-scale 2D array format is highly challenging. By integrating double-layer microvalves, we achieve a "balloon" valve that preserves the pressure-on state under released pressure; this valve can allow the selective releasing and trapping of 7200 multiplexed pico-droplets using only 1 μL of sample without volume loss. This selectivity and addressability completely arranged only single-cell encapsulated droplets from a mixture of droplet compositions via repetitive selective trapping and releasing. Thus, it will be useful for efficient handling of miniscule volumes of rare or clinical samples in multiplex or combinatory assays, and the selective collection of samples.

  5. Training set optimization under population structure in genomic selection.

    PubMed

    Isidro, Julio; Jannink, Jean-Luc; Akdemir, Deniz; Poland, Jesse; Heslot, Nicolas; Sorrells, Mark E

    2015-01-01

    Population structure must be evaluated before optimization of the training set population. Maximizing the phenotypic variance captured by the training set is important for optimal performance. The optimization of the training set (TRS) in genomic selection has received much interest in both animal and plant breeding, because it is critical to the accuracy of the prediction models. In this study, five different TRS sampling algorithms, stratified sampling, mean of the coefficient of determination (CDmean), mean of predictor error variance (PEVmean), stratified CDmean (StratCDmean) and random sampling, were evaluated for prediction accuracy in the presence of different levels of population structure. In the presence of population structure, the most phenotypic variation captured by a sampling method in the TRS is desirable. The wheat dataset showed mild population structure, and CDmean and stratified CDmean methods showed the highest accuracies for all the traits except for test weight and heading date. The rice dataset had strong population structure and the approach based on stratified sampling showed the highest accuracies for all traits. In general, CDmean minimized the relationship between genotypes in the TRS, maximizing the relationship between TRS and the test set. This makes it suitable as an optimization criterion for long-term selection. Our results indicated that the best selection criterion used to optimize the TRS seems to depend on the interaction of trait architecture and population structure.

  6. Presence of archaea and selected bacteria in infected root canal systems.

    PubMed

    Brzezińska-Błaszczyk, Ewa; Pawłowska, Elżbieta; Płoszaj, Tomasz; Witas, Henryk; Godzik, Urszula; Agier, Justyna

    2018-05-01

    Infections of the root canal have polymicrobial etiology. The main group of microflora in the infected pulp is bacteria. There is limited data that archaea may be present in infected pulp tissue. The aim of this study was to check the prevalence of archaea in necrotic root canal samples obtained from patients with primary or post-treatment infection. The prevalence of selected bacteria species (Prevotella intermedia, Porphyromonas gingivalis, Tannerella forsythia, Treponema denticola, Synergistes sp.) in necrotic samples was evaluated as well. Sixty-four samples from root canal were collected for DNA and RNA extraction. A PCR assay based on the 16S rRNA gene was used to determine the presence of archaea and selected bacteria. Of the 64 samples, 6 were analyzed by semiquantitative reverse transcription PCR to estimate expression profiles of 16S rRNA, and another 9 were selected for direct sequencing. Archaea were detected in 48.4% samples. Statistical analysis indicated a negative association in coexistence between archaea and Treponema denticola (P < 0.05; Pearson's χ 2 test). The main representative of the Archaea domain found in infected pulp tissue was Methanobrevibacter oralis. Archaea 16S rRNA gene expression was significantly lower than Synergistes sp., Porphyromonas gingivalis, and Tannerella forsythia (P < 0.05; Student's t test). Thus, it can be hypothesized that archaea may participate in the endodontic microbial community.

  7. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  8. 4D (x-y-z-t) imaging of thick biological samples by means of Two-Photon inverted Selective Plane Illumination Microscopy (2PE-iSPIM)

    PubMed Central

    Lavagnino, Zeno; Sancataldo, Giuseppe; d’Amora, Marta; Follert, Philipp; De Pietri Tonelli, Davide; Diaspro, Alberto; Cella Zanacchi, Francesca

    2016-01-01

    In the last decade light sheet fluorescence microscopy techniques, such as selective plane illumination microscopy (SPIM), has become a well established method for developmental biology. However, conventional SPIM architectures hardly permit imaging of certain tissues since the common sample mounting procedure, based on gel embedding, could interfere with the sample morphology. In this work we propose an inverted selective plane microscopy system (iSPIM), based on non-linear excitation, suitable for 3D tissue imaging. First, the iSPIM architecture provides flexibility on the sample mounting, getting rid of the gel-based mounting typical of conventional SPIM, permitting 3D imaging of hippocampal slices from mouse brain. Moreover, all the advantages brought by two photon excitation (2PE) in terms of reduction of scattering effects and contrast improvement are exploited, demonstrating an improved image quality and contrast compared to single photon excitation. The system proposed represents an optimal platform for tissue imaging and it smooths the way to the applicability of light sheet microscopy to a wider range of samples including those that have to be mounted on non-transparent surfaces. PMID:27033347

  9. Methods of analysis by the U. S. Geological Survey National Water Quality Laboratory - determination of organonitrogen herbicides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry with selected-ion monitoring

    USGS Publications Warehouse

    Sandstrom, Mark W.; Wydoski, Duane S.; Schroeder, Michael P.; Zamboni, Jana L.; Foreman, William T.

    1992-01-01

    A method for the isolation of organonitrogen herbicides from natural water samples using solid-phase extraction and analysis by capillary-column gas chromatography/mass spectrometry with selected-ion monitoring is described. Water samples are filtered to remove suspended particulate matter and then are pumped through disposable solid-phase extraction cartridges containing octadecyl-bonded porous silica to remove the herbicides. The cartridges are dried using carbon dioxide, and adsorbed herbicides are removed from the cartridges by elution with 1.8 milliliters of hexaneisopropanol (3:1). Extracts of the eluants are analyzed by capillary-column gas chromatography/mass spectrometry with selected-ion monitoring of at least three characteristic ions. The method detection limits are dependent on sample matrix and each particular herbicide. The method detection limits, based on a 100-milliliter sample size, range from 0.02 to 0.25 microgram per liter. Recoveries averaged 80 to 115 percent for the 23 herbicides and 2 metabolites in 1 reagent-water and 2 natural-water samples fortified at levels of 0.2 and 2.0 micrograms per liter.

  10. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  11. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    PubMed

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  12. Determination of selected quaternary ammonium compounds by liquid chromatography with mass spectrometry. Part II. Application to sediment and sludge samples in Austria.

    PubMed

    Martínez-Carballo, Elena; González-Barreiro, Carmen; Sitka, Andrea; Kreuzinger, Norbert; Scharf, Sigrid; Gans, Oliver

    2007-03-01

    Soxhlet extraction and high-performance liquid chromatography (HPLC) coupled to tandem mass spectrometry detection (MS/MS) was used for the determination of selected quaternary ammonium compounds (QACs) in solid samples. The method was applied for the determination of alkyl benzyl, dialkyl and trialkyl quaternary ammonium compounds in sediment and sludge samples in Austria. The overall method quantification limits range from 0.6 to 3 microg/kg for sediments and from 2 to 5 microg/kg for sewage sludges. Mean recoveries between 67% and 95% are achieved. In general sediments were especially contaminated by C12 chain benzalkonium chloride (BAC-C12) as well as by the long C-chain dialkyldimethylammonium chloride (DDAC-C18) with a maximum concentration of 3.6 mg/kg and 2.1mg/kg, respectively. Maxima of 27 mg/kg for DDAC-C10, 25 mg/kg for BAC-C12 and 23 mg/kg for BAC-C14 were determined for sludge samples. The sums of the 12 selected target compounds range from 22 mg/kg to 103 mg/kg in the sludge samples.

  13. Vapor permeation-stepwise injection simultaneous determination of methanol and ethanol in biodiesel with voltammetric detection.

    PubMed

    Shishov, Andrey; Penkova, Anastasia; Zabrodin, Andrey; Nikolaev, Konstantin; Dmitrenko, Maria; Ermakov, Sergey; Bulatov, Andrey

    2016-02-01

    A novel vapor permeation-stepwise injection (VP-SWI) method for the determination of methanol and ethanol in biodiesel samples is discussed. In the current study, stepwise injection analysis was successfully combined with voltammetric detection and vapor permeation. This method is based on the separation of methanol and ethanol from a sample using a vapor permeation module (VPM) with a selective polymer membrane based on poly(phenylene isophtalamide) (PA) containing high amounts of a residual solvent. After the evaporation into the headspace of the VPM, methanol and ethanol were transported, by gas bubbling, through a PA membrane to a mixing chamber equipped with a voltammetric detector. Ethanol was selectively detected at +0.19 V, and both compounds were detected at +1.20 V. Current subtractions (using a correction factor) were used for the selective determination of methanol. A linear range between 0.05 and 0.5% (m/m) was established for each analyte. The limits of detection were estimated at 0.02% (m/m) for ethanol and methanol. The sample throughput was 5 samples h(-1). The method was successfully applied to the analysis of biodiesel samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Solid sorbent air sampler

    NASA Technical Reports Server (NTRS)

    Galen, T. J. (Inventor)

    1986-01-01

    A fluid sampler for collecting a plurality of discrete samples over separate time intervals is described. The sampler comprises a sample assembly having an inlet and a plurality of discreet sample tubes each of which has inlet and outlet sides. A multiport dual acting valve is provided in the sampler in order to sequentially pass air from the sample inlet into the selected sample tubes. The sample tubes extend longitudinally of the housing and are located about the outer periphery thereof so that upon removal of an enclosure cover, they are readily accessible for operation of the sampler in an analysis mode.

  15. Automatic HTS force measurement instrument

    DOEpatents

    Sanders, S.T.; Niemann, R.C.

    1999-03-30

    A device is disclosed for measuring the levitation force of a high temperature superconductor sample with respect to a reference magnet includes a receptacle for holding several high temperature superconductor samples each cooled to superconducting temperature. A rotatable carousel successively locates a selected one of the high temperature superconductor samples in registry with the reference magnet. Mechanism varies the distance between one of the high temperature superconductor samples and the reference magnet, and a sensor measures levitation force of the sample as a function of the distance between the reference magnet and the sample. A method is also disclosed. 3 figs.

  16. System Would Acquire Core and Powder Samples of Rocks

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Randolph, James; Bao, Xiaoqi; Sherrit, Stewart; Ritz, Chuck; Cook, Greg

    2006-01-01

    A system for automated sampling of rocks, ice, and similar hard materials at and immediately below the surface of the ground is undergoing development. The system, denoted a sample preparation, acquisition, handling, and delivery (SPAHD) device, would be mounted on a robotic exploratory vehicle that would traverse the terrain of interest on the Earth or on a remote planet. The SPAHD device would probe the ground to obtain data for optimization of sampling, prepare the surface, acquire samples in the form(s) of cores and/or powdered cuttings, and deliver the samples to a selected location for analysis and/or storage.

  17. Fixture for supporting and aligning a sample to be analyzed in an x-ray diffraction apparatus

    DOEpatents

    Green, L.A.; Heck, J.L. Jr.

    1985-04-23

    A fixture is provided for supporting and aligning small samples of material on a goniometer for x-ray diffraction analysis. A sample-containing capillary is accurately positioned for rotation in the x-ray beam by selectively adjusting the fixture to position the capillary relative to the x and y axes thereof to prevent wobble and position the sample along the z axis or the axis of rotation. By employing the subject fixture relatively small samples of materials can be analyzed in an x-ray diffraction apparatus previously limited to the analysis of much larger samples.

  18. Fixture for supporting and aligning a sample to be analyzed in an X-ray diffraction apparatus

    DOEpatents

    Green, Lanny A.; Heck, Jr., Joaquim L.

    1987-01-01

    A fixture is provided for supporting and aligning small samples of material on a goniometer for X-ray diffraction analysis. A sample-containing capillary is accurately positioned for rotation in the X-ray beam by selectively adjusting the fixture to position the capillary relative to the x and y axes thereof to prevent wobble and position the sample along the z axis or the axis of rotation. By employing the subject fixture relatively small samples of materials can be analyzed in an X-ray diffraction apparatus previously limited to the analysis of much larger samples.

  19. Mineralogical Characterization of the Miocene Olcese Formation, Southern San Joaquin Valley, California

    NASA Astrophysics Data System (ADS)

    Lopez, K. A.; Baron, D.; Guo, J.; Woolford, J. M.

    2016-12-01

    The early to middle Miocene Olcese Formation in the southern San Joaquin Valley of California consists of shallow marine shelf sands in its lower and upper parts, and non-marine, frequently pumiceous sands in its middle part, and varies in thickness up to 1800 ft. There is little known as to the origin, nature, quantity, and distribution of clay minerals throughout the formation. This study examined 95 sidewall core samples from three wells, as well as 388 cutting samples from four wells and 12 samples from 3 outcrops. Well samples were from depths between 1,800 and 4,000 ft. Qualitative and quantitative mineralogy including clay minerals of the sidewall samples and selected cutting samples was determined by powder X-ray diffraction (XRD). XRD analyses were supplemented by scanning electron microscopy with energy-dispersive X-ray spectroscopy (SEM-EDS) and petrographic microscopy of selected samples. The main minerals of bulk samples include composite clay, quartz, potassium feldspar/plagioclase, calcite, and clinoptilolite. Content of composite clay varies between 17% and 51%. The clay-size fraction is predominantly composed of smectite, illite, kaolinite and chlorite with smectite being the most abundant. Smectite and clinoptilolite may be the alteration products of deeper burial of volcanic materials. The formation permeability could be significantly lowered by these authigenic minerals.

  20. GalaxyGPCRloop: Template-Based and Ab Initio Structure Sampling of the Extracellular Loops of G-Protein-Coupled Receptors.

    PubMed

    Won, Jonghun; Lee, Gyu Rie; Park, Hahnbeom; Seok, Chaok

    2018-06-07

    The second extracellular loops (ECL2s) of G-protein-coupled receptors (GPCRs) are often involved in GPCR functions, and their structures have important implications in drug discovery. However, structure prediction of ECL2 is difficult because of its long length and the structural diversity among different GPCRs. In this study, a new ECL2 conformational sampling method involving both template-based and ab initio sampling was developed. Inspired by the observation of similar ECL2 structures of closely related GPCRs, a template-based sampling method employing loop structure templates selected from the structure database was developed. A new metric for evaluating similarity of the target loop to templates was introduced for template selection. An ab initio loop sampling method was also developed to treat cases without highly similar templates. The ab initio method is based on the previously developed fragment assembly and loop closure method. A new sampling component that takes advantage of secondary structure prediction was added. In addition, a conserved disulfide bridge restraining ECL2 conformation was predicted and analytically incorporated into sampling, reducing the effective dimension of the conformational search space. The sampling method was combined with an existing energy function for comparison with previously reported loop structure prediction methods, and the benchmark test demonstrated outstanding performance.

  1. Optimal flexible sample size design with robust power.

    PubMed

    Zhang, Lanju; Cui, Lu; Yang, Bo

    2016-08-30

    It is well recognized that sample size determination is challenging because of the uncertainty on the treatment effect size. Several remedies are available in the literature. Group sequential designs start with a sample size based on a conservative (smaller) effect size and allow early stop at interim looks. Sample size re-estimation designs start with a sample size based on an optimistic (larger) effect size and allow sample size increase if the observed effect size is smaller than planned. Different opinions favoring one type over the other exist. We propose an optimal approach using an appropriate optimality criterion to select the best design among all the candidate designs. Our results show that (1) for the same type of designs, for example, group sequential designs, there is room for significant improvement through our optimization approach; (2) optimal promising zone designs appear to have no advantages over optimal group sequential designs; and (3) optimal designs with sample size re-estimation deliver the best adaptive performance. We conclude that to deal with the challenge of sample size determination due to effect size uncertainty, an optimal approach can help to select the best design that provides most robust power across the effect size range of interest. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Radionuclides, inorganic constituents, organic compounds, and bacteria in water from selected wells and springs from the southern boundary of the Idaho National Engineering Laboratory to the Hagerman Area, Idaho, 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartholomay, R.C.; Edwards, D.D.; Campbell, L.J.

    1992-03-01

    The US Geological Survey and the Idaho Department of Water Resources, in response to a request from the US Department of Energy, sampled 19 sites as part of a long-term project to monitor water quality of the Snake River Plain aquifer from the southern boundary of the Idaho National Engineering Laboratory to the Hagerman area. Water samples were collected and analyzed for manmade pollutants and naturally occurring constituents. The samples were collected from seven irrigation wells, five domestic wells, two springs, one stock well, two dairy wells, one observation well, and one commercial well. Two quality assurance samples also weremore » collected and analyzed. The water samples were analyzed for selected radionuclides, inorganic constituents, organic compounds, and bacteria. None of the radionuclides, inorganic constituents, or organic compounds exceeded the established maximum contaminant levels for drinking water. Most of the radionuclide and inorganic constituent concentrations exceeded their respective reporting levels. All samples analyzed for surfactants and dissolved organic carbon had concentrations that exceeded their reporting level. Toluene concentrations exceeded the reporting level in one water sample. Two samples contained fecal coliform bacteria counts that exceeded established maximum contaminant levels for drinking water.« less

  3. Manifold Regularized Experimental Design for Active Learning.

    PubMed

    Zhang, Lining; Shum, Hubert P H; Shao, Ling

    2016-12-02

    Various machine learning and data mining tasks in classification require abundant data samples to be labeled for training. Conventional active learning methods aim at labeling the most informative samples for alleviating the labor of the user. Many previous studies in active learning select one sample after another in a greedy manner. However, this is not very effective because the classification models has to be retrained for each newly labeled sample. Moreover, many popular active learning approaches utilize the most uncertain samples by leveraging the classification hyperplane of the classifier, which is not appropriate since the classification hyperplane is inaccurate when the training data are small-sized. The problem of insufficient training data in real-world systems limits the potential applications of these approaches. This paper presents a novel method of active learning called manifold regularized experimental design (MRED), which can label multiple informative samples at one time for training. In addition, MRED gives an explicit geometric explanation for the selected samples to be labeled by the user. Different from existing active learning methods, our method avoids the intrinsic problems caused by insufficiently labeled samples in real-world applications. Various experiments on synthetic datasets, the Yale face database and the Corel image database have been carried out to show how MRED outperforms existing methods.

  4. Bayesian model selection: Evidence estimation based on DREAM simulation and bridge sampling

    NASA Astrophysics Data System (ADS)

    Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.

    2017-04-01

    Bayesian inference has found widespread application in Earth and Environmental Systems Modeling, providing an effective tool for prediction, data assimilation, parameter estimation, uncertainty analysis and hypothesis testing. Under multiple competing hypotheses, the Bayesian approach also provides an attractive alternative to traditional information criteria (e.g. AIC, BIC) for model selection. The key variable for Bayesian model selection is the evidence (or marginal likelihood) that is the normalizing constant in the denominator of Bayes theorem; while it is fundamental for model selection, the evidence is not required for Bayesian inference. It is computed for each hypothesis (model) by averaging the likelihood function over the prior parameter distribution, rather than maximizing it as by information criteria; the larger a model evidence the more support it receives among a collection of hypothesis as the simulated values assign relatively high probability density to the observed data. Hence, the evidence naturally acts as an Occam's razor, preferring simpler and more constrained models against the selection of over-fitted ones by information criteria that incorporate only the likelihood maximum. Since it is not particularly easy to estimate the evidence in practice, Bayesian model selection via the marginal likelihood has not yet found mainstream use. We illustrate here the properties of a new estimator of the Bayesian model evidence, which provides robust and unbiased estimates of the marginal likelihood; the method is coined Gaussian Mixture Importance Sampling (GMIS). GMIS uses multidimensional numerical integration of the posterior parameter distribution via bridge sampling (a generalization of importance sampling) of a mixture distribution fitted to samples of the posterior distribution derived from the DREAM algorithm (Vrugt et al., 2008; 2009). Some illustrative examples are presented to show the robustness and superiority of the GMIS estimator with respect to other commonly used approaches in the literature.

  5. Effects of select and reject control on equivalence class formation and transfer of function.

    PubMed

    Perez, William F; Tomanari, Gerson Y; Vaidya, Manish

    2015-09-01

    The present study used a single-subject design to evaluate the effects of select or reject control on equivalence class formation and transfer of function. Adults were exposed to a matching-to-sample task with observing requirements (MTS-OR) in order to bias the establishment of sample/S+ (select) or sample/S- (reject) relations. In Experiment 1, four sets of baseline conditional relations were taught-two under reject control (A1B2C1, A2B1C2) and two under select control (D1E1F1, D2E2F2). Participants were tested for transitivity, symmetry, equivalence and reflexivity. They also learned a simple discrimination involving one of the stimuli from the equivalence classes and were tested for the transfer of the discriminative function. In general, participants performed with high accuracy on all equivalence-related probes as well as the transfer of function probes under select control. Under reject control, participants had high scores only on the symmetry test; transfer of function was attributed to stimuli programmed as S-. In Experiment 2, the equivalence class under reject control was expanded to four members (A1B2C1D2; A2B1C2D1). Participants had high scores only on symmetry and on transitivity and equivalence tests involving two nodes. Transfer of function was extended to the programmed S- added to each class. Results from both experiments suggest that select and reject controls might differently affect the formation of equivalence classes and the transfer of stimulus functions. © Society for the Experimental Analysis of Behavior.

  6. An efficient sampling strategy for selection of biobank samples using risk scores.

    PubMed

    Björk, Jonas; Malmqvist, Ebba; Rylander, Lars; Rignell-Hydbom, Anna

    2017-07-01

    The aim of this study was to suggest a new sample-selection strategy based on risk scores in case-control studies with biobank data. An ongoing Swedish case-control study on fetal exposure to endocrine disruptors and overweight in early childhood was used as the empirical example. Cases were defined as children with a body mass index (BMI) ⩾18 kg/m 2 ( n=545) at four years of age, and controls as children with a BMI of ⩽17 kg/m 2 ( n=4472 available). The risk of being overweight was modelled using logistic regression based on available covariates from the health examination and prior to selecting samples from the biobank. A risk score was estimated for each child and categorised as low (0-5%), medium (6-13%) or high (⩾14%) risk of being overweight. The final risk-score model, with smoking during pregnancy ( p=0.001), birth weight ( p<0.001), BMI of both parents ( p<0.001 for both), type of residence ( p=0.04) and economic situation ( p=0.12), yielded an area under the receiver operating characteristic curve of 67% ( n=3945 with complete data). The case group ( n=416) had the following risk-score profile: low (12%), medium (46%) and high risk (43%). Twice as many controls were selected from each risk group, with further matching on sex. Computer simulations showed that the proposed selection strategy with stratification on risk scores yielded consistent improvements in statistical precision. Using risk scores based on available survey or register data as a basis for sample selection may improve possibilities to study heterogeneity of exposure effects in biobank-based studies.

  7. The use of laser-induced fluorescence or ultraviolet detectors for sensitive and selective analysis of tobramycin or erythropoietin in complex samples

    NASA Astrophysics Data System (ADS)

    Ahmed, Hytham M.; Ebeid, Wael B.

    2015-05-01

    Complex samples analysis is a challenge in pharmaceutical and biopharmaceutical analysis. In this work, tobramycin (TOB) analysis in human urine samples and recombinant human erythropoietin (rhEPO) analysis in the presence of similar protein were selected as representative examples of such samples analysis. Assays of TOB in urine samples are difficult because of poor detectability. Therefore laser induced fluorescence detector (LIF) was combined with a separation technique, micellar electrokinetic chromatography (MEKC), to determine TOB through derivatization with fluorescein isothiocyanate (FITC). Borate was used as background electrolyte (BGE) with negative-charged mixed micelles as additive. The method was successively applied to urine samples. The LOD and LOQ for Tobramycin in urine were 90 and 200 ng/ml respectively and recovery was >98% (n = 5). All urine samples were analyzed by direct injection without sample pre-treatment. Another use of hyphenated analytical technique, capillary zone electrophoresis (CZE) connected to ultraviolet (UV) detector was also used for sensitive analysis of rhEPO at low levels (2000 IU) in the presence of large amount of human serum albumin (HSA). Analysis of rhEPO was achieved by the use of the electrokinetic injection (EI) with discontinuous buffers. Phosphate buffer was used as BGE with metal ions as additive. The proposed method can be used for the estimation of large number of quality control rhEPO samples in a short period.

  8. Evaluation of field sampling and preservation methods for strontium-90 in ground water at the Idaho National Engineering Laboratory, Idaho

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cecil, L.D.; Knobel, L.L.; Wegner, S.J.

    1989-09-01

    From 1952 to 1988, about 140 curies of strontium-90 have been discharged in liquid waste to disposal ponds and wells at the INEL (Idaho National Engineering Laboratory). The US Geological Survey routinely samples ground water from the Snake River Plain aquifer and from discontinuous perched-water zones for selected radionuclides, major and minor ions, and chemical and physical characteristics. Water samples for strontium-90 analyses collected in the field are unfiltered and preserved to an approximate 2-percent solution with reagent-grade hydrochloric acid. Water from four wells completed in the Snake River Plain aquifer was sampled as part of the US Geological Survey'smore » quality-assurance program to evaluate the effect of filtration and preservation methods on strontium-90 concentrations in ground water at the INEL. The wells were selected for sampling on the basis of historical concentrations of strontium-90 in ground water. Water from each well was filtered through either a 0.45- or a 0.1-micrometer membrane filter; unfiltered samples also were collected. Two sets of filtered and two sets of unfiltered water samples were collected at each well. One set of water samples was preserved in the field to an approximate 2-percent solution with reagent-grade hydrochloric acid and the other set of samples was not acidified. 13 refs., 2 figs., 6 tabs.« less

  9. Photo ion spectrometer

    DOEpatents

    Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.

    1989-01-01

    A method and apparatus for extracting for quantitative analysis ions of selected atomic components of a sample. A lens system is configured to provide a slowly diminishing field region for a volume containing the selected atomic components, enabling accurate energy analysis of ions generated in the slowly diminishing field region. The lens system also enables focusing on a sample of a charged particle beam, such as an ion beam, along a path length perpendicular to the sample and extraction of the charged particles along a path length also perpendicular to the sample. Improvement of signal to noise ratio is achieved by laser excitation of ions to selected autoionization states before carrying out quantitative analysis. Accurate energy analysis of energetic charged particles is assured by using a preselected resistive thick film configuration disposed on an insulator substrate for generating predetermined electric field boundary conditions to achieve for analysis the required electric field potential. The spectrometer also is applicable in the fields of SIMS, ISS and electron spectroscopy.

  10. Photo ion spectrometer

    DOEpatents

    Gruen, D.M.; Young, C.E.; Pellin, M.J.

    1989-08-08

    A method and apparatus are described for extracting for quantitative analysis ions of selected atomic components of a sample. A lens system is configured to provide a slowly diminishing field region for a volume containing the selected atomic components, enabling accurate energy analysis of ions generated in the slowly diminishing field region. The lens system also enables focusing on a sample of a charged particle beam, such as an ion beam, along a path length perpendicular to the sample and extraction of the charged particles along a path length also perpendicular to the sample. Improvement of signal to noise ratio is achieved by laser excitation of ions to selected auto-ionization states before carrying out quantitative analysis. Accurate energy analysis of energetic charged particles is assured by using a preselected resistive thick film configuration disposed on an insulator substrate for generating predetermined electric field boundary conditions to achieve for analysis the required electric field potential. The spectrometer also is applicable in the fields of SIMS, ISS and electron spectroscopy. 8 figs.

  11. CO hydrogenation on PdCo/NaY catalysts: Effect of ion hydration on metal phases and selectivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuangen Yin; Zongchao Zhang; Sachtler, W.M.H.

    1993-02-01

    Exposure of calcined PdCo/NaY catalyst precursors to water vapor, prior to reduction, strongly affects the CO hydrogenation activity and selectivity of the reduced bimetal catalysts. With samples that had been exposed to H[sub 2]O before reduction, the formation of hydrocarbons prevails; nonhydrated reference samples of the same overall composition are mainly selective for oxygenates. After 6 h of reaction time PdCo alloy particles of 5.8 nm are detected by XRD in H[sub 2]O-exposed catalysts, but in the reference samples the metal particles are below the limit of detection by XRD. The observed effects are attributed to the formation of mobilemore » aquo-complexes of metal ions; after reduction they are converted to larger alloy particles, richer on Co, than in the reference samples. Results obtained with NaOH-neutralized and Co-free Pd/NaY catalysts are also discussed. 23 refs., 13 figs., 1 tab.« less

  12. Magnetic dummy molecularly imprinted polymers based on multi-walled carbon nanotubes for rapid selective solid-phase extraction of 4-nonylphenol in aqueous samples.

    PubMed

    Rao, Wei; Cai, Rong; Yin, Yuli; Long, Fang; Zhang, Zhaohui

    2014-10-01

    In this paper, a highly selective sample clean-up procedure combining magnetic dummy molecular imprinting with solid-phase extraction was developed for rapid separation and determination of 4-nonylphenol (NP) in the environmental water samples. The magnetic dummy molecularly imprinted polymers (mag-DMIPs) based on multi-walled carbon nanotubes were successfully synthesized with a surface molecular imprinting technique using 4-tert-octylphenol as the dummy template and tetraethylorthosilicate as the cross-linker. The maximum adsorption capacity of the mag-DMIPs for NP was 52.4 mg g(-1) and it took about 20 min to achieve the adsorption equilibrium. The mag-DMIPs exhibited the specific selective adsorption toward NP. Coupled with high performance liquid chromatography analysis, the mag-DMIPs were used to extract solid-phase and detect NP in real water samples successfully with the recoveries of 88.6-98.1%. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Phenotypic constraints promote latent versatility and carbon efficiency in metabolic networks.

    PubMed

    Bardoscia, Marco; Marsili, Matteo; Samal, Areejit

    2015-07-01

    System-level properties of metabolic networks may be the direct product of natural selection or arise as a by-product of selection on other properties. Here we study the effect of direct selective pressure for growth or viability in particular environments on two properties of metabolic networks: latent versatility to function in additional environments and carbon usage efficiency. Using a Markov chain Monte Carlo (MCMC) sampling based on flux balance analysis (FBA), we sample from a known biochemical universe random viable metabolic networks that differ in the number of directly constrained environments. We find that the latent versatility of sampled metabolic networks increases with the number of directly constrained environments and with the size of the networks. We then show that the average carbon wastage of sampled metabolic networks across the constrained environments decreases with the number of directly constrained environments and with the size of the networks. Our work expands the growing body of evidence about nonadaptive origins of key functional properties of biological networks.

  14. Integration of reconfigurable potentiometric electrochemical sensors into a digital microfluidic platform.

    PubMed

    Farzbod, Ali; Moon, Hyejin

    2018-05-30

    This paper presents the demonstration of on-chip fabrication of a potassium-selective sensor array enabled by electrowetting on dielectric digital microfluidics for the first time. This demonstration proves the concept that electrochemical sensors can be seamlessly integrated with sample preparation units in a digital microfluidic platform. More significantly, the successful on-chip fabrication of a sensor array indicates that sensors become reconfigurable and have longer lifetime in a digital microfluidic platform. The on-chip fabrication of ion-selective electrodes includes electroplating Ag followed by forming AgCl layer by chemical oxidation and depositing a thin layer of desired polymer-based ion selective membrane on one of the sensor electrodes. In this study, potassium ionophores work as potassium ion channels and make the membrane selective to potassium ions. This selectiveness results in the voltage difference across the membrane layer, which is correlated with potassium ion concentration. The calibration curve of the fabricated potassium-selective electrode demonstrates the slope of 58 mV/dec for potassium concentration in KCl sample solutions and shows good agreement with the ideal Nernstian response. The proposed sensor platform is an outstanding candidate for a portable home-use for continuous monitoring of ions thanks to its advantages such as easy automation of sample preparation and detection processes, elongated sensor lifetime, minimal membrane and sample consumption, and user-definable/reconfigurable sensor array. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards.

    PubMed

    Bornstein, Marc H; Jager, Justin; Putnick, Diane L

    2013-12-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study's target population, whether they yield representative and generalizable estimates of subsamples within a study's target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce "noise" related to variation in subsamples and whether that "noise" can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting.

  16. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards

    PubMed Central

    Bornstein, Marc H.; Jager, Justin; Putnick, Diane L.

    2014-01-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study’s target population, whether they yield representative and generalizable estimates of subsamples within a study’s target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce “noise” related to variation in subsamples and whether that “noise” can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting. PMID:25580049

  17. [Comparison study on sampling methods of Oncomelania hupensis snail survey in marshland schistosomiasis epidemic areas in China].

    PubMed

    An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang

    2016-06-29

    To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.

  18. Water-quality, bed-sediment, and biological data (October 2014 through September 2015) and statistical summaries of data for streams in the Clark Fork Basin, Montana

    USGS Publications Warehouse

    Dodge, Kent A.; Hornberger, Michelle I.; Turner, Matthew A.

    2017-01-19

    Water, bed sediment, and biota were sampled in selected streams from Butte to near Missoula, Montana, as part of a monitoring program in the upper Clark Fork Basin of western Montana. The sampling program was led by the U.S. Geological Survey, in cooperation with the U.S. Environmental Protection Agency, to characterize aquatic resources in the Clark Fork Basin, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were located on the Clark Fork and selected tributaries. Water samples were collected periodically at 20 sites from October 2014 through September 2015. Bed-sediment and biota samples were collected once at 13 sites during August 2015.This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2014 through September 2015. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. At 12 sites, samples for analysis of dissolved organic carbon and turbidity were collected. In addition, samples for analysis of nitrogen (nitrate plus nitrite) were collected at two sites. Daily values of mean suspended-sediment concentration and suspended-sediment discharge were determined for three sites. Seasonal daily values of turbidity were determined for four sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork Basin are provided for the period of record.

  19. Stability of selected serum hormones and lipids after long-term storage in the Janus Serum Bank.

    PubMed

    Gislefoss, Randi E; Grimsrud, Tom K; Mørkrid, Lars

    2015-04-01

    The potential value of a biobank depends on the quality of the samples, i.e. how well they reflect the biological or biochemical state of the donors at the time of sampling. Documentation of sample quality has become a particularly important issue for researchers and users of biobank studies. The aim of this study was to investigate the long-term stability of selected components: cholesterol, high density cholesterol (HDLC), low density cholesterol (LDLC), apolipoprotein A1 (apo-A1), apolipoprotein B (apo B), follicle stimulating hormone (FSH), luteinizing hormone (LH), prolactin (PRL), thyroid stimulating hormone (TSH) and free thyroxin (FT4). Samples, stored at -25°C, from 520 men aged 40-49 years at blood sampling distributed in equally sized groups (n=130) according to length of storage, 0, 4, 17 and 29 years, respectively, were used in a cross sectional design. The freshly collected serum samples were used as a reference group to calculate storage related changes. The differences between fresh samples and samples stored for 29 years were substantial for apo-A1 (+12%), apo-B (+22.3%), HDLC (-69.2%), LDLC (+31.3%), and PRL (-33.5%), while total cholesterol, FSH, LH, TSH and FT4 did not show any significant difference. The study showed large differences in serum level of the selected components. The lipids and apolipoproteins were all changed except for total cholesterol. Most hormones investigated (FSH, LH, TSH and FT4) proved to be stable after 29 years of storage while PRL showed sign of degradation. The observed differences are probably due to long-term storage effects and/or external factors (i.e. diet and smoking). Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Mid-infrared spectroscopy combined with chemometrics to detect Sclerotinia stem rot on oilseed rape (Brassica napus L.) leaves.

    PubMed

    Zhang, Chu; Feng, Xuping; Wang, Jian; Liu, Fei; He, Yong; Zhou, Weijun

    2017-01-01

    Detection of plant diseases in a fast and simple way is crucial for timely disease control. Conventionally, plant diseases are accurately identified by DNA, RNA or serology based methods which are time consuming, complex and expensive. Mid-infrared spectroscopy is a promising technique that simplifies the detection procedure for the disease. Mid-infrared spectroscopy was used to identify the spectral differences between healthy and infected oilseed rape leaves. Two different sample sets from two experiments were used to explore and validate the feasibility of using mid-infrared spectroscopy in detecting Sclerotinia stem rot (SSR) on oilseed rape leaves. The average mid-infrared spectra showed differences between healthy and infected leaves, and the differences varied among different sample sets. Optimal wavenumbers for the 2 sample sets selected by the second derivative spectra were similar, indicating the efficacy of selecting optimal wavenumbers. Chemometric methods were further used to quantitatively detect the oilseed rape leaves infected by SSR, including the partial least squares-discriminant analysis, support vector machine and extreme learning machine. The discriminant models using the full spectra and the optimal wavenumbers of the 2 sample sets were effective for classification accuracies over 80%. The discriminant results for the 2 sample sets varied due to variations in the samples. The use of two sample sets proved and validated the feasibility of using mid-infrared spectroscopy and chemometric methods for detecting SSR on oilseed rape leaves. The similarities among the selected optimal wavenumbers in different sample sets made it feasible to simplify the models and build practical models. Mid-infrared spectroscopy is a reliable and promising technique for SSR control. This study helps in developing practical application of using mid-infrared spectroscopy combined with chemometrics to detect plant disease.

Top